HOW SWIR IS SOLVING THE LOW VISIBILITY CHALLENGE FOR ADAS AND AV - WHITEPAPER

Page created by Jeremy Blair
 
CONTINUE READING
HOW SWIR IS SOLVING THE LOW VISIBILITY CHALLENGE FOR ADAS AND AV - WHITEPAPER
WHITEPAPER

HOW SWIR IS
SOLVING           AUTOMOTIVE

THE LOW
VISIBILITY
CHALLENGE
FOR ADAS AND AV
HOW SWIR IS SOLVING THE LOW VISIBILITY CHALLENGE FOR ADAS AND AV - WHITEPAPER
TABLE OF CONTENTS

01 SETTING THE STAGE - ADAS SYSTEMS AND ROAD VISIBILITY                                 3

02 PITFALLS AND SHORTCOMINGS IN IMPROVING DRIVER SAFETY                                 4
  i. Sensor Fusion Blind Spots                                                          4
  ii. Adverse Weather and Low Light Conditions                                          4
  iii. Detectable and Undetectable Hazards                                              5

03 WHAT IS THE LOW VISIBILITY CHALLENGE?                                                6

04 SOLVING THE PROBLEM WITH SWIR                                                        7
  i. What is the SWIR Spectrum?                                                         7
  ii. SWIR Benefits                                                                     8
     a. Sensors in Focus - SWIR vs. Fog                                                 8
     b. Sensors in Focus - SWIR vs. Night Time                                          8
     c. Sensors in Focus - SWIR vs. Dust                                                9
     d. Hazard Detection - Pedestrians and Animals on the Road                          9
     e. Remote Material Sensing                                                         10
     f. Seeing Through Glass                                                            11
     g. Reusing Existing AI Algorithms                                                  12
     h. Ease of Integration                                                             12
     i. Eye Safety                                                                      13
  iii. A Deep Dive Into the Sensor Solutions: InGaAs-Based vs CMOS-Based SWIR Cameras   14

05 TRIEYE RAVEN CMOS BASED SWIR SENSOR                                                  15
  i. HD Imaging - Seeing Clearly in all Conditions                                      15
  ii. The Cost Differential                                                             15
  iii. Scalability                                                                      15
  iv. Reliability                                                                       15
  v. Small Form Factor                                                                  16

06 READY TO SEE IN SWIR?                                                                17

07 CONCLUSION - THE FUTURE OF SWIR                                                      18
HOW SWIR IS SOLVING THE LOW VISIBILITY CHALLENGE FOR ADAS AND AV - WHITEPAPER
01 SETTING THE STAGE
           ADAS SYSTEMS AND ROAD VISIBILITY
While autonomous vehicles are still far from being available to the average driver, most cars on the
market offer Advanced Driver-Assistance Systems (ADAS), semi-autonomous features that significantly
enhance vehicle reliability and safety.

Most collisions (94%) occur as a result of human error, therefore ADAS applications have been developed
to increase driver safety and decrease the risk of an accident. ADAS features such as lane marking
detection, pedestrian warning, and emergency braking were all developed to enable drivers to make
better decisions. ADAS applications have proven their value and managed to reduce the number of
road fatalities.

However, as valuable as ADAS technology has proven to be at improving driver's safety, there is still a long
way to go. ADAS technology urgently needs better sensors which can assist drivers in the identification
of visible and invisible hazards, enabling those advanced systems to better “see” the world in front of
and around them.

Today, driver-assistance systems are not functional in low light and adverse weather conditions.
 In that matter, a recent AAA study revealed that none of the systems were able to detect an adult
pedestrian crossing in front of the vehicle at night, when testing automated emergency brake systems
with pedestrian detection. The low visibility challenge must be solved because most severe road
accidents happen in these conditions. In fact, even though the number of miles driven at night are
substantially lower than during the daytime, more than half of all fatalities occur at night. For example,
in the United States almost half (49%) of road fatalities occur during night time, making nocturnal
driving twice as dangerous as daytime driving.

The fact that ADAS technology on the market today can reduce the likelihood and severity
of an accident is undisputed, but until now, it has not been able to offer a consistent solution when it
is most needed.

FIGURE 1: SENSOR FUSION UNDER A CLEAR BLUE SKY

                      RESOLUTION                                                                          SURROUND VIEW

                                                                                                                            BLIND
                                                                                    TRAFFIC                                 SPOT
                                                                                     SIGN                                 DETECTION
                                                                                  RECOGNITION    CROSS
                                                                                                TRAFFIC
                                                                                                                                        REAR
                                                                                                 ALERT
 COST-EFFECTIVE                        CONTRAST                                                                                       COLLISION
                                                                      EMERGENCY BRAKING                                               WARNING          REAR
                                                   ADAPTIVE CRUISE                                                                                  COLLISION \
                                                                     PEDESTRIAN DETECTION
                                                      CONTROL                                                                                     SURROUND VIEW
                                                                      COLLISION AVOIDANCE

                                                                                                 CROSS
                                                                                      LANE      TRAFFIC
                                                                                   DEPARTURE     ALERT                      BLIND
                                                                                    WARNING                                 SPOT
                                                                                                                          DETECTION

                                                                                                          SURROUND VIEW

    REMOTE                             3D SCENE
MATERIAL SENSING

                     DETECTION AND
                   RECOGNITION RANGE

                                                                                                                                                                  3
HOW SWIR IS SOLVING THE LOW VISIBILITY CHALLENGE FOR ADAS AND AV - WHITEPAPER
02 PITFALLS AND SHORTCOMINGS
   IN IMPROVING DRIVER SAFETY
I SENSOR FUSION BLIND SPOTS
Sensor Fusion is the synergy of information from a variety of sensors. Most vehicles manufactured these
days have two primary types of sensors combined for use in ADAS applications: Standard Visible (VIS)
Cameras and radar. Additionally, there is the LiDAR sensor, which is considered primarily applicable
for higher automation applications.

VIS camera sensors are used for detecting lane markings, signs, vehicles, and pedestrians where as
radar sensors are used to measure the distance to an object and its speed. LiDAR sensors measure
the distance to objects in a higher resolution than radar, but are extremely expensive and difficult to
scale and integrate.

In mass market applications, pricing is a crucial factor for decision-making . Within the automotive
industry this point of consideration is even more crucial; ADAS components need to be cost-effective in
order to be considered by Original Equipment Manufacturers (OEMs). In some cases, decision makers
are forced to choose pricing over the integration of cutting-edge technologies. This consideration
intensifies the industry’s need for a cost-effective, reliable and safe system, which can support HD
vision in all driving conditions.

II ADVERSE WEATHER AND LOW LIGHT CONDITIONS
A major hurdle in the creation of a truly secure ADAS solution has been the creation of a sensor that
can see through fog, haze and dust, as well as in the dark. This is an urgent concern, considering that
75% of pedestrian fatalities occur at night and the technology currently on the market has proven that
it is not up to the task of providing a clear view of the road ahead in low visibility driving conditions.

On a clear day, existing products can offer the required resolution, contrast, detection and recognition
range. However, they fail to compensate for common instances of diminished visibility. This is the most
critical challenge preventing a consistent and effective solution. Even when combining existing sensors
into one system, they are still unable to operate under all weather and lighting conditions.

As a result, the driver assistance systems simply do not have the critical information about their immediate
environment that they require to make smart and safe decisions. The danger this presents cannot
be underestimated, particularly in light of the fact that most severe road accidents happen in low
visibility conditions.
FIGURE 2: SENSOR FUSION FAILS UNDER COMMON LOW VISIBILITY CONDITIONS

                      RESOLUTION                                                                       SURROUND VIEW

                                                                                                                     BLIND
                                                                                 TRAFFIC                             SPOT
                                                                                  SIGN        CROSS                DETECTION
                                                                               RECOGNITION   TRAFFIC                             REAR
                                                                                              ALERT                            COLLISION
 COST-EFFECTIVE                        CONTRAST      ADAPTIVE       EMERGENCY BRAKING                                          WARNING          REAR
                                                                   PEDESTRIAN DETECTION                                                      COLLISION /
                                                  CRUISE CONTROL
                                                                    COLLISION AVOIDANCE                                                    SURROUND VIEW

                                                                                   LANE       CROSS
                                                                                DEPARTURE    TRAFFIC
                                                                                 WARNING      ALERT                 BLIND
                                                                                                                    SPOT
                                                                                                                  DETECTION

                                                                                                       SURROUND VIEW

    REMOTE                             3D SCENE
MATERIAL SENSING

                     DETECTION AND
                   RECOGNITION RANGE                        GLARE               DUST            FOG&HAZE         DARKNESS                  SMOKE

                                                                                                                                                           4
HOW SWIR IS SOLVING THE LOW VISIBILITY CHALLENGE FOR ADAS AND AV - WHITEPAPER
02 PITFALLS AND SHORTCOMINGS
   IN IMPROVING DRIVER SAFETY

III. DETECTABLE AND UNDETECTABLE HAZARDS
In addition to poor performance in adverse weather and low light conditions, the existing sensor fusion
solution has shortfalls in accurate object detection. Even under optimal visibility conditions, its hazard
detection, of pedestrians wearing dark clothes or dark-furred animals crossing the road, is extremely
deficient and poses a serious danger.

Furthermore, while standard VIS cameras can differentiate between a person and an animal, they
cannot always differentiate between a real person and a picture of a person. Putting object detection
at risk of “phantom” objects and other attacks. Being able to distinguish between visible hazards on
the road like debris, pedestrians, and animals darting into the road, as well as hazards that the human
eye cannot detect such as black ice, are vital to saving lives.

Currently, invisible hazards such as black ice or oil slicks on asphalt are impossible to detect from a safe
distance. In adverse weather and low light conditions it can be an even greater challenge to identify
these invisible dangers on the road that threaten vehicle safety.

FIGURE 3: SENSOR FUSION LIMITED DETECTION CAPABILITIES OF UNSEEN HAZARDS

                      RESOLUTION

 COST-EFFECTIVE                        CONTRAST

    REMOTE                             3D SCENE
MATERIAL SENSING                                                                              VIS CAMERA |    RADAR

                     DETECTION AND
                   RECOGNITION RANGE
                                                   ANIMAL CROSSING   PEDESTRIAN   OIL SLICK       ICE ON THE ROAD

                                                                                                                      5
HOW SWIR IS SOLVING THE LOW VISIBILITY CHALLENGE FOR ADAS AND AV - WHITEPAPER
03 WHAT IS THE LOW
   VISIBILITY CHALLENGE?
Addressing the low visibility challenge is of the utmost importance. According to the Federal Highway
Administration (FHWA), over 5 million crashes take place annually, 22% of which are weather-related.
The ability to see, while driving in snow, rain, fog and sandstorms would cut the number of accidents
and significantly reduce the amount of severe injuries and fatalities.

Night is also a critical factor in vehicle safety, as it is when pedestrians, drivers and passengers are
at their most vulnerable. A disproportionate number of fatal road injuries occur after dark; Accident
statistics across the United States reinforce the urgency of finding a solution to the limitations ADAS
technologies face when used in nocturnal driving. According to NHTSA, the passenger vehicle occupant
fatality rate at nighttime is about three times higher than the daytime rate. Nationwide, almost half
(49%) of road fatalities occur during night time.

Unfortunately, the sensor fusion systems available on the market do not work at the times they are
most needed - when visibility is compromised. So, it is critical for ADAS technology to evolve to a point
where driving in bad weather and lighting conditions are no longer an activity which risks lives.

To this end, TriEye has developed the Raven CMOS-based SWIR solution, which takes a significant step
forward in improving visibility under challenging driving conditions.

FIGURE 4: CRITICAL REASONS FOR CRASHES INVESTIGATED IN THE NATIONAL MOTOR
VEHICLE CRASH CAUSATION SURVEY CONDUCTED BY NHTSA
Source: crashstats.nhtsa.dot.gov

                       1%
                  3%                                                                     8%
             4%
        4%                                                                         7%
                                         Slick Roads (ice, debris, etc.)

                                         Glare                                                               Recognition Error
  9%
                                         View Obstructions                                                   Decision Error
                                         Other Road-related Condition      11%
                                                                                                             Performance Error
11%
                                         Fog/Rain/Snow                                                       Non-Performance
                                                                                                             Error (sleep, etc.)
                                         Other Weather-Related Condition
                                                                                                             Other
  17%                              50%   Signs/Signals                       33%                      41%

                                         Road Design

      ENVIRONMENT RELATED CRITICAL REASONS                                       DRIVER RELATED CRITICAL REASONS

                                                                                                                                   6
HOW SWIR IS SOLVING THE LOW VISIBILITY CHALLENGE FOR ADAS AND AV - WHITEPAPER
04 SOLVING THE PROBLEM
   WITH SWIR
I WHAT IS THE SWIR SPECTRUM?
Short-Wave Infrared (SWIR), refers to a specific wavelength range from 1000nm to 1600nm. For
reference, standard cameras usually operate between 0.4-0.75µm spectrum. SWIR allows for a
number of applications to be performed that weren't previously possible using visible light and with
great results. In contrast to cameras on the visible spectrum, a SWIR camera has a lower refractive
coefficient, meaning that it is significantly less scattered and can perceive what standard cameras in
the visible spectrum are not able to see. Thus, “Seeing Beyond the Visible”.

Reliable and consistent, the SWIR camera can offer superior sight, functionality, and operability under
all weather and lighting conditions. In the automotive market, it has the potential to enhance human
driver capabilities and assist in the detection of previously invisible hazards on the road, a sensing
capability that was not available at a low cost until now.

Another significant advantage of SWIR is that it is a useful tool for remote material sensing. By comparing
the relative intensities of carefully chosen spectral bands, the differences between spectral signatures
are revealed. This allows for differentiation between target materials such as ice or water covered
regions on an asphalt surface (referenced in Section E).

          VISIBLE                         NIR                            SWIR
  0.4 [   M]                0.75 [   M]             1.0 [   M]                                  1.6 [   M]

                                                                                                              7
HOW SWIR IS SOLVING THE LOW VISIBILITY CHALLENGE FOR ADAS AND AV - WHITEPAPER
04 SOLVING THE PROBLEM WITH SWIR

II. SWIR BENEFITS
A. SENSORS IN FOCUS - SWIR VS. FOG                                     VISIBLE CAMERA

Fog is caused by supersaturation of the air close to the
earth’s surface, which creates suspended droplets of
moisture. It can be incredibly hazardous, reducing visibility
to just a few meters. While a standard VIS camera image
might be completely obscured by fog because it creates
a white “screen” that blinds it, a SWIR camera can see
through fog because of its high penetration coefficient.

SWIR cameras are able to see for longer ranges than
visible cameras because light is less scattered in the SWIR
wavelength, increasing detection range significantly.

                                                                             SWIR CAMERA

 VISIBLE CAMERA

                                               B. SENSORS IN FOCUS - SWIR VS. NIGHT TIME
                                               Driving at night presents a unique set of dangers, being
                                               that majority of accidents occur after dark. At night, a VIS
                                               camera, just like the human driver, has difficulty recognizing
                                               objects, people, animals, and other vehicles, due to lack of
                                               photons in low light. This drastically limits the car’s vision
                                               to the narrow area that is illuminated by its headlights.

                                               However, a SWIR camera can achieve significant visibility
                                               distance in the dark, offering effective night vision capabilities
                                               and greatly lowering the risk of collision.

                           SWIR CAMERA

                                                                                                                    8
HOW SWIR IS SOLVING THE LOW VISIBILITY CHALLENGE FOR ADAS AND AV - WHITEPAPER
04 SOLVING THE PROBLEM WITH SWIR

C. SENSORS IN FOCUS - SWIR VS. DUST
Dust and sandstorms are common phenomena in arid and semi-arid regions. These unpredictable
conditions present a particularly dangerous threat to drivers. Sandstorms can cause thick clouds
of dust that obscure the road and can lead to zero visibility conditions. A SWIR camera is able to
offer better visibility in such scenarios because the light is less scattered in the SWIR spectrum compared
to the visible spectrum.

D. HAZARD DETECTION - PEDESTRIANS AND ANIMALS ON THE ROAD
Arguably, one of the most unpredictable road hazards is unexpected animals or pedestrians in the
road. This is especially true in low visibility conditions where AI algorithms might not be able to detect
and recognize. The challenge becomes even bigger if the person is wearing dark clothing or a dark-
furred animal is present. The enhanced vision capabilities of the SWIR camera substantially improve
the detection algorithm performance, therefore reducing the risk of a collision in such a scenario and
increasing safety of all road users.

In the image below (Figure 5), the VIS camera shows pedestrians wearing dark clothing, which make
them hard to detect even in broad daylight. Yet, their clothing appears much lighter with the SWIR
camera. The reason being that SWIR detects the material of the object, not its color. So cotton material
will appear as bright white in the SWIR spectrum, making it a beneficial tool for hazard detection.

                                                                                                              9
04 SOLVING THE PROBLEM WITH SWIR
FIGURE 5: ANIMAL AND PEDESTRIAN DETECTION AT DAY TIME

 VISIBLE CAMERA - OBJECTS UNDETECTED                                    SWIR CAMERA - OBJECTS DETECTED

As seen in Figure 5, a dark-furred animal is even      FIGURE 6: MELANIN ABSORPTION
harder to see with a VIS camera. The animal’s dark
appearance is due to the high melanin content
of their fur which absorbs the light instead of
reflecting it to the camera. As the wavelength
of the light increases from the VIS to the SWIR
spectrum, the absorption decreases by more than
an order of magnitude. A significant challenge,
taking into account that more than 1.3 million
deer-related accidents occur every year in the
United States alone.

SWIR sensors can enhance the ability of ADAS systems to recognize animals and pedestrians at all
times, significantly improving road safety and dramatically reducing these collision statistics.

E. REMOTE MATERIAL SENSING
Identification and detection are major problems for VIS cameras; a dark spot on the road could be a
shadow, a puddle, or a pothole. Car reflections off buildings could be identified as an oncoming vehicle
and confuse the sensor. SWIR technology offers superior and more accurate detection because SWIR
cameras can detect materials and are not confused by colors.

Every material has a unique spectral response defined by its chemical and physical characteristics
that can be detected by comparing the relative intensities of carefully chosen spectral bands. This is
known as remote material sensing.

                                                                                                           10
04 SOLVING THE PROBLEM WITH SWIR

There are hazards which are invisible to both       FIGURE 7: ASPHALT THROUGH WATER AND ICE
the driver and ADAS sensor fusion solutions,
where the ADAS system isn't able to detect or
alert the driver of the hazard until its too late
to prevent an accident. - e.g. when the vehicle
is already slipping on black-ice. This is a major
challenge for self-driving cars and human drivers
alike, exposing them to significant risks. SWIR
cameras allow for the detection of unseen or
                                                     VISIBLE CAMERA                                 SWIR CAMERA
obscured hazards at longer ranges.

In real world scenarios this capability is
invaluable since it enables differentiation
between target materials such as ice or water
covered regions on an asphalt surface. As shown
in Figure 7, a SWIR camera allows drivers to
easily distinguish the black ice on the asphalt,
with the same ease as they would differentiate
between red and blue.
                                                    Source: Shigeki Nakauchi, Ken Nishino, and Takuya Yamashita,
                                                    Toyohashi University of Technology, Japan

F. SEEING THROUGH GLASS
SWIR technology is able to see through glass. Unlike LiDAR and radar sensors that require visible
exterior mounting on the car roof or bumper, SWIR cameras can be mounted behind the windshield or
in the car headlight. This is the preferred location for car manufacturers since it allows for maximum
image clarity and doesn’t require exterior design modifications.

Moreover, windshield positioning offers an optimal view of the road and it means that the manufacturer
does not need to be concerned with implementing a cleaning system.

                                                                                                                   11
04 SOLVING THE PROBLEM WITH SWIR

G. REUSING EXISTING AI ALGORITHMS
SWIR image data can be processed with the same algorithms that were developed for VIS cameras.
Using existing deep learning algorithms simplifies the development process, saving significant time and
resources, as the algorithms do not need to be developed from scratch which requires driving millions
of miles physically and trillions of miles virtually.

FIGURE 8: OBJECT RECOGNITION

 VISIBLE CAMERA                                                                           SWIR CAMERA

For example, for SWIR Object Recognition capabilities, open source visible algorithms are
utilized, with almost 4,000 frames able to be retooled as SWIR camera data (Yolo by TensorFlow) as
shown in Figure 8.

FIGURE 9: SEMANTIC ROAD SEGMENTATION

 VISIBLE CAMERA                                                                           SWIR CAMERA

Another example is that of Semantic Road Segmentation based Fully Convolutional Network (FCN-8
Architecture developed at Berkeley). It uses VGG16 pre-trained on ImageNet.

H. EASE OF INTEGRATION
SWIR cameras collect and produce regular image data, similar to the data produced by a standard VIS
camera. This simple output type allows engineers to integrate it easily with existing systems that are
currently being used by car manufacturers and process the image in similar methods.

                                                                                                          12
04 SOLVING THE PROBLEM WITH SWIR

I. EYE SAFETY
Illumination sources that operate in the SWIR spectrum are considered ‘eye-safe’, meaning that
light at these wavelengths will not penetrate the cornea of the human eye (Class 1 laser eye-safe).
SWIR illumination can be operated at optical powers three orders of magnitude higher than other
wavelengths while remaining Class 1 eye-safe.

FIGURE 4: CLASS 1 EYE SAFETY ACCORDING TO THE INTERNATIONAL STANDARD IEC 60825 REGULATIONS

                                                                                                      13
04 SOLVING THE PROBLEM WITH SWIR

III. A DEEP DIVE INTO THE SENSOR SOLUTIONS:
     INGAAS-BASED VS CMOS-BASED SWIR CAMERAS
Multiple industries have been harnessing SWIR’s technological capabilities for decades using a compound
of exotic materials, Indium Gallium Arsenide (InGaAs). However, InGaAs has a low production yield
and its fabrication involves multiple, complex steps, making the technology prohibitively expensive.

The industries using this technology include defense, science, and aerospace, sectors that can afford
the high cost. Due to its high price and long lead time, InGaAs-based cameras are not suitable for
mass-market applications. Additionally, the cameras can be large and cumbersome which is also an
entry barrier to mass-market adoption.

Conversely, CMOS-based cameras can be manufactured with far greater ease and at a cost one
thousand times lower than that of InGaAs-based cameras. They are more reliable, scalable, energy
efficient, provide much higher resolution and can be miniaturized and integrated into a mobile phone
camera. Standard CMOS-based sensors today are not able to sense the SWIR spectrum because their
sensitivity is extremely low above 1um.

Based on advanced nanophotonics research, TriEye’s cutting edge technology enables SWIR sensing
on a CMOS-based sensor for mass-production. This unique combination enables vision in adverse
weather and nighttime conditions, and allows for better visualization of potential hazards on the road;
a significant technological breakthrough in the automotive sensor industry.

                                                                                                          14
05 TRIEYE RAVEN CMOS BASED
   SWIR SENSOR
I. HD IMAGING - SEEING CLEARLY IN ALL CONDITIONS
TriEye’s innovative camera is able to produce HD imaging of the driving scene, similar to that of a
VIS camera, but with incomparable efficacy under common low-visibility conditions. Delivering high-
resolution image data, it allows for safer and more reliable ADAS in low visibility conditions, better
mapping of the car surroundings and a higher object detection rate. This enables ADAS applications
such as emergency braking systems and pedestrian warning tools to operate consistently, offering peak
visibility day or night and in the most extreme weather.

II. THE COST DIFFERENTIAL
Until now, available SWIR technology had a significantly high cost, tens of thousands of USD, due to
a lack of compatibility with CMOS-based sensors, limited wafer size, the need for die to die bonding,
low yield, and more.

TriEye’s patent-pending technology, which includes compatibility with a CMOS-based sensor, is able to
overcome these obstacles and eliminate the exorbitant manufacturing costs of using InGaAs, a material
characterized by its scarcity, low production yield, and a complex and expensive manufacturing process.
This has enabled TriEye to reduce expenditure a thousand times the current industry rate.

III. SCALABILITY
TriEye is partnering with a global leading CMOS foundry which fabricates TriEye’s sensor based on the
company’s unique design. This partnership allows for mass production using existing tools, in order
to support the automotive industry’s scale requirements, while adhering to its strict regulatory and
safety requirements.

IV. RELIABILITY
TriEye’s sensor is highly reliable, as it uses existing fabrication tools that were already developed for
CMOS-based sensors worldwide. This manufacturing process has long been proven to deliver stability
and uniformity of the sensor and camera performance over time. Thus, TriEye’s CMOS-based SWIR
camera allows for transformative sight technology that is more reliable and robust.

                                                                                                            15
05 TRIEYE RAVEN CMOS BASED SWIR SENSOR

V. SMALL FORM FACTOR
TriEye's Raven CMOS-based sensor meets the size standard required for automotive applications,
allowing for a simple and flexible integration consistent with vehicle design requirements.

SENSOR MODALITIES FUNCTIONALITY

             ABILITY TO SEE
                                                     USING        REMOTE
            IN ALL WEATHER                                                             MOUNTING
                              RESOLUTION   COST    EXISTING AI   MATERIAL    SIZE
             AND LIGHTING                                                              POSITION
                                                  CAPABILITIES   SENSING
              CONDITIONS

CAMERA

RADAR

LIDAR

TRIEYE

                                                                                                  16
06 READY TO SEE IN SWIR?
TriEye's CMOS-based Short-Wave Infrared sensor technology is the world's first cost-effective
and scalable solution. This enables unprecedented machine vision capabilities suited for a wide
variety of automotive use cases.

TriEye’s SWIR sensor is quickly becoming a key component in highly automated driving systems.
From detecting vulnerable road users at all times to identifying rather invisible hazards on the
road (e.g. black ice). This innovative technology can supercharge the capabilities and efficacy of
automotive vision systems.

TRIEYE OVI DEVELOPMENT KIT
SWIR cameras collect and process image data, similar to the data produced by a standard VIS camera.
This type of output is easily integrated with existing systems, currently used by car manufacturers,
that process the image using similar methods.

SPECIFICATIONS

HW                                             TRIEYE RAVEN SENSOR
Modular stacked board devkit                   Sensor Spectrum: 0.4 – 1.6µm
 - Sensor board                                Resolution: 1284 x 960
 - Connectivity board                          Frame Rate: Up to 120 FPS
 - Frame Grabber board                         Shutter: Global/Rolling
                                                                                                   HW INSIDE
 - ISP board (optional)*                       Output format: 8/10/12-bit Raw
 - Serializer/De-serializer board (optional)
                                               SENSOR CONTROL
                                                                                                         Ext Trigger

USB3 and External Sync cables
Serial cable (optional)                        Exposure Control                                                Power

Size 62 x 62 x 55 mm                           FPS control
                                                                                                                  USB3

C-mount                                        ROI support
                                                                                                              Frame Grabber
                                               Binning support
OPTICS (OPTIONAL)                                                                  Sensor
                                                                                   Board                 Connectivity

Several options available (contact sales)      ADDITIONAL SW FEATURES                       HW ISP
                                                                                            Board
                                                                                                         Board

                                                                                            (optional)
                                               Image Preview
ILLUMINATION (OPTIONAL)                        Sync with another camera
Several options available (contact sales)      Illumination control
SW                                             Save images as BMP/Tiff
Viewer-Recorder-Controller (VRC) tool          Save videos as AVI/MP4
DevWareX Raven Plugin                          Internal registers access
TriEye API (TEA)                               Image analysis capabilities
ISP SW                                         Diagnostic information analysis
OS SUPPORTED
Windows
Linux*

                                                                                                                              17
07 CONCLUSION
        THE FUTURE OF SWIR
TriEye is solving the low visibility challenge for ADAS and Autonomous Vehicles. Specifically, the company
is developing a SWIR sensing solution that can deliver image data even under the most challenging
visibility conditions. Furthermore, TriEye’s CMOS-based SWIR sensor is the industry-first to enable
remote material sensing for both visible and invisible hazards on the road, thus mitigating risks on the
road and dramatically enhancing road safety.

TriEye’s Raven SWIR camera enhances human driver capabilities and assists in the detection of invisible
hazards on the road. The Raven’s innovative technology allows life-saving ADAS applications such as
emergency braking systems and pedestrian protection systems to operate in different environmental
scenes and under the toughest weather and lighting conditions. This cost-effective solution delivers
unprecedented vision capabilities that bring the autonomous vehicles industry closer to safer and
better ADAS and AV.

TriEye enables SWIR sensing capabilities to a much broader group - it is changing the way humanity
perceives the world around us by enabling another channel of crucial data, to provide a clear image
where the human eye or even a standard visible camera cannot “see”.

While our primary target market is the automotive industry, our technology can reshape a wide range of
other sectors. The TriEye mass-market SWIR sensing solution can be utilized across numerous verticals
with real-life applications; security, mobile, agriculture and even medical. Some were cost-prohibitive
in the past and some will be discovered as our new sensing modality comes to life.

To learn more contact us at sales@trieye.tech or visit www.trieye.tech

ABOUT TRIEYE
TriEye is a fabless semiconductor company developing a unique, mass-market, and affordable
SWIR sensing technology based on academic research in nanophotonics. TriEye's team are experts
in SWIR technology specializing in device physics, process design, electro-optics, and deep learning.

                                                                                                             18
SEEING BEYOND
 THE VISIBLE

  CONTACT US

 info@trieye.tech
 www.trieye.tech
You can also read