The World of Near Infrared Light & Its Applications

What Is Infrared Light?

This is the Visible Spectrum (above). It’s the only part of the Electromagnetic Spectrum that our eyes can visually recognize. The wavelengths of the visible spectrum range from 380 to 750 nanometers (nm). The shorter wavelength on the left-hand side of the spectrum (380nm) is recognized as Purple and the longer wavelength on the right-hand side of the spectrum (750nm) is recognized as Red. Our eyes can’t see anything above or below this range. Anything under 380 nanometers is known as Ultraviolet (UV) light and anything above 750 nanometers is known as Infrared (IR) light. 

Furthermore, there are different categories of infrared light. There’s near-infrared (750-2500nm), mid-infrared (2500-4000nm), and far-infrared (4000nm~). Near-Infrared and Far-Infrared have completely different properties and uses. 

For this blog, let's start by deepening our understanding of "Near-Infrared" using a very familiar example.

 

 

Introducing A State-of-The-Art "Near-Infrared" device. . . The iPhone!!

 

Surprised? First off, let’s briefly go over what NIR light is. The primary characteristics of NIR light include “having properties close to visible light” and “invisible to the human eye.” Since the wavelength of near-infrared light is close to that of visible light, mass produced devices such as image sensors and cameras can be used as is/with some simple modifications. Additionally, because we can’t see near-infrared light, our eyes wont be irritated by it and it has contrarily been found to help heal eye injuries, reduce inflammation, and protect against vision loss in multiple peer reviewed clinical studies.

To give you an idea of near-infrared applications in our daily lives, we have prepared an iPhone and a near-infrared camera.

We call this camera a “near-infrared” camera, but we took a regular digital camera and modified it. Digital cameras have silicon sensors that can pick up electromagnetic waves longer than 750nm up to 1000nm. Remember, we can’t see anything above 750nm long. For that reason, these cameras also have a filter called an “IR Cut Filter” which doesn’t allow wavelengths higher than 750 nm through in order to better match our eyes'. So, we modified this camera by removing the IR cut filter and installing an "IR Transmission Filter" in its place that only allows wavelengths longer than 760 nm through.

Ta-da! We now have a Near-Infrared camera.

Now back to the main subject. First, let’s unlock this iPhone using Face ID. The moment we touch the power button, the iPhone has already been unlocked. It was as if there weren’t any process of authentication. However, if we look at the iPhone through our NIR camera… 

 

 Flash!  Flash!  Flash!

Now when pointed at a wall.... 

It turns out the identity of the flashing is a projected pattern of countless dots. 

This flashing occurs when the iPhone is running its facial recognition software or when creating Memojis etc. 

 

This is the identity of the iPhone’s "TRUE DEPTH" function. Its job is to measure the exact distance of the subject from the camera, which then allows it to view the three-dimensional shape of the object in front of it.

Because the iPhone’s IR camera and the dot projector are located separately, the dots are projected slightly off angle from the IR camera. As a result, each dot captured by the IR camera shifts horizontally depending on the distance of the subject. By calculating the degree of this shift, the distance of each dot can be determined. This 3D measurement method is known as the Active Stereo Method.

This method is used to grasp the three-dimensional shape of the subject's face and to improve the accuracy of the iPhone’s face authentication. Even if you are wearing sunglasses, you can pass the authentication as long as it’s something that transmits infrared rays to some extent. Have you ever used Memoji? This function can read the facial muscles of the subject in front of the camera, another ability that “True Depth” is capable of by using near-infrared light.

Please feel free to watch the adjacent video taken with our IR camera if you would like to see the iPhone's TRUE DEPTH function in action.

 

 

The Future: LiDAR and Automatic Driving

In recent years, the automotive industry has reached a major turning point. Electric Vehicles? That is one, but the development of advanced sensor technology has opened the way to achieve humanity’s dream of fully autonomous passenger vehicles. One of the key technologies that has made this possible is called “LiDAR.” LiDAR stands for "Light Detection and Ranging". The L in “LiDAR” can also stand for "Laser", and the literal Japanese translation of it is "Detection and Ranging by Laser".

In general, lasers are often used in distance measurement devices such as handheld rangefinders used by construction workers, golfers and military. When used, a laser beam is radiated, and if the laser beam hits an object, the reflected light is then received and measured. LiDAR applies this concept and creates "Point Cloud Data" by repeatedly firing thousands of lasers in a fraction of a second (think radar but with invisible lasers). This makes it possible for LiDAR devices to effectively "see" the shape and distance of objects. 

*Image of 3D Point Cloud Data

Now that we understand the basic mechanism of LiDAR, what kind of factors are important for it to be effectively utilized in autonomous vehicles?

For example, if the distance between the laser observation points are too far apart, a narrow obstacle may pass between the lasers and not be properly recognized.

Effective range is also important. Considering the braking distance of a fast moving car, accurate measurement out to at least 100 meters is required. However, the further the distance, the less light is reflected, and the more difficult it is to measure. Therefore, as a countermeasure, there is a method that increases the output of the laser to increase the reflected light, thus making detection easier. 

However, with this method there is a also a major problem -  it is damaging to the human eye. 

Most mainstream LiDAR devices currently use the NIR wavelength of 905 nm - mainly for cost reasons, but this wavelength is close to visible light, so it has the ability to damage our retinas potentially leading to blindness. Therefore, in recent years, due to government regulations LiDAR has switched to using the longer NIR wavelength of 1500 nm. By switching to 1500nm the damaging effect to our eyes is significantly reduced if not negated. 

 However, the main disadvantage of using 1500nm is the price. Since most consumer grade sensors made of silicon do not have the required sensitivity to effectively view infrared light at the 1500nm wavelength, special sensors made using InGaAs (Indium Gallium Arsenide) are required. This material alone is several times more expensive than that of silicon. For example, the cost of a basic silicon based automotive LiDAR system is said to be around 100,000 yen or more. So the question is then, how much would a high precision InGaAs based LiDAR system using 1500nm cost? Balancing this cost and performance aspect of LiDAR is likely to be one of the keys to developing the future of autonomous vehicles. 

 

 

We hope this blog post helped you learn a little about the wonderful world of infrared light and some of its applications.

And for anyone looking to reduce infrared reflections, please check out IR Flock Sheet that absorbs over 99% of NIR light!

 

 

 

 

SHARE:

Leave A Comment

Please note, comments must be approved before they are published