3D Sensing – New Ways of Sensing the Environment
3D Sensing- A game-changer
Three dimensional (3D) technology is a momentous scientific breakthrough. It is a depth-sensing technology that augments camera capabilities for facial and object recognition. The process of capturing a real-world object’s length, width, and height with more clarity and in-depth detail than can be achieved using a number of different technologies. 3D technology delivers unique advancements in the way day-to-day activities are perceived and approached.
3D is a real game-changer as manufacturers scramble to incorporate these new advancements into consumer products such as mobile phones. 3D sensing technology mimics the human visual system using optical technology, which facilitates the emergence and integration of augmented reality, AI (Artificial Intelligence), and the Internet of Things (IoT). This creates unique opportunities in consumer applications.
3D Sensing key technologies driving advancement
Many of the key technologies driving the advancement of 3D sensing have their pros and cons. Designing these new systems involve developing high-quality sensors and efficient algorithms that can leverage new and existing technologies. For example, Vertical-Cavity Surface-Emitting Lasers (VCSELs) are becoming the dominant light source technology for 3D sensing and can replace LEDs or edge-emitting laser diodes, as they are simple, have a narrow spectrum, and a stable temperature. Stereoscopic vision, structured light pattern, and time of flight are three technologies used for 3D sensing.
Each of these technologies has its common use-cases and individual strengths, which we discuss in more detail.
Stereoscopic Vision
The stereoscopic vision technology derives its structure from the way human eyes capture any image. Two cameras are placed at slightly offset positions (just like human eyes). The two captured images are then united into one picture using the software. Small variances resulting from the different camera positions create the stereoscopic, i.e., 3D picture. In the assisted stereoscopic vision, a laser projection module is deployed, which projects dots on the object or scene to help the camera focus more easily. The captured image is processed to bring out a depth effect. For instance, this technology is used in bullet cameras installed for monitoring people’s movement at door entrances and other places. FLIR Systems (U.S) manufactures Stereo Vision Camera Systems with stereoscopic vision technology.
Structured Light Pattern
A light pattern made of either line, squares (periodic structures), or dots is projected on to an object or a scene by a laser projection module. A distorted pattern is created by the reflected light. The reflected light from the target is captured by a camera mounted triangularly to the projection module. The pattern distortion achieved by the triangulation between the projection module and the camera helps in the acquisition of 3D coordinates of the object or scene. The most common example is the True Depth Camera used in iPhone X. The front camera with this technology adds an infrared emitter that projects over 30,000 dots in a known pattern onto the user’s face. Those dots are then photographed by a dedicated infrared camera for analysis, and thus, the image analyzed is used for accessing the phone.
Time of Flight (ToF)
Direct short light flashes are emitted by a projection module, captured by a camera module, and integrated with the system. The time taken by the light to travel from the emitter to the object and back to the camera is calculated. The data is then processed with coordinates, and a 3D picture is generated. In some cases, phase differences are used to calculate the depth and motion of the object detected. Wavelength stability over the entire operating temperature range of the optical source is critical to maintaining tracking precision, as filters are typically applied in the receive path to minimize the noise in the received signal. Time-of-Flight camera sensors can be used for object scanning, measuring distance, indoor navigation, obstacle avoidance, gesture recognition, tracking objects, measuring volumes, reactive altimeters, 3D photography, and augmented reality games, among others.
Applications of Technology
Over the past 10 years, the consumer electronics market has experienced a slow adoption of 3D depth-sensing technology. The first commercial application of this technology was observed in the field of gaming. Recently, 3D depth-sensing technology has widened its applicability in the field of 3D imaging and detection. The ability of mobile devices to capture pictures in 3D instead of 2D is one of the key factors contributing to the increasing applicability of 3D sensing technology.
Iris scanning, a simpler technology when compared to 3D sensing, uses mathematical pattern-recognition techniques on video images. It allows camera modules to scan your eye, compare the iris to an image on a file, and confirm the identity of the user in similar way fingerprints were used in the past. Several consumer applications of iris scanning technology include access control, automotive security, and mobile pay. Samsung Galaxy S9 and S9+ have Iris scanning technology embedded in them.
The applicability of 3D sensing technology has expanded in the robotics industry, owing to the increasing demand for virtualized solutions. This technology is increasingly used in consumer, automotive, drones, and industrial applications.
Source: https://www.theverge.com/2019/1/2/18164881/sony-tof-laser-depthsensing-3d-camera-report
Application in Consumer Electronics
Gesture applications translate human movements (faces, hands, fingers, or whole-body) into symbolic directives to command gaming consoles, smart televisions, or portable computing devices. Significant use of 3D sensing is observed in the gaming industry. The use of 3D sensing technology in virtual reality games has helped players to control the game by mere gestures and not having any physical contact with the gaming console. Many of these use-cases are relying on intelligent software, particularly when it comes to image recognition situations powered by machine learning. Smartphone companies are focused on incorporating 3D sensors in their devices. The first generation of AR, which relies on standard rear-facing cameras, has already started to impact social media and e-commerce. It allows social media users to create funny 3D effects and helps customers of furniture manufacturers, such as IKEA, to virtually place and arrange furniture before they are being purchased. The combination of technologies such as computer vision signal processing, machine learning, and ToF helps in the creation of products or their features.
3D sensing technology is faced with three distinct challenges in the consumer electronics industry.
- Need to accurately measure the distance between the device and the user.
- Need to specifically interpret parameters of ToF or structured light, while carrying out algorithms of facial or iris recognition.
- Shortcoming associated with the amount of power it consumes when being used, coupled with battery life
Application in Automotive
3D sensing technology can increase safety by alerting the driver when people and objects in the vicinity of the car are detected; it also helps in computer-assisted driving. The technology is used by long-distance drivers to monitor their own behavior. A device in the truck’s interior detects the driver’s motion and issues warnings. Assisted Driving/Automated Driver Assistance Systems (AD/ADAS) are integrated with 3D sensing technology to increase safety while enhancing the driving experience. AD mainly use ToF sensors for detection and supporting actions. The 3D sensing technology also helps increase the comfort and convenience of drivers by offering them access to infotainment controls, cockpit controls, and gear shift controls, among others. STMicroelectronics developed ToF 3D depth sensors that have several uses in non-gesture applications. The automotive industry is ahead of the drone industry in using ToF depth ranging cameras, whereas, the use of 3D sensing technology in drones is still evolving.
Application in Drones
Several technology companies combine 3D sensors, software, and algorithms to provide a comprehensive solution. For example, drones may combine ToF, LiDAR, stereo vision, and ultrasonic sensors to enable them to operate on autonomous flight modes with collision avoidance systems. Listed below are various applications of 3D sensing in drones:
- Indoor Navigation
- Gesture Recognition
- Object Scanning
- Collision Avoidance
- Track Objects
- Measure Volumes
- Surveillance of a Target Zone
- Count Objects or People
- Fast Precise Distance-to-Target Readings
- Augmented Reality /Virtual Reality
- Estimate Size and Shape of Objects
- Enhanced 3D Photography
Walkera Aibao drone, along with its software application, allows the user to play a virtual reality flight game in the real world. This drone combines the real and virtual worlds to deliver an ultimate gaming experience to users.
Widget Applications
Major electronic giants such as Apple, Samsung, and Sony are incorporating 3D sensing in their gadgets to provide the most unique user experience. A 3D imaging sensor that can capture images through walls is currently under development. This feature could be a major breakthrough for home security systems, as individuals can monitor different rooms of their homes from a distance.
The sensing feature can be targeted at cable companies, broadband, and smart home sectors. It is expected to ensure absolute residential security, owing to the ability of the sensor to monitor motions and activities of several individuals in a room. The feature may help in reducing the installation turnaround time of equipment provided by cable companies.
Conclusion
3D sensing aims at connecting devices with the real world to enhance the end-user experience. Increased adoption of 3D sensing in smartphones is driving downsize, the power consumed, and cost and on the other hand, driving up performance. This is also enabling the expanded use of 3D sensing in other end applications, such as automotive and industrial robotics. The need to test these applications in the coming years is critical, as they all have a safety or security component to them.
Interested in similar perspectives? Register to access exclusive content of insights, trending topics, and webinars.