Take a closer look at one of the new iPhone 12 Pro models, or theand you will see a small black dot near the camera lenses, the same size as the flash. This is the lidar sensor, and it’s a new type of depth sensing that could make a difference in many interesting ways.
If Apple has its way, lidar is a term you’ll start hearing a lot now, so let’s break down what we know, what Apple will use it for, and where the technology might go next.
What does lidar mean?
Lidar is synonymous with light sensing and metering and has been around for quite some time. It uses lasers to ping objects and return to the laser source, measuring distance by timing the journey, or flight, of the light pulse.
How does lidar work to detect depth?
Lidar is a type of time-of-flight camera. Some other smartphones measure depth with a single light pulse, while a smartphone with this type of lidar technology sends out pulsed light waves in a spray of infrared points and can measure each with its own sensor, creating a field of points that map distances. and can “mesh” the dimensions of a space and the objects it contains. The light pulses are invisible to the human eye, but you might see them with a night vision camera.
Isn’t it like Face ID on iPhone?
It is, but with a longer range. The idea is the same: that of Appleit also shoots a series of infrared lasers, but can only work up to a few meters away. The rear lidar sensors on the iPad Pro and iPhone 12 Pro work at a distance of up to 5 meters.
Lidar is already in many other technologies
Lidar is a technology that is popping up everywhere. It is used for, or . It is used for is . Augmented reality headsets like the have a similar technology, they map the spaces of the rooms before superimposing virtual 3D objects on them. But it also has a rather long history.
Microsoft’s old depth-sensing Xbox accessory, the, it was also an infrared depth scan camera. In fact, PrimeSense, the company that helped make Kinect technology, . We now have Apple’s TrueDepth sensors for face scanning and the rear lidar camera.
The iPhone 12 Pro camera might work better with lidar
Time-of-flight cameras on smartphones tend to be used to improve focus accuracy and speed, and the iPhone 12 Pro will do the same. Apple promises better focus in low light, up to 6x faster in low light. The lidar depth sensing will also be used to enhance the effects of the night portrait mode.
Better focus is an advantage and there is also the possibility that the iPhone 12 Pro can add more 3D photo data to images. While that element hasn’t been outlined yet, Apple’s TrueDepth front-facing depth-sensing camera has been used in a similar fashion with apps.
It will also greatly improve augmented reality
Lidar will allow the iPhone 12 Pro to launch AR apps much faster and create a quick map of a room to add more details. Lot ofthey are using lidar to hide virtual objects behind real ones (called occlusion) and place virtual objects within more complicated room mappings, such as on a table or chair.
But there’s extra potential beyond that, with a longer tail. Many companies dream of headsets that combine virtual and real objects: AR glasses,, , , , is and others, will rely on having advanced 3D world maps on which virtual objects can be superimposed.
Those 3D maps are now being built with special scanners and equipment, almost like the world scanning version of those cars from Google Maps. But there is a chance that people’s devices may eventually help crowdsource that information or add extra data on the fly. Again, AR headsets such as Magic Leap and HoloLens already pre-scan the environment before stacking things on it, and Apple’s AR technology equipped with lidar works the same way. In that sense, the iPhone 12 Pro and iPad Pro are like AR earphones without the headset part … and could pave the way for Apple to make their own glasses eventually.
3D scanning could be the killer app
Lidar can be used to mesh rooms and 3D objects and overlay photographic images, a technique called photogrammetry. This could be the next wave of capture technology for practical uses like, or even social media and journalism. The ability to capture 3D data and share that information with others could make these phones and tablets equipped with lidar tools for capturing 3D content. Lidar could also be used without the camera element to acquire measurements for objects and spaces.
Apple isn’t the first to explore technologies like this on a phone
Google had this same idea in mind when– one of the first AR platforms that it was — was created. The advanced array of cameras also had infrared sensors and could map rooms, creating 3D scans and depth maps for AR and for measuring interior spaces. Google Tango-equipped phones were short-lived, replaced by machine vision algorithms that performed estimated depth sensing on cameras without the need for the same hardware. But Apple’s iPhone 12 Pro looks like a much more advanced successor.