Home / Technology / Apple wants to make lidar a big deal on the iPhone 12 Pro and beyond. What it is and why it matters

Apple wants to make lidar a big deal on the iPhone 12 Pro and beyond. What it is and why it matters



apple-iphone12pro-rear-camera-10132020.jpg

The lidar sensor of the iPhone 12 Pro – the black circle at the bottom right of the camera unit – opens up AR possibilities.

Apple

Apple is getting bullish on lidar, a brand new technology for the iPhone 12 family, in particular the iPhone 12 Pro and the iPhone 12 Pro Max. (The iPhone 12 Pro is on sale now, with the iPhone 12 Pro Max with the Pro Max to follow in a few weeks.)

Take a closer look at one of the new iPhone 12 Pro models, or the Latest iPad Proand you will see a small black dot near the camera lenses, the same size as the flash. This is the lidar sensor, and it’s a new type of depth sensing that could make a difference in many interesting ways.

If Apple has its way, lidar is a term you’ll start hearing a lot now, so let’s break down what we know, what Apple will use it for, and where the technology might go next.

What does lidar mean?

Lidar is synonymous with light sensing and metering and has been around for quite some time. It uses lasers to ping objects and return to the laser source, measuring distance by timing the journey, or flight, of the light pulse.

How does lidar work to detect depth?

Lidar is a type of time-of-flight camera. Some other smartphones measure depth with a single light pulse, while a smartphone with this type of lidar technology sends out pulsed light waves in a spray of infrared points and can measure each with its own sensor, creating a field of points that map distances. and can “mesh” the dimensions of a space and the objects it contains. The light pulses are invisible to the human eye, but you might see them with a night vision camera.

ipad-pro-ar

The iPad Pro released in the spring also has lidar.

Scott Stein / CNET

Isn’t it like Face ID on iPhone?

It is, but with a longer range. The idea is the same: that of Apple TrueDepth camera that enables Face ID it also shoots a series of infrared lasers, but can only work up to a few meters away. The rear lidar sensors on the iPad Pro and iPhone 12 Pro work at a distance of up to 5 meters.

Lidar is already in many other technologies

Lidar is a technology that is popping up everywhere. It is used for self-driving car, or assisted driving. It is used for robotics is drones. Augmented reality headsets like the HoloLens 2 have a similar technology, they map the spaces of the rooms before superimposing virtual 3D objects on them. But it also has a rather long history.

Microsoft’s old depth-sensing Xbox accessory, the Kinect, it was also an infrared depth scan camera. In fact, PrimeSense, the company that helped make Kinect technology, was acquired by Apple in 2013. We now have Apple’s TrueDepth sensors for face scanning and the rear lidar camera.

XBox_One_35657846_03.jpg

Remember the Kinect?

Sarah Tew / CNET

The iPhone 12 Pro camera might work better with lidar

Time-of-flight cameras on smartphones tend to be used to improve focus accuracy and speed, and the iPhone 12 Pro will do the same. Apple promises better focus in low light, up to 6x faster in low light. The lidar depth sensing will also be used to enhance the effects of the night portrait mode.

Better focus is an advantage and there is also the possibility that the iPhone 12 Pro can add more 3D photo data to images. While that element hasn’t been outlined yet, Apple’s TrueDepth front-facing depth-sensing camera has been used in a similar fashion with apps.

lidar-powered-snapchat-lens.png

Snapchat is already enabling AR lenses using the iPhone 12 Pro lidar.

Snapchat

It will also greatly improve augmented reality

Lidar will allow the iPhone 12 Pro to launch AR apps much faster and create a quick map of a room to add more details. Lot of Apple’s AR updates in iOS 14 they are using lidar to hide virtual objects behind real ones (called occlusion) and place virtual objects within more complicated room mappings, such as on a table or chair.

But there’s extra potential beyond that, with a longer tail. Many companies dream of headsets that combine virtual and real objects: AR glasses, Facebook works on, Qualcomm, Snapchat, Microsoft, Magic Leap is most likely Apple and others, will rely on having advanced 3D world maps on which virtual objects can be superimposed.

Those 3D maps are now being built with special scanners and equipment, almost like the world scanning version of those cars from Google Maps. But there is a chance that people’s devices may eventually help crowdsource that information or add extra data on the fly. Again, AR headsets such as Magic Leap and HoloLens already pre-scan the environment before stacking things on it, and Apple’s AR technology equipped with lidar works the same way. In that sense, the iPhone 12 Pro and iPad Pro are like AR earphones without the headset part … and could pave the way for Apple to make their own glasses eventually.

occipital-canvas-ipad-pro-lidar.png

A 3D scan of the room from Occipital’s Canvas app, enabled by depth sensing lidar on the iPad Pro. Expect the same for the iPhone 12 Pro, and maybe more.

Occipital

3D scanning could be the killer app

Lidar can be used to mesh rooms and 3D objects and overlay photographic images, a technique called photogrammetry. This could be the next wave of capture technology for practical uses like Home improvements, or even social media and journalism. The ability to capture 3D data and share that information with others could make these phones and tablets equipped with lidar tools for capturing 3D content. Lidar could also be used without the camera element to acquire measurements for objects and spaces.

google-tango-lenovo-1905-001.jpg

Remember Google Tango? It also had depth sensing.

Josh Miller / CNET

Apple isn’t the first to explore technologies like this on a phone

Google had this same idea in mind when Project Tango – one of the first AR platforms that it was on two phones only — was created. The advanced array of cameras also had infrared sensors and could map rooms, creating 3D scans and depth maps for AR and for measuring interior spaces. Google Tango-equipped phones were short-lived, replaced by machine vision algorithms that performed estimated depth sensing on cameras without the need for the same hardware. But Apple’s iPhone 12 Pro looks like a much more advanced successor.






Now playing:
Watch this:

iPhone 12, iPhone 12 Mini, Pro and Pro Max explained




9:16


Source link