قالب وردپرس درنا توس
Home / Tips and Tricks / What is Apple’s new lidar technology and what can it do for the iPhone 12 Pro?

What is Apple’s new lidar technology and what can it do for the iPhone 12 Pro?



apple-iphone12pro-back-camera-10132020.jpg

The iPhone 12 Pro’s lid sensor – the black circle at the bottom right of the camera – opens up AR capabilities.

Apple

The iPhone 12 and 12 Pro is on sale now, but one of the most important differences between the Pro and non-Pro models this year is a deep sensing technique. If you look closely at one of the new iPhone 12 pros, or latest iPad Pro, you see a small black dot near the camera lenses, about the size of the flash. It’s a lidar sensor. Apple has been on the rise as a way to add deep sensing and new augmented reality features to its pro-end tablets and phones. Camera focus can also help a lot.

But why does Apple do a great thing about suffering and what will it be able to do about you buy iPhone 12 Pro or iPhone 12 Pro Max? It’s a term you’ll be hearing a lot right now, so let’s break down what we know, what Apple will use it for, and where the technology can go next.

What does suffering mean?

Lidar stands for light detection and extends, and has been around for a while. It uses lasers to ping objects and return to the source of the laser and measure the distance by setting the travel or flight of the light pulse.

ipad-pro-ar

The iPad Pro, which was released in the spring, also suffers.

Scott Stein / CNET

How does suffering work to feel deep?

Lidar is a type of flight time camera. Already on some other smartphones, they often measure the depth with a single light pulse, while sufferers send waves of light pulses out in a spray of infrared dots and can measure each with their sensor, creating a field with points that map distance and can “mesh “the dimensions of a space and the objects in it. The light pulses are invisible to the human eye, but you can see them with a night-vision camera.

Is not it like Face ID on iPhone?

That is it, but with a longer range. The idea is the same: Apple Face ID Enabling TrueDepth Camera also fires a series of infrared lasers, but can only work up to a few meters away. The rear lid sensors on the iPad Pro and iPhone 12 Pro work up to 5 meters.

XBox_One_35657846_03.jpg

Do you remember Kinect?

Sarah Tew / CNET

Lidar already has many other technologies

Suffering is a technology that is emerging everywhere. It is used for self-driving cars, or assisted driving. It is used for robotics and drone. Augmented reality headset as HoloLens 2 has similar technology and maps space spaces before placing 3D virtual objects in them. But it also has a fairly long history.

Microsoft’s old deep sensing Xbox accessories, Kinect, was a camera that also had infrared deep scanning. In fact, Primesense, the company that helped make Kinect technology, acquired by Apple in 2013. We now have Apple’s face scan of TrueDepth and rear lid camera sensors.

The iPhone 12 Pro camera can work better with lidar

Time-of-flight cameras on smartphones are typically used to improve focus accuracy and speed, and the iPhone 12 Pro will do the same. Apple promises better focus in low light: Apple says it will be up to 6 times faster in low light. Depth sensing of lidar will also be used to enhance the effects of night portrait mode.

Better focus is a plus, and there’s also a chance that the iPhone 12 Pro can add more 3D photo data to photos as well. Although this element has not yet been laid out, Apple’s forward-looking, deep-sensing TrueDepth camera has been similarly used with apps.

lidar-powered-snapchat-lens.png

Snapchat has already enabled AR lenses using iPhone 12 Pro’s lidar.

snapchat

It will also significantly enhance augmented reality

Lidar allows the iPhone 12 Pro to launch AR apps much faster and build a quick map of a room to add more detail. Plenty Apple’s AR updates in iOS 14 uses lidar to hide virtual objects behind real ones (called occlusion) and place virtual objects in more complicated room mappings, such as on a table or chair.

But there is an extra potential beyond that, with a longer tail. Many companies dream of headsets that mix virtual objects and real: These AR glasses, working on by Facebook, Qualcomm, snapchat, Microsoft, Magic Leap and probably Apple and others, will rely on having advanced 3D maps of the world to store virtual objects on.

These 3D maps are now built with special scanners and equipment, almost like the world-scanned version of these Google Maps cars. But there is a possibility that people’s own devices may eventually help gather the information or add extra on-the-fly data. AR headsets like Magic Leap and HoloLens already scan your environment before you put things in it, and Apple’s lidar-equipped AR technology works the same way. In that sense, the iPhone 12 Pro and iPad Pro are like AR headsets without the headset part … and could pave the way for Apple to eventually make its own glasses.

occipital-canvas-ipad-pro-lidar.png

A 3D room scan from Occipital’s Canvas app, activated by deep sensing sufferer on iPad Pro. Expect the same for the iPhone 12 Pro and maybe more.

Occipital

3D scanning can be the killer app

Lidar can be used to join 3D objects and rooms and place photo images on top, a technique called photogrammetry. It could be the next wave of capture technology for practical things like home improvement or even social media and journalism. The ability to capture 3D data and share that information with others could open up these phones and tablets with lidar as 3D tools for capturing content. Lidar can also be used without the camera element to get measurements for objects and spaces.

google-tango-lenovo-1905-001.jpg

Remember Google Tango? It also had depth sensing.

Josh Miller / CNET

Apple is not the first to explore technology like this on a phone

Google had the same idea in mind when Project Tango – an early AR platform that only existed on two phones – was created. The advanced camera array also had infrared sensors and could map rooms, create 3D scans and depth maps for AR and to measure indoor spaces. Google’s Tango-equipped phones were short-lived and were replaced by computer vision algorithms that have made estimated depth sensing on cameras without the need for the same hardware. But Apple’s iPhone 12 Pro looks like the much more advanced successor.






Now playing:
Check this out:

iPhone 12, iPhone 12 Mini, Pro and Pro Max explained




9:16


Source link