قالب وردپرس درنا توس
Home / Tips and Tricks / iPhone 12 Pro camera uses lidar. What it is and why it matters

iPhone 12 Pro camera uses lidar. What it is and why it matters



apple-iphone12pro-back-camera-10132020.jpg

The iPhone 12 Pro’s lid sensor – the black circle at the bottom right of the camera unit – opens up AR capabilities.

Apple

The iPhone 12 and 12 Pro is on sale now, but one of the most important differences between the Pro and non-Pro models this year is a deep sensing technique. If you look closely at one of the new iPhone 12 Pro models or latest iPad Pro, you see a small black dot near the camera lenses, about the size of the flash. It’s a lidar sensor. Apple has been on the rise as a way to add deep sensing and new augmented reality features to its pro-end tablets and phones. Camera focus can also help a lot.

But why does Apple do a great thing about suffering and what will it be able to do about you buy iPhone 12 Pro or iPhone 12 Pro Max? It’s a term you will start to hear a lot now, so let’s break down what we know, what Apple should use it for and where the technology can go next.

What does suffering mean?

Lidar stands for light detection and extends, and has been around for a while. It uses lasers to ping objects and return to the source of the laser, and measure the distance by setting the movement or flight of the light pulse.

How does suffering work to feel deep?

Lidar is a type of flight time camera. Some other smartphones measure the depth with a single light pulse, while a smartphone with this type of lidar technology sends waves of light pulses out in a spray with infrared dots and can measure each with its sensor, creating a field that maps distance and can “mesh “the dimensions of a space and the objects in it. The light pulses are invisible to the human eye, but you can see them with a night-vision camera.

ipad-pro-ar

The iPad Pro, which was released in the spring, also suffers.

Scott Stein / CNET

Is not it like Face ID on iPhone?

That is it, but with a longer range. The idea is the same: Apple Face ID Enabling TrueDepth Camera also fires a series of infrared lasers, but can only work up to a few meters away. The rear lid sensors on the iPad Pro and iPhone 12 Pro work in a range of up to 5 meters.

Lidar already has many other technologies

Suffering is a technology that is emerging everywhere. It is used for self-driving cars, or assisted driving. It is used for robotics and drone. Augmented reality headset as HoloLens 2 has similar technology and maps space spaces before placing 3D virtual objects in them. But it also has a fairly long history.

Microsoft’s old deep sensing Xbox accessories, Kinect, was also a camera that had infrared deep scanning. In fact, PrimeSense, the company that helped make Kinect technology, acquired by Apple in 2013. We now have Apple’s face scan of TrueDepth and rear lid camera sensors.

XBox_One_35657846_03.jpg

Do you remember Kinect?

Sarah Tew / CNET

The iPhone 12 Pro camera can work better with lidar

Time-of-flight cameras on smartphones are typically used to improve focus accuracy and speed, and the iPhone 12 Pro will do the same. Apple promises better focus in low light, up to 6 times faster in low light. The lid depth sensing will also be used to enhance the effects of night portrait mode.

Better focus is a plus, and there’s also a chance that the iPhone 12 Pro can add more 3D photo data to photos as well. Although this element has not yet been laid out, Apple’s forward-looking, deep-sensing TrueDepth camera has been used in a similar way with apps.

lidar-powered-snapchat-lens.png

Snapchat has already enabled AR lenses using iPhone 12 Pro’s lidar.

snapchat

It will also significantly enhance augmented reality

Lidar allows the iPhone 12 Pro to launch AR apps much faster and build a quick map of a room to add more detail. Plenty Apple’s AR updates in iOS 14 uses sufferers to hide virtual objects behind real ones (called occlusion) and place virtual objects in more complicated room mappings, such as on a table or chair.

But there is extra potential beyond that, with a longer tail. Many companies dream of headsets that mix virtual objects and real: AR glasses, working on by Facebook, Qualcomm, snapchat, Microsoft, Magic Leap and probably Apple and others, will rely on having advanced 3D maps of the world to store virtual objects on.

These 3D maps are now built with special scanners and equipment, almost like the world-scanned version of these Google Maps cars. But there is a possibility that people’s own devices may eventually help gather the information or add extra on-the-fly data. Again, AR headsets like Magic Leap and HoloLens have already scanned your environment before you put things in it, and Apple’s lidar-equipped AR technology works the same way. In that sense, the iPhone 12 Pro and iPad Pro are like AR headsets without the headset part … and could pave the way for Apple to make its own glasses eventually.

occipital-canvas-ipad-pro-lidar.png

A 3D room scan from Occipital’s Canvas app, activated by deep sensing sufferer on iPad Pro. Expect the same for the iPhone 12 Pro and maybe more.

Occipital

3D scanning can be the killer app

Lidar can be used to join 3D objects and rooms and place photo images on top, a technique called photogrammetry. It may be the next wave of capture technology for practical uses such as renovation, or even social media and journalism. The ability to capture 3D data and share that information with others could open up these lidar-equipped phones and tablets to be 3D content tools. Lidar can also be used without the camera element to get measurements for objects and spaces.

google-tango-lenovo-1905-001.jpg

Remember Google Tango? It also had depth sensing.

Josh Miller / CNET

Apple is not the first to explore technology like this on a phone

Google had the same idea in mind when Project Tango – an early AR platform that was only on two phones – was created. The advanced camera array also had infrared sensors and could map rooms, create 3D scans and depth maps for AR and to measure indoor spaces. Google’s Tango-equipped phones were short-lived and were replaced by computer vision algorithms that have made estimated depth sensing on cameras without the need for the same hardware. But Apple’s iPhone 12 Pro looks like a much more advanced successor.






Now playing:
Check this out:

iPhone 12, iPhone 12 Mini, Pro and Pro Max explained




9:16


Source link