Apple is on the rise, a technology that is completely new to the iPhone 12 family, especially for. ( with Pro Max to follow in a few weeks.)
Look closely at one of the new iPhone 12 Pro models or, and you see a small black dot near the camera lenses, about the size of the flash. It’s the lidar sensor, and it’s a new type of depth sensing that can make a difference in a number of interesting ways.
If Apple has its way, suffering is a term that you will start to hear a lot now, so let’s break down what we know, what Apple should use it for and where the technology can go next.
What does suffering mean?
Lidar stands for light detection and stretching and has been around for a while. It uses lasers to ping objects and return to the source of the laser and measure the distance by setting the travel or flight of the light pulse.
How does suffering work to feel deep?
Lidar is a type of flight time camera. Some other smartphones measure the depth with a single light pulse, while a smartphone with this type of lidar technology sends waves of light pulses out in a spray with infrared dots and can measure each with its sensor, creating a field that maps distance and can “mesh “the dimensions of a space and the objects in it. The light pulses are invisible to the human eye, but you can see them with a night-vision camera.
Is not it like Face ID on iPhone?
That is it, but with a longer range. The idea is the same: Applealso fires a series of infrared lasers, but can only work up to a few meters away. The rear lid sensors on the iPad Pro and iPhone 12 Pro work in a range of up to 5 meters.
Lidar already has many other technologies
Suffering is a technology that is emerging everywhere. It is used for, or . It is used for and . Augmented reality headset as has similar technology and maps space spaces before placing 3D virtual objects in them. But it also has a fairly long history.
Microsoft’s old deep sensing Xbox accessories,, was a camera that also had infrared deep scanning. In fact, PrimeSense, the company that helped make Kinect technology, . We now have Apple’s face scan of TrueDepth and rear lid camera sensors.
The iPhone 12 Pro camera can work better with lidar
Time-of-flight cameras on smartphones are typically used to improve focus accuracy and speed, and the iPhone 12 Pro will do the same. Apple promises better focus in low light, up to 6 times faster in low light. The lid depth sensing will also be used to enhance the effects of night portrait mode.
Better focus is a plus, and there’s also a chance that the iPhone 12 Pro can add more 3D photo data to photos as well. Although this element has not yet been laid out, Apple’s forward-looking, deep-sensing TrueDepth camera has been similarly used with apps.
It will also greatly enhance augmented reality
Lidar allows the iPhone 12 Pro to launch AR apps much faster and build a quick map of a room to add more detail. Plentyuses sufferers to hide virtual objects behind real ones (called occlusion) and place virtual objects in more complicated room mappings, such as on a table or chair.
But there is extra potential beyond that, with a longer tail. Many companies dream of headsets that mix virtual objects and real: AR glasses,, , , , and and others, will rely on having advanced 3D maps of the world to store virtual objects on.
These 3D maps are now built with special scanners and equipment, almost like the world-scanned version of these Google Maps cars. But there is a possibility that people’s own devices may eventually help gather the information or add extra on-the-fly data. Again, AR headsets like Magic Leap and HoloLens have already scanned your environment before you put things in it, and Apple’s lidar-equipped AR technology works the same way. In that sense, the iPhone 12 Pro and iPad Pro are like AR headsets without the headset part … and could pave the way for Apple to eventually make its own glasses.
3D scanning can be the killer app
Lidar can be used to join 3D objects and spaces and store photo images on top, a technique called photogrammetry. It may be the next wave of capture technology for practical uses such as, or even social media and journalism. The ability to capture 3D data and share that information with others could open up these lidar-equipped phones and tablets to be 3D content tools. Lidar can also be used without the camera element to get measurements for objects and spaces.
Apple is not the first to explore technology like this on a phone
Google had the same idea in mind when– an early AR platform that was – was created. The advanced camera also had infrared sensors and could map rooms, create 3D scans and depth maps for AR and to measure indoor spaces. Google’s Tango-equipped phones were short-lived and were replaced by computer vision algorithms that have made estimated depth sensing on cameras without the need for the same hardware. But Apple’s iPhone 12 Pro looks like a much more advanced successor.