Apple CEO Tim Cook has said that increased reality will "change everything".
In recent years, the technology has already affected several industries. It has helped companies to become more efficient. It has begun a new way for brands to market products. It can make neighborhoods a virtual playground. And it can add another dimension of fun and creativity to photos and videos. As a result, AR has attracted the attention of investors.
But what is reinforced reality, exactly? If you are looking for the answer, you have come to the right place.
Definition of enhanced reality and its characteristics
While modern technology has brought the "silver age" of increased reality, the concept is and even some of the implementations of the technology are not new.
In 1994, researchers Paul Milgram, Haruo Takemura, Akira Utsumi and Fumio Kishino defined a reinforced and virtual reality as points on a spectrum, doubling the continuity of reality-virtuality. On one end of the spectrum, the real environment is seen naturally by humans. At the opposite end of the spectrum is the virtual reality, where the real environment is completely replaced by a digital environment.
The points between the real environment and the virtual reality are occupied by increased reality, where hardware and software complement the natural environment with digital content.
The researchers also trained the concept of mixed reality as an overall classification for technology that joins the real and virtual environments, with Microsoft interacting the term and conflating it with its own Mixed Reality platform for VR thus confusing some consumers in recent years ).
Fast-winding to modern-day makes the realization of the textbox definition for increased reality dependent on environmental understanding of computers delivered via a connected camera, to deliver virtual content within the user's field of view.
One-way environmental understanding is achieved through markers that allow the computer to track consistently within the environment. A marker can be created by corresponding to a QR code that a computer's camera recognizes as a virtual content placement area. Another way to establish a marker is a headlight that communicates its physical location to the AR unit.
Conversely, environmental understanding without a marker means that you build a 3D map of the environment. Initial marketless reinforced reality experiences required a camera that can feel the depth of the environment. Without a depth sensor, computers can use a computer vision algorithm trained to estimate surfaces to anchor virtual content to the environment.
Another element of environmental understanding is occlusion, which refers to real world objects that block the display of virtual content from the point of view of the computer's camera and its users, which enhances the realism of the virtual content. Usually, this requires a depth sensor, but computer display progress in tests has demonstrated the ability to identify physical objects within the camera view.
Finally, realistic enhanced reality experiences require 3D content. Developers generally use the same game engines used to create virtual reality experiences, mainly Unity and Unreal Engine, to create enhanced reality content. Along with 3D engines, AR experiences need 3D models to show in real physical environments. Models can be created in 3D modeling programs or captured by photogrammetry of real world objects.
We have determined that enhanced reality experiences are delivered via computers. For the average consumer, it means smartphones and tablets that have the cameras to read markers or discover surfaces and the mobility of the users to target the device to its field of view. Many current smartphones also include sensors (usually an accelerometer, a magnetometer, and a gyroscope) that allow AR devices to orient the user for their environment and virtual content.
For a more natural experience, however, mounted screens or fully deepened magnified reality heads and smartglasses fit the bill. In the layman, AR headsets are essentially mobile devices, with miniature screens configured for the user's point of view, along with a computer, either embedded in the portable device or connected (bound) to an external computer. In more advanced cases, AR headsets also contain depth sensors for environmental mapping.
Elevated reality has also made its way to cars. Heads-up monitors in modern car models provide dashboards, infotainment and driver windshield navigation. When autonomous vehicles replace manual models, the increased reality is likely to play a role in sharing the vehicle's view of the world, such as the recognition of other cars, pedestrians and traffic hazards with their passengers.