How 3D three-dimensional camera will work in the iPhone 8 coming in 2017

Let's put aside the Galaxy S devices 8 (Galaxy S8) coming in April / May of Samsung company that excelled in their design and the selection of some of the internal components like the new smart assistant Viv development of the company. To move on to talk about her brother - enemies brothers - coming from Apple, namely the iPhone 8 (iPhone 8).

I talked a lot of sources for Apple's desire to launch a phone-class leader such as using OLED screen after integrating fingerprint sensor inside, in addition to the wireless charging feature a long-range, external structure made of glass, and most importantly the camera; and here to talk not about the background, but for the camera front, which may be three-dimensional 3D.

Three dimensional

It may seem the idea of ​​three-dimensional or capable of converting elements to a revolutionary three-dimensional camera, which is also in terms of technologies that achieve that vision. But the concepts used in a simple and fun, and can be implemented based on more than technology, it can also be used in rear camera, as there is nothing to hinder this.

Some reports indicate that Apple's desire to rely on the sensor to send infrared, and another for the reception, when the user takes a picture the first sensor sends infrared user does not see, and after the rebound and return to the second sensor based algorithms calculates the distance and distance and then convert the captured image through the camera to become three-dimensional.

The simplest way, the camera in general you take bi-dimensional images flat, and can not move or see the dimensions of any element inside. This will continue in the iPhone 8, but while taking pictures gonna infrared; that is, they will emerge from the device and then received by again to know that far. After obtaining the elements after the device can be added to the basic algorithms's flat.

At the same time, it went further reports to talk about the existence of a second mechanism was followed by Apple, which is composed of sensing as well; the first to send a matrix, optical network, and the second works as a camera to photograph the matrix and the method Todaha the elements, a mechanism known as Structured Light, as you know the first that rely on infrared b Time-of-Flight.

The first sensor will send optic network, as is the case with light, Serttm thing Eachd shape; I mean, if we had a ball We guided by the light from the bottom, we note that the lighting will be focused to varying degrees on the ball. The second function of the sensor or camera is reading this focus - focus on the elements of light - to see the dimension, and then send the data to the algorithms that will add a dimension to the captured image.

Of course you can give up the infrared and the use of ultrasound, which can happen because Apple want to use those waves in the screen to read the fingerprint, and thus can screen while taking a personal photo Selfie version that ultrasound and then read through the sensor and converts the data to the dimensions after take advantage of algorithms capable of conducting those operations within a fraction of a second.

Uses of such technology are many, where it can be for Apple first ensure the user's identity through fingerprint and through the face, read any details of the face and compare it with the stored on the device. In addition, it can be out three-dimensional images. Not only this, but can use these images in Augmented Reality Augmented Reality applications, such as transforming the face of the user to the cartoon character in a game.
Source Agencies