Three dimensional technology in smartphones and tablets has been done before. HTC and LG both took a crack at it with a past smartphone and tablet, and the HTC One M8 even has the depth of field effect that you can use to enhance the pictures you take. Let’s not also forget the fabled Amazon smartphone that will use a 3D tracking technology in its four extra front camera sensors, which we’ll finally get a look at tomorrow when Amazon announces and unveils the phone. In truth though the 3D technology that we have seen in mobile devices like smartphones and tablets thus far is nothing like what we’ll be seeing from the company who is building the tech for Google’s Project Tango device setup.
The company behind the tech is called Mantis Vision and they spent some time with Business Insider showing off what it was capable of. This truly stunning and amazing technology will not just be able to add in cool 3D effects, but in actuality it will be able to render any image taken so that it has a new look and provide a full 3D view of an image. As an example, you could be taking a picture of a bird close up(as a side note, Mantis Vision explains that the closer you are to your subject the better the technology works)with its wings down, and the 3D imaging technology from Mantis Vision would presumably be able to take that image and then reproduce to look like the bird has its wings spread like it’s about to take off in flight. Mantis Vision shows off an example of how the image technology works to Business Insider which you can see below, and in that example the image taken is of a person sitting down, but using the technology from Mantis they are able to render the image to look like the person is standing instead. It’s also worth noting that the GIF image shown below represents raw data. So that means when this type of technology finally makes it into our smartphones and tablets in the future, we’re to expect much finer quality and sharper images.
While rendering images to look completely different like this is cool, there’s a possibility that Mantis may be able to get this technology to work in a way that allows any sensor on the phone to interact with the image, and not just by using the screen to interact with the features. They display a good example of this in one of the GIF images below, as the orientation of the image changes at different angles. This may not be quite what we see from Google’s Project Tango devices or devices modeled after it, but the technology behind it is all the same and does a good job to illustrate what kinds of things will be possible with cameras in our mobile devices in the future.