X

What Is A Time-of-Flight (ToF) Camera?

Samsung’s new Galaxy Note 10+ is shipping with something Samsung calls a DepthVision Lens on the back, and while that’s just a fancy name for a Time-of-Flight (ToF) camera, the concept of a ToF camera isn’t all that well known to most people. ToF cameras can be called many different things, including 3D sensors or range cameras, but all of them share the same mechanics and inner workings.

By definition, a Time-of-Flight camera utilizes infrared light to measure the distance between it and the objects in its field of view. Simply put, the camera emits infrared light, which effectively shoots out like invisible little laser beams. These beams eventually bounce back, and the time it takes for the beam to fly through the air and bounce back to the sensor is known as the time of flight. Since the camera knows how fast the light is traveling, it can easily calculate how far away the beam went and, thus, determine the distance to the object.

The best part about the ToF camera isn’t even the accurate distance measure it can provide, it’s the fact that all the calculation can be done right from the camera and won’t take up precious processing power on the phone. This leaves more room for that advanced computational photography that modern smartphones use, meaning higher quality pictures with little to no processing time.

ToF cameras started finding their way into phones last year when the tech became more readily available to smartphone manufacturers. Up to this point though, most phones that ship with a ToF camera, like the HONOR View20 and the Huawei P30 Pro, utilize it for almost nothing. In fact, the only phone with a ToF camera that has utilized it for any kind of daily use scenario is the LG G8 ThinQ, which actually places the ToF camera on the front rather than the back.

LG intelligently used this camera for 3D facial mapping for it’s proprietary face unlock technology, which is right up there in quality and speed with Apple’s FaceID technology in the iPhone X series. The biggest difference there is that LG is only using one camera for that tech (the ToF camera), while Apple has to use several cameras and dot emitters to get the job done.

LG is also using it for advanced depth perception on its front-facing camera, which provides it the ability to create 3D light sources and other depth tricks based on the actual calculated depth between the phone and your face.

Samsung’s implementation is more like this second half of LG’s functionality here except, of course, it’s for the rear cameras on the Galaxy Note 10+ since the ToF camera is on the back. Samsung has been working on its depth perception over the years, including the ability to change the way the “bokeh” effect looks with its flagship smartphones over the last two years.

Have you ever taken a portrait mode shot and seen that ugly, unrealistic-looking edge blurring around people or, worse yet, when entire objects are incorrectly included or excluded from the faux lens effect?

That should almost never happen on the Galaxy Note 10+, thanks to the new ToF sensor.

What else should we expect from ToF cameras, going forward? Google’s latest ARCore, the technology behind advanced augmented reality games and apps, now supports ToF cameras which enhances the realism in AR applications by providing better depth data straight from the camera. Now that Samsung has jumped on-board the ToF train, expect even more developers to start taking advantage of this new hardware with updates and new apps in the very near future.