X

Google Uses Lag Detecting Robot to Test Touchscreens

One of the criticisms levelled at Android is how the operating system lags and pauses between our finger touching the screen and the device responding. Our eyes and mind are very attuned to how responsive the world is. Almost everything we touch has a texture, pressure and usually an immediate response. Our touchscreen devices are a little different for these currently have the same texture and pressure, but the response can differ. Even a small amount of hesitation between touching the screen and the device behaving as we want can feel uncomfortable for the user. Individual’s tolerances to this lag varies, but many of us are annoyed when our device does not appear to be responding to our inputs, so we try again and it does the same thing twice!

Over the years, we’ve seen Google working on reducing the latency between touch and device responding. One of the most notable examples of this is Project Butter, which was introduced in Android 4.1 Jelly Bean. Project Butter is about maintaining a smooth, flowing user experience with fewer dropped frames during animation and a more responsive device. These changes ran deep into the operating system other than boosting the animation buffering: one change mandated by Project Butter is that the processor speed is increased when the device detects activity on the touchscreen. This means that the device is ready to go should the user take any activity – this does result in the device running the processor at a higher clock speed than earlier versions of Android but also improves the responsiveness of the device.

How does Google measure the latency between touch and action, the lag? It turns out that the Chrome OS team use a robot built by Finnish company OptoFidelity to measure and quantity the lag of both Android and Chrome OS devices. The YouTube clip at the bottom of this article shows how the robot draws along the screen in a series of web based tests, which are measured by the device. The high speed video camera shows how drawing a line on the screen is actually drawn in segments that slowly fade away. Google use this technology to source issues and problems in both the software and hardware of devices; we know that there are other ways for Google to measure latency but given that much of their technology relies on a touchscreen, this could be one of the most important.