According to recent reports, Google has at least one set of employees with a job you won’t likely find at almost any other company in the tech industry. The group – which operates under the label of the “NightWatch” team, is tasked with considering the cultural and situational diversity of Google’s millions of users. It then puts forward ideas for improving applications in terms of both safety and ethical values. NightWatch has a distinct advantage in that it is comprised of a very diverse set of people, according to members of the group. The team includes both men and women from a wide variety of cultures and career backgrounds, and that’s no accident. Amber Yust, a software engineer on the team, says that it is not at all impossible to make good ethical decisions with a team that isn’t as diverse but that it is much easier if the team is more representative of the world as a whole. Unfortunately, according to Lea Kissner, the current leader of the team, this “is not what most of Silicon Valley looks like.”
One example Google puts forward, to show the kinds of work NightWatch is responsible for, is YouTube’s face blurring tool. The tool was originally thought up by NightWatch following the outbreak of protests in Syria. Videos of the events were being uploaded and people in those videos were at risk as a result. The team responded and the smart face blurring tool for content creators was released in February of last year as a result. However, as is usually the case, things were a lot more complicated than that. Members of the NightWatch team quickly realized that it might be possible for the unblurred versions of videos to be obtained and used against protesters. So the option was added to allow users to permanently delete those versions and the company made extra efforts to show users that it was possible. Instructions for how to accomplish that were released in the various languages the service is available in. The team also considered the implications of geotagging in the videos when creating the feature. Originally, the idea was put forward that users could set the smart blurring based on location, but that carried possible problems with permissions about location data collection so the team scrubbed the idea.
What perhaps makes the team’s revelation most interesting is how it seems to clash with some ideas people have about big corporations, specifically the idea that those entities only operate with profits in mind. While the search giant likely has plenty of for-profit reasoning behind NightWatch, the team’s existence shows that the company remains committed to its “don’t be evil” mantra, despite that the company technically dropped the somewhat vague slogan years ago. If nothing else, it shows the company’s continued commitment to user safety.