As many as 259 people died while taking selfies in the six years between 2011 and 2017 according to various recent reports and it takes little more than a Google search to see that number seems to be increasing. The uptick isn’t without explanation but the simplest answer to complex questions, often touted by a variety of users across the internet and linking the deaths to risky behavior, while intuitive, isn’t always the correct one.
The blame can’t necessarily be placed on the industry either. Leaders in the smartphone industry have continued to focus on camera technology, pushing boundaries and inventing new ways to incorporate better front-facing cameras. That improves the quality of photos users are trying to snap of themselves, their friends, and their environment but the success of companies is built on the meeting the demands of consumers.
A similar scenario can be found in the rise of selfie-related apps meant to improve or add more creativity and individuality to selfies.
Perhaps surprisingly, a study released in 2018 notes that only a handful of the deaths that have occurred while taken selfies can be blamed on high-risk activity. Others occur under circumstances that included what the study referred to as non-risky behavior.
None of that is to say that the behavior that leads to tragedy for victims and their friends or families is unexplainable or insoluble.
The risks of being human
The stories that tend to hit the news cycle are often related to individuals who are engaged in what could be viewed as risky scenarios, looking to snap a shot that will set them apart in an unending sea of selfies. In many cases, the deaths are the result of missteps that led to falls — although animal maulings and other accidents have occurred too.
In April alone, news surfaced of 20-year-old Andrea Norton, who fell nearly 100 feet while on an outing with fellow university students in the Arkansas Ozark National Forest. A 26-year-old Filipino victim slipped and fell almost 50 feet down a Hong Kong waterfall while snapping a selfie this month too, while another man in his 50s visiting the Grand Canyon from Hong Kong fell 1,000 ft near the Eagle Point observation area.
Each of those occurred under circumstances where the line between what could be considered high-risk or low-risk begin to blur. However, as noted above, that’s not always the case. Sydney Monfries, a 22-year-old Fordham University student had planned to photograph the New York City skyline from a campus bell tower when she fell through an opening on one of the structure’s landings, later succumbing to injuries from the 40-foot fall.
There are likely many other victims who haven’t been named here and plenty of examples of mundane accidents that haven’t led to injury — fatal or not.
To some extent, the risk posed to selfie-takers can be attributed to issues with self-awareness. Some commentary from a variety of sources has gone so far as to question the intelligence of selfie-takers. Experts warn that may be oversimplifying the matter in the first explanation and outright wrong in the latter.
Not only are victims of accidents related to selfies ordinarily aware of the risks presented by a given situation. A complex blend of the natural desire to stand out, a more accessible world audience, and inefficiencies in how the brain processes information.
The latter of those explanations does tie into the apparent lack of self-and circumstantial-awareness often noted in deaths associated with selfies but from a biological standpoint rather than one related to a lack of forethought.
Human brains are naturally relatively limited in their ability to take in stimuli and respond to changes in the surrounding environment. When a person is snapping a selfie or any photo, in most cases, their focus is taken away from other important factors that really require attention. That’s not deliberate and it isn’t always easy to see where danger might present itself in advance.
Additionally, factors leading to accidents can be found in the desire for individuals to stand apart from the crowd. That’s been true of people since there have been people. Selfies offer unique and frequently stunning insights into a personal perspective on life and can offer a glimpse into who, exactly, somebody is on a highly personal level.
Nothing about that immediately excuses actions that put people at risk such as standing too close in front of a wild animal’s cage at the zoo or snapping a shot of themselves walking along the ledge of a skyscraper. But there is societally-driven pressure to get a better shot, a plethora of examples of people surviving risky selfie sessions, and a reasonable explanation that even the most intelligent people lose track of themselves or their surroundings while capturing a selfie.
Technology in search of digital wellbeing and personal safety
Understanding what causes risk to turn to disaster is a great way to start being more aware of why accidents happen so that steps can be taken to prevent those from happening, to begin with.
Technology, particularly related to smartphones, could readily be pointed to as bearing some responsibility for the amazing shots that spur some on to take selfies — dangerous or otherwise. Emerging technologies such as AR, machine vision, and better location tracking methods may also hold the key to at least a partial solution.
Solutions that improve quality of life, well being, and safety isn’t on the backburner for the companies that could put forward solutions either. In fact, “digital wellbeing” in the world’s most popular mobile OS has been one of the leading edge goals for Google since the launch of the search giant’s Pixel 3 family of flagships.
Those efforts have chiefly centered around how to get users to be more aware of their screen time and their app usage.
Speculatively, at the far edge of the tech spectrum, an AR overlay could feasibly be used in tandem with AI-driven camera sensors, for example, to alert users who are taking a selfie in a potentially dangerous location if they are at risk. Far more easily, developers could introduce software that leverages camera hardware to provide selfie-takers with a wider view of their surroundings while taking images.
The implementations that could make that possible are already in at least some of the flagships available to the public. Samsung, for instance, introduced a smartphone earlier this year in its Galaxy S10 lineup that features a ToF camera sensor that isn’t too different from what’s found in many self-driving vehicles. Other OEMs are following suit too. Conversely, AI-dedicated chipsets have been in top-tier devices and many mid-rangers for more than a year.
Now, it seems that use monitoring and battery savings will remain the primary purpose of well-being goals for the foreseeable future and there aren’t necessarily any publicly-revealed projects underway to shift focus away from that or add onto it either. But companies such as Google could easily repurpose the underlying technologies involved to help users become more aware of their surroundings too. So it may only be a matter of time before a viable solution presents itself.