Meta, the parent company of Facebook, has recently filed a lawsuit against Voyager Labs for allegedly creating tens of thousands of fake Facebook accounts to scrape user data and provide surveillance services for clients.
According to the legal filing, Meta alleges that Voyager Labs created over 38,000 fake Facebook user accounts to scrape information from over 600,000 Facebook users and groups belonging to employees of non-profit organizations, universities, news media organizations, healthcare facilities, the armed forces of the United States, and local, state, and federal government agencies, as well as full-time parents, retirees, and union members.
Furthermore, Voyager also scraped data from Instagram, Twitter, YouTube, LinkedIn, and Telegram to sell and license for profit. As a result, Meta disabled more than 60,000 Voyager Labs-related Facebook and Instagram accounts and pages “on or about” January 12th. Meta is demanding that the company stop violating its terms of service and has asked a judge to permanently ban Voyager from Facebook and Instagram, claiming that Voyager’s actions have caused it “to incur damages, including investigative costs, in an amount to be proven at trial.”
“Companies like Voyager are part of an industry that provides scraping services to anyone regardless of the users they target and for what purpose, including as a way to profile people for criminal behaviour,” said Jessica Romero, Meta’s Director of Platform Enforcement and Litigation. “This industry covertly collects information that people share with their community, family, and friends, without oversight or accountability, and in a way that may implicate people’s civil rights.”
Understanding Voyager’s business model
The company bills itself as a leader in advanced AI-based investigation solutions. However, in practice, this means analyzing social media posts to make claims about individuals. According to The Guardian, Voyager Labs sold its services to the Los Angeles Police Department, with the company claiming to predict which individuals were likely to commit crimes in the future. However, experts say these technologies are flawed, and algorithms are too simple to predict crime effectively. Voyager’s model suggested factors like Instagram usernames denoting Arab pride or tweets about Islam could indicate someone is leaning toward extremist ideologies.