Select date

May 2024
Mon Tue Wed Thu Fri Sat Sun

Amazon’s Ring Planned Neighborhood “Watch Lists” Built on Facial Recognition

26-11-2019 < Blacklisted News 37 634 words
 

Ring, Amazon’s crimefighting surveillance camera division, has crafted plans to use facial recognition software and its ever-expanding network of home security cameras to create AI-enabled neighborhood “watch lists,” according to internal documents reviewed by The Intercept.


The planning materials envision a seamless system whereby a Ring owner would be automatically alerted when an individual deemed “suspicious” was captured in their camera’s frame, something described as a “suspicious activity prompt.”


It’s unclear who would have access to these neighborhood watch lists, if implemented, or how exactly they would be compiled, but the documents refer repeatedly to law enforcement, and Ring has forged partnerships with police departments throughout the U.S., raising the possibility that the lists could be used to aid local authorities. The documents indicate that the lists would be available in Ring’s Neighbors app, through which Ring camera owners discuss potential porch and garage security threats with others nearby.


Ring spokesperson Yassi Shahmiri told The Intercept that “the features described are not in development or in use and Ring does not use facial recognition technology,” but would not answer further questions.


This month, in response to continued pressure from news reports and a list of questions sent by Massachusetts Sen. Edward Markey, Amazon conceded that facial recognition has been a “contemplated but unreleased feature” for Ring, but would only be added with “thoughtful design including privacy, security and user control.” Now, we know what at least some of that contemplation looked like.


Mohammad Tajsar, an attorney with the American Civil Liberties Union of Southern California, expressed concern over Ring’s willingness to plan the use of facial recognition watch lists, fearing that “giving police departments and consumers access to ‘watch listing’ capabilities on Ring devices encourages the creation of a digital redline in local neighborhoods, where cops in tandem with skeptical homeowners let machines create lists of undesirables unworthy of entrance into well-to-do areas.”


Legal scholars have long criticized the use of governmental watch lists in the United States for their potential to ensnare innocent people without due process. “When corporations create them,” said Tajsar, “the dangers are even more stark.” As difficult as it can be to obtain answers on the how and why behind a federal blacklist, American tech firms can work with even greater opacity: “Corporations often operate in an environment free from even the most basic regulation, without any transparency, with little oversight into how their products are built and used, and with no regulated mechanism to correct errors,” Tajsar said.


Mounting Concern About Ring


Once known only for its line of internet-connected doorbell cameras marketed to the geekily cautious, Ring has quickly turned into an icon of unsettling privatized surveillance. The Los Angeles company, now owned by Amazon, has been buffeted this year by reports of lax internal security, problematic law enforcement partnerships, and an overall blurring of the boundaries between public policing and private-sector engineering. Earlier this year, The Intercept published video of a special online portal Ring built so that police could access customer footage, as well as internal company emails about what Ring’s CEO described as the company’s war on “dirtbag criminals that steal our packages and rob our houses.”


Previous reporting by The Intercept and The Information revealed that Ring has at times struggled to make facial recognition work, instead relying on remote workers from Ring’s Ukraine office to manually “tag” people and objects found in customer video feeds. The automated approach to watch-listing described in the documents reviewed by The Intercept may seem less unsettling than that human-based approach, but it potentially allows for a litany of its own problems, like false positives and other forms of algorithmic bias.


Read More...


Print