Get Even More Visitors To Your Blog, Upgrade To A Business Listing >>

Axon's Ethics Board Resigned Over Taser-Armed Drones. Then the Company Bought a Military Drone Maker

Tags: axon ethics board

To revist this article, visit My Profile, then View saved stories.To revist this article, visit My Profile, then View saved stories.Ese OlumhenseThis article was copublished with The Markup, a nonprofit, investigative newsroom that challenges technology to serve the public good. Sign up for its newsletters here.Less than 10 days after the Robb Elementary School shooting in Uvalde, Texas, in May 2022, Axon Enterprises CEO Rick Smith announced the company had formally started developing Taser-equipped drones. The technology, Smith argued, could potentially save lives during mass shootings by incapacitating active shooters within seconds.For Axon, which changed its name from Taser in 2017, the concept seemed a sensible next step for stakeholders who share Axon’s public safety mission, Smith said on the company’s site.“In brief,” he wrote, “non-lethal drones can be installed in schools and other venues and play the same role that sprinklers and other fire suppression tools do for firefighters: Preventing a catastrophic event, or at least mitigating its worst effects.”Elsewhere, however, the announcement roused significant concern. Only a few weeks before Smith’s announcement, a majority of the members of Axon’s AI Ethics Board—which consisted of a dozen academics, attorneys, activists, and former law enforcement officials—recommended the company not move forward with a pilot study of Taser-armed drones, then called Project ION. The board had spent more than a year considering the Taser-equipped drone project but had never considered any use case in which it would be a solution to mass shootings.Advisory board members told The Markup that Smith’s announcement was unexpected and made without consultation or input from the ethics body the company had worked relatively well with for the previous four years. In the past, the board’s work prompted Axon to ban facial recognition on its body cameras out of concern the technology could not be responsibly rolled out.“I begged Rick not to go public with the weaponized drone plan without consulting with the board, as our operating principles required,” Barry Friedman, founder of NYU Law School’s Policing Project and former Axon ethics advisory board chair, told The Markup.Within a week of the announcement, nine of Axon’s 12-member ethics board resigned, saying in a joint letter that they had “lost faith in Axon’s ability to be a responsible partner.”“Although we all joined this Board understanding that we are advisory only—and have seen Axon reject our advice on some prior occasions—rushing ahead to embrace use of surveillance-enabled, Taser-equipped drones, especially when its Board was urging against unnecessarily precipitate action, is more than any of us can abide,” the exiting members wrote then.Amanda HooverAmanda HooverAlden WickerKate KnibbsIn the wake of the board’s dissolution, Axon halted its Taser-drone program temporarily. Former board members, meanwhile, continued to speak out against the company’s efforts. The group released a report in January 2023 criticizing company leaders for “trading on the tragic shootings which had just occurred in Uvalde and Buffalo.” The report included a number of recommendations about Taser-drone technology, including the need for accuracy and safety thresholds, as well as local lawmakers’ approval and internal department policies governing the drones’ use. Drones deploying force should never be autonomous, either, the ex-board members recommended—a human should make that decision.The former members also noted that in addition to the physical risk of injury or death posed by an electroshock weapon, the proposed devices would rely on surveillance systems that would be triggered by the sound of gunshots, posing privacy and accuracy risks. Additional surveillance in schools might also lead to increased disciplinary action, even for minor offenses, they said. There’s potential for disparate, racist impact here, too, the former members pointed out, saying, “Black students are four times more likely to attend a school with a high level of surveillance.”Weaponized drones are also vulnerable to misuse and might increase how frequently force is used, the experts said in their report. “A growing literature on military use of drones notes the unique characteristics of remote use of force—humans appear as figures on a computer screen, and decisions to use force often are made by teams rather than by a single individual,” the experts wrote. This “could lead to dehumanization of individuals targeted by the drone and could diminish operators’ sense of personal moral culpability for their decisions, leading to increased use of force.”However, Axon, which did not reply to a request for comment by press time, still appears to be moving forward with its armed drone plan.“On a longer time horizon, Axon sees opportunities to explore how robotics can expand to include less-lethal robotic payloads and operations,” company leadership said in a statement on its website from April of this year. “While this is still in early concept, we believe with ample research, ethical development and identifying the most amenable use cases, this capability can positively contribute to the future of public safety.”Then, in July, the company acquired Sky-Hero, a company based in Belgium that manufactures drones and unmanned ground vehicles. Sky-Hero has already developed so-called “distraction” technology for some of its drones and rovers that produces the same sound pressure levels as a semiautomatic rifle, it said in the description of a YouTube video demonstrating the tech, acting as a “true non-lethal flashbang.”In interviews with The Markup, former ethics advisory board members expressed concern about the company’s plans to continue developing weaponized drone technology.“Vendors like Axon are selling products to public agencies for the benefit of the public, and I think they have a responsibility to consider the harms their products might cause and to try to mitigate those harms,” said Max Isaacs, a senior staff attorney at the Policing Project who worked with the board.Amanda HooverAmanda HooverAlden WickerKate Knibbs“Everyone deserves public safety,” Isaacs added. “Everyone wants more public safety. That’s not the question. The question is, when companies sell these products claiming all of these public safety benefits, have they proven those benefits? Has there been any independent testing? Do we know that these products are making us safer? Oftentimes the answer is no.”Isaacs and others The Markup interviewed noted that ethics boards are an imperfect patch for the regulatory void around evolving technology like drones and artificial intelligence.“The fact that we’re relying on companies to set up these advisory boards in order to address the harms of their products is itself a very problematic concept to me,” Isaacs said.Though Axon leadership says it is committed to the “responsible” development of new technology, it’s not clear whether the company is still consulting with ethics experts on the plan.The Markup found that mentions of the former AI ethics board appear to have been removed from the company’s website, including the board’s once-public recommendations and reports. The webpage axon.com/ethics, where the former board’s governing principles and work was hosted, now sends searchers to a letter from Smith announcing the company would pause its Taser-drone plan. In September 2022, the company unveiled its Ethics and Equity Advisory Council, a panel of academics and community leaders who advise Axon on “a limited number of products per year,” according to the company. Axon says the body is independent, but unlike the former ethics board, it is led by an Axon executive vice president.The company declined to make members of this group available for interviews. Public reports from the body are not available on its website, nor is a copy of its operating principles.Because ethics is both subjective and not legally binding, they can readily be trumped by capitalist imperatives, said Ryan Calo, a professor at the University of Washington School of Law and a former member of the ethics board at Axon.“Ethics is important,” said Calo. “Ethics is good. But law needs to be a backstop. Law is the place where society goes to decide what’s forbidden and what’s required.”The Arizona-based company has not shipped any Taser-equipped drones to any client, it said in a 2022–23 annual report, the most recent available. It also pledged to never build “lethal” drones.Axon, which launched in 1993, is a premier player in the law enforcement and military technology space. Technology developed by the company, which makes Taser devices and body-worn cameras, is used by more than 95 percent of state and local law enforcement agencies in the United States, it claims in investor reports. Axon also owns the evidence.com platform, a cloud-based evidence management system for police officers and what the company calls “the world’s leading repository of law enforcement data.” Regarding Taser drones, Axon maintains they should be developed. “We believe there is no organization in the world better suited to develop it the right way,” its most recent investor report reads.Amanda HooverAmanda HooverAlden WickerKate KnibbsCompany leadership expects to generate more than $1.5 billion in revenue in 2023, it said in an August investor statement. And by 2025, Rick Smith has set a target of reaching $2 billion. According to its own reports, Axon, which went public in 2001, has generated “over $15 billion in wealth” for its shareholders.Still, the plan to arm drones with Tasers was not universally well-received by Axon’s shareholders, some of whom criticized the company for Smith’s announcement about the weaponized drones. A shareholder proposal submitted by the Jubitz Family Foundation, a Portland, Oregon–based foundation that promotes nonviolent alternatives to conflict, encouraged shareholders to vote to discontinue developing these drones.“Axon proposed using AI surveillance, algorithmic predictors, and virtual reality simulations to stop mass shootings,” the proposal, which was included as part of the company’s 2022–23 annual report, reads. “Axon did not seek meaningful input from its in-house Community Advisory Coalition, AI Ethics Board, or Vice President of Community Impact prior to the announcement.”After the ethics board’s resignations last year, “Axon has now replaced both the Community Advisory Coalition and the AI Ethics Board with a new advisory council, which Smith still does not commit to heeding,” the foundation added in its proposal.“The rollout of this proposal demonstrates a tremendous failure of management’s self-governance procedures,” the Foundation wrote, and risked not only harming children psychologically and physically, but possible litigation and reputational damage.The Jubitz Family Foundation did not respond to a request for comment on the proposal from The Markup.In a lengthy response to the Jubitz proposal, Axon said robotic security could save lives, slashing gun-related deaths by presenting police with longer-range, remotely operable weapons.“Axon is working to reduce violence and displace lethal uses of force with less-lethal alternatives that can save—rather than take—lives,” the company said.“Based on our analysis of The Washington Post’s data set of fatal officer-involved shootings, we estimate that a more effective, longer-range handheld Taser device has the potential to reduce fatal officer-involved shootings by around 40 percent,” the company said. “When we run this same analysis looking at instances where police could have utilized a less-lethal capable drone, we estimate that a drone could likely have been used instead of lethal force in 57 percent of these fatal shootings. When we combine an advanced handheld Taser device together with remotely operated drone and robotic capabilities, we estimate that up to 72 percent of fatal shootings might be averted.” (The company did not share information about its analysis in response to questions about it.)While Axon technology is used by major police departments and federal agencies, including the New York Police Department, the Los Angeles Police Department, the US Department of Homeland Security, and the Departments of Defense and Justice, according to the company, there isn’t proof that the products are solving the problem of police violence. According to the Washington Post database of fatal police shootings, the number of such shootings was higher in 2022 than it was in any of the previous seven years tracked. And recently, some police unions have argued they should be paid more just to use body cameras, a barrier to critical transparency even where these tools are available.Amanda HooverAmanda HooverAlden WickerKate KnibbsAxon did not respond to questions in time for publication about when taser-equipped drones might arrive on the market, whether its new Equity and Ethics Advisory Council is advising it on the weaponized drone program, or whether the company has received any demand from school districts for these products. However, some insight into CEO Smith’s vision for this kind of drone technology can be gleaned from his 2019 book The End of Killing.In a chapter of the book on school safety, Smith presents readers with a fictitious scenario about a day care center shooting, a tragedy averted because of a Taser-equipped drone installed in the room and activated by a “AI algorithm … designed to constantly monitor for potential firearm discharge sounds, not all that different from the iPhones of millions of people around the world that are awakened by the ‘Hey Siri’ sound pattern,” Smith writes.An algorithm, Smith writes, would subsequently calculate the direction of the sound and, combined with a panic-alert signal system, trigger the drop of a small drone within a second. A “computer vision algorithm” on the drone would detect muzzle flashes. Sensing a probable weapon, he added, police could remotely deploy the Taser-armed drone, shooting electrical impulses “designed to paralyze the human nervous system” toward the shooter. All of this could take place within two seconds of gunfire.📩 Don’t miss our biggest stories, delivered to your inbox every day🎧 Our new podcast wants you to Have a Nice FutureHow to use AI to talk to whales—and save life on EarthLove, loss, and pig butchering scamsUnmasking Trickbot, one of the world’s top cybercrime gangsSorry, your paper coffee cup is a toxic nightmareThese are our favorite sheets to catch some z’s🌲 Our Gear team has branched out with a new guide to the best sleeping pads and fresh picks for the best coolers and binocularsAndy GreenbergKate O'FlahertyMatt BurgessLily Hay NewmanDavid NieldLily Hay NewmanLily Hay NewmanAndrew CoutsMore From WIREDContact© 2023 Condé Nast. All rights reserved. Use of this site constitutes acceptance of our User Agreement and Privacy Policy and Cookie Statement and Your California Privacy Rights. WIRED may earn a portion of sales from products that are purchased through our site as part of our Affiliate Partnerships with retailers. The material on this site may not be reproduced, distributed, transmitted, cached or otherwise used, except with the prior written permission of Condé Nast. Ad Choices



This post first appeared on VedVyas Articles, please read the originial post: here

Share the post

Axon's Ethics Board Resigned Over Taser-Armed Drones. Then the Company Bought a Military Drone Maker

×

Subscribe to Vedvyas Articles

Get updates delivered right to your inbox!

Thank you for your subscription

×