Jun 15 2013
UAVs or unmanned aerial vehicles – are not exactly ubiquitous yet. But that future may not be far away.
The day after it was reported that Dominos is testing drones for pizza delivery, the L.A.-based Drone Dudes were on the UC San Diego campus with a remote-controlled flying camera, buzzing around the blue “Fallen Star” cottage high atop Jacobs Hall, getting shots an unassisted human would be hard-pressed to get. Inside the Jacobs School of Engineering, meanwhile, a team of students (link to Ioana’s story) was hard at work developing their version of a UAV, one they hope will have a number of commercial applications. Farther afield, another sort of drone might have been conducting surveillance of the U.S. border with Mexico. Yet another was probably flying a military mission over Pakistan.
Drones range dramatically, from what UC San Diego visual arts professor and chair Jordan Crandall calls the “wondrous flying machines” of DIYers and other hobbyists, to copters that can monitor endangered species or the progress of wildfires, to the massive, weaponized systems operated by the U.S. military. That’s a big range: from geek-chic to deadly. And just as big as the differences between drones are the questions about them that we, as a society, have yet to fully address.
Crandall, who co-curated the “Drones at Home” exhibition at Qualcomm Institute (Calit2) last year, has been working on a project he calls “Unmanned.” A performance piece and a book in progress, “Unmanned” primarily examines the drones used for state-sponsored surveillance and military operations. In the performance, Crandall enacts seven different characters to explore the changing nature of masculinity as warfare becomes increasingly automated.
The goal of the piece and of Crandall’s larger endeavor on drones, he says, is to provide a “a philosophical framework for understanding distributed intelligence and action.” What are drones? What does it mean to be automated? At what point, if ever, has decision-making been ceded to the machine?
To begin with, Crandall says, drones – even the so-called autonomous ones – are not really “unmanned.” It takes hundreds of people to operate a sophisticated one; it’s just that the operators (and the programmers before them) have been redistributed in time and space. The network is complex and largely invisible, Crandall says. So much so that it’s hard to grasp. Crandall hopes his work will help people visualize the network, insofar as that’s possible, and serve as a starting point for the broader discussion about drones we need to have. “It is important,” he said, “to create awareness and informed dialogue about drone use.”
Crandall may be as fascinated with drones as the folks who tinker with them in their garages, but he also wants us to consider some thorny scenarios: “In what cases, for example, would it be acceptable to arm a drone used by the police?”
Kelly Gates, an associate professor of communication and science studies at UC San Diego, has similar questions. She is also not an outright critic of the technology – there are obvious benign and beneficial applications – but, like Crandall, she wants us to take the time to grapple with the issues they raise.
Author of “Our Biometric Future: Facial Recognition Technology and the Culture of Surveillance,” Gates focuses her research on digital media technologies. Her main emphasis has been on the politics and social implications of computerization, particularly the automation of surveillance.
For Gates, the big cultural moment with drones came after the terrorist attacks of 9/11. Humans, she says, have been using “unmanned” aerial devices to study enemy territory since the invention of the hot-air balloon. But attaching weapons to these devices is pretty new. The recent ramp-up in scale is also unprecedented.
She is troubled by how “weaponization of drones just sort of happened,” with very little public notice or discussion.
As drones multiply and their capacities grow, she would like to see people take seriously not only the ethical issues associated with bomb strikes abroad or Big-Brother possibilities at home but also the potential invasions of privacy from such banal uses as highly precise images taken by realtors or traffic reporters.
While “there is no putting this [technology] back in the box,” Gates says, we can stop being unthinking about its adoption.
Yet she worries that “people are becoming so accustomed to surveillance that we just can’t be shocked anymore.” At this point, she wonders, is there even a meaningful concept of “privacy” left?