Compliance with the declaration would prohibit researchers from working on robots performing search and rescue or in the new field of “social robotics”. One of Dr. Bethel’s research projects is the development of technology that uses small, human-like robots to interview children who have been abused, sexually assaulted, trafficked, or otherwise traumatized. In one of their most recent studies, 250 children and adolescents questioned about bullying were often willing to give a robot information they would not give an adult.
Having an investigator “drive” a robot in another room could result in less painful and more informative interviews with child survivors, said Dr. Bethel, a trained forensic interviewer.
“You need to understand the problem area before you can talk about robotics and policing,” she said. “You make a lot of generalizations without a lot of information.”
Dr. Crawford is one of the signatories of the open letters “No Justice, No Robots” and “Black in Computing”. “And you know whenever something like this happens or awareness is created, especially in the community I work in, I try to make sure I support it,” he said.
Dr. Jenkins refused to sign the “No Justice” statement. “I thought it was worth considering,” he said. “But in the end I thought the bigger problem was really the representation in the room – in the research lab, in the classroom and in the development team, in the management.” Ethics discussions should be rooted in this first fundamental civil rights issue, he said.
Dr. Howard has not signed either statement. She reiterated her point that biased algorithms are in part the result of the skewed population – white, male, capable – that the software designs and tests.
“If outside people with ethical values don’t work with these law enforcement agencies, who will?” She said. “If you say ‘no’, others will say ‘yes’. It is not good if there is no one in the room saying,’ Um, I don’t think the robot should kill. “