Tech Activists Say Computer Code Changed Human Behavior
WASHINGTON — From the algorithm that delivers your next TikTok video to the AI that powers ChatGPT, computer code learning isn’t a one-way operation. Of course, the program gleans information from user engagement, but experts warn that computer coding technologies have changed human behavior, too, both consciously and unconsciously.
Inspired by a book of essays on lines of code, Arizona State University’s Zócalo Public Square hosted a discussion with tech activists to ponder human decision-making’s impact on the digital world — and conversely, the ways that code has impacted humanity.
“Code only means something in context,” Ethan Zuckerman, public policy, communication and information associate professor at the University of Massachusetts at Amherst, said. “Unless you know the rest of the system, what its inputs are, what its outputs are … we don’t read code like we read poetry.”
Zuckerman, who wrote the code in 1997 that would later become pop-up advertisements, described this action as “the original sin of the web,” and admitted that he and other engineers in coding often tried to solve problems without bothering to consider the assumptions behind them.
The result is that computer coding now inevitably has human interest baked into technological products, along with the racial and gender biases that come from its very human creation.
“That’s what these systems do. They replicate patterns that they see in the world,” Zuckerman said. “And the data that we are feeding into these systems is pretty fucked up. These systems tend to reproduce those biases unless we find ways to consciously correct for them.”
And while there is education that goes into biases like racial divides, author Charlton McIlwain said computer coding education still is not done well because there has been “no room for the complexities that live outside or beyond the computation parts.”
“Not only the 0s and the 1s — but the code that tells us how we’re thinking about how we’re framing the problem for which the computer code is built and means to solve,” he said.
While algorithms like the Police Beat algorithm from the 1960s and more recent coding developed for global surveillance were meant to make the world safer by linking probable suspects and identifying crime patterns, McIlwain says they just make the world more biased.
“In learning from … data from an intensely unjust and unfair system, we are now locking into code racial [biases] that have been plaguing our nation for decades,” he said.
The answer to all of this is multifaceted.
First, it isn’t enough to consider corrections in algorithmic design based on mathematics; ethics are important.
“We need generally accepted algorithmic principles,” Zuckerman said, adding, “there is no good data set that we are going to use to achieve racial justice … there’s no data to even work from.”
But changes in computer science education might be a good start.
“We need computer science students to be liberal arts students,” he said. “We need them to study larger sociotechnical systems, which means they might need to take sociology [or] anthropology; they might need to round themselves significantly more than we’re used to in computer science education.”
And the public must understand that these technologies created by humans have incorporated human faults.
“There are no objective decisions. There are human beings [training] algorithms for different purposes all the way down,” Zuckerman said. “There are no apolitical technologies. All technologies have politics.”
Kate can be reached at [email protected]