Illustration by: Eleanor Shakespeare

The search for ethics

Digital technology is reshaping media and culture. Our scholars explore how to build and use these new tools responsibly.

Safiya Umoja Noble had just begun researching the inner workings of search engines in 2009 when a colleague quipped, “You should see what happens when you Google ‘black girls.’”

Assistant Professor of Communication Safiya Noble.
Photo by: John Davis

“I am a black girl,” Noble recalled. “I have a daughter. I have lots of nieces. I immediately did the search.” 

She got back a page full of pornography.

“That started me on a deeper line of inquiry about the way in which misrepresentation happens for women of color on the internet, and the broader social consequences of that,” said Noble, assistant professor of communication. “This was happening on the internet, a forum where people think of the information that they come across is objective, neutral and credible.”

Noble had previously spent 15 years in marketing and advertising, working for some of the largest Fortune 100 brands in the United States. As she was leaving corporate America and beginning graduate school at the University of Illinois at Urbana-Champaign in the late 2000s, she started scrutinizing the rise of digital technologies — Google in particular. She noticed that many of her peers were touting the “liberatory” effect that Google was having on the information space. 

“I got a lot of pushback from people in academe and people in industry who said, ‘Google is the peoples’ company,’ and to critique them was unfair because they were doing so much more than any other company had done to make information accessible,” Noble said.

“But as this total diversion of public goods and knowledge into privately held organizations unfolded, there were questions to be asked — like, ‘Who will truly lose?’”

Noble is among several USC Annenberg faculty asking these kinds of probing questions about our evolving landscape of digital media and information systems. Their research explores the importance of establishing a strong ethical framework to bear on new modes of information-sharing — both on the part of the profit-seeking firms creating these new tools, and on the part of individuals as responsible and sophisticated users of the tools. While scholars differ on how these new-media ethics might be implemented, they agree that academia must have a role in developing workable solutions.

For Noble, her research led to her latest book, Algorithms of Oppression: How Search Engines Reinforce Racism. One of her major findings: The results of Google searches are not value-neutral.

“In the book, I interrogate the fundamental building blocks of technology,” she said. “Computer language is a language, and as we know in the humanities, language is subjective and can be interpreted in myriad ways. 

“It is the responsibility of companies, users and regulators to ask: What’s ethical, what’s moral, what’s right, what’s oppressive, what’s fair, what’s socially just, what fosters civil rights, and what erodes human rights? All of those conversations live in the domain of ethics.”

But most who work in the information and tech industries, she argues, don’t possess the ethical training, knowledge, expertise or, in many cases, the inclination to think about what they are stewarding and how they are influencing public opinion. If people are willing to shift their attention away from journalists and academic scholars and instead rely heavily upon these media platforms to provide trusted information, Noble wonders who is guiding them.

“One of the most important frameworks for computer scientists is the concept of universal design, yet the philosophical underpinnings of this universality often preclude women and people of color,” she said. “And these challenges, of course, are deeply tied to ethical structures.”

Noble recognizes that over the years some engineers have tried to apply a traditional sense of ethics as they design. Nevertheless, the specific ways that vulnerable people — women and people of color — are often in the crosshairs of some of the worst abuses of technology remain to be addressed, she said.

There is no golden age 

Professor of Communication and Director of Doctoral Studies Tom Hollihan.
Photo by: John Davis
Tech companies are far from the first to face scrutiny for how the choices they make in disseminating information can sway thoughts and opinions. Ethics in communication, or the lack of them, can mean the difference between news and propaganda.

According to Thomas Hollihan, we can trace 21st- century concerns about ethics in mass communication back to the Sophists of ancient Greece, who not only insisted they could teach persuasion, but virtue as well.

An authority on political rhetoric and a former USC debate coach, Hollihan notes that, as they sought to attract and win over their pupils, Sophists increasingly relied upon a deliberate use of fallacious reasoning and exaggerated claims. This drew harsh criticism, especially from Plato.

“Plato was hostile to rhetoric,” Hollihan explained. “He thought that the orators had inflamed the passions of the people and didn’t think the public was really suited to evaluate arguments.”

Then along came Aristotle. 

“He provided a much more pragmatic and systematic view of persuasion,” said Hollihan, professor of communication and director of doctoral studies. “He talked about the psychological characteristics of audiences and the topics that most influence them.”

Aristotle proclaimed rhetoric to be an instrument that can be either harmful or profoundly helpful, depending upon the virtue and value of the rhetor who uses it. 

In other words, intent is key. Just as Aristotle underscored the need to assess the “virtue and value of the rhetor,” Hollihan believes that, when it comes to media platforms, a fundamental evaluation of their intrinsic goals should be the starting point.

Is the platform acting to fulfill narrow self-interests or promoting shared communal interests? Does it fundamentally operate with goodwill toward the people it is trying to influence? Is the platform honest and faithful in presenting the information people need to make a good decision, or has important information been withheld?

“The bottom line is, you need ethics, character and good purpose,” Hollihan said. “There are lots of examples in history of people who have been effective in winning over audiences. But, if you do it based on fear, anxiety of the other or willful ignorance, then I don’t think you can celebrate it as good rhetoric, even if it’s successful rhetoric.”

Turning to his expertise in political communication, Hollihan points to the emergence of propaganda studies during World War I.

When President Woodrow Wilson issued an executive order establishing the Committee on Public Information in 1917, he gave the new federal agency authority to use every medium available at the time to persuade Americans it was in their best interest to support the war.

“That was the first concentrated attempt to manipulate political opinion in the United States in a very systematic way,” Hollihan said. “We created this public effort to actually persuade people, in this case, to support the war: to buy Liberty Bonds, to enlist and to get behind the war effort with enthusiasm.”

At every step, society has seen technological advances leveraged to influence the masses. But the convenience or the entertainment value of these technologies has always been sufficient to overcome the initial anxiety innovation produces.

“We move on and adapt,” Hollihan said. “People adapt, society adapts. Technology adapts. And social science adapts. New developments will continue to occur, and eventually the entire situation we’re in now will turn out to have been a blip in history. I don’t think we want to be too nostalgic about a perfect golden era that never really existed.”

It’s all about your audience

Henry Jenkins agrees that as technologies emerge, we always find a way to push beyond them; that constant back-and-forth struggle is the nature of the world.

In considering the nuances of tech’s modern-day powers to persuade, however, he differs from some of his colleagues. Jenkins objects to ascribing too much power to one side — the creators of these digital tools — and treating the other — the users — as if they had no power to change the situation.

Jenkins, Provost Professor of Communication, Journalism and Cinematic Arts, flat-out rejects the premise that people are “hypnotized” or “captured” by their devices. Human beings, he argues, never engage in any activity that’s meaningless to them. The pursuit of meaning is a core human urge, Jenkins insists, and it’s the social scientist’s job to understand how and why an activity is meaningful.

His research documents how young people, far from being hapless victims hijacked by their devices, are bending digital media — and every other kind of media — to their will. Jenkins’ 2016 book, By Any Media Necessary, is based on interviews with 200 youth activists who are breaking new ground in political discourse through a media-saturated vocabulary rooted in pop culture.

Jenkins’ current project, funded by a MacArthur Foundation grant, looks at youth activism through the lens of civic imagination. His 16-person research team at USC Annenberg monitors more than 30 youth-run social movements around the world, documenting how they use popular culture and new media to further their political goals. A collaborative book, Pop Culture and Civic Imagination, will be released next year.

The media skills of youth activists were on full display last spring after the mass shooting at a high school in Parkland, Florida, as outraged students took America’s gun debate into their own hands. “They went seamlessly from social media to a CNN town hall meeting to a face-to-face meeting with the president to a march on Washington and speeches on the Mall,” Jenkins said. “They even communicated through the patches on Emma Gonzalez’s jacket.”

From his perspective, the present moment is not about corporations manipulating young people. Rather, young people are taking advantage of available resources, including digital technology, and using them effectively to bring about new kinds of networked change.

Provost Professor of Communication, Journalism, Cinematic Arts and Education Henry Jenkins.
Photo by: John Davis
“That meaning may be translated into cash and exploited by corporations, but the starting point is something that kids really deeply desire to do,” Jenkins said. “My own ethical commitment is to start to figure out what it means.”

Social media, for example, fills a hole in the social fabric frayed by hypermobility. Today, Americans relocate on average 12 times across their lifespan. This had been increasing generation by generation across the 20th century. To Jenkins, people using Instagram, Snapchat or Facebook are not so much slaves to these platforms as social beings using technology to build up communal cohesion and maintain social ties across long distances.

“I think of it as bringing your social network with you wherever you go — like the turtle brings the shell on its back,” Jenkins said.

One practice that has media critics worried is unethical promotion of videos. The 15-second countdown that queues up the next YouTube video is not long enough, psychologists say, for the human mind to make a reasoned decision about stopping or continuing.

Again, where critics see manipulation, Jenkins is on the lookout for meaning. And here he speaks from personal experience, as an avowed Netflix binge-watcher.

“Yes, I sit and watch one show after another,” he admits, “but I’m not randomly watching. I’m exploring a list of 30 shows I want to watch, because there’s that much good TV being produced. I’ve chosen them from a broader range of media content than I have ever had available. And those shows are tied to all kinds of conversations I’m having as a fan within the web-based fan communities I participate in. Some other fans are rewriting the shows, remaking them as fan fiction art, and re-creating them as cosplay.”

The important point, Jenkins said, is that, persuasive technology notwithstanding, digital content consumers are constantly making self-interested choices and curating their playlists.

“Social media drives an awful lot of YouTube circulation,” he said. To Jenkins, far more interesting than the tricks platforms use to hold captive audiences is the logic by which viewers decide what video content to recommend through their social media network.

Focusing on the platforms gives a distorted view of their power to control our brains, he believes. Studying the audience side reveals a cascade of conscious and creative responses being made by consumers, and all of the choices are meaningful to the people making them.

“So, can I be distracted?” Jenkins asked. “Yes. 

“Can I be fed misinformation? Yes, though online communities actively debunk false information that circulates through social media.

“Can I be confused, distracted, pulled in different directions? Yes. And companies can definitely make money off choices I’m making. 

“Still, I see many, many potent examples of conscious choice-making throughout the media landscape. The kind of disempowering rhetoric that seems to be dominant at the current time is not helpful for understanding and explaining the behavior we’re actually seeing.”

Time for intervention

For Noble, however, evaluating a platform’s integrity is paramount. She has found, like Aristotle, that intent is everything.

“This is what my collaborators and I are trying to do,” Noble said. “Show how the platforms work, so we can think differently about alternatives.”

One of those alternatives is Noble’s idea of a public-interest search engine.

She regards libraries — not commercial platforms intent on driving consumer behavior — as the appropriate knowledge epicenters and gatekeepers of information in a democracy.

The public mistakenly assumes Google search is like a digital public library.

“Lots of research shows that most people believe the results have been curated and vetted, that they’re fair and accurate,” she said.

Noble, whose doctoral training is in information science, wants to see a robust, publicly-funded alternative to Google. She envisions something curated by teachers, librarians and subject-matter experts rather than the bots and underpaid offshore content moderation workers of commercial platforms.

The infrastructure for Noble’s proposed public-interest search engine already exists in the archives and databases of major institutions like the Smithsonian, the Library of Congress and top university and research libraries — a strong, integrated network with a history of collaborative toolmaking and knowledge-sharing.

She has met with leading librarians to discuss the idea and hopes to build momentum.

But in the meantime, as a sharp digital divide continues to separate the haves and have-nots in American life, Noble underscores that the have-nots are far more susceptible to, and victimized by, the misinformation and distortions perpetuated online.

Jenkins sees digital media literacy education as a crucial step to bridge this gap.

“The public needs to be informed about the choices it’s facing and the consequence of those choices,” Jenkins said. “People need to acquire skills at using the tools available to them.”

Most developed nations, he points out, now have rigorous media literacy education requirements.

“You don’t get out of British schools without passing a core task on media literacy. We have nothing like that in America,” Jenkins said. “With all we’ve heard in the last year and a half about distortions in the election by fake news and Russian interventions, we should have school systems embracing media literacy education as a central part of what they teach kids.”

In an ideal world, Jenkins would like to see digital media literacy education happening not just in American schools but everywhere people gather, including museums and churches. “Media literacy education should permeate every part of society that shapes how people make sense of the world around them,” he said.

In a world of conflicting accounts, Hollihan agrees that teaching individuals how to evaluate stories, whatever form they might take, involves fostering a refined sense of judgment to assess how faithfully characters perform the roles that they are assigned.

“To some extent we have the innate ability to do this because virtually everything in our lives is introduced through stories,” he said. “Think about history: Moral lessons, even warnings from parents, all come in a storied form. We seem to have a challenge at the moment in terms of judging the claims of conflicting stories.”

Hollihan adds that critical thinking also needs to be systematically taught.

“Some studies have shown that in middle school and high school we have to take civics courses and keep up with world events,” he said. “But after that most of us are far less likely to learn anything more about how critical thinking can be applied to public policy. I think that’s a huge problem.”

Noble welcomes the advent of such mass digital education, but she rejects any implication that the burden of filtering out false, manipulative and toxic internet content should — or could — be shouldered by the public.

“That’s like saying the public should adapt to their water being poisoned. If the information environment is poisoned, that’s not on the public at large to solve,” she said.

If she could make one recommendation to Silicon Valley, Noble would insist on hiring interdisciplinary teams made of specialists with Ph.D. and master’s degrees in fields such as communication, cultural studies, ethnic studies and women’s studies. 

“It’s simply not enough to say that we need a more diverse pipeline,” she said. “Of course, we do need more women and people of color, but more importantly, we need people who are deeply trained and well educated in the humanities and social sciences who can recognize the output of technology systems. This is the place where we have an incredible shortfall.”

On the other hand, her advice for the ordinary digital-content consumer who wants to hasten progress is to insist upon more research — funded by the government, universities and taxpayers — into the potential harm of big data-based and other types of biometric technologies that are being deployed on the public.

“Demand regulation the way you demand regulation of air and water quality and food safety,” Noble said. “You have a right to information technology safety, too.”

Jenkins adds that coupling regulations that encourage transparency with education around new media will begin to move the needle toward solutions. 

“We need to know what choices we are making and why.” 


This story was co-written by Emily Cavalcanti and Diane Krieger.