27 Apr 2023 WACC Europe considers agenda for the future to promote digital justice
“AI, AI, AI!” — Is this how you feel about artificial intelligence?
An online seminar on April 26 hosted by the WACC regional association in Europe offered rich perspectives for those struggling with the implications of digital transformation, for those seeking digital justice — and for those wondering what the role of Christian communicators could possibly be.
The seminar considered questions such as: How do we decolonize artificial intelligence? How do we incorporate the knowledge of Indigenous people? How do we root out racist algorithms? How can we make digital technology more accessible for all?
Surviving the AI tipping point
In her keynote presentation, Erin Green remarked that, even if we don’t know it, our world is now melded with artificial intelligence in our day-to-day lives. Green, a Canadian living in Belgium and author of Digital Justice: A Study and Action Guide co-published by WACC and the World Council of Churches, has spent nearly 15 years studying the social and ecological impact of artificial intelligence,
“When I started doing my research, there was not so much out there about technology and artificial intelligence, or theology and technology,” she said. “But what we see within the last five years — it’s kind of exploded.”
Green outlined some independent resources and credible researchers that might be helpful for people trying to ward off the potential negative effects of artificial intelligence — and also take advantage of the positive effects.
One of the challenges is how to determine what’s fake out there, but it’s not impossible, Green said, citing a quote humorously misattributed to Abraham Lincoln: “Don’t believe everything you read on the internet.”
She mused that artificial intelligence has infiltrated our lives so much that, at times, it’s challenging even for the tech-savvy to determine what’s real, especially in a digital world that is full of attention-grabbing disclaimers and content warnings.
“What is authentic?” asked Green. “Should we label it?”
Positioned as we are at what Green calls “a tipping point” with regard to artificial intelligence, communicators are well-situated to have meaningful conversations so that they can feel less afraid and be more selective about how they respond.
“Facial recognition technology ends up having a strong racial bias,” Green noted. “A lot of social media algorithms feed hate speech.” Nonetheless, amid these negative effects, we can use technology to be a force for good, she said.
“When you see a commitment to ecological justice, racial justice, decolonization, inclusivity, taking care of the poor, care for creation, and so on — when you apply the digital to those concerns, you then have concern for digital justice,” Green stated. “We don’t have to reinvent the wheel. The values and commitment are already there.”
We can move forward with a sense of hope, Green concluded. “One of the things I think is really powerful is connection,” she said. “What can you add to this conversation and share it with folks — now, today?”
If communicators want to survive this “tipping point,” they have to adapt, she said. “I’m critical and I’m quite fearful, but I think the luddite way is not the way to go.”
Digital inclusion in the global South
Emy Osorio Matorel, a Colombian who works for the Catholic Media Council (CAMECO) in Germany, spoke on digital inclusion in the global South. She began by noting that the concept of digital inclusion is much different in the global South than in the global North, and that difference is what makes digital inclusion a complex topic.
“Is it exclusion if I don’t have the means to pay for internet access? Is it exclusion if most of the content is in English?” she asked. “As you might know, the COVID-19 pandemic exposed so many inequities.”
COVID-19 prevented many people from communicating with their loved ones, and today the digital divide still prevents many people from fully living their day-to-day lives. Matorel discussed many types of “digital divides,” including those related to gender, rural versus urban, language-based, age-related, and others.
“Another issue that we see is how things are displayed,” said Matorel, noting that many people have access to a cell phone but not a computer. “Just having this limitation of not being able to input your data carefully or not being able to make sure you are connected to a trustful internet source — it’s a problem in itself, [one of] access and trustability.”
Even the definition of “digital literacy” has changed, noted Matorel. “We used to think about digital literacy as learning how to use a platform, how to send an email, how to open a Google doc, how to use Word,” she said. “Now we have different kinds of access. It’s not only learning how to draft a document but also how to use the right internet connection, and how to use it in a smart way, depending on the platform you’re using.”
Matorel identified several organizations already working on issues related to access, among them Access Now, which has a platform on which many civil society organizations can track internet shutdowns. “There are people who are already working on some things that we can build on, and we can form strategies with them,” she said.
Agenda for the future
Stephen Brown, WACC Europe president, moderated a discussion that gave participants an opportunity to exchange thoughts on “digital divides” and how to address them.
Jane Stranz, WACC Europe secretary, said she was particularly concerned with the issue of language. “AI assumes we all have English. What is our mother tongue in AI?”
Matorel acknowledged this assumption on the part of AI but also expressed appreciation for the translation capacities of AI. “If you think about our lives a year ago, we wouldn’t have thought some of the things we are talking about would be possible,” she said.
Jim McDonnell, WACC Director resident in England and Wales, summed up the question on the minds of everyone as the discussion drew to a close: “What is our agenda for the future?”