MozFest Community Interrogates Trustworthy AI
Following a year of being stuck at home with escalating reliance on tech platforms, I was tasked to attend MozFest with an investigative lens and look at how and where Trustworthy AI showed up in the work of the MozFest global community. In a moment where I had little trust to conjure in the face of a global pandemic and overpowering political systems, I was curious to find the threads of trust in this conversation about artificiality. And though artificial intelligence is rooted in software engineering and data science, the conversation at MozFest oftentimes extended beyond both.
Artificial Intelligence is Human
The MozFest community first reminded me that in talking about AI, we first have to talk about each other and our past. AI can be referred to as a seperate machine that discriminates against us, that feeds us recurring content that we don’t need, or that knows more about us than we know about ourselves. Some of us forget that there are humans at the other end of this machine, and the decisions that these humans make result in this artificial intelligence. In one of the sessions, immersive creator and coder Alton Glass said, “technology is an extension of us, and it's not artificial. It's an alternative intelligence to who we are as human beings.” To me, it was a great reframing to get at the real issues that result in untrustworthy and biased systems and machines.
The conversation about Trustworthy AI was placed in the context of history. For example, in one of the sessions, Aurum Linh broke down colonialism and racism in order to trace the lineage of algorithmic injustice and harm. The case was made at MozFest, that these were not new imbalances — like for example the extractive practice of the Global North of the resources (now: data) of Global South — but they are made anew under the guise of technology and innovation.
Accordingly, the conversations at MozFest flowed seamlessly between technology, climate justice, racism, colonialism, global politics and inequalities, economics, self care, neurodiversity, and community building. Because in order to talk about trust in machines, we need to first expose and address the systematic structures that obscure trust between humans.
Human rights took center stage through a community-driven framework to hold AI systems (read: people) accountable. That meant looking at whether or not AI complies with and upholds our human rights and dignity, in order to ensure that our safety, privacy, and freedoms are preserved.
Upending Current Systems of Extraction
The community at MozFest stood strong in interrogating AI, and questioned what intersectional AI can look like. In order to upend the current system of data extraction, sessions centered feminist practices and indigenous knowledge to combat the exclusionary dangers and harms embedded in AI. It was most clear that in order to push back against the current tide, we needed different movements, regions and types of knowledge to work together. Everyone who is connected to the internet — and sometimes even those who are not — is impacted by AI, or, to drive my idea home, by other people’s decisions. And if these decisions continue to sit in a silo, we will move farther away from digital ecosystems that can be safe and helpful for us all.
The most powerful stories I heard during MozFest were those that relied on mutual aid to combat AI discrimination and power wielding. It showed up in conversations about labor, the economy, education, data governance, bias, and trust. In one of the sessions about the decentralized co-operative web, Silvia Lopez said: “You can’t really code trust, technology is there to support trust between people.” A reminder that in order to solve trust issues in technology, we needed to look beyond the code, or rather at the systems that precede it.
If I didn’t attend MozFest after this past year I don’t know if I would have felt like we can actually change things, but at MozFest, people were already doing that. People from countries all over the world were generating solutions, that are built on trust, to the problems most relevant to them. So if you are reading this and thinking, I need a pick-me-up to sustain you through difficult AI conversations about data, vaccines, COVID19, inequality, and trust, I encourage you to watch these sessions because they will give you hope. I would also encourage you to join the MozFest Community Slack channel and take part in this movement
Zeina Abi Assy is MozFest Investigative Storyteller