We recently dipped our pinky toe into the world of Artificial Intelligence (AI). We’re enjoying the increased efficiency AI offers, but it isn’t all sunshine and lollipops. This blog offers a brief intro to AI, a look at our first experience with it, and a giggle or two.


What’s AI?

Artificial intelligence uses technology to imitate the problem-solving and decision-making capabilities of the human mind, resulting in a computer system that is able to perform tasks that would usually require human intelligence. Some day-to-day examples of AI are: autocorrect, chatbots, search recommendations, facial recognition, navigation systems, and digital assistants.

How is AI used in the international development and humanitarian sectors?

Development agencies are adopting AI to facilitate communication with the people they serve, deliver services to hard-to-reach areas, and analyze data. They’re also using it to support poverty analysis, disaster relief, displaced populations, and even disease tracking.

AI at Databoom

At Databoom, we conduct a lot of stakeholder and expert interviews. Before we introduced AI into our process, we relied on live notes captured during interviews and recorded Zoom calls. Interview transcription was incredibly time-consuming and live transcription prevented us from being fully “present” during interviews. Incorporating AI into our interview process has helped us stay focused during interviews, facilitate better conversations, and leave with stronger takeaways.

We adopted a small-scale AI tool, Otter. ai—a technology that transcribes speech to text using AI and machine learning. Otter.ai offers an impressive transcription service. It differentiates voices and separates speakers. It’s also compatible with platforms like Zoom, Teams, and GoogleMeet. But here’s the fuzz on the lollipop – the implicit Western bias in the transcriptions has underscored the need for human intervention.

Here’s one recent example… Databoom is working with the Hilton Foundation and their Safe Water team to explore the path to safe, reliable, affordable water at scale in Kabarole district, Uganda, and three districts in the Amhara region of Ethiopia. This project requires conversations with country officers, relevant stakeholders, and partners.

While the human mind can quickly grapple with accents and heavy pronunciations, we found Otter.ai regularly made Western assumptions to transcribe terms. For example, The Ugandan district ‘Kabarole’ was mistaken for ‘Colorado’ consistently. A partner in Uganda, ‘Aquaya’ was repeatedly mistaken for ‘Choir’, another partner, ‘WSUP’ was transcribed as ‘Will Stop Soap’ and the Ethiopian region ‘Amhara’ was transcribed as “I’m hungry.”

Otter.ai deserves credit—it’s made the transcription process much more efficient. However, in our work, we’re often looking for very specific information and artificial intelligence can readily omit or misrepresent material. As researchers, it is crucial that we uphold the authenticity of comments, the rigor of qualitative data, and endeavor to showcase the unique places we work and the experiences people have.

As AI continues to grow, be it through transcription services like Otter.ai, or digital personal assistants, like Amazon’s Alexa or Apple’s Siri, it’s crucial that researchers and consumers of evidence are aware of rampant Western bias. We must also challenge developers to make better products, built by diverse teams, that embrace diversity rather than alienate nonnative and nonstandard English speakers.