A Feminist Alexa? How Students Built a More Progressive AI Assistant

Talking to our devices is becoming the default way to ask for directions, restaurant recommendations, movie times, and the weather, according to Microsoft data(Opens in a new window). Of the 2,000 people surveyed, 72% of digital assistant users said they preferrred voice-based requests.

And while Alexa hasn’t exactly been a cash cow(Opens in a new window) for Amazon, it’s clear that virtual assistants are becoming more and more integral to our digital lives.

But there’s a problem in how these voice-activated AIs are presented to us. In case you hadn’t noticed, the default voices for Amazon’s Alexa, Apple’s Siri, and the Google Assistant are all female. You can now select male voices, but it seems technologists have overwhelmingly designed voice assistants as “not-male.”

Other industry examples include IPSoft’s AMELIA and Cognitive Code’s SILVIA, which was integrated into Northrop Grumman’s simulation system, SADIE. Of course, the forerunner of them all was ELIZA(Opens in a new window), the 1960s-era bot within MIT’s time-sharing system. 

Does it matter that they’re all gendered? Well, they are perceived as mere “assistants” to do our human bidding, so it’s not exactly empowering. Are big tech companies perpetuating stereotypes and reinforcing negative bias (hidden or overt) by portraying their AIs as “female”? Arguably so.

Then the question must be: Is there a better approach to building voice-activated AI? What are the steps to design an assistant that is not only beyond gender, but has more enlightened values driving its functionality? And are there ways to build an AI for specific communities, such as queer and trans users? 


Designing a Feminist Alexa 

creative computing institute presentation screen


(Credit: Creative Computing Institute)

These are the questions students tackled in the Designing a Feminist Alexa program at the Creative Computing Institute(Opens in a new window) within University of the Arts London. Forty UAL students gathered together with a bunch of Alexa devices, and Amazon’s Alexa Voice Service(Opens in a new window) tools, for a three-day workshop. Their mission? To imagine and prototype personal intelligent assistants that would meet a meaningful human need and embody feminist values. 

To kick off the debate on what constitutes a feminist-leaning dialog for an AI assistant, the students used the questions posed in the Feminist Chatbot Design Process(Opens in a new window). This came out of research done by Josie Young(Opens in a new window) (now Program Manager at Microsoft on the Xbox Trust Team) during her MSc at Goldsmiths, University of London.

It covers everything from examining the ethics behind each technology platform choice, to developing a genderless character that reminds users it’s a robot (to avoid any bias or prejudice), protecting data stores, and eliciting feedback to train the AI while improving its comprehension of the task at hand. 

Then the students on the Feminist Alexa course got down to work and separated into teams. Each team brainstormed a persona for their assistant, which would meet the needs of their proposed end users. This is essentially creating a character that their audience would feel comfortable talking with. 

During this process, they also came up with sample query and response dialog patterns to reinforce their persona’s “character” (which would be developed further in the Conversation Design stage). For example, how their artificial intelligence would respond when it didn’t understand a question.

Then they moved on toward creating all possible permutations of back-and-forth chat between a feminist-leaning bot and its (note: gender neutral terminology there) human counterpart.


voiceflow


(Credit: Voiceflow)

CxD is a multi-modal design skillset, consisting of voice user interaction design, interface design, motion design, visual design, and UX writing. It’s not just about the “prompts” (which include keywords to let the AI know the human has asked a specific question) or the conversational flow (to avoid “I’m sorry, I didn’t understand that”) but it also reinforces the voice of the assistant. 

There are many tools on the market for conversation design, including Voiceflow(Opens in a new window), platform-specific actions for Google Assistant(Opens in a new window), and skills for Alexa.(Opens in a new window) The teams at UAL mapped out conversations as a flowchart first, using Whimsical(Opens in a new window), to ensure they’d covered all possible routes and responses within the dialog management stage. 

CxD looks like visual scriptwriting, but it also requires excellent spatial understanding, to “see” the conversation, with all its “branches,” as it veers off into different topic areas, while designing it on-screen. The assistant is also only as good as the knowledge base it draws on. It can’t tell you what it can’t access on the web.

Cathy Pearl(Opens in a new window), Design Manager for Google Assistant and the author of Designing Voice User Interfaces(Opens in a new window) has a good explainer video(Opens in a new window) for how to script a virtual assistant, including doing a “table read” and how to handle unexpected user responses.  

Alternatively, one can code a conversational AI from scratch, using Python for dialogflow(Opens in a new window), to run it on a Google Assistant-powered Android device. For more information on how that works, check out Google Staff Developer Advocate Priyanka Vergadia(Opens in a new window)’s video above.

It all depends on your level of complexity required. Something like Voiceflow is akin to using WordPress to create a website, in that the dashboard leads you through the no-code, block-based process. Python is, well, harder. 


Voice Choices

While chatbots type and respond, virtual assistants talk. This means they need vocalization styling. There are several ways of doing this. The most laborious and low-tech option is to record all possible responses to human inquiry. Less time-consuming, but not as pleasing to the human ear, is to use concatenation, which involves recording individual words and then stringing them together to form cohesive sentences.   

Most AI assistants right now use synthetic voices within Text to Speech programs, such as Amazon Polly(Opens in a new window). The AI “reads” the script in the background and we hear the vocalized response. Some TTS (the expensive ones) can be trained to modulate their speech patterns to sound more convincing than others. 

A more futuristic method is to deploy speech synthesis to capture the full phoneme spectrum of a human voice as opposed to recording individual syllables. I saw this done back in 2016 at ObEN, a company that voiceprints actors for robot concierges in Vegas and on-screen talent across Asia. The clever thing about capturing voice at the phoneme level is that the virtual assistant can speak any language—it’s just 1s and 0s to the AI. 


Exporting to Platforms 

Unlike the web with its open standards, AI assistants are still a nascent and somewhat proprietary technology, so developers have to pick a platform on which to run their software. For example, using Alexa requires an Alexa Developer Account(Opens in a new window), then testing the AI on the Alexa Developer Console(Opens in a new window) (simulation environment) before running it locally on an Alexa device. 

For more specific voice assistants, which run as “actions” on Google Assistant (across Android devices), developers can sign up for a testing account(Opens in a new window) with Google and follow the instructions. But let’s return (virtually) to UAL and their Feminist Alexa course for the final stage. 


Prototypes 

feminist AI prototypes


(Credit: UAL Creative Computing Institute)

In the end, the UAL students on the Feminist Alexa course came up with eight prototypes, including Bud(Opens in a new window), HiFuture(Opens in a new window), and Egami(Opens in a new window), which demonstrated empowering, feminist-informed life, career, wellness, and sex education advice.

All the prototypes displayed a confident voice and a non-judgmental tone (e.g. “Nothing is too weird for me. I’m a bot. What’s on your mind?”); clapping back when faced with abuse from users (e.g. “Please be polite. I am not a human but abuse is not acceptable in any way or form”) and setting boundaries (e.g. “Good luck with using that language in your job interviews.”)

It was refreshing to hear the various prototypes in action because they were nothing like the de facto AIs I’d heard before. After the Designing a Feminist Alexa program wrapped, UAL decided to support further investigation into progressive AIs, and whether one could be made specifically for trans and/or non-binary people.  


Syb: Queering Voice AI

syb AI


(Credit: Feminist Internet)

In 2020, another group of students—the majority of whom identified as trans and/or non-binary—gathered for a week at the Creative Computing Institute to create Syb(Opens in a new window), a prototype to support trans people as part of Queering Voice AI: Trans Cenetred Design(Opens in a new window) course, under the guidance of course leaders Andrew Mallinson, co-founder of Feminist Internet(Opens in a new window), and Cami Rincòn, who is now AI Ethics Researcher(Opens in a new window) at the Alan Turing Institute in London (which PCMag visited in 2019). 

“The purpose and design process of the two courses were aligned,” Rincòn told PCMag. “But ours was quite different in that we sought to develop a prototype, based on design requirements elicited through my academic study(Opens in a new window) into the needs and experiences of trans and/or non binary users of VAIs [Voice Activated AIs].”

To this end, the conversational design for Syb was explicitly forward-thinking in its usage of gender-affirming language and tone, as well as in its knowledge base. The use case developed by the students was about connecting trans people to media (films, television, etc.) recommended by their community. 

In 2021, Syb was awarded the inaugural New New Fellowship(Opens in a new window), which allowed the design team to refine the prototype and bring it to a wider audience. While the Feminist Alexa prototypes are not available to use publicly, Syb has an early prototype you can beta test(Opens in a new window).


Virtual Assistants in the Future

I’ve recently realized that current AI assistants are designed to be broad solutions for the general public. They are meant to answer questions from collective knowledge databases, such as Wikipedia (which has its own issues of veracity and bias), or to mine digital content (restaurant details, map directions, weather, and news).

I have used Google Assistant to read my schedule to me, and occasionally locate someone in my call list or send a text (but I always have to make sure it heard me correctly). I’d like an AI assistant that draws on my own increasing knowledge stores, so it can grow and learn with me.

Not because I’m specifically worried about losing my mind (clinically or otherwise), but because I want a voice-based AI interface to my life. Something that suggests films I might want to watch, books to read (and then automatically requests to borrow the ebook from the local library), reminds me when birthdays and important dates are coming up, and lets me know what I bought last year to avoid re-gifting issues. 

A real virtual personal assistant; not a generic tool, but an AI that is tailored to me, with a really great voice. Learning more about the work at UAL inspired me to think beyond the usual suspects, towards a future where a myriad of disembodied AIs, truly progressive and beyond gender, could provide a really useful voice-activated interface to our increasingly hybrid lives.

What’s New Now to get our top stories delivered to your inbox every morning.”,”first_published_at”:”2021-09-30T21:30:40.000000Z”,”published_at”:”2022-08-31T18:35:24.000000Z”,”last_published_at”:”2022-08-31T18:35:20.000000Z”,”created_at”:null,”updated_at”:”2022-08-31T18:35:24.000000Z”})” x-show=”showEmailSignUp()” class=”rounded bg-gray-lightest text-center md:px-32 md:py-8 p-4 mt-8 container-xs” readability=”30.769230769231″>

Get Our Best Stories!

Sign up for What’s New Now to get our top stories delivered to your inbox every morning.

This newsletter may contain advertising, deals, or affiliate links. Subscribing to a newsletter indicates your consent to our Terms of Use and Privacy Policy. You may unsubscribe from the newsletters at any time.

Facebook Comments Box

Hits: 0