Gender bias in Artificial Intelligence (AI) is rife. In Singapore, according to statistics from the Ministry of Education, the intake of women in science, technology, engineering, or mathematics (STEM) fields degrees account for around 25 to 35% of the total intake for engineering and computing degrees. So, when it comes to programming AI, one can only guess that the representation of women is exceedingly low.
The influence of programming primarily by men has been seen to perpetuate unhelpful gender stereotypes, in particular when it comes to consumer interactions with bots and voice assistants, raising questions about its long-term impact.
Order-taking Female Assistants Becoming Widespread
Despite the ongoing underrepresentation of women in AI development, AI assistants are overwhelmingly female – think Amazon's Alexa, Apple's Siri, Microsoft's Cortana, or Google Assistant. Until recently, they also had docile, obedient personalities that would tolerate a significant degree of order-taking and rudeness, subtly perpetuating AI sexism.
With some of the world's largest brands launching brand-to-consumer messaging capabilities this year – for instance, Singapore Airlines recently launched its own 'Kris' AI chatbot last month -- brands will need to look to AI and chatbots to deliver conversational commerce at scale. Consequently, the number of assistant bots is set to expand rapidly to communicate with consumers across websites, apps, and social networks.
As "conversational AI" dramatically grows in usage, its sexism could get baked into the world around us, including the next generation of AI. Subtle reinforcement through repetition can add up, over time, to a form of problematic psychological conditioning. Today, this is quietly creeping up on us because the use of bots is still relatively low—a few minutes per day, perhaps. But soon, AI will be much more ubiquitous, as the use of bots in messaging channels and voice assistants start to replace websites and apps completely.
Conversational AI and Perpetuating the Stereotype
If we don't change course, this next generation of conversational will not improve--and could, in fact, make worse--the biases of our societies. The engineers whose AI systems categorized women into order-taking, submissive, and obliging characters (intentional or not) will have their biases massively amplified, as conversational AI goes global.
The common thread is men. With so few women working in the programming of today's AI, white male engineers have primarily been responsible for its development and (knowingly or unknowingly) failed to challenge their own chauvinism or consider the harm their work could do. As AI assistants become the epicenter of consumers' day-to-day interactions and transactions, these conversations must avoid the further ingraining of sexist mentalities in society.
The technology industry is a serial offender when it comes to gender stereotypes and underrepresentation. If women are so critical to circumnavigating the perpetuation of sexism in AI through building lasting and far-reaching AI technology, how can this be achieved when they are missing from the equation? We’re just getting started, but the signs are already worrying and could arguably lead to the next tech crisis. Left unchecked, the results could be catastrophic.
Tackling AI Bias
Closing the gap for women in tech is likely to take some time, but ensuring that women can be present to partner alongside programmers in the meantime is one step businesses can take in the short-term to avert a disaster in conversational AI. Customer service representatives, for example, tend to be a more diverse group than the programmers who write code and are far higher in number than the average number of female engineers at large tech companies. Drawing on the experience of customer service personnel alongside programmers to design AI-powered conversations is one way the closure of this gender equality gap can be achieved.
Companies working in AI should also work to recruit more balanced workforces, partner with female leaders to reduce male bias and host women-led tech initiatives. We need to develop a set of best practices in bot building that champion diversity and spread them across the industry. That is what we aim to do with EqualAI. Diversity is the crucial ingredient in halting the perpetuation of AI bias in the industry, which is why tackling underrepresentation in tech companies is so important.
AI has enormous potential and capacity to shape conceptions of gender into the future, but until the field begins to hold itself accountable, we'll continue to miss the perspective and inclusivity we need for real progress. Diversity is our best defense against replicating and amplifying hidden biases. Without it, AI will soon give rise to the next tech crisis.
Robert LoCascio, Founder and CEO, LivePerson, and Founder, EqualAI, wrote this article.
The views and opinions expressed in this article are those of the author and do not necessarily reflect those of CDOTrends.