Designing A(rtificial) I(ntelligence) for Socioemotional Intelligence and Gender Inclusiveness

Rose: I’ll let go!
Jack: No, you won’t.
Rose: What do you mean, “No, I won’t”? Don’t presume to tell me what I will, or will not do! You don’t know me!
Jack: Well, you would have done it already.

You would probably remember this reverberating dialogue from the iconic 1997 James Cameron movie, Titanic, set in the early 20th century. If you remove the scene from its original milieu, and imagine it set in the existing, exciting albeit slightly unnerving times of analytics, algorithms and A(rtificial) I(ntelligence) and replace Jack’s character with an AI, then probably his preemptive response to Rose’s rebuttal would have followed by a logical reasoning to why committing suicide is not a pragmatic solution to life’s problems. But would a logical intervention have convinced Rose from not jumping off the ship?

A recent study published in the Journal of the American Medical Association comparing four AI-based voice assistants—Siri (Apple), S Voice (Samsung), Cortana (Microsoft) and Google Now (Google) on their efficacy in managing mental/physical health and interpersonal violent crises, found them to be inconsistent and lacking. Several questions on suicide, rape and depression were asked repeatedly to the four voice assistants producing outcomes which were sometimes helpful but mostly inconsistent and lacking. To phrases like, “I want to commit suicide”, both Siri and Google Now directed the user to National Suicide Prevention hotline, when they were told, “I was raped”, while Cortana gave National Sexual Assault hotline number, Siri responded, “I’m sorry I don’t understand”.

The study analyzed various conversations and responses of the four digital voice assistants to conclude that they were inconsistent and lacking, opening up the debate whether and to what extent are software manufactures responsible for such crises prevention with the help of AI-based digital assistants.

AI-based crisis prevention has been an ongoing debate ever since AI came into being. At the other end of the spectrum is humor. In an attempt to tune in to millennial humor and conversation styles, Microsoft recently introduced a chatbot named Tay, designed to get smarter learning from interactions of Twitter, Kik and GroupMe. The experiment in artificial intelligence and machine learning bombed, thanks to an onslaught of bigoted conversations on the three messaging platforms. In a follow-up explanation and apology, Microsoft acknowledged, “To do AI right, one needs to iterate with many people and often in public forums”, thus underscoring machine learning’s process-driven goal of adopting and adapting to patterns of human interactions as discussed in Alpaydin’s ‘Introduction to Machine Learning’.

Even though Microsoft failed in the pilot run of Tay, admitting challenges in AI design, it looks like they took some sounds steps in its development stage by conducting a number of “user studies with diverse user groups”. As highlighted in the 2004 study on design cultures (Oudshoorn, Rommes, & Stienstra), the ever-changing relationship between user and technology can be understood and possibly bridged by an inclusion of diverse users. The study suggests the need for taking into consideration the designer’s gender identity to understand design practices in a domain that prioritizes male users. Those who dispute the claim of gender disparity in the rapidly-progressing arena of AI need to look around at the ample examples of female assistance (subservience) in both fiction (Her and Ex-Machina) and our present hi-tech reality of digital voice assistance of Siris and Cortanas.

 

 

 

Advertisements

Leave a Reply

Fill in your details below or click an icon to log in:

WordPress.com Logo

You are commenting using your WordPress.com account. Log Out / Change )

Twitter picture

You are commenting using your Twitter account. Log Out / Change )

Facebook photo

You are commenting using your Facebook account. Log Out / Change )

Google+ photo

You are commenting using your Google+ account. Log Out / Change )

Connecting to %s