#SXSW: Digital copycats and emotionally intelligent machines

In the first dispatch from Austin, Zone's chief creative officer, Dan Harvey, listens to discussions about digital replicas, AI design and biometric surveillance...

Granary Square
SXSW1

Digital Copycats: Escaping Plato's Cave

We're told time and again that we live in an era where differentiated experience is key. Then why are so many experiences the same? Facebook copies Snapchat, Uber copies Lyft, Microsoft copies everyone. This copycat syndrome isn't confined to just surface dimensions but whole business models. Karwai Wong and Will Anderson, design strategists at SapientRazorfish, interrogated this phenomenon today.

In their summation there are three fundamental reasons this is happening. The first is the incestuous use of the same "best in class" case studies. The second is stale ideation techniques that just reveal the same problems/opportunities. The third is the emphasis on short-term value that limits the scope of design thinking.

The duo proposed two interventions to help. First, new frameworks that show the social consequences of these oft-copied digital products. Second, a call for strategists to "kill their darlings" when their approaches gather rust.

Designing Emotionally Intelligent Machines

At Zone we've recently launched an AI-Voice taskforce, so a lot of my time at SXSW will be dedicated to that topic. Sophie Kleber, executive director of Product & Innovation at Huge, spoke about the design of AI experiences today. Designers have always evoked emotions, eg the friendliness of the bug-eyed sprite, but voice brings that to the fore more than ever before.

Kleber walked through the expected 2021 $36.7bn market for emotional recognition software. She discussed the rising stars of facial and voice recognition and biometrics spaces. That tech helps establish the emotional cues of a customer but more is required to understand context and how a system should respond. Kleber suggested designers familiarise themselves with psychology to do their jobs tomorrow.

Her framework had two axes: a customer’s design for emotion and a brand’s permission to play. When there is neither, the system should tune out. When the former is high but the latter is low, it should react like a machine. When play is possible, the system can react like an extension of self or like a human.

Are Biometrics the New Face of Surveillance?

This panel was hosted by Sara Sorcher of The Christian Science Monitor and featured Chris Piehota of the FBI, Brian Brackeen of facial recognition company Kairos and Cory Doctorow of the Electronic Frontier Foundation. The panel debated whether biometrics were making things more secure/convenient or if it was creepy. The answer from the audience when asked was: yes, both.

Doctorow, who is a regular and gifted speaker, eloquently spoke about the risk. The more the private sector gathers, the cheaper it gets for the government to surveil. That erodes the ability for the government to constrain corporate surveillance. Every time we use a tech product or service we're paying corporates for these invasions. Governments in turn raid that data.

Bracken argued the private sector wants regulation but Washington gridlock makes it impossible. He also raised the spectre of the implicit racism of recent facial recognition tech. Now, the tech has learned so much about race that it is an effective tool for genealogy use cases.

That claim terrified Piehota, who said the government has taken many steps to preserve civil liberties from this tech. He mentioned how his agency couples constrained version of the tech with a human element. These "super recognisers" have a marginally better success rate over the tech alone. Today, anyway.