2021-July-12

Did you know? You can share this story using the social media icons on the upper left. Use the hashtag #WeAreCisco. You can also rate or comment on the story below.

My Fear of Sexist, Racist, Homophobic Robots

BY MAULIE DASS · SR. DIRECTOR, CISCO INNOVATION LABS · UNITED STATES

ORIGINAL ARTWORK BY SARA BALDWIN



“What the heck was this design team thinking?”

How many times have you wondered this? Only to come to the realization your experience seems totally different from others’ use of the same thing.

This happens when the product design teams aren’t actually thinking about an inclusive user base. Often "I, us, and the others" is the default approach for many teams, which introduces bias by design.

Here are some everyday examples of products that exclude large groups of people across the entire development process.

Traditional School Desks: Designed for right-handers so left-handers can never rest their elbows.
Vehicle Safety Tests: Most crash test dummies are modeled on male bodies, putting females at risk.
Automatic Soap Dispensers: Dispensers originally didn’t recognize dark skin tones. They were designed to work when light reflects off lighter skin tones.

Oops. Bias crept into our tech, too

There’s a ton of innovation leveraging the phenomenal benefits of artificial intelligence. But even with AI algorithms, lack of inclusive user bases shows up as bias that threatens the livelihood of various communities.

Take Amazon’s secret AI recruiting tool that preferred hiring of men over women — even if they weren’t qualified. A decade’s worth of historical hiring data informed this model to automate what humans already do so well: Make biased decisions.

Let’s keep it real: This is an industry issue — Amazon just happened to get caught in the act.

AI will amplify bias at speed and scale

Often a topic of heated debate, PredPol’s algorithm provides information for predictive policing based on location. Though knowing where crime may happen next is valuable, location data — such as zip codes — can also be an unintentional proxy for protected attributes, like race and/or socioeconomic status.

There has been a lot of important work highlighting the impact of bias in AI models, some of which can be seen in the documentary "Coded Bias".

All point to a now or never moment in the tech industry — mitigate bias or risk the decline in quality of life for marginalized communities. But the question is how?

Inclusive innovation can’t be an afterthought

I’m passionate about inclusivity in innovation and product design. And it’s common business sense because including different types of people means broadening my potential market. (Duh.)

Guiding principles for innovation in our Emerging Technology & Innovation teams include some no-brainer concepts: Observe and listen, gain empathy, fail fast, iterate faster.

Inclusive design practices enable these outcomes.

And I use the word “practice” here because, like yoga or meditation, inclusivity is a practice. While it would be awesome to declare inclusivity in a YAML and fire and forget about it, it just doesn’t work like this:

<assuming_inclusion_exists.yaml>

---

inclusion: true

diversity: true

effort: “low to none”

Instead, practice means integrating inclusion with diversity for creativity, embedding aspects of belonging and trust within a design environment, and ensuring that empathy can be conveyed in both story and data.

Our teams constantly discover and refine their practice

Along the way, we have identified some tips:

1. It starts with people

Leverage your communities of interest: Consider formal and informal ways to include various perspectives. For instance, during our healthcare ideation, we invited members of our Caregiver Network to be interviewed to accurately represent their needs.

Sponsor passion, not just skill: We are fortunate to have some very smart people in our organization… but passion and life experience are just as important as skill.

2. Process enables repeatability

Create a methodology for group ideation: Before the brainstorming, start with a foundation for embedding inclusivity and ensuring an environment for belonging — at all levels/dimensions.

Mitigate micro-aggresions in software: Drive more inclusive language in your software environment.

3. Use tools to help you scale

Pick one to get going: This tutorial on fairness in ML was the first one I watched (and re-watched). The general sequence that I try to keep in mind is: 1) Identify protected attributes you want to ensure fairness for; 2) Define a fairness metric; 3) Analyze the bias in your model; 4) Document findings; 5) Document and drive plan to mitigate.

Keep exploring and discover more: AI Fairness 360 includes demos and guidebooks. Algorithmic Justice League can help with audits and inclusive datasets. Perhaps your company has an AI Ethics Committee, so be sure to check with them for ethical AI policy, templates, guidelines, and more. Cisco has a central AI/ML Trust Strategy that provides tools and processes to drive more inclusive AI.

My biggest tip? You just need to care.

You don’t need to be an expert in either design or AI to mitigate bias in product design. I’m not.

You just need to care. The next step is to take action.

Maulie's Biggest Tip: You just need to care about inclusivity. And then take action.
Maulie's biggest tip: You just need to care about inclusivity. And then take action.

Live it to be it

Inclusivity as a buzzword is having its moment. And it’s about time. But it can’t just be something we say to make us feel good at work.

Inclusive innovation can help mitigate bias. Make it a priority to have forms of allyship and awareness in your code, in your environments, and on your team.

Let’s build a bench of people who care that when we say “human-centered design,” there is someone (you) to ensure it’s inclusive for all.

That way, someone else (me) will no longer need to fear robots but instead see them contributing to a friendly, accepting future — because we won’t allow it to be programmed any other way.

Untangling the bias from products begins with awareness and acknowledgment of where it exists. How have you experienced bias in product design? Share in the comments.

Related Links


Connect everything. Innovate everywhere. Benefit everyone.

Share your thoughts on the story here!

View More Comments