Technology isn’t agnostic. It’s time to recognise that the products we build are reflections of their creators as well as their users – for better or worse. Every design and development decision made for a product has the opportunity to empower a fellow human, but also to harm them – exclude them from interaction, engagement, or stop them accomplishing a task. Our algorithms and mockups are not neutral, and they never will be. Designers and devs are inherently problem solvers, but before diving in, we need to understand whether or not we really can grasp the problem we aim to solve, as well as how even the smallest decision can have enormous human impact. Only then can we make our processes and products inclusive, put the decision-making power in the right hands, and ensure that our impact is for the better.

What is exclusionary UX?

Simply put, an exclusionary user experience is an interface which excludes a user and prevents them from doing something. It is often a result of bias from a decision-maker in the product process – PM, CEO, designer, or developer. Inclusive design is a buzzword, but the concept is so broad that it’s easy to ignore – after all, what does ‘inclusive’ even mean for a website? An app? A checkout process? In order to truly understand this concept and understand how a UX can be harmful, examples are most helpful:


The #Airbnbwhileblack hashtag surfaced problems with Airbnb’s interface and clientele. Harvard confirmed it.

Why does this happen?

Quite often, these experiences are excused by saying that exclusion wasn’t the intention. Of course it wasn’t, but that doesn’t make the consequences or experience any less painful for those affected.

As mentioned, these often are a result of biases – another big buzzword in tech today. We all have them, and accepting that is a valuable first step. Only then can we account for them. Unfortunately, these biases are amplified and manifest in our work due to a range of factors, including but not limited to:

  • Non-diverse founding teams, which in turn hire non-diverse product teams, which are unaware of their biases and the problems of their users, or are only aware on a surface level.
  • Attitudes like “move fast and break things” are all well and good, until understanding nuanced problems is cast aside in favour of speed. The focus shifts from shipping a valuable solution to shipping… something.

Non-diverse product teams result in decision-making power falling into the wrong hands; a product (Apple’s Healthkit, for example) loses sight of the problem it aims to solve, and becomes a reflection of its creator, not its benefactor. Microsoft’s Tay chatbot took after those who taught it, and is a great example of technology reflecting the people who feed it data.

Screenshot of a Tay chatbot tweet which states 'chill im a nice person! i just hate everybody'
Microsoft’s Tay started off with “Hellooooooo world!!!”, but it quickly went sour. The above is one of the tamest examples.

How do we address it?

None of this is to say that we can’t help. We want to solve problems, and have useful skills. However, by better understanding where we fit in a truly inclusive product process, and where those skills are best leveraged, we can both solve those problems and avoid harm.

Step one is understanding that a product doesn’t exist in a vacuum. It will be a reflection of its user but also you, the creator! You have biases and you will make mistakes. This doesn’t make you a bad human being; it’s what you do to address the mistake that matters.

Understand who is affected, and who can evaluate the solution

First, foremost, and most difficult, ask yourself: should you build this? Should you be the one who solves this problem? Are you fundamentally affected by it? If you are not, will you truly be able to centre those who are? Why do you want to solve the problem, if you are unaffected?

There are techniques that can help here, such as personas. However, they don’t guarantee understanding. No technique will. The first step is knowing what you don’t know. For example, Val Head is passionate about animation; in the writing of her article about safer web animation, she asked people affected by vestibular disorders to identify triggering animations – she didn’t identify them herself.

Understand that empathy has limits

It’s easy to just say that we should empathise with our users, but what about in practice? We all have a history. As much as we’d like to, we don’t come “with clean slates and fresh eyes”.

As a result, sometimes you just won’t be able to bridge that gap. Men simply can’t understand the nuances of period tracking. Folks with a male or female gender identity won’t understand what it’s like to interact with a form that doesn’t accommodate you. Same for mixed people and racial identity. That’s fine. You don’t have to. It just necessitates a change to how you work, and the framework in which we do so. You can still help – offer operational skills (coding, design, product management), but…

Put decision-making power in the hands of those affected

“There is nothing that can help us validate our products than actually talking to subject matter experts and seeing them use it.”

I’d suggest involving subject matter experts (or those affected by the problem we’re trying to solve) even earlier. Asking SMEs to use a product assumes that a product should be built. If you cannot empathise, find those who can, and put the decision-making power in their hands. Involve them from concept through ideation through QA through code implementation through testing, and – most importantly – trust them. If they tell you that something doesn’t work, be prepared to change it. Test with them, but respond to them too: understand that your understanding does not extend far enough to make certain decisions.

Build with intention

The involvement and centering of folks affected will slow you down, particularly if they’re not a designer or developer on the team. It should. Deal with it. Push back against “move fast and break things,” especially where breaking things can be harmful.

Focus on moving with intention, and set expediency aside. Far too often, we focus simply on shipping “a solution”, not a good solution or a thoughtful one. Moving away from defaults will be unfamiliar, but it is imperative to creating an inclusive UX, underpinned by code written with inclusion in mind. If you account for the exceptions, you will not somehow exclude the defaults. For example, ensuring that a non-binary person feels included by a form will not somehow exclude cis-white heterosexual males; it will just make the whole experience better for all.

When you do fail, take responsibility

Mistakes will be made; it’s how we react to them mistakes which matters. A great example of this is Nextdoor, whose interface allowed racial profiling. They acknowledged their mistake, and moved to address it. They created a new posting process, adding friction to it, making users stop and think before they included information specific to race. The problem isn’t fixed, but they are making steps to do so.

Screenshot of a Nextdoor entry about someone supposedly casing a house
Post on “crime and safety” on Nextdoor before the changes.

The new process on Nextdoor for reporting crime and safety issues in the neighbourhood
With the changes, a user is required to consider what they’re posting about – a crime? Suspicious activity? Something else? Furthermore, if race is part of their description, they must include additional information.

What does success look like?

A diverse team

First and foremost, the team in charge of the project has to include those affected by the core problem. This can either be operational – designers, developers, etc – or as consultants. Either way, the decision-making power is theirs. Take an honest look at your team. Are they the best people to solve the problem at hand and, if not, how can they be empowered? Who can they empower? It may seem counterintuitive, but once we stop thinking about the products and think about the people behind them, the innovation will come – therein lies the magic.

An inclusive interface

When we talk about interfaces, remember that an inclusive interface’s foundation is inclusive code, as well as a product/project management process that makes sure the right people are making the right decisions. In addition to Nextdoor above, there are a few examples, and a great many resources:

Sometimes it’s difficult to identify concrete examples; quite often, success is inaction: not asking for gender or sex when you don’t need it, for example, or a subtle notification vs a loud noise (or no notification at all).

If videos and talks are more your thing, here’s a great list from UX Mastery.

And if you need some guidelines for your product or team, check these out:

Image montage of the Design for Real Life book, the Government Digital Service's checklist, and the Inclusive Design Patterns book
There’s no shortage of resources out there – you just have to be willing to dig in!

Collaboration is the key

The key to eliminating exclusion from products is to do the counter-intuitive: acknowledge our limitations, and allow others to make decisions. This can be difficult for people who pride themselves on their empathy and ability to solve difficult problems. We want to see technology as agnostic, empowering everyone – but this is only true if those creating it do the same.