If you’ve seen the Netflix film Coded Bias, you’ll know the scene where the black, female engineer puts on a white mask to be recognised by facial analysis software. It’s an eye-opening moment, one of many in a film that shines a light on the bias built into the algorithms, artificial intelligence and machine learning software that permeate our daily interactions.
In another unsettling scene, an algorithm used by Amazon to automate their recruitment process at scale is revealed to be sexist. The computer model was trained (by the white, male engineers) to screen applicants by referencing patterns in resumes submitted over the previous 10 years, mostly from men. In essence, the system taught itself that male candidates were preferred and any reference to ‘women’ as in ‘women’s college’ or ‘women’s clubs’ categorised the candidate as less qualified or less appealing. Amazon eventually edited the algorithm, but the damage was done.
Watching this documentary made me reflect on our approach and process to much of the work we do at Folk – designing and building products, services and systems with and for the Australian ‘public’ – as well inclusion, unconscious bias and diversity in the design industry more broadly.
We recently made the decision to become members of the Diversity Council of Australia (DCA), a not-for-profit peak body who are leading the charge for diversity and inclusion in the workplace. While going through the onboarding process, the team at DCA mentioned that they have very few design consultancies as members and asked what motivated us to get involved.
At Folk, we have a 60/40 percentage split male to female across our design team – we work for a range of clients from government to not-for-profits and private businesses. Those are our clients – but ultimately our work is about the impact at the individual human level. We design for a very broad range of people and we need to understand and consider how power and influence operates in relationships between people, as well as how someone’s race, gender, ethnicity, economic-status, geography, ability, religion, and sexuality might affect the experience they have with the products, services and systems we design.
Part of the motivation for joining DCA was to help us reflect on our own working practices, to inform our teams about power dynamics and consider how we can actively recruit for, and retain diversity in our organisation. If, as an industry, we are going to even come close to designing products and services to reflect the diversity of the people who interact with them, then we need to get serious about increasing the diversity of our teams and our industry – as well as giving greater voice and influence to those most adversely affected by design decisions.
Amazon has shown us, in the most negative of ways, that if you unthinkingly embed an approach from the outset it will carry through. Ultimately, our goal with our involvement with DCA is to ensure that we’re embedding the right approaches as early as possible, at both the organisational and project level, to set the right foundations for new thinking, new approaches, new markets, and for new ideas to flourish.