PS: I'd love for you to dig more into the idea of designing for the extremes and perhaps provide an example.
West: If we can design for people who are the extreme, we'll find innovation that can be applied for all. We all know the physicist Stephen Hawkins… because of his disability, towards the end of his life, he could only use his eyes to type. So this eye-gazing technology was developed to help him to still be productive in the later part of his life. And that same eye gazing technology is now being deployed in autonomous vehicles. The car is sensing whether your eyes are gazing on the road or if you begin to doze off, or are not paying attention. And that will trigger certain sensors.
PS: How can organizations prevent biases from being baked into their systems, and how can they also use technology to remove biases that may already exist in their technologies?
West: I have a phrase in my book, "as technology gets more human, humans need to get more human." AI is actually being programmed by humans. AI is made up of two basic foundational elements. One is the data set. One is logic, or machine learning. But the programming of the logic is done by an individual. If the individual has narrow thinking, or understanding, or perception of the world, then that logic will be programmed into the AI. And that is where the unconscious bias can come in.
PS: A really interesting example in your book was around insurance companies, and how over the years it was discovered that there were biases baked into their systems.
West: So the insurance company basically determines a premium that is really set on certain assumptions. And they assume that people who are lower income are not as good of drivers – they somehow made that connection. Therefore, they charge a higher premium, and by doing so, they inadvertently “punish” the underserved, already poor population because they have to pay a much higher premium on their insurance. This just perpetuates this negative cycle of poverty.
It is very important that we are very deliberate and intentional in thinking about how to maintain the neutral state of a technology. Technology by itself is very neutral and can be used in a productive way to get rid of traditional bias. But that will only come about if leaders, especially high-level decision-makers, are conscious and aware and making sure that you really have a diverse development team, design team, deployment team and testing team to test the 360-degree possibilities of technology like AI to ensure that it can stay as fair, as trustworthy and also as transparent as possible.
PS: We've talked a lot about the why today, and why authentic inclusion is so important, but can you dive a little bit deeper into how organizations can really take inclusion from simply a core value to really being a core practice?
West: There are two basic tenants. One is a fundamental principle about respecting the fact that each individual can make a difference not in spite of, but because of, their difference. Because we are technologists, it's even more important to have diversity of people present and understand the usage and the contribution to the design of the technology.
On the other side of authentic inclusion is how to operationalize it. Fundamentally, inclusion cannot be just an HR initiative, or imperatives, or an HR program. It has to be a corporate wide holistic imperative. And that has to be viewed in the business context in order for it to have sustainability. It's what I call the “6 E Framework.” You have to have a senior leader, like a CEO or your chairman of the board, embrace the idea of inclusion. But after you embrace it, you need to envision it and be able to articulate it... what is your vision? What is the strategy? And then after you envision it, you need to enact it. Meaning, you need to have policy and governance around it to make sure that it actually can be institutionalized throughout the rest of the organization. Then after you enact it, you really need to enlist it. Enlist meaning, enlist people, enlist resources. You’ve got to put money behind it. Then after you enlist, you need to enable them. You need to have the education. You need to have training. You need to build up competencies, for example, in digital inclusion and accessibility. Last, but not least, is to ensure. Meaning you’ve got to have measurements. You must have metrics to measure your progression.
That's the way to really make sure that authentic inclusion can be systematically sustained and scalable inside your organization. So going forward, if we really want to build a just society, then this humanness, this human-first thinking has to be first and foremost.