Recently, the creator of Ruby on Rails, David Heinemeier Hansson (DHH), tweeted about how his wife, Jamie, who he shares financial accounts with
received a lower credit limit (5% of his) on the newly launched Apple Card. Given their combined income and credit history, DHH, decided to use his social media platform and the associated potential for virality to bring light to an important issue - algorithmic bias. In the subsequent days, Goldman Sachs (card issuer) and Apple reversed their initial decision and increased Jamie's limit.
During the process, the response Jamie initially received was that ‘the algorithm’ decides the credit limit of applicants. For most consumers, this might have been an acceptable response. However, for those of us with a technical background and professional experience within the technology space, we know better.
Every algorithm is designed by a human or a team of humans. Even those that are considered artificial intelligence based still have their foundation in people designing, developing, launching and maintaining the technology. While a computer may simply be asked to take the inputs and deliver the output repeatedly at scale, the algorithm does not function in isolation. In essence, the humans are the master not the algorithm.
In an increasingly, technology based world, not only is the algorithm owned by the organization. But, the ability to create unique algorithms is often the foundation for an organization's competitive advantage.
As we learned more about the situation, it turns out that Jamie listed her occupation on the application as homemaker. Within the algorithm, this data input resulted in assumptions about credit risk (higher) despite the other factors such as shared bank account and superior payment history (Jamie has a higher credit score than David).
While it is great that Jamie was able to remedy the situation through the power of social media, the reality is that there are many others who are not as powerful or wealthy who have decisions made about them on a daily basis using algorithms that many would consider unfair or biased.
As Jamie says herself in a
self-published article about the situation, “this is not merely a story about sexism and credit algorithm blackboxes, but about how rich people nearly always get their way.”
So, what can we do about this?
The reality is that we are beginning to decouple technology from the technologists who create it. Typically when solutions are first launched, we learn the names of the founding team by name. Think about Mark Zuckerberg (Facebook), Bill Gates (Microsoft) and Sergey Brin & Larry Page (Google). As the companies begin to grow and scale, the daily builders (engineers, developers and product managers) of the technology become increasingly anonymous. These teams of engineers and developers are building hardware and software solutions that influence large aspects of our daily lives.
For example, global positioning systems (GPS) mapping technologies like Google Maps and Waze are relied upon by millions of users daily to navigate their commute from home to work and back again. For years, delivery services like UPS and FedEx have
avoided their drivers making left turns whenever possible because their data showed that these turns tended to result in a higher level of accidents.
If this factor is not part of the GPS algorithms, we could be placing drivers at increased risk of accident to save a few minutes of travel time. Or, what happens when we re-route drivers down residential streets because the main road has a slow down? Inevitably we are placing residents of those streets in danger due to a sudden, increased, and unexpected traffic flow.
This challenge is magnified when algorithms are developed thousands of miles away from their end user. For example, there may be certain neighborhoods in rural areas that are considered unsafe at certain times of day due to wildlife crossing or limited visibility. Those who live in the area would intimately be aware of which streets to avoid But, if one is simply attempting to save a few minutes as an out of town visitor, the algorithm may end up sending a user down an unsafe street which again increases risk.
In this scenario, the application developer could simply reply that it was the algorithm that sent someone down the unsafe street. However, that would be extremely unsatisfying especially after understanding that algorithms are built by humans for humans.
There are likely a number of potential issues that one can point out with any algorithm that is heavily used in modern day society. This does not mean that we should simply eliminate the use of algorithms. Instead, we must be willing to peel back the onion on these new technologies to figure out what the different inputs and outputs are. We also need to understand who is building these algorithms and what is the expressed goal.
Criticism is healthy and it leads to improvement. In fact David is a passionate technologist who frequently expresses concern about the misuse of technology-based platforms and services. Thankfully, his platform is big enough and his influence large enough that people are listening.
Going forward, we must keep in mind that algorithms are designed by people. So, when there is an issue, we have to look back at the education and experiences of those who built it and see what opportunities there are to make improvements going forward.
Conclusion
As technology platforms and services inevitable scale to larger and larger groups of people, it becomes important to not just design algorithms that work for the many or the few. Instead, we must be sensitive that our algorithms work for everyone even if that means an increased development costs to better quantify and validates the potential edge cases.
Join our growing community!
Sign up to receive the latest news, special offers, & updates about Digital Adventures.