We Count: Fair Treatment, Disability & Machine LearningJutta TreviranusInclusive Design Research CentreAttribution-NonCommercial 4.0 International License1 My name is Jutta Treviranus and the title of my contribution to this important discussion is We Count: Fair Treatment, Disability and Machine Learning.
About you & relevant to you..•Not just disability and accessibility•Stress testers and “canaries in the coal mine”•Key to fixing what is wrong2 I wanna preface my talk my saying that my topic is not a specialist topic, only of concern to people interested in disability and accessibility.
People with lived experience of disability are the stress testers of our systems, including the web.
They are the canaries in the coal mine for things that will go wrong.
The flip side of this is that considering disability offers the innovative choices that may help us fix the cracks and shortcomings for everyone.
The late Stephen Hawking predicted that we were entering a dangerous time.
This was before COVID-19.
Seasoned prognosticators seemed to agree that this disruption is one of many that will come at an accelerated pace.
The web is host to what remains of our common conversations and deliberation.
Our collective move online during the pandemic means that this responsibility has become weightier.
Our web tools are not neutral.
How we design them influences their uses in subtle and blatant ways.
Combine web tools with the power tools of AI and we have a potent mix.
Machine learning may help us amplify the opportunities but we also amplify the risks.
Machine learning enables us to do what we are doing more efficiently and effectively.
Before we employ our power tools, we need to ask what do we want to amplify and automate?
Where are we accelerating to?
What is the ultimate destination if we go even faster in the direction our current practices are leading us?
I wanna tell you about my pivotal moment of alarm.
Alice and the unexpected...5 Back in 2015, I had the opportunity to test machine learning models that guide automated vehicles in intersections.
I tested them with an anomalous scenario, a friend of mine that propels her wheelchair backwards through an intersection, effectively but very erratically.
All the models chose to proceed through the intersection and effectively run my friend over.
Well, the developers said come back when our models are more mature and have been exposed to more data about people in wheelchairs in intersections.
Smarter is not always better...6 When I retested the more mature machine learning models that had been exposed to a great deal of data about people in wheelchairs in intersections, they chose to run her over with greater confidence.
The data gave them confidence that people in wheelchairs move forward.
The power tools of AI..•Unable to handle diversity & complexity•Unprepared for the unexpected•Replicates our own inadequacies•Automate and amplify them7 I realized that the power tools of AI are unable to handle diversity and complexity.
Unprepared for the unexpected, they replicate our own inadequacies and automate and amplify them.
...more than about automated vehicles,....same pattern in all automated decisions...more than about artificial intelligence 8 And this is more than about automated vehicles.
The same pattern occurs in all automated decisions.
It's more than about artificial intelligence.
How do we treat small minorities & outliers?9 It is about how we treat small minorities and outliers.
I like to illustrate it in this way.
If we were to take all of our preferences and requirements and plot them on a multivariate scatterplot, it would look like a starburst with about 80% of the needs and requirements in 20% of the middle.
And the remaining 20% of the needs and requirements scattered in the 80% of the remaining space.
If you were to look at the dots in the middle, they're very close together, meaning they're very similar.
If you were to look at the dots as you move from the center, they would be further and further apart, meaning more and more different.
As a result of this, and in part because of Pareto and Richard Koch's 80/20 rule, design works for anyone that has a need in the middle, is difficult as you move from the middle and design doesn't work if you were out at that outer edge of our human starburst.
In terms of data, predictions are highly accurate if your needs are in the middle, inaccurate as you move from the middle and wrong if your needs are at the edge.
This pattern affects all of our systems, whether it is the design of our products and services, which thanks to Moore's law are improving in availability, reliability, functionality and cost, the opposite is true if you have marginalized needs our systems of research favor the average and ignore minority needs.
Our education system ranks based on the average and requires conformance, our systems of employment attempt to create replaceable workers, our democracy is driven by majority rules.
This not only hurts the 20% with needs at the edge, it hurts our society as a whole.
Because of this pattern, we have mass production, mass communication, mass marketing, a popularity push and our innovation suffers.
We have greater conformance, lock-in and our flexibility, extensibility, resilience and responsiveness all suffer.
AI Ethics and Fairness•Data gaps, misrepresentation and lack of representation•Algorithmic bias•Statistics and the treatment of minorities and outliers16 AI ethics has received a great deal of attention lately because of the discrimination faced by many minority groups.
There are three main sources of a discrimination.
Discrimination may happen because of data gaps because people are not represented in data gathering.
They may have no digital traces because of digital exclusion.
Or inaccurate data proxies are used.
Data may also be misinterpreted as noise and eliminated to improve performance.
Secondly, there is algorithmic bias and this can result from human bias that finds its way into algorithms due to labeling analysis and interpretation that is used or it can be due to biased training data.
The much more fundamental discrimination faced by small minorities and outliers, including people with disabilities is the inherent bias in our statistical methods and how we treat minorities and outliers.
AI Ethics and social justice strategies•Comparing treatment of bounded identity groups•Creating false choices (e.g., trolleys and lifeboats)17 The measures we use to assess AI ethics do no cover the discrimination faced by people with disabilities.
The primary tools for detecting bias is to compare the treatment of bounded identity groups with a default.
From the data perspective, there are no common bounded identifiers for people with disabilities.
The only common data characteristic is distance from the mean, such that things don't work for you.
Therefore, many people fall through the cracks or are stranded at the edges of these bounded groups, making it even more difficult to achieve fair treatment.
Privacy•Privacy protections don’t work•Can be re-identified •Most vulnerable to data abuse and misuse•Need protections against data abuse and misuse18 These issues are compounded by other vulnerabilities.
If you're highly unique, privacy protections don't work.
The primary protection is de-identifications at source.
If you have highly unique needs, you can be re-identified.
If you have to request special treatment, you barter your privacy for the service.
People with disabilities are also most vulnerable to data abuse and misuse.
Because of this, we need not only privacy protections but protections against data abuse and misuse from bad actors, surveillance capitalism and the state.
Disability and Data Science•Unfair treatment as subjects of data-driven decisions based on population data•Barriers to participation in designing and developing data-science•Barriers to interpreting the outcomes of data science•Subject to systemic influence or vicious cycles19 The entire field of data science is somewhat problematic if you have a disability.
You face unfair treatment as subjects of data-driven decisions based on population data, barriers to participation in designing and developing data science so you can help fix it, barriers to interpreting the outcomes of data science and therefore, benefiting from it.
And you're subject to systemic influence or vicious cycles of discrimination.
How is this relevant beyond disability?20 How's this relevant beyond disability?
Traditional means of deciding, judging & planning no longer work...21 Traditional means of deciding, judging and planning no longer work.
We live in increasingly complex adaptive systems of systems•Changing and unstable•Unpredictable•Entangled22 We live in increasingly complex adaptive systems of systems.
These are changing and unstable, they're unpredictable, they're entangled.
And right now, during this pandemic, we need to navigate out of danger.
But we are in a complex terrain with multiple hills and valleys.
Stuck on the local optima•Only values: popularity and profit•Optimizing toward homogeneity & conformity•Vicious cycle, creating steeper incline & disparity24 And because of the way we do things, we are stuck on the local optima, rather than reaching that global optima.
The only values are reflected in our systems are popularity and profit, meaning we can just keep going up that local hill.
We're optimizing towards homogeneity and conformity when we need diversity.
And these vicious cycles are intensifying disparity.
The only formula: diversification & collaboration•Include diverse perspectives•No best or winning strategy•Stop doing the same thing over and over again!•Work together!25 The only formula to get us out of that local optima is diversification and collaboration.
We need to include diverse perspectives.
There is no best or winning strategy.
We need to stop doing the same thing over and over again and we need to work together.
Prone to “Cobra Effects”•unintended consequences of over-simplistic “solutions” to complex problems•linear thinking that only makes things worse26 We are prone also to cobra effects or the unintended consequences of over simplistic solutions to complex problems.
We use linear thinking that only makes things worse.
Is how we are designing technology making things worse?27 We may be seeing a cobra effect right now and AI may only intensify this.
Technology & human adaptability28 There's a famous Thomas Friedman graph that compels us toward greater progress to avoid economic collapse.
It shows us that technology's adapting at an exponential rate, while humans are adapting at a linear rate.
Impeding human adaptability?•Cushioning from dissonance and diversity•Recommender systems – “people like you”•Search engine optimization•Popularity bubble •Smart systems that replace human smarts•Evidence-based decisions that favor the majority and homogeneity29 I would say the situation is worse.
My contention is that technology is impeding human adaptability.
We're cushioned from dissonance and diversity by recommender systems that recommend things liked by people like us.
We have search engine optimization.
We have a popularity bubble.
We have smart systems that replace human smarts and we have evidence-based decisions that favor the majority and homogeneity.
80/20 rule reconsidered... We have misidentified the modern day ‘vital few’ as the difficult 20%30 My recommendation is that we reconsider the 80/20 rule.
It came about because Pareto noticed that 80% of land in Italy was owned by 20% of the population or what he called the vital few.
He told the emperor of the time to ignore the 80% and attend to the vital few.
Koch later turned this into a formula for quick wins, saying ignore the difficult 20% who take 80% of the effort.
Who are the real vital few?•If our goal is not greed, quick wins or gaming the system, but..•Innovation•Diverse perspectives•Detecting weak signals•The “difficult 20%” that occupy 80% of the unexplored terrain31 However, who are our current vital few?
If our goal is not greed, quick wins or gaming the system, but innovation, diverse perspectives, detecting weak signals, the difficult 20%, as Koch calls them, that occupy 80% of the unexplored terrain are the real vital few.
If we design with and consider the needs of those 20%, we make sure that the 80% in the middle have room to change.
It is out at that outer edge that we find innovation, not in the complacent middle.
It is also where we detect weak signals that disrupt our life.
How will we measure, decide and judge?34 But how will we measure, decide and judge if we are not using statistical average?
‘the true measure of any society can be found in how it treats its most vulnerable members’ ~ Mahatma Ghandi35 One recommendation comes from Gandhi and others.
“The true measure of any society can be found in how it treats its most vulnerable members”.
This is more than altruism.
It actually has scientific rationale.
Discover...•The diverse range•The edge or periphery36 What we need to do in our AI and in our research and in our data analytics is discover the diverse range, the edge or the periphery.
You may ask what about cost?
Cost over time and longevity of system..38 We've actually found that cost over time and longevity of the system works better if you plan with the edge, rather than planning for only the center, which becomes brittle and soon reaches end of life.
In one of the experiments we've made to level the playing field and address the needs of people with disabilities, we call our lawnmower of justice.
It's a very early model but what we've tried to do is to remove the top of the Gaussian curve, to take away the advantage you have by being similar to everyone else, such that the learning model needs to attend to all of the edges.
Systems that value the edge of our human scatterplot..•Adapt to change & respond to the unexpected•Detect risk•Transfer to new contexts•Results in greater dynamic resilience & longevity•Will reduce disparity•May hold the key to our survival40 And what we found is that systems that value the edge of our human scatterplot adapt to change and respond to the unexpected, detect risk, transfer to new contexts, result in greater dynamic resilience and longevity, will reduce disparity and may hold the key to our survival.
Continuing the Conversation....•https://idrc.ocadu.ca•https://wecount.inclusivedesign.ca•@juttatrevira•https://medium.com/@jutta.trevira41 And I would love to continue the conversation.