top of page
Writer's picturePooja Sachdev

Changing our algorithms


I was watching Netflix last week (a rare evening treat these days) and scrolling through the options presented to me. As with most of us, there is a distinct pattern in what shows up on my screen.



I clearly have a ‘type’ according to Netflix and it’s pretty narrow, if I’m honest: rom coms, dramas, documentaries… the usual. I decided to switch profiles and take a peek at my partner’s recommended shows, just out of curiosity. Wow, was it a whole different world! Vampires, war movies and horrors – the total opposite of my list! – nothing I would have chosen myself, but it struck me that there is a whole world of films and content that is completely hidden from my sight. And I’ve let a computer programme make that decision for me.


Now, I know it’s not news that the algorithms working behind the scenes are making connections between what I’ve seen before and what I’ve liked before to create categories and associations to predict what I might be more interested in watching. It makes sense, and in some ways, it is helpful, BUT it is also hugely limiting.


This got me thinking: Isn’t this just what our brains do ‘behind the scenes’ as we move through life and make all our mini decisions every day?


When we are babies, we are like the ‘new account’ in Netflix – with the whole world open to us, everything is new and has potential... As we get older and have more exposure and experiences, our brain begins to create ‘likes’ and ‘dislikes’. We start to create patterns and associations. We form categories and ‘types’ that we are more comfortable with, more drawn to. Then, when we encounter new things or meet new people, our brain does an unconscious ‘sorting’ into what is familiar v unfamiliar, a ‘liked’ category or a ‘disliked’ category.



We feel ourselves more drawn to the familiar, the ‘liked’ category; we feel ‘default warmth’, attraction and loyalty to those people and things that we know, and we are more likely to give them a chance. On the flip side, we unconsciously ‘screen out’ things that are unfamiliar, or where we may have not had a positive experience or association in the past. And we might close ourselves off from those options earlier, avoiding them and not always giving them a chance.


From an evolutionary perspective, this mechanism (sometimes called ‘similarity bias’ or ‘affinity bias’) is built into all of us [1]. And there’s a good reason for it: it has helped us to stay ‘safe’ because it enables us to recognise and stick with our own families or tribes, avoiding potentially dangerous ‘other’ tribes or species… However, fast forward to the modern day working world (and social lives) in an increasingly diverse and globalised context: how does this impact the way we might respond to people and environments that are new, and the choices and judgments we make about differences we encounter?



In our workshops on bias and decision-making [2], we talk about how the brain ‘thin-slices’ through vast quantities of data to make ‘snap’ judgements and choices – these are terms from Malcolm Gladwell’s book ‘Blink’. This process is an evolutionary necessity in many ways: we can’t stand there and spend hours weighing the pros and cons of crossing the road or jumping out of the way of a speeding bus! We need to be able to make quick decisions, and even snap judgments – and, according to behavioural economist Daniel Kahneman, we actually do this most of the time, because it’s cognitively easier (he calls is System 1 thinking [3]).



Snap judgments [4] are necessary when we need to make sense of complex issues quickly (e.g. surgeons making a call in the operation theatre), and they can actually be very accurate and even amazing to watch in action – such as that side kick that looks impossible, but lands the ball precisely in the goal with unbelievable finesse! That’s the player making a ‘thin-sliced’ decision in that split second about how and where to kick, instinctively channelling all their years of playing the game and their learned ‘feel for’ what to do in the moment.


However, unless we are experts and highly skilled in the area in which we are making the snap decision (for example: the football player or a professional chess player with years of experience), we are prone to making errors because we are more likely to default to automatic associations or old brain habits, rather than taking all the data and evidence into account. And this is where assumptions, stereotypes and bias creep in and interrupt objective decision-making.


This may not be hugely consequential when we are deciding what to eat for lunch or what to wear, or even when we are choosing where to kick the ball in a friendly game… but when the judgement is about another individual (whether conscious or unconscious) it can impact how we respond to them, how fairly or kindly we treat them and what opportunities we afford them.


Apart from impacting the other individual personally, this can also get magnified at an organisational or social level where patterns of majority/minority or power differentials exist.



At Rewire, we call this the ‘Ladder of Impact’ [5] to demonstrate the impact of our biases from a ‘micro’ to a ‘macro’ level. Initially bias begins just as a thought – an assumption – just something that’s in our minds. But when we encounter someone or something new, our ‘instinctive’ or ‘default’ feelings about them can ‘leak out’ and show up in the form of a verbal or non-verbal response (a smile or nod, sitting next to or away from, making more or less eye contact) or even an action (inviting someone for drinks or not). On a more overt and explicit level this can turn into exclusion or negative treatment, when left unchecked, and over time can lead to systemic differences in peoples’ experiences and opportunities which are visible in society today.


So in order to tackle the systemic ‘isms’, we do need to take a few steps back and look at our individual interactions – and how we might be allowing our internal ‘algorithms’ to affect our judgements and responses to others.




As Malcolm Gladwell [6] puts it: “we need patience and humility when it comes to dealing with strangers. We need to understand that we’re not always right… And we have to understand that there is no reason to be in a hurry to jump to a conclusion about someone we don’t know.”


Apparently, we make decisions about other people, even complete strangers, in a matter of seconds [7] – sometimes, all it takes is a handshake or a facial expression. If we can find a way to switch gears from System 1 (impulsive) to System 2 (rational) when we make decisions about others, we can interrupt this automatic process and potentially open ourselves to greater objectivity in our judgments – and also widen our own lens8 on how we see the world.


Of course, this is easier said than done. We don’t always have the time, luxury and frankly, the mental capacity for that level of ‘meta-cognition’ [9]. This is why the discussion on unconscious bias has stalled to some extent – we all know and recognise we have bias, but it’s incredibly difficult to shift. I wonder whether, perhaps we should look at it as not necessarily about ‘wiping away’ the bias but becoming more attuned to it, aware of it and ‘checking it’ when it creeps into our decisions.



So, how can we do this?



Firstly, we can do this by putting ‘process checks’ in place at an organisational level, such as:


  • Removing factors that might limit our objectivity (e.g. so called ‘blind CVs’ in recruitment)

  • Broadening our ‘range of view’ (e.g. using a broader definition of talent, and looking beyond the ‘usual’ places for recruitment)

  • Ensuring there is a representative/ range of views in the room for major decisions (e.g. diverse panel)

  • Appointing specific roles in meetings to challenge group-think (‘devil’s advocate’)

  • Allowing adequate time for decisions and being clear about criteria; rewarding ‘good’ decision-making (rather than just ‘quick’ decision-making)

  • Keeping track of patterns in decisions (e.g. monitoring data on appointments, promotions, etc) and working to understand what’s behind gaps or discrepancies

  • Removing barriers to access (e.g. physical/technological barriers as well as working norms that might inadvertently exclude certain groups)

  • Education and awareness-raising; Clarity on expected process and standards (e.g. leadership frameworks and organisational values/ code of conduct – with clear expectations and examples – and policies and processes that support this)

  • Role modelling and visible cues (e.g. corporate language, communications, imagery)




Secondly, there are things we can all do on a personal level too.


This takes a great deal of intentional practice and honest self-reflection. It takes humility and being truly open to changing our mind, which we know, as humans, we don’t seem to like to do [10]!


Here are a few suggestions:


  • Challenge your first impression. The next time you have a strong reaction to something or someone, stop and ask yourself why. What’s behind this?

  • Look for evidence AND counter-evidence. The next time you make a judgement about someone’s personality or ability, take a little extra moment for a thoughtful pause and force yourself to articulate the facts and data to support it. And then, proactively LOOK for facts that support the opposite view! (So you don’t fall prey to ‘confirmation bias’ [11])

  • Get a second opinion. Seek out diverse views, seek to understand the other perspective before you make the call.

  • Actively self-educate. Learn (and unlearn) your automatic patterns. Open up what you are exposed to, go somewhere new, meet someone different from you, seek out opportunities to broaden your world. Challenge your likes and dislikes, your ‘types’ and your assumptions.


Both natural and artificial intelligence functions by identifying patterns in the data that is “fed” to it. Where the data is biased or limited in some way, so is the outcome – in my old days of studying computer science, we called this RIRO – “rubbish in, rubbish out” – or GIGO [12] for my American colleagues!



To mitigate this, we need to re-balance the data we feed both, machines and our brains.


So, re-programme or re-set your algorithms every now and again.


In the end: You may not completely switch from ‘rom-coms’ to ‘horrors’, but you might just come across that one film you love that you might not otherwise have even given a chance!




 

At Rewire Consulting, we have been researching culture, behaviour and diversity for over two decades.

We are specialists with experience in organisational development, employee engagement, employee experience, employer brand, research, coaching and leadership development.


Our diverse, international team are based across UK, Europe, USA and Asia and we work on both local and global projects.


What drives us is a passion and a commitment to bring out the best in people, teams and organisations – so we can all fulfill our potential and serve our purpose.



We have worked with and in a wide range of organisations over the years, from blooming start-ups to large multi-nationals.



 


1 Insights From fMRI Studies Into Ingroup Bias: https://www.ncbi.nlm.nih.gov/pmc/articles/PMC6174241/ 2 Rewire Consulting - Our experience: https://www.rewireconsulting.com/our-experience

3 Daniel Kahneman discussing System 1 and System 2 Thinking: https://www.youtube.com/watch?v=PirFrDVRBo4

4 Guardian article about thin-slicing and snap judgments: https://www.theguardian.com/lifeandstyle/2009/mar/07/first-impressions-snap-decisions-impulse

5 Download a copy of the ‘Ladder of Impact’ here: https://www.rewireconsulting.com/post/free-resources 6 Malcolm Gladwell on snap judgments about strangers: https://www.cnbc.com/2020/06/25/malcolm-gladwell-why-you-cant-know-someone-through-snap-judgement.html

7 Seven second to make a first impression: https://www.businessinsider.com/only-7-seconds-to-make-first-impression-2013-4?r=US&IR=T

8 Widen the Screen: https://www.youtube.com/watch?v=PUHop5i8-f4

9 Metacognition simply means ‘thinking about how we think’: https://en.wikipedia.org/wiki/Metacognition

10 Adam Grant on changing our minds: https://behavioralscientist.org/your-ideas-are-not-your-identity-adam-grant-on-how-to-get-better-at-changing-your-mind/

11 Confirmation bias is the tendency to notice and pay more attention to data that supports our existing beliefs: https://en.wikipedia.org/wiki/Confirmation_bias

12 GIGO = Garbage in, Garbage out

Comentários


Os comentários foram desativados.
bottom of page