Tag Archives: unconscious bias

Ethics and innovation go hand in hand

Last week I attended an “Ethics and Automation” panel run by HMG’s Automation Taskforce, and hosted by Katie Rhodes, Senior Policy & Strategy Advisor; this is part 1 of my thoughts from the session.

The first panellist was Bethan Charnley, Head of Strategic Projects at the Centre for Data Ethics and Innovation (CDEI). The role of this organisation is to advise Government on how to maximise the benefits of data.

Bethan raised that often innovation and ethics are posed as contentious, but she sees ethics as an enabler for innovation. (For those of us in tech circles, particularly where we have association with BCS, the Chartered Institute for IT, we are also familiar with people thinking that professionalism can stifle innovation. But look at some of the incredible, stunning, unique buildings that we see erected, such as the one below. Does professionalism in Architecture stifle innovation there? Anyway… ).

No, she clearly stated – and I wholeheartedly agree – that “ethics and innovation go hand in hand“.

I would argue that by considering ethics alongside innovation we are far less likely to have the unintended consequences we have seen in examples of AI that has been trialled, and not just trialled but put into production. By considering the ethics we are not prevented from innovating, we just do it better. Innovation is more relevant, more inclusive, more valuable to society.

Furthermore, Bethan challenged us to realise that we have to consider not just the ethics of doing something but the ethics of NOT doing something. I’ll take that one step further: is it ethical if a Government Department does not provide a service to someone who doesn’t know they are entitled to it, when said Department may already have the data that shows they are? (Just a hypothetical question of course.)

Naturally, we can’t have a conversation about ethics in the field of data science without talking about bias in algorithmic decision making. AI could be a way to remove such bias. But if we’re not careful it’s a way to bake that bias in: training with biased data, building bias into algorithms, testing with biased data, and so on. We need to make sure we get insights into every stage of the AI lifecycle.

That’s one of the many reasons why IBM has developed Watson OpenScale. It can trace and explain AI decisions across workflows, and it allows you to intelligently detect and correct bias to improve outcomes.

A good, fun example of this is how we applied AI fairly to pick highlights from Wimbledon. If you think about it, the main courts have the biggest audiences and may make the loudest roars during rallies and wins. But there may still be a fabulous shot, unique win, and so on, on one of the higher number courts. Just as in life where those who shout the loudest are not always the most successful, at Wimbledon you may still have an amazing shot with only a ripple of applause. We wanted to make sure all successes were considered.

I suggested that by considering the ethics we innovate better. By applying fairness to this AI at Wimbledon the result was “a higher-quality selection of sports highlights—and more of them.”

Read that Wimbledon storoy for yourself: https://www.ibmbigdatahub.com/blog/ai-picks-highlights-wimbledon-fairly-fast

Learn about how KPMG stewards responsible AI with Watson OpenScale: https://mediacenter.ibm.com/media/1_ulgwi98c

1 Comment

Filed under Automation, Government, Public Sector

It’s time for change.

There are many reasons why unconscious bias needs to be discussed and addressed within the IT industry.  One of those is that there will be more STEM jobs in the future, and if fewer women study those subjects and work in that industry now, they are even futher likely to be impacted.

And there’s plenty of evidence about why gender diversity is so important, and companies with greater gender diversity at the board level perform better and have better reputations.

But 70% of women who graduate with a STEM degree do not stay in STEM post 5 years.  One of the reasons for that is that we need to feel like we belong.  I’ve experienced folks from Catalyst who talk about the importance of belonging.  In fact, they have a rather handy page on why diversity and inclusion matter, with links to relevant research.

And we *all* have a role to play in inequality.  Conscious bias clearly does still exist, and is dangerous.  But unconscious bias goes on largely unnoticed and is more likely experienced in a professional environment with less obvious complaint mechanisms than conscious bias.

Many folks talk about ‘the management team’ undergoing mandatory training for this, but we all need it, and need to refresh it, and constantly keep it in mind.


On 12th February, Talat Yaqoob, Director of Equate Scotland, spoke at a BCSWomen event I ran with the BCS Tayside and Fife branch on this very subject, and she highlighted 3 unconscious biases I thought I’d share.

One is in group bias, where we favour in our image.  We see like for like, and all too regularly asses likeability over competence.  I saw it in practice recently when recruiting new people to a team.  Comments were made about the ability of our new members to gel with us, and be able to cope with the style of existing members who were rather happy to share their opinions, ideas, worries, concerns.  But rather it should be for the existing members to ensure the skills and experience that was lacking were sought for first, and the responsiblity of those existing members to ensure that everyone had a say.  There’s something about extroverts being preferred that means we end up without diversity in that characteristic!

The second was confirmation bias, when we hear what we want to hear.  Sadly I didn’t hear an example, but I suspect many women suffer from this one.  Say there is one woman on a team, a board, in an org, and she fails (you know, sometimes it happens!)  Far too many folks think therefore if she fails then all women will fail and therefore they shouldn’t be on boards, in certain positions, etc., etc.

The one that made me chuckle was unconscious bias bias!  This is where someone thinks that as they are an open-minded, logical person they don’t have any unconscious bias!  Or they think that because they have had unconscous bias training they are no longer biased!  But we all have them.  Talat covers this sort of subject all the time, and she told us a story (that I don’t feel at liberty to share here) of an example of her unconscious bias too.

Of course, bias is not just about gender either…

Something else to be aware of is macro and micro aggressions.  Macro aggression can be seen in the gender pay gap, the lack of women in STEM, the lack of men in vet medicine…

Micro agression can be seen as ‘death by a thousand cuts’, where you’re just not happy with an organisation, team, etc., but can’t necessarily put it down to one obvious thing.  Describing an event as “black tie” can even be thought of as a micro-aggression, especially when it’s an event for women; why aren’t we describing what women should wear?

Another common micro aggression is women not being heard by men.  Take Obama’s 2009 administration for example.  Two thirds of the top staffers were men, and the women’s voices were just not being heard.  So the women did something called ‘amplification’.  They went to meetings in at least twos, and every time a woman had a good idea other women in the room would positively reinforce it, perhaps by stating it was a good idea and saying “tell us more”.

So, what can we all do?  Challenge our own thinking.  All the time.  And slow down decision making.

Do get trained.  More than once.  A training session allows for self reflection in a space we don’t often give ourselves.

Pick people up on their unconscious bias, point it out.  Don’t be mean, but don’t hold back either.


Leave a comment

Filed under womenintech

What does one do with a Barbie?

Today I had the privilege of chairing a panel at the Women of Silicon Roundabout in London.  Our focus was “Closing the Gender Gap”.

We started with a discussion on targets versus quotas.  The panel appeared to be in agreement that quotas – a government requirement that organisations comply to a certain number/percentage of women, with repercussions if they do not – can be harmful, putting more pressure on those women who *may* be a number rather than there on merit, even although it was admitted that in some countries the quotas appear to have worked.

The panel also agreed that targets, however, can be useful. Fiona Hathorn, MD of Women on Boards UK, stated:

 “What gets measured gets managed, and what gets managed gets done.”

That really resonated with me.  Targets can also make it much more clear where the problems lie in an organisation; for example, if a company can meet a target of 50% female applications, 50% female hires, but not 50% females in the middle management layer there’s clearly a challenge to be investigated.

But we could talk targets and quotas for ever.  So I moved us on.

There was also violent agreement that there is a problem with pipeline.  There definitely needs to be more done to interest children, at the primary level, in technology.  IT – specifically in the UK – has a poor brand, and it doesn’t help that many people outside of the industry struggle to articulate what a career could be, including influencers such as parents.

I certainly don’t envy teachers of computing at schools; how they stay up to date with technological advances and how they can make technology attractive when so much of what they have is out of date is beyond me.  Susan Bowen of TechUK commented that to compensate for a country-wide, government-led change, work is being done in pockets across the UK, largely by volunteers who recognise the need.

Someone asked the panel how we keep children interested in technology assuming that we have caught their attention.  And certainly, Clare Sudbury of LateRooms had commented earlier that when she was younger she had had a fear that her liking of science, of puzzles, of maths, was wrong because she was a girl.  In my own experience I know that I stopped asking for Lego as a present because I didn’t think girls should do that*.  I started asking for Barbies in order to fit in.

I didn’t have a clue what to do with a Barbie.

So, there are quite a few organisations out there to help; organisations such as CodeFirst: Girls, Stemettes and CoderDojo may fit the bill, the latter particularly for younger children.

We also talked about name-blind applications and unconscious bias.  I related the anecdote that Dame Stephanie Shirley used her nickname “Steve” to sign communications with clients because they didn’t respond to a woman.  Toby Mildon, Diversity and Inclusion lead at the BBC, told us more about what they are doing to ensure no unconscious bias screens out suitable applicants early in the hiring process.  He told us of one who had had two applications turned down at the very first stage in other scenarios, but made it right through as the most qualified candidate at the final stage of recruitment when they applied through the name-blind process.

We talked about many other things too, but lastly I just want to highlight the discussion we had about men.  Gents, we can’t meet that gender gap without your support.  Don’t forget, we’re not really taking your places, it’s more that there’s a huge skills gap in the market and women can help fill that.  It’s likely that women will join you, not replace you.  And there’s lots of evidence out there that proves that diverse teams are the most successful.  This is not just a touchy-feely thing about diversity; there’s a real business case behind it.  So, be our mentors, be our sponsors, be our advocates.  Come along to “women” events to see what is on our minds and some of the challenges we’ve faces.  You might be the only man in the room, but do remember that often we’re in the reversed scenario.  I once ran a BCSWomen event to which two men had signed up.  They walked into the room, saw all the women and walked back out again.  If I did that at work when I saw all the men I’d never get my work done!

Of course, if you’re a dad/uncle/brother help your daughter/niece/sister explore what technology means to them, how it has an impact on their lives.  Don’t let them spend any time feeling bad if they get teased for being different.

Also, if you haven’t yet watched Emma Watson’s HeForShe event at the UN in 2014 do take a look.

I also have my own opinions on how flexible working can help the gender gap too but I will save that for another blog another day.

So, what do you think?  Have you seen the effect of targets?  Have you suffered from the unintended consequences of unconscious bias, especially at the application stage?

And what does one do with a Barbie?

*I’m over that now.  It was a while back now, but in my first visit to Hamley’s at the age of 22 I bought a Lego Ferrari. For me (just in case there was any confusion there).  I’ll be making sure my nieces and nephew know that they can play with Lego no matter how old they are.


Filed under womenintech