Rosa Emerald Fox

Technology, career and lifestyle blogger

Category: london

Living with Intelligent Machines

Today I was incredibly fortune to see Nello Cristianini present on the subject of ‘Living with Intelligent Machines’ at Government Digital Sevice. Nello is a lecturer at Bristol University and has been studying the field of Artificial Intelligence for over 25 years.

The talk focused on ethical considerations surrounding data, AI and machine learning.

In this post I will write about some of my key take aways. Please keep in mind these are my own understandings and interpretations of the presentation. I have used a mixture of Nello’s examples along with those that I have found myself.

The Business of Data 

‘AI’ and ‘Big Data’ have been thrown around as marketing buzzwords over the last few years in order to sell products. Cisco have gone so far as to create promotional images displaying the quote ‘Data is the new oil.

Clearly there is money in data. I am not going to search for stats on how many people use Google search engine, but it’s obviously a lot of people. From what I can gather, Google don’t sell this data, but they do use it to power their own advertising platform which other companies pay to use. Advertisers trust that their ads will be targeting users that are likely to be interested in whatever they are selling.

Due to the perception of data = lots of cash, start ups inevitably start springing up, claiming to offer AI solutions to various problems. It could be far too easy for someone who doesn’t really know what they are doing, to hack together a basic model using some Python libraries, package it up as AI and sell it.

I am starting to learn how to build machine learning models myself, and yes for the most part I will be learning through hacking things together and seeing which results I get. When you are learning it is so important to do this; to experiment, to see what you can build and to practically explore the theory you read. It is equally important to remember that this is not something that should be deployed. By all means share your code as a learning exercise but don’t try and sell things that you don’t fully understand. Nello reiterated that quick solutions are worse than no solutions.

Aggregating data and presenting it to users online has disrupted many industries over the last two decades. Websites such as Compare the Market (comparing and buying insurance) and Skyscanner (finding/comparing/booking flights) save users from trawling through lots of different shops or websites to find the options that best suit us.

Airbnb, ASOS and online food shopping mean that we can choose to make purchases without physically needing to go anywhere. Google Maps helps us find where we need to go. Skype allows us to video call our loved ones from across the world.

These online services convenience our lives and are free so we use them. They have succeeded in removing an intermediate layer. We don’t want to go back to paying to speak to our relatives abroad via landline, but we don’t tend to question wether our Skype video calls are being used to train facial recognition systems (Microsoft President Brad Smith wrote a blog post in 2018 calling out the need for public regulation and corporate responsibility for facial recognition technology).

We have got so used to consuming this technology that it’s virtually impossible to go back. People have been unpleasantly surprised in how companies have used their data, which has resulted in backlash. In 2012 a 26 year old bar manager was refused entry into America because he tweeted “Free this week, for quick gossip/prep before I go and destroy America”. He meant it in a party sense, but it was interpreted as raising concern and he was sent back to the UK.

Nello showed us an example of Facebook banning the insurance company Admiral from pricing car insurance based on the content of users Facebook posts. Admiral claimed that they were experimenting with ways in which young drivers could prove their sensibility. The idea was that young drivers opted in to sharing their Facebook data and could save up to £350 if they appeared sensible. According to Admiral (…well actually the BBC news article I read about the project…) being sensible involves “writing in short concrete sentences, using lists, and arranging to meet friends at a set time and place, rather than just “tonight””.

Being assessed based on your online activity, throws a spanner in the works of the theory that open and transparent models will eliminate bias. If you act in a certain way to meet the criteria of a model then you can’t live as your authentic self. Nello gave an example of Uber drivers reporting that they felt they had to take abuse from customers in order to maintain a high rating to adhere to the scoring system. We don’t have an ID card scheme in the UK, but for countries that do, could various scores be added to represent people based on their data? Could they be denied opportunities because of a score that may not accurately represent how ‘good’ they are?

Making sense of the digital world

These issues are very complex. The 2016 introduction of GDPR helps through setting regulation in EU law about data protection and privacy, but as users, often we will just accept the cookies and the unknown consequences. 

Listening to Nello has reinforced to me that it is important to check facts and to question proposed ideas. If an article in a magazine claims that a system is bias, look to back this up with academic proof where possible.

Just because the same piece of code can be applied to predict things for completely different use cases, it doesn’t mean that it should be. It is important to consider what the harm could be in developing machine learning models. The pair of jeans that ‘follow me around the internet’ after I have viewed them on ASOS inevitably cause less harm than a system that analyses immigrants to determine if they are lying.

A system could analyse something better than random selection could, better than a human could, but ethically it could be greatly harmful, so shouldn’t be deployed.

It will be important for me to consider and test the quality of training data. In professional practice, I would assume that to select a dataset, it would largely help to be an expert of that subject matter, or to find people who are. There is a popular computer science term ‘garbage in, garbage out’. It is also important to check that using the data abides to privacy regulations.

There isn’t a cookie cutter solution to fix data ethics. Far from it, as for all the positive applications of AI technology, there can be negatives. A possible step towards a solution could be that organisations would have their models reviewed by an internal or external body of experts that would throughly investigate the ethical concerns of any AI technology that was to be deployed. Counteractively, there is the fear that this could stifle innovation.

From the perspective of this blog, for people like me that are just getting started: I feel that obviously technical skills are important, but actively educating yourself about the legality of what you are building and assessing the harm it could cause will be vital for when you are at a stage of producing deployable applications. I am looking forward to understanding more about data ethics, which I believe will greatly influence how I approach my studies.

Hair by Not Another Academy

On Instagram stories, I saw that Not Another Academy were looking for models for root bleaches so I messaged them straight away ?.

Not Another Academy is the training arm of Not Another Salon on Brick Lane. In my opinion it is the best salon ever – lively, bright, eccentric, fun and they specialise in brightly coloured hair. The salon is owned by girl boss Sophia Hilton, who was running the academy alongside hair expert Norman on the day I attended.

The stylists that worked on my hair @nicolakristel_hair from Portsmouth and @colourbysambluz from Scotland were very experienced and talented. They made the day really enjoyable and did a fabulous job.

Before: 

After:

I was probably there for nearing 7 hours, but it went fast and gave me some much needed time to relax and read my kindle. Despite it taking a while, mostly due to waiting for things to start and for the stylists (there were around 10) to view each others consultations etc, it was definitely worth it.

Putting yourself forward as a hair model is also much cheaper, which really helped me as I was paying towards 3 accommodations in London in September ?. Looking out for these opportunities can really help if you have high maintenance hair and are on a bit of a budget. Otherwise, the regular Not Another Salon are very transparent about pricing and you really do get what you pay for.

It has been a couple of weeks now and the colour faded to a nice peachy rose gold colour. The salon gave me some pink dye which I can mix into my conditioner and leave on my hair to top up when needed which has turned out to not be too messy (my main concern with coloured hair… destroying the carpet with dye!).

If I do need to go back to having more conservative hair for any reason, then it will eventually fade back to a blonde. Some colours such as blue or green tend to stain the hair but pink it pretty easy to try out for a bit and then change if needed. I am definitely really enjoying it for now anyway.

uncodebar

uncodebar is codebar‘s annual unconference. This year it was hosted at Twitter and gathered 86 developers from our codebar community.

At an unconference there is no specified agenda, meaning that speakers are not booked in advance to present. Instead, at the start of the day, people that attend the event pick up the microphone and pitch sessions that they want to run. There is a show of hands to determine which room size the session will require and it is added to a time slot on the schedule on the wall.

The sessions usually take the format either of a talk with Q&A, a hands on coding workshop or a group discussion around a set topic.

Schedule which took shape as a result of pitches:

I took part in a session about running community meet ups, saw a talk about ‘the art of saying no’, learnt about the highs and lows of @thisisjofrank’s project in which she created a tweet controlled LED wedding dress (it was AMAZING, find out more in Jo’s post here), saw a thought provoking talk about software and ethics by @richardwestenra and finally a talk about coaching software dev ?

Honestly, go along to an unconference if you can. You never know quite what you are going to get, but that is part of the fun. In this post about a Civil Service unconference, Claire writes about the value of moving away from having “speakers” and “listeners” as the collective knowledge of the audience is likely to be more than that of any one speaker.

Huge thanks to the codebar organisers that put this together (I can take no credit as I wasn’t involved in organising this… just attended!), they did a brilliant job. I left feeling very proud of the codebar community and look forward to next year.

Break Into Public Speaking

I had the pleasure of watching the final presentations from the ‘Break Into Public Speaking’ workshops that I co-organise at work for other people from minority backgrounds.

All the talks were brilliant and covered a range of subjects, some examples of which were the importance of including LGBTQ participants in user research, travelling alone with a a disability and using agile techniques to help to organise your home life.

Check out Lucy’s blog post to find out more about the workshop series.

Powered by WordPress & Theme by Anders Norén