6/6: Ethics http://www.bbc.co.uk/programmes/b04p7yg3
If a driverless car has to choose between crashing you into a school bus or a wall who do you want to be programming that decision? Aleks Krotoski explores ethics in technology.
Join Aleks as she finds out if it’s even possible for a device to ‘behave’ in a morally prescribed way through looking at attempts to make a smart phone ‘kosher’. But nothing captures the conundrum quite like the ethical questions raised by driverless cars and it’s the issues they raise that she explores with engineer turned philosopher Jason Millar and robot ethicist Kate Darling.
Professor of law and medicine Sheila MacLean offers a comparison with how codes of medical ethics were developed before we hear the story of Gus a 13 year old whose world was transformed by SIRI.
- Looking ‘behind the curtain’ to those creating our digital world – and how their morality, etc. feeds into that … especially in making decisions about the functionality of the driverless car.
- Facebook – social contagion experiment – http://www.forbes.com/sites/gregorymcneal/2014/06/28/facebook-manipulated-user-news-feeds-to-create-emotional-contagion. Scientists are presented with the idea that they are ‘providing’ cut and dried solutions, and need more understanding that it affects social interactions. Are making ethical decisions.
- Ethics themselves are difficult decisions, but placing this into the technology was hard – e.g. created automatic traffic tickets, and found that although the law seems to be cut/dried, as different numbers of tickets were issued depending on how the law was interpreted by the programmers.
- Machines need to make decisions re driverless cars as to whether the car will hit a baby’s pram, or a wall?
- What sets our internal moral compass comes from culture, upbringing, or a higher power… but it’s hard to set an ethical position into specifications.
- Certain number of things are “easy”, but creating a Kosher phone for the Jewish Orthodox community … the developer thought it was ‘pointless but harmless’. Kosher has been determined as ‘fit for purpose’ by a Rabbi… many apps/connectivity removed – developer sees it as an ‘extreme porn filter’, filtering out any possible damage/distraction/against the 613 items of the Torah. The more that the development continued, the more complicated it became, especially as the developer would not be engaged with directly by the Rabbis.
- Do you get the phone to automatically shut down on the Sabbath, or do you ‘trust the user’ in a self-policing community.
- How responsible is the developer for the fact that the technology no longer is able to contact outside the community?
- Driverless cars are programming logic into the cars where unavoidably about to crash. Ensure engineers understand that it’s not just a technical problem, but an ethical problem… drivers may not want to trust the developers, but would they want to press ‘wall’ or ‘child’ on setting… which takes it too far?
- How has medicine dealt with some of these questions – many decisions made after Nuremberg trials when Nazis over-rode moral questions – the patient now often makes the decision, rather than the technologists.
- Gus, the power of SIRI – can give him endless answers to his (endless) questions. Need to ask very clearly – the possibilities of social bonding with machines – adds extra possibilities for social interactions (for autism). SIRI, however, isn’t designed to form this function, so need to be sure that not replacing human interaction. The machine isn’t the friend – it’s the bridge to friends.
- Human/robot interaction – recognise cues, can they manipulate those cues to assign particular behaviour? We respond to the lifelike cues that the machines give us – even though we know they are robots …. Companies could exploit that emotional attachment (e.g. compulsory upgrade). Kate Darling – calling for social scientists, philosophers, etc. into these decisions = importance of interdisciplinary input = not just technological decisions (as bioethics has done for years).
- Require ethics boards … Google only has once because they acquired a company that made it part of the deal – difficult to incentivise for companies.
- Don’t want to do – as much software does – test it – see if it works – and adapt as it goes wrong. Some issues are too high stakes for this.
- Kosher Phone – supports agency rather than draining it, in conducting an ‘observant lifestyle’.
- We tend to align ourselves with the moral codes, rules/regulations, etc. of our societies, until they don’t appear to align with our moral compass. The same goes for technology – but as we don’t see technology as anything more as ‘neutral’ we don’t realise when the ethic of that technology stands in contrast to our own ethical world view – do we give our choices to ethical boards at tech companies, or do we force them to become more transparent about their decisions.
- If it works smoothly, it’s a product that people will embrace – you make a better product.
And see the Tumblr associated with the programme.