Today I was due to speak at St John’s College, Durham, as part of a panel addressing Fake News, but have been unable to make it, so sharing some of my (unfinished) prep here. Reason here:
— Dr Bex Lewis (@drbexl) August 31, 2017
I listened to an interesting panel on fake news at Greenbelt on Monday, a few thoughts from there:
- Reuters looking at trust in the media: nationally around 41%, although another panel (Edelman?) said only 24%, so opening question was should Britain distrust the media as much?
- Vanessa Baird from New Internationalist: Good to use critical judgement in assessing the media, and not surprising that people do distrust the media with e.g. phone hacking/expenses scandal, etc. Feels Levenson = not enough accountability and too many cosy relationships. Blanket distrust, however, is dangerous, so ‘we should support those who are doing a good job’. Follow the £ and see who is paying for stories. Be particularly aware of native ads, advertorials and sponsored content, and ‘how lies are hugely profitable’.
- Peter Oborne from The Spectator/Daily Mail, etc with 30+ years on national newspapers, feels that they hold politicians to account and expose scandals, feels that there have always been attacks on the newspapers by the rich & powerful, doesn’t want state regulation of the press or HackedOff, and that fake news is largely caused by the internet and social media.
- Jack Monroe who now writes for The Guardian, got into journalism by accidence. Says are many stories going untold, and that accuracy and fake-checking are the key duties of a journalist, and that journalists should hold each other to account – triple sourcing everything. She reads 7 newspapers a day to understand what other angles are being taken – a duty to tell the truth, even if its not our own truth – what we write has an impact on real people’s lives.
A lot of discussion about how far it is possible to fact check, and where the boundary for responsibility between facts and opinions lies. Can’t always meet with people face-to-face to check, so often rely on others on the ground to check stories out, use NGOs (PO says don’t they have an agenda, VB says not used verbatim but are A SOURCE, importance of citizen journalism), depends if are doing analysis or breaking news.
The bit I was really interested in – social media – PO said that thinks social media has been disastrous, not the democratisation first expected, that it’s undermining the business models of the newspapers who now can’t afford staff. Couldn’t work out whether was admiration for Trump in that he can bypass the national newspapers and reach out straight to his supporter base via Twitter. JM was also not a fan of social media – that we exist in an echo bubble, talking to people we like/agree with, and sharing stories without checking the source, and reinforcing our existing viewpoints, whilst cherry picking from a range of sources. VB said potential for democratisation, but it’s not really going that way – the echo chamber gives the illusion of diversity, but often reinforces things first published in the tabloid press – opinion pieces often set the tone for what is seen in social media.
I am more in tune with this idea from earlier in the festival, but it does take an active engagement with who you follow, rather than defaulting to an echo chamber:
— Dr Bex Lewis (@drbexl) August 25, 2017
VB raised the question of responsibility of digital platforms, which pretend not to be publishers, but are mechanically responsible. Says they ‘escape scrutiny’, ‘keep us hooked’, and are ‘generous with their support’ in keeping us online. The freedom of the internet, how they are funded, what they fund, needs to be looked at. PO then referred to ‘the social medias’ and ‘the facebooks’, and emphasised that Trump’s tweets are ‘politically meaningful’ and can’t be ignored.
I was asked to prepare the following questions, although hadn’t got quite that far in detail:
- What is the impact of fake news on your sector/businesses? *Education or church? Has affected every sector.
- Do we have a responsibility to manage fake news? Every one of us has a responsibility to look at what we do, and yes we do.
- Is there a distinction between big scale media/corporate and political use of news/sharing and how ordinary citizens use it? Different scale – big companies have more impact and more resources so need to take more responsibility (including the social media platforms); each of us has a responsibility as each of those small shares, etc. that we do makes a difference (see slide 20 re what makes us share).
- What are the potential implications of a post-truth society in the future? Or are we already in it? It feels like we’re already in it, but how different is it from historical eras (see this Telegraph article). Media studies, etc. should be back on the menu, become part of PSHE, each of us needs to look at what/how we are sharing, and call for appropriate regulation, etc. Typically about culture, and tech is human-created.
Note, as a historian, I would say that the notion that news has ever been without bias doesn’t work, although digital may have assumed a different scale, and Brexit/Trump have brought ‘fake news’ into the mainstream.
I am struggling a bit with the continual equation of ‘fake news’ and ‘propaganda’, as propaganda (as you’ll see from my PhD) is one of my research areas. Propaganda is officially ‘neutral’ (as is social media), it is how it is used that is important (as with social media has certain affordances/constraints). Propaganda became a negative word in Britain after WW1 when people realised how much truth had been withheld, and how they had been fed ‘fake news’ about ‘the hun’, but in other cases, it is about the angle that is taken, which is not always negative, so want some more thought put into the distinction.
Whenever I’ve been asked for comment in the press, typically get a call from the producer/journalist wanting to know my angle and seeing if it fits with what they want, what they already have, although will also (mostly) change tack based on what might say if is not something they have already considered.
Picked up Post-Truth: How Bullshit Conquered the World by James Bull (Biteback, 2017) at Greenbelt too, though not read that yet.
I’d been collecting stories on Wakelet (and some academic articles stored in Dropbox although didn’t get to those either). Some of the thoughts I picked up from reading those:
- ‘Fake News’ became a term after Trump’s first press conference as President-Elect in Nov 2016, tied up with Brexit/Trump – versus ‘the establishment’/conspiracy theories – all gain massive media attention.
- 2016 – a different scale to ‘propaganda’, etc as social media interactions and algorithms = hyperbole for the US election – polarised/polemic/global.
- Distorting truth for emotional persuasion and action. Not always political, often an economic agenda for making £/$ from content sharing.
- Internet has enabled low cost for entry barriers, building truth with audiences, law/regulation harder to enforce.
- Telegraph: “… in an atmosphere where you never know what might happen next or what to believe, you’re going to be more receptive to hyperbole and truth distortion.”
- Importance of headlines, especially for online news – Buzzfeed demonstrated how these outperformed ‘real news’. Incendiary stories gain attention, strong headlines get shared/go viral, and even if FB, etc shut down on a story, easy/cheap to set up another.
- 5 types of fake news: intentionally deceptive; jokes taken at face value; large scale hoaxes taken seriously; slanted reporting of real facts; no established baseline for truth.
- Did Facebook, etc. win the election for Trump – difficult to measure but academic research shown that a Facebook fake article would need the same persuasive effect as 36 TV ads, so a ‘menace not a game changer’, but real concerns for countries with far-right growth.
- There are a lot of independent fact-checking organisations, e.g. Snopes, factcheck.org that should be used.
- Avoiding filter bubbles takes work but: use a diversity of media sources’ use search and other sources; seek out alternative perspectives that you disagree with; don’t unfriend/block those you disagree with (except trolls); be skeptical and fact-check. Nesta believes that those more interested in politics are typically more skilled at this and should coach others – online networks can be more divers than face-to-face – emphasises individual responsibility.
- There is no easy technological ‘quick fix’, but the global tech community can look at methods – needs human thought. Difference between stories with some truth but based on unreliable sources, or deliberately false stories designed to mislead. We need to fact-check (as we should always have done), crowdsource responses via e.g. Snopes, news outlets as a whole need to look at what they do, use AI machine-learning to highlight stories that may need human fact-checking (e.g. Robocheck from Fullfact), need reliable open data sources. More use of creative commons, Github, etc. building on each others work rather than privatising everything. Digital literacy and education are key. Look at big data, and consider tech both a cause and a solution.
- Tony Watkins considered some biblical verses that are relevant (and has written other stuff), talks about social media as a news source and growing distrust of the establishment, the scale of possibility – anywhere/anyone – Trump was blatantly untrue but resonated. The way that revenue has become so key for newspapers online means virality has become more important than truth/quality. Are fake-news sites/farms set up in (Eastern Europe) which are quick/easy to set up/duplicate/share, etc. Clickbait and headlines are key, and careless/unverified reporting is problematic. 23% of US adults in a Pew survey had shared fake news, knowingly or nowt. Sharing = social proof, often emotional, cascade of information that can be difficult to stop. Danger of confirmation bias, we reject the thinking that challenges us, and FB algorithms, etc. can reinforce that. The church has a prophetic role to ascribe to biblical truths, and seek truth in the world.
- Care with automated accounts, bots, etc. Bots can be used harmlessly, but Twitter, etc has a responsibility to look at how their platform is used. If lots of fake accounts/bots push the same story fast, it gives the impression that that story has more support than it actually does. Their current power lies in the fact that few know that they exist.
- Government is looking into it this year, differentiates between echo chambers (own viewpoints) and filter bubbles (automated content), as large numbers getting their news online, and a lot of reliance on algorithms, although also highlights human behaviour. Research shows that its not that simple and we are still exposed to attitude- changing information, with diverse social networks spread over widespread geographical regions. No clear definition of fake news, but wants Govt action to be industry-led, drawing upon regulation and user education.
- Content hosts have a responsibility: verification processes; banning false site; advertisers control where ads appear; remove £ incentives for false information; media literacy for consumers as content distributers.
- Internet/social media = amplify speed/reach of information, and in speed reading, can forestall critical engagement. We have moved from a commitment to storytelling to content marketing ‘where distortion is a priority’. Soundbite clips get well shared, rather than full documentaries, but YouTube/Netflix, etc give access to wider set of voices which mainstream broadcasters don’t really take up (e.g. re Vaxxed/MRR/autism).In Lumiere (first film) – would factory workers leaving the factory be ‘fake news’ as was ‘not real’. The conflict between the free press and authoritarian leaders = age old. Post-truth not a new thing, but entered mainstream use in 2016. Broadcasters, etc can influence the ‘truths’ audiences see through their editorial and commissioning choices.
- Bubbles are not new to social media – we’ve always mixed with certain class, gender, etc. bubbles … 2-step flow model – key opinion leaders pick up, share and amplify a story – as have always done. Social media can make the message and how it is created very visible, unlike the press. Novelty, speed and shareability… well researched and documented tends to be less shared! Digital have lost some of the clues that the physical form gave in distinguishing fact/fiction? Retraction is not really visible, fake news can just ‘disappear’. Social media can amplify spread – choices by system designers and regulators rather than tech affordances! Complex – but how do we learn to read the news? Why do we not regulate social media? How do you fact check before sharing?
- Rewards clicks by volume so more sensational news wins… programmatic ads alongside bogus content still gets ££ so needs looking at… cheap to produce so fake news is £ rewarded.
Digiexplorer (not guru), Senior Lecturer in Digital Marketing @ Manchester Metropolitan University. Interested in digital literacy and digital culture in the third sector (especially faith). Author of ‘Raising Children in a Digital Age’, regularly checks hashtag #DigitalParenting.