On Saturday 10th March, Culture Secretary Matt Hancock spoke in The Times(£) about social media being ‘broken’ and how he wanted to fix it before his children went online (13 being the legal age limit for most social media accounts, although WhatsApp is officially 16). This is the segment I was sent to respond to:
He also wants a new code of conduct to ensure that children access only age-appropriate material online. There could be age ratings, similar to film classifications, for websites and different sections on platforms such as YouTube for over-18s, over-15s and over-13s. Parents could then set internet filters at home based on their children’s age. And, most radically of all, he is considering mandatory limits on the amount of time children spend on certain sites. “For an adult I wouldn’t want to restrict the amount of time you are on a platform but for different ages it might be right to have different time cut-offs,” he says. “I think there is a genuine concern about the amount of screen time young people are clocking up and the negative impact it could have on their lives.”
Families, he insists, need more support from the state in navigating the online world. “This technology is new — the smartphone was invented in 2008 — and we are the first generation of parents to have gone through this.” His 11-year-old daughter doesn’t yet have her own mobile phone and he admires schools that “take an absolutist approach” to banning them. Parents should also set an example, he says.
“At home we have a box in the kitchen where we put our phones when we go into the house . . . Part of this is putting laws through and some of it is about how society behaves and grows up. Children must be taught how to manage their online lives, including data protection and dealing with cyberbullying. I am extremely worried about sites that, unintentionally largely, promote anorexia. We are putting resilience in the digital age into the curriculum.”
The segment got cut a little short (and ran 20 mins late) because of Vince Cable’s Brexit comments breaking, but here’s the segment:
I was expecting to be able to respond to the other comments, so sharing a few more thoughts that I’d noted in my preparation:
- I like the intent behind it, and definitely think companies and state have responsibility along with parents/ friends/ family/ schools/ churches to look at how best to enable digital health, but the time limits thing – especially mandatory doesn’t work for me … recent research focuses upon quality of what is being done online, instead of time, and it is also an excellent tool for e.g. disabled/autistic kids.
- The anti-diet group I work with has a mantra of ‘be your own guru’, so what works for you, may not work for me, so people put forward suggestions, people try things, and if they work, they do them again and build them as habits. So healthy digital literacy, conversation with your children, spending time with them, understanding their digital environments makes much more sense than a random time limit (same as the 100 calorie limit on snacks).
- No one yet seems to have found a way to age verify online, so that’s problematic for that bit, although I write in my book that parents letting their children on before a platform’s correct age is sending a wrong message about other rules, but also the platforms can’t throw their hands up and say ‘we say you have to be 13’ ignoring all the research that shows that large numbers are on before then…
- The ‘in a box’ at school – I’d rather children were shown how to do them well, so a mix of classes in which the box comes out, and others where phones are permitted – same for home – have friends who put all their phones on a shelf at home time/dinner time, and have to be charged downstairs at night for everyone, etc. but people need to find what works with their particular situation…
I had a chat on Facebook (only visible to friends), in which people contributed tales of children with Aspergers and how the screen removes the stress of face-to-face interactions, when people are ill (any age) it’s a really helpful tool, the possibility that children are able to self-regulate their use, that social media is used to keep in contact with friends who have been left behind, that children use the internet quite differently from adults and we shouldn’t assume that it’s all ‘bad’ – remember the things we got yelled at for (putting down a book before dinner was a thing that happened!), and one that I loved – when told to put the phone down, said “can I just please finish helping my friend with a bit of a situation”. Remember also that digital and social media is much more interactive than TV watching – and can often be very creative!
The black/white nature of the notion of having fixed screen time doesn’t work for me at a philosophical level, let alone at a practical level – how do you manage that across multiple devices, is this going to impact on privacy, etc., and how much resources will be needed to ‘police’ this – resources which could better be invested in digital literacy courses. There was reference in the interview to digital natives, not something I’m a big fan of, much preferring Dave White’s visitors and residents theory.
The narrative raised in this speaks to the fears that have been gathering pace over the past few years, including the stories from those who originally invested in Facebook, etc. There is no doubt that the companies need to look at their responsibilities, and that the government should look at where it can step in, but assuming that ‘screentime’ is negative and that it is all the same is unhelpful. There’s still hangovers of the notion that the internet is ‘virtual’ and a ‘wild west’ – some bits are – but we can be teaching each other (let alone children) positive behaviours to undertake there. A common cry is “I’m addicted to the internet” etc – I’ve spoken about this several times – addiction is a medical condition with particular traits – there is no doubt that e.g. games developers build in addictive behaviours that need regulating, but for many people, there are simply bad habits that people could experiment with – e.g. I had a friend who didn’t pick up her phone til 9am in the morning for a month to see what this did … remembering that what works for some won’t necessarily work for everybody!
I gave a talk on resilience a few year’s ago, in seeing resilience as the ability to bounce back after being squashed in some way, and found this quote from The Guardian interest: “but the concept of resilience now applies more frequently to humans and has evolved to understand “toughness”, less as rigidity and more as elasticity. In other words, resilient humans need not be indestructible brick walls that the wolf can’t blow down, but more like trees who can bend and sway with the wind.”
Another quote from Canada:
“If we want resilient kids we need to understand what young people’s experiences are online, listen to their concerns, and intervene with their best interests in mind.” Jane Tallim, Co-Executive Director, MediaSmarts, in article titled ‘Young Canadians need less surveillance and more mentorship online: national study highlights the important role of adults in kids’ online lives’ “Zero-tolerance policies don’t work. Encouraging trust and open dialogue is the best approach, particularly when dealing with mean and cruel online behaviour”
Another quote from Andrew Tomlinson at the BBC:
“We’re doing this because all the research tells us that children and young people respond best to their peers. Whether they’re under pressure to take part in a dangerous prank, or to victimise someone, or whether they’re an online bully themselves, stories told by other young people are most likely to resonate and to help them cope, or change their behaviour.”
Most of the fears are tied to technological determinism – assumes that the technology is at fault, rather than the surrounding culture, the people around it who decide what is acceptable, what is made, what algorithms work, etc. We need to look more at the ‘social shaping’ of technology – the affordances and constraints it gives. Social Media offers a lot of space for conversation and listening, although, as with offline conversation – people usually rush to post and say their thing!
The use of e.g. social media agreements can work at home, school, church, etc. There is an assumption that recognising the positives of social media, means that advocates put no thought put into it at all, but my book subtitle is ‘enjoying the best, avoiding the worst’ as we seek to use wisdom (and that doesn’t necessarily mean ‘digital by default’). As with all of life, life online is not risk free, and what works for one won’t necessarily work for all (e.g. children with unlimited chocolate learn that they can use as they wish – the need to use something ‘bad’ disappears).
Need to see less as either/or online/offline … but a blend – how do they all work together? What can each form of communication do well?
How would I advocate for parents?
- Don’t chuck them in any more than pool/park – life is not risk free
- Conversation – get them to show you …
- Early age ring-fence what they can see, as they get older give the more space/responsibility – note that 13 is the age for most social network sites (16 for WhatsApp)
- Age verification doesn’t yet work..
- Offline life amplified – so if you’re already tending towards depression, etc… (Correlation/causation) …
Assuming that switching off is always the answer is not particularly helpful (I’m collecting stories about screentime/addiction on Wakelet). We need to look at the surrounding culture much more about what encourage people to behave negatively online, consider whether the boundaries we are putting on children’s internet use is helpful (and our own for that matter) – tech should not be a free for all with children staying up all night using it – but penalising the ones that use it well because of others isn’t helpful! It’s a big debate which has been going on for years, and has reached a bigger layer of public attention in the last few years – and it’s a subject everyone has an opinion on – need more research – such as that undertaken at LSE.
At the last Premier Digital Conference I spoke on this topic, slidedeck here: