This morning it has been announced that Ofcom will have new regulatory powers with regards to online safety. See BBC, Guardian, Financial Times.
I’ll be on Premier Radio around 1.20pm speaking about this. <added afterwards>
My thoughts so far:
- Cautiously optimistic (as the details are not yet clear) – as part of the mix of making digital a positive place to be.
- Wary about the global nature of technology, so be interesting to see how much power one nation has to make this work, but there are already examples from Germany and Australia (largely involving fines/imprisonment). Social networks need to make money, so it hits the bottom line. It may not be that straightforward though, as a 0.5% tariff was suggested last year, but Trump complained that this was a US focused tax as most of the platforms are based in the USA.
- We need care to ensure that we do not end up like China, blocking platforms, are concerns about ‘backdoor entry’ to access people’s data.
- Makes sense that it’s Ofcom that does this as it already regulates other media formats (although e.g. BBC is UK-centric)
- It responds to the government online harms document released last year (see conversation with UCB last April), ensures that platforms do not have an unregulated free hand or no responsibility (but how enforceable is this?)
- Looking for more transparency/accountability for platforms, and the process of regulation. There is no magic bullet, so we still need to take our own responsibility for digital literacy, what we share, and articulating what we want from platforms, etc.
- Want to know who is setting the rules, and that it’s based on good research data (e.g. LSE kids yesterday released report more risk, no evidence of more harm), and not just responding to media panic (as we’ve had with every new technology.
- AI is already being used by tech companies to help increase their £, so good that it should (if not already) be put to use to increase the benefits of using platforms (government mentions violence, terrorism (something that HE institutions are concerned about policing), cyber-bullying, child abuse) – e.g. photo recognition software (though not positive for e.g. mastectomy photos).
- They are seeking to ensure that content is removed quickly, and minimising the risks of it happening at all. If digital spaces are fast to take down, will become pointless to post up (though can quickly repost/repost).
- There are still more details to come in the Spring so lots still to know. Are queries about VPN workarounds, etc.
- There’s a question of a duty of care, there is more than £ at stake here.
- Online has never been supposed to be a wild-west, free for all. Current legislation still stands, but as much legislation is national, this can be problematic.
- Query whether ‘policing’ or ‘encouraging better practice’.