According to Nic Newman, of the Reuters Institute for the Study of Journalism, around 58% of people surveyed in the UK say they’re concerned about being able to separate what is true and what is not true on the internet.
That was one of the interesting facts which came out of the recent Westminster Forum conference on fake news and misinformation. I intend to write more about this in a future issue of the Digital Education newsletter, including a review of Newsguard (see below for a brief heads-up). In the meantime:
The agenda for the conference may be found here: Fake news conference.
If you would like to buy the transcript of the conference — both talks and Q & A sessions — here’s the link for that: Ordering details for the transcript.
In the interests of transparency, I should like to say that I was given a complimentary media pass to attend the conference. That has not and will not affect my evaluation of it.
I think the Department for Education could do a lot more to help in equipping young people to recognise fake news and other forms of misinformation. I submitted an article outlining my ideas for inclusion in he transcript. That’s been accepted, and I have also included it below.
Just before that though, allow me to draw your attention to Newsguard. This is a service that evaluates websites against 9 criteria, and uses real live people (journalists) to do it. It looks like it could be dead useful in a school setting — you can add it as an extension to your web browser. It uses a traffic light system to tell you whether a site you’re on is legit or not — assuming that Newsguard has examined it of course. I’ll give it a more thorough treatment in Digital Education, so be sure to sign up for that!
And now, here’s my article suggesting how the Department for Education could help.
Fake news: role for the DfE?
Although the issue of fake news is a global concern, the Department for Education could make a difference. How? *
First, government documents, especially those emanating from the DfE, should be written in clear English, because that would make misinterpretation more difficult. For example, the recently-published Education Technology strategy might have done better in this respect, such as by avoiding 'empty calorie' phrases such as 'drive the delivery of this strategy', and setting SMART targets rather than vague ones. (See The New DfE Education Technology Strategy: A Textual Analysis.)
Secondly, in the context of fake news, there is already a healthy emphasis on thinking critically in the National Curriculum and elsewhere, even if 'fake news' is not mentioned specifically. The Programmes of Study in Computing, Citizenship, English and Mathematics each address the need for critical evaluation in their own way. In addition, the Teaching Standards state that teachers must establish a safe environment for pupils, while the recently-published National Standards for Essential Digital Skills includes the need to be aware of fake reviews online.
The DfE is far better placed than I to join the dots and pull all of these strands together into a coherent policy on addressing how to spot fake news.
Finally, given that dealing with misinformation is as much an educational issue as a technological one, it would be useful to hear the DfE's views on the matter at conferences such as this.
Don’t forget: more news and commentary about fake news in a forthcoming issue of Digital Education.
* Not included in the submission to the Westminster Forum: I am trying to not think about the DfE’s own contribution to fake news in the form of misleading statistics. See School funding 'exaggerated' by ministers, says watchdog.
If you want an excellent example of the triumph of hope over experience, look no further than the optimism surrounding driverless cars.