If you are willing and able to engage your class in discussions about the ed tech issues of the day, recent events in the USA provide rich pickings.
In summary, Twitter has deleted Trump’s account, on the basis of what they regarded was his incendiary comments. Facebook has booted him off too. When many of his followers took to Parler, a Twitter-like platform that appears to do less policing of its content than Twitter, Google and Apple took the app off their stores, and Amazon in effect deleted its presence on the web itself.
It seems to me that a good discussion could be had about the following questions.
As a principle, should anyone be “de-platformed”?
I think the best way of countering offensive speech is to have it out in the open. Sunlight is the best disinfectant. How can you argue against things you disagree with if you’re not allowed to see or read those things in the first place?
If some people should be de-platformed, shouldn’t the State do it?
Twitter et al are private companies answerable only to their shareholders. I am very queasy about the idea of matters of freedom of speech being decided by the self-appointed arbiters of civic duty who head these companies. What do you or your students think?
Are Twitter et al platforms or publishers?
The social media companies have always argued that they cannot be held responsible for what people say or do on their platforms, because they are not publishers. This position is also enshrined in US law. But as soon as you boot someone off your platform for things they’ve said, you are, surely, behaving as a publisher.
What about discrimination?
If these companies are indeed de facto publishers, then mustn’t they be responsible for everything that is published on their platforms? I don’t see how they can be responsible for some things without being responsible for everything.
What about unintended consequences?
I mentioned earlier that I feel queasy about freedom of speech decisions being made by private companies. Lots of people are hailing the de-platforming of Trump with glee. However, a principle has been established. What if a more conservative group of people take the helm of these companies, or their owners become more conservative as they grow older. Will they then deplatform Democrats, free speech enthusiasts, or anyone else they deem unworthy of being listened to?
Another example, more immediate. Twitter, Google and Facebook routinely make it hard to find content about the pandemic that doesn’t conform to the World Health Organisation’s views, even if the dissenting views are from epidemiologists and other scientists. It may be laudable to quash fake news and misinformation, but surely there’s a difference between a group of people declaring that this whole pandemic thing is a hoax and a conspiracy, and a group of scientists who believe that lockdowns may not be the best way of preventing its spread? Is anyone in Twitter, Google and Facebook qualified to determine whether or not these experts in their field are correct?
Another point: the people who were on Parler spreading and sharing obnoxious views have now migrated to Telegram, which is an encrypted messaging service, in droves. As a practical matter, even if you wish to silence people, how does driving them underground help? It doesn’t silence them, it just means the rest of us can’t hear them, which means we have less chance to argue against them. To me it’s the equivalent of sticking our fingers in our ears and singing “La la la”, in the manner of five year olds.
Further reading
A couple of articles you might find interesting are:
The tech supremacy: Silicon Valley can no longer conceal its power
I hope you have found this article useful, and that you enjoy fruitful discussions with your students.
If you found this article interesting and useful, why not subscribe to my free newsletter, Digital Education? It’s been going since the year 2000, and has news, views and reviews for Computing and ed tech teachers — and useful tips.