creativity, critical thinking, digital literacy, fake news, social digital literacy, terminology, Uncategorized

Digital literacy unpacked launch event

On 8th November we were delighted to welcome some of our lovely authors to the Open University Library to celebrate the publication of Digital literacy unpacked.

Following lunch and cake, we were treated to a series of talks from five contributors to the book – Jane Secker, Clare Killen, Josie Fraser, Caroline Tagg and Geoff Walton – covering terminology, coaching, critical digital literacy, and creative use of digital practices in schools. It provided the ideal inspiration for the finale to the day, led by Mark Childs – a splendidly creative depiction of the student digital literacy journey, using lego, plasticine or drawing.

Some highlights from the day are featured here. Presentations will be shared via this blog as soon as we can make them available.

Meanwhile, here are two videos that give a flavour of some of the topics under discussion:

Facebook, filter bubbles and fake news (Caroline Tagg and Philip Seargeant)

Mio my son (created by Danish schoolchildren and shared by Geoff Walton and Mark Childs) – the inspiration for our own versions of the student digital literacy journey.

A future post by Mark Childs will expand on some of the learning from the AMORES project. This found that school pupils’ motivation to engage with literature increased when they retold the story to an audience of their peers using e-artefacts.

critical thinking, digital literacy, fake news, social digital literacy

Fake news and the need for ‘social’ digital literacy

Over the next few weeks we will be featuring posts from some of the contributors to Digital Literacy Unpacked. The first article is by Philip Seargeant and Caroline Tagg.

The phrase ‘fake news’ has been popularised in recent years as a result of changes in the political landscape, both in the UK and the USA. Lecturer in English Language and Applied Linguistics, Caroline Tagg, and Senior Lecturer in English Language and Applied Linguistics, Philip Seargeant, reflect on the conclusions of the recently published interim report on ‘fake news’ by the Digital, Culture, Media and Sport (DCMS) Committee, and the UK government’s response.

An earlier version of this post originally appeared on the OU News website on 1 August 2018

Photo by rawpixel on Unsplash

On Tuesday 23rd October 2018, the UK government issued its response to the interim report on fake news, published by the Commons Digital, Culture, Media and Sport (DCMS) Committee. Much of the Committee’s report’s focused on the reform and regulation of electoral practices in the era of social media. However, as language and education specialists, we were pleased that it also recognised the need for ‘a unified approach to digital literacy’, including changes to the school curriculum and a public information campaign. The government’s subsequent response, while skirting many of the recommendations, did highlight its own commitment to ‘ensure that all citizens – not just those in full or part-time education – have the digital literacy skills needed to spot dangers’. In this article we reflect on the findings – and limitations – of this aspect of the committee’s interim report, focusing specifically on the role that higher education institutions can play in tackling the phenomenon.

Exploitation of personal data for the purposes of propaganda

 The DCMS’s enquiry was set up in January 2017 to look at ways of combatting the ‘widespread dissemination … and acceptance as fact of stories of uncertain provenance or accuracy’. As the enquiry developed, a particular concern for the committee became the way that data is used and shared, and its exploitation for purposes of propaganda. For this reason, much of the report focuses on the influence of technology – and the companies and organisations which work with this.

In an interview we conducted with the select committee’s chair, Damian Collins, he pointed to areas where he felt government was able to act to combat the current situation. These include transparency around online information, data protection laws to ‘check that [companies are] holding data in a way that complies with the law’, and ensuring that social media companies have a legal ‘responsibility to curate that space in a responsible manner’. These issues are now explicitly laid out in the report, with most recommendations involving changes to electoral law and regulation of social media companies.

Why do we share fake news?

But technology is only one part of the equation. It is also important to understand why people share false stories, and the effect this type of misinformation actually has on people’s actions. After all, the spread of misinformation online is related to how people use sites like Facebook – and this is shaped by the fact that Facebook is, first and foremost, a social space.

The report recognises this, citing the evidence we gave to the Committee in January that ‘to many people Facebook was not seen as a news media site, but a “place where they carry out quite complex maintenance and management of their social relationships”’. As our research shows, when people post to Facebook they potentially address a range of different social ties, from close family members to colleagues and acquaintances. It can be a tricky process to manage these various relationships all at the same time while not offending or upsetting anyone. Because of this, what someone shares or likes is often determined as much by the ties they have with their network as by a strict evaluation of its credibility.

Digital literacy – the fourth pillar of education

For this reason, as we argued in our own evidence to the committee, any solution to the problem needs to include educational measures alongside technological ones. In line with this, the report rightly recommends that digital literacy become ‘the fourth pillar of education, alongside reading, writing and maths’ in the school curriculum and that this requires co-ordinated action between the Departments of DCMS and Education, funded in part by an educational levy on social media platforms.

The vital role of Higher Education in providing digital literacy skills

This is all very welcome. One limitation, however, is that the measures focus too narrowly on data management and technology’s role in the spread of information. Our research suggests that, along with the current recommendations, education should also include what we call social digital literacies. Alongside traditional digital literacies skills, we need to provide greater critical awareness among the general public of how our social interactions and relationships play an important part in influencing our decisions regarding what to share or like – and how this in turn can contribute to the circulation and visibility of news in the online environment.

A second limitation of the report is that the educational measures it sets out fail, as yet, to envisage a role for higher education in equipping people with the digital literacy skills necessary for tackling fake news. This can hopefully be reconsidered, not least because, given their out-reach programmes and use of online resources, higher education institutions like the Open University are very well-placed to reach the government’s key target of equipping all citizens with the necessary skills, not just those within the education system. Furthermore, not only do UK universities have a great deal of experience in teaching digital literacy skills, but one key area taught in higher education is precisely that set of critical reading and thinking skills that social media users require if they are to learn how to evaluate online news and identify false information, and to appreciate how these might be shaped by their social concerns. For the recommended ‘unified public awareness initiative’ to be successful, it needs to include this critical element which will enable people to adapt their skills both to changes in technology and to developments in the on-going attempts made to mislead them.