Watch the video to find out what media literacy is and how it can improve your consumption and creation of content.
For tips on how to evaluate different resources such as websites, books, journal articles and more, try the library's Resource Evaluation Tool.
Are you worried about deepfakes? Take a look at Content Credentials, a tool which will tell you where, how, and when content was generated.
Digital messages are designed for different purposes and audiences. Evaluating digital content is made easier by using the SIFT tool (Stop, Investigate the Source, Find Other Coverage, and Trace the Claims). Click on the blue rectangles below for information about each step.
Because we now get so much of our information from web sites and social media, we need to know how to navigate the issues caused by predictive searching based on algorithms. If we know how algorithms work and where they occur, we can understand how to gather more authoritative, diverse perspectives which can give us more of the ‘whole story’ of a situation or event. We can be more mindful about what we like and share.
An algorithm is a set of instructions used by computers to perform a certain task. It is generally used to automate processes which are repetitive or take a long time. An example might be sorting results from a Google search by their relevance to your search words.
A lot of our daily tasks are governed by algorithms. There is the Google search example above. Think of streaming apps and shopping sites – how do they know what to recommend to us? Processes like these involve the analysis of lots of data. How does the software gain access to that data?
Believe it or not, we provide a lot of the clues. Every time we decide to watch a particular program, share an online article, or like a social media post, we provide the algorithms behind the apps and sites with valuable data.
When did we decide to give the businesses behind those apps and sites access to our data, and allow them to sell it on to other businesses? We do it when we tick the ‘Terms of Use’ box required by most apps and websites to allow us to use them. By compiling all the data from our decisions, likes and shares, companies can build up a surprisingly accurate picture of who we are, where we live, and what we like. They can use that picture to predict our future actions.
As algorithms in the sites we use refine their suggestions to us, we in turn share data with the algorithms if we choose to engage with their suggestions. This can result in a very narrow range of choices as we endorse the algorithms’ recommendations.
When we're setting up news feeds, or just scrolling through our social media accounts, we can make sure that our preferences are set according to what we want to see, not what the platform thinks we should see. We can make our own decisions about who and what we listen to, read or watch. If we make more diverse choices in terms of content creators and platforms, our information landscape will be richer and more balanced.
For students and staff at the University of Newcastle, you might like this very short LinkedIn Learning course, Spotting Misinformation Online.
Unsure about how to access LinkedIn Learning? Go to the LinkedIn Learning page in this guide.
If you become aware of false information being shared on social media, you can correct it in a new post. This is less confrontational and embarrassing for the person who posted it. Remember that most people don't want to knowingly share false information; they probably posted it quickly without checking it first. Your correction should not be a challenge, just a new post (or a comment attached to the incorrect post) containing the correct factual, verifiable information.
You can also improve your own practice by stopping to think about what you're sharing or posting. Is it something that you can verify independently, using an authoritative, reliable source of information? If not, maybe it's better not to share until you can find out more.
Stop to check the source of the post - identify whether the account is legitimate or whether it belongs to a bot or troll. If the account is legitimate, check the credentials of the information's author or publisher.
Watch this LinkedIn Learning video to find out more about how to develop media literacy (note that this video is only available to staff and students of the University of Newcastle):
Develop media literacy from Working and Collaborating Online by Garrick Chow
Unsure about how to access LinkedIn Learning? Go to the LinkedIn Learning page in this guide.
A lot of information on this page is based on content from the following eBook:
Burkhardt, J.M. (2022). Media smart: Lessons, tips and strategies for librarians, classroom instructors and other information professionals. Facet Publishing.