Media Literacy: Keys To Interpreting Media Mess... Fixed
One of the first things young children need to discover to be media literate is the difference between fact and opinion. Many younger children have prior knowledge about a subject they are interested in that may not be 100% accurate. Educators can guide these children towards finding credible sources to back up their beliefs using online content hubs like PebbleGo. With PebbleGo, for example, teachers can provide young children with a safe place to search through credible sources including pictures, videos, and articles on a variety of topics.
Media Literacy: Keys to Interpreting Media Mess...
After grade 2, students move on to PebbleGo Next which provides a natural next step for 3rd-5th grade students. PebbleGo Next includes articles that are aligned to state and national standards with a familiar, yet age-appropriate, experience and interface. As students get older and more familiar with asking questions and researching answers, educators can encourage them to compare facts uncovered on PebbleGo Next with other media sources, such as other websites, books, or newspaper articles. Students can compare their findings - and their individual interpretation of findings- with their peers as well!
We recognize that they have choices. We recognize that the immediacy -- and -- and -- and the reasons why they pick our challengers and competitors, whether it be for any type of training or armament. We don't pressurize them. That's not what we do. We offer a value proposition that's grounded in principles of -- of value and also common -- commonality of national objectives, their national objectives and ours.
FAIR is the national progressive media watchdog group, challenging corporate media bias, spin and misinformation. We work to invigorate the First Amendment by advocating for greater diversity in the press and by scrutinizing media practices that marginalize public interest, minority and dissenting viewpoints. We expose neglected news stories and defend working journalists when they are muzzled. As a progressive group, we believe that structural reform is ultimately needed to break up the dominant media conglomerates, establish independent public broadcasting and promote strong non-profit sources of information.
As our experience of media evolves from mere reception to active participation, learning to think critically about content is not enough. We must learn to act purposefully with these new tools lest they, and the people behind them, act purposefully on us, instead. In an era when nothing short of true mastery will do, Julie Smith gives parents and educators clear and simple steps for how to become media literate in the twenty-first century.
This book is for everyone who recognizes the power of the media, and wants to know how that power is used and what we need to do to master media literacy skills. Julie Smith explains why those of us who promote media literacy education really are in it to save the world.
"As a parent, an educator, and as an informed citizen, I found Master the Media to be a fascinating and eye-opening read on the importance of media literacy. Julie Smith expertly unfolds the history of the media around us, while providing tools to help us become leaders of media literacy. A must read for every household!"
Education of media should not be an elective within our learning environments; education of media should be a pre-requisite within primary education and a measured goal of literacy as a student progresses through secondary education. Considering how much influence media has on children and their cultural experiences, laying a foundation of critical judgment of media for children should become an objective for every educator in the world. This book can help that process within our communities.
"Master the Media is an important book, directed at parents and others who are in positions to influence the media habits of young people. The book takes a very personal approach to the constructive use of the media, offering support and direction, so that individuals can develop a healthy independence from the messages they receive through the media."
We wanted to be careful not to "teach to the test" in preparing students for the performance-based assessment. We needed to strike a balance between teaching the content (e.g., probability given two independent events) and preparing students for the task (e.g., interpreting the validity of a media resource). We brainstormed six different formative assessments that would need to be in place before students completed the performance task. However, we also acknowledged that this part of our plan would need to be constantly reviewed and revised depending on student learning needs.
Last but certainly not least, Section 230 of the Communications Decency Act will largely immunize social media from most of the potential legal liability discussed in this memo. If a third party posts a digital falsification on an online platform, the platform cannot be held liable for hosting it even if the third party could be, unless hosting the content violates federal criminal or intellectual property law. At the very least, this means that platforms are not legally responsible for user-generated falsifications that would otherwise run afoul of laws concerning the right of publicity, defamation, false light, or IIED.
The preceding question is actually ambiguous in an important way. On the one hand, we might be asking about the ethical status of some particular piece of synthetic or manipulated media in some particular context. For example, we might be asking a very local question about a specific video that seemingly shows a political candidate saying something that they do not believe. On the other hand, we might be focused on some such media as an instance of a broader type. For example, we might want to know the ethical status of any synthetic or manipulated media that is developed with the intent of disrupting a legitimate election. Platforms (and others) are unlikely to be in a position to judge every case in its full complexity. We thus focus more on the latter question, since it can lead to the development of ethically grounded policies and principles (albeit, with exceptions or edge cases). We present a collection of such principles in the last section.
In many cases, synthetic and manipulated media are produced and directly targeted with the explicit intent of doing harm to some person or group. Many of the most obvious worries about synthetic/manipulated media and the 2020 election (including examples mentioned above) involve specific intent to harm the political prospects or reputation of an individual or political party. Even when the relevant media is not created with that conscious intent, we still must consider key rights and interests of those who are directly impacted or harmed by it:
The impacts of synthetic and manipulated media are not restricted to the political sphere, but extend more generally to social communities. For example, a deepfake that slanders a political candidate could also impact the broader community (social, physical, and so on) in which the target lives. The social ties that are vital to a community can be undermined as a result of an attack. We thus must include the relevant values for social communities:
In contrast with the interests of the broader political community, community interests can be threatened by a single instance of synthetic or manipulated media. A single well-designed attack can have a disproportionate impact on the relationships that hold a community together, particularly in smaller communities, where political leaders often play a large social role.
We do not live in an ideal world. Platforms, consumers, and the targets of synthetic or manipulated media attacks are rarely in a position to perform a full, complete analysis. We might successfully complete these steps for isolated instances, but we have every reason to expect an onslaught of both types of media in advance of the 2020 election. It will not be feasible to approach these evaluations on a wholly case-by-case basis. We should instead look for rough principles or guidelines that are ethically grounded, and that can serve as heuristics to evaluate the ethical status of some synthetic or manipulated media. The principles will almost certainly be wrong for unusual cases, but should provide appropriate guidance for the easy cases.
One key observation is that the majority of interests outlined above would almost always be harmed, rather than advanced, by synthetic/manipulated media that intended to influence the 2020 U.S. presidential election. Since many of those interests are prima facie equally weighty, we should (from an ethical perspective) view such media relevant to the 2020 election with significant skepticism.
Of course, some threats to various interests would be minimized if the synthetic or manipulated media were known to be false, so clear labeling of synthetic and manipulated media reduces this burden of proof. For example, a deepfake that falsely shows a candidate endorsing a position might be less ethically problematic if it were clearly and persistently marked as a deepfake. Notice that the heuristic principle here is that labeling reduces, but does not eliminate, this burden. Even labeled synthetic/manipulated media can still significantly impair weighty interests (since people cognitively struggle to remember what exactly is fiction), and so be unethical.
5 As mentioned above, one key function of metaethical frameworks is to provide specific tools and methods to integrate diverse, competing rights and interests. So if we are willing to commit to a particular framework (such as utilitarianism), then we have a specific (though not always easy-to-use) way to determine the ethical status of some synthetic media.
6 We emphasize again that we are focused on the ethical, not legal, status of synthetic/manipulated media. Even if we adopt this heuristic principle, there might not be legal means to restrict the production and promulgation of such media. 041b061a72