Cornell University Professor Dr. Gordon Pennycook spoke at a monthly pub night in Saskatoon on Tuesday evening about “misinformation and social media during the pandemic.”Pennycook talked about “why people believe, spread and share misinformation online.”“One of the largest issues with misinformation is that those few who have the strongest views or, in some cases, may be the most overconfident or whatever, those are the ones who are speaking out the most and the silent majority may not agree, but they aren’t speaking up,” said Pennycook.However, Pennycook published numerous stories with MSNBC, The Hill, CBC, and The New York Times. Most of his work focuses on 'misinformation' and 'fake news' when, in fact, he is spreading 'misinformation' himself.One story for CBC was titled How the COVID-19 crisis exposes widespread climate change hypocrisy, where he ties together climate change deniers and people critical of the government’s COVID-19 response as conspiracy theorists.“We should turn to people who know more than we do — particularly for topics that are complex and important,” wrote Pennycook.“If you accept this logic, there's no way you should reject the idea that humans are causing climate change. It is indisputable that the most established experts on global climate, those whose job it is to understand our climate and who actively publish primary research on it, are effectively unanimous in their agreement that climate change is happening and that humans are the cause.”Another story for The New York Times entitled The Right Way to Fight Fake News endorses the view that social media platforms should control the information shared to prevent 'misinformation.'“One strategy that platforms have used is to provide more information about the news’ source. YouTube has 'information panels' that tell users when content was produced by government-funded organizations, and Facebook has a 'context' option that provides background information for the sources of articles in its News Feed,” wrote Pennycook.“This sort of tactic makes intuitive sense because well-established mainstream news sources, though far from perfect, have higher editing and reporting standards than, say, obscure websites that produce fabricated content with no author attribution.”Also joining Pennycook at the event were Dr. Kyle Anderson from USask’s College of Medicine and Dr. Angie Rasmussen from VIDO-Intervac.
Cornell University Professor Dr. Gordon Pennycook spoke at a monthly pub night in Saskatoon on Tuesday evening about “misinformation and social media during the pandemic.”Pennycook talked about “why people believe, spread and share misinformation online.”“One of the largest issues with misinformation is that those few who have the strongest views or, in some cases, may be the most overconfident or whatever, those are the ones who are speaking out the most and the silent majority may not agree, but they aren’t speaking up,” said Pennycook.However, Pennycook published numerous stories with MSNBC, The Hill, CBC, and The New York Times. Most of his work focuses on 'misinformation' and 'fake news' when, in fact, he is spreading 'misinformation' himself.One story for CBC was titled How the COVID-19 crisis exposes widespread climate change hypocrisy, where he ties together climate change deniers and people critical of the government’s COVID-19 response as conspiracy theorists.“We should turn to people who know more than we do — particularly for topics that are complex and important,” wrote Pennycook.“If you accept this logic, there's no way you should reject the idea that humans are causing climate change. It is indisputable that the most established experts on global climate, those whose job it is to understand our climate and who actively publish primary research on it, are effectively unanimous in their agreement that climate change is happening and that humans are the cause.”Another story for The New York Times entitled The Right Way to Fight Fake News endorses the view that social media platforms should control the information shared to prevent 'misinformation.'“One strategy that platforms have used is to provide more information about the news’ source. YouTube has 'information panels' that tell users when content was produced by government-funded organizations, and Facebook has a 'context' option that provides background information for the sources of articles in its News Feed,” wrote Pennycook.“This sort of tactic makes intuitive sense because well-established mainstream news sources, though far from perfect, have higher editing and reporting standards than, say, obscure websites that produce fabricated content with no author attribution.”Also joining Pennycook at the event were Dr. Kyle Anderson from USask’s College of Medicine and Dr. Angie Rasmussen from VIDO-Intervac.