{"id":367,"date":"2023-08-23T08:00:00","date_gmt":"2023-08-23T12:00:00","guid":{"rendered":"https:\/\/www.techsequences.org\/podcasts\/?p=367"},"modified":"2023-08-21T18:13:56","modified_gmt":"2023-08-21T22:13:56","slug":"scientists-confronting-the-disinformation-ecosystem","status":"publish","type":"post","link":"https:\/\/www.techsequences.org\/podcasts\/2023\/08\/scientists-confronting-the-disinformation-ecosystem\/","title":{"rendered":"Scientists Confronting the Disinformation Ecosystem"},"content":{"rendered":"\n<p><strong>Guest:<\/strong> Sheldon Himelfarb<\/p>\n\n\n\n<p>With the advances in AI and increasing sophistication in creating misleading content such as deepfakes, there is growing concern especially amongst academics and researchers about the threat mis- and dis-information pose not only for the 2024 election cycle and democracy, but also for pressing concerns such as public health and climate change. How can we address a polluted information ecosystem at a time of significant social divide and erosion of trust? How can scientific principles and approach inform policy? <\/p>\n\n\n\n<p>Please join us for a conversation with Shelton Himelfarb, the co-founder and Executive Director of the <a href=\"https:\/\/www.ipie.info\/\">International Panel on the Information Environment (IPIE)<\/a>\u00a0 an independent global organization of over 300 leading scientists dedicated to providing actionable scientific knowledge on threats to our information landscape.<\/p>\n\n\n\n<p><strong>Hosted by:<\/strong> Alexa Raad and Leslie Daigle.<\/p>\n\n\n\n<p><strong>Further reading:<\/strong><\/p>\n\n\n\n<ul class=\"wp-block-list\">\n<li><a href=\"https:\/\/www.ipie.info\/about\">IPIE<\/a><\/li>\n\n\n\n<li><a href=\"https:\/\/www.nytimes.com\/2023\/05\/24\/business\/researchers-study-misinformation.html\">With Climate Panel as a Beacon, Global Group Takes On Misinformation<\/a><\/li>\n\n\n\n<li><a href=\"https:\/\/www.nobelprize.org\/events\/nobel-prize-summit\/2023\">Nobel Prize Summit: Truth, Trust and Hope<\/a><\/li>\n\n\n\n<li><a href=\"https:\/\/www.nytimes.com\/2023\/07\/04\/business\/federal-judge-biden-social-media.html\">Federal Judge Limits Biden Officials\u2019 Contacts With Social Media Sites<\/a><\/li>\n\n\n\n<li><a href=\"https:\/\/static1.squarespace.com\/static\/5b6df958f8370af3217d4178\/t\/6011e68dec2c7013d3caf3cb\/1611785871154\/NYU+False+Accusation+report_FINAL.pdf\">False Accusation: PAUL M. BARRETT AND J. GRANT SIMS The Unfounded Claim that Social Media Companies Censor Conservatives<\/a><\/li>\n<\/ul>\n<div class=\"powerpress_player\" id=\"powerpress_player_1830\"><audio class=\"wp-audio-shortcode\" id=\"audio-367-1\" preload=\"none\" style=\"width: 100%;\" controls=\"controls\"><source type=\"audio\/mpeg\" src=\"https:\/\/media.blubrry.com\/techsequences\/content.blubrry.com\/techsequences\/20230711-SheldonHimelfarb.mp3?_=1\" \/><a href=\"https:\/\/media.blubrry.com\/techsequences\/content.blubrry.com\/techsequences\/20230711-SheldonHimelfarb.mp3\">https:\/\/media.blubrry.com\/techsequences\/content.blubrry.com\/techsequences\/20230711-SheldonHimelfarb.mp3<\/a><\/audio><\/div><p class=\"powerpress_links powerpress_links_mp3\" style=\"margin-bottom: 1px !important;\">Podcast: <a href=\"https:\/\/media.blubrry.com\/techsequences\/content.blubrry.com\/techsequences\/20230711-SheldonHimelfarb.mp3\" class=\"powerpress_link_pinw\" target=\"_blank\" title=\"Play in new window\" onclick=\"return powerpress_pinw('https:\/\/www.techsequences.org\/podcasts\/?powerpress_pinw=367-podcast');\" rel=\"nofollow\">Play in new window<\/a> | <a href=\"https:\/\/media.blubrry.com\/techsequences\/content.blubrry.com\/techsequences\/20230711-SheldonHimelfarb.mp3\" class=\"powerpress_link_d\" title=\"Download\" rel=\"nofollow\" download=\"20230711-SheldonHimelfarb.mp3\">Download<\/a><\/p><p class=\"powerpress_links powerpress_subscribe_links\">Subscribe: <a href=\"https:\/\/podcasts.apple.com\/us\/podcast\/techsequences\/id1509826111?mt=2&amp;ls=1\" class=\"powerpress_link_subscribe powerpress_link_subscribe_itunes\" target=\"_blank\" title=\"Subscribe on Apple Podcasts\" rel=\"nofollow\">Apple Podcasts<\/a> | <a href=\"https:\/\/open.spotify.com\/show\/6BgXkvatS6UgsTsJVi7BJE?si=N7tlTeOkTlOrg3Ysco2nbw\" class=\"powerpress_link_subscribe powerpress_link_subscribe_spotify\" target=\"_blank\" title=\"Subscribe on Spotify\" rel=\"nofollow\">Spotify<\/a> | <a href=\"https:\/\/subscribeonandroid.com\/www.techsequences.org\/podcasts\/feed\/podcast\/\" class=\"powerpress_link_subscribe powerpress_link_subscribe_android\" target=\"_blank\" title=\"Subscribe on Android\" rel=\"nofollow\">Android<\/a> | <a href=\"https:\/\/www.techsequences.org\/podcasts\/feed\/podcast\/\" class=\"powerpress_link_subscribe powerpress_link_subscribe_rss\" target=\"_blank\" title=\"Subscribe via RSS\" rel=\"nofollow\">RSS<\/a><\/p><!--powerpress_player-->","protected":false},"excerpt":{"rendered":"<p>Guest: Sheldon Himelfarb With the advances in AI and increasing sophistication in creating misleading content such as deepfakes, there is growing concern especially amongst academics and researchers about the threat mis- and dis-information pose not only for the 2024 election<\/p>\n","protected":false},"author":2,"featured_media":369,"comment_status":"open","ping_status":"open","sticky":false,"template":"","format":"standard","meta":{"_jetpack_memberships_contains_paid_content":false,"footnotes":""},"categories":[31,35],"tags":[],"class_list":["post-367","post","type-post","status-publish","format-standard","has-post-thumbnail","hentry","category-31","category-content"],"jetpack_featured_media_url":"https:\/\/www.techsequences.org\/podcasts\/wp-content\/uploads\/2023\/08\/TS-PodcastHeaders-1.jpg","jetpack_sharing_enabled":true,"_links":{"self":[{"href":"https:\/\/www.techsequences.org\/podcasts\/wp-json\/wp\/v2\/posts\/367","targetHints":{"allow":["GET"]}}],"collection":[{"href":"https:\/\/www.techsequences.org\/podcasts\/wp-json\/wp\/v2\/posts"}],"about":[{"href":"https:\/\/www.techsequences.org\/podcasts\/wp-json\/wp\/v2\/types\/post"}],"author":[{"embeddable":true,"href":"https:\/\/www.techsequences.org\/podcasts\/wp-json\/wp\/v2\/users\/2"}],"replies":[{"embeddable":true,"href":"https:\/\/www.techsequences.org\/podcasts\/wp-json\/wp\/v2\/comments?post=367"}],"version-history":[{"count":1,"href":"https:\/\/www.techsequences.org\/podcasts\/wp-json\/wp\/v2\/posts\/367\/revisions"}],"predecessor-version":[{"id":368,"href":"https:\/\/www.techsequences.org\/podcasts\/wp-json\/wp\/v2\/posts\/367\/revisions\/368"}],"wp:featuredmedia":[{"embeddable":true,"href":"https:\/\/www.techsequences.org\/podcasts\/wp-json\/wp\/v2\/media\/369"}],"wp:attachment":[{"href":"https:\/\/www.techsequences.org\/podcasts\/wp-json\/wp\/v2\/media?parent=367"}],"wp:term":[{"taxonomy":"category","embeddable":true,"href":"https:\/\/www.techsequences.org\/podcasts\/wp-json\/wp\/v2\/categories?post=367"},{"taxonomy":"post_tag","embeddable":true,"href":"https:\/\/www.techsequences.org\/podcasts\/wp-json\/wp\/v2\/tags?post=367"}],"curies":[{"name":"wp","href":"https:\/\/api.w.org\/{rel}","templated":true}]}}