Powered by RND
PodcastsSociety & CultureComputer Says Maybe

Computer Says Maybe

Alix Dunn
Computer Says Maybe
Latest episode

Available Episodes

5 of 86
  • Who Knows? Fact-Finding in a Failing State w/ HRDAG and Data & Society
    Everything is happening so fast. And a lot of it’s bad. What can research and science organizations do when issues are complex, fast-moving, and super important?More like this: Independent Researchers in a Platform Era w/ Brandi GuerkinkBuilding knowledge is more important than ever in times like these. This week, we have three guests. Megan Price from the Human Rights Data Analysis Group (HRDAG) shares how statistics and data science can be used to get justice. Janet Haven and Charlton McIlwan from Data & Society explore the role that research institutions can offer to bridge research knowledge and policy prescription.Further reading & resources:HRDAG’s involvement in the trial of José Efraín Ríon MonttA profile of Guatemala and timeline of its conflict — BBC (last updated in 2024)To Protect and Serve? — a study on predictive policing by William Isaac and Kristian LumAn article about the above study — The AppealHRDAG’s stand against tyrannyMore on Understanding AI — Data & Society’s event series with the New York Public LibraryAbout Janet Haven, Executive Director of Data & SocietyAbout Charlton McIlwan, board president of Data & SocietyBias in Computer Systems by Helen NissenbaumCenter for Critical Race and Digital StudiesIf you want to hear more about the history of D&S, the full conversation is up on Youtube (add link when we have).**Subscribe to our newsletter to get more stuff than just a podcast — we run events and do other work that you will definitely be interested in!**Post Production by Sarah Myles | Pre Production by Georgia Iacovou
    --------  
    59:17
  • Who Knows? Independent Researchers in a Platform Era w/ Brandi Geurkink
    Imagine doing tech research… but from outside the tech industry? What an idea…More like this: Nodestar: Turning Networks into Knowledge w/ Andrew TraskSo much of tech research happens within the tech industry itself, because it requires data access, funding, and compute. But what the tech industry has in resources, it lacks in independence, scruples, and a public interest imperative. Alix is joined by Brandi Guerkink from The Coalition of Independent Tech Research to discuss her work at a time where platforms have never been so opaque, and funding has never been so sparseFurther Reading & Resources:More about Brandi and The CoalitionUnderstanding Engagement with U.S. (Mis)Information News Sources on Facebook by Laura Edelson & Dan McCoyMore on Laura EdelsonMore on Dan McCoyJim Jordan bringing in Nigel Farage from the UK to legitimise his attacks on EU tech regulations — PoliticoTed Cruz on preventing jawboning & government censorship of social media — BloombergJudge dismisses ‘vapid’ Elon Musk lawsuit against group that cataloged racist content on X — The GuardianSee the CCDH’s blog post on getting the case thrown outPlatforms are blocking independent researchers from investigating deepfakes by Ariella SteinhornDisclosure: This guest is a PR client of our consultancy team. As always, the conversation reflects our genuine interest in their work and ideas.**Subscribe to our newsletter to get more stuff than just a podcast — we run events and do other work that you will definitely be interested in!**
    --------  
    48:15
  • Tres Publique: Algorithms in the French Welfare State w/ Soizic Pénicaud
    Governments around the world are using predictive systems to manage engagement with even the most vulnerable. Results are mixed.More like this: Algorithmically Cutting Benefits w/ Kevin De LibanLuckily people like Soizic Pénicaud are working to prevent the modern welfare state from becoming a web of punishment of the most marginalised. Soizic has worked on algorithmic transparency both in and outside of a government context, and this week will share her journey from working on incrementally improving these systems (boring, ineffective, hard) — to escaping the slow pace of government and looking at the bigger picture of algorithmic governance, and how it can build better public benefit in France (fun, transformative, and a good challenge).Soizic is working to shift political debates about opaque decision-making algorithms to focus on what they’re really about: the marginalised communities who’s lives are most effected by these systems.Further reading & resources:The Observatory of Public Algorithms and their InventoryThe ongoing court case against the French welfare agency's risk-scoring algorithmMore about SoizicMore on the Transparency of Public Algorithms roadmap from Etalab — the task force Soizic was part ofLa Quadrature du NetFrance’s Digital Inquisition — co-authored by Soizic in collaboration with Lighthouse Reports, 2023AI prototypes for UK welfare system dropped as officials lament ‘false starts’ — The Guardian Jan 2025Learning from Cancelled Systems by Data Justice LabThe Fall of an Algorithm: Characterizing the Dynamics Toward Abandonment — by Nari Johnson et al, featured in FAccT 2024**Subscribe to our newsletter to get more stuff than just a podcast — we host live shows and do other work that you will definitely be interested in!**
    --------  
    52:22
  • Straight to Video: From Rodney King to Sora w/ Sam Gregory
    Seeing is believing. Right? But what happens when we lose trust in the reproductive media put in front of us?More like this: The Toxic Relationship Between AI and Journalism w/ Nic DawesWe talked to a global expert and leading voice on this issue for the past 20 years, Sam Gregory to get his take. We started way back in 1992 when Rodney King was assaulted by 4 police officers in Los Angeles. Police brutality was (and is) commonplace, but something different happened in this case. Someone used a camcorder and caught it on video. It changed our understanding about the role video could play in accountability. And in the past 30 years, we’ve gone from seeking video as evidence and advocacy, to AI slop threatening to seismically reshape our shared realities.Now apps like Sora provide impersonation-as-entertainment. How did we get here?Further reading & resources:More on the riots following Rodney King’s murder — NPRMore about Sam and WitnessObscuraCam — a privacy-preserving camera app from WITNESS and The Guardian ProjectC2PA: the Coalition for Content Provenance and AuthenticityDeepfakes Rapid Response Force by WITNESSSubscribe to our newsletter to get more stuff than just a podcast — we run events and do other work that you will definitely be interested in!Post Production by Sarah Myles
    --------  
    1:00:15
  • The Toxic Relationship Between AI & Journalism w/ Nic Dawes
    What happens when AI models try to fill the gaping hole in the media landscape where journalists should be?More like this: Reanimating Apartheid w/ Nic DawesThis week Alix is joined by Nic Dawes, who until very recently ran the non-profit newsroom The City. In this conversation we explore journalism’s new found toxic relationship with AI and big tech: can journalists meaningfully use AI in their work? If a model summarises a few documents, does that add a new layer of efficiency, or inadvertently oversimplify? And what can we learn from big tech positioning itself as a helpful friend to journalism during the Search era?Beyond the just accurate relaying of facts, journalistic organisations also represent an entire backlog of valuable training data for AI companies. If you don’t have the same resources as the NYT, suing for copyright infringement isn’t an option — so what then? Nic says we have to break out of the false binary of ‘if you can’t beat them, join them!’Further reading & resources:Judge allows ‘New York Times’ copyright case against OpenAI to go forward — NPRGenerative AI and news report 2025: How people think about AI’s role in journalism and society — Reuters InstituteAn example of The City’s investigative reporting: private equity firms buying up property in the Bronx — 2022The Intimacy Dividend — Shuwei FangSam Altman on Twitter announcing that they’ve improved ChatGPT to be mindful of the mental health effects — “We realize this made it less useful/enjoyable to many users who had no mental health problems, but…”**Subscribe to our newsletter to get more stuff than just a podcast — we run events and do other work that you will definitely be interested in!**
    --------  
    41:47

More Society & Culture podcasts

About Computer Says Maybe

Technology is changing fast. And it's changing our world even faster. Host Alix Dunn interviews visionaries, researchers, and technologists working in the public interest to help you keep up. Step outside the hype and explore the possibilities, problems, and politics of technology. We publish weekly.
Podcast website

Listen to Computer Says Maybe, Great Company with Jamie Laing and many other podcasts from around the world with the radio.net app

Get the free radio.net app

  • Stations and podcasts to bookmark
  • Stream via Wi-Fi or Bluetooth
  • Supports Carplay & Android Auto
  • Many other app features

Computer Says Maybe: Podcasts in Family

Social
v8.0.7 | © 2007-2025 radio.de GmbH
Generated: 12/7/2025 - 10:01:12 PM