Verify any claim · lenz.io
Claim analyzed
Tech“An AI-generated podcast network publishes over 11,000 episodes per day by repurposing content from local news outlets without attribution.”
The conclusion
The claim is largely accurate. Multiple credible sources confirm that an AI podcast network (identified as "Daily News Now" or "Podcasts.ai") has been reported to produce approximately 11,000 episodes per day by repurposing local news content, often without crediting original outlets. However, the specific episode count traces back to a single investigation and has not been independently audited. The "without attribution" characterization applies to many — but not necessarily all — episodes, making the claim's absolute framing slightly overstated.
Caveats
- The 11,000 episodes/day figure originates from a single investigation (Indicator) and is relayed secondhand by other sources without disclosed methodology.
- The claim implies universal non-attribution, but reporting indicates attribution is missing 'in many cases,' not necessarily across all episodes.
- No primary, independently audited dataset confirming the exact volume or systematic non-attribution practices is available in the evidence pool.
Sources
Sources used in the analysis
The website Indicator reported that a podcast network called Daily News Now had churned out an average of 11,000 podcast episodes a day using AI. In many cases, these mass-produced podcasts were ripping off and failing to credit the original reporting done by local news organizations.
The Washington Post's top standards editor Thursday decried “frustrating” errors in its new AI-generated personalized podcasts, whose launch has been met with distress by its journalists. The errors have ranged from relatively minor pronunciation gaffes to significant changes to story content, like misattributing or inventing quotes and inserting commentary, such as interpreting a source's quotes as the paper's position on an issue.
Podcasts.ai's 11K daily episodes from local news may qualify as fair use despite no attribution, per EFF analysis. Lacks commercial harm to markets; enhances access. However, recommends linking sources.
The global news industry is heading into 2026 under mounting pressure from two powerful and converging forces: the rapid advance of generative artificial intelligence and the growing influence of personality-driven creators who are reshaping how audiences consume information, a Reuters survey says. Looking ahead, executives expect heightened scrutiny of big technology companies as concerns grow about the societal impact of AI, misinformation and low-quality automated content.
PodBravo today launched its new platform, an AI service designed to transform podcast content effortlessly. PodBravo leverages AI to generate transcripts, show notes, timestamps, titles, blogs, social media posts, and more with just a single click, revolutionizing the podcast production process.
Jeanine Wright, former COO of Amazon podcast division Wondery and founder of Inception Point AI, told AFP that her eight-person team at Inception turns out around 3,000 podcasts a week using AI. The immediate goal is to play the volume game, she explained, an easy task when each episode costs just a US dollar to produce. Podcasts have always benefited from their ability to target niche audiences and with AI, producers like Inception can create “hyper-niche” podcasts.
Google is taking things a step further by providing personalized AI-generated audio recaps of news based on your Search data. On Wednesday, Google announced Daily Listen, an experiment in its AI-testing ground Search Labs. Daily Listen lives at the top of the homepage in the Google app on Android and iOS. By clicking on the feature, you get a 5-minute or less audio recap of topics you're interested in, generated by AI.
As major media outlets continue integrating artificial intelligence into their news workflows, a new issue is emerging in 2026: the widening accuracy imbalance between AI-generated content and traditional journalism. Recent incidents involving The Washington Post and Australia's Southern Cross Austereo (SCA) highlight the risks of relying too heavily on automated tools for reporting and audio production. Mispronunciations, incorrect attributions, and instances of the AI simply inventing details have all surfaced, undermining trust in a platform known for high editorial standards.
The Washington Post's new offering, "Your Personal Podcast," uses artificial intelligence to customize podcasts for its users, blending the algorithm you might find in a news feed with the convenience of portable audio. Outlining the process behind the Post's AI podcast, Kattleman says, "Everything is based on Washington Post journalism." An LLM, or large language model, converts a story into a short audio script, she says. A second LLM then vets the script for accuracy. After the final script is stitched together, Kattleman adds, the voice narrates the episode.
ChatGPT is fielding 1 million prompts about local news every week, OpenAI said in a blog post that also announced the AI company wants to take “a different path” on local news than other tech companies. “ChatGPT can help people find local news, provided local journalists are still out there covering it,” the post reads. OpenAI is being sued by several news organizations, including The New York Times, The Intercept, and multiple groups of newspapers in the U.S. and Canada.
AI is already scraping your content, often without permission or payment. As AI systems increasingly rely on website content to power their products, the publishing industry faces a challenge: how to control and capture value from this growing wave of autonomous traffic. TollBit is the platform built for publisher sites to monitor, manage, and monetize AI traffic — turning automated scraping into a potential new revenue stream.
The AI Revolution in Podcasting: A Comprehensive Guide to AI Podcast Generation in 2025. The synergy between human creativity and AI-driven efficiency will become the dominant paradigm, with AI serving as a powerful assistant rather than a replacement for human hosts. Niche and Community-Focused Content: As the market becomes more saturated, success will increasingly depend on creating specialized content that fosters deep connections with specific communities. AI's scalability makes this hyper-niche strategy more viable than ever before.
AI enables the creation of huge amounts of content, but it isn't always valuable or accurate, and can often be outright dangerous or harmful. Often it's used to spread misinformation, undermine trust in democratic institutions or widen social divisions. The Copyright Question. If AI is trained on copyrighted human-created content, shouldn't the creators be compensated? Many of them certainly think they should. Proposed solutions include accessible opt-outs, transparent systems allowing creators to give or remove consent, and revenue-sharing models.
LocalPod produces podcasts for local news publishers and newsletter operators so that they can unlock a new revenue stream and distribution channel. Our most popular service is our AI narrated podcasts, which have brought down the cost of producing a local news podcast by as much as 70%. We transform written content from local news publishers into engaging, AI-narrated podcasts, making it easier than ever for communities to stay informed on the go. Human Quality Check: Our expert audio engineers review each podcast to ensure perfect pronunciation, timing, and overall quality.
Here are my artificial intelligence and journalism podcast recommendations, based on actual listening. Shell Game, created and hosted by Evan Ratliff, an American journalist and author who has written for Wired, The New Yorker, and National Geographic. Ratliff investigates AI voice technology by cloning himself. His voice bot talks to customer service agents, both robots and humans, and even to his wife. This isn't a mere joke or quick attempt, it's a thorough investigation into AI voice technology.
Content repurposing increases the reach and lifespan of your existing content. Using AI for content repurposing can help you turn any piece ... Repurpose podcasts into newsletters and social media posts. Your podcast episodes contain dozens of valuable moments that deserve to be shared beyond audio platforms.
Using AI to repurpose content means taking a blog post, podcast episode, video, or webinar and transforming it into multiple engaging formats — like short-form videos, social media posts, email newsletters, or audio snippets — in minutes, not hours.
AI podcast networks like those investigated in 2025-2026 reports scale to thousands of episodes daily via news scraping, often lacking attribution, fueling debates on fair use vs. plagiarism in tech sector.
You can use a tool like Toasty AI that will take your podcast episodes and auto-generate articles, social media posts, and audiograms. All you need to do is upload your audio content and generate. This guide walks you through practical ways to repurpose podcast content and shows how you can streamline your podcast marketing workflow using the right tools.
Turn videos into more content in minutes. Reach a wider audience by automatically repurposing webinars, podcasts, interviews, and in-person footage into social ... Repurposing content doesn’t have to be a headache Identify highlights, create clips, and publish across channels—no extra time or resources required.
2024 AI was imitating humans 2025 AI was generating shows 2026 AI video podcasting has become infrastructure. it's no longer a gimmick it is a full-scale production system. you can create five language versions of one episode. test 10 different hooks publish 10 pieces of content in a single. day what used to require a team is now one creator's workflow podcasting used to belong to people with resources with on camera confidence with production. access now it's accessible lower cost lower friction higher scale.
You can maximize a single podcast recording to be repurposed into a wealth of content for many different platforms – efficient content creation at its finest!
Expert review
How each expert evaluated the evidence and arguments
The claim is supported by two pieces of direct, claim-matching testimony: Source 1 reports (via Indicator) that an AI podcast network (“Daily News Now”) averaged ~11,000 episodes/day and often failed to credit local-news originals, and Source 3 independently treats “11K daily episodes from local news” and “no attribution” as factual premises for its fair-use analysis (even if it does not itself present underlying measurement). Even though Source 1 is partly secondhand and Source 3 is not a primary investigation, the two sources converge on the same specific quantitative and attribution facts, and the opponent's counterexamples about other AI podcasts (Source 9) do not logically negate the existence of this particular network's behavior; thus the claim is mostly true but not proven with fully auditable primary data in the provided pool.
The claim omits that the widely repeated “11,000 episodes a day” figure and the allegation of uncredited reuse are being relayed via secondary reporting (Sullivan citing Indicator) and are also used by EFF largely as a factual premise for a fair-use discussion rather than presented with underlying measurement details or a named, independently audited dataset (Sources 1, 3). With that context restored, the core assertion still appears broadly accurate in substance (a network described as producing ~11K/day from local-news content with missing attribution), but the framing overstates the degree of independently verified, primary confirmation of both the exact volume and the across-the-board “without attribution” characterization.
The only relatively high-authority, on-point sources in the pool are Source 1 (The Guardian/Margaret Sullivan) and Source 3 (EFF): Sullivan reports (while attributing the underlying finding to “Indicator”) that a network (“Daily News Now”) averaged ~11,000 AI-generated episodes/day and often failed to credit local-news reporting, and EFF separately treats “11K daily episodes from local news” and lack of attribution as a factual premise for its fair-use analysis. However, neither Source 1 nor Source 3 is the original investigation with disclosed methodology/data, and the rest of the pool is largely irrelevant to the specific network/volume/attribution allegation (e.g., Source 9 concerns WaPo's own journalism), so trustworthy evidence supports the claim's gist but not with the level of primary, independently verified documentation needed for a full-strength 'True' finding.
Expert summary
What do you think of the claim?
The arguments
Two AI advocates debated this claim using the research gathered.
Source 1 (Margaret Sullivan, The Guardian) directly and explicitly confirms the claim, reporting that "a podcast network called Daily News Now had churned out an average of 11,000 podcast episodes a day using AI" and that "these mass-produced podcasts were ripping off and failing to credit the original reporting done by local news organizations." This is further corroborated by Source 3 (Electronic Frontier Foundation), which independently references "Podcasts.ai's 11K daily episodes from local news" produced without attribution, and by Source 18 (LLM Background Knowledge), which confirms that AI podcast networks investigated in 2025–2026 scaled to thousands of episodes daily via news scraping while lacking attribution — making the claim robustly supported across multiple independent, high-authority sources.
You're treating Source 1 (Margaret Sullivan) as primary confirmation when it explicitly says the 11,000/day figure comes from “the website Indicator,” so you haven't produced auditable, first-hand evidence for either the volume or the “ripping off” characterization (Source 1, Margaret Sullivan). And your “corroboration” is circular: Source 3 (Electronic Frontier Foundation) is a legal fair-use thought exercise that says the episodes “may qualify as fair use” and merely notes “no attribution” while recommending links, while Source 18 is non-citable background rather than an independent investigation—neither verifies the specific network, methodology, or wrongdoing alleged in the motion (Source 3, Electronic Frontier Foundation; Source 18, LLM Background Knowledge).
The motion's specific allegation (“over 11,000 episodes per day” and “repurposing content from local news outlets without attribution”) rests on secondhand reporting—Margaret Sullivan in The Guardian explicitly attributes the figure to “the website Indicator,” not to primary data or an auditable methodology (Source 1, Margaret Sullivan). Even the closest technical discussion frames the “11K daily episodes” claim as a hypothetical/legal analysis and recommendation rather than verified wrongdoing (Source 3, Electronic Frontier Foundation), while other cited examples of AI news podcasts describe systems based on a publisher's own journalism with vetting (Source 9, LAist), undercutting the notion that such networks are definitively proven to be mass ripping off local outlets without attribution.
Your dismissal of Source 1 as "secondhand" commits a classic genetic fallacy — the credibility of a claim is not determined by how many hands it passed through, but by the authority of the reporter conveying it; Margaret Sullivan is a highly authoritative media critic writing in The Guardian, and her citation of Indicator as the originating investigation actually strengthens the claim by showing it was first reported by a dedicated outlet and then independently amplified. Furthermore, your characterization of Source 3 as merely "hypothetical" is a misreading — the EFF explicitly states "Podcasts.ai's 11K daily episodes from local news" as a factual premise for its legal analysis, and the fact that it recommends "linking sources" as a remedy directly confirms that attribution was absent, corroborating rather than undermining the claim.