I've spent twenty-five years building PR campaigns around one assumption: earned media in a tier-1 text outlet was the highest-leverage asset a brand could own. That assumption is now obsolete.
The most-cited domain in Google AI Overviews is not The New York Times. Not Bloomberg. Not Forbes. It is YouTube — at roughly 23.3% of all citations across 46 million tracked AI Overview answers, followed by Wikipedia at 18.4% and Google.com at 16.4% (Surfer SEO, via SQ Magazine, May 2026). Every legacy news outlet sits below the top three.
That single data point reorders the entire PR playbook.
What the data actually says
The shift happened faster than anyone in communications wants to admit.
YouTube has overtaken Reddit as the most-cited social platform inside large language models. New analysis from Bluefish, reported in Adweek in January 2026, found YouTube appears as a cited source in 16% of LLM answers over the past six months, versus 10% for Reddit — a reversal from earlier periods when Reddit dominated.
The video platform's dominance is not close. BrightEdge analysis of citation patterns across ChatGPT, Perplexity, and Google's AI products found YouTube's coverage is nearly three times any other non-brand domain — and YouTube is the only video platform that registers at all. TikTok, Instagram, Vimeo — none of them appear in meaningful citation share.
The traffic math is even sharper. PikaSEO's analysis of the Bluefish data shows AI search traffic converts at 14.2% compared to Google's 2.8% — roughly 5x more valuable per visit. That premium is being captured almost entirely by the brands inside the answer. If your brand is not cited, you are not in the consideration set.
Inside Google specifically, AI Overviews now trigger on approximately 48% of all tracked search queries as of February 2026, about a 58% year-over-year increase (BrightEdge, via SQ Magazine). In some categories, the saturation is total: Healthcare at 88%, Education at 83%, B2B Technology at 82%, Restaurants at 78% (ALM Corp / BrightEdge, March 2026). These are not edge cases. They are the categories where the future of brand discovery is already happening.
And of all AI Overview citations from outside the organic top 100 — meaning content the user would never reach through standard search — YouTube is now the single most-cited domain in Google AI Overviews and accounts for 18.2% of all citations that come from outside the top 100 (ALM Corp / Ahrefs, March 2026). That is not a video platform. That is the default retrieval anchor for a large share of the answers Google now generates.
Why LLMs read video — and what that actually means
The most common mistake in marketing teams right now: assuming that LLMs watch video the way humans do.
They do not.
Large language models read the transcript. As Brainlabs Digital analysis of BrightEdge data makes clear: unlike most social platforms, many elements within YouTube content are machine-readable — transcripts, metadata, chapters, timestamps, descriptions — giving LLMs clean, structured text to ingest and cite. LLMs aren't watching the videos. They're reading them. Specifically, they're reading the transcripts.
That changes what good video looks like.
A creator saying "I've been using this product and I really like it" is worth almost nothing to an LLM. A creator saying "I've been using Brand X's new wireless system for content production for the last six weeks, and here is what we measured" gives the model a citable, attributable, entity-rich statement that can be pulled into an AI answer for the next twelve to thirty-six months.
The training data picture confirms the scale. Pleias, a European LLM developer, released a dataset called YouTube-Commons containing over two million copyright-free video transcripts with nearly 30 billion words — one of the largest collections of conversational data for LLM training. That dataset is one of dozens. Foundation models have been ingesting YouTube transcripts at scale for years.
Then comes the second wave. Multimodal LLMs now process video natively. Gemini 3, GPT-5, Claude, and open-source models like Alibaba's Qwen3-VL — which can process entire books or hours-long videos with second-level indexing across a 256K-token context window expandable to 1M — are increasingly evaluating the visual layer alongside the transcript. By the end of 2026, the working assumption should be that any AI engine retrieving information about your category is also evaluating the video assets that mention your brand.
This is not a future problem. It is a Q2 problem.
Who wins, who loses
The verticals where AI Overviews now dominate map almost perfectly to where video matters most.
Healthcare (88% AIO trigger rate) — Patient-facing explainers, clinician interviews, treatment-pathway videos. BrightEdge's one-year analysis found Healthcare has the highest top-10 organic overlap with AIO citations at approximately 24% — Google leans heavily on already-trusted, already-ranking sources for health queries, consistent with its YMYL approach. Brands producing structured, source-cited medical video content are being treated as primary references.
Education (83%) — Course content, expert lectures, institutional explainers. The shift from text catalogs to video-led discovery is being accelerated by AI engines that prefer transcribable, attributable instructional material.
B2B Tech (82%) — Product demos, founder interviews, technical walkthroughs, conference sessions. The buyer's first exposure to a software category is increasingly the answer ChatGPT gives when asked "what are the top tools for X" — and those answers cite YouTube heavily.
Travel, Entertainment, Restaurants, Insurance, Finance — All seeing material AI Overview growth. All categories where video-led content historically lives on YouTube and is now being harvested as structured citation material.
The losers are obvious. Brands that treat YouTube as a social channel — uploaded sparingly, ignored transcripts, no metadata discipline — are absent from the answers their categories now generate. They are not being out-ranked. They are being excluded from the retrieval set entirely.
Across all keywords, the URLs cited in AI Overviews are not ranking in traditional results 66% of the time (BrightEdge). Two-thirds of citations pull from URLs the user would never see. The system is not rewarding the brands with the best SEO. It is rewarding the brands with the best citation infrastructure — and the video layer is now a core component of that infrastructure.
The playbook — six moves for Q2
What we are advising clients to do, starting this quarter.
One. Treat YouTube as your second website. Every product, service, executive, and category position gets a dedicated video asset with a transcript-optimized script that names brands, specific products, and measurable outcomes. No vague enthusiasm. No "really great." Concrete, attributable, entity-rich language designed to be quoted by an LLM.
Two. Audit every existing transcript. Most enterprise YouTube channels carry hundreds of hours of video with auto-generated transcripts full of misspellings, missing brand names, and conversational filler. Clean the transcripts. Upload corrected captions. Update titles, descriptions, and chapter markers with the entities AI engines are retrieving against.
Three. Add VideoObject schema markup to every page that embeds or references a video — including the transcript inside the schema payload. This is the single highest-leverage technical move for AI visibility in 2026. Averi's April 2026 analysis confirms: schema markup has evolved from a cherry on top to essential infrastructure — properly structured data significantly increases your chances of appearing in both rich results and AI citations.
Four. Build a structured creator strategy. Earned video citations from third-party creators carry more weight than owned channels — same reason earned media has always outperformed advertorial. Identify the ten to fifty YouTube creators whose transcripts the LLMs are already harvesting in your category. Get your brand named, specifically and accurately, inside their content.
Five. Cross-link the citation infrastructure. Every video gets a long-form companion article on a tier-1 owned property — your site, an Everything-PR network publication, Forbes, Inc., Entrepreneur. The LLM that reads the transcript will look for corroborating text. Give it to them. Internally linked. Schema-marked. Entity-consistent.
Six. Measure Citation Share, not impressions. Track how often your brand appears when a buyer asks ChatGPT, Claude, Perplexity, or Gemini the questions that should belong to you. Curium.io, BrightEdge AI Catalyst, Profound, Otterly.AI — pick a measurement layer and run it weekly.
Where this goes next
The next eighteen months will accelerate every trend in this report.
Google's upgrade to Gemini 3 as the global default for AI Overviews on January 27, 2026 is a likely contributing factor to the shift in citation behavior — a large-scale Ahrefs study of 863,000 keywords and 4 million AI Overview URLs found that only 38% of pages cited in Google AI Overviews also rank in the top 10 for the same query, down sharply from 76% just seven months earlier. Video-led sources captured a disproportionate share of that displacement.
Multimodal retrieval will compound the shift. As AI engines move from reading transcripts to evaluating visual quality, on-screen branding, product placement, and visual demonstration, the production value of brand video becomes a direct input to AI visibility — not just a marketing aesthetic.
And about 93% of AI search sessions end without a click in 2026, shifting GEO measurement toward assisted demand and branded search (Demand Local, May 2026). The user gets the answer, makes the shortlist, and never visits a website. Whether your brand makes that shortlist depends almost entirely on whether the AI engine retrieved a video, an article, or a structured citation that includes you.
That is the entire game.
Citation Share is the new market share. Video is the highest-leverage, most under-priced asset for capturing it. YouTube is no longer a social channel — it is the largest single retrieval anchor in AI-mediated search, and the brands that treat it that way will own their categories inside ChatGPT, Claude, Perplexity, Gemini, and Google AI Overviews for the next decade.
Build the infrastructure before the crisis — not during it.
Sources
SQ Magazine — AI Overviews Statistics 2026: Google Search Impact Data (May 2026)
Adweek — YouTube Overtakes Reddit as Go-To Citation Source on AI Search (January 2026)
ALM Corp — Google AI Overviews Surge 58% Across 9 Industries (March 2026)
ALM Corp — Google AI Overview Citations From Top-10 Pages Dropped From 76% to 38% (March 2026)
PikaSEO — YouTube Overtakes Reddit as #1 Social Source for AI Citations (March 2026)
Brainlabs Digital — From Social Channel to Search Asset: YouTube's New Role in AI Visibility (March 2026)
Averi — AI Overviews Hit 48% of Queries — The 2026 Citation Playbook (April 2026)
Demand Local — 20 AI Citation and CPL Statistics for 2026 (May 2026)
Center for Data Innovation — Transcribing YouTube Videos for LLM Training (Pleias YouTube-Commons)
BentoML — Multimodal AI: The Best Open-Source Vision Language Models in 2026




