When STG took Avid Technology private in November 2023, paying $1.4 billion at $27.05 a share, the company that makes the editing software used on 87 percent of Oscar-winning film and TV productions went quiet. No earnings calls, no investor days, no public roadmap. Two years of silence. Now Avid is back — and the headline is that Google's AI is running inside the edit bay.
The partnership, announced April 16 ahead of the NAB Show in Las Vegas, embeds Google's Gemini AI models and Vertex AI into Avid Media Composer and Content Core, Avid's media-asset management platform. Vertex AI is the search and data-processing engine that underpins Google's cloud machine learning products. For the first time, the software that grades most Oscar-winning films will let editors search footage using plain language, automate the tedious process of tagging archive clips with descriptive metadata, and pull supplementary B-roll footage autonomously during timeline assembly. Avid also announced a parallel AWS integration at the same trade show, suggesting it is building on multiple cloud providers rather than picking a side.
The timing is not accidental. Avid's trailing twelve-month revenue sits at roughly $410 million, which puts the 2023 buyout at approximately 3.4 times revenue — a multiple that requires either strong growth or an eventual exit through IPO or resale. Going dark for two years and emerging with a Google press release is one way to reset the narrative.
The AI capabilities themselves are technically real. The Gemini integration is a genuine multi-year collaboration with Google's cloud division, not a logo-swap press release. But every specific feature described is characterized as either a first public demonstration at NAB or shipping when ready. No paying customers are named. No production deployment timeline is confirmed. Avid declined to make an editor or product lead available for this story.
There is a skeptic's case worth making: Avid needed a headline, Google needed a marquee customer for its professional creative AI story, and the intersection of those needs produced this announcement. The actual functionality — an AI assistant that searches your archive and fills your timeline — is the kind of thing that works beautifully on a demo stage and falls apart on a real production with inconsistent metadata, mixed frame rates, and decades of accumulated legacy media.
On the creativity question, Ramesh Srinivasan, a professor of information studies at UCLA who researches the intersection of technology and media, offered a direct caution: initial research is showing that AI-created content is flattening creativity, outputting dominant patterns rather than reflecting the diverse and specific creative approaches that go into writing or editing.
Underneath the announcement is a familiar power dynamic. Avid is not building AI — it is distributing Google's AI inside the professional toolchain. If Media Composer users adopt these workflows at scale, Google gains insight into how post-production actually works at the data level, which matters for a company that also competes in cloud infrastructure and content tools.
Avid did not respond to questions about STG's exit timeline, pricing for AI-enabled tiers of Media Composer, or whether any customer has signed up for the new capabilities. A spokesperson said the company will hold a public demo at NAB this week and declined to comment beyond the prepared announcement. The NAB Show runs through April 22.