Deepdub closes new AI funding round that doubles down on movies, shows and games

0

Join today’s top leaders online at the Data Summit on March 9. Register here.


Dubbing, where recordings in other languages ​​are lip-synched and mixed with a show’s soundtrack, is a growing business. A location platform, Zoo Digital, saw its revenue jump 73% to $28.6 million in July 2018 from a year earlier. Another, BTI Studios, told Television Business International that dubbing rose from 3% of its revenue in 2010 to 61% in 2019.

According to Verified Market Research, the movie dubbing market alone could reach $3.6 billion by 2027, with a compound annual growth rate of 5.6% from 2020. But hurdles are hampering the expansion. On average, it can take an hour of recording in the studio for five minutes of narration — not to mention an actor who can deliver it with pace and timing that matches the original audio as closely as possible. It is also expensive. A calculator sets the price at $75 per minute, even for a simple video.

Artificial intelligence technologies promise to streamline the process by automating various aspects of dubbing. Video and voice-focused companies including Synthesia, Respeecher, Resemble AI, and Papercup have developed AI dubbing tools for shows and movies, as has Deepdub. Reflecting investor interest, Deepdub announced today that it has raised $20 million in a Series A round led by Insight Partners with participation from Booster Ventures, Stardom Ventures, Swift VC and angel investors.

Dubbing with AI

Deepdub, an Israeli startup headquartered in Tel Aviv, was co-founded in 2019 by brothers Nir Krakowski and Ofir Krakowski. The company provides AI-powered dubbing services for film, TV, games and advertising, powered by neural networks that split and isolate voices and replace them in the original tracks.

Deepdub attempts to layer new accents while retaining the original voice actors in all languages, using synthetic voices based on samples from the actors in one language to form foreign words using that actor’s synthesized voice. The company’s algorithms – which can’t yet dub voices in real time, but plug into post-production systems – attempt to match lip sync or lip movement in source media in multiple languages ​​to the time.

“With the growing global demand for local experiences by audiences, there is a growing demand for high-quality dubbing solutions that can scale quickly and deliver a new level of experience. Audiences have more and more options to choose from and content platforms are looking for ways to differentiate themselves in local markets,” Nir Krakowski told VentureBeat via email.[Our] AI models enable replication of voice characteristics using no more than three minutes of voice data. The AI-generated voice in the target language can then perform at any expressive level, including crying, shouting, talking with food in the mouth, [and more] – even if the original voice data did not include this information.

Deepdub’s work includes “Every Time I Die,” a 2019 English-language thriller that the startup dubbed into Spanish and Portuguese. This is one of the first times that fully dubbed films have used voice clones based on the voices of the original cast.

“[We’re] working with several Hollywood studios on various currently proprietary projects. Since the public launch in December 2020, Deepdub has already signed a multi-series partnership with Topic.com to bring their catalog of foreign TV shows in English,” the Krakowski brothers said. “The funds of [the latest round] will be used to expand the global reach of the company’s sales and delivery teams, bolster the Tel Aviv-based R&D team with additional researchers and developers, and further enhance the web-based localization platform. deep business learning.

Potential pitfalls

According to Statista, 59% of American adults said they would rather watch foreign-language movies dubbed in English than see the original feature film with subtitles. Look no further than dubbed series like Squid Game (which was originally in Korean) and Dark (in German) for proof; an October 2021 poll found that 25% of Americans had seen Squid Game, which remains the most-watched Netflix original show of all time.

Beyond the startups, Nvidia has developed technology that alters video in a way that takes an actor’s facial expressions and combines them with new language. Veritone, for its part, has launched a platform called Marvel.ai that allows content producers to generate and license AI-generated voices.

But localization is not as simple as a simple translation. As Steven Zeitchik of The Washington Post points out, AI-dubbed content without attention to detail could lose its “local flavor.” Expressions in one language may not mean the same thing in another. Additionally, AI dubs pose ethical questions, such as whether to recreate the voice of a deceased person. And technology could lead to a future in which millions of viewers may never be exposed to a language other than their own.

Also, troubles are the ramifications of the voices generated by the performances of the acting actors. In a lawsuit last spring, Bev Standing, a voice actor, alleged that TikTok used — but failed to compensate her — an AI text-to-speech feature. The Wall Street Journal reports that more than one company has attempted to reproduce Morgan Freeman’s voice in private demos. And studios are increasingly adding provisions in contracts that seek to use synthetic voices in place of performers “when necessary,” such as to polish lines of dialogue during post-production.

Deepdub says it is not committed to creating “deepfakes” and works with performers to ensure they understand how their voice is being used. We’ve reached out to the company with a detailed list of questions and will update this article once we receive a response.

“The pandemic has forever changed the way people around the world consume content, whether it’s entertainment content using one of the many streaming services and platforms or through e-learning platforms, podcasts and audiobooks. This allows audiences to be increasingly exposed to content that is not in their local language,” Krakowski continued.[Deepdub] started 2021 with seven employees and grew to over 30 employees at the start of 2022. We plan to double that size by the end of the year.

VentureBeat’s mission is to be a digital public square for technical decision makers to learn about transformative enterprise technology and conduct transactions. Learn more

Share.

Comments are closed.