The OBS is producing its video content in ultrahigh definition and high dynamic range, which should spruce up the level of detail and color in every shot. Content is also being captured in all manner of formats: vertical video for watching clips on phones, 8K video for the highest-definition broadcasts, and 360-degree shots for truly immersive drama.

The OBS says it has more than doubled the multi-camera systems it uses to capture multiple angles of the action for super-slow-motion replays later. It will also use cinematic camera lenses, which are capable of capturing artsier shots, like enhanced depth-of-field changes that you’ve probably seen in movies. The struggle to make that happen with traditional sports cameras is that the time typically needed to process those complex shots has prevented them from being used in live production. But the OBS is relying on AI and cloud technologies to speed up the processing time enough to use these shots within its live coverage. Exarchos says its new processes enable shots that were hitherto impossible to capture and present live, like 360-degree replays that spin the viewer around the athlete while they’re sailing through the air.

“The effect that exists in the Matrix film that you could do in cinema, you can be doing live,” Exarchos says.

The OBS is also recording sounds in 5.1.4 audio, with the goal of capturing immersive audio from the venues during events and during interviews with athletes on the sidelines. That, along with things like augmented-reality stations that give people a view of what it is like on the Olympic grounds, are meant to make those at home feel like they’re closer to the games.

“If we repeat the previous—very successful—games, we have failed,” Exarchos says. “Because as in sports, everything is about breaking new ground, breaking new frontiers, and going one step further.”

Tech Proving Grounds

As you’d expect in 2024, artificial intelligence tools will be used extensively during the Olympics.

Broadcasters like the Olympic Broadcasting Service and NBC will use AI to pull together highlights that scrape thousands of hours of footage to find key moments, package them up nicely, and deliver them straight to the viewer. Some companies have gone all in on AI offerings; NBC will be using the AI-rendered voice of legendary sportscaster Al Michaels to narrate its highlights packages on Peacock. The team trained its generative AI voice engine using Michaels’ past appearances on television broadcasts, and the results sound smooth yet still unmistakably uncanny.

As you watch the games live, AI will be able to conjure up key info in real time and display it on screen: stats about the athletes, probability percentages that they’ll make the shot or beat the clock, and artificially augmented views of what is happening on the ground. The AI incursion extends beyond the games; NBC is incorporating AI into its ad platform, with the goal of better personalizing the ads that play during the breaks.

This exorbitant broadcasting bacchanal is still a training ground for these new technologies. NBC is using the Olympics as the first major test of its Multiview capability and user customization features, so expect to see those things appear more often in regular live sports broadcasts. According to a rep from NBC, the company’s hope is that the technology debuting during the Paris Olympics could be deployed to other live sports events, and even non-sports shows like the Macy’s Thanksgiving Day parade.

Ultimately, Exarchos says, the goal of these technologies is to make people feel more connected to these events and the people participating in them, especially after the last two Olympics games were mired by pandemic restrictions that limited who could attend.

“We’re going through a phase where people have a huge desire and nostalgia to relieve physical experiences, especially with other people.” Exarchos says. “Sports is a big catalyst for that.”

Shares:

Leave a Reply

Your email address will not be published. Required fields are marked *