Meet the Minds Inventing the Future of Video: Josselin Cozanet

Accueil / Blog / Meet the Minds Inventing the Future of Video: Josselin Cozanet

meet the minds

In our blog series  Meet the Minds Inventing the Future of Video, we’ve been going behind the scenes to find out more about some of Ateme’s brightest minds and what they’ve been working on. In part seven of the series, we introduce Research and Development Engineer Josselin Cozanet. He spoke to us about the ins and outs of Open Caching

What is Your Role at Ateme?

I currently work as a research and development engineer in the Research and Innovation team at Ateme. My focus is on video delivery. This includes the innovative use of Content Delivery Network (CDN) technologies to improve the way video is delivered to viewers. 

What Have You Been Working on at Ateme?

I have been working on a variety of subjects, including an Open Caching implementation. I implemented this standard to participate in the testbed effort in the Streaming Video Technology Alliance (SVTA Open Caching Working Group.

What is Open Caching?

Open Caching is a collaborative effort to create and test a new set of specifications designed to improve video stream delivery. These new specifications allow multiple CDNs to communicate with each other through a set of open APIs. In turn, this allows content publishers to reach more viewers with higher quality and lower latency. It also allows them to enter new markets more quickly. 

What Industry Challenges Does Open Caching Address?

Over the last decade, the consumption of online video has grown exponentially. However, this explosion in streaming has put increased pressure on internet service providers (ISPs) and their networks. 

Significant peaks in internet traffic can also have a knock-on effect on both content providers and viewers. Additional stress on networks can reduce the quality of the viewing experience, for instance through increased latency and rebuffering. 

The way to overcome this network pressure is to cache the content at the edge of the network, as close as possible to viewers. To do this, most ISPs have built good CDNs for their own TV services. However, for global OTT service providers, building such a CDN network would be costly. This is due to the number of points of presence required to reach their global audiences. So they would rather partner with CDNs or ISPs to use the caching capabilities these partners have already built. 

Consequently, content providers tend to have several interfaces with different CDNs. Historically, this has made interconnecting difficult. It added operational complexity, making it challenging for content providers to deploy their service in a new region that uses different CDNs. While the Internet Engineering Task Force (IETF) has tried to improve the current situation by defining how CDNs interconnect, the results of their endeavor were too broad and inapplicable to a number of use cases, in particular those related to video streaming. 

Recognizing this issue, the SVTA created the Open Caching Working Group to extend the IETF’s standardization of CDN interfaces to cover the specific requirements of video streaming. 

What Have You Achieved in This Field at Ateme?

At Ateme, I have completed an Open Caching implementation allowing other Open Caching-compatible CDNs to connect to Ateme’s NEA CDN. This takes the form of a simple gateway which converts the Open Caching requests it receives into NEA CDN-compatible requests. This helps to configure a NEA CDN through any Open Caching-compatible service. 

I also participated in successful interoperability tests with members of the SVTA as part of the Open Caching Testbed group. Moreover, I continue to follow the latest updates on Open Caching specifications.

What Does Open Caching Change for Viewers?

Nowadays, streaming video is becoming more difficult and costly because of an increase in the demands of video delivery with high quality and low latency. Content delivery solutions, including CDNs, are a way to improve delivery by storing media content in caches that are geographically closer to the viewers. However, each individual viewer does not have access to every different cache network available. So viewers might not be able to access a specific piece of content from the network that is closest to them. By allowing all the networks to interconnect, Open Caching optimizes access to all media for all viewers. In turn, this allows viewers to access higher-quality content with reduced latency. 



Leave a comment

Your email address will not be published. Required fields are marked *

Ateme