P2P is literally everywhere


We process a huge volume of data per day as we traverse the web seeking clients content we are protecting. This data reveals that many of the sites are using the older tech of P2P networks to cut bandwidth required to deliver the content. This may be done directly by the sites, or as is more often the case, it is used by one of the multiple player options that a content distribution platform loads into its page.

The content is delivered in this way to the end user from a server infrastructure they have built upon with little difference allowing easy implementation of hybrid solutions. The potential savings are large and the technology enables their existing network capacity to be greatly increased without cost and with only minor alterations needed by the sites.

This all happens without any software needing to be installed by the user, providing a transparent experience so the user may be completely unaware. This is down to code that supports the player, which is part of the code embedded in your browser. The content is delivered from the hosts servers just as any other such stream network, whilst concurrently a connection opens to public torrent nodes announcing their existence to any other such users players. Normally video on demand is less effective to deliver over a torrent network but this is avoided by the nature of the content demand having big spikes for specific items and the digital files, whilst being on several different hosts, they are byte for byte identical thanks to API features provided by many content hosts.

The effect? Let's take a well known content host which we will not name but it is current and very active. Estimated traffic tools put it at around 19 million visits per month. On the assumption that each subdomain with a live server at the other end is a server which is part of their CDN, then it has over 70 servers from a handful of lease dedicated providers. The offerings from these companies would appear to say that each will at least have a 1Gbps connection (whilst it will have contention, it appears the content host has taken time to carefully pick the products they have chosen from these providers to spread across many locations and different BGP networks. Potentially this has been done to avoid their own traffic choking their other servers so they get the most out of the connection provided to each server.)

Giving head room of 40% to allow contention and bursts from client buffering etc, this aggregates to a 500Gbps network which is very sizable on it’s own, potentially able to deliver content concurrently at 2Mbps (assume 4Mbps to give further headroom allowance) 128,000 streams which would be HD. By adding the P2P capability to the player framework, if a 50% offload of bandwidth (in benchmark tests savings of up to 70% reportedly been sustained in real conditions) was removed from the dedicated server network then the overall CDN capacity suddenly doubles to 0.25 billion concurrent streams. Now whilst theoretical and susceptible to many variables, the advantages of this sort of hybrid network is very clear and in this scenario creates a content delivery network of huge capacity. This could well be in part to your bandwidth being added to this peer enabled network.