Site icon blog maverick

A Question about P2P Technologies

Peer to Peer technologies are getting more and more press lately. Verisign and Adobe signed a deal, The Venice Project, BIttorrent.com is announcing content deals, and of course there is the ever present treasure trove of illegal content available online via torrents.

The reason there has been excitement about P2P technologies built around BitTorrent type technology is simple. It saves bandwidth on file distribution and it creates the opportunity to speed the delivery of files, large or small. If it were able to live up to the hype, the notion is that how multimedia is distributed on the net, and its economics would change.

I’m not as sure it will as some others are.

The premise of the technically is to break up files into pieces and distribute those pieces on to the PCs of end users who have downloaded the BitTorrent type client. Then when a user requests the file to be delivered or streamed to them, rather than having to go to a host server, a tracker determines where all the file pieces are, and defines how the user reassembles them into a copy of the original on his or her computer as a file or a stream.
Thats the very, very simplific explanation of how it works.

From a business perspective, the important element is that if X number of people request a 1gbs file, rather than a host computer having to deliver files consuming Xgbs, the file is tracked among the peers and delivered using their bandwidth and resources , relieving the host of the bandwidth cost and obligation and hopefully speeding the delivery of the content

All good, right ?

For people creating content. Absolutely. For the end user, not so much.

P2P technology expects the end user to contribute bandwidth, hard drive storage and processing power. Something that in most cases, we all have available to spare. Most of the bit torrent client softwares have a “give to get ” algorithm , In other words, it will opportunistically deliver content to you as quickly as the bandwidth you make available to it. So if you make 100k of bandwidht available to upload file segments being hosted on your PC, it will allow you to download up to 100k (there are other variables involved, but this is the simple way to understand).

All of this works very, very well in controlled environments. It also works well on a public internet tests when there are a lot of clients fully participating. In other words, they are offering their bandwidth, and open to seeding all content.

In real world execution however, it doesnt happen that way. There are multiple problems with P2P systems that could kill the golden goose.

1. Conflicting Clients . There are a ton of clients, with the number growing all the time. Although they work on basically the same source code and protocols, they all install and operate as if they had exclusive access. They want to control the PC so that they are in charge of what resources are available. When multiple clients are installed on a PC, not only does that create confusion among users, its a “last installed, first in charge” approach. THat approach and lack of respect for other clients will lead to user configuration problems. Thats not going to work. At some point they get considered to be malware and the clients will get uninstalled

2. End Users dont understand how P2P works, and once they do, they get concerned about giving up bandwidth. Most users dont know how to go in and edit the default settings. So even if they settle on a single client and are happy with just the content available on that network or to that client, they arent going to be happy about their banwidth being in constant use to save a content provider money .

3. The P2P model of seeding is a HUGE problem for those using wireless broadband with bandwidth constraints or per bit or per minute costs. People are going to wake up and find that they owe Verizon, Sprint, whoever a lot more than they ever thought possible because they installed a client on their Laptops. That could lead to these networks blocking the protocol.

4. There is a misconception that there is bandwidth savings for the end user. If you want to download a 1gb size file, 1gb of data will be delivered to your PC. There is no savings of bandwidth on the client side. In fact, the client is charged a bandwidth premium because after they have received the entire file, they are asked to particpate in the peering by delivering parts of the file to other users.

This in turn becomes an issue to services providers, whether DSL, cable, whoever. If quite a few users on a network segment are seeding files, it can slow down the network segment..

Its interesting to note that some feel that more than 55pct of internet bandwidth is consumed by Torrents. I dont know what percentage of internet users are using bitTorrent clients to acquire content, but it has to be relatively small. If that percentage doubles, what happens to performance on the net ?

If bittorrent client installation doubles or triples, does the pct of internet bandwidth used by torrents go from 55pct to 100pct ? Of course it wont work that way, but the 55pct to current client base ratio raises some very interesting questions about whether torrents truly do save bandwidth and can speed delivery of content

In conclusion, P2P is a product that tests great. In application however, it has a ton of challenges

Exit mobile version