P2P-reаerat (1114129), страница 2
Текст из файла (страница 2)
A distributed messaging system that is often likened as an early peer-to-peer architecture is the USENET network news system that is in principle a client–server model from the user or client perspective, when they read or post news articles. However, news servers communicate with one another as peers to propagate Usenet news articles over the entire group of network servers. The same consideration applies to SMTP email in the sense that the core email relaying network of Mail transfer agents has a peer-to-peer character, while the periphery of e-mail clients and their direct connections is strictly a client–server relationship. Tim Berners-Lee's vision for the World Wide Web, as evidenced by hisWorldWideWeb editor/browser, was close to a peer-to-peer design in that it assumed each user of the web would be an active editor and contributor creating and linking content to form an interlinked web of links. This contrasts to the broadcasting-like structure of the web as it has developed over the years.
Advantages and weaknesses
In P2P networks, clients provide resources, which may include bandwidth, storage space, and computing power. This property is one of the major advantages of using P2P networks because it makes the setup and running costs very small for the original content distributor. As nodes arrive and demand on the system increases, the total capacity of the system also increases, and the likelihood of failure decreases. If one peer on the network fails to function properly, the whole network is not compromised or damaged. In contrast, in a typical client–server architecture, clients share only their demands with the system, but not their resources. In this case, as more clients join the system, fewer resources are available to serve each client, and if the central server fails, the entire network is taken down. The decentralized nature of P2P networks increases robustness because it removes the single point of failure that can be inherent in a client-server based system.
Another important property of peer-to-peer systems is the lack of a system administrator. This leads to a network that is easier and faster to setup and keep running because a full staff is not required to ensure efficiency and stability. Decentralized networks introduce new security issues because they are designed so that each user is responsible for controlling their data and resources. Peer-to-peer networks, along with almost all network systems, are vulnerable to unsecure and unsigned codes that may allow remote access to files on a victim's computer or even compromise the entire network. A user may encounter harmful data by downloading a file that was originally uploaded as a virus disguised in an .exe, .mp3, .avi, or any other filetype. This type of security issue is due to the lack of an administrator that maintains the list of files being distributed.
Harmful data can also be distributed on P2P networks by modifying files that are already being distributed on the network. This type of security breach is created by the fact that users are connecting to untrusted sources, as opposed to a maintained server. In the past this has happened to the FastTrack network when the RIAA managed to introduce faked chunks into downloads and downloaded files (mostly MP3 files). Files infected with the RIAA virus were unusable afterwards or even contained malicious code. The RIAA is also known to have uploaded fake music and movies to P2P networks in order to deter illegal file sharing. Consequently, the P2P networks of today have seen an enormous increase of their security and file verification mechanisms. Modern hashing, chunk verification and different encryption methods have made most networks resistant to almost any type of attack, even when major parts of the respective network have been replaced by faked or nonfunctional hosts.
There are both advantages and disadvantages in P2P networks related to the topic of data backup, recovery, and availability. In a centralized network, the system administrators are the only forces controlling the availability of files being shared. If the administrators decide to no longer distribute a file, they simply have to remove it from their servers, and it will no longer be available to users. Along with leaving the users powerless in deciding what is distributed throughout the community, this makes the entire system vulnerable to threats and requests from the government and other large forces. For example, YouTube has been pressured by the RIAA, MPAA, and entertainment industry to filter out copyrighted content. Although server-client networks are able to monitor and manage content availability, they can have more stability in the availability of the content they choose to host. A client should not have trouble accessing obscure content that is being shared on a stable centralized network. P2P networks, however, are more unreliable in sharing unpopular files because sharing files in a P2P network requires that at least one node in the network has the requested data, and that node must be able to connect to the node requesting the data. This requirement is occasionally hard to meet because users may delete or stop sharing data at any point.
In this sense, the community of users in a P2P network is completely responsible for deciding what content is available. Unpopular files will eventually disappear and become unavailable as more people stop sharing them. Popular files, however, will be highly and easily distributed. Popular files on a P2P network actually have more stability and availability than files on central networks. In a centralized network, only the loss of connection between the clients and server is simple enough to cause a failure, but in P2P networks, the connections between every node must be lost in order to fail to share data. In a centralized system, the administrators are responsible for all data recovery and backups, while in P2P systems, each node requires its own backup system. Because of the lack of central authority in P2P networks, forces such as the recording industry, RIAA, MPAA, and the government are unable to delete or stop the sharing of content on P2P systems.
Social and economic impact
The concept of P2P is increasingly evolving to an expanded usage as the relational dynamic active in distributed networks, i.e., not just computer-to-computer, but human-to-human. Yochai Benkler has coined the term commons-based peer production to denote collaborative projects such as free and open source software and Wikipedia. Associated with peer production are the concepts of:
-
peer governance (referring to the manner in which peer production projects are managed)
-
peer property (referring to the new type of licenses which recognize individual authorship but not exclusive property rights, such as the GNU General Public License and theCreative Commons licenses)
-
peer distribution (or the manner in which products, particularly peer-produced products, are distributed)
Some researchers have explored the benefits of enabling virtual communities to self-organize and introduce incentives for resource sharing and cooperation, arguing that the social aspect missing from today's P2P systems should be seen both as a goal and a means for self-organized virtual communities to be built and fostered. Ongoing research efforts for designing effective incentive mechanisms in P2P systems, based on principles from game theory are beginning to take on a more psychological and information-processing direction.
Applications
There are numerous applications of peer-to-peer networks. The most commonly known is for content distribution
Content delivery
-
Many file sharing networks, such as gnutella, G2 and the eDonkey network popularized peer-to-peer technologies. From 2004 on, such networks form the largest contributor of network traffic on the Internet.
-
Peer-to-peer content delivery networks (P2P-CDN). See : Giraffic, Kontiki, Ignite, RedSwoosh.
-
Peer-to-peer content services, e.g. caches for improved performance such as Correli Caches[18]
-
Software publication and distribution (Linux, several games); via file sharing networks.
-
Streaming media. P2PTV and PDTP. Applications include TVUPlayer, Joost, CoolStreaming, Cybersky-TV, PPLive, LiveStation, Giraffic and Didiom.
-
Spotify uses a peer-to-peer network along with streaming servers to stream music to its desktop music player.
-
Peercasting for multicasting streams. See PeerCast, IceShare, FreeCast, Rawflow
-
Pennsylvania State University, MIT and Simon Fraser University are carrying on a project called LionShare designed for facilitating file sharing among educational institutions globally.
-
Osiris (Serverless Portal System) allows its users to create anonymous and autonomous web portals distributed via P2P network.
Exchange of physical goods, services, or space
-
Peer-to-peer renting web platforms enable people to find and reserve goods, services, or space on the virtual platform, but carry out the actual P2P transaction in the physical world (for example: emailing a local footwear vendor to reserve for you that comfy pair of slippers which you've always had your eyes on, or contacting a neighbor who has listed their weedwacker for rent).
-
Bitcoin is a peer-to-peer based digital currency.
-
Tradepal is a peer-to-peer marketplace where users list, discover, share and trade unique items with trusted peers.
Networking
-
Dalesa a peer-to-peer web cache for LANs (based on IP multicasting).
-
Voice Peering Fabric is a peer-to-peer interconnect system for routing VoIP traffic between organizations by utilizing BGP and ENUM technology.
-
Open Garden, connection sharing application that shares Internet access with other devices using Wi-Fi or Bluetooth.
Science
-
In bioinformatics, drug candidate identification. The first such program was begun in 2001 the Centre for Computational Drug Discovery at the University of Oxford in cooperation with the National Foundation for Cancer Research. There are now several similar programs running under the United Devices Cancer Research Project.
-
The sciencenet P2P search engine.
Search
-
Distributed search engine, a search engine where there is no central server
-
YaCy, a free distributed search engine, built on principles of peer-to-peer networks.
-
FAROO, a Peer-to-peer Web search engine
Communications networks
-
Skype, one of the most widely used internet phone applications is using P2P technology.
-
VoIP (using application layer protocols such as SIP)
-
Instant messaging and online chat
-
Completely decentralized networks of peers: Usenet (1979) and WWIVnet (1987).
General
-
Research like the Chord project, the PAST storage utility, the P-Grid, and the CoopNet content distribution system.
-
JXTA, for Peer applications. See Collanos Workplace (Teamwork software), Sixearch
Miscellaneous
-
The U.S. Department of Defense is conducting research on P2P networks as part of its modern network warfare strategy. In May, 2003, Anthony Tether, then director of DARPA, testified that the U.S. military uses P2P networks.
-
Kato et al.'s studies indicate over 200 companies have invested approximately $400 million USD in P2P networking. Besides file sharing, companies are also interested in distributing computing and content distribution applications.
-
Wireless community network, Netsukuku
-
An earlier generation of peer-to-peer systems were called "metacomputing" or "middleware". These include: Legion, Globus
Historical perspective
Tim Berners-Lee's vision for the World Wide Web was close to a P2P network in that it assumed each user of the web would be an active editor and contributor, creating and linking content to form an interlinked "web" of links. This contrasts to the current broadcasting-like structure of the web.
Some networks and channels such as Napster, OpenNAP and IRC serving channels use a client–server structure for some tasks (e.g., searching) and a P2P structure for others. Networks such as gnutella or Freenet use a P2P structure for nearly all tasks, with the exception of finding peers to connect to when first setting up.
P2P architecture embodies one of the key technical concepts of the Internet, described in the first Internet Request for Comments, RFC 1, "Host Software" dated April 7, 1969. More recently, the concept has achieved recognition in the general public in the context of the absence of central indexing servers in architectures used for exchanging multimedia files.