P2P Workshop Summary
A workshop titled Collaborative Computing in Higher Education: Peer-to-Peer and Beyond took place January 30-31, 2002, at Arizona State University. The workshop assembled experts from academia, industry, and government to share experiences with peer-to-peer (P2P) technologies and suggest future directions for P2P and related technologies in higher education. Here are some of the major themes, along with a very small sample of the ideas put forward:
- P2P as a bandwidth hog. While workshop chair Ana Preston opened the meeting by noting that "university response to P2P has been largely reactive; we want to try to find a way to be more proactive, and beyond that, become part of it", for many in academia the inescapable starting point is that P2P means Napster and similar large-file-sharing technologies, and these technologies mean major hassles with bandwidth management. While there was broad agreement that P2P provides excellent motivation for universities to upgrade network capacity, there was also an acute awareness that this is not always a workable short-term solution. Joe St. Sauver predicted that "you're going to see Ethernet switches that only deliver 256K to each port, and they'll sell like hotcakes." On the other hand, there was also much interest in engineering better network awareness into filesharing applications in order to reduce their bandwidth usage.
- P2P and copyright. There was much discussion of the responsibilities of universities with respect to copyright; while there was no consensus on how to approach this, the dominant sentiment appeared to be that universities have a responsibility to inform their students of the law, but not to police them. Bill St. Arnaud pointed out an interesting relationship between the bandwidth-hogging and copyright issues: "Right now we can justify throttling P2P traffic because of copyright issues -- when it becomes legit, we have a problem."
- P2P for education. Though P2P has so far been little developed as a tool for education, there was broad agreement that there is great potential here. Bill St. Arnaud discussed the potential uses of P2P in "eScience", the goal of which he gave as "to allow students and eventually members of the general public to be full participants in scientific discovery and innovation." David Wiley discussed his work on Peers and Learning: A Solution to the "Teacher Bandwidth" Problem. Wiley lays great stress on the impossibility of ever being able to completely take asking a real person out of the information-finding process; as Slashdot-like communities will always be needed, it's important to make them work as well as possible.
- Resource discovery. Finding things in a P2P system gets much harder when you're not dealing with widely replicated content in a single standard format (as with Napster and MP3s) but rather with, for example, expert knowledge of a wide range of subjects. This gives great importance to work on search architectures and metadata. Eytan Adar presented SHOCK (Social Harvesting of Community Knowledge), developed to track down experts at Hewlett-Packard. SHOCK clients index what each user does and create locally-stored profiles of each user's expertise; questions are submitted in XML format, and a server helps direct questioners to the right expert. Internet2 Middleware Initiative chair Ken Klingenstein noted the applicability of much of I2-MI's work to providing the resource-discovery "plumbing" for P2P.
- P2P and other advanced networking technologies. P2P issues are inseparable from issues in other areas of advanced networking. For example, Yoid (Your Own Internet Distribution) does multicast at the applications level; QoS technologies offer solutions to the bandwidth management issues posed by P2P; and IPv6 offers support for QoS, as well as providing addresses for the millions upon millions of peers that will be created by technologies like Smart Dust. Steve Wallace observed that "Higher education is one of the last communities where the end-to-end paradigm persists, and it's at risk there as well. There's a push to support IPv6 in higher education: if you want P2P you want end-to-end, and if you want end-to-end, support IPv6."
- Hybrid approaches. Many P2P applications incorporate "superpeers" and servers to perform specialized functions. David Anderson noted that his SETI@home project, often cited as an example of the promise of P2P, is strictly speaking not a P2P system at all, but rather an "inverted client-server" system. Werner Vogel pointed out that there is a huge amount of older research on distributed systems that's ripe for application to P2P.
The workshop participants shared a sense that P2P and related technologies have only just begun to be tapped; Micah Beck suggested that "the challenge for Internet2 is not to support distributed P2P per se, but to support users in whatever they'll think of to do next." Abstracts and
presentations from the workshop are available, as are workshop-inspired predictions from Andy Oram, one of the keynote speakers.