Home  /   PANELS

Panels provide an opportunity for participants to interact with a group of experts in an open question-and-answer environment.

  • Venture Capital: Who Wants to be a Billionaire?
  • Is TCP an Adequate Protocol for High Performance Computing Needs?
  • Petaflops Around the Corner: When? How? Is it Meaningful?
  • Convergence at the Extremes: Computational Science Meets Networked Sensors
  • Computational Grids: A Solution Looking for a Problem?
  • Open Source: IP in the Internet Era
  • Megacomputers

Is TCP an Adequate Protocol for High Performance Computing Needs?

TCP/IP is the standard for data transmission in the web era. This fundamental protocol has undergone many changes to continually improve its dynamic range for performance and stability. TCP itself is essential for stability of today’s wide-area networks. However, this protocol isn’t for everyone. For example, high performance cluster builders dispense with TCP as being too heavyweight and tuning TCP for high performance (e.g. Gigabit) WAN connections is very time consuming.

This panel will address issues of how the high performance community and the massive consumer Internet-at-large community can converge on a protocol that satisfies all emerging networking needs? Some questions include: where are the fundamental problems and can they be over-come? What is the current level of performance that can be attained and how difficult was it to tune the network?

Convergence at the Extremes: Computational Science Meets Networked Sensors

The SC conference has ridden the technology waves of the “killer micro” and the “killer network” over the past decade. Two new “killer technologies” are emerging that hold the potential to radically change the nature of computational science. Micro-electromechanical systems (MEMS) make it possible to incorporate tiny sensors into all sorts of computational devices. Low-power CMOS radios make it possible to communicate with minute devices deeply embedded in the physical environment. Not only do these technologies fundamentally change the degree of instrumentation possible in experiments, they provide a unique means of linking computational science and real-time measurement in novel environments. This panel will explore the potential of these emerging technologies, what really is new, and what difference it all makes.

Computational Grids: A Solution Looking for A Problem?

The computational grid, or simply “The Grid” is the latest buzz in computing. Among all the hype are real problems and issues that need addressing before the Grid becomes a ubiquitous and transparent service. Questions that this panel will address include: Will Grid computing “make it” and be successful? What will the Grid really look like? What’s the most important problem still to solve to make it successful? And, how do Grids differ from what’s being done already in distributed computing, clusters, and OS design?

Open Source: IP in the Internet Era

Open source operating systems, tools and community scientific codes represent a fundamental shift in the ownership and intellectual property of software. Of particular importance is the fact that open source can provide continuity of the high-performance software stack as underlying hardware technology undergoes dramatic changes. However, open source gives up control of critical intellectual property and has become a two-edged sword, igniting passions from all communities that focus on software This panel will address the following questions: Can the open source paradigm be applied to other disciplines and should the term become open intellectual property? Can open intellectual property be a component of a profitable business plan? What constitutes truly open source code/intellectual property? And, what software needs to be open source to serve the needs of the high-end/scientific computing user community?


We are nearing a major discontinuity in the evolution of high-performance computing, creating a third era of supercomputing. The first era, which ended with the Cray 1, sought performance by building the fastest single processor possible. The second era, starting with the Illiac IV and continuing to today’s tightly coupled computing complexes (PC Superclusters, MPPs, SMPs, and DSMs) sought performance through tens to thousands of identical processors. The third era couples very large numbers of heterogeneous computers on networks to create a virtual supercomputer. This third era is now going through an important phase change as numerous academic and private companies move from the intranet to the Web itself, harnessing tens of thousands to millions of PCs on the Net. Questions to be addressed by the panel include: What is the software architecture model? Can general-purpose computing machines be created in this manner? What are the classes of applications most likely to move to this new computing fabric? And what new computer science research frontiers are important in this more “biological” effervescent style of architecture?