Grid Computing

Read Complete Research Material

GRID COMPUTING

Grid Computing



Grid Computing

Introduction

As the Internet ushered humanity into the Information Age, communication and access to computing resources and data have become an integral part of life in the developed world. Scientists are attempting to harness the considerable resources made available through the Internet to offer computing, communication and data solutions for those who require massive amounts of computer processing power. One such solution is grid computing, also known as Internet computing, adaptive computing, meta-computing, global computing, and even planetary computing, referring to the much-acclaimed SETI@home Project, which depends on Internetconnected computers to Search for Extra-Terrestrial Intelligence (SETI) (Buyya, 2002, 85-89).

History and Definition

Ian Foster, a computer scientist at the University of Chicago, and Carl Kesselman, of the Information Sciences Institute at the University of Southern California, earned world recognition by proposing a new paradigm in distributed computing in the mid 1990s, which they referred to as “grid computing.” Grid computing made it possible to use the vast array of new networks, including the Internet, to bring globally dispersed computing resources together (Foster & Kesselman, 1999, 15-29). Grid computing provides computing power in much the same way that a power grid creates a single, reliable, pervasive source of energy by utilizing electricity generated from many suppliers, dispersed through many geographical regions. Fran Berman, Geoffrey Fox, and Tony Hey, editors of Grid Computing—Making the Global Infrastructure a Reality, defined grid computing as follows: [It] “integrates networking, communication, computation and information to provide a virtual platform for computation and data management, in the same way that the Internet integrates resources to form a virtual platform for information” (9). In essence, grid computing refers to a set of common standards, protocols, mechanisms, and tools that could be implemented to harness idle computing resources, data resources, specialized scientific instruments, and applications in order to create a coordinated and collaborative virtual supercomputer that would offer almost infinite processing power and storage space (Foster, Kesselman & Tuecke, 2001, 74-83).

Grid Computing Applications

Although grid computing traces its inception to widearea distributed supercomputing, today it is used to support the needs of myriad disciplines that need dispersed data resources and high computational power, such as high-energy physics, biophysics, molecular biology, risk analysis and modeling, financial modeling, scenario development, natural disaster modeling, geophysics and astrophysics, weather forecasting, computer simulation, and first-response coordination of emergency services (Berman, Fox & Hey, 2002, 56-61).

One major U.S. grid computing project is the National Science Foundation's $53 million Tera-Grid, which connects computing resources and provides one of the largest grids available. The Tera-Grid performs calculations at a speed of 13.6 teraflops (13.6 trillion floating-point operations per second), offers over 0.6 petabytes (millions of gigabytes) of disk space, and has a dedicated network interconnecting all the nodes at 40 gigabits per second. The high-energy physics lab of the European Organization for Nuclear Research (CERN) created a data grid to disseminate over 10 petabytes of the data they expect to generate from the new particle accelerator due to begin operations in ...
Related Ads