Science of Collaboratories logo - link
 
 
 
An alliance to advance the understanding of collaboratories
Science of Collaboratories
   
   

Return to the list of Collaboratory Projects

 
   
 

Name of Collaboratory :

 

Grid Physics Network (GriPhyN)

 
 

Logo :

   
 
 

URL :

  http://www.griphyn.org/  
 

Collaboratory Status :

 
In Development   Start Date : 2000 End Date : 2005 Info Last Updated : Sat, Nov 1 2003 4:01pm PST
 
 

Primary Collaboratory Function :

  Community Infrastructure Development  
 

Secondary Collaboratory Functions :

  Product Development, Distributed Research Center  
 

Domain(s) :

  high energy and nuclear physics, astronomy, computer science  
 

Brief Description of the Collaboratory :

 

The GriPhyN (Grid Physics Network) is a team of experimental physicists and information technology (IT) researchers planning to implement the first Petabyte-scale computational environments for data intensive science. GriPhyN will deploy computational environments called Petascale Virtual Data Grids (PVDGs) to meet the data-intensive computational needs of the diverse community of international scientists involved in the related research. "Petascale" emphasizes the massive CPU resources (Petaflops) and the enormous datasets (Petabytes) that must be harnessed, while "virtual" refers to the many required data products that may not be physically stored but exist only as specifications for how they may be derived from other data.

GriPhyN is funded through the National Science Foundation as a large ITR project. The group is focused on the creation of a number of tools for managing "virtual data" - an approach to dealing with data that acknowledges that all data except for "raw" data need exist only as a specification for how they can be derived. Strategies for reproducing or regenerating data on the grid are key areas of research for the virtual data community. The key deliverable of the GriPhyN project is the Chimera Virtual Data System - a software package for managing virtual data.

The collaboratory team is composed of seven IT research groups and members of four NSF-funded frontier physics experiments: LIGO, the Sloan Digital Sky Survey and the CMS and ATLAS experiments at the Large Hadron Collider at CERN. GriPhyN will oversee the development of a set of production Data Grids, which will allow scientists to extract small signals from enormous backgrounds via computationally demanding analyses of datasets that will grow from the 100 Terabyte to the 100 Petabyte scale over the next decade. The computing and storage resources required will be distributed, for both technical and strategic reasons, across national centers, regional centers, university computing centers, and individual desktops.

 
 

Access to Instruments :

  No direct access to instruments, though each of the experiments supported by GriPhyN provide some level of remote access to instruments (either directly or mediated by data repositories).  
 

Access to Information Resources :

  The Chimera Virtual Data System is set up as an information resource that mediates access to and changes to a dataset. Also ties closely into the information repositories and transport mechanisms provided by the Particle Physics Data Grid. More specifically, uses the grid infrastructure to access information resources.  
 

Access to People as Resources :

  email, video conferencing (VRVS, H.323), POTS conferencing, face-to-face meetings  
 

Funding Agency or Sponsor :

   
 
 

Notes on Funding Agencies/Sponsors:
Funding amount: $11.9 million plus $1.6 million matching

 
 
 
Organizations with Funded Participants:
 
Organization name:
Approx # of participants:
Description of organization's role(s):
Argonne National Laboratory (ANL)
Fermi National Accelerator Laboratory (Fermilab)
Harvard University
Indiana University
Johns Hopkins University
Lawrence Berkeley National Laboratory (LBNL)
Northwestern University
University of California, San Diego (UCSD)
   San Diego Supercomputer Center (SDSC)
Stanford University
   Stanford Linear Accelerator Center (SLAC)
University of California, Berkeley
University of Florida
University of Illinois at Chicago (UIC)
University of Pennsylvania
University of Southern California (USC)
University of Texas at Brownsville
University of Wisconsin-Madison
University of Wisconsin-Milwaukee
 
TOTAL PARTICIPANTS:
 

Notes on Participants/Organizations:
82 in development, likely several thousand users of technology

   
     
 
 

Communications Technology Used :

   
 

Technical Capabilities :

  Management of technical resources
Security, Access control/login facilities
Computation
Remote launch of computation
Support for transition between synch and asynch
Workflow management
Asynchronous object sharing
Index/metadata, Audit trail of events, Email/attachments
Asynchronous conversation
Email
Synchronous conversation
Audio, Video, Instant messaging/chat
 
  Key Articles :  

Freeman, P. A., Crawford, D. L., Kim, S., & Munoz, J. L. (2005). Cyberinfrastructure for science and engineering: Promises and challenges Proceedings of the IEEE, 93(3), 682-691.

Hey, T. & Trefethen, A. (2003). e-Science and its implications Philosophical Transactions of the Royal Society of London Series A-Mathematical, Physical and Engineering Sciences, 361(1809), 1809-1825.

 
 

Project-reported performance data :

   
 
         
    
  Home | About SOC | Workshops | Resources | News & Events  

University of California, Irvine Logo

University of California, Irvine

School of Information Logo

School of Information University of Michigan