This month's issue of IEEE Computer includes four articles on system-level science: the integration of diverse sources of knowledge about the constituent parts of a complex system with the goal of obtaining an understanding of the system's properties as a whole. This being IEEE Computer, they focus in particular on information technology (IT) issues involved in achieving scientific goals:
[S]ystem-level science integrates not only different disciplines but also, typically, software systems, data, computing resources, and people. System-level science is usually a team pursuit. Data comes from different sources, different groups develop component models, team members provide specialized expertise, and the often substantial computing and data resources required for success are themselves diverse and distributed. Thus, system-level science itself requires the creation of yet another sort of system that may combine large numbers of both physical and human components.
The four articles are as follows:
- Scaling System-Level Science: Scientific Exploration and IT Implications by myself and Carl Kesselman, introduces the special issue. We lead off with a quote from Robert Calderbank:
Sometimes through heroism you can make something work. However, understanding why it worked, abstracting it, making it a primitive is the key to getting to the next order of magnitude of scale.
Then, we discuss IT issues that must be addressed to increase the scale at which we tackle system-level science problems.
- From Molecule to Man: Decision Support in Individualized E-Health (PDF) by Peter Sloot and colleagues:
Computer science provides the language needed to study and understand complex multiscale, multiscience systems. ViroLab, a grid-based decision-support system, demonstrates how researchers can now study diseases from the DNA level all the way up to medical responses to treatment.
Multiscale Modeling: Physiome Project Standards, Tools, and Databases, by Peter Hunter and colleagues:
The Physiome Project's markup languages and associated tools leverage the CellML and FieldML model databases published in peer-reviewed journals. As these tools mature, researchers can check models for conformance to underlying physics laws, using them to develop complex physiological models from separately validated components.
CASA and LEAD: Adaptive Cyberinfrastructure for Real-Time Multiscale Weather Forecasting (PDF), by Beth Plale and colleagues:
Two closely linked projects aim to dramatically improve storm forecasting speed and accuracy. CASA is creating a distributed, collaborative, adaptive sensor network of low-power, high-resolution radars that respond to user needs. LEAD offers dynamic workflow orchestration and data management in a Web services framework designed to support on-demand, real-time, dynamically adaptive systems.