Keywords
irene, modelling, model testing, soe, software component
Start Date
1-7-2008 12:00 AM
Abstract
As the role of biophysical models in ecological, biological and agronomic areas grows in importance, there is an associated increase in the need for suitable approaches to evaluate the adequacy of model outputs (model testing, often referred as to “validation”). Effective testing techniques are required to assess complex models under a variety of conditions, including a wide range of validation measures, possibly integrated into composite metrics. Both simple and composite metrics are being proposed by the scientific community, continuously broadening the pool of options for model evaluation. However, such new metrics are not available in commonly used statistical packages. At the same time, the large amount of data generally involved in model testing makes the operational use of new metrics a labour-consuming process, even more when composite metrics are meant to be used. An extensible and easily reusable library encapsulating such metrics would be an operational way to share the knowledge developed on model testing. The emergence of the component-oriented programming in model-based simulation has fostered debate on the reuse of models. There is a substantial consensus that componentbased development is indeed an effective and affordable way of creating model applications, if components meet via their architecture a set of requirements which make them scalable, transparent, robust, easily reusable, and extensible. This paper illustrates the Windows .NET2 component IRENE (Integrated Resources for Evaluating Numerical Estimates) and a first prototype application using it, SOE (Simulation Output Evaluator), to present a concrete application matching the above requirements in the area of model testing.
A Software Component for Model Output Evaluation
As the role of biophysical models in ecological, biological and agronomic areas grows in importance, there is an associated increase in the need for suitable approaches to evaluate the adequacy of model outputs (model testing, often referred as to “validation”). Effective testing techniques are required to assess complex models under a variety of conditions, including a wide range of validation measures, possibly integrated into composite metrics. Both simple and composite metrics are being proposed by the scientific community, continuously broadening the pool of options for model evaluation. However, such new metrics are not available in commonly used statistical packages. At the same time, the large amount of data generally involved in model testing makes the operational use of new metrics a labour-consuming process, even more when composite metrics are meant to be used. An extensible and easily reusable library encapsulating such metrics would be an operational way to share the knowledge developed on model testing. The emergence of the component-oriented programming in model-based simulation has fostered debate on the reuse of models. There is a substantial consensus that componentbased development is indeed an effective and affordable way of creating model applications, if components meet via their architecture a set of requirements which make them scalable, transparent, robust, easily reusable, and extensible. This paper illustrates the Windows .NET2 component IRENE (Integrated Resources for Evaluating Numerical Estimates) and a first prototype application using it, SOE (Simulation Output Evaluator), to present a concrete application matching the above requirements in the area of model testing.