Keywords

cloud computing, serverless computing, model services

Start Date

27-6-2018 2:00 PM

End Date

27-6-2018 3:20 PM

Abstract

Recently serverless computing platforms have emerged that provide automatic web service hosting in the cloud. These platforms are promoted for their ability to host “micro” services to end users while seamlessly integrating key features including 24/7 high availability, fault tolerance, and automatic scaling of resources to meet user demand. Serverless Computing environments abstract the majority of infrastructure management tasks including VM/container creation, and load balancer configuration. A key benefit of serverless computing is FREE access to cloud computing resources. Many platforms provide free access for up to 1,000,000 service requests/month with 1 GB of memory for 400,000 seconds/month. Additionally, serverless platforms support programming languages common for modeler developers including Java, Python, C#, and Javascript.

We present results from a proof-of-concept deployment of the Precipitation-Runoff Modeling System (PRMS) a deterministic, distributed-parameter model developed to evaluate the impact of various combinations of precipitation, climate, and land use on stream flow, sediment yields, and general basin hydrology (Leavesley et al., 1983). We deployed a web-services based implementation of PRMS implemented using the Cloud Services Integration Platform (CSIP) and the Object Modeling System (OMS) 3.0 component-based modeling framework to the Amazon AWS Lambda serverless computing platform. PRMS consists of approximately ~11,000 lines of code and easily fits within the 256 MB maximum code size constraint of AWS Lambda. We compared our serverless deployment to a traditional cloud based Amazon EC2 VM-based deployment. We contrast average model execution time, service throughput (requests/minute), as well as the cloud hosting costs of PRMS using both cloud platforms.

Stream and Session

Stream A: Advanced Methods and Approaches in Environmental Computing

A5: Leveraging Cloud computing, Containerization, and Microservices for Environmental Modelling Software Deployment

Share

COinS
 
Jun 27th, 2:00 PM Jun 27th, 3:20 PM

Going Serverless: Evaluating the Potential of Serverless Computing for Environmental Modeling Application Hosting

Recently serverless computing platforms have emerged that provide automatic web service hosting in the cloud. These platforms are promoted for their ability to host “micro” services to end users while seamlessly integrating key features including 24/7 high availability, fault tolerance, and automatic scaling of resources to meet user demand. Serverless Computing environments abstract the majority of infrastructure management tasks including VM/container creation, and load balancer configuration. A key benefit of serverless computing is FREE access to cloud computing resources. Many platforms provide free access for up to 1,000,000 service requests/month with 1 GB of memory for 400,000 seconds/month. Additionally, serverless platforms support programming languages common for modeler developers including Java, Python, C#, and Javascript.

We present results from a proof-of-concept deployment of the Precipitation-Runoff Modeling System (PRMS) a deterministic, distributed-parameter model developed to evaluate the impact of various combinations of precipitation, climate, and land use on stream flow, sediment yields, and general basin hydrology (Leavesley et al., 1983). We deployed a web-services based implementation of PRMS implemented using the Cloud Services Integration Platform (CSIP) and the Object Modeling System (OMS) 3.0 component-based modeling framework to the Amazon AWS Lambda serverless computing platform. PRMS consists of approximately ~11,000 lines of code and easily fits within the 256 MB maximum code size constraint of AWS Lambda. We compared our serverless deployment to a traditional cloud based Amazon EC2 VM-based deployment. We contrast average model execution time, service throughput (requests/minute), as well as the cloud hosting costs of PRMS using both cloud platforms.