Keywords

Docker; Kubernetes; Jenkins; Continuous Integration; Distributed Systems

Start Date

27-6-2018 9:00 AM

End Date

27-6-2018 10:20 AM

Abstract

Environmental models typically consume vast amounts of computing resources. To effectively serve a growing community of physical, social and natural scientists, these models must be able to scale dynamically and horizontally to meet the demand. Models also require a vast array of software libraries, runtimes, compilers, and configurations specific to a particular application. Maintaining arrays of physical servers, each configured for one specific application, is expensive and inefficient to build and maintain. With the advent of software containers, model developers can isolate an application and all of its software dependencies from the physical server. Kubernetes, a container orchestration tool built by Google, has made it possible to dynamically deploy these containers seamlessly across a cluster of machines. We introduce key concepts and tools for building distributed modeling systems with containers using Kubernetes, managed with a continuous integration pipeline built in Jenkins. We then build and deploy a suite of comprehensive flow analysis (CFA) models as microservices. Finally, we test the service responsiveness, throughput, and average execution time of various containerized configurations of CFA models against deployment on virtual and bare-metal machines.

Stream and Session

Stream A: Advanced Methods and Approaches in Environmental Computing

Sesson 5: Leveraging Cloud Computing, Containerization, and Microservices for Environmental Modelling Software Deployment

Share

COinS
 
Jun 27th, 9:00 AM Jun 27th, 10:20 AM

Building Containerized Environmental Models Using Continuous Integration with Jenkins and Kubernetes

Environmental models typically consume vast amounts of computing resources. To effectively serve a growing community of physical, social and natural scientists, these models must be able to scale dynamically and horizontally to meet the demand. Models also require a vast array of software libraries, runtimes, compilers, and configurations specific to a particular application. Maintaining arrays of physical servers, each configured for one specific application, is expensive and inefficient to build and maintain. With the advent of software containers, model developers can isolate an application and all of its software dependencies from the physical server. Kubernetes, a container orchestration tool built by Google, has made it possible to dynamically deploy these containers seamlessly across a cluster of machines. We introduce key concepts and tools for building distributed modeling systems with containers using Kubernetes, managed with a continuous integration pipeline built in Jenkins. We then build and deploy a suite of comprehensive flow analysis (CFA) models as microservices. Finally, we test the service responsiveness, throughput, and average execution time of various containerized configurations of CFA models against deployment on virtual and bare-metal machines.