3rd International Workshop on Workflow Models, Systems, Services and Applications in the Cloud
To be held in conjunction with the 28th IEEE International Parallel & Distributed Processing Symposium (IPDPS) 2014, Arizona Grand Resort PHOENIX (Arizona) USA, May 19-23, 2014
Cloud computing is gaining tremendous momentum in both academia and industry, more and more people are migrating their data and applications into the Cloud. We have observed wide adoption of the MapReduce computing model and the open source Hadoop system for large scale distributed data processing, and a variety of ad hoc mashup techniques that weave together Web applications. However, these are just first steps towards managing complex task and data dependencies in the Cloud, as there are more challenging issues such as large parameter space exploration, data partitioning and distribution, scheduling and optimization, smart reruns, and provenance tracking associated with workflow execution.
Cloud needs structured and mature workflow technologies to handle such issues, and vice versa, as Cloud offers unprecedented scalability to workflow systems, and could potentially change the way we perceive and conduct research and experiments. The scale and complexity of the science and data analytics problems that can be handled can be greatly increased on the Cloud, and the on-demand nature of resource allocation on the Cloud will also help improve resource utilization and user experience.
As Cloud computing provides a paradigm-shifting utility-oriented computing model in terms of the unprecedented size of datacenter-level resource pool and the on-demand resource provisioning mechanism, there are lots of challenges in bringing Cloud and workflows together. We need high level languages and computing models for large scale workflow specification; we need to adapt existing workflow architectures into the Cloud, and integrate workflow systems with Cloud infrastructure and resources; we also need to leverage Cloud data storage technologies to efficiently distribute data over a large number of nodes and explore data locality during computation etc. We organize the CloudFlow workshop as a venue for the workflow and Cloud communities to define models and paradigms, present their state-of-the-art work, share their thoughts and experiences, and explore new directions in realizing workflows in the Cloud.
We welcome the submission of original work related to the topics listed below, which include (in the context of Cloud):
• Models and Languages for Large Scale Workflow Specification
• Workflow Architecture and Framework
• Large Scale Workflow Systems
• Service Workflow
• Workflow Composition and Orchestration
• Workflow Migration into the Cloud
• Workflow Scheduling and Optimization
• Cloud Middleware in Support of Workflow
• Virtualized Environment
• Workflow Applications and Case Studies
• Performance and Scalability Analysis
• Peta-Scale Data Processing
• Event Processing and Messaging
• Real-Time Analytics
Authors are invited to submit papers with unpublished, original work. The papers should not exceed 10 single-spaced double-column pages using 10-point size font on 8.5x11 inch pages (IEEE conference style), including figures, tables, and references.
Paper submission should be done via the EDAS Conference System (http://edas.info/N15819) by midnight January 13th, 2014 Pacific Time. The final format should be in PDF. Proceedings of the workshop will be published by the IEEE Digital Library （indexed by EI）and distributed at the conference. Selected excellent work may be eligible for additional post-conference publication as journal articles or book chapters. Submission implies the willingness of at least one of the authors to register and present the paper.