wiki:public/20121001

Project: MapReduce with In-Memory Technolgy

Team: Prof. Dr. Jens Dittrich, Jorge-Arnulfo Quiané-Ruiz

Research institution: Saarland University

Abstract: These current needs of data management require applications to run over a large number of computing nodes. In this context MapReduce is quickly becoming the de facto standard to analyze very large datasets. One of the key features of MapReduce is that it allows non-expert users to easily perform complex tasks over very large datasets. However, this simplicity comes at a price: MapReduce typically reads from and writes intermediate results to disk and hence the performance of MapReduce jobs is usually suboptimal. Many research works have been done by the database research community to improve the performance of MapReduce. Despite all these efforts, MapReduce does not still take advantage of the main memory capacity of computing clusters. In this project, we aim at improving the query responsiveness of big data processing frameworks. To do so, we plan to fully exploit the main memory capacity of computing clusters in a dynamic way.

Last modified 7 years ago Last modified on May 3, 2012 11:38:39 AM