Enabling Infrastructures and Middleware for Big-Data Modelling and Simulation
Chair: Prof. Ewa Niewiadomska-Szynkiewicz, Poland
Vice-Chair: Prof. Ioan Salomie, Romania
From the inception of the Internet, one has witnessed an explosive growth in the volume, speed, and variety of electronic data created on a daily basis. Raw data currently originates from numerous sources including mobile devices, sensors, instruments (e.g. CERN LHC, MR scanners, etc.), computer files, Internet of Things, governmental/open data archives, system software logs, social networks, commercial datasets, etc. The so-called ‘Big Data’ problem requires the continuous improvement of servers, storage, and the whole network infrastructure in order to enable the efficient analysis and interpretation of data through on-hand data management applications (e.g. agent-based solutions in Agent component in Oracle Data Integrator (ODI)). The main challenge in Big Data Modelling and Simulation is to define a complete framework which includes intelligent coordination and communication, data fusion, mapping algorithms, and protocols. The programming abstractions and data manipulation techniques must therefore be designed for (a) the seamless implementation of application solutions with efficient levels of virtualisation of COST 103/14 20 EN computational resources (communications, storage, and servers); and (b) the effective normalisation and merging of data with dissimilar types into a consistent format (wide class of data services).
WG1 activities will foster research on:
- Comprehensive survey and taxonomy of existing Big Data system architectures and middleware, including Artemis Platform for Neonatal IC, systems using RFID-based technologies, smart meters, smart passports, etc., and with a particular focus on the modelling frameworks of interest for WG3 and WG4,
- Analysis and design of Big Data system components based on the practical use cases and user requirements, particularly those defined in WG3 and WG4.
- Development of novel heterogeneous models, algorithms and techniques for advanced Big Data exploitation, based on the current and emerging multicore system architectures, virtualised servers and data centres, mobile cloud and multi-cloud systems. This objective is tightly coupled with WG2 objectives.
- Definition of a test bed (benchmark suite) and a standardised library for heterogeneous parallel processing of Big Data for life, physical, and social science applications.
- Analysis and development of the new trends in evolution of the Big Data middleware and architectures
Prof. Ewa Niewiadomska-Szynkiewicz graduated from the Faculty of Electronics, Warsaw University of Technology (WUT). She took a Ph.D degree in 1995, D.Sc. degree in 2005 both from WUT. Since 1988 with the Institute of Control and Computation Engineering, WUT, currently a professor and head of the Complex Systems Group at this Institute. She also works with the Research and Academic Computer Network (NASK), since 2006 an associate professor at NASK and head of the Traffic Engineering and Network Simulation Group (R&D division). In 2002-2009 a member of scientific council of NASK, in 2008-2009 vice-chairman. Since September 2009 she is the Research Director at NASK. She participated in a number of research projects including three European projects within TEMPUS programme and EU projects (5th FP and 7th FP), coordinated a number of the groups’ activities, managed organisation of a number of national-level conferences. She is an expert of the Polish Accreditation Committee working for the quality of education in Poland and a member of Committee on Automatic Control and Robotics of the Polish Academy of Science. For many years she was involved in research on complex systems modelling, control and optimization, computer simulation and decision support systems. Her current interests are computer simulation, optimisation and network modelling, ad hoc networks and high performance computing. She is the author and co-author of two books on parallel and distributed computing, one textbook for e-learning in clusters and grids, and over 140 journal and conference papers.
Ioan Salomie is Professor of Computer Science at the Technical University of Cluj-Napoca (TUCN), Romania and the Head of the Distributed Systems Research Laboratory of the same university. He graduated University “Politehnica” of Bucharest and received his PhD in 1994 at TUCN. Professor Salomie has participated together with his research team in national and international research projects in the areas of energy efficient data centers in the context of smart grid and smart cities (contributions including energy-aware semantic modeling, simulation, optimization and green scheduling of tasks for FP7 projects GAMES and GEYSER) and ambient assisted living (FP7 AAL projects Diet4Elders and EldersUp!). His research areas include green computing and systems, context awareness and autonomic computing, adaptive complex systems, intelligent systems. Professor Salomie has held positions of visiting professor at Loyola University in Maryland and at the University of Limerick and is author and co-author of 7 books, several book chapters and more than 100 journal and conference papers in his research areas.