Plenary Session 4: Big Data Management

The increasing amount of data made available by the rising of new technologies for gathering, processing and rendering digital resources, together with the radical improvements in global communication, are transforming the way research is carried out. Massive datasets can now be assembled and explored in ways that reveal inherent but unsuspected relationships.  This data-intensive science is a promising new paradigm for discovery and knowledge production.

Scientific disciplines such as particle physics, genomics and astronomy, were the first to experience in the 2000s an explosion in their data volumes and coined the term 'big data', but this notion is now permeating all areas of human endeavour.

For instance, in the health domain there are medicines discovered from databases that describe the properties of drug-like compounds. 'Big data' refers to massive and ever growing amounts of data, extremely high rates of data production, or other complexities. Such data requires high computing power to be processed and new solutions for its storage, management and processing.

According to current estimates, the global amount of data generated by our digital world doubles in size every two years, giving global rise to the question of how to deal with these huge flows of data and how to sustainably support the needed technological and structural developments. 

Big Data will be one of the major cross-cutting challenges in research. Clear guidelines and policies for the global scientific community and infrastructure operation have to be developed.