Solutional nouveau logo (1)

Big data

" Retour à l'index des glossaires

Big data refers to a field that deals with data sets too large or complex for traditional data processing applications. This concept emerged in the 1990s and includes various types of data – structured, semi-structured, and unstructured. As the digital universe expands, big data is evolving, with sizes ranging from terabytes to zettabytes. It necessitates the development of new techniques to gain insights from these massive, diverse data sets. Big data is characterized by five Vs – volume, velocity, variety, veracity, and value. Its growth presents challenges in data capture, storage, analysis, vie privée[1], quality, and authenticity. However, with technological advancements, big data’s storage capacity is set to double every 40 months. It has a wide range of applications in sectors like finance, healthcare, education, media, insurance, and more. Its use has transformed decision-making processes and approaches in various fields, indicating its immense potential and value.

Définitions des termes
1. vie privée. La protection de la vie privée est un concept fondamental qui a évolué au cours de l'histoire et qui continue de façonner notre discours sociétal. Historiquement, il est issu de débats philosophiques, avec des figures comme Aristote et John Locke qui en ont posé les bases. La protection de la vie privée est également liée à des questions juridiques et éthiques, comme en témoignent les arrêts historiques de la Cour suprême et les révélations telles que celles d'Edward Snowden. Les progrès technologiques ont à la fois remis en question et renforcé la protection de la vie privée, en introduisant de nouvelles menaces et de nouvelles mesures de protection. Les normes de protection de la vie privée varient d'un pays à l'autre et d'une organisation internationale à l'autre. À l'ère numérique, la vie privée est confrontée à de nouveaux défis et considérations, tels que l'utilisation des médias sociaux, la culture du selfie et les services de géolocalisation. Ce concept englobe le droit d'un individu à garder ses informations personnelles secrètes et à l'abri de toute intrusion non autorisée.
Big data (Wikipedia)

Big data primarily refers to data sets that are too large or complex to be dealt with by traditional data-processing application software. Data with many entries (rows) offer greater statistical power, while data with higher complexity (more attributes or columns) may lead to a higher false discovery rate. Though used sometimes loosely partly due to a lack of formal definition, the best interpretation is that it is a large body of information that cannot be comprehended when used in small amounts only.

Non-linear growth of digital global information-storage capacity and the waning of analog storage

Big data analysis challenges include capturing data, data storage, data analysis, search, partage, transfer, visualization, querying, updating, information privacy, and data source. Big data was originally associated with three key concepts: volume, varietyet velocity. The analysis of big data presents challenges in sampling, and thus previously allowing for only observations and sampling. Thus a fourth concept, veracity, refers to the quality or insightfulness of the data. Without sufficient investment in expertise for big data veracity, the volume and variety of data can produce costs and risks that exceed an organization's capacity to create and capture value from big data.

Current usage of the term big data tends to refer to the use of predictive analytics, user behavior analytics, or certain other advanced data analytics methods that extract value from big data, and seldom to a particular size of data set. "There is little doubt that the quantities of data now available are indeed large, but that's not the most relevant characteristic of this new data ecosystem." Analysis of data sets can find new correlations to "spot business trends, prevent diseases, combat crime and so on". Scientists, business executives, medical practitioners, advertising and governments alike regularly meet difficulties with large data-sets in areas including Internet searches, fintech, healthcare analytics, geographic information systems, urban informaticset business informatics. Scientists encounter limitations in e-Science work, including meteorology, genomics, connectomics, complex physics simulations, biology, and environmental research.

The size and number of available data sets have grown rapidly as data is collected by devices such as appareils mobiles, cheap and numerous information-sensing Internet of things devices, aerial (remote sensing) equipment, software logs, cameras, microphones, radio-frequency identification (RFID) readers and wireless sensor networks. The world's technological per-capita capacity to store information has roughly doubled every 40 months since the 1980s; as of 2012, every day 2.5 exabytes (2.17×260 bytes) of data are generated. Based on an IDC report prediction, the global data volume was predicted to grow exponentially from 4.4 zettabytes to 44 zettabytes between 2013 and 2020. By 2025, IDC predicts there will be 163 zettabytes of data. According to IDC, global spending on big data and business analytics (BDA) solutions is estimated to reach $215.7 billion in 2021. While Statista report, the global big data market is forecasted to grow to $103 billion by 2027. In 2011 McKinsey & Company reported, if US healthcare were to use big data creatively and effectively to drive efficiency and quality, the sector could create more than $300 billion in value every year. In the developed economies of Europe, government administrators could save more than €100 billion ($149 billion) in operational efficiency improvements alone by using big data. And users of services enabled by personal-location data could capture $600 billion in consumer surplus. One question for large enterprises is determining who should own big-data initiatives that affect the entire organization.

Relational database management systems and desktop statistical software packages used to visualize data often have difficulty processing and analyzing big data. The processing and analysis of big data may require "massively parallel software running on tens, hundreds, or even thousands of servers". What qualifies as "big data" varies depending on the capabilities of those analyzing it and their tools. Furthermore, expanding capabilities make big data a moving target. "For some organizations, facing hundreds of gigabytes of data for the first time may trigger a need to reconsider data management options. For others, it may take tens or hundreds of terabytes before data size becomes a significant consideration."

" Retour à l'index des glossaires
Retour en haut