Current location - Loan Platform Complete Network - Big data management - The idea of what big data emphasizes over random sampling of small data
The idea of what big data emphasizes over random sampling of small data

The notion of what big data emphasizes over random sampling of small data: all data.

Big data, or huge data, refers to information that is so large in size that it cannot be captured, managed, processed, and organized in a reasonable amount of time through mainstream software tools to help businesses make more positive decisions.

In The Age of Big Data by Viktor Mayer-Sch?nberg and Kenneth Kukier, Big Data refers to the use of all data without the use of shortcuts such as random analysis (sampling). The 5V characteristics of Big Data (proposed by IBM): Volume, Velocity, Variety, Value, Veracity.

For "big data" (Big data) research organization Gartner gave this definition. "Big data" is the need for new processing models to have a stronger decision-making power, insight discovery and process optimization capabilities to adapt to the massive, high growth rate and diversity of information assets.

The McKinsey Global Institute defines it as a collection of data that is so large that it greatly exceeds the capabilities of traditional database software tools to acquire, store, manage, and analyze it, and is characterized by massive data size, rapid data flow, diverse data types, and low value density.