Current location - Loan Platform Complete Network - Big data management - When was big data introduced
When was big data introduced

In 1980, Alvin Toffler proposed.

In 1980, Alvin Toffler, a famous American futurist, first put forward the concept of BigData in his book The Third Wave and extolled it as the magnificent music of the third wave. Until now, BigData has been widely used in government decision-making departments, industry enterprises, research organizations, etc., and actually creates value.

Introduction:

The definition given by McKinsey Global Institute is: a collection of data so large that it greatly exceeds the capabilities of traditional database software tools in terms of acquisition, storage, management, and analysis, and is characterized by four major features: massive data scale, rapid data flow, diverse data types, and low value density.

Big data is a collection of data with massive data scale, rapid data flow, diverse data types and low value density.

The strategic significance of big data technology does not lie in the mastery of huge data information, but in the specialized processing of these data containing meaning. In other words, if big data is compared to an industry, then the key to profitability of this industry lies in improving the "processing capability" of the data, and realizing the "value-added" of the data through "processing". ".