Data economy
The world has entered the era of data economy. Data provides basic "nutrients" for artificial intelligence, and artificial intelligence helps people to obtain meaningful information from data and provide reference for their own behavior and decision-making. This was very obvious at the 20021Amazon Cloud Technology Conference. At this scientific and technological event, all participants are discussing what value and service data can provide, and all kinds of enterprises are trying their best to make full use of their own data.
Chief data officer and chief analyst's position in the enterprise is getting higher and higher, which also proves this point. Chief data officer is responsible for overseeing a series of data-related functions to ensure that the organization obtains the most valuable assets. His responsibilities include improving data quality, data governance and master data management, as well as formulating information strategy, data science and business analysis.
No-code/low-code platform
Most enterprises are aware of the importance of data and artificial intelligence. However, if they want to "transform" into data-driven enterprises, they may face many problems. For example, it takes nearly 8 months to integrate artificial intelligence models into commercial applications. No-code/low-code platform came into being to help more people, including non-professionals such as "civilian developers", meet the challenges brought by data and artificial intelligence.
Civilian developers are not professional programmers, but employees of companies. They can develop new business applications within the company for other employees to use. In the future, almost anyone with little technical knowledge can develop software, and code-free/low-code tools can actively transform ordinary business users into platform developers.
Edge artificial intelligence
5G, artificial intelligence and network security need to cooperate with each other to achieve wider penetration. Data from IOT endpoints of factories and self-driving cars will trigger a data tsunami.
Edge artificial intelligence and joint learning are trying to meet these challenges, training models on local and centralized data sets without enjoying data sets and invading privacy. With the rise of extended detection and response, security information and event management, security coordination, automation and response, and intelligent operation and maintenance management platform, security will play a vital role in handling applications and data distribution.
Super automation
Super automation is not only a way of thinking, but also a collection of technologies: that is, any business that can be automated in an organization should be automated; Super automation is a collection of innovative technologies, including robot process automation, artificial intelligence, machine learning and other technologies to help organizations improve operational efficiency and save time.
Super Automation enables accelerated growth and business flexibility by quickly identifying, reviewing and automatically executing as many processes as possible. Gartner's research shows that the best-performing super automation teams focus on three key priorities: improving work quality, accelerating business processes and enhancing decision agility.
Data weaving
Data weaving is the next generation of data management, which integrates data from multiple data sources such as data warehouse, data lake, lake library integration and data mart. Data lake refers to the repository of raw data in various formats. The integration of lake and warehouse is a new architecture example in the field of data management, which combines the best characteristics of data warehouse and data lake. Data analysts and data scientists can operate data in the same data storage, which can also bring more convenience to the company's data governance. Data mart refers to a data cube that meets the needs of a specific department or user, is stored in a multidimensional way, and is generated for decision analysis.
Data weaving can not only store data more permanently, but also realize on-site, self-help analysis, classification and management of data by using artificial intelligence. As a flexible and flexible way of data integration across platforms and business users, data weaving can simplify the data integration infrastructure of enterprises and create an extensible architecture, thus reducing the problems caused by the rising difficulty of integration for most data and analysis teams.
Interpretable artificial intelligence
Deep Thinking recently released a new very large language model called Gopher. "Gopher" can run 280 billion parameters, which is higher than GPT-3 released by OpenAI before, and can run 65.438+075 billion parameters, but it is not as good as Megatron Turing released by NVIDIA-Microsoft, and can run 530 billion parameters. The results show that Megatron Turing has achieved unprecedented accuracy in a series of natural language tasks, including text prediction, reading comprehension, common sense reasoning, natural language reasoning and word sense disambiguation.
However, artificial intelligence faces challenges in overcoming prejudice, protecting privacy and gaining trust, which leads to the rise of Interpretative Artificial Intelligence (XAI). XAI is a new branch of artificial intelligence, which is used to explain the logic behind every decision made by artificial intelligence. XAI can improve the performance of AI model, because XAI's explanation helps to find problems in data and feature behavior, and can also provide better decision-making deployment, because its explanation provides additional information for intermediaries to act wisely and decisively.