Data to transform and democratize

To say in the middle of 2022 that we are in the era of big data is to ignore a reality that has been with us for more than half a century.

24 May 2022 Tuesday 12:21
8 Reads
Data to transform and democratize

To say in the middle of 2022 that we are in the era of big data is to ignore a reality that has been with us for more than half a century. It is something that is already essential for the growth of any company: what do we do with the data? The answer to this question should not be left solely in the hands of corporations: how the data works, why it is collected and what it is for, should also be “public knowledge”. The idea is not for all of us to be experts in data management or for companies to share their expertise in the matter, but to accept and be fully aware that information is a common good, moving away from the dystopian and Orwellian vision that with this practice “they spy on us”.

Phrases like that data is “the oil of the s. XXI” do not help to understand them as a social benefit. There is a false conception that data management is commodifying privacy, when the reality is that it helps us to advance as a society, to be aware of our needs, our advances and discoveries. All this in an environment that, until the emergence of big data management, only had the possibility of knowing its reality based on the knowledge of a small group of experts. It is not that these experts have lost relevance, only that their sources and, therefore, the different existing realities, have been expanded as society participates in it, through its own generation of data.

Is this change of roles positive at all? It will be to the extent that this "management" of the data is carried out with a series of protocols, tools and methodologies that allow the handling of information to be for the appropriate purposes and within the required legal margins. Therefore, it is not a question of “interpreting reality”, but of understanding it, of making a reading of it without bias. For this, a good filtering and purification is necessary that allows to respect not only the degree of privacy that decides who transfers their data, but also that the information obtained is 100% useful for the purpose for which it is required, without leading to confusion. .

The latter is of vital importance at a time in history when we are capable of obtaining data from thousands of sources every second. Just 20 years ago it was quantified that humanity had generated five exabytes (an exabyte is a million terabytes) of information in its entire history up to that point. Just eight years later, in 2011, 1,800 were generated, and the latest data available (2021) places us at a production of 79 zetabytes (a zetabyte is 1,000 million terabytes), with a forecast of reaching 180 in 2025 How have we managed to reach these amounts in such a short time?

Hyperconnectivity is one of the reasons, but also the development of methodologies such as machine learning, the Internet of Things, artificial intelligence, and changes in profiles, such as the fact that we are not only consumers of data, but generators. The production of data is no longer a matter of purely technical roles, and its processing and control are of interest to areas ranging from marketing campaigns to the identification of products and services that are launched on the market or even the strategic redirection of the company.

Before big data enveloped every process, new products were launched with minimal market research and the belief that they could solve a specific need. It was the market, with its own mechanisms, that was in charge of assessing its success or failure. Now it is the consumer of the product itself who is involved in its development without even realizing their own relevance and participation in this process.

From the famous web browsing cookies, what we share on social networks, what we like, what series we devour on online platforms, even what purchases we make online or what events we add to the calendar of our mobile... All this information that we are dispersing as breadcrumbs in our day to day, is collected and translated into data that draw exact consumer profiles, which are impacted with the accuracy of knowing exactly what to sell, to whom, how, why and for what: the five w's applied to business growth.

But demonizing the use of data would be like denying the usefulness of the internet just because certain individuals make illegitimate use of it. Big data management is a powerful methodology and a set of tools whose use allows from saving resources by accurately indicating how much time and material a task requires, to improving world health through the global sharing of information on progress doctors. But to get to that point, it is necessary to understand the importance of big data and democratize it, so that its management can be carried out clearly.

The data must be treated as what it is: one more asset of the company whose interpretation and management must be clean, precise, that sheds light rather than chaos. It is essential to be clear about its usefulness, in what we are going to invest the information it provides us and what it is going to solve for us. Thus, managing big data allows, where before there could be doubts or uncertainties, information that offers confidence can be obtained. By being able to automatically verify any query based on gigantic sources of information, something is no longer X because we think so, but because a multitude of fully verified data confirms it to us.

Big data is neither "using Pegasus" nor a product of laboratory or engineering. It is to know the environment perfectly based on the information that is generated from it, with precise tools and methodologies that reduce the margins of error to a minimum and offer a safe and legal environment for the treatment of an asset as delicate as personal data. .

Thus, those who manage big data do not access the data of "Antonio, resident in Madrid, teacher, 42 years old and consumer of auteur films". Both ironclad regulations such as the RGPD or LOPD, as well as the nature of the data management tools themselves and their own filters and blocks, allow us to know about Antonio's profile, but not that it is himself unless he consents to it. . And it is this last aspect that affects the fact that the democratization of data and the proper functioning of big data happens because we know and are aware of its benefits and its use. Since, thanks to its correct implementation, we can prevent fraud, increase security, improve the customer experience, reduce costs, streamline processes and a long etcetera for which Antonio or, rather, the resident middle-aged teacher in Madrid and a movie buff, you shouldn't worry.

The very nature of big data is its own handicap: the fact that we speak of massive data allows us to immerse ourselves in numerous sources of information that verify and clarify any doubt or need, but we also run the risk that the wave of data becomes in tsunami and end up devouring us. For this, the main challenge is the adaptation and absorption capacity that we have to store, process and make visible each piece of data. It is a digital transformation process that every company should undertake so as not to be left behind in this "race" for quality data.

Concepts such as an appropriate cloud environment, data architecture and engineering that allows its processing and validation, business analytics tools that present it in a secure, processable and understandable way... An ecosystem that calls itself data governance or data governance and that, correctly implemented, it ensures that all information is assimilated and reaches exactly who it has to reach, in the most appropriate state for it, which is none other than in a precise state (reporting accurately), complete (without being supported its validity in other data), consistent (that fits into its context), punctual (that is neither obsolete nor incomplete), singular (unique and adds value) and valid (useful and reliable).



You have to login for comment. If you are not a member? Register now.

Login Sign Up