Augmented is one of the most used, searched, and, perhaps, misused terms in the world of data & analytics today. Everything is required to be “augmented”, often even before coming to existence, which contradicts the very term because, by definition, I can only augment what I already have. And so one is speaking of augmented data quality… when they don’t yet do data quality, of augmented data catalog… when I don’t know what a data catalog is and haven’t implemented one yet, and so on.
Worse still, sometimes one defines as “augmented” something that is simply part of what a tool, a software, a process, has always been able to do… even when it wasn’t yet called so.
But what does it mean to “augment” a process? What is the difference between augmentation and automation? What are the actual possibilities of application? And where to begin?
It would be useful perhaps to make some order, discuss in practice, and try to understand why innovation should not remain in slogan but bring value.
The discussion is similar if we talk about #DataFabric. Do we know what it is?
It is interesting to note that the Data Fabric concept covers and summarizes a dozen of the top trends for data management identified by Gartner® (CFR. Gartner 2020 Planning Guide for Data Management). Here is a short excerpt of the definition of Data Fabric that Gartner gives:
The data fabric takes data from a source to a destination in the most optimal manner, it constantly monitors the data pipelines to suggest and eventually take alternative routes if they are faster or less expensive — just like an autonomous car. Initially the data fabric will be semiautonomous where human intervention is required, but eventually the idea is to make the data fabric fully autonomous. Data consumers will browse a data catalog, not knowing where data resides, pick the data they need and the data fabric will provide the data for them in the most optimal way in terms of performance, effort and cost
Gartner® – Demystifying the Data Fabric – Published 17 September 2020 – By Analysts Jacob Orup Lund
Read that right?
The great value of organizations like Gartner that analyze the market and serve as a bridge between the demand and the offer is how they catch the trends, offer insights and stimuli to the players. Moreover, they encourage innovation by often going beyond what seems reasonable today, but tomorrow may no longer be such.
Our task is to accept the challenge and innovate, but so that things that look to the future happen in reality today!
About the Author
Having grown up in the world of business consulting, Renato Valera worked for ten years in internationally renowned companies, gaining significant experience in the management of complex projects in the fields of organisation, processes and IT. Since 2005 he has been working in Irion, a company where he is a partner and where he currently coordinates the "Consulting & Solution" area - a structure of over 80 resources through which Enterprise Data Management projects and solutions based on the Irion framework are managed. His education and professional experience have allowed him to develop strong skills on EDM issues (Data Quality Management, Data Governance, Data Integration, Aggregation and Reporting, Business Intelligence...) and a deep knowledge of the specific business needs of the banking & financial services market.
Download the Whitepaper
Find out more! Download the whitepaper by Michele Iurillo and Mauro Tuvo:
What is Adaptive Data Governance?
Topics include:
RegTech for Surveillance
RegTech for Regulatory Reporting
RegTech not only Finance
and much more!
More on Data Management: