Why Modern Data Governance Needs a Data Doctrine

  • June 26, 2023
1206502-blog-image-data-security-GettyImages-1135788475-450x250.jpg

Data has become ubiquitous in our daily lives, and the various technologies data enables have changed how we interact, conduct business and organize our society. It's projected that by 2025, more than 150 zettabytes of big data will need analysis. As data builds up to unprecedented new volumes and becomes more critical for businesses, we also need to reinvent how data is managed. Enterprises need a comprehensive data governance framework that spans from security to usability and accountability. It should incorporate not the processes being governed, but also the people using the data and the systems making it possible. It’s absolutely critical to be able to understand the state of your data at any given point with codified and robust guidelines on every aspect of data management.

The case for data doctrine

A data doctrine is a set of governance principles that guide how data is architected, created, acquired, distributed, processed and used throughout its entire lifecycle. These principles can then be applied to standardization, harmonization, ownership, classification, responsibility, access, integrity and retention of enterprise data. As it takes a village to raise a child, it takes an entire organization to secure data. Organizations should identify communities that understand the data, where it comes from, where and how it gets moved, who is using it and for what. With a data doctrine and a core understanding of the need for a data community established at the onset, organizations can focus on additional constructs to further the position of data governance.

When sensitive data is at stake, compliance with regulations is critical. Ideally, a data doctrine should make compliance with new regulations and standards swift and efficient. For instance, both GDPR and CCPA require that a business provide all personal data for a specific user within 30 or 45 days, respectively. A business that has scattered, poorly or haphazardly managed data, runs the risk of failing to comply with the right-to-be-forgotten clause, which would incur expensive fines. Data collection and removal procedures must be strictly applied to comply with existing personal data legislation and in general future regulations across the world will increasingly put users in complete control of their data.

Protecting the data that is collected, stored and shared via a stable and universal platform would also reduce the risks associated with data loss and degradation. Maintaining the quality of data throughout every stage of its lifecycle while keeping it encrypted is critical. This process is where technologies like homomorphic encryption and attribute-based encryption (ABE) can play a key role. Under attribute-based encryption, we can transfer all the policies and attributes directly from the application to the level of data itself. So, in the future, when we shift data to any data storage, we'll also be able to store it in an encrypted form.

The growing popularity of data fabric

Another concept that's gained popularity recently is that of the data fabric. A data fabric is a way of managing data using a network-based architecture rather than a series of point-to-point connections. By itself, it’s not a way to store or manage data, rather, it’s a flexible way of architecting the connections between data stores so that the data itself stays where it is, and the fabric provides a central platform for that data to be processed, analyzed and used for a business purpose. But that creates more data, which requires further storage and management. This is where data fabric becomes immensely useful as it ties all the data together. Enterprises can effectively leave the data where it is, and the fabric — the platform that allows that data to connect — gives you the power to analyze it and turn that data into meaningful information.

But these connections between data sources by their nature bring up the question of trust: how do enterprises trust these sources of data? The answer is that enterprise organizations need to have methods and processes for evaluating the integrity of data. So for a data fabric structure to work, the entirety of the data needs to be within the authority of a system of governance and data doctrine. Otherwise trust is lost.

Learn more

We’ve taken a vital step in understanding how to bridge the elements of trust and security in a disparate data world. In my next post, I will dive into the applicability of a data doctrine and its inclusion in interoperable data systems.

I'd love to learn more about how you approach data security at your organization, get in touch and explore our end-to-end cybersecurity services.

Subscribe to our blog

ribbon-logo-dark
Oliver Forbes Headshot
Oliver Forbes

Oliver is the Data Security and Network Security Product Leader at NTT DATA. He has more than 15 years’ experience in delivering cybersecurity solutions by stitching together unique product narratives that solve the biggest challenges. As an innate cybersecurity thought leader, he's developed cybersecurity strategies and architectures to help companies across multiple industry sectors bolster their most fundamental defenses. Additionally, he advocates conveying deep-seated points of security to empower and enable an invested cyber community by speaking on the global stage for RSA and Cloud Security Alliance and contributing multiple blog articles, whitepapers and cybersecurity artifacts.

 

Related Blog Posts