DCAM Framework – Introduction

Search

Introduction

The Data Management Capability Assessment Model (DCAM®) defines the scope of capabilities required to establish, enable and sustain a mature data management (DM) discipline. It addresses the strategies, organizational structures, technology and operational best practices needed to successfully drive DM. It addresses the tenets of DM based on an understanding of business value combined with the reality of operational implementation.

Overview

The concept of data as a foundational aspect of business operations has arrived. It is now widely understood as one of the core factors of input into the full range of business and organizational processes. An organization that is effective in its use of data is one that implements and manages a data control environment. The data control environment yields a wide range of bottom-line benefits to the organization, including reduced operational costs, automated manual processes, consolidated redundant systems, minimized reconciliation, and enhanced business opportunities. An organization implements a data control environment to ensure trust and confidence in the data they are relying on for business processing and decision-making. The data control environment eliminates the need for manual reconciliation or reliance on data transformation processes.

The data control environment concept ensures the data is precisely defined, described using metadata, aligned with meaning, and managed across the full data lifecycle. The key to first establishing a data control environment, however, is the achievement of unambiguous shared meaning across the organization along with the governance required to ensure precise definitions. Data must be consistently defined by the real thing it represents, such as products, clients, customers, legal entities, transactions, events, and much more. All other processes are built upon this foundation.

The opposite of a data control environment is a fragmented data environment. The fragmentation results in application development that produces ad hoc data models. The fragmentation exacerbates the problem of common terms that have different meanings, common meanings that use different terms, and vague definitions that don't capture critical nuances. For many organizations, this challenge can be debilitating because there are thousands of data elements, delivered by hundreds of internal and external sources, all stored in dozens of unconnected databases. This fragmentation results in a continual challenge of mapping, cross-referencing, and manual reconciliation. The data control environment is dependent on every data attribute being understood, at its atomic level, as a fact that is aligned with specific, durable business meaning without duplication or ambiguity. Managing data as meaning is the key to the alignment of data repositories, harmonization of business glossaries, and ensuring that application data dictionaries are comparable.

Achieving alignment of business meaning, including the process of how terms are created and maintained, can be a daunting task. It is not uncommon to experience resistance from internal business and technology users– particularly when there are multiple existing systems linked to critical business applications. The best strategy for reconciliation in a fragmented environment is to harmonize based on legal, contractual, or business meanings rather than trying to get every system to adopt the same naming convention. Nomenclature represents the structure of data and the unraveling of data structures and data models is expensive and not necessary. It is better to focus on precisely defining business concepts, documenting transformation processes, and capturing real-world data relationships. Once established, existing systems, glossaries, dictionaries, repositories, etc. can be cross-referenced to the common meaning.

Data as meaning must be managed along with its defining metadata to ensure consistency and comparability across the organization. Data meaning and metadata management must be understood as the core of your content infrastructure and the baseline for process automation, application integration, and alignment across linked processes. Some common types of metadata include business, operational, technical, descriptive, structural, administrative.

The implementation and management of a data control environment are governed by data policies, standards, processes, and procedures. These are the essential mechanisms for establishing a sustainable DM initiative and for ensuring compliance with a data control environment in the face of organizational complexity. Managing meaning is the key to effective DM. Meaning is achieved through the adoption of semantic standards. Standards are governed by policy. The policy is established by executive management, supported by data owners and enforced by Internal Audit.

Challenges in Creating a Data Control Environment

diagram 1

Diagram 0.1: Data Control Environment Challenges

Achieving a data control environment requires the organization to overcome the following challenges

  • Understand the existing legacy data environments, including an inventory of data, point-to-point links, inconsistent definitions, etc.
  • Simplify, organize, and categorize the disparate environment into defined data domains, with clearly identified data elements and documented data flows.
  • Align data elements to unambiguous shared meaning across the organization through the implementation of controls, policy, and governance.
  • Measure and track data to ensure quality and consistency with minimal reconciliation.
  • Align technology to ensure the principles and best practices that have been established are enabled across the organization's technology infrastructure.

It is this journey that must be taken to arrive at a data control environment needed to ensure the highest quality of data is delivered to critical processes throughout the organization.

Many organizations have made the mistake of trying to solve the DM problem by starting with technology. The DCAM, as a best practice, advocates that the starting point is with the business process that defines the requirements for data. Then the business process and data can be automated by technology.

Data Management Operating Levels

The DCAM Framework defines the components and capabilities that are required to achieve a data control environment in the organization. However, the Framework is not prescriptive to how the capabilities are executed at the various operating levels of the organization. An organization will need to customize their operating model to account for the size, complexity, geography, and culture of the organization.

diagram 2

Diagram 0.2: Data Management Operating Levels

When using the DCAM Framework to assess the organization’s capabilities it is most informative to evaluate each level as defined in the DM target operating model of the organization.

Data Management Stakeholders

The DM Stakeholder Accountability Matrix illustrated below is a construct for identifying the stakeholder types across the data ecosystem and the high-level roles and the relationships between the stakeholders. Understanding the stakeholder types is an important foundation for applying the DCAM Framework in a DM target operating model for the organization. It also defines the audience that is targeted for using the DCAM as the capability assessment tool for the organization. Within the Matrix, data architecture is the pivotal bridge in the relationship between the business and technology stakeholders.

diagram 3

Diagram 0.3: Data Management Stakeholder Accountability Matrix

Emerging Issues

The practice of DM continues to evolve. New technologies and issues introduced into organizations that impact data or DM create additional requirements for the DM initiative. While the CDO may not be directly accountable for executing these technologies or addressing the issues, they must be actively involved in guiding the organization.

Our mission is to continue to evolve DCAM to stay current and support DM professionals to be successful in their roles within their organizations. Based on feedback from our membership, two topics—the rise of Machine Learning (ML) and Artificial Intelligence (AI) in analytics and the issue of data ethics—have emerged as important considerations for DM decision making. The member workgroup supporting the new version of DCAM included key subject matter experts on these topics. With their input, we have included considerations for these topics into the existing sub-capabilities in DCAM and introduced new capabilities and sub-capabilities where required.

Analytics & Machine Learning

Many organizations are at various stages of developing and implementing more advanced analytics and machine-learning applications within their operations. Although these applications do not fundamentally alter the required structures of good DM, they do necessitate a variety of changes to quite a few of its elements. These changes are set out under the relevant components and capabilities. However, it is useful to provide some background to the analytics and machine-learning functions in an organization so that its requirements will have context.

In broad terms, two main types of data-analytics activities will take place in an organization:

  1. Descriptive / Inferential Statistics (D/IS) – This is the summary of key metrics of an organization. Key metrics include income, throughput, sales, and expenses. D/IS may also be estimates of what these metrics are likely to be if they are not directly observable. These summaries may be presented numerically or graphically, such as in the form of dashboards, and may also be generated periodically or on a live basis. This type of analysis typically forms part of the business intelligence (BI) or management-intelligence (MI) function of an organization.
  2. Machine Learning / Artificial Intelligence – As opposed to descriptive and inferential statistics, which aim to present the salient features of a dataset through a direct summary of the data, this type of analysis employs algorithms to discern useful and valuable patterns from the data with little, or no, human input beyond setting up the learning algorithm. These patterns can be derived from both structured and unstructured data. The patterns uncovered are often more valuable than those discerned by humans for several reasons, including:
    1. These types of algorithms can consider many more data dimensions and their interactions than humans. They can also consider many more examples of a phenomenon simultaneously than a human can; and
    2. Machines do not suffer from many of the cognitive biases that result in poor human patternrecognition outcomes.

A consequence of the complexity of the patterns recognized by these types of algorithms is that, despite the accuracy of the recognized patterns, the underlying pattern is often too complex to be understood by humans and functions effectively as a black box. The lack of understanding may not always be acceptable, depending on the application of the algorithm.

Both D/IS and ML/AI may be used as the basis for other types of analysis and forecasting activities.

Regarding the data flow that takes place for analytical activities, both classes of analytical activities go through approximately the same steps:

    1. Raw input data is sourced from reference databases into a single processing workspace;
    2. The raw input data is manipulated and transformed into a form that is necessary for the analytical activities;
    3. The analytical activities take place:
      1. In the case of D/IS, this would be the generation of summary metrics;
      2. In the case of ML/AI, this would be the training of algorithms to recognize the required patterns;
  • The output of the analysis stage is made available to authorized users in the desired output format:
    1. In the case of D/IS, this would be the output of summary metrics in the form of a numerical or graphical report or dashboard;
    2. In the case of ML/AI, this would be the publication of trained and tested algorithms to those users who are authorized to use them or the publication of the outputs of the trained algorithms as they are applied to current data.

New technologies such as ML/AI introduce unique considerations for DM. That said, the successful implementation of all the DM capabilities is your best approach to addressing these new requirements. DCAM V2 now incorporates these considerations across the spectrum of capabilities and sub-Capabilities as a guide to empowering advanced analytics.

Data Ethics

ML/AI and other technological innovations have increased the sheer volume of data available to organizations and created a greater distance between human judgment and automated decision-making. One of the DM considerations raised by technological innovations in data ethics. In our discussions with DM experts around the world, the topic of data ethics has emerged as a primary concern, especially in light of the record-breaking fines imposed on organizations that trade in personal data and fail to protect user privacy. The concern is compounded by the rapid pace at which companies are pursuing products and services that leverage the internet of things, including facial recognition security systems and self-driving cars. DCAM V2 addresses this concern proactively to help DM professionals instill an organizational culture aligned with a Code of Ethics that articulates their organizations' values and priorities and provides guidelines for operationalization.

Automated decision making—using algorithms to classify data and make determinations based on these classifications—has the potential to revolutionize DM processes that were previously tedious, expensive, and impossible for humans to conduct. For example, convolutional neural networks—the most commonly applied machine learning technique used in medical imaging—are employed in the detection of tumors in diagnostic radiology. However, the reliability and accuracy of these algorithms depend on many factors, including the quality of training data, and the volume and how well the data corpus represents what it is supposed to represent. The best algorithms work because they are trained on accurate yet generalizable data so that new information is not forced into inappropriate classifications.

What this means for data ethics is that the complexity of the algorithms employed by an organization must be supported by heavily annotated training data and a broadly representative, very large data corpus. Furthermore, those responsible for data ethics within the organization must understand how the algorithms classify data, and how this classification may have unintended consequences. In some organizations, CDOs are being assigned accountability for data ethics, while others appoint a Chief Ethics Officer. Regardless of the approach, there are responsibilities and requirements for data and DM inherent in the operations of organizations that prioritize data ethics.

The assumptions shape the ability of algorithms to contend with data that varies from its training data that programmers make when contemplating the realm of possible outcomes. Though advances in machine learning have shone a spotlight on data ethics, this issue is not confined to the realm of artificial intelligence. Efforts to uphold privacy commitments to the people represented by the data are undermined often by simple human error or outdated systems protocols. While compliance with legal requirements for DM is helpful, it represents a low bar for privacy protection. Even the most well-intentioned data professionals working with the cleanest datasets may cause harm inadvertently, especially if they are not aware of how algorithms replicate societal inequities. In other words, "Good data can create bad outcomes."

Similarly, those responsible for governing data ethics in an organization must be prepared to consider how both automated and human decision making may affect organizational stakeholders. The stakeholders may be known entities or unknown populations who may be affected unintentionally by the organization's operations. 

Incorporating data ethics into the DCAM framework is designed to help organizations establish policies and procedures that increase the likelihood that data-driven decisions with potential for such unintended outcomes will be identified and modified accordingly. Of course, it is difficult to anticipate how data classification and use will affect people in dynamic contexts. An organizational culture that embraces data ethics can go a long way toward minimizing vulnerability to data breaches and offsetting the biases inherent in programming assumptions. Every member of the organization who interacts with data and data systems has the potential to contribute to a culture of data ethics.

DCAM: A Framework for Sustainable Data Management

A complex set of DM capabilities are required to achieve a data control environment. The Data Management Capability Assessment Model is a framework for executing a robust, sustainable DM function. It is also an essential tool for ongoing assessment and benchmarking of an organization's DM capabilities.

diagram 4

Diagram 0.4: DCAM Framework

The DCAM Framework consists of seven core components and one optional component. The first component, Data Management Strategy & Business Case and the second component, the Data Management Program & Funding Model, are foundational to the other five core components.

The next four core components of the DCAM framework, Business & Data Architecture, Data & Technology Architecture, Data Quality Management, and Data Governance are the execution components.

The final core component is the collaboration activity–Data Control Environment. It is here that the execution components are put into operation by the data producer to bring a defined set of data into control and make it available to data consumers at a point in time that is either real-time or a period end.

The seven core Components include 31 Capabilities and a total of 106 Sub-capabilities. The definition and scope of each component are presented below.

The eighth component, Analytics Management, is optional and is relevant where the scope of the DM program or the Chief Data Officer's responsibilities cover the Analytics functions of the organization. This component has seven capabilities and 30 sub-capabilities.

DCAM: The Scope of the Eight Components

1.0 Data Management Strategy & Business Case

The Data Management Strategy (DMS) & Business Case component is a set of capabilities to define, prioritize, organize, fund, and govern DM and how it is embedded into the operations of the organization in alignment with the objectives and priorities of both the enterprise and operating units. The DM business case is the justification for creating and funding a DM initiative. The business case articulates the major data and data related issues facing an organization or operational unit and describes the expected outcomes and benefits that can be achieved through the implementation of a successful DM initiative.

  • Establish a DMS function within the Office of Data Management (ODM).
  • Work with DM Program Management Office (PMO) to design and implement sustainable business-asusual processes and tools for the DMS function.
  • Align the DMS with the business strategy, objectives, and priorities, including prioritization of data based on its criticality to the business.
  • Define the rationale and business case for the management of data as an asset through the organization-wide DM initiative.
  • Ensure the DMS is aligned with the organization-wide Enterprise Data Management Principles.
  • Articulate the DM target and current-state. Then, using DCAM as an assessment tool for gap analysis and prioritized gap closure, create a cohesive execution plan.
  • Define the high-level execution roadmap.
  • Define strategy execution risks and mitigations.
  • Define DM performance metrics.
  • Document the DMS with a compelling presentation of the value of an organization-wide DM initiative.
  • Ensure that the DMS governance is integrated into the Data Governance (DG) structure.

2.0 Data Management Program & Funding Model

The Data Management Program (DMP) & Funding Model component is a set of capabilities to manage the Office of Data Management (ODM). These organizational structures include resource requirements and a full range of Program Management Office (PMO) activities such as the execution of program management, stakeholder management, funding management, communications, training, performance measurement. The DM funding model within the DMP is designed to provide the mechanism to ensure the allocation of sufficient capital needed for implementation of the Program. It also defines and describes the methodologies used to measure both the costs and the organization-wide benefits derived from the DM initiative.

  • Establish a DMP function to implement the Program Management Office (PMO) capabilities within the ODM.
  • Facilitate the design and implementation of sustainable business-as-usual DM processes and tools across the components and their capabilities.
  • Establish roles and responsibilities related to the DM capabilities aligned with an organizational structure and execute in an ODM.
  • Define Funding Model, secure and monitor funding, and institute cost and benefits tracking aligned to the Business Case.
  • Establish the DM execution roadmap with supporting project plans to build upon the high-level Data Management Strategy (DMS) roadmap.
  • Engage each stakeholder across the data ecosystem as appropriate to their roles in resource alignment, funding, communications, training, and skill development.
  • Manage the DM initiative by monitoring and socializing DM performance metrics.
  • Ensure that the DMP governance is integrated into Data Governance (DG).

3.0 Business & Data Architecture

The Business and Data Architecture (DA) component is a set of capabilities to ensure integration between the business process requirements and the execution of the DA function. The business architecture function defines the business process. DA defines data models such as taxonomies and ontologies, as well as data domains, metadata, and business-critical data to execute processes across the data control environment. The DA function ensures the control of data content, that the meaning of data is precise and unambiguous and that the use of data is consistent and transparent.

  • Establish a DA function within the Office of Data Management (ODM).
  • Work with DM Project Management Office (PMO) to design and implement sustainable business-asusual processes and tools for DA, including the required integration with business architecture.
  • Identify and establish data domains, authoritative sources, and provisioning points.
  • Identify and inventory the data to support the business requirements, including all necessary metadata, including a glossary, dictionary, classification, lineage, etc.
  • Define and assign business definitions linked to the data inventory.
  • Ensure that the DA governance is integrated into Data Governance (DG) and aligned to both business and technology governance activities.

4.0 Data & Technology Architecture

The Data & Technology Architecture (TA) component is a set of capabilities to align the architectural requirements of the business, data, and technology across the organization to support the desired business process outcomes. These outcomes include DM processes and the required technology infrastructure.

  • Work with DM Project Management Office (PMO) to design and implement sustainable business-asusual processes and tools for the integration of business and technology architecture with data architecture.
  • Ensure DM function alignment with business, data, and technology architecture and strategy.
  • Ensure that the DM governance is aligned to both business and technology governance activities.

5.0 Data Quality Management

The Data Quality Management (DQM) component is a set of capabilities to define data profiling, DQ measurement, defect management, root cause analysis, and data remediation. These capabilities allow the organization to execute processes across the data control environment, ensuring that data is fit for its intended purpose.

  • Establish a DQM function within the Office of Data Management (ODM).
  • Work with data management (DM) Program Management Office (PMO) to design and implement sustainable business-as-usual processes and tools for DQM.
  • Execute DQM processes against business-critical data. DQM processes include profiling & grading, measurement, defect management, root cause fix, remediation.
  • Establish DQ metrics and reporting routines.
  • Ensure that the DQM governance is integrated into Data Governance (DG).

6.0 Data Governance

The Data Governance (DG) component is a set of capabilities to codify the structure, lines of authority, roles & responsibilities, escalation protocol, policy & standards, compliance, and routines to execute processes across the data control environment. The component ensures authoritative decision making at all levels of the organization

  • Establish a data governance function within the Office of Data Management (ODM).
  • Work with DM Program Management Office (PMO) to design and implement sustainable business-asusual processes and tools for data governance.
  • Define clear roles, responsibilities, and accountabilities for DM resources, including those mandated by DM policy.
  • Define and operate the data governance structure with clear lines of authority, responsibility for decision making, engaged stakeholders, adequate oversight, issue escalation paths, and tracking of remediation activity.
  • Develop and oversee adherence to comprehensive and achievable DM policies, standards, and procedures, including leading the response to audits.
  • Ensure the data governance function is aligned with other relevant control function policies, procedures, standards, and governance requirements from information security, privacy, technology architecture, etc.

7.0 Data Control Environment Scope

The Data Control Environment (DCE) component is a set of capabilities that together form the fully operational data control environment. Data operations, supply-chain management, cross-control function alignment, and collaborative technology architecture must operate cohesively to ensure the objectives of the DM initiative are realized across the organization.

  • Work with DM Program Management Office (PMO) to design and implement sustainable business-asusual processes and routines to enable a successful data control environment.
  • Bring together the DM components as a coherent, end-to-end data ecosystem.
  • Follow current DM best practices by routinely reviewing and auditing the capabilities and their processes.
  • Ensure all facets of DM for business-critical data such as data lifecycle, end-to-end data lineage, and data aggregations are fully operational.
  • Ensure DM is aligned with other control function policies, procedures, standards, and governance.

8.0 Analytics Management Scope

The Analytics Management (AM) component is a set of capabilities used to structure and manage the Analytics activities of an organization. The capabilities align AM with DM in support of business priorities. They address the culture, skills, platform, and governance required to enable the organization to obtain business value from analytics.

Scope
  • Define the Analytics strategy.
  • Establish the AM function
  • Ensure that analytics activities are driven by the Business Strategy and supported by the DM strategy.
  • Ensure clear accountability for the analytics created and for their uses throughout the organization.
  • Work with DM to align Analytics with Data Architecture (DA) and Data Quality Management (DQM).
  • Establish an analytics platform that provides flexibility and controls to meet the needs of the different stakeholder roles in the Analytics operating model.
  • Apply effective governance over the data analysis lifecycle. Governance includes tollgates for model reviews, testing, approvals, documentation, release plans, and regular review processes.
  • Ensure that Analytics follows established guidelines for privacy, data ethics, model bias, and model explainability requirements and constraints.
  • Manage the cultural change and education activities required to support the Analytics strategy.

DCAM Uses Cases

DCAM has multiple uses within an organization.

  • As a well-defined control framework
  • As an assessment tool
  • As an industry benchmark

DCAM as a Framework

When an organization adopts the standard DCAM framework they introduce a consistent way of understanding and describing DM. DCAM is a framework of the capabilities required for a comprehensive DM initiative presented as a best practice paradigm. DCAM helps to accelerate the development of the DM initiative and make it operational. The DCAM Framework:

  • Provides a common and measurable DM framework
  • Establishes common language for DM
  • Translates industry expertise into operational standards
  • Documents DM capability requirements
  • Proposes evidence-based artifacts

DCAM as an Assessment Tool

To effectively use DCAM as an assessment tool requires the definition of the assessment objectives and strategy, planning for the assessment management, and adequate training of the participants to establish a base understanding of the DCAM Framework.

The assessment results translate the practice of DM into a quantifiable science. The benefits afforded an organization from the assessment outcomes include:

  • Baseline measurement of the DM capabilities in the organization compared to an industry standard
  • Quantifiable measurement of the progress the organization has made to operationalize the required DM capabilities
  • Identification of DM capability gaps to inform a prioritized roadmap for future development aligned to the organization’s business requirements for data and DM
  • Focused attention to the funding requirements of the DM initiative

DCAM Scoring Guide

The actual scoring guide used throughout DCAM is as follows. It is designed to assess the phase of capability attainment. It is not an assessment of the maturity or scope to which an organization has applied its capabilities. By design, the scoring is an even number to force a conscious decision by the rater and avoiding the tendency to select the midpoint of the scoring scale.

SCORE

CATEGORY

DESCRIPTION

CHARACTERISTICS

1

Not initiated

Not Performed

Ad hoc activities performed by heroes

2

Conceptual

Initial Planning Stages

Issues being debated; whiteboard sessions

3

Developmental

Engagement Underway

Key functional stakeholders identified; workstreams defined; meetings underway; participation growing; policies, roles, and operating procedures being established; project/annual funding

4

Defined

Defined and Verified

Business users active; LOB management with P&L responsibility engaged; requirements verified; responsibilities defined and assigned; policy and standards exist; routines in place; lineage underway; critical data elements (CDEs) identified; adherence tracked; multi-year/sustainable funding

5

Achieved

Adopted and Enforced

Executive management sanctioned; proactive business engagement; responsibilities coordinated; policy and standards implemented; lineage verified; data harmonized across repositories; adherence audited; strategic/investment funding

6

Enhanced

Integrated

Fully embedded in the operational culture of the organization with the goal of continuous improvement

Table 0.5: DCAM Scoring Guide

DCAM as an Industry Benchmark

The EDM Council conducted a DM industry benchmark study in 2015, 2017, and 2020. The benchmark is based on the capabilities defined in DCAM and thus can be used in comparison analysis for organizations conducting a DCAM assessment.

The industry benchmark is not restricted to organizations that are using DCAM. Input from a broader range of DM industry practitioners affords an enhanced perspective on the state of the DM industry.

DCAM Release Notes Summary

DCAM v2.2 – November 2020

The DCAM v2 release is considered a minor release in that it includes an enhancement to the DCAM Framework but it did not structurally change the scoring of DCAM.

The objective of the DCAM v2.2 refresh includes the following:

  • Addition of Component 8.0: Analytics Management
    • This additional component is considered an optional component in relation to the seven core components of DCAM.
    • An assessment conducted with all eight Components is now considered a DCAM+ Assessment, and the overall DCAM+ score would be inclusive of the 8.0 Analytics Management score, while the overall DCAM score would continue to be calculated based on the seven core components.
  • Introduction of Core Artifacts required to support each Component.
  • Enhancement to the content to support clarity, usability, and consistency in language, format, and presentation style.

For more information about how Component 8.0: Analytics Management is scored or included in a DCAM Assessment, please see the DCAM+ Scoring Guidelines FAQ.

DCAM v1.3 Mapping to v2.2 – a model is available that presents alignment of the Component, Capabilities, and Sub-capabilities in DCAM version 1.3 to version 2.2.

DCAM v2.1 – August 2019

The DCAM v2 release is considered a major release in that it has structural change. The intent is that there will not be a structural change to DCAM more often than 24 months, which will align to the two-year cycle on refreshing the industry benchmark data. In between major releases, there is the potential for a minor release no more often than six months. A minor release could include changes such as language, clarification, glossary alignment, and enhancements to the Component Introduction and reference to related EDMC best practice materials. Very basic grammar or spelling mistakes will be fixed as uncovered.

The EDM Council will always support one version backward from the current major version (v2.x). In other words, v1.3 will be supported until such time that v3.1 is released.

The objective of the DCAM v2.1 refresh includes the following:

  • Changes to the framework components to better present a logical flow of the capabilities
  • Introduction of new concepts that have requirements for data and DM
  • Enhancement to the content to support clarity, usability, and consistency in language, format and presentation style

Changes to Framework Components

1.0 Data Management Strategy & Business Case

  • Changed to strategy capability that aggregates the concepts of the other six components into a summary strategy
  • Introduced the concept that a data management strategy is comprised of a data content strategy, data use strategy, and data management deployment strategy
  • Integrated Business Case with the data management strategy – previously in a standalone component with Funding Model

2.0 Data Management Program & Funding Model

  • Integrated Data Management Program with Funding Model – previously in a standalone component with Business Case

3.0 Business & Data Architecture

  • Clarified that data management all starts with the business process requirements for data; maintained full Data Architecture scope

4.0 Data & Technology Architecture

  • Clarified the accountability for Technology to meet the business process and data requirements

5.0 Data Quality Management

  • Clarified a single data quality process that supports all data types

6.0 Data Governance

  • Aggregated all governance leveraged by the other components into a single view of governance

7.0 Data Control Environment

  • Clarified that the data control environment is the operational execution that brings together the data management components as a coherent, end-to-end data ecosystem
  • Formalized data risk as a sub-capability

Introduction of New Concepts

Machine Learning and Artificial Intelligence – the introduction of challenges and opportunities using innovative analytics tools and methodologies like Machine Learning (ML) and Artificial Intelligence (AI)

Data Ethics – advanced analytics exposes an organization to increased risk from the misuse of data; data use and outcome is not simply a legal question but also an ethical question

Data Ethics/ML/AI Concepts Mapped to DCAM v2.1 – a model is available that aligns the concepts of ML/AI and Data Ethics and their data and data management requirements to related sub-capabilities of DCAM.

Business Process Design – emphasize the role of the business process in defining requirements for data and data management and troubleshooting root-cause-fix; introduce the requirement for standard end-to-end DCAM DM processes

Enhancements to the Content

Standard Language – improved the consistent use of language and the readability

DCAM Business Glossary – applied the discipline of aligning to the use of terminology as defined in the DCAM Business Glossary

Rating Guidance Standard Language – established consistent language to describe the six assessment scores of a sub-capability to simplify the respondent scoring process

The rating guidance standard language is intended to be used as the basis for understanding how to score each sub-capability. However, each sub-capability has its bespoke rating guidance, which is specific to the sub-capability context and should be a blend of standard language plus specific information to help you score the sub-capability.

How to use this Document – Anatomy of DCAM

The DCAM is organized into seven core components and one optional component. Each component is preceded with a definition of what it is, why it is important, and how it relates to the overall data management process. These definitions are written for business and operational executives to demystify the data management process. The core components are organized into 31 capabilities and 106 sub-capabilities and the optional component is organized into seven capabilities and 30 sub-capabilities. The capabilities and sub-capabilities are the essences of the DCAM Framework. They define the goals of data management at a practical level and establish the operational requirements that are needed for sustainable data management. Finally, each subcapability has an associated set of measurement criteria. The measurements are used in an assessment of your data management journey.

  • Component – a group of capabilities that together deliver a foundational tenet of data management; used as a reference tool by the data practitioners who are accountable for executing the tenet
    • Introduction – high-level context for the component; used as a background for developing an understanding of the component by data practitioners
      • Definition – formal description of the component; used to support common data management understanding and language
      • Scope – a set of statements to establish the guardrails for what is included in the component; used to understand and communicate reasonable boundaries
      • Value Proposition – a set of statements to identify the business value of delivering the data management component; used to inform the varied business cases for developing the DM initiative
      • Overview – more detailed context and accounting at a practical level to establish an understanding of the operational execution required for sustainable data management; used as a guide by the respective data practitioners
      • Process, Tools & Constructs – things that are required to support the execution of the component; used for reference and to link to supporting best practice material when available
      • Core Questions – high-level but probing inquiries; used to direct exploration of the data management component
  • Capability – a group of sub-capabilities that together execute tasks and achieve the stated objectives; used as a reference tool by the data practitioners who are accountable for the execution
    • Description – brief aggregate explanation of what is included in the sub-capabilities required to achieve the capability; used in the assessment process to inform the respondent of the scope of what they are rating
  • Sub-Capability – more granular activities required to achieve the capability; used as a reference tool by the data practitioners who are accountable for the execution
    • Description – a brief explanation of what is included in the sub-capability and used in the assessment process to inform the respondent of the scope of what they are rating
    • Objective – identified goals or desired outcomes from executing the sub-capability; used as a basis for defining requirements for the data management process design
    • Advice – more detailed but casual insight on the best practice how to execute the sub-capability with an audit review perspective; used by the data practitioner
    • Questions – inquiries to direct interrogation of the capability/sub-capability current-state; used by the data practitioner to inform a perspective of the assessment scoring
    • Artifacts – Required things or evidence of adherence; used for assessment and audit reference and to link to supporting best practice material when available
    • Rating Guidance – insight for defining an assessment score; used when completing an assessment survey

DCAM Business Glossary

The EDM Council has developed a DCAM Business Glossary, which contains ~130 data management term names and definitions. DCAM v2.1 has applied these terms consistently across the document. Where there are terms defined in the glossary, the word or phrase is italicized and underlined in the text.

DCAM Business Glossary3 – all words or phrases throughout the document that are italicized and underlined are contained in the DCAM Business Glossary available by the link above.

Acronym Glossary

AI
Artificial Intelligence
ETL
Extract, Transform, Load
AM
Analytics Management (DCAM Component)
KPIs
Key Performance Indicators
BA
Business Architecture (DCAM Component)
KRIs
Key Risk Indicators
BI
Business Intelligence
ML
Machine Learning
CDE
Critical Data Elements
MRA
Memo Requiring Attention
COO
Chief Operating Officer
NLP
Natural Language Processing
DA
Data Architecture (DCAM Component)
ODM
Office of Data Management
DaaS
Data as a Service
PMO
Program Management Office
DBMS
Database Management System
QA
Quality Assurance
DCE
Data Control Environment (DCAM Component)
QC
Quality Control
DG
Data Governance (DCAM Component)
RACI
Accountability, Responsibility, Consulted and Informed
DM
Data Management
RCA
Root-cause Analysis
DMP
Data Management Program (DCAM Component)
ROI
Return-on-Investment
DMS
Data Management Strategy (DCAM Component)
SDLC
Software Development Lifecycle
DQ
Data Quality
SLA
Service Level Agreement
DQM
Data Quality Management (DCAM Component)
STP
Straight Through Processing
DSA
Data Sharing Agreement
TCO
Total Cost of Operation
 

Leave a Reply

Be a thought leader, share your best practice with other industry practitioners. Join the DCAM User Group or the CDMC Interest Group (or both). Then share this invitation with your fellow members - let’s get the crowd moving.
Join the Crowd