Stibo Systems - The Master Data Management Company

A Complete Master Data Management Glossary

Brian Cluster | March 14, 2023 | 51 minute read

A Complete Master Data Management Glossary

Master Data Management Blog by Stibo Systems logo
| 51 minute read
March 14 2023

The language of master data management (MDM) is packed with definitions, acronyms and abbreviations used by analysts, data scientists, CDOs and IT managers. This dictionary offers simple explanations of the most common MDM terms and helps you navigate the MDM landscape.

 

master data management definition - A

ADM - Application Data Management

ADM is the management and governance of the application data required to operate a specific business application, such as your ERP, CRM or supply chain management (SCM). Your ERP system’s application data is data associated with products and customers. This may be shared with your CRM, but in general, application data is shared with only few other applications and not widely across the enterprise.

ADM performs a similar role to MDM, but on a much smaller scale as it only enables the management of data used by a single application. In most cases, the ADM governance capabilities are incorporated into the applications, but there are specific ADM vendors that provide a full set of capabilities.

See also: Master Data

More information:

AI - Artificial Intelligence

In the MDM universe, AI is relevant in two ways. First, AI is an integral part of the MDM that uses AI and the underlying machine learning technology to classify data objects, such as products. AI searches for certain identifiers to find patterns in fragments of information, e.g., a few attributes or a picture of a product and makes sure the product is classified correctly in the product taxonomy. This capability is used in product onboarding. If the object does not match the classification criteria it can be sent to a clerical review which will help train the AI. Second, AI also relies on MDM when used in external applications. AI needs to know which identifiers to search for. These identifiers are described and governed in the MDM data model. MDM ensures the results of AI processing are trustworthy.

See also: Generative AI, Machine Learning, Auto-Classification

More information:

API - Application Programming Interface

An API is an integrated code of most applications and operating systems that allows one piece of software to interact with other types of software. In master data management, not all functions can necessarily be handled in the MDM platform itself. For instance, you want to be able to deliver product data from your MDM to the ecommerce system or validate customer data using a third-party service. The API makes sure your request is delivered and returns the response.

The use of APIs is a core capability of the MDM enterprise platform. As the single source of truth, the MDM needs to connect with a wide range of business systems across the enterprise.

See also: Connector, DaaS, Data Integration

More information:

Architecture

An MDM solution is not just something you buy, then start to use. It needs to be fitted into your specific enterprise setup and integrated with the overall enterprise architecture and infrastructure, which is why MDM architecture is required as one of the first steps in an MDM process. Your MDM strategy should include a map of your data flows and processes, which systems are delivering source data, and which rely on receiving accurate data to operate. This map will show where the MDM fits into the enterprise architecture.

See also: Implementation Styles

More information:

Asset Data

Enterprise assets can be physical (equipment, buildings, vehicles, infrastructure) and non-physical (data, images, intellectual property). In any case, it’s a subject that is owned by a company. Assets have data, such as specifications, location, bill-of-materials and unique ID. This data can be leveraged in different ways and needs to be managed, for instance for preventative maintenance. An MDM system can help to describe how assets relate functionally, perform and configure, essentially holding the digital twin of an asset. Having reliable information on your assets, can answer important questions like “What is the current condition of our assets that are supporting this business process? Who is using them? Where are they located?” Please note this important distinction: Asset data is your assets’ master data whereas data that is generated by your assets is called sensor data, time series data or IoT data. Master data has the quality of being low-volatile whereas IoT data is constantly changing and increasing. Asset data represents what you know with certainty about your assets.

See also: DAM, Digital Twin

More information:

Attributes

In MDM, an attribute is a specification or characteristic that helps define an entity. For instance, a product can have several attributes, such as color, material, size, ingredients or components. Typical customer attributes are name, address, email and phone number. MDM supports the management and governance of attributes to provide accurate product descriptions, identify customers and create golden records. Having accurate and rich attributes is hugely important for the customer experience as well as for the data exchange with partners and authorities.

See also: Data Entity, Data Object, Data Governance, Golden Record 

More information:

Augmented MDM

Augmented master data management applies machine learning (ML) and artificial intelligence (AI) to the management of master data. Leveraging AI techniques can enhance data management capabilities to complement human intelligence to optimize learning and decision making. Augmented MDM enables companies to refine master data to optimize their business operations to run more efficiently and transform their businesses to drive growth.

See also: AI

More information:

Auto-Classification

Auto-classification, computer-assisted classification or machine-supervised classification, is a method of automatically categorizing items into predefined categories, classes and industry standards using machine learning algorithms. It can be used as a part of a product information management (PIM) system to automatically categorize and organize product information into different product categories and attributes, for example, when products are onboarded with a minimal set of data, such as brand name and item number. Auto-classification, and computer assisted classification, speed up supplier item onboarding and leads to an up-front more correct classification of items, leading to better data quality through a correct use of templates of the item records. Predictions for classifications are associated with a confidence range and if the confidence of the prediction falls below a set threshold, the product is sent to clerical review for manual classification, which in turn can be used to re-train the classification algorithm.

More information:

master data management definition - B

BI - Business Intelligence

Business intelligence is a type of analytics. It analyzes the data that is generated and used by a company in order to find opportunities for optimization and cost cutting. It entails strategies and technologies that help organizations understand their operations, customers, financials, product performance and many other key business measurements. Master data management supports BI efforts by feeding the BI solution with clean and accurate master data.

See also: Data Analytics, Data Governance

Big Data

Big data is characterized by the three Vs: Volume (a lot of data), Velocity (data created with high speed) and Variety (data comes in many forms and ranges). Big data does not only exist on the internet but within single companies and organizations who need to capitalize on that data.

Finding patterns and inferring correlations in large data sets often constitutes goals of big data projects. The purpose of using big data technologies is to capture the data and turn it into actionable insights. The information gathered from big data analytics can be linked to your master data and thereby provide new levels of insights.

See also: Data Analytics

More information:

BOM - Bill of Materials

In manufacturing, a bill of materials is a list of the parts or components that are required to build a product. A BOM is similar to a formula or recipe, but whereas a recipe contains both the ingredients and the manufacturing process, the BOM only contains the needed materials and items. Manufacturers rely on different digital BOM management solutions to keep track of part numbers, unit of measure, cost and descriptions.

See also: Attributes

Business Rules

Business rules are the conditions or actions that modify data and data processes according to your data policies. You can define numerous business rules in your MDM to determine how your data is organized, categorized, enriched, managed and approved. Business rules are typically used in workflows, e.g., for the validation of data in connection with import or publishing. As such, business rules are a vital tool for your data governance and for the execution of your data strategy as they ensure the data quality and the outcome you want to achieve.

See also: Workflow

More information:

master data management definition - C

CDI - Customer Data Integration

CDI is the process of combining customer information acquired from internal and external sources to generate a consolidated customer view, also known as the golden record or a customer 360° view. Integration of customer data is the main purpose of a Customer MDM solution that acquires and manages customer data in order to share it with all business systems that need accurate customer data, such as the CRM or CDP.

See also: Customer MDM

More information:

CDP - Customer Data Platform

A customer data platform is a marketing system that unifies a company’s customer data from marketing and other channels to optimize the timing and targeting of messages and offers. A Customer MDM platform can support a CDP by ensuring the customer data that is consumed by the CDP is updated and accurate and by uniquely identifying customers. The CDP can manage large quantities of customer data while the MDM ensures the quality of that data. By linking the CDP data to other master data, such as product and supplier data, the MDM can maximize the potential of the data.

See also: CRM, Customer MDM

More information:

Centralized Style

The centralized implementation style refers to a certain system architecture in which the MDM system uses data governance algorithms to cleanse and enhance master data and then pushes data back to its respective source system. This way, data is always accurate and complete where it is being used. The Centralized style also allows you to create master data making your MDM the system of origin of information. This means you can centralize data creation functions for supplier, customer and product domains in a distributed environment by creating the entity in MDM and enriching it through external data sources or internal enrichment.

See also: Implementation styles, Coexistence style, Registry style, Consolidation style

More information:

Change Management

Change management is the preparation and education of individuals, teams and organizations in making organizational change. Change management is not specific to implementing a master data management solution, it is, however, a necessity in any MDM implementation if you want to maximize the ROI. Implementing MDM is not a technology fix but just as much about changing processes and mindsets. As MDM aims to break down data silos, it will inevitably raise questions about data ownership and departmental accountabilities.

More information:

Cloud MDM

A hosted cloud MDM solution is being run on third-party servers. This means organizations don’t need to install, configure, maintain and host the hardware and software, which is outsourced and offered as a subscription service. Cloud MDM holds many operational advantages, such as elastic scalability, automated backup and around-the-clock monitoring. Most companies choose a hosted cloud MDM solution vs. an on-premises solution or choose to migrate their solution to the cloud.

See also: SaaS

More information:

Coexistence Style

The coexistence implementation style refers to a system architecture in which your master data is stored in the central MDM system and updated in its source systems. This way, data is maintained in source systems and then synchronized with the MDM hub, so data can coexist harmoniously and still offer a single version of the truth. The benefit of this architecture is that master data can be maintained without re-engineering existing business systems.

See also: Implementation styles, Centralized style, Registry style, Consolidation style

More information:

Composable Commerce

Composable commerce is an innovative approach to digital commerce characterized by a modular architecture consisting of independent components. These components can be flexibly assembled to tailor-fit a business's unique requirements. Building upon the concept of headless commerce, which involves decoupling the customer-facing technology layer from backend systems, composable commerce takes this separation to the next level. It prioritizes heightened control and adaptability to curate an enhanced digital experience for consumers.

See also: Headless Commerce

Connector

Connectors are application-independent programs that perform transfer of data or messages between business systems or external sources and applications, such as connecting your MDM platform to your ERP, analytics or to data validation sources or marketplaces. Connectors are a vital architectural component to centralize data and automate data exchange.

See also: API

More information:

Consolidation Style

The consolidation style refers to a system architecture in which master data is consolidated from multiple sources in the master data management hub to create a single version of truth, also known as the golden record.

See also: Implementation styles, Centralized style, Registry style, Coexistence style

More information:

Contextual Master Data

Contextual (or situational) master data management refers to the management of changeable master data as opposed to traditional, more static, master data. As products and services get more complex and personalized, so does the data, making the management of it equally complex. The dynamic and contextual MDM takes into consideration that the master data, required to support some real-world scenarios, changes. For example, certain personalized products can only be created and modeled in conjunction with specific customer information.

More information:

CRM - Customer Relationship Management

CRM is a system that can help businesses manage customer and business relationships and the data and information associated with them. For smaller businesses a CRM system can be enough to manage the complexity of customer data, but in many cases organizations have several CRM systems used to various degrees and with various purposes. For instance, the sales and marketing organization will often use one system, the financial department another, and perhaps procurement a third. MDM can provide the critical link between these systems. It does not replace CRM systems but supports and optimizes the use of them.

See also: ERP, Customer MDM

More information:

Customer MDM

Customer master data management is the governance of customer data to ensure unique identification of one customer as distinct from another. The aim is to get one single and accurate set of data on each of your business customers, the so-called 360° customer view, across systems and locations in order to get a better understanding of your customers. Customer MDM is indispensable for all systems that need high-quality customer data to perform, such as CRM or CDP. Customer MDM is also vital for compliance with data privacy regulations.

See also: CRM, CDP, GDPR

More information:

master data management definition - D

DaaS - Data as a Service

Data as a Service is a cloud-based data distribution service focused on the real-time delivery of data at scale to high-volume data-consuming applications. It is an always-on service that allows applications to pull in data when it is needed to internal applications or customer facing channels wherever it is needed in the world. As part of MDM, DaaS delivers a near real-time version of master data through a configurable API and serverless architecture. This removes the need to maintain several API services as well as the need to create multiple copies of data for each application.

See also: SaaS

More information:

DAM - Digital Asset Management

Digital asset management (DAM) is the repository and management of digital assets, such as images, videos, digital files and their metadata. Important DAM capabilities include metadata management, version control, classification, linking, search and filter, workflow management and user-friendly asset import. Many businesses have a stand-alone DAM solution. When digital assets are needed for ecommerce, retail or distribution, the DAM needs to be integrated with the product information management system in order to not delay processes and cause bad user experiences. Master data management (MDM) supports digital asset management by connecting digital assets to appropriate master data records. DAM can be an integrated function in MDM, making it easy to accurately associate digital assets with individual products.

More information:

Data Analytics

Data analytics is the discovery of meaningful patterns in data. For businesses, analytics are used to gain insight and thereby optimize processes and business strategies. MDM can support analytical efforts by connecting data, upgrading the quality of data across the organization and by providing organized master data as the basis of the analysis. Furthermore, MDM data can be shared to business intelligence solutions to help provide a common corporate data framework and hierarchies for business data analysis and fuel future developments in AI.

More information:

Data Augmentation

Data augmentation is a technique used in data science and machine learning to increase the size and diversity of a data set by creating new samples from existing data through modifications or transformations. The goal of data augmentation is to enhance the quality and robustness of the data and improve the accuracy and generalization of machine learning models. Master data augmentation refers to the process of enriching or expanding the existing master data of an organization by integrating new or external data sources. This can involve adding new attributes or fields to the existing master data, as well as updating or enriching the existing attributes with additional information. The goal is to improve the accuracy, completeness and relevance of the master data to enable better decision-making and execution of business processes. By incorporating new data sources and expanding the scope of the master data, organizations can gain a more comprehensive view of their operations and assets.

See also: Augmented MDM

More information:

Data Catalog

A data catalog is a tool that provides an organized collection of metadata, descriptions and information about the data assets available within an organization. A data asset can refer to any type of structured or unstructured data that is generated, collected or maintained by an organization. The purpose of a data catalog is to make it easier for users to discover, understand and access data assets. The catalog typically provides a searchable and browsable interface that allows users to find and explore data assets based on various attributes, such as name, type, format, owner, and usage. Master data management (MDM) and data catalogs are complementary technologies that can work together to improve data management practices. MDM provides a centralized repository for managing the most important data assets, such as customer and product data, while a data catalog provides an interface for discovering and accessing data assets across the organization.

Data Cleansing

Data cleansing is the process of identifying, removing and correcting inaccurate data records, for example by deduplicating data. A data error that can appear during data cleansing is one of validity. Each piece of data aligned to a specific attribute should conform to the rules of the attribute. Moreover, consistency is another critical aspect affected by poor data cleansing practices. As brand ranges grow, the consistency of product or brand descriptions for attributes should remain the same or evolve together for brand consistency. Cleansing data is an integral and basic process of master data management, as it eliminates the problems of useless data and enhances the overall quality and reliability of information within the company. By addressing both validity and consistency issues during data cleansing, organizations can improve their ability to make data-driven decisions and delivering great customer experiences.

See also: Deduplication

Data Democratization

Democratizing data means to make data available to all layers of an organization and its business ecosystem, across departmental boundaries. Hence, the opposite of data democratization is to store data in silos managed and controlled by a single department without visibility for others. Data democracy’s upside is to remove barriers for the talent in your organization by providing the right access of data to enable timely and informed decision making. Data democratization is only possible if data is transparent. That includes its quality and sources, how it is shared and used, and who is accountable for the data quality as well as for the interpretation of the data.

See also: Data silo, data transparency

More information:

Data Domain

In master data management, a data domain refers to a specific set or category of data that share common characteristics, such as data type, purpose, source, format, or usage. Data domains are often used to organize and categorize data assets within an organization, and they play a critical role in data governance and data quality management. For example, the customer data domain includes data such as name, address, email, phone number and demographic information. Another example could be product data, including attributes such as size, color, function, technical specifications and item number. Different domains can be governed in conjunction and thus provide new valuable insights, such as where and how a product is used. By defining data domains and their attributes, organizations can establish a common understanding of their data and ensure consistency, accuracy, and completeness across different systems and applications. This, in turn, can help organizations make better data-driven decisions, improve operational efficiency, and comply with regulatory requirements.

See also: Multidomain, Data governance, MDM, PIM, Zones of Insight

More information:

Data Enrichment

Data enrichment refers to the process of enhancing, refining or improving existing data sets by adding new information, attributes or context to them. The goal of data enrichment is to increase the value, relevance and accuracy of the data for analysis, decision-making and improving customer experiences. Product data enrichment is particularly important for providing good customer experiences and preventing product returns. Customer data enrichment can help companies gain a more comprehensive view of their customers. Data enrichment can involve various techniques and sources, such as: - Data augmentation: adding new data points or fields to existing data sets, such as demographic, geographic, behavioral or transactional data. - Data integration: combining data from multiple sources or systems to create a unified view of the data.

More information:

Data Fabric

Data fabric is a data architecture that connects and enhances your operating systems to provide these systems with clean, real-time data at scale. It’s an integrated layer on top of systems that these systems can subscribe to. It is designed to provide a unified view of data that can be accessed and used by different applications and users within an organization. Data fabric improves collaboration through a common shared data language and democratization of data. A multidomain master data management platform can facilitate a data fabric through integration of data sources and built-in data governance. However, data fabric extends beyond the capabilities of a single technology. It typically includes a variety of technologies and components, including data virtualization engines, data integration tools, metadata management systems and data governance policies. These components work together to provide a comprehensive view of data that can be accessed and used by different stakeholders. One of the key benefits of a data fabric is that it enables organizations to access and analyze data in real-time, regardless of where the data is located or how it is stored.

Data Governance

Data governance is a collection of practices and processes aiming to create and maintain a healthy organizational data framework. It is a set of policies, procedures and guidelines that are put in place to ensure that data is accurate, consistent and compliant with regulations and standards. Data governance can include creating policies and processes around version control, approvals, etc., to maintain the accuracy and accountability of the organizational information. Data governance is as such not a technical discipline but a discipline to ensure data is fit for purpose. Data governance includes a variety of activities, such as data quality management, data security, data privacy, data classification, data lifecycle management and data stewardship. These activities are designed to ensure that data is properly managed and used throughout its lifecycle, from creation to deletion. Data governance is typically led by a data governance team or committee that is responsible for creating and enforcing data policies and standards. This team may include representatives from various departments within an organization, such as IT, legal, compliance and business operations. Data governance can be supported by master data management capabilities that are configurable to execute data policies.

More information:

Data Governance Framework

A data governance framework is a collection of processes, policies, standards, and metrics that ensure effective and efficient use of information. It includes the organizational structure, roles and responsibilities, and the tools and technologies used to manage and protect data. See also: Data Policy

More information:

Data Hierarchy

In the context of master data management (MDM), data hierarchy refers to the organization and classification of data elements based on their level of importance and their relationships to each other. At the top of the data hierarchy are the master data domains. These are the core data entities, such as customers, products, suppliers and employees. Beneath the master data domain, e.g., product, are the different product types and classes. Under these you will find the stock keeping units (SKU) and their attributes. At the bottom of the data hierarchy is the reference data. Reference data includes categories like product classifications, industry codes and currency codes. Master data management helps organize data into logical hierarchies which is essential to become a data-driven organization.

See also: Data Modeling, Data Domain, Reference Data

Data Hub

A data hub, or an enterprise data hub (EDH), is a database which is populated with data from one or more sources and from which data is taken to one or more destinations. A master data management system is an example of a data hub, and therefore sometimes goes under the name master data management hub.

More information:

Data Integration

Data integration is the process of combining data from different sources into a unified view. It involves transforming and consolidating data from various sources such as databases, cloud-based applications and APIs into a single, coherent dataset. One of the biggest advantages of a master data management solution is its ability to integrate with various systems and link all of the data held in each of them to each other. A system integrator will often be brought on board to provide the implementation services.

See also: API

More information:

Data Lake

A data lake is a central place to store your data, usually in its raw form without changing it. The idea of the data lake is to provide a place for the unaltered data in its native format until it’s needed. Data lakes allow organizations to store both structured and unstructured data in a single location, such as sensor data, social media data, weblogs and more. The data can be stored in different formats. Data lakes are highly scalable, allowing organizations to store and process vast amounts of data as needed, without worrying about storage or processing limitations. Certain business disciplines, such as advanced analytics, depend on detailed source data. However, since the data in a data lake has not been curated and derives from many different sources, this introduces an element of risk when the ambition is data-driven decision making. Data that is stored in a data lake is not reconciled and validated and may therefore contain duplicate or inaccurate information. Applying master data management can improve the accuracy and help identify relationships and thus enhance the transparency.

See also: Data Warehouse

Data Lineage

Data lineage refers to the detailed history of the data's life cycle, including its origins, transformations, and movements over time. It helps in understanding the data flow from its source to its final destination, providing transparency and aiding in data quality management, regulatory compliance, and troubleshooting.

Data Maintenance

In order for any data management investment to continue delivering value, you need to maintain every aspect of a data record, including hierarchy, structure, validations, approvals and versioning, as well as master data attributes, descriptions, documentation and other related data components. Maintenance is often enabled by automated workflows, pushing out notifications to, e.g., data stewards when there’s a need for a manual action. Maintenance is an important and ongoing process of any MDM implementation.

Data Mesh

Data mesh is an approach to data architecture that emphasizes the decentralization and democratization of data ownership and governance. It aims to address some of the common challenges of traditional centralized data architectures, such as slow time-to-market, siloed data and lack of agility. Traditional data storage systems can continue to exist but instead of a centralized data team that owns and governs all data, data mesh advocates for data ownership to be distributed among cross-functional teams that produce and consume data. This federated system helps eliminate many operational bottlenecks. The key idea behind data mesh is to treat data as a product and apply product thinking principles to data management. This implies clear product ownership, product roadmaps and product metrics. Data products should be designed to meet the specific needs of consumers, and should be evaluated based on their value to the business. While data mesh and master data management have different approaches to data management, they can complement each other in a hybrid data management approach. For example, domain-specific teams in a data mesh architecture can use MDM to ensure that their data products are consistent with enterprise-wide master data standards. In this way, data mesh and MDM can work together to ensure that data is accurate, consistent, and trustworthy across the enterprise.

See also: MDM

Data Modeling

Data modeling is the process of creating a conceptual or logical representation of data objects, their relationships and rules. It is used to define the structure, constraints and organization of data in a way that enables efficient data management and analysis, as well as supports your business model. Data modeling is an essential component of master data management (MDM) as it provides a principle for organizing and managing master data. In MDM, data modeling is used to define the attributes, relationships and hierarchies of master data entities such as customers, products and locations. It also helps in identifying and resolving data conflicts and duplicates by establishing unique identifiers and rules for data matching and merging. Data modeling in master data management is a process in the beginning of an MDM implementation where you accurately map and define the relationship between the core enterprise entities, for instance your products and their attributes. Based on that you create the optimal master data model that best fits your organizational setup.

Data Monetization

Data monetization is generally understood as the “process of using data to obtain quantifiable economic benefit” (Gartner Glossary, Data Monetization). There are three typical ways to achieve this:

  • Using data to make better decisions and thus enhance your business performance
  • Sharing data with business partners for mutual benefit
  • Selling data as a product or service

Whichever way you choose, data monetization gets more profitable if you can provide context to the data and make it insightful. The more insight your data provides, the more valuable it is. Multidomain master data management can help add context and that way increase the value of your data.

More information:

Data Object

The data object is what you aim to enrich and identify uniquely using master data management (MDM). A data object has a unique ID and a number of attributes that it may share with other data objects, such as a specific customer that may share the address with another specific customer. Products, suppliers, customers and locations are examples of types of data objects in a master data management context. They are organized in data models that can contain relations between data objects. A multidomain MDM can show relations across data object types.

See also: Data Modeling

Data Onboarding

Data onboarding is the process of transferring data from an external source into your system, e.g., product data from suppliers or content service providers to your PIM system or from an offline database to an online environment. The onboarding process can be be handled by data integration via an API, a data onboarding tool or by importing xml files. The speed of data onboarding can be crucial for reducing your time to market. Sophisticated onboarding processes can help maintain data quality by flagging duplicates or incomplete data. If your onboarding tool is supported by AI, it may suffice to onboard fragments of information, such as an item number or GTIN number and a brand name.

Data Policy

Your data policy is a set of rules and guidelines for your data management to ensure data quality as well as data processes are aligned with business goals. Your data policy defines data ownerships, stewardships, how to store and share data. A thorough data policy should be in place prior to any system implementation in order to ensure accountability and that data is clean and fit for purpose.

Data Pool

A data pool is a repository of data where trading partners can maintain and exchange information about products in a standard format. Suppliers can, for instance, export data to a data pool that cooperating retailers can then import to their ecommerce system. Data pools, such as 1WorldSync and GS1, enable business partners to synchronize data and reduce their time to market. An MDM system that supports various data pool formats can facilitate the data exchange and minimize the manual work needed.

See also: GS1

More information:

Data Quality

Data quality, or just DQ, refers to the overall accuracy, completeness, consistency, timeliness and relevance of data. The quality of data can have a significant impact on the accuracy and effectiveness of decision-making processes, as well as on the success of business operations that rely on data.

The quality of data is therefore of particular interest for data analysts and business intelligence. Accuracy refers to the extent to which data reflects reality, and it involves ensuring that the data is correct and free from errors. Completeness refers to the extent to which all necessary data is available, and there are no missing values. Consistency refers to the extent to which data is uniform and follows predefined standards, so it can be compared or analyzed. Timeliness refers to the extent to which data is available when it is needed and reflects the current state of affairs. Finally, relevance refers to the extent to which data is useful and applicable to a specific task or purpose.

100% data accuracy is in most cases not attainable, and also not desirable. Most organizations aim for the data quality that is 'fit for purpose'. The finest job for master data management is to ensure the quality of master data to enable qualified decision making.

Data Silo

The term data silo describes when crucial data or information — such as master data — is stored and managed separately and isolated by individuals, departments, regions or systems. The siloed approach results in data being not easily accessible or shared with other parts of the organization. In other words, data silos are a barrier that prevents the free flow of information within an organization. Data silos are often created unintentionally as different departments adopt different software systems or databases to manage their data. These silos can make it difficult for other departments to access important information, which can lead to redundant efforts, inconsistencies and inaccurate or incomplete data analysis. Master data management can help mitigating the negative impact of data silos by way of integration with business applications, such as ERPs and CRMs, thus providing a single source of truth.

More information:

Data Swamp

A data swamp occurs where a large volume of data is collected and stored without proper organization, management or governance. It is a data repository that is poorly managed and lacks structure, making it difficult or impossible to use for analysis or decision-making. The cause of a data swamp is when organizations collect data without a clear understanding of how it will be used or how it fits into the organization's broader goals and objectives. The consequences of a data swamp can include wasted resources, reduced productivity and even regulatory issues as data that is poorly managed may be subject to compliance violations or data breaches. To prevent a data swamp, organizations must adopt a data management strategy that includes proper data governance, metadata management and data quality controls. The capabilities of master data management support streamlining of data to make it trustworthy and actionable.

See also: Data Lake

Data Synchronization

Synchronization of master data is the process of ensuring that data is consistent and up-to-date across multiple systems or devices. Master data management ensures that all users have access to the most recent and accurate master data. Data synchronization is crucial in situations where multiple users or systems need to access and modify the same data, such as in a collaborative work environment or a mobile application that accesses data from a central server.

Data Syndication

Syndicating data is important for manufacturers and brand owners in order to share accurate, channel-optimized product data and content with retailers, distributors, data pools and marketplaces. Data syndication entails mapping, transforming and validating data. A master data management solution can automate the syndication process using built-in support for industrial classification standards or an integrated data syndication tool that is flexible to adapt to the retailer requirements as they change over time.

More information:

Data Transparency

Data transparency refers to the end-to-end insight into your most valuable data. It involves breaking down silos and barriers that obstruct the visibility, clarity and flow of trusted data. Data transparency includes knowledge about your data's completeness, where it comes from, who can access it, who is accountable for it, etc. Having data transparency enables you to comply with data regulations and standards and make better decisions based on insight. This insight is particularly important in order to meet the growing sustainability demands.

More information:

Data Warehouse

A data warehouse is a large, centralized repository of data that is used for reporting, analysis, and business intelligence. It is designed to store and manage data from various sources in a structured way, making it easier to access and analyze. The purpose of a data warehouse is to provide a single source of accurate, consistent and integrated data that can be used to make informed business decisions. Data warehouses are typically designed to support complex queries and analysis, rather than simple transaction processing. They are often used by businesses to consolidate data from disparate sources, such as transactional databases, customer relationship management systems and other sources, into a single location that can be easily accessed and analyzed. A master data management solution enhances the functionality of data warehouses by feeding trusted data into the database and by uniquely identifying entities.

See also: Data lake

Deduplication

Deduplication of data entities is the process of identifying and removing duplicate or redundant data from a dataset. This process is important because duplicate data can slow down processing times, lead to inconsistencies in analysis or reporting and cause bad customer experiences. Deduplication can be done manually, by comparing and eliminating duplicates, or it can be automated using software tools. These tools use algorithms to identify duplicates based on various criteria, such as exact or partial matching, fuzzy matching or machine learning. Master data management (MDM) can help with deduplication by providing a centralized system for managing and maintaining a single, authoritative source of master data. Through automated processes of matching and linking, MDM establishes a master record (i.e., golden record) for each entity and ensures that data is consistent, accurate, and up-to-date across all systems and applications. MDM includes tools for data profiling, cleansing, and matching, which can help identify and eliminate duplicates. For example, MDM can use algorithms to compare and match data from different sources, identifying duplicates based on various criteria such as name, address, phone number or other identifying attributes. Deduplication can be done for product records as well. Through deduplication, MDM enhances the functionality of business systems, such as CRM, customer support and ecommerce. MDM can also help prevent the creation of duplicate data in the first place. By enforcing standard data entry rules and using data validation techniques, MDM can ensure that new data entered into the system is accurate and consistent with existing data. Deduplication improves data quality, reduces storage costs and increases efficiency.

See also: Golden Record, Data Cleansing

More information:

Digital Catalog

A digital catalog is an electronic version of a product catalog or service catalog. It is a database or repository of information about products or services, including descriptions, specifications, prices, images, and other details. A digital catalog can be used by businesses to showcase and promote their products or services online, making them accessible to a global audience. Customers can browse the catalog, search for specific products or services, view images and descriptions and make purchases directly through the website. Digital catalogs are often used in ecommerce, and other industries that rely heavily on digital marketing and sales. They can be easily updated and maintained, making it possible to add new products or services, update prices and specifications and make other changes as needed. Master data management for product information (PIM) provides a single source of truth for product data, which can be used to populate a variety of channels, such as digital catalogs, ecommerce sites, marketplaces and other sales and marketing channels. Thus, master data management is the necessary foundation for a well-functioning digital catalog.

See also: PIM, Data Syndication

Data Marketplace

A data marketplace is an online platform where data providers can sell or share data with data consumers. It facilitates the exchange of data between organizations, helping them access a broader range of data sources for analytics and decision-making.

Data Masking

Data masking is a technique used to protect sensitive data by replacing it with realistic but false data. It is commonly used to secure data in non-production environments and ensure privacy while preserving the data's usability for testing and development.

See also: Synthetic Data

Digital Product Passport

Digital product passports is a sustainability requirement proposed by the European Union under the Ecodesign for Sustainable Products Regulation (ESPR) to “make sustainable products the norm […] and ensure sustainable growth.” The passport creates a digital record of data relating to all aspects of the lifecycle of a product in order to capture the environmental impact from the supply chain which accumulates from multiple suppliers and their actions, as well as themes pertaining to the product’s afterlife, such as disposal, recyclability, recovery, refurbishment, remanufacturing, predictive maintenance and reuse. Master data management can support the creation of digital product passports by managing sustainability data from multiple domains, including suppliers, locations and products.

More information:

Digital Profiling

Data profiling is the process of examining the data available in an existing data source and collecting statistics and information about that data. It helps in assessing the quality of data, identifying anomalies, and understanding data distributions, patterns, and potential relationships.

Digital Transformation

Digital transformation is the process of using digital technologies to fundamentally change how businesses operate, interact with customers and create value. It involves the integration of digital technologies into all aspects of a business, from operations and customer experience to products and services. The goal of digital transformation is to improve business performance, increase efficiency and create new revenue streams. Digital transformation is driven by consumer demands, forcing organizations to change their business strategy and thinking in order to deliver excellent customer experiences, or by increasing demands for compliance or the general deluge of data generated by IoT and various platforms. Hence, digital transformation is a necessity but also a competitive parameter as it has major impact on efficiency and workflows.

Master data management (MDM) can play a crucial role in enabling digital transformation as the backbone architecture that ensures data is trustworthy and fit for purpose. Master data management (MDM) can support digital transformation by providing a centralized system for managing and maintaining a single, authoritative source of data. By establishing a trusted master record for each entity, MDM can ensure that data is consistent, accurate and up-to-date across all systems and applications. Thus, MDM is a key enabler of digital transformation, as it provides the foundation for data-driven decision making, process automation and customer engagement. By using a centralized MDM system, businesses can ensure that all stakeholders have access to the data they need for their transformational process in a timely and reliable manner. MDM can also help businesses improve operational efficiency by automating manual processes and reducing the time and resources required for data management. By streamlining data workflows and ensuring data quality, MDM can help businesses improve productivity, reduce errors and enhance collaboration.

More information:

Digital Twin

A digital twin is a virtual representation, or data replication, of an entity such as an asset, product, person or process and is developed to support business objectives. The digital twin of a person is also referred to as a customer 360° view. For manufacturers, the digital twin can visually conceptualise their real-world manufacturing processes. Digital twin describes a transition of the physical product to the virtual product in order to ensure that what companies are producing is what they actually want to produce. Master data management supports the building of digital twins by providing a repository for rich and accurate data about assets.

Direct to Consumer (D2C, DTC)

D2C selling relates to manufacturers and brands that in addition to selling through retailers and dealers establish their own retail sales channels, such as ecommerce or brick-and-mortar shops. While the D2C approach can be profitable and good for brand building, it also challenges brands to manage their product data to make it fit for great consumer experiences and omnichannel purposes.

More information:

D-U-N-S

A Data Universal Numbering System number is a unique nine-digit identification number assigned to businesses and organizations by Dun & Bradstreet (D&B), a provider of commercial data, analytics, and insights. The D-U-N-S number is commonly used by lenders, suppliers and business credit reporting agencies to establish a company's creditworthiness and financial stability. The D-U-N-S number is also used by various organizations and government entities to identify and verify a business's existence and location. Master data management (MDM) can source the D-U-N-S number via integration to enrich a company's party data with unique identifiers.

More information:

master data management definition - E

EAM - Enterprise Asset Management

Enterprise Asset Management (EAM) is the process of managing an organization's physical assets throughout their lifecycle, from acquisition to disposal. EAM is typically used by organizations with large and complex physical assets, such as manufacturing plants, transportation companies, and utility providers. EAM software is used to track and manage assets such as machinery, equipment, vehicles, and buildings. It provides a centralized database for asset information, including maintenance schedules, warranties and repair histories. The software can also generate reports and analytics to help organizations optimize their asset utilization, reduce maintenance costs, and extend asset lifespan.

Master data management (MDM) can help describe how assets perform and relate functionally. MDM is inherently designed to cross silos of systems, business units and processes in order to unite information in a way that supports decision making. MDM is therefore positioned as a technology that can help unify and standardize views of asset information to help reduce the number of processes that an organization must work with.

See also: Asset Data

More information:

ECLASS

The global industry standard for classifiation and description of products and services across countries and multiple languages. The classification system is maintained by industry consortium ECLASS e.V. association. ECLASS enables digital exchange of product master data. Every product and service is classified in a four-level hierarchy and identified by an eight-digit code. ECLASS is used in Product Information Management (PIM), ecommerce and master data management systems.

More information:

EDW

See Data Warehouse

Enterprise Data

Enterprise data refers to all the data that is created, processed and used within an organization. This includes data from various sources, such as internal systems, external partners, customers and other stakeholders. Enterprise data can take many forms, such as structured data (e.g., databases, spreadsheets), unstructured data (e.g., emails, documents), semi-structured data (e.g., XML files) and multimedia data (e.g., images, videos).

By implementing a master data management strategy, organizations can improve the accuracy, consistency and completeness of their master data across different systems and business units. This can help create context for all enterprise data, improving analytics and decision making.

ERP - Enterprise Resource Planning

Enterprise resource planning (ERP) is a type of business management software that integrates and manages an organization's core business processes in real-time. ERP systems are designed to provide a unified view of an organization's operations and data. ERP systems typically include modules for various business functions such as finance, accounting, inventory management, human resources, procurement and customer relationship management. These modules are integrated into a single system, which enables data sharing and visibility across different departments and functions. Businesses can have several ERP systems.

A master data management solution can enhance the functionality of the ERP by ensuring that the data from each of the data domains used by the ERP is accurate, up-to-date and synchronized across the multiple ERP instances. Although ERP systems also manage master data, they do not have the same comprehensive governance capabilities as MDM systems.

ETIM

ETIM is an industry classification system that stands for "European Technical Information Model." It is a standardized classification system for technical product data in the electrical, plumbing, heating, ventilation and air conditioning (HVAC) industries. The ETIM system provides a framework for classifying technical information about products in a standardized format that can be easily shared among manufacturers, distributors, retailers and end-users. The ETIM classification system is based on a hierarchical structure, with each level providing more detail about the product. ETIM also includes a standardized set of product attributes and values that can be used to describe the features of a product in a consistent way. This allows different manufacturers to use the same terminology to describe their products, making it easier for customers to compare and choose products.

Via integration or built-in ETIM compliance, a master data management platform can further enrich product information and facilitate data sharing between manufacturers and distributors/retailers.

More information:

ETL - Extract, Transform and Load

ETL (Extract, Transform, Load) is a data integration process used to move data from multiple sources, transform it into a consistent format, and load it into a target database or data warehouse. The three stages of ETL are:

Extract: Data is extracted from various sources such as databases, flat files, APIs or web services.

Transform: The extracted data is cleaned, normalized and transformed into a consistent format to ensure that it is accurate, complete and usable.

Load: The transformed data is loaded into a target database or data repository where it can be analyzed and used for various purposes, such as business intelligence reporting, analytics or master data management.

master data management definition - G

GDPR

The General Data Protection Regulation (GDPR) is a binding regulation created by the European Commission. The regulation, which came into effect on the 25th of May 2018, has replaced former European Union data protection directives and diverse national laws. The GDPR was introduced in order to strengthen the citizens' right to data protection, including the right to be forgotten, the right to access your own personal data, the right to data portability, to rectification and to object. Affected businesses have to meet several requirements in relation to how they collect and use the personal data of EU citizens – whether or not the company itself is European.

Master data management supports compliance with the GDPR by consolidating party data, providing transparency into data processing and by supporting consent management.

More information:

Generative AI

Generative AI refers to artificial intelligence techniques that are capable of generating new data or content based on patterns learned from existing data. It can be used in tasks such as language generation, image synthesis and code generation. Retailers can use generative AI to create compelling and unique descriptions for products. This is particularly useful for ecommerce websites or businesses with large product catalogs, where writing descriptions for each product can be a time-consuming and tedious task. With generative AI, a machine learning model can be trained on a dataset of existing product descriptions, allowing it to learn the patterns and structures commonly used in these descriptions. In order to get a satisfactory outcome of generative AI, it is therefore important that the underlying product data is governed and trustworthy.

More information:

Golden Record

The golden record is the consolidated master data record of a customer, supplier or product based on the most trusted information. This is also referrred to as a single view, which consists of one unified, trusted version of data that captures all the necessary information needed to know about a customer or a product. It is very common for organizations to have several records of the same object sitting in the same or in different systems. Some of these records might be inaccurate or incomplete. Through data governance capabilities, such as matching, linking and merging, master data management (MDM) is capable of unifying information into one trusted version.

More information:

GRI

Global Reporting Initiative (GRI) is a sustainability standard that operates across more than 140 different topics across biodiversity, tax, waste, emissions, diversity, equality and health and safety. In order to comply with the GRI reporting, companies can use master data management to ensure data transparency, manage and share sustainability data.

More information:

GS1

GS1 is a global standards organization that develops and maintains a range of standards for the identification, capture and sharing of product and supply chain information. It is used by companies in a wide range of industries, including retail, healthcare, food service and manufacturing. The GS1 system enables companies to use a common language to identify products, locations, and assets, and to share data with trading partners in a consistent and standardized way. The GS1 standard includes a range of standardized identifiers, barcodes, and electronic data interchange (EDI) messages that enable companies to identify and track products as they move through the supply chain.

Some of the key components of the GS1 system include: Global Trade Item Number (GTIN), a unique identifier for products that is used to identify and track products at the item level; Global Location Number (GLN), a unique identifier for physical locations such as warehouses, retail stores, and manufacturing facilities; Global Data Synchronization Network (GDSN), a system that enables companies to share product information with trading partners in a standardized format, ensuring that everyone has access to the same information; and Electronic Product Code (EPC), a unique identifier for individual items that is encoded in an RFID tag, enabling companies to track items at the individual unit level.

A master data management solution will support and integrate the GS1 standards across industries.

See also: Data Pool

More information:

Screenshot 2021-05-03 at 13.57.49

Headless Commerce

Headless commerce separates the customer facing front-end technology from the business systems on the back-end, to allow for increased flexibility. In other words, the back-end of commerce solutions are separated from the direct consumer experience. Headless commerce allows for a more focused approach to the customer experience and leverages integrations to the back-end to improve the ability to provide services. The information flows between the back-end and front-end via application programming interfaces (APIs). Product information management solutions (PIM) provides the foundational, enriched product data that serves as the single source of truth and may also include DAM that can share operational and marketing and rich product data across channels worldwide.

See also: Composable Commerce

master data management definition - I

Identity Resolution

Identity resolution is a data management process related to customer master data management. Customer master data management involves a data governance process that enables organizations to link customer data across disparate systems and sources to form a single, accurate and complete view of the customer. One key aspect of this process is identity resolution, which involves gathering different data sets and identifying non-obvious relationships to link a customer's traces and accounts to their unique identity. Customers often have various digital identities that need to be consolidated into one customer view. By successfully linking all these identities, identity resolution allows for customer-centric marketing.

See also: Customer MDM, Golden Record

More information:

Implementation Style

When you implement a master data management solution you can choose between four different methodologies, or styles, according to your business needs: (1) Registry, (2) Transaction/Centralized, (3) Consolidation or (4) Coexistence. The choice for each is depending on the business situation, the organization's technology, people and aspirations. Implementation styles may also difffer dependng on the type of master data.

More information:

IOT - Internet of Things

Internet of things is the network of physical devices embedded with connectivity technology which enables these devices to connect and exchange data. Devices include sensors, cameras, consumption meters and wearables. IoT can help companies maintain assets and equipment. A master data management solution supports IoT initiatives by providing a 360-degree view of the connected assets. Asset data management is important to provide context for IoT data, also known as sensor data or time series data.

More information:

master data management definition - L

Legacy Systems

An applied technology that has become outdated and has been taken over by newer operating systems, but is still in use for a variety of reasons. A legacy system is no longer for sale on the software marketplace and is not being updated or supported by its vendor. A master data management system can help in the retirement of many legacy systems by capturing and cleansing exising master data. Using the MDM approach means that a business can continue operations during a critical migration process and, furthermore, increase efficiency and data governance.

More information:

Location data

Data about locations. Locations, such as stores, warehouses, factories and other real estate, are assets that need accurate descriptions. Master data management can provide additional value by adding location data to the mix of master data, resulting in a multidomain solution. Product information can greatly benefit from being enriched with location data to provide insight into where a certain product is offered or from where it originates. Restaurants, hotels and venues can use location data management to advertise their location-based services. By enriching store location data with local points of interest information and localized consumer demographics, retailers can offer relevant products and services and update customers with the latest store information.

More information:

Machine Learning

master data management definition - M

Machine learning (ML) is a subfield of artificial intelligence that focuses on enabling machines to learn from data without being explicitly programmed. It is a type of algorithm that uses statistical models to recognize patterns in data, and then uses those patterns to make predictions or decisions. ML creates a data model based on sample data in order to allow more data to be fed into the system for continous improvement. ML is being employed in master data management solutions to aid in reducing repetitive tasks such as classification, image metadata tagging and image deduplication.

Master Data

Master data is essential to the operation of a business or organization. It includes key business information, categorized in master data domains such as customer and product data, as well as other critical assets, including suppliers, locations and equipment. Master data is typically defined by a set of attributes describing the asset. Contrary to other types of data, e.g., transactional data or real-time data, master data is characterized by low volatility. Master data is often managed in many different systems according to domain, i.e., product master data is managed in a PIM system, and customer master data in a CRM. However, consolidating and governing master data in a single source can enhance insight and efficiency for departments and applications that rely on having clean master data. Effective management of master data is essential for ensuring data quality, consistency, and accuracy throughout the organization, and is key to supporting business processes and decision-making.

More information:

Master Data Management

Master data management (MDM) is the core process of acquiring, managing, governing, enriching and sharing master data according to the data policy and business rules of your company. The efficient management of master data in a central repository gives you a single authoritative view of information and eliminates costly inefficiencies caused by data silos. Master data management is necessary to support your business initiatives and objectives through identification, linking and syndication of information and content across products, customers, locations, suppliers, digital assets and more. Master data can be managed using spreadsheets or various business systems, such as ERP and CRM. However, to automate the process of managing master data and to ensure the uniformity of master data, many organizations use a dedicated MDM solution that is capable of connecting data silos and providing clean master data to the business systems that need it to operate efficiently.

More information:

Matching and linking

Matching and linking are two key processes in the creation of golden data records. Golden records are a single, accurate and comprehensive view of an entity, such as a customer or a product, that is created by combining and reconciling data from multiple sources. Matching involves identifying and grouping together records that refer to the same entity, based on a set of matching rules or criteria. These rules may include fields such as name, address and phone number, which are used to compare and match duplicate records from the same or different sources.

Linking refers to consolidating or merging the matched records into a single golden record. This involves resolving conflicts and inconsistencies between the records, and combining the information from each record into a single, comprehensive view of the entity. When all source records are linked to the golden record, the merging function selects a survivor and non-survivors. The golden record is based only on the survivor. The non-survivors are deleted from the system. Linking may also imply enriching the data by adding additional information from external sources.

The creation of golden records through matching and linking is an iterative process that requires ongoing monitoring and refinement to ensure accuracy and completeness. This process is typically supported by master data management (MDM) software, which provides tools and workflows for managing the creation and maintenance of golden records.

See also: Golden Record

More information:

Metadata Management

The management of data about data. Metadata refers to data that describes other data. It is information that provides context, meaning and structure to data. Metadata can be used to facilitate the discovery, management and use of data, and is essential for ensuring that data is accurate, consistent and interoperable. There are many different types of metadata. Examples of metadata include the author and date of creation of a document, the file format and size, the data type and structure and the keywords and tags used to describe the content. Metadata can be stored and managed separately from the data itself, and can be used to facilitate search, discovery, and analysis of data.

While metadata management and master data management systems intersect, they provide two different frameworks for solving data problems such as data quality and data governance.

Multidomain

AA multidomain master data management solution manages the data of several enterprise domains, such as product and supplier domain or customer and product domain or any combination handling more than one domain. A multidomain MDM solution provides unified governance across all domains. Multidomain MDM allows you to establish relationships between data of different domain origins and bring governance across these relationships. The combination and common governance of different master data domains can provide new insights into business-critical questions, such as where certain products are sold, how they are used, and which suppliers are delivering the same product. These insights can help reduce costs and open up for new streams of revenue.

See also: Data Domain, Zones of Insight

More information:

Multiple Domain

A multiple-domain master data management solution manages master data from different domains in one application but has separate goverance for each domain. Unlike mulitdomain solutions, this type of solution may require significant coding or configurations to actually enable the true managemet of multiple domains possible.

More information:

master data management definition - N

Natural Language Processing

NLP is a field of artificial intelligence (AI) that focuses on enabling machines to understand and interpret human language. It involves developing algorithms and models that can analyze and generate natural language, such as text or speech, and perform language translation, sentiment analysis speech recognition and more. NLP has many applications in fields such as customer service, healthcare, finance and education. For example, it can be used to develop chatbots and virtual assistants that can interact with customers in natural language. Master data management (MDM) is important for NLP because NLP algorithms rely on accurate and consistent data to generate meaningful insights and results. NLP applications typically require access to large volumes of high-quality data, and MDM provides a framework for managing and governing this data to ensure that it is accurate, complete and consistent across different systems and applications.

Omnichannel

master data management definition - O

Omnichannel is a marketing strategy that relies on a single source pushing product data out to business systems (data silos) where the data sits and goes through individual transformation and cleaning. The focus is on providing customers with a consistent experience across all channels, including in-store, online, mobile and social media. It allows customers to switch between channels and touchpoints. In an omnichannel approach, each channel is managed separately, and the customer experience is optimized for each individual channel.

More information:

Party Data

In relation to master data management, party data is the overarching data domain referring to individuals and organizations, typically customers and suppliers, but also employees or college students. A party can also be a relation, such as an attorney or a family member of a customer, and party data is then data referring to these parties. Party data management can be part of an MDM setup and these relations can be organized using hierarchy management. Party data can also be defined by it's source. First-party data is your own data, second-party data is someone else’s first-party data handed over to you, while third-party data is collected by someone with no relation to you.

More information:

master data management definition - P

PII - Personally Identifiable Information

PII refers to any data that can be used to identify a specific individual. This can include a person's name, date of birth, social security number, driver's license number, passport number, email address, phone number or physical address. PII can be either sensitive or non-sensitive, depending on the context in which it is used. Sensitive PII includes information that, if disclosed, could result in harm or embarrassment to an individual, such as their financial information, medical records or personal relationships. Non-sensitive PII includes information that, by itself, is not considered harmful, but when combined with other information, could be used to identify an individual. Laws and regulations protecting data privacy, such as the GDPR, require organizations to take appropriate measures to safeguard PII and prevent unauthorized access or disclosure. Failure to protect PII can result in legal and financial consequences, as well as damage to an organization's reputation. Customer, or party data management can help organizations manage legal compliance with PII regulations, e.g., by applying data governance rules to data processing and by managing consent and maintaining a single source of truth.

More information:

PIM - Product Information Management

A PIM system is a software solution that is designed to manage and centralize product information, including product descriptions, attributes, images, videos and other related data. PIM systems are commonly used by ecommerce businesses, retailers and manufacturers to manage large volumes of product information across multiple channels, such as websites, mobile apps and marketplaces. By using a PIM system, organizations can ensure that product information is accurate, up-to-date and consistent across all channels, which can improve customer experience and increase sales.

Some of the key features of a PIM system may include: - Centralized product information storage and management - Workflow management for product data creation, review, and approval - Product data enrichment capabilities, such as the ability to add metadata, attributes, and digital assets - Data governance and data quality management tools to ensure accuracy and consistency of data - Integration with other systems such as ecommerce platforms, content management systems and ERP systems - Multi-language and multi-currency support - Analytics and reporting capabilities to track product performance and user engagement

Many companies use a product master data management solution to build their PIM system because of the product MDM's enhanced capabilities, such as management of large data volumes and a wide range of integration options.

More information:

Platform

A software platform is a comprehensive technology that provides a foundation for developing, deploying and managing software applications. A software platform typically includes an operating system, middleware, development tools and other software components that are designed to work together to support the development and execution of software applications. Software platforms can be used to develop a wide range of applications, from simple desktop applications to complex enterprise systems and cloud-based services. Software platforms include, for example, operating systems (e.g., Windows, macOS, Linux and Android), web platforms (e.g., WordPress, Drupal, HubSpot and Sitecore), cloud platforms (e.g., Amazon Web Services, Microsoft Azure, and Google Cloud), and enterprise platforms (e.g., SAP and Oracle).

More information:

PLM - Product Lifecycle Management

PLM is a process that helps manage the entire lifecycle of a product from ideation, design and manufacturing to service and disposal. PLM is a cross-functional approach to managing product data and information across departments, including engineering, manufacturing, supply chain management, sales and marketing. The goal of PLM is to provide a centralized platform to manage product-related information and enable companies to optimize product development and improve collaboration between teams.

Master data management (MDM) can support PLM by providing accurate and consistent product data and a framwork for data governance and data stewardship, which helps organizations ensure that product data is managed and controlled according to established policies and procedures. This can help organizations meet regulatory compliance requirements and reduce the risk of data breaches and other security incidents. MDM also provides integration capabilities that enable PLM systems to connect with other systems and applications, such as ERP and supply chain management systems. This can help organizations streamline their product development and manufacturing processes by providing a single source of truth for product data and information.

Product Data

Product data encompasses all attributes, relationships and records of a certain product. It includes descriptions, specifications, packaging data, units of measure, digital assets and much more. Product data is often managed in a PIM or Product MDM where you can maintain and share product data via integrations to other internal and external sources.

More information:

PXM - Product Experience Management

PXM is a process that focuses on creating and managing the end-to-end experience of a product across all channels and touchpoints, including online and offline. PXM aims to deliver consistent, high-quality product experiences that meet the needs and expectations of customers and stakeholders. PXM goes beyond traditional product information management (PIM) and product lifecycle management (PLM) systems by including a wide range of activities, such as product content creation, optimization, localization, enrichment and distribution, as well as analytics and reporting. PXM helps organizations create and deliver rich, engaging and informative product content that resonates with customers and drives conversions. The goal of PXM is to provide customers and stakeholders with a seamless and personalized product experience across online marketplaces, ecommerce websites, social media, mobile apps and physical stores. PXM systems integrate with other systems and applications to streamline workflows and improve the efficiency of product content creation and distribution.

master data management definition - R

Reference Data

Data that define values relevant to cross-functional organizational transactions. Reference data management aims to Reference data refers to fixed and unchanging values or information that is used as a standard or point of reference for various purposes. It typically includes data elements that serve as constants or benchmarks, such as codes, classifications, standards or other predefined values. Reference data is commonly used in information systems and databases to provide context or categorization for other data elements. It helps ensure consistency, accuracy and interoperability across different systems or applications by providing a standardized framework for organizing and interpreting data. Examples of reference data include country codes, currency codes, industry classifications, product catalogs or any other type of data that remains relatively static and widely used for comparison or identification.

More information:

Registry Style

A method of implementing a master data management solution where MDM is used as a referenced, read-only source of mastered data for external systems. The Registry style is mainly used to spot duplicates by running cleansing and matching algorithms on data from your various source systems. It assigns unique global identifiers to matched records to help identify a single version of the truth.

More information:

master data management definition - S

SaaS - Software as a Service

A cloud computing model where software applications are provided and hosted by a third-party provider over the internet. Instead of installing and maintaining software on individual computers or servers, users access and use the software via a web browser or a client. SaaS eliminates the need for users to handle software installation, infrastructure management and maintenance tasks, as these responsibilities are taken care of by the service provider. Users typically pay a subscription fee to access the software, which is often based on usage or a periodic basis. SaaS offers benefits such as scalability, accessibility from anywhere with an internet connection, automatic updates, and the ability to share and collaborate on data easily.

More information:

SASB - Sustainability Accounting Standards Board

An independent nonprofit organization that develops and maintains sustainability accounting standards for publicly traded companies in the United States. SASB standards are industry-specific and focus on the disclosure of material environmental, social and governance (ESG) factors that are relevant to a company's financial performance. These standards provide a framework for companies to report on ESG issues that are most likely to impact their financial condition, operating performance, and risk profile. The SASB standards cover a wide range of industries and sectors, including financials, healthcare, technology, transportation and more. They address topics such as greenhouse gas emissions, employee health and safety, supply chain management, data privacy, and diversity and inclusion.

Master data management supports SASB reporting by collecting sustainability compliance data from heterogenous sources, for instance suppliers, in order to comply with the sustainability assessments and disclosures your business is measured against.

More information:

SCM - Supply Chain Management

The coordinated activities involved in the planning, sourcing, production and distribution of goods and services from suppliers to end customers. It encompasses the entire lifecycle of a product or service, from the procurement of raw materials to the delivery of the final product. Master data is essential for effective supply chain management as the foundational and critical data that provides a comprehensive view of the products, customers, suppliers and other key entities within a supply chain. It includes information such as product descriptions, specifications and supplier details.

More information:

SKU - Stock Keeping Unit

A unique identifier or code used to track and manage individual inventory items within a company's product catalog. It is a distinct code assigned to each specific product variant, such as size, color or packaging, allowing for precise identification and differentiation of products. By assigning unique SKUs to each product variant, businesses can easily identify, classify and locate products within their inventory, streamline operations, and improve accuracy in stock management. SKUs are commonly used in retail and inventory management systems to facilitate efficient inventory control, order fulfillment, and tracking of sales and stock levels. Each SKU typically corresponds to a specific product configuration or variation, enabling businesses to manage and monitor their inventory with granularity.

Stack

Your tech stack is the combination of software tools, programming languages, frameworks, libraries and other technologies that are used together to build and operate your software architecture or system. It represents the layers of technology components that are employed to develop, deploy and maintain a software solution. A typical tech stack is composed of several layers, including presentation, application and data layers.

Stewardship

Data stewardship is the management and oversight of an organization's data assets to ensure their quality, integrity, and compliance with relevant policies and regulations. It involves the responsibility for safeguarding data throughout its lifecycle, from creation or acquisition to archival or deletion. Data stewards are individuals or teams assigned with the task of ensuring that data is properly managed and protected. They act as custodians of the data and are accountable for its accuracy, completeness, and appropriate use. Data stewardship involves a range of activities, including data governance, data quality management and data security

More information:

Supplier Data

Data about suppliers. This is one of the most central master data domains on which an MDM solution can be beneficial. A supplier MDM solution can provide a 360° view of your supplier ecosystem enabling visibility into relationships between suppliers, parent companies, sub-suppliers and subsidiaries. Managing supplier data also provides valuable insight into contracts, certifications, accreditations and performance metrics to ensure that your suppliers are compliant with your policies.

More information:

Supplier Portal

Supplier portals, also known as vendor portals, help companies to establish product data guidelines and standardization to facillitate the onboarding of product information through supplier self-service access into their product information management (PIM) or master data management (MDM) platform. Benefits of supplier portals is that it makes it easier for vendors to upload new information, update existing information and reduces friction for the recipient to spend more time on higher value activities.

More information:

Sustainability Data

Sustainability data provides insights into your organization's performance in terms of sustainable practices and helps identify areas for improvement. It includes information and metrics that measure and evaluate the environmental, social and economic impact of the organization's activities or products. It is also known as ESG data (Environmental, Social, Government). Sustainability data can cover various aspects, including data related to resource consumption, energy usage, greenhouse gas emissions, waste generation, water usage and ecological footprint. It may also refer to an organization's social impact and include data concerning labor practices, human rights, employee well-being, diversity and inclusion, community engagement and social contributions. Sustainability data is typically collected through comprehensive reporting frameworks, such as the Global Reporting Initiative (GRI) or the Sustainability Accounting Standards Board (SASB). These frameworks provide guidelines for organizations to disclose their sustainability-related information in a standardized and transparent manner.

A master data management solution can support sustainability data management and help organizations comply with sustainability reporting requirements by collecting and systematizing data at all stages of the supply chain. The achieved transparency can help you make demands on your suppliers or replace them with new suppliers that comply with sustainability standards. With data on production conditions, supply chain, logistics, etc., stored and consolidated in a single master data source, you can make trusted and data-based reporting on progress and fulfilment of your sustainability efforts.

More information:

Synthetic Data

Synthetic data is created algorithimically to fill in gaps of real-world data or to compensate for this data. It mimics the characteristics and patterns of real-world data without containing any personally identifiable information (PII). It is created using statistical algorithms and machine learning techniques to replicate the statistical properties and relationships found in actual data. Synthetic data serves as a privacy-preserving alternative for organizations to share or analyze data without the risk of exposing sensitive information. It enables the development and testing of models, algorithms and applications while maintaining data privacy and security. Synthetic data has applications in various fields, including data analysis, machine learning and cyber security. Master data management can improve the pertinence and explainability of synthetic data by implementing a process to ensure the curation of the synthetic information is representative and insightful.

More information:

master data management definition - T

TecDoc

TecDoc is a reference data standard for automotive parts and vehicles that is used to organize product data in the automotive industry. It provides basic descriptions of vehicles, vehicle components and product classification (i.e., standardized product information, criteria that can be described in the article) in the globally recognised standardized format avialable in over 40 languages. Companies in the automotive industry use master data management solutions that can utilize TecDoc and other industry standard information to encourage accurate and collaborative sharing of critical automotive business data.

More information:

master data management definition - U

Unified Commerce

Unified commerce is a business strategy that leverages consistent data from a single platform to ensure that the entire organization is acting on the same consistent data, including, for example, sales representatives and business partners. With a unified-commerce strategy, every business function can tap into the same trusted source of information. This can positively impact the supply chain by facilitating more precise inventory management and forecasting. Unified commerce takes a holistic approach by integrating all channels and touchpoints into one seamless experience, including backend operations. Master data management can provide a centralized repository for all the core data elements that are needed to support different channels and touchpoints, such as ensuring that product information, promotions, customer data and other critical data are consistent and accurate across all channels. This can help to prevent issues like conflicting pricing or product information that can cause confusion and frustration for customers.

UNSDG

The United Nations Sustainable Development Goals are a set of 17 goals established by the United Nations in 2015 as part of the 2030 Agenda for Sustainable Development. The SDGs cover a broad range of social, economic and environmental objectives, including eradicating poverty, promoting gender equality, ensuring clean water and sanitation and combating climate change. The SDGs provide a framework for global action and cooperation to achieve a more sustainable and inclusive future. Organizations that need to report against some of the SDGs can use a master data management solution to provide visibility into business data enrichment completion and ensure compliance.

See also: Sustainability Data

More information:

UNSPSC

The United Nations Standard Products and Services Code [UNSPSC] is a coding system for goods and services used globally. It enables buyers and sellers to describe goods and services in a common way without referring to any suppliers' custom or privately created catalogue codes and descriptions. The UNSPSC framework adopts a four or five level classification, using the labels Segment, Family, Class, Commodity and Business Function to accurately classify items across industries.

More information:

Workflow AutomationScreenshot 2021-05-03 at 13.59.19

Workflow automation in master data management helps to automate and streamline repetitive, manual tasks and processes. It involves designing, executing and managing a series of automated actions that follow a predefined sequence of steps or rules. Workflow automation aims to increase efficiency, reduce errors and enhance productivity by replacing manual interventions with automated actions. It typically involves the integration of various software applications and systems, allowing data and information to flow seamlessly across different stages of a process. For example, workflow automation is used to extract information from suppliers or data pools to populate a PIM system or data catalogues with product data. Rules ensure that the data is compliant with the receiver's data requirements.

master data management definition - Z

Zone of Insight

Multidomain master data management (MDM) provides unified governance across all master data domains. In the intersections between domains, organizations can get access to new insights depending on the context of the master data. Common domains include customer, supplier, product, asset and location. Managing several master data domains in conjunction can add additional business value to the solution, such as end-consumer analytics (customer + product domains), localized, personalized offerings (customer + product + location domains) or understanding how a supplier is perfoming in terms of onboarding new products and identifying ways to improve (supplier + product domains). Zones of insight are a result of the ability of multidomain MDM to break down data silos and unify data in a single view using common data governance rules. Managing master data domains separately, e.g., product data in a PIM system and customer data in a CRM, would not allow for the data to be merged in the same insightful way.

More information:

Want this glossary to-go? Download it and keep it as your go-to master data management dictionary »


Topics: 
Master Data Management Blog by Stibo Systems logo

Brian Cluster is Stibo Systems’ Industry Strategy Director for the consumer packaged goods and retail industries. He has more than 25 years of experience collaborating on strategy, delivering analytics and developing business plans and digital transformation. At Stibo Systems, Brian is putting his varied industry expertise to good use, providing direction and strategy for field teams and helping to drive customer value for master data management solutions. He is a frequent contributor to The Consumer Goods Forum, and his articles have been published in Consumer Goods Technology, Multichannel Merchant, Total Retail, Footwear News, and Center for Data Innovation, among others.

Discover Blogs by Topic

  • MDM strategy
  • Data governance
  • Customer and party data
  • See more
  • Retail and distribution
  • Manufacturing
  • Data quality
  • Supplier data
  • Product data and PIM
  • AI and machine learning
  • CPG
  • Financial services
  • GDPR
  • Sustainability
  • Location data
  • PDX Syndication

The Difference Between Master Data and Metadata?

5/26/24

Master Data Management Roles and Responsibilities

5/20/24

8 Best Practices for Customer Master Data Management

5/16/24

What Is Master Data Governance – And Why Do You Need It?

5/12/24

Guide: Deliver flawless rich content experiences with master data governance

4/11/24

Risks of Using LLMs in Your Business – What Does OWASP Have to Say?

4/10/24

Guide: How to comply with industry standards using master data governance

4/9/24

Digital Product Passports - A Data Management Challenge

4/8/24

Guide: Get enterprise data enrichment right with master data governance

4/2/24

Guide: Getting enterprise data modelling right with master data governance

4/2/24

Guide: Improving your data quality with master data governance

4/2/24

Data Governance Trends 2024

1/30/24

NRF 2024 Recap: In the AI era, better data can make all the difference

1/19/24

Building Supply Chain Resilience: Strategies & Examples

12/19/23

How Master Data Management Can Enhance Your ERP Solution

12/14/23

Shedding Light on Climate Accountability and Traceability in Retail

11/29/23

What is Smart Manufacturing and Why Does it Matter?

10/11/23

Future Proof Your Retail Business with Composable Commerce

10/9/23

5 Common Reasons Why Manufacturers Fail at Digital Transformation

10/5/23

How to Digitally Transform a Restaurant Chain

9/29/23

Three Benefits of Moving to Headless Commerce and the Role of a Modern PIM

9/14/23

12 Steps to a Successful Omnichannel and Unified Commerce

7/6/23

CGF Global Summit 2023: Unlock Sustainable Growth With Collaboration and Innovation

7/5/23

Navigating the Current Challenges of Supply Chain Management

6/28/23

Responsible AI relies on data governance

5/11/23

Product Data Management during Mergers and Acquisitions

4/6/23

A Complete Master Data Management Glossary

3/14/23

4 Ways to Reduce Ecommerce Returns

3/8/23

Asset Data Governance is Central for Asset Management

3/1/23

4 Common Master Data Management Implementation Styles

2/21/23

How to Leverage Internet of Things with Master Data Management

2/14/23

Manufacturing Trends and Insights in 2023-2025

2/14/23

Sustainability in Retail Needs Governed Data

2/13/23

What is Augmented Data Management?

2/9/23

NRF 2023: Retail Turns to AI and Automation to Increase Efficiencies

1/20/23

What is the difference between CPG and FMCG?

1/18/23

5 Key Manufacturing Challenges in 2023

1/16/23

What is a Golden Customer Record in Master Data Management?

1/9/23

The Future of Master Data Management: Trends in 2023-2025

1/8/23

Innovation in Retail

1/4/23

5 CPG Industry Trends and Opportunities for 2023-2025

12/5/22

Life Cycle Assessment Scoring for Food Products

11/21/22

Retail of the Future

11/14/22

Omnichannel Strategies for Retail

11/7/22

Hyper-Personalized Customer Experiences Need Multidomain MDM

11/5/22

What is Omnichannel Retailing and What is the Role of Data Management?

10/25/22

Most Common ISO Standards in the Manufacturing Industry

10/18/22

How to Get Started with Master Data Management: 5 Steps to Consider

10/17/22

What is Supply Chain Analytics and Why It's Important

10/12/22

What is Data Quality and Why It's Important

10/12/22

A Data Monetization Strategy - Get More Value from Your Master Data

10/11/22

An Introductory Guide: What is Data Intelligence?

10/1/22

Revolutionizing Manufacturing: 5 Must-Have SaaS Systems for Success

9/15/22

An Introductory Guide to Supplier Compliance

9/7/22

What is Application Data Management and How Does It Differ From MDM?

8/29/22

Digital Transformation in the Manufacturing Industry

8/25/22

Master Data Management Framework: Get Set for Success

8/17/22

Discover the Value of Your Data: Master Data Management KPIs & Metrics

8/15/22

Supplier Self-Service: Everything You Need to Know

6/15/22

Omnichannel vs. Multichannel: What’s the Difference?

6/14/22

Digital Transformation in the CPG Industry

6/14/22

Create a Culture of Data Transparency - Begin with a Solid Foundation

6/10/22

The 5 Biggest Retail Trends for 2023-2025

5/31/22

What is a Location Intelligence?

5/31/22

Omnichannel Customer Experience: The Ultimate Guide

5/30/22

Location Analytics – All You Need to Know

5/26/22

Omnichannel Commerce: Creating a Seamless Shopping Experience

5/24/22

Top 4 Data Management Trends in the Insurance Industry

5/11/22

What is Supply Chain Visibility and Why It's Important

5/1/22

6 Features of an Effective Master Data Management Solution

4/30/22

What is Digital Asset Management?

4/23/22

The Ultimate Guide to Data Transparency

4/21/22

How Manufacturers Can Shift to Product-as-a-Service Offerings

4/20/22

How to Check Your Enterprise Data Foundation

4/16/22

An Introductory Guide to Manufacturing Compliance

4/14/22

Multidomain MDM vs. Multiple Domain MDM

3/31/22

Making Master Data Accessible: What is Data as a Service (DaaS)?

3/29/22

How to Build a Successful Data Governance Strategy

3/23/22

What is Unified Commerce? Key Advantages & Best Practices

3/22/22

How to Choose the Right Data Quality Tool?

3/22/22

What is a data domain? Meaning & examples

3/21/22

6 Best Practices for Data Governance

3/17/22

5 Advantages of a Master Data Management System

3/16/22

A Unified Customer View: What Is It and Why You Need It

3/9/22

Supply Chain Challenges in the CPG Industry

2/24/22

Data Migration to SAP S/4HANA ERP - The Fast and Safe Approach with MDM

2/17/22

The Best Data Governance Tools You Need to Know About

2/17/22

Top 5 Most Common Data Quality Issues

2/14/22

What Is Synthetic Data and Why It Needs Master Data Management

2/10/22

What is Cloud Master Data Management?

2/8/22

How to Implement Data Governance

2/7/22

Build vs. Buy Master Data Management Software

1/28/22

Why is Data Governance Important?

1/27/22

Five Reasons Your Data Governance Initiative Could Fail

1/24/22

How to Turn Your Data Silos Into Zones of Insight

1/21/22

How to Improve Supplier Experience Management

1/16/22

​​How to Improve Supplier Onboarding

1/16/22

How to Enable a Single Source of Truth with Master Data Management

1/13/22

What is a Data Quality Framework?

1/11/22

How to Measure the ROI of Master Data Management

1/11/22

What is Manufacturing-as-a-Service (MaaS)?

1/7/22

The Ultimate Guide to Building a Data Governance Framework

1/4/22

Introducing the Master Data Management Maturity Model

1/3/22

Master Data Management Tools - and Why You Need Them

12/20/21

The Dynamic Duo of Data Security and Data Governance

12/20/21

How to Choose the Right Supplier Management Solution

12/20/21

How Data Transparency Enables Sustainable Retailing

12/6/21

What is Supplier Performance Management?

12/1/21

What is Party Data? All You Need to Know About Party Data Management

11/28/21

What is Data Compliance? An Introductory Guide

11/18/21

How to Create a Marketing Center of Excellence

11/14/21

The Complete Guide: How to Get a 360° Customer View

11/7/21

How Location Data Adds Value to Master Data Projects

10/29/21

How Marketers Should Prepare for the 2023 Holiday Shopping Season

10/26/21

What is Supplier Lifecycle Management?

10/19/21

What is a Data Mesh? A Simple Introduction

10/15/21

How to Build a Master Data Management Strategy

9/26/21

10 Signs You Need a Master Data Management Platform

9/2/21

What Vendor Data Is and Why It Matters to Manufacturers

8/31/21

3 Reasons High-Quality Supplier Data Can Benefit Any Organization

8/25/21

4 Trends in the Automotive Industry

8/11/21

What is Reference Data and Reference Data Management?

8/9/21

What Obstacles Are Impacting the Global Retail Recovery?

8/2/21

GDPR as a Catalyst for Effective Data Governance

7/25/21

All You Need to Know About Supplier Information Management

7/21/21

5 Tips for Driving a Centralized Data Management Strategy

7/3/21

Data Governance and Data Protection, a Match Made in Heaven?

6/29/21

Welcome to the Decade of Transparency

5/26/21

How to Become a Customer-Obsessed Brand

5/12/21

How to Create a Master Data Management Roadmap in Five Steps

4/27/21

What is a Data Catalog? Definition and Benefits

4/13/21

How to Improve the Retail Customer Experience with Data Management

4/8/21

How to Improve Your Data Management

3/31/21

How to Choose the Right Master Data Management Solution

3/29/21

Business Intelligence and Analytics: What's the Difference?

3/25/21

Spending too much on Big Data? Try Small Data and MDM

3/24/21

What is a Data Lake? Everything You Need to Know

3/21/21

How to Extract More Value from Your Data

3/17/21

Are you making decisions based on bad HCO/HCP information?

2/24/21

Why Master Data Cleansing is Important to CPG Brands

1/20/21

CRM 2.0 – It All Starts With Master Data Management

12/19/20

5 Trends in Telecom that Rely on Transparency of Master Data

12/15/20

10 Data Management Trends in Financial Services

11/19/20

Seasonal Marketing Campaigns: What Is It and Why Is It Important?

11/8/20

What Is a Data Fabric and Why Do You Need It?

10/29/20

Transparent Product Information in Pharmaceutical Manufacturing

10/14/20

How to Improve Back-End Systems Using Master Data Management

9/19/20

8 Benefits of Transparent Product Information for Medical Devices

9/1/20

How Retailers Can Increase Online Sales in 2023

8/23/20

Master Data Management (MDM) & Big Data

8/14/20

Key Benefits of Knowing Your Customers

8/9/20

Women in Master Data: Kelly Amavisca, Ferguson

8/5/20

Customer Data in Corporate Banking Reveal New Opportunities

7/21/20

How to Analyze Customer Data With Customer Master Data Management

7/21/20

How to Improve Your 2023 Black Friday Sales in 5 Steps

7/18/20

4 Ways Product Information Management (PIM) Improves the Customer Experience

7/18/20

How to Estimate the ROI of Your Customer Data

7/1/20

Women in Master Data: Rebecca Chamberlain, M&S

6/24/20

How to Personalise Insurance Solutions with MDM

6/17/20

How to Democratize Your Data

6/3/20

How to Get Buy-In for a Master Data Management Solution

5/25/20

How CPG Brands Manage the Impact of Covid-19 in a Post-Pandemic World

5/18/20

5 Steps to Improve Your Data Syndication

5/7/20

Marketing Data Quality: Why Is It Important and How to Get Started

3/26/20

Panic Buying: Navigating Long-term Implications and Uncertainty

3/24/20

Women in Master Data: Ditte Brix, IMPACT

2/20/20

Get More Value From Your CRM With Customer Master Data Management

2/17/20

Women in Master Data: Nagashree Devadas, Stibo Systems

2/4/20

How to Create Direct-to-Consumer (D2C) Success for CPG Brands

1/3/20

Women in Master Data: Anna Schéle, Ahlsell

10/25/19

Women in Master Data: Morgan Lawrence, Infoverity

9/26/19

Women in Master Data: Sara Friberg, Acando (Part of CGI)

9/13/19

Improving Product Setup Processes Enhances Superior Experiences

8/21/19

How to Improve Your Product's Time to Market With PDX Syndication

7/18/19

8 Tips For Pricing Automation In The Aftermarket

6/1/19

How to Drive Innovation With Master Data Management

3/15/19

Discover PDX Syndication to Launch New Products with Speed

2/27/19

How to Benefit from Product Data Management

2/20/19

What is a Product Backlog and How to Avoid It

2/13/19

How to Get Rid of Customer Duplicates

2/7/19

4 Types of IT Systems That Should Be Sunsetted

1/3/19

How to Use Customer Data Modeling

11/15/18

How to Reduce Time-to-Market with Master Data Management

10/28/18

How to Start Taking Advantage of Your Data

9/12/18

6 Signs You Have a Potential GDPR Problem

8/16/18

GDPR: The DOs and DON’Ts of Personal Data

6/13/18

How Master Data Management Supports Data Security

6/7/18

Frequently Asked Questions (FAQ) About the GDPR

5/30/18

Understanding the Role of a Chief Data Officer

4/26/18

3 Steps: How to Plan, Execute and Evaluate Any IoT Initiative

2/20/18

How to Benefit From Customer-Centric Data Management

9/7/17

3 Ways to Faster Innovation with Multidomain Master Data Management

6/7/17

Product Information Management Trends to Consider

5/25/17

4 Major GDPR Challenges and How to Solve Them

5/12/17

How to Prepare for GDPR in Five Steps

2/21/17

How Data Can Help Fight Counterfeit Pharmaceuticals

1/24/17

Create the Best Customer Experience with a Customer Data Platform

1/11/17