'Application integration platforms enable independently designed applications, apps and services to work together. Key capabilities of application integration technologies include: • Communication functionality that reliably moves messages/data among endpoints. • Support for fundamental web and web services standards. • Functionality that dynamically binds consumer and provider endpoints. • Message validation, mapping, transformation and enrichment. • Orchestration. • Support for multiple interaction patterns, content-based routing and typed messages.
Gartner defines augmented data quality (ADQ) solutions as a set of capabilities for enhanced data quality experience aimed at improving insight discovery, next-best-action suggestions and process automation by leveraging AI/machine learning (ML) features, graph analysis and metadata analytics. Each of these technologies can work independently, or cooperatively, to create network effects that can be used to increase automation and effectiveness across a broad range of data quality use cases. These purpose-built solutions include a range of functions such as profiling and monitoring; data transformation; rule discovery and creation; matching, linking and merging; active metadata support; data remediation and role-based usability.
B2B gateway software is integration middleware that supports information exchange between your organization and its ecosystem trading partners, applications and endpoints. BGS consolidates and centralizes data and process integration and interoperability between a company's internal applications and external endpoints, such as business partners, SaaS or ecosystems. The BGS market is a composite market that includes pure-play BGS solutions and BGS that is embedded or combined with other IT solutions (for example, ESB suites that support BGS features as services connected to the ESB suite, integration brokerage services, e-invoicing software and networks, application platform suites, electronic data interchange [EDI] translators, and managed file transfer [MFT] technology).
Customer data platforms (CDPs) are software applications that support customer experience use cases by unifying a company’s customer data from marketing, sales, service, commerce and other sources. CDPs unify customer data to facilitate its output to coordinate profiles between cross-functional systems, create segments and/or audience targets, optimize offers and/or decisions, and inform analysis while distributing insights that create triggers for other experiences.
Customer Relationship Management (CRM) refers to products and services that enable organizations to manage, analyze, and enhance customer interactions. This category includes markets that support functions such as customer engagement, service delivery, experience personalization, and operational efficiency - aimed at improving customer satisfaction, loyalty, and business outcomes.
The market for data integration tools consists of stand-alone software products that enable organizations to combine data from multiple sources and perform tasks related to data access, transformation, enrichment and delivery. They enable use cases such as data engineering, delivering modern data architectures, self-service data integration, operational data integration and supporting AI projects. Data management leaders procure data integration tools for their teams, including data engineers and data architects, or for other users, such as business analysts or data scientists. These products are primarily consumed as SaaS or deployed on-premises, in public or private cloud, or in hybrid configurations.
Gartner defines data management platforms as integrated, dynamic data environments for managing enterprise data with operational simplicity. DMPs bring different data management capabilities into a single platform, enabling technical and business users to efficiently manage data for operational, analytical and AI use cases. DMPs use shared metadata to automate data management activities, paving the way for more advanced data ecosystems. A DMP is a commercial solution from a single vendor for managing general-purpose data for an organization, unlike a customer data platform.
Data masking is based on the premise that sensitive data can be transformed into less sensitive but still useful data. This is necessary to satisfy application testing use cases that require representative and coherent data, as well as analytics that involve the use of aggregate data for scoring, model building and statistical reporting. The market for data protection, DM included, continues to evolve with technologies designed to redact, anonymize, pseudonymize, or in some way deidentify data in order to protect it against confidentiality or privacy risk.
Data preparation is an iterative and agile process for finding, combining, cleaning, transforming and sharing curated datasets for various data and analytics use cases including analytics/business intelligence (BI), data science/machine learning (ML) and self-service data integration. Data preparation tools promise faster time to delivery of integrated and curated data by allowing business users including analysts, citizen integrators, data engineers and citizen data scientists to integrate internal and external datasets for their use cases. Furthermore, they allow users to identify anomalies and patterns and improve and review the data quality of their findings in a repeatable fashion. Some tools embed ML algorithms that augment and, in some cases, completely automate certain repeatable and mundane data preparation tasks. Reduced time to delivery of data and insight is at the heart of this market.
Data virtualization technology is based on the execution of distributed data management processing, primarily for queries, against multiple heterogeneous data sources, and federation of query results into virtual views. This is followed by the consumption of these virtual views by applications, query/reporting tools, message-oriented middleware or other data management infrastructure components. Data virtualization can be used to create virtualized and integrated views of data in-memory, rather than executing data movement and physically storing integrated views in a target data structure. It provides a layer of abstraction above the physical implementation of data, to simplify querying logic.
Data and Analytics refers to products and services that enable organizations to collect, integrate, analyze, and act on data to drive informed decision-making and business outcomes. This category includes markets that focus on empowering enterprises to manage data pipelines, ensure data quality and governance, extract insights through advanced analytics, and machine learning across structured and unstructured data environments.
A data and analytics governance platform is a set of integrated business and technology capabilities that help business leaders and users develop and manage a diverse set of governance policies and enforce those policies across business and data management systems. These platforms are unique from data management in that data management focuses on policy execution, whereas D&A platforms are used primarily by business roles — not only or even specifically IT roles — for policy management. Data and analytics (D&A) leaders who are investing in operationalizing and automating the work of D&A governance should evaluate this market. The work of D&A governance primarily includes policy setting and policy enforcement, and collaborates with data management (policy execution). Use cases are employed across numerous governance policy categories and multiple business scenarios and asset types (data, KPIs, analytics models). The intersection of use-case/business scenarios, policy categories and assets to be governed is then used to identify the technology capability. These capabilities may share similar names across policy categories, but may not mean the same thing, or may be used differently by various governance personas. For example, data classification in a data security implementation would be quite different from a data classification effort for creating trust models, which would be based on lineage and curation.
The market for ESP platforms consists of software subsystems that perform real-time computation on streaming event data. They execute calculations on unbounded input data continuously as it arrives, enabling immediate responses to current situations and/or storing results in files, object stores or other databases for later use. Examples of input data include clickstreams; copies of business transactions or database updates; social media posts; market data feeds; images; and sensor data from physical assets, such as mobile devices, machines and vehicles.
Gartner defines integration platform as a service (iPaaS) as a vendor-managed cloud service that enables end users to implement integrations between applications, services and data sources, both internal and external to their organization. iPaaS enables end users of the platform to integrate a variety of internal and external applications, services and data sources for at least one of the three main patterns of integration technology use: data consistency, multistep process and composite services. These integration use cases are most commonly implemented via intuitive low-code or no-code developer environments, though some vendors provide more complex developer tooling.
Master data management (MDM) is a technology-enabled business discipline where business and IT organizations work together for the uniformity, accuracy, stewardship, semantic consistency and accountability of enterprises’ shared master data assets. Organizations use MDM solutions as part of an MDM strategy, which should be part of a wider enterprise information management (EIM) strategy. An MDM strategy potentially encompasses management of multiple master data domains (e.g., customer, citizen, product, “thing,” asset, person/party, supplier, location, and financial master data domains). Data and analytics (D&A) leaders procure MDM tools for data engineers or less-technical users, such as data stewards.
Master data management (MDM) of customer data solutions are software products that: Support the global identification, linking and synchronization of customer information across heterogeneous data sources through semantic reconciliation of master data. Create and manage a central, persisted system of record or index of record for customer master data. Enable the delivery of a single, trusted customer view to all stakeholders, to support various business initiatives. Support ongoing master data stewardship and governance requirements through workflow-based monitoring and corrective-action techniques. Are agnostic to the business application landscape in which they reside; that is, they do not assume or depend on the presence of any particular business application(s) to function.
Master data management (MDM) of product data solutions are software products that: Support the global identification, linking and synchronization of product data across heterogeneous data sources through semantic reconciliation of master data. Create and manage a central, persisted system of record or index of record for product master data. Enable the delivery of a single, trusted product view to all stakeholders, to support various business initiatives. Support ongoing master data stewardship and governance requirements through workflow-based monitoring and corrective-action techniques. Are agnostic to the business application landscape in which they reside; that is, they do not assume or depend on the presence of any particular business application(s) to function.
Gartner defines metadata management solutions as applications to enable the collection, analysis and orchestration of metadata related to organizational data assets. These solutions enable workflow and operational support to make data easy to find, use and manage. They do this by collating metadata in any form from within its own application and third-party systems, and providing the ability to search, analyze and make decisions on the collated results. They also provide transparent cross-referencing over all related metadata, and derive insights from data (such as usage patterns and performance) through analysis of metadata to support a wide range of data-driven initiatives.
Gartner defines the product information management (PIM) market as the packaged solutions that enable product, commerce and marketing teams to create and maintain an approved shareable version of rich product content. PIM makes a single, trusted source of product information available for multichannel commerce and data exchange. PIM solutions now support complex use cases, including product data syndication (PDS), product experience management (PXM), product information effectiveness analytics, digital shelf analytics and product data contextualization. They lay the foundation for delivering personalization, product discovery and digital experience platforms (DXPs). PIM is available as hosted cloud-native, SaaS, private cloud and on-premises solutions.
To reduce both infrastructure costs and manual workloads in postmodern ERP projects, SAP application leaders and SAP Basis operations leaders should evaluate specialized software tools for automating the regular refresh of their SAP ERP test data. SAP selective test data management tools perform selective copying of SAP test data, but they vary in their approach to data selection, scrambling and performance optimization. There are two user constituencies for these tools: (1) Basis operations teams require repetitive data copy operations that must be as automated as possible (2) SAP application data objects for ad hoc data copying. Some of the tools also enable Basis operations teams to produce a 'shell system,' which is an identical copy of a complete production system, but without the transaction data. This is very useful in many projects for testing purposes.
The structured data archiving and application retirement market is identified by an array of technology solutions that manage the life cycle of application-generated data and accommodate corporate and regulatory compliance requirements. Application-generated data is inclusive of databases and related unstructured data. SDA solutions focus on improving the storage efficiency of data generated by on-premises and cloud-based applications and orchestrating the retirement of legacy application data and their infrastructure. The SDA market includes solutions that can be deployed on-premises, and on private and public infrastructure, and includes managed services offerings such as SaaS or PaaS.
Test Data Management (TDM) is the process of provisioning data for development and testing in preproduction environments. It ensures efficient, high-quality datasets while safeguarding data privacy and sensitive corporate information to meet compliance and security requirements. Modern TDM solutions leverage synthetic data generation, alongside data subsetting and masking techniques, to provide realistic yet secure test data. These solutions are widely used by software developers, QA engineers, data analysts, and IT security teams to optimize testing, maintain regulatory compliance, and enhance application reliability.