Gartner defines analytics and business intelligence platforms (ABI) as those that enable organizations to model, analyze and visualize data to support informed decision making and value creation. These platforms facilitate the preparation of data and the creation of interactive dashboards, reports and visualizations to uncover patterns, predict trends and optimize operations. By doing so, they empower users to collaborate and effectively communicate the dimensions and measures that drive their organization. The platforms may also optionally include the ability to create, modify or enrich a semantic model, including business rules. Analytics and business intelligence platforms integrate data from multiple sources, such as databases, spreadsheets, cloud services and external data feeds, to provide a unified view of data, breaking down silos and transforming raw data into meaningful insights. They also allow users to clean, transform and prepare data for analysis, in addition to creating data models that define relationships between different data entities.
Application Development refers to products and services that support the design, creation, deployment, and maintenance of software applications across web, mobile, desktop, and cloud environments. This category includes markets that support organizations to build scalable, secure, and user-centric applications while evolving through agile methodologies, automation, modern development practices, and continuous integration and delivery.
Artificial Intelligence (AI) refers to products and services that enable machines to perform tasks typically requiring human intelligence—such as learning, reasoning, problem-solving, perception, and language understanding. This category includes markets that focus on helping organizations build, deploy, and scale intelligent systems, utilizing AI across multiple industries through technologies like machine learning, natural language processing, computer vision, and generative AI.
Gartner defines augmented data quality (ADQ) solutions as a set of capabilities for enhanced data quality experience aimed at improving insight discovery, next-best-action suggestions and process automation by leveraging AI/machine learning (ML) features, graph analysis and metadata analytics. Each of these technologies can work independently, or cooperatively, to create network effects that can be used to increase automation and effectiveness across a broad range of data quality use cases. These purpose-built solutions include a range of functions such as profiling and monitoring; data transformation; rule discovery and creation; matching, linking and merging; active metadata support; data remediation and role-based usability.
The market for data integration tools consists of stand-alone software products that enable organizations to combine data from multiple sources and perform tasks related to data access, transformation, enrichment and delivery. They enable use cases such as data engineering, delivering modern data architectures, self-service data integration, operational data integration and supporting AI projects. Data management leaders procure data integration tools for their teams, including data engineers and data architects, or for other users, such as business analysts or data scientists. These products are primarily consumed as SaaS or deployed on-premises, in public or private cloud, or in hybrid configurations.
Data preparation is an iterative and agile process for finding, combining, cleaning, transforming and sharing curated datasets for various data and analytics use cases including analytics/business intelligence (BI), data science/machine learning (ML) and self-service data integration. Data preparation tools promise faster time to delivery of integrated and curated data by allowing business users including analysts, citizen integrators, data engineers and citizen data scientists to integrate internal and external datasets for their use cases. Furthermore, they allow users to identify anomalies and patterns and improve and review the data quality of their findings in a repeatable fashion. Some tools embed ML algorithms that augment and, in some cases, completely automate certain repeatable and mundane data preparation tasks. Reduced time to delivery of data and insight is at the heart of this market.
Gartner defines a data science and machine learning platform as an integrated set of code-based libraries and low-code tooling. These platforms support the independent use and collaboration among data scientists and their business and IT counterparts, with automation and AI assistance through all stages of the data science life cycle, including business understanding, data access and preparation, model creation and sharing of insights. They also support engineering workflows, including the creation of data, feature, deployment and testing pipelines. The platforms are provided via desktop client or browser with supporting compute instances or as a fully managed cloud offering.
Data and Analytics refers to products and services that enable organizations to collect, integrate, analyze, and act on data to drive informed decision-making and business outcomes. This category includes markets that focus on empowering enterprises to manage data pipelines, ensure data quality and governance, extract insights through advanced analytics, and machine learning across structured and unstructured data environments.
A data and analytics governance platform is a set of integrated business and technology capabilities that help business leaders and users develop and manage a diverse set of governance policies and enforce those policies across business and data management systems. These platforms are unique from data management in that data management focuses on policy execution, whereas D&A platforms are used primarily by business roles — not only or even specifically IT roles — for policy management. Data and analytics (D&A) leaders who are investing in operationalizing and automating the work of D&A governance should evaluate this market. The work of D&A governance primarily includes policy setting and policy enforcement, and collaborates with data management (policy execution). Use cases are employed across numerous governance policy categories and multiple business scenarios and asset types (data, KPIs, analytics models). The intersection of use-case/business scenarios, policy categories and assets to be governed is then used to identify the technology capability. These capabilities may share similar names across policy categories, but may not mean the same thing, or may be used differently by various governance personas. For example, data classification in a data security implementation would be quite different from a data classification effort for creating trust models, which would be based on lineage and curation.
A digital integration hub (DIH) is an architectural pattern that centralizes data from various sources to provide a scalable, and real-time layer for modern digital applications, especially beneficial for enterprises looking to transform to digitized sales processes. It aggregates data from multiple systems of record into a low-latency, high-performance data store (the data management layer) which is then accessed by sales force automation (SFA), sales enablement and other tools via APIs or events. It also provides a central layer of abstraction that decouples applications from underlying systems, making it easier to integrate and manage new data sources and applications without disrupting existing systems. DIH provides sales teams with rich and responsive access to massive data sources, limits the fees paid to API providers and helps enable 24/7 operations enhancing customer experience through self service, digital commerce and loyalty.
Population Health Management (PHM) is the approach used to achieve measurable improvements in the health outcomes of a population. In the broadest definition, healthcare provider population health management platforms cover the set of IT capabilities and related services that enable provider organizations to manage populations of patients and achieve the specific quality, cost and experience goals. The products included in this market provide the crucial capabilities for identifying patients at high risk of poor health, presenting a care plan to address poor health, visualizing the gaps between their current care and the care plan, engaging patients in their health, and tracking the outcomes of care.
Gartner is defining a new class of capabilities focused on value-based performance management analytics. This is a complement to population health analytics, but with deeper capabilities around the ability to model, forecast and monitor the performance of risk-bearing and value-based contracts, and to intersect the critical cost and quality variables.
Gartner defines integration platform as a service (iPaaS) as a vendor-managed cloud service that enables end users to implement integrations between applications, services and data sources, both internal and external to their organization. iPaaS enables end users of the platform to integrate a variety of internal and external applications, services and data sources for at least one of the three main patterns of integration technology use: data consistency, multistep process and composite services. These integration use cases are most commonly implemented via intuitive low-code or no-code developer environments, though some vendors provide more complex developer tooling.
Master data management (MDM) is a technology-enabled business discipline where business and IT organizations work together for the uniformity, accuracy, stewardship, semantic consistency and accountability of enterprises’ shared master data assets. Organizations use MDM solutions as part of an MDM strategy, which should be part of a wider enterprise information management (EIM) strategy. An MDM strategy potentially encompasses management of multiple master data domains (e.g., customer, citizen, product, “thing,” asset, person/party, supplier, location, and financial master data domains). Data and analytics (D&A) leaders procure MDM tools for data engineers or less-technical users, such as data stewards.
Master data management (MDM) of customer data solutions are software products that: Support the global identification, linking and synchronization of customer information across heterogeneous data sources through semantic reconciliation of master data. Create and manage a central, persisted system of record or index of record for customer master data. Enable the delivery of a single, trusted customer view to all stakeholders, to support various business initiatives. Support ongoing master data stewardship and governance requirements through workflow-based monitoring and corrective-action techniques. Are agnostic to the business application landscape in which they reside; that is, they do not assume or depend on the presence of any particular business application(s) to function.
Master data management (MDM) of product data solutions are software products that: Support the global identification, linking and synchronization of product data across heterogeneous data sources through semantic reconciliation of master data. Create and manage a central, persisted system of record or index of record for product master data. Enable the delivery of a single, trusted product view to all stakeholders, to support various business initiatives. Support ongoing master data stewardship and governance requirements through workflow-based monitoring and corrective-action techniques. Are agnostic to the business application landscape in which they reside; that is, they do not assume or depend on the presence of any particular business application(s) to function.
Gartner defines metadata management solutions as applications to enable the collection, analysis and orchestration of metadata related to organizational data assets. These solutions enable workflow and operational support to make data easy to find, use and manage. They do this by collating metadata in any form from within its own application and third-party systems, and providing the ability to search, analyze and make decisions on the collated results. They also provide transparent cross-referencing over all related metadata, and derive insights from data (such as usage patterns and performance) through analysis of metadata to support a wide range of data-driven initiatives.
To reduce both infrastructure costs and manual workloads in postmodern ERP projects, SAP application leaders and SAP Basis operations leaders should evaluate specialized software tools for automating the regular refresh of their SAP ERP test data. SAP selective test data management tools perform selective copying of SAP test data, but they vary in their approach to data selection, scrambling and performance optimization. There are two user constituencies for these tools: (1) Basis operations teams require repetitive data copy operations that must be as automated as possible (2) SAP application data objects for ad hoc data copying. Some of the tools also enable Basis operations teams to produce a 'shell system,' which is an identical copy of a complete production system, but without the transaction data. This is very useful in many projects for testing purposes.