Active metadata management is a set of capabilities that enables continuous access and processing of metadata that support ongoing analysis over a different spectrum of maturity, use cases and vendor solutions. Active metadata outputs range from design recommendations based upon execution results and reports of runtime steps through, and indicators of, business outcomes achieved. The resulting recommendations from those analytics are issued as design inputs to humans or system-level instructions that are expected to have a response.
'Application integration platforms enable independently designed applications, apps and services to work together. Key capabilities of application integration technologies include: • Communication functionality that reliably moves messages/data among endpoints. • Support for fundamental web and web services standards. • Functionality that dynamically binds consumer and provider endpoints. • Message validation, mapping, transformation and enrichment. • Orchestration. • Support for multiple interaction patterns, content-based routing and typed messages.
Gartner defines augmented data quality (ADQ) solutions as a set of capabilities for enhanced data quality experience aimed at improving insight discovery, next-best-action suggestions and process automation by leveraging AI/machine learning (ML) features, graph analysis and metadata analytics. Each of these technologies can work independently, or cooperatively, to create network effects that can be used to increase automation and effectiveness across a broad range of data quality use cases. These purpose-built solutions include a range of functions such as profiling and monitoring; data transformation; rule discovery and creation; matching, linking and merging; active metadata support; data remediation and role-based usability. These packaged solutions help implement and support the practice of data quality assurance, mostly embedded as part of a broader data and analytics (D&A) strategy. Various existing and upcoming use cases include: 1. Analytics, artificial intelligence and machine learning development 2. Data engineering 3. D&A governance 4. Master data management 5. Operational/transactional data quality
B2B gateway software is integration middleware that supports information exchange between your organization and its ecosystem trading partners, applications and endpoints. BGS consolidates and centralizes data and process integration and interoperability between a company's internal applications and external endpoints, such as business partners, SaaS or ecosystems. The BGS market is a composite market that includes pure-play BGS solutions and BGS that is embedded or combined with other IT solutions (for example, ESB suites that support BGS features as services connected to the ESB suite, integration brokerage services, e-invoicing software and networks, application platform suites, electronic data interchange [EDI] translators, and managed file transfer [MFT] technology).
Gartner defines customer data platforms (CDPs) as software applications that support marketing and customer experience use cases by unifying a company’s customer data from marketing and other channels. CDPs optimize the timing and targeting of messages, offers and customer engagement activities, and enable the analysis of individual-level customer behavior over time. The purpose of a CDP is to centralize data collection and unify customer data from disparate sources into profiles. CDPs enable marketers to create and manage segments and push those segments to priority channels without requiring coding or use of advanced querying techniques. While CDPs originated to serve marketing use cases, interest from data management roles, IT and other customer-facing roles (e.g. sales, service and support) is on the rise. Digital marketing leaders have long used a variety of systems to design, orchestrate and measure multichannel campaigns. While many of those systems also manage customer-level data and audiences for targeting, they do so in a way that makes both data governance and orchestration across channels (and across competitive vendor solutions) a challenge. CDPs aim to address that challenge by collecting and unifying disparate customer data in a centralized location accessible to marketers. The CDP is not a substitute for an enterprise’s master data management, but it can ensure that customer profile data, transactional events and analytic attributes are available to marketing when needed for real-time interactions.
The data integration tools market comprises stand-alone software products that allow organizations to combine data from multiple sources, including performing tasks related to data access, transformation, enrichment and delivery. Data integration tools enable use cases such as data engineering, operational data integration, delivering modern data architectures, and enabling less-technical data integration. Data integration tools are procured by data and analytics (D&A) leaders and their teams for use by data engineers or less-technical users, such as business analysts or data scientists. These products are consumed as SaaS or deployed on-premises, in public or private cloud, or in hybrid configurations.
Data masking is based on the premise that sensitive data can be transformed into less sensitive but still useful data. This is necessary to satisfy application testing use cases that require representative and coherent data, as well as analytics that involve the use of aggregate data for scoring, model building and statistical reporting. The market for data protection, DM included, continues to evolve with technologies designed to redact, anonymize, pseudonymize, or in some way deidentify data in order to protect it against confidentiality or privacy risk.
Data preparation is an iterative and agile process for finding, combining, cleaning, transforming and sharing curated datasets for various data and analytics use cases including analytics/business intelligence (BI), data science/machine learning (ML) and self-service data integration. Data preparation tools promise faster time to delivery of integrated and curated data by allowing business users including analysts, citizen integrators, data engineers and citizen data scientists to integrate internal and external datasets for their use cases. Furthermore, they allow users to identify anomalies and patterns and improve and review the data quality of their findings in a repeatable fashion. Some tools embed ML algorithms that augment and, in some cases, completely automate certain repeatable and mundane data preparation tasks. Reduced time to delivery of data and insight is at the heart of this market.
Data virtualization technology is based on the execution of distributed data management processing, primarily for queries, against multiple heterogeneous data sources, and federation of query results into virtual views. This is followed by the consumption of these virtual views by applications, query/reporting tools, message-oriented middleware or other data management infrastructure components. Data virtualization can be used to create virtualized and integrated views of data in-memory, rather than executing data movement and physically storing integrated views in a target data structure. It provides a layer of abstraction above the physical implementation of data, to simplify querying logic.
Reviews for 'Data and Analytics - Others'
A D&A governance platform is a set of integrated business capabilities that helps business leaders and users evaluate and implement a diverse set of governance policies and monitor and enforce those policies across their organizations’ business systems. These platforms are unique from data management and discrete governance tools in that data management and such tools focus on policy execution, whereas these platforms are used primarily by business roles — not only or even specifically IT roles.
Elevating Test Data Management for DevOps is the process of providing DevOps teams with test data to evaluate the performance, and functionality of applications. This process typically includes copying production data, anonymization or masking, and, sometimes, virtualization. In some cases, specialized techniques, such as synthetic data generation, are appropriate. With that, it applies data masking techniques to protect sensitive data, including PII, PHI, PCI, and other corporate confidential information, from fraud and unauthorized access while preserving contextual meaning.
The market for ESP platforms consists of software subsystems that perform real-time computation on streaming event data. They execute calculations on unbounded input data continuously as it arrives, enabling immediate responses to current situations and/or storing results in files, object stores or other databases for later use. Examples of input data include clickstreams; copies of business transactions or database updates; social media posts; market data feeds; images; and sensor data from physical assets, such as mobile devices, machines and vehicles.
Gartner defines integration platform as a service (iPaaS) as a vendor-managed cloud service that enables end users to implement integrations between a variety of applications, services and data sources, both internal and external to their organization. iPaaS enables end users of the platform to integrate a variety of internal and external applications, services and data sources for at least one of the three main uses of integration technology: Data consistency: The ability to monitor for or be notified by applications, services and data sources about changes, and to propagate those changes to the appropriate applications and data destinations (for example, “synchronize customer data” or “ingest into data lake”). Multistep process: The ability to implement multistep processes between applications, services and data sources (for example, to “onboard employee” or “process insurance claim”). Composite service: The ability to create composite services exposed as APIs or events and composed from existing applications, services and data sources (for example, to create a “credit check” service or to create a “generate fraud score” service). These integration processes, data pipelines, workflows, automations and composite services are most commonly created via intuitive low-code or no-code developer environments, though some vendors provide more-complex developer tooling.
MDM is a technology-enabled business discipline in which business and IT work together to ensure the uniformity, accuracy, stewardship, governance, semantic consistency and accountability of an enterprise’s official shared master data assets. Master data has the lowest number of consistent and uniform sets of identifiers and attributes that uniquely describe the core entities of the enterprise and are used across multiple business processes.
Master data management (MDM) of customer data solutions are software products that: Support the global identification, linking and synchronization of customer information across heterogeneous data sources through semantic reconciliation of master data. Create and manage a central, persisted system of record or index of record for customer master data. Enable the delivery of a single, trusted customer view to all stakeholders, to support various business initiatives. Support ongoing master data stewardship and governance requirements through workflow-based monitoring and corrective-action techniques. Are agnostic to the business application landscape in which they reside; that is, they do not assume or depend on the presence of any particular business application(s) to function.
Master data management (MDM) of product data solutions are software products that: Support the global identification, linking and synchronization of product data across heterogeneous data sources through semantic reconciliation of master data. Create and manage a central, persisted system of record or index of record for product master data. Enable the delivery of a single, trusted product view to all stakeholders, to support various business initiatives. Support ongoing master data stewardship and governance requirements through workflow-based monitoring and corrective-action techniques. Are agnostic to the business application landscape in which they reside; that is, they do not assume or depend on the presence of any particular business application(s) to function.
To reduce both infrastructure costs and manual workloads in postmodern ERP projects, SAP application leaders and SAP Basis operations leaders should evaluate specialized software tools for automating the regular refresh of their SAP ERP test data. SAP selective test data management tools perform selective copying of SAP test data, but they vary in their approach to data selection, scrambling and performance optimization. There are two user constituencies for these tools: (1) Basis operations teams require repetitive data copy operations that must be as automated as possible (2) SAP application data objects for ad hoc data copying. Some of the tools also enable Basis operations teams to produce a 'shell system,' which is an identical copy of a complete production system, but without the transaction data. This is very useful in many projects for testing purposes.
The structured data archiving and application retirement market is identified by an array of technology solutions that manage the life cycle of application-generated data and accommodate corporate and regulatory compliance requirements. Application-generated data is inclusive of databases and related unstructured data. SDA solutions focus on improving the storage efficiency of data generated by on-premises and cloud-based applications and orchestrating the retirement of legacy application data and their infrastructure. The SDA market includes solutions that can be deployed on-premises, and on private and public infrastructure, and includes managed services offerings such as SaaS or PaaS.