Active metadata management is a set of capabilities that enables continuous access and processing of metadata that support ongoing analysis over a different spectrum of maturity, use cases and vendor solutions. Active metadata outputs range from design recommendations based upon execution results and reports of runtime steps through, and indicators of, business outcomes achieved. The resulting recommendations from those analytics are issued as design inputs to humans or system-level instructions that are expected to have a response.
Analytics and business intelligence platforms — enabled by IT and augmented by AI — empower users to model, analyze and share data. Analytics and business intelligence (ABI) platforms enable organizations to understand their data. For example, what are the dimensions of their data — such as product, customer, time, and geography? People need to be able to ask questions about their data (e.g., which customers are likely to churn? Which salespeople are not reaching their quotas?). They need to be able to create measures from their data, such as on-time delivery, accidents in the workplace and customer or employee satisfaction. Organizations need to blend modeled and nonmodeled data to create new data pipelines that can be explored to find anomalies and other insights. ABI platforms make all of this possible.
Reviews for 'Application Development, Integration and Management - Others'
Gartner defines augmented data quality (ADQ) solutions as a set of capabilities for enhanced data quality experience aimed at improving insight discovery, next-best-action suggestions and process automation by leveraging AI/machine learning (ML) features, graph analysis and metadata analytics. Each of these technologies can work independently, or cooperatively, to create network effects that can be used to increase automation and effectiveness across a broad range of data quality use cases. These purpose-built solutions include a range of functions such as profiling and monitoring; data transformation; rule discovery and creation; matching, linking and merging; active metadata support; data remediation and role-based usability. These packaged solutions help implement and support the practice of data quality assurance, mostly embedded as part of a broader data and analytics (D&A) strategy. Various existing and upcoming use cases include: 1. Analytics, artificial intelligence and machine learning development 2. Data engineering 3. D&A governance 4. Master data management 5. Operational/transactional data quality
Gartner defines data integration as the discipline comprising the architectural patterns, methodologies and tools that allow organizations to achieve consistent access and delivery of data across a wide spectrum of data sources and data types to meet the data consumption requirements of business applications and end users. Data integration tools enable organizations to access, integrate, transform, process and move data that spans various endpoints and across any infrastructure to support their data integration use cases. The market for data integration tools includes vendors that offer a stand-alone software product (or products) to enable the construction and implementation of data access and data delivery infrastructure for a variety of data integration use cases.
Data preparation is an iterative and agile process for finding, combining, cleaning, transforming and sharing curated datasets for various data and analytics use cases including analytics/business intelligence (BI), data science/machine learning (ML) and self-service data integration. Data preparation tools promise faster time to delivery of integrated and curated data by allowing business users including analysts, citizen integrators, data engineers and citizen data scientists to integrate internal and external datasets for their use cases. Furthermore, they allow users to identify anomalies and patterns and improve and review the data quality of their findings in a repeatable fashion. Some tools embed ML algorithms that augment and, in some cases, completely automate certain repeatable and mundane data preparation tasks. Reduced time to delivery of data and insight is at the heart of this market.
Gartner defines a data science and machine learning platform as an integrated set of code-based libraries and low-code tooling that support the independent use by, and collaboration between, data scientists and their business and IT counterparts through all stages of the data science life cycle. These stages include business understanding, data access and preparation, experimentation and model creation, and sharing of insights. They also support machine learning engineering workflows including creation of data, feature, deployment and testing pipelines. The platforms are provided via desktop client or browser with supporting compute instances and/or as a fully managed cloud offering. Data science and machine learning (DSML) platforms are designed to allow a broad range of users to develop and apply a comprehensive set of predictive and prescriptive analytical techniques. Leveraging data from distributed sources, cutting-edge user experience, and native machine learning and generative AI (GenAI) capabilities, these platforms help to augment and automate decision making across an enterprise. They provide a range of proprietary and open-source tools to enable data scientists and domain experts to find patterns in data that can be used to forecast financial metrics, understand customer behavior, predict supply and demand, and many other use cases. Models can be built on all types of data, including tabular, images, video and text for applications that require computer vision or natural language processing.
Reviews for 'Data and Analytics - Others'
A D&A governance platform is a set of integrated business capabilities that helps business leaders and users evaluate and implement a diverse set of governance policies and monitor and enforce those policies across their organizations’ business systems. These platforms are unique from data management and discrete governance tools in that data management and such tools focus on policy execution, whereas these platforms are used primarily by business roles — not only or even specifically IT roles.
A digital integration hub (DIH) is an architectural pattern that centralizes data from various sources to provide a scalable, and real-time layer for modern digital applications, especially beneficial for enterprises looking to transform to digitized sales processes. It aggregates data from multiple systems of record into a low-latency, high-performance data store (the data management layer) which is then accessed by sales force automation (SFA), sales enablement and other tools via APIs or events. It also provides a central layer of abstraction that decouples applications from underlying systems, making it easier to integrate and manage new data sources and applications without disrupting existing systems. DIH provides sales teams with rich and responsive access to massive data sources, limits the fees paid to API providers and helps enable 24/7 operations enhancing customer experience through self service, digital commerce and loyalty.
Providers included in this market offer capabilities to make the deployment and operation of Hadoop environments better. Vendors offer unique capabilities across areas such as performance optimization, flexible and efficient infrastructure consumption, backup and disaster recovery, and workload monitoring to help I&O leaders meet internal business SLAs. Typically, they support (and typically are certified resellers for) multiple commercial Hadoop distributions.
Population Health Management (PHM) is the approach used to achieve measurable improvements in the health outcomes of a population. In the broadest definition, healthcare provider population health management platforms cover the set of IT capabilities and related services that enable provider organizations to manage populations of patients and achieve the specific quality, cost and experience goals. The products included in this market provide the crucial capabilities for identifying patients at high risk of poor health, presenting a care plan to address poor health, visualizing the gaps between their current care and the care plan, engaging patients in their health, and tracking the outcomes of care.
Gartner is defining a new class of capabilities focused on value-based performance management analytics. This is a complement to population health analytics, but with deeper capabilities around the ability to model, forecast and monitor the performance of risk-bearing and value-based contracts, and to intersect the critical cost and quality variables.
Gartner defines integration platform as a service (iPaaS) as a vendor-managed cloud service that enables end users to implement integrations between a variety of applications, services and data sources, both internal and external to their organization. iPaaS enables end users of the platform to integrate a variety of internal and external applications, services and data sources for at least one of the three main uses of integration technology: Data consistency: The ability to monitor for or be notified by applications, services and data sources about changes, and to propagate those changes to the appropriate applications and data destinations (for example, “synchronize customer data” or “ingest into data lake”). Multistep process: The ability to implement multistep processes between applications, services and data sources (for example, to “onboard employee” or “process insurance claim”). Composite service: The ability to create composite services exposed as APIs or events and composed from existing applications, services and data sources (for example, to create a “credit check” service or to create a “generate fraud score” service). These integration processes, data pipelines, workflows, automations and composite services are most commonly created via intuitive low-code or no-code developer environments, though some vendors provide more-complex developer tooling.
MDM is a technology-enabled business discipline in which business and IT work together to ensure the uniformity, accuracy, stewardship, governance, semantic consistency and accountability of an enterprise’s official shared master data assets. Master data has the lowest number of consistent and uniform sets of identifiers and attributes that uniquely describe the core entities of the enterprise and are used across multiple business processes.
Master data management (MDM) of customer data solutions are software products that: Support the global identification, linking and synchronization of customer information across heterogeneous data sources through semantic reconciliation of master data. Create and manage a central, persisted system of record or index of record for customer master data. Enable the delivery of a single, trusted customer view to all stakeholders, to support various business initiatives. Support ongoing master data stewardship and governance requirements through workflow-based monitoring and corrective-action techniques. Are agnostic to the business application landscape in which they reside; that is, they do not assume or depend on the presence of any particular business application(s) to function.
Master data management (MDM) of product data solutions are software products that: Support the global identification, linking and synchronization of product data across heterogeneous data sources through semantic reconciliation of master data. Create and manage a central, persisted system of record or index of record for product master data. Enable the delivery of a single, trusted product view to all stakeholders, to support various business initiatives. Support ongoing master data stewardship and governance requirements through workflow-based monitoring and corrective-action techniques. Are agnostic to the business application landscape in which they reside; that is, they do not assume or depend on the presence of any particular business application(s) to function.
To reduce both infrastructure costs and manual workloads in postmodern ERP projects, SAP application leaders and SAP Basis operations leaders should evaluate specialized software tools for automating the regular refresh of their SAP ERP test data. SAP selective test data management tools perform selective copying of SAP test data, but they vary in their approach to data selection, scrambling and performance optimization. There are two user constituencies for these tools: (1) Basis operations teams require repetitive data copy operations that must be as automated as possible (2) SAP application data objects for ad hoc data copying. Some of the tools also enable Basis operations teams to produce a 'shell system,' which is an identical copy of a complete production system, but without the transaction data. This is very useful in many projects for testing purposes.