Gartner defines the application programming interface (API) management market as the market for software to manage, govern and secure APIs. Organizations use APIs to modernize their architectures; APIs provide access to systems, services, partners and data services. API management software enables organizations to plan, deploy, secure, operate, version control and retire APIs, regardless of their size, region or industry.
Reviews for 'Application Development, Integration and Management - Others'
'Application integration platforms enable independently designed applications, apps and services to work together. Key capabilities of application integration technologies include: • Communication functionality that reliably moves messages/data among endpoints. • Support for fundamental web and web services standards. • Functionality that dynamically binds consumer and provider endpoints. • Message validation, mapping, transformation and enrichment. • Orchestration. • Support for multiple interaction patterns, content-based routing and typed messages.
Gartner defines augmented data quality (ADQ) solutions as a set of capabilities for enhanced data quality experience aimed at improving insight discovery, next-best-action suggestions and process automation by leveraging AI/machine learning (ML) features, graph analysis and metadata analytics. Each of these technologies can work independently, or cooperatively, to create network effects that can be used to increase automation and effectiveness across a broad range of data quality use cases. These purpose-built solutions include a range of functions such as profiling and monitoring; data transformation; rule discovery and creation; matching, linking and merging; active metadata support; data remediation and role-based usability. These packaged solutions help implement and support the practice of data quality assurance, mostly embedded as part of a broader data and analytics (D&A) strategy. Various existing and upcoming use cases include: 1. Analytics, artificial intelligence and machine learning development 2. Data engineering 3. D&A governance 4. Master data management 5. Operational/transactional data quality
Gartner defines business process automation (BPA) tools as software that automates business processes by enabling orchestration and choreography of diverse sets of actors (humans, systems and bots) involved in the execution of the process. BPA tools provide an environment for developing and running applications that incorporate process models (and optionally other business, decision and data models) enabling digitization of business operations
Gartner defines business processes as the coordination of the behavior of people, systems and things to produce specific business outcomes. 'Things' in this context refers to devices that are part of the Internet of Things (IoT). A BPM platform minimally includes: a graphical business process and/or rule modeling capability, a process registry/repository to handle the modeling metadata, a process execution engine and a state management engine or rule engine (or both). The three types of BPM platforms — basic BPM platforms, business process management suites (BPMSs), and intelligent business process management suites (iBPMSs) — can help solution architects and business outcome owners accelerate application development, transform business processes, and digitalize business processes to exploit business moments by providing capabilities that manage different aspects of the business process life cycle.
Gartner defines data integration as the discipline comprising the architectural patterns, methodologies and tools that allow organizations to achieve consistent access and delivery of data across a wide spectrum of data sources and data types to meet the data consumption requirements of business applications and end users. Data integration tools enable organizations to access, integrate, transform, process and move data that spans various endpoints and across any infrastructure to support their data integration use cases. The market for data integration tools includes vendors that offer a stand-alone software product (or products) to enable the construction and implementation of data access and data delivery infrastructure for a variety of data integration use cases.
Data virtualization technology is based on the execution of distributed data management processing, primarily for queries, against multiple heterogeneous data sources, and federation of query results into virtual views. This is followed by the consumption of these virtual views by applications, query/reporting tools, message-oriented middleware or other data management infrastructure components. Data virtualization can be used to create virtualized and integrated views of data in-memory, rather than executing data movement and physically storing integrated views in a target data structure. It provides a layer of abstraction above the physical implementation of data, to simplify querying logic.
EBPA is a comprehensive approach toward business and process modeling aimed at transforming and improving business performance with an emphasis on cross-viewpoint (strategy, analysis, architecture, automation), cross-functional analysis to support strategic and operational decisions.
Event brokering is a role played by middleware in facilitating event-driven application architecture. The minimum capability required to play the role of event broker is pub-sub messaging. All middleware products, including MOMs and ESBs, supporting pub-sub can play the role of an event broker and can be referred to as basic 'event brokers' when so deployed. Middleware products that additionally offer special support for event-centric use cases (for example, a persistent event ledger for analysis and event sourcing, or programmable extensibility for custom filtering and analysis) are 'advanced' event brokers.
The market for ESP platforms consists of software subsystems that perform real-time computation on streaming event data. They execute calculations on unbounded input data continuously as it arrives, enabling immediate responses to current situations and/or storing results in files, object stores or other databases for later use. Examples of input data include clickstreams; copies of business transactions or database updates; social media posts; market data feeds; images; and sensor data from physical assets, such as mobile devices, machines and vehicles.
IMDGs provide a lightweight, distributed, scale-out in-memory object store — the data grid. Multiple applications can concurrently perform transactional and/or analytical operations in the low-latency data grid, thus minimizing access to high-latency, hard-disk-drive-based or solid-state-drive-based data storage. IMDGs maintain data grid durability across physical or virtual servers via replication, partitioning and on-disk persistence. Objects in the data grid are uniquely identified through a primary key, but can also be retrieved via other attributes. The most typical use of IMDGs is for web-scale transaction processing applications. However, adoption for analytics, often in combination with Apache Spark and Hadoop or stream analytics platforms, is growing fast — for example, for fraud detection, risk management, operation monitoring, dynamic pricing and real-time recommendation management.
Gartner defines integration platform as a service (iPaaS) as a vendor-managed cloud service that enables end users to implement integrations between a variety of applications, services and data sources, both internal and external to their organization. iPaaS enables end users of the platform to integrate a variety of internal and external applications, services and data sources for at least one of the three main uses of integration technology: Data consistency: The ability to monitor for or be notified by applications, services and data sources about changes, and to propagate those changes to the appropriate applications and data destinations (for example, “synchronize customer data” or “ingest into data lake”). Multistep process: The ability to implement multistep processes between applications, services and data sources (for example, to “onboard employee” or “process insurance claim”). Composite service: The ability to create composite services exposed as APIs or events and composed from existing applications, services and data sources (for example, to create a “credit check” service or to create a “generate fraud score” service). These integration processes, data pipelines, workflows, automations and composite services are most commonly created via intuitive low-code or no-code developer environments, though some vendors provide more-complex developer tooling.
MDM is a technology-enabled business discipline in which business and IT work together to ensure the uniformity, accuracy, stewardship, governance, semantic consistency and accountability of an enterprise’s official shared master data assets. Master data has the lowest number of consistent and uniform sets of identifiers and attributes that uniquely describe the core entities of the enterprise and are used across multiple business processes.
Master data management (MDM) of customer data solutions are software products that: Support the global identification, linking and synchronization of customer information across heterogeneous data sources through semantic reconciliation of master data. Create and manage a central, persisted system of record or index of record for customer master data. Enable the delivery of a single, trusted customer view to all stakeholders, to support various business initiatives. Support ongoing master data stewardship and governance requirements through workflow-based monitoring and corrective-action techniques. Are agnostic to the business application landscape in which they reside; that is, they do not assume or depend on the presence of any particular business application(s) to function.
Master data management (MDM) of product data solutions are software products that: Support the global identification, linking and synchronization of product data across heterogeneous data sources through semantic reconciliation of master data. Create and manage a central, persisted system of record or index of record for product master data. Enable the delivery of a single, trusted product view to all stakeholders, to support various business initiatives. Support ongoing master data stewardship and governance requirements through workflow-based monitoring and corrective-action techniques. Are agnostic to the business application landscape in which they reside; that is, they do not assume or depend on the presence of any particular business application(s) to function.
Reviews for 'Security Solutions - Others'