3D printing (3DP) workflow software enables organizations to manage the estimating, pricing, scheduling, capacity planning, costing, order entry, job tracking and billing of 3D-printed items. It is an essential enabler of additive manufacturing, providing tools and capabilities needed to manage, optimize, and scale the process from start to finish. It provides centralized control of multiple 3D printers with real-time monitoring of print status.. It also tracks material inventory, usage, and expiration dates while offering support for different types of printing materials with specific handling requirements. The software enables management to understand the true costs and capacity of installations, address bottlenecks, and improve customer service, thus increasing revenue and customer satisfaction. Typical users include manufacturers, product designers and engineers, educational institutions and healthcare providers.
Gartner defines AI-Augmented Code Modernization Tools as software solutions that use specialized AI agents, generative AI, and deterministic analysis to accelerate the transformation of legacy systems. These tools automate and enhance a broad spectrum of modernization activities, including deep code and architecture analysis, software documentation, dependency mapping, risk assessment, migration planning, and refactoring. By supporting end-to-end modernization workflows, they significantly expedite the adoption of modern software architectures.
Gartner defines an advanced distribution management system (ADMS) as the real-time operations support system of an electricity distribution network. Utilities use ADMSs to monitor, control and operate physical field assets (owned by the utility) to provide reliable electric service to customers. The ADMS estimates the state of the distribution network to optimize asset utilization and minimize losses. An ADMS can manage and guide outage restoration activities by identifying the fault location, isolating and restoring affected equipment, and returning the system to its desired operating state.
Gartner defines analytics and business intelligence platforms (ABI) as those that enable organizations to model, analyze and visualize data to support informed decision making and value creation. These platforms facilitate the preparation of data and the creation of interactive dashboards, reports and visualizations to uncover patterns, predict trends and optimize operations. By doing so, they empower users to collaborate and effectively communicate the dimensions and measures that drive their organization. The platforms may also optionally include the ability to create, modify or enrich a semantic model, including business rules. Analytics and business intelligence platforms integrate data from multiple sources, such as databases, spreadsheets, cloud services and external data feeds, to provide a unified view of data, breaking down silos and transforming raw data into meaningful insights. They also allow users to clean, transform and prepare data for analysis, in addition to creating data models that define relationships between different data entities.
Application Composition Platforms facilitate the rapid creation, integration, and deployment of applications using modular, reusable components or microservices. By leveraging low-code or no-code development paradigms, these platforms enable both technical and non-technical users to participate in application development. ACPs support integration with various systems and services, ensuring seamless data flow and interoperability. These platforms enables IT professionals, software developers, and business leaders to accelerate time-to-market, reduce development costs, and adapt swiftly to evolving business requirements, thereby enhancing overall productivity and innovation.
The application development life cycle management (ADLM) tool market focuses on the planning and governance activities of the software development life cycle (SDLC). ADLM products focus on the 'development' portion of an application's life. Key elements of an ADLM solution include: software requirements definition and management, software change and configuration management, software project planning, with a current focus on agile planning, work item management, quality management, including defect management. Other key capabilities include: reporting, workflow, integration to version management, support for wikis and collaboration, strong facilities for integration to other ADLM tools.
Gartner defines business orchestration and automation technologies (BOAT) as a consolidated software platform that delivers enterprise process automation by enabling capabilities including orchestration of business processes, enterprise connectivity, low code development and agentic automation. A BOAT platform includes a cross section of certain capabilities from different markets such as business process automation (BPA), low-code application platforms (LCAP), integration platform as a service (iPaaS), intelligent document processing (IDP), robotic process automation (RPA), collaborative workflow management and document management. However, this list is not necessarily all-encompassing.
Computer-Aided Design software is used by designers, engineers, architects, and drafters across several industries to create two-dimensional and three-dimensional models. These 2D and 3D models can be used to explore design ideas, visualize concepts and simulate the physical behavior of a design in the real world. The software provides in-built templates such as flowcharts, mind maps, wireframes, network diagrams, and org charts to create quality as well as detailed design models. The software also allows for instant changes to models enabling collaborative work between team members.
Data center infrastructure management (DCIM) tools monitor, measure, manage and/or control data center resources and energy consumption of both IT-related equipment (such as servers, storage and network switches) and facilities infrastructure components (such as power distribution units and computer room air conditioners). They are data-center-specific (they are designed for data center use), rather than general building management system tools, and are used to optimize data center power, cooling and physical space. Solutions do not have to be sensor-based, but they do have to be designed to accommodate real-time power and temperature/environmental monitoring. They must also support resource management, which Gartner defines as going beyond typical IT asset management to include the location and interrelationships between assets.
Data preparation is an iterative and agile process for finding, combining, cleaning, transforming and sharing curated datasets for various data and analytics use cases including analytics/business intelligence (BI), data science/machine learning (ML) and self-service data integration. Data preparation tools promise faster time to delivery of integrated and curated data by allowing business users including analysts, citizen integrators, data engineers and citizen data scientists to integrate internal and external datasets for their use cases. Furthermore, they allow users to identify anomalies and patterns and improve and review the data quality of their findings in a repeatable fashion. Some tools embed ML algorithms that augment and, in some cases, completely automate certain repeatable and mundane data preparation tasks. Reduced time to delivery of data and insight is at the heart of this market.
Gartner defines a data science and machine learning platform as an integrated set of code-based libraries and low-code tooling. These platforms support the independent use and collaboration among data scientists and their business and IT counterparts, with automation and AI assistance through all stages of the data science life cycle, including business understanding, data access and preparation, model creation and sharing of insights. They also support engineering workflows, including the creation of data, feature, deployment and testing pipelines. The platforms are provided via desktop client or browser with supporting compute instances or as a fully managed cloud offering.
Data and Analytics refers to products and services that enable organizations to collect, integrate, analyze, and act on data to drive informed decision-making and business outcomes. This category includes markets that focus on empowering enterprises to manage data pipelines, ensure data quality and governance, extract insights through advanced analytics, and machine learning across structured and unstructured data environments.
Gartner defines enterprise agile planning (EAP) tools as products that enable organizations to scale their agile practices to support a holistic enterprise view. These tools act as a hub for defining, planning, managing and deploying work. They also serve as an information hub for the disparate islands of metrics from the full life cycle. Just as agile is an evolution of development methodologies, EAP tools are an evolution of project-/team-centric tools. They support a business-outcome-driven approach to managing the full life cycle of agile product delivery at scale.
Gartner defines enterprise low-code application platforms (LCAPs) as software platforms for the accelerated development and maintenance of applications, using model-driven development tools, generative AI and prebuilt component catalogs for the entire application’s technology stack. Enterprise LCAP features include support for the collaborative development of all application components; runtime environments for high performance, availability and scalability of applications; and application deployment and monitoring with detailed usage insights. Enterprise LCAP platforms feature governance controls and insights, self-service capabilities, APIs for integration with external DevOps tooling, success management with exhaustive technical documentation, training programs and a comprehensive global partner network. Enterprise LCAPs provide the foundation for developing a wide range of applications and application components with distributed data architectures, including complex multimodal front ends, business workflows, agentic AI and integration capabilities. The enterprise LCAP market is closely related to the citizen application development platform (CADP) market, as they both aim to address the use cases listed below. However, they are distinctively different in terms of the target audience and complexity of the applications built on the platform.
The global industrial IoT platform delivers multiple integrations to industrial OT assets and other asset-intensive enterprises’ industrial data sources to aggregate, curate and deliver contextualized insights that enable intelligent applications and dashboards through an edge-to-cloud architecture. The global industrial Internet of Things (IIoT) platform market exists because of the core capabilities of integrated middleware software that support a multivendor marketplace of intelligent applications to facilitate and automate asset management decision making. IIoT platforms also provide operational visibility and control for plants, infrastructure and equipment. Common use cases are augmentation of industrial automation, remote operations, sustainability and energy management, global scalability, IT/operational technology (OT) convergence, and product servitization of industrial products.
Integration means making independently designed applications and data work well together. IoT integration means making the mix of new IoT devices, IoT data, IoT platforms and IoT applications — combined with IT assets (business applications, legacy data, mobile, and SaaS) — work well together in the context of implementing end-to-end IoT business solutions. The IoT integration market is defined as the set of IoT integration capabilities that IoT project implementers need to successfully integrate end-to-end IoT business solutions.
Lifecycle Cost Management Software is a specialized tool designed to assist organizations in managing the total cost of ownership of their assets throughout their entire lifecycle. This type of solution is crucial for financial analysts, cost engineers and businesses that rely heavily on physical assets, such as manufacturing plants, infrastructure, or large equipment, as it helps them plan, analyze, and optimize costs from acquisition through operation and maintenance to eventual disposal. It often includes features such as performing cost analysis to forecast future expenses, generating insights into asset performance and lifecycle trends, as well as allowing users to model different scenarios and simulate outcomes. This holistic approach allows organizations to make informed decisions about asset management, ensuring that they maximize value while minimizing costs.
Manufacturing process management (MPM) and model-based manufacturing (MbM) bridge the gap between the virtual design realm and the physical product/process manufacturing realm as part of an organized software architecture. These technologies are not only applied within the four walls of a plant or a corporation's multiple manufacturing sites. They can be applied holistically, with workflow to manage multiple recipe variants and labeling change/requirements, and/or handle certificates of compliance (CoCs) and certificates of analysis (CoAs) from suppliers.
Manufacturing refers to products and services that support the design, production, monitoring, and optimization of industrial and discrete manufacturing operations. This category includes markets that focus on production execution, asset and environmental management, and advanced manufacturing services-enabling organizations to enhance efficiency, quality, sustainability, and innovation across the production lifecycle.
Gartner defines manufacturing execution systems as a specialist class of production-oriented software that manages, monitors and synchronizes the execution of real-time physical processes involved in transforming raw materials into intermediate and/or finished goods. These systems coordinate the execution of work orders with production scheduling and enterprise-level systems like ERP, product life cycle management and quality management systems. MES applications also provide feedback on process performance, and support component and material-level traceability, genealogy and integration with process history, where required.
Gartner defines metadata management solutions as applications to enable the collection, analysis and orchestration of metadata related to organizational data assets. These solutions enable workflow and operational support to make data easy to find, use and manage. They do this by collating metadata in any form from within its own application and third-party systems, and providing the ability to search, analyze and make decisions on the collated results. They also provide transparent cross-referencing over all related metadata, and derive insights from data (such as usage patterns and performance) through analysis of metadata to support a wide range of data-driven initiatives.
The meter data management systems (MDMS) market is used by utility companies in the electricity, gas, water and thermal sectors to support metered commodity measurement at customer premises. This data is used for billing, customer service, consumption management (forecast and demand), operations (outages and losses) and finance. Other buyers of MDMS products include energy service companies for audits and efficiency projects; energy management providers for consumption reporting; outsourced or local energy providers for production planning and engineering; and commercial and industrial users for accounting and finance. Additionally, municipalities and government agencies use MDMS products for public works and regulatory compliance. MDMS are essential IT components of advanced meter infrastructures that facilitate the meter-to-cash process by collecting and managing consumption data for utility services such as electricity, water, gas and thermal energy. MDMS collect meter data; apply validation, estimation and error corrections; and store and process the data into billing determinants based on the customer’s product before sending it to a billing engine. MDMS can support various analytics use cases, including consumption profiles, trends, alerts, revenue protection and basic meter asset management. It can be delivered through on-premises or cloud-based solutions, enabling real-time data access and analytics that enhance operational efficiency and customer engagement.
Gartner defines multienterprise collaboration networks (MCNs) as solutions that support a community of trading partners of any tier and type that need to coordinate and execute on business processes that extend across multiple enterprises. Gartner considers cloud-based MCNs to be a key technology for organizations of any industry, geography, size and maturity, implemented to coordinate, automate, orchestrate and transform an organization’s extended supply chain within the overall business ecosystem.
An MXDP is an opinionated, integrated set of front-end development tools and “backend for frontend” (BFF) capabilities. It enables a distributed, scalable development approach (in terms of both teams and architecture) to build fit-for-purpose apps across digital touchpoints and interaction modalities. At minimum, an MXDP must support cross-platform development and building of both custom iOS and Android app binaries, responsive web apps, and at least one of the following: PWAs, chatbots, voice apps, wearables and Internet of Things (IoT) apps, and augmented-reality (AR) and mixed-reality (MR) apps.
Gartner defines the product life cycle management (PLM) software market in discrete manufacturing industries as a philosophy, process and discipline. It is supported by software to manage product data and related processes throughout their entire life cycle, from concept through recycling/retirement. It applies to products that are assembled or constructed.
Gartner defines the market for quality management system (QMS/eQMS) software as stand-alone digital solutions with embedded emerging technological capabilities that enable organizations to systematically manage, monitor and improve the quality of their products, processes and services. These software solutions help organizations meet customer quality expectations and maintain compliance with international standards as well as industry-specific regulations. QMS software is designed to help organizations reduce waste, lower costs, house quality documents, assess risk, track performance, ensure compliance and improve processes. The QMS/eQMS market serves a broad spectrum of industries, including manufacturing, life sciences, automotive, aerospace and defense, electronics, food and beverage, chemicals, consumer packaged goods, services, and more. These solutions are built to help organizations of all sizes, from single facilities to complex, global operations, establish a consistent approach to managing and enhancing the quality of products, processes and services throughout the value chain.
Supervisory control and data acquisition (SCADA) software is essential for industries such as manufacturing, energy and utilities, transportation etc. to control the processes, collect & monitor real-time data, and communicate system issues. The software communicates with devices such as programmable logic controllers (PLCs) and Remote Terminal Units (RTUs) to interact with industrial equipment and processes. SCADA software can be run virtually, which allows the operator to supervise the industrial processes even from a distant location. The software provides real-time data insights through Human Machine Interface (HMI) to maximize efficiency, reduce overhead costs, and streamline operations. The software also warns the operator of any hazardous conditions such as blocked processes and failing systems.
Synthetic sensors are a virtualized software-based model that deducts and analyzes information indirectly. The information can be integrated from multiple real physical sensors from the same device, or any other device, such as a smartphone, for example. In the industry, this is also known as a virtual sensor or virtual sensing. These systems do not rely on direct hardware measurements but instead infer complex insights like user behavior, emotional states, or environmental conditions by analyzing combined sensor inputs. Their core strengths lie in their ability to deliver real-time, context-aware insights, simulate expensive or impractical sensors, and operate flexibly across diverse environments. Their benefits include cost-effectiveness, improved accuracy, and the ability to operate in environments where physical sensors are limited or unavailable. Synthetic sensors are widely used across industries such as healthcare (for monitoring patient well-being), consumer electronics (in smartphones and wearables), automotive (for driver behavior analysis), and smart homes (for occupancy and mood detection).