Data analytics insights in EPC projects

Article overview

From descriptive to cognitive dashboards

Engineering, Procurement, and Construction (EPC) projects generate vast amounts of data during design and execution. In the Front-End Engineering Design (FEED) and detailed engineering phases, teams handle hundreds of drawings, specifications, schedules, and reports. Using data analytics helps turn this raw data into meaningful insights for better, data-driven decisions. In fact, well-managed project data can significantly improve outcomes – a McKinsey study noted that effective use of data can reduce project costs by up to 20% and increase productivity by up to 30%. Many organizations, however, still rely on basic descriptive dashboards that show what has happened, such as how many documents were issued, without deeper analysis. Most companies remain stuck at the descriptive stage of analytics, yet moving beyond to predictive and prescriptive analytics offers a competitive edge.

This article explains the five levels of analytics – descriptive, diagnostic, predictive, prescriptive, and cognitive – in an EPC context.

It focuses on how these analytics apply in FEED and detailed design, with examples across disciplines (mechanical, civil, electrical, process, instrumentation, HSE, piping, structural, construction, project controls, quality, supply chain, etc.). The goal is to show how organizations can evolve from simple reports to intelligent dashboards that support decisions, reduce rework, and improve first-time-right performance. 

Descriptive analytics: “What happened?”

Descriptive analytics is the simplest level, focusing on what has already happened on a project. It uses historical and current data to report facts and trends without interpreting causes. In EPC projects, descriptive dashboards are common for tracking progress and status. For example, during detailed engineering, a mechanical or civil team’s dashboard may display the number of drawings or deliverables completed this week versus the plan. Project controls teams use descriptive reports to show overall schedule progress or cost expenditure to date. Procurement departments list materials or equipment delivered so far. These data-driven reports provide a single source of truth for the project status, ensuring everyone sees consistent information. Descriptive analytics helps teams answer questions like, “How many piping isometrics have been issued?” or “What is the overall engineering completion percentage?”. 

Descriptive insights are valuable for transparency and monitoring. For instance, a dashboard can highlight that 90% of instrumentation diagrams are finished or that 10 RFIs (Requests for Information) were closed last month. However, descriptive analytics only signals what is happening or has happened – it does not explain why metrics are ahead or behind, and it cannot predict future issues. Most routine project reports (e.g. weekly progress reports) fall in this category. While crucial as a foundation, descriptive analytics is limited to retrospective reporting. EPC teams benefit from it for status updates, but they need to advance to the next levels to understand causes and anticipate what’s next. 

Diagnostic analytics: “Why did it happen?” 

Diagnostic analytics digs into data to find out why something happened. It goes a step beyond describing what by uncovering underlying causes, patterns, or anomalies. In practice, this involves techniques like data drill-downs, correlations, and root cause analysis. For EPC professionals, diagnostic dashboards help identify reasons for delays, errors, or any deviations in project performance. 

For example, suppose a set of piping isometric drawings is lagging behind schedule. A diagnostic analysis might reveal that the root cause was a late deliverable from a vendor – perhaps a critical valve specification arrived two weeks late, causing a chain reaction of design delays. Similarly, a quality manager might use diagnostic data to discover why a high number of welding defects occurred on a project: analysis could show they mostly happened with a particular material batch or under a specific subcontractor’s work, indicating a supply issue or training gap. In the HSE (Health, Safety, Environment) domain, if incident rates spiked in one month, diagnostic dashboards can correlate those incidents with factors like crew overtime hours or weather conditions to explain the spike. 

These insights answer questions like “What caused the schedule slippage in civil works?” or “Why did we have more change orders in electrical design?” By measuring performance against baselines and comparing different data sets, teams can pinpoint inefficiencies or problem areas. For instance, multidisciplinary design teams can see if rework on structural drawings was due to late process data or incorrect initial calculations. Diagnostic analytics provides deeper context, helping project teams address the real issues rather than just the symptoms. It is still mostly backward-looking (reactive), but it greatly supports decision-making by ensuring that corrective actions target the right problems. Companies with mature project controls processes rely on diagnostic metrics to continuously improve their workflows, instead of only presenting raw figures. 

Predictive analytics: “What will happen?” 

Moving into a forward-looking mode, predictive analytics uses historical and current data to forecast future outcomes. In other words, it helps project teams anticipate what could happen next so they can prepare. Predictive models analyse patterns (using trends, statistical models, or machine learning) to estimate the likelihood of various scenarios. This is especially valuable in FEED and detailed engineering, where early warnings can mitigate delays or cost overruns. 

In an EPC project context, predictive analytics might forecast schedule slippages, cost increases, or risks before they materialize. For example, during FEED, engineers can employ predictive models to estimate the final project cost and duration based on the design parameters and historical data from similar past projects. If the model indicates a high chance that the initial design choices will lead to cost overruns, the team can adjust the scope or budget early. In detailed design, a prediction model can warn that, given the current progress rate, the piping discipline will miss its deliverable target next month. It could also forecast which documents are likely to be delayed – for instance, if critical equipment datasheets from suppliers are running late, the model might predict a downstream delay in related instrumentation and electrical drawings. 

Real-world examples illustrate the power of predictive project analytics. One EPC project manager noted that predictive analytics continuously analysing project data can flag deviations from the plan early, allowing the team to act before small issues snowball into big delays. For instance, in a large construction project, analytics could forecast delays due to supply chain disruptions (such as late steel shipments) and prompt the team to find alternate suppliers or adjust the sequence. Predictive tools in construction also help anticipate delays based on factors like weather, resource availability, and past performance data. A forecasting model might show, for example, that if heavy rain is expected next week, the earthworks will likely be pushed out by several days – enabling proactive rescheduling of tasks. 

By answering “What might happen if current trends continue?”, predictive dashboards transform project management from reactive firefighting to proactive risk management. They effectively reduce guesswork in decision-making by providing data-backed projections. Instead of relying on gut feeling, a project controls engineer can use a predictive dashboard to see that “if we continue at this productivity, we will finish 10% behind schedule” and then communicate the need for corrective action. Overall, predictive analytics helps EPC teams minimize non value-adding time by addressing potential problems in advance. This level of insight marks a significant step toward optimization and better outcomes, as teams can avoid or lessen impacts of future issues rather than simply report on them after the fact. 

Prescriptive analytics: “What should we do next?” 

While predictive analytics tells what is likely to happen, prescriptive analytics goes further to recommend actions for optimal results. It answers the question, “Given what we know (data, analysis, predictions), what should we do to reach our targets?”. Prescriptive analytics uses a combination of data, business rules, and often mathematical optimization models to propose specific decisions or action plans. Essentially, it builds on the insights from descriptive (what happened), diagnostic (why it happened), and predictive (what might happen) analytics to determine the best course of action. 

In practice, a prescriptive dashboard in an EPC setting might function like a smart advisor to the project manager and discipline leads. For example, if a predictive model forecasts that the project will be delayed by four weeks due to late piping isometrics, the prescriptive system could analyze various alternatives and suggest an optimal plan to recover time. It may recommend specific actions such as: “Increase the piping design team by two engineers for the next month” or “Fast-track the procurement of pipe supports using an alternate supplier”. These recommendations are not just guesses – they result from the system evaluating different scenarios (e.g. adding resources, working overtime, resequencing tasks) and identifying which option would minimize the delay with least cost impact. In this way, prescriptive analytics directly supports decision-makers by providing a clear path forward, not merely data. 

There are two modes in which prescriptive analytics often operates: decision support and decision automation. In decision-support mode, the analytics tool offers one or more recommended options, but a human decision-maker evaluates and chooses the final action. For instance, a prescriptive dashboard might present the project engineer with three options to address a predicted engineering delay (option A: add staff, option B: reduce scope in a non-critical area, option C: accept a later completion date with a penalty). The dashboard would show the projected outcome of each option (e.g. “Option A will likely recover 2 weeks at a cost of €50,000”). This helps the team weigh trade-offs and select an informed solution. On the other hand, in decision-automation mode, the system itself identifies the single best action (or automatically triggers it) based on predefined criteria. According to Gartner’s definition, prescriptive analytics “outputs a decision rather than a report or estimate”. This means the tool might automatically reschedule certain tasks or send an order request to a backup vendor when it detects an upcoming shortfall, without waiting for manual intervention. In a sense, the system acts like an autopilot for routine decisions – for example, automatically reordering materials when inventory drops below a threshold to prevent schedule delays. 

Crucially, prescriptive analytics in dashboards is enabled by embedding business logic and rules into the data analysis. By codifying best practices and project constraints into the system, the dashboard can check for compliance and suggest actions accordingly. This kind of automated guidance greatly reduces errors and rework. For instance, an engineering dashboard can automatically flag any design that doesn’t meet the project’s specifications (say, a pump specified with the wrong voltage) and prescribe a corrective action (assign an electrical engineer to revise the specification). Catching such issues early prevents costly rework later in procurement or construction. The result is a higher first-time-right rate – more deliverables are correct and approved on the first submission, rather than coming back with comments. As one industry expert observed, automating routine project management tasks (like schedule updates and consistency checks) ensures fewer errors and frees up time for the team to focus on critical decisions. By providing an intelligent to-do list of actions, prescriptive dashboards optimize the workflow for project managers and discipline leads. 

Adopting prescriptive analytics requires a certain level of organizational data maturity and mindset change. It involves trust in data-driven recommendations and often new tools or processes. Not many EPC firms are consistently operating at this level yet, but interest is growing as the benefits become clear. Gartner estimates that predictive and prescriptive analytics together account for 40% of new investments in business intelligence and analytics software, reflecting a trend toward more advanced decision-support systems. In EPC projects, as schedules tighten and margins thin, the ability to not only predict problems but also get guidance on solving them is extremely valuable. Prescriptive analytics helps teams make the jump from “knowing what might happen” to deciding the best next step, supported by data. 

Cognitive analytics: “How do we improve? “ 

The final level in analytics maturity is cognitive analytics, sometimes referred to as the AI-driven or self-learning analytics. Cognitive analytics uses advanced technologies – such as artificial intelligence, machine learning, and semantic analysis – to learn from data patterns and human feedback. Unlike other analytics that follow predefined queries or models, cognitive systems can adapt and improve over time as they absorb more data. In essence, a cognitive dashboard gets “smarter” with each project, enabling continuous improvement in insights and recommendations. 

In an EPC context, cognitive analytics might manifest as an intelligent project assistant or an enterprise dashboard that analyses all project data (structured and unstructured) to yield high-level strategic insights. Cognitive dashboards aim to identify emerging patterns or problems in advance – even those that project teams might not explicitly look for. For example, a cognitive system could sift through thousands of past project documents, emails, and incident reports to learn that certain combinations of factors (like a particular supplier and a certain type of design change) tend to lead to cost overruns. It can then alert project executives in new projects when it detects a similar pattern forming, effectively predicting future problems before they fully manifest. This goes beyond traditional predictive analytics by handling far more data types and finding subtle correlations that humans might miss. 

A key benefit of cognitive analytics is its ability to facilitate continuous improvement. Because these systems use machine learning, they are capable of continuous improvement – the more data they process, the more accurate and refined their models become. For instance, if a cognitive dashboard initially recommends actions that turn out suboptimal, it will learn from that outcome. Over time and with more training data (e.g., results of past recommendations, project success metrics), the system adjusts its algorithms to make better decisions in the future. One can imagine a cognitive system in an EPC firm that after several projects has learned the optimal design review process to minimize errors, or the optimal timing to order equipment to avoid site delays. It would then continuously suggest improvements to project workflows, design standards, or procurement strategies based on what it has learned. This creates a feedback loop: each project’s data is not just an outcome to report, but becomes input to learn and optimize the next project. 

Cognitive analytics often incorporates elements like natural language processing (for reading reports or contracts), computer vision (for analysing photos from the construction site), and advanced simulation (digital twins of facilities). These capabilities enable a cognitive dashboard to provide richer insights. For example, an executive could query the system in plain language: “What risk factors should I worry about on this project based on past projects?” and the AI assistant could answer with data-backed insights (perhaps noting “Similar past projects had change order spikes due to late scope changes; consider freezing the design earlier or add contingency.”). By bridging human expertise with machine intelligence, cognitive analytics blurs the line between decision support tools and autonomous reasoning systems. 

For EPC organizations, cognitive analytics is the frontier that promises long-term, enterprise-wide improvement. It helps institutionalize lessons learned: instead of each project team relearning mistakes, the AI system accumulates knowledge project by project. It can then proactively suggest ways to eliminate waste and improve efficiency – for example, highlighting that a certain design workflow consistently yields fewer errors (higher first-time-right) so it should be adopted as a best practice across all projects. Over multiple project cycles, these continuous improvements driven by cognitive insights can lead to significant gains in productivity, safety, and quality. In summary, cognitive analytics-enabled dashboards act as self-improving partners to EPC decision-makers, enabling a truly data-driven continuous improvement cycle in project execution.