scroll to top

You don’t need to spend big to manage a mega-project

The term mega-project (~ £1billion) has earned an unenviable reputation for cost overruns, delays and transparency issues. Prime examples such as HS2, Berlin Brandenburg Airport and Sydney Opera House with their massive delays and soaring costs highlight how rushing the fundamentals exacerbates the inherent complexity of these already massive projects. Look at projects like the Walney Extension offshore windfarm, however, the largest windfarm in the world when complete, delivered on time and budget through meticulous planning and coordination and the contrast couldn’t be greater.

The challenges facing mega-projects are numerous but one of the core elements of managing successful delivery is accurate data and reporting. Given the enormous scale of the infrastructure and energy projects necessary for the green transition, relying on outdated methods of manual data entry and opaque reporting is unsustainable. The traditional use of repetitive and manually assembled PowerPoint presentations (we’ve all been there), disconnected spreadsheets, and vague reporting does not stand up to the scrutiny of data-savvy stakeholders or the pressures of timelines and budgets inherent in these energy mega-projects.

Thankfully you don’t need to buy expensive software to solve this and odds are you already have the tools. Do it right and the answer to common mega-project management challenges is remarkably simple.

Too fast, too soon

The urgency that often comes with large-scale initiatives frequently prompts stakeholders to rush through planning and development phases, mistakenly believing that speed will save time and money. It doesn’t.

Early-stage mega-projects commonly rely on manual data collection - think spreadsheets, PowerPoint presentations, and isolated reports compiled independently by various consultants and contractors. This fragmented approach not only creates a reporting cottage industry churning out update after update, but it is typically filled with optimism bias and prevents stakeholders from having a view of what is actually happening, causing significant gaps in transparency and accountability.

Sometimes this stems from teams fudging reports to present a rosier picture than reality. Managers often end-up relying on what they’re being told because they simply don’t have the data to hand. More commonly though, it’s just the fact that no-one, teams or management, truly understands the full picture because they’re making decisions on outdated information, operating in siloes and firefighting so relentlessly they can’t take a step back. By the time someone spots a problem, it’s no longer about prevention, it’s damage limitation.

It’s easier now

Ten years ago, talk about systems integration or data pipelines and most project managers would have been left scratching their heads. Since then, platforms such as Power BI and now Fabric have grown in capability and accessibility. They’re game changers in project delivery, real-time decision-making and accountability, enhancing a project’s ability to simulate scenarios and validate designs.

With the vast amount of free training material online, anyone with a couple of days and use of a genAI tool (ChatGPT, LeChat etc.) can start developing live dashboards. Instead of trundling through bloated reporting packs manually compiled by overstretched teams, managers can now get near-real time visibility on their projects or programmes budget, schedule and risk in a single streamlined location. Less reactive ‘status chasing’ more proactive course correction allowing project teams to address small issues before they balloon.

The simple clarity these tools can give you by connecting data sets from systems such as Primavera P6, Oracle/SAP and ArcGIS means you’re no longer stitching together conflicting reports. get a single source of truth. That visibility isn’t just for programme managers either. Delivery teams, suppliers and even non-technical stakeholders can be given access to live, shareable, secure information.

Having high quality and integrated data also opens the door to practices like Building Information Modelling (BIM) data and digital twins (virtual models of physical assets). By creating these, project teams can run detailed simulations, test various scenarios and validate their designs before construction begins. This reduces risk, enhances resource allocation and accelerates decisions.

Used effectively with the right data foundations, easy access tools and systems can establish transparency from the outset, driving greater accountability, reducing disputes and enhancing credibility with stakeholders.

Set intention from the start

As much as the tech has improved, process, people and timing still matter. Rushing straight to a flashy dashboard or predictive AI without laying the right behavioural and data foundations is counterproductive. This is where taking the time during the development of a project to plot the way forward is key. Some of the fundamental first steps include:

  • Clear data conventions and standards: Set clear naming conventions, work breakdown structures and expected formats from day one. This ensures data is consistent regardless of where it comes from and can be merged, analysed and tracked across suppliers, phases and platforms.
  • Data governance and ownership: Define who owns the data, who can enter it or change it and when it’s needed by. Without this you risk unauthorised changes and unclear accountability, especially across the sprawling supply chains typical of mega-projects.
  • Prioritise integration: there are plenty of great specialist systems outs there but if your finance data can’t talk to your scheduling data, you end up with data siloes and manual workaround (we’ve all seen people try to fudge the data). Choose systems that can share data easily via APIs or connectors and planning your integration workflows from the start.
  • Set your performance and reporting KPIs from the start: Define what success looks like (cost performance, schedule adherence, risk exposure etc.) and figure out what data you’ll need to track it. Realising halfway through that you want to monitor CPI but didn’t plan for a cost loaded programme can be an expensive and time-consuming fix. Getting everyone aligned on performance expectations from the start is key.

Lay these straightforward data foundations early and you unlock accurate cost modelling, agile scheduling, and meaningful risk mitigation. It creates the conditions for multiple teams and stakeholders to work off the same, near real-time planning, finance and design data, transforming decision-making along the way.

So what?

But we no longer have to rely on disjointed spreadsheets and manually curated reports to help navigate mega-project complexity. Modern data platforms and integration tools have matured into powerful and easily accessible solutions that your company likely already uses. If implemented with intent, they can provide the visibility, accountability, and agility that mega-projects so badly need without having to spend millions on fancy kit.

The future of these projects doesn’t have to be late, over budget, or chaotic. With the right data foundations in place, mega-projects can leverage the vast swathe of accessible tools and systems to help break free of their historical baggage and deliver with credibility and confidence.