Events and Conferences
Three Ways Data is Reshaping Asset Management: Our Takeaways from AMPEAK 2026
Last week, the AssetFuture team joined hundreds of asset management professionals at AMPEAK 2026 in Perth. It was a packed few days of presentations, conversations and peer-reviewed research, and we were proud to contribute three of them. Here's what our team presented, and why it matters for anyone managing assets at scale.
UnitingCare Queensland: What happens when you go from complexity to clarity
UnitingCare Queensland manages over $3 billion in assets and employs the second-largest workforce in the state. When they started their asset management journey, they were starting from scratch.
Key takeaway: UCQ's experience shows that a structured, data-first approach to asset management can deliver real, measurable savings while shifting an organisation from reactive to proactive management.
Using the GFMAM evaluation framework, UCQ assessed their maturity across governance, people, process, data and technology. The gaps were clear: they needed stronger data structures and better analytics to support evidence-based decisions.
That led to the creation of the Asset Condition and Performance (ACAP) program. ACAP introduced a unified data dictionary and taxonomy, an industry-aligned condition grading framework, and a repeatable training package designed to build internal capability across their Facilities Management teams.
Going deeper
Partnering with AssetFuture, UCQ brought their ACAP datasets into a digital platform that enabled scenario-based planning, risk-weighted capital allocation and long-term financial forecasting. For the first time, they had clear visibility over renewal demand, investment priorities and service outcomes across a 10-year horizon.
The results speak for themselves. UCQ achieved approximately 20% cost savings through their prioritisation framework, with decision-making efficiency improving by more than 30%. Beyond the numbers, they've driven a genuine cultural shift, embedding ISO 55000-aligned practices and moving from a 70/30 reactive-to-proactive maintenance split toward a target of 40/60. Reactive work is trending down as proactive management takes hold.
For organisations with large, complex portfolios, UCQ's journey is a strong proof point. You don't need perfect data to start. You need a structured approach, the right tools and a commitment to building capability from within.
A probabilistic approach to asset degradation modelling
Most asset management systems take a deterministic approach to predicting when assets will fail. They rely on the manufacturer's design life, and when that clock runs out, the asset is flagged for replacement. It's straightforward, but it doesn't reflect how assets actually behave in the real world.
Key takeaway: By shifting from deterministic to probabilistic degradation modelling, organisations can forecast replacement costs more accurately and allocate capital more efficiently, replacing what's likely to fail rather than everything that's "due."
Hanbit Cho, researcher at AssetFuture, presented a peer-reviewed paper proposing an alternative approach. Instead of treating design life as a hard deadline, the model uses a Weibull Cumulative Distribution Function to calculate the likelihood of an asset reaching end-of-life based on its actual condition at replacement.
That likelihood is then used to calculate an Expected Value for each asset, quantifying the cost associated with the probability of failure. An adjustable risk threshold determines which assets are prioritised for replacement.
Going deeper
The model was tested across 27 distinct asset types using historical condition assessment data. Four cost scenarios were compared, ranging from expected replacement cost with a risk threshold through to full replacement cost without one.
The results were clear. The expected replacement cost with a 50% risk threshold was the most economical scenario, because the cost is directly proportional to the number of assets expected to fail, not the total portfolio value. When the risk threshold dropped to 25%, the estimated cost jumped by 110%, driven by the sharp, non-linear acceleration in failure probability as assets enter their high-risk phase.
What makes this approach practical is that it works across a wide range of everyday asset types, from lights and taps to carpet and power outlets. It doesn't require bespoke deterioration models for each material type. And by forecasting future condition ratings in yearly increments, the model can project replacement costs annually, giving organisations a data-driven foundation for capital planning.
Reducing the cost of creating an asset register through data extrapolation
Building a comprehensive asset register is one of the most expensive steps in setting up an asset management program. Traditional methods require a full on-site audit, which is accurate but resource-intensive and time-consuming.
Key takeaway: Data extrapolation can achieve comparable accuracy to a full assessment while costing 53% less and taking four weeks fewer to complete.
Wing-Tim Choi, researcher at AssetFuture, presented a peer-reviewed paper outlining an alternative methodology. Rather than auditing every room and every asset, the approach involves assessing a percentage of rooms per area type, determining the most common assets and their average condition, then calculating relative quantities per square metre of floor area. Those ratios are applied to the remaining rooms to generate the full register.
Going deeper
The methodology was validated against two completed projects in the residential and tertiary education sectors. At each level of assessment completion (from 10% through to 90%), the extrapolated data was compared against the actual full-assessment results across three variables: item composition weighted by replacement cost, quantity difference per item per area type, and total replacement cost difference.
The results showed a clear linear relationship between assessment completion and accuracy. At 50% completion, residential projects achieved 78% accuracy and tertiary education projects reached 69%. The variance between the two comes down to asset diversity. Education facilities tend to have a wider spread of asset types per area, which means the extrapolation model needs more data points to capture that complexity.
For organisations looking to establish an asset register without the cost of a full comprehensive assessment, this approach offers a practical middle ground. It provides a reliable cost estimate for capital and operational planning, while significantly reducing the time and resources needed to get there.
What ties it all together
All three presentations reflect the same underlying principle: better data leads to better decisions. Whether it's UCQ transforming their entire asset management culture, probabilistic models replacing rigid design-life assumptions, or extrapolation making asset registers more accessible, the thread is consistent.
At AssetFuture, we believe the future of asset management is data-driven, and we're committed to building the research and the tools that make that future practical.


