AI Interest Is Rising, But Data Foundations Still Hold Back Value in Mining and Civil Geoscience
Seequent, the Bentley Subsurface company, has released its 7th Geoprofessionals Data Management Report, and the message is clear: teams are dealing with complex, multisource datasets across many software platforms, while spending too much time on routine administration. The global survey of 1,000+ geoprofessionals highlights ongoing issues with data quality, integration across tools, and incomplete or un-managed historical data.
According to the report, geoprofessionals spend over a quarter of their time on data management. As Seequent's Chief Customer Officer, Angela Harvey, put it, teams want to use data for competitive advantage, but limited frameworks mean time goes to wrangling data instead of interpreting results. Many organisations still lack a centralised single source of truth.
AI Momentum Is Real-But It Needs Better Data
Across industries, 51% of organisations are using or considering AI, up from 30% two years ago. "Data is the most valuable asset of any organisation," Harvey said, adding that the real opportunity now is building the foundations that let AI deliver efficiency and more sustainable outcomes. The appetite is there; the plumbing isn't.
Mining: High Priority, But Frameworks Lag
In mining, 80% of geoprofessionals rate data management as high or critical importance. They spend almost a third of their time on it, yet only 39% of mining organisations have a defined data management framework.
"In mining, data isn't just a byproduct of operations but the core asset that drives every decision, from exploration to reclamation," said Dr Janina Elliott, Segment Director, Mining, at Seequent. The next challenge is clear: extract more value from current and historical data as AI and automation become more important.
What Management Should Do Next
Your job is to reduce time spent on busywork, improve decision quality, and make the process repeatable. That starts with clear ownership, consistent standards, and a plan to clean and connect high-value datasets.
- Establish a single source of truth: Centralise core geodata in a governed platform. Define domains, owners, retention rules, and change control.
- Formalise a framework: Adopt a proven model (for example, DAMA-DMBOK) to set standards for metadata, quality checks, access, and lineage.
- Fix historical data: Inventory what exists, prioritise the datasets that drive key decisions, clean them, and document assumptions so the context isn't lost again.
- Standardise integrations: Consolidate overlapping tools where possible. Use APIs and ETL/ELT pipelines to cut manual handoffs and spreadsheet sprawl.
- Measure and incentivise: Track time spent on data administration, data quality scores, and rework rates. Tie improvements to team goals.
- Pilot AI where data is strong: Start with narrow use cases (core logging, drillhole QA/QC, geotech classification). Keep humans in the loop and require interpretable outputs.
- Upskill the team: Ensure managers and specialists understand data governance and AI basics. If you need a fast start, explore AI courses by job function.
A 90-Day Plan That Actually Moves the Needle
- Baseline: quantify time spent on data tasks and pinpoint the top three pain points.
- Select two high-impact datasets and define standards for schema, metadata, and quality checks.
- Stand up a central repository and automate one pipeline from source to consumption.
- Run a small AI pilot on a well-governed dataset with clear success criteria.
- Review results, document what worked, and scale with an agreed playbook.
The Payoff
Less time on data chores. Faster cycles from exploration to design. Better auditability and handovers across projects. And when AI is applied, it has clean, connected data to work with-so the results actually help your people make better calls.
The signal from the report is simple: commit to core data practices now, and the value from AI follows. Delay the basics, and you'll keep paying the busywork tax.
Your membership also unlocks: