Nathan Jones

Nathan Jones

Founder, Calon; 28 years in data across FMCG, industrial, and retail

linkedin.com/in/nathanjonesdata

Why We Built Backbone

You've done the smile. You know the one.

A senior leader comes back from a conference, or finishes a podcast, or reads a thread on LinkedIn, and they've heard something. About agents that write pipelines. About Claude Code. About how AI is going to do the whole thing end to end, and by the way, why isn't your team already doing this?

So you smile. You say you'll look into it. And then you go back to your desk, back to the warehouse that the last team built and the last team understood, back to the three definitions of revenue that have coexisted in production for longer than anyone wants to admit, back to the data model that lives on a wall somewhere and hasn't been true for eighteen months.

The pressure to do more with AI is real. The foundations that would make AI useful don't exist yet. And nobody in that conference room is going to help you build them.

This is the situation for data leaders running analytics at scaling consumer goods companies. They're not failing because they lack talent or ambition. They're failing because the foundations underneath their data platform were never properly built, and every new tool, model, or initiative they layer on top makes the problem harder to fix.


The industry has been selling a lie. Not a malicious one, just an optimistic one, the kind that feels true when you're pitching it.

The lie goes like this: give your data the right semantic layer and a good chatbot, and the business will finally be able to answer its own questions. Build a great dashboard. Hook it up to a language model. Done.

But a semantic layer sitting on top of unresolved data is just a faster way to get the wrong answer. And in a consumer goods business with multiple ERP systems, acquired brands with different chart of accounts, and a finance team that still reconciles in Excel every month, “unresolved data” is not the exception. It's the baseline.

I've worked with global data teams for twenty-eight years. In a large consulting firm and outside of one. Across FMCG, industrial, retail. Every single build, even the well-resourced ones, started from scratch. Not because the problems were new. Because no one had built reusable structure for the parts of the problem that don't change.

Here's what doesn't change: the shape of an order-to-cash process is roughly the same across every consumer goods company. The entities in a product master are roughly the same. The way you integrate sales data with finance data, the logic that resolves naming inconsistencies between systems, the layering structure that keeps transformation logic auditable and testable. This has all been solved before. Dan Lindstedt solved most of the integration problem in Data Vault 2.0 in the early 2000s. The medallion architecture has been refined across thousands of implementations.

And yet every team is reinventing it. Every project is 95% custom build for problems that are not actually custom. Because there was no platform that encoded what works, the opinionated, repeatable structure that senior data architects carry in their heads after a decade of doing this, and turned it into something you could actually use.

That was always a problem. AI just made it urgent.


Backbone is what we built in response to all of it.

It starts where data work should always start: with the business. What decisions does this company need to make? What does profitable growth actually mean for this team, which markets, which SKUs, which channels? What are the jobs that finance, commercial, and operations need data to do? We call this the ontology layer: a structured definition of what the business is trying to understand, before a single line of SQL gets written.

From that foundation, Backbone generates the data warehouse. Not by writing code you then have to maintain. By encoding the architectural decisions that don't change into a metadata-driven engine. The data vault foundation. The integrated views across systems. The data products that make sense of your commercial, operations, and finance data together, not separately. And the semantic layer and MCP server that sit on top, so that the AI tools your business wants to use have something clean and well-documented to work with.

Your data stays in your Snowflake. Your pipeline descriptions and code live in your Git. We're not building a black box, we're building a backbone. Structure you own, that you can extend, that your engineers understand, and that your analysts can actually trust.


We are opinionated. Let us be clear about that.

We believe your data model is not as unique as you think. The entities are the same. The differences are unique: the specific dimensions that matter to your brand, the rules behind your margin metrics, the logic that makes your promotional data make sense. That's what we capture. That's what we generate around. But the structure underneath is not a competitive advantage. Treating it like one just means you rebuild it badly every time.

We believe data models are not static artifacts. They're living definitions of how a business understands itself. When your metrics change, and they will because your business changes, your data model needs to change with them. We built Backbone to be updated, versioned, and extended. Not to be printed, framed, and forgotten on a wall.

We believe the industry has been over-investing in the visible part of the stack and under-investing in what sits beneath it. Dashboards are easy to explain. Governance, lineage, data quality, and architectural standards are not. But dashboards built on bad foundations are not a product; they're a liability. The metric that can't be trusted is worse than no metric at all.

We believe AI will not replace the need for a clean data model, it will make the need more urgent. An LLM needs context. It needs definitions. It needs to know what “revenue” means in your business, which source system to trust when two disagree, and what the grain of your customer table actually is. Backbone generates that context automatically, keeps it current, and surfaces it through an MCP server that your AI tools can use. That's not a feature. That's the point.

We believe coding is not going away, but writing code by hand is. The future of data engineering is metadata-driven generation, LLM-assisted validation, and human oversight of decisions, not human generation of boilerplate. We built Backbone to be on the right side of that shift.


“Getting the numbers right is a non-negotiable; we need it to make the decisions that drive profitable growth. I can't put a number on it.”

— CEO, 600-person global consumer products group

Clean, honest data is non-negotiable. Every CEO of a scaling consumer goods business knows this, even if they can't put a number on it. The decisions that drive profitable growth depend on numbers that can be trusted when the pressure is highest: which markets to double down on, which SKUs to cut, which channel mix to optimise.

Most data platforms weren't built to provide that. They were built incrementally, under time pressure, by teams that have since moved on, using approaches that made sense in isolation but never held together as a whole.

Backbone is our answer to that. Not a chatbot on top of the mess. A foundation underneath it, opinionated, repeatable, and built for the moment the CEO walks back into the room and asks again.

That moment is coming. We think you should be ready for it.