In Part 1 of this series, we cracked open the AI landscape using the analogy of vehicles. We looked at the engines that power AI, the different sizes and types of engines built for different jobs, and the vehicles that put those engines into your hands. The takeaway: AI isn’t one thing. It’s a whole industry, with engine makers, vehicle builders, and bolt-on accessories all targeting different drivers.
That mental model is the easy part. The hard part starts when a business actually decides to run a fleet.
What does it cost to keep these vehicles on the road? Whose data are they carrying, and who else can see it? How do they connect to each other so the whole organisation gets the benefit, rather than just the team that bought the tool?
This is where most AI projects either pay off or quietly fall over. Picking the right vehicle gets the headlines. Building the supply chain around it is what actually delivers value.
Let’s keep going.
Every vehicle needs fuel. In AI, that fuel is tokens: chunks of text the model reads and writes. Every question you ask, every document you upload, and every word the AI generates back burns tokens. Long inputs cost more than short ones. Verbose answers cost more than brief ones.
The car analogy holds beautifully here. Bigger, more powerful engines drink more fuel, and they drink premium. Smaller, efficient engines sip the cheap stuff. Heavy cargo (long documents, big datasets) burns more fuel than a quick errand (a one-line question).

The price gap is bigger than most people realise. A query to a top-tier “thinking” engine like Claude Opus or GPT-5.5 Pro can cost roughly five times more per token than the same query to a lightweight model like Nano, Haiku or Gemini Flash. Run that across thousands of queries a day and the difference between the right engine and the wrong one is the difference between a manageable bill and a runaway one.
Two things drive costs up faster than people expect. The first is long inputs: uploading a 100-page PDF and asking “summarise this” costs more than most users realise, because the model reads every page before writing a word. The second is chatty back-and-forth: every follow-up question replays the whole conversation history, like reloading the same cargo for every trip.
Most business users won’t see this directly. You’ll pay a flat monthly subscription for ChatGPT, Claude, Gemini or Copilot, and the fuel costs are bundled in. But behind the scenes, that’s exactly what’s happening. It’s also why the same provider charges very different prices for their top-tier and lightweight models, and why “unlimited” plans usually have quiet caps on the premium engines once you push them hard enough.
The takeaway for business isn’t “use the cheapest engine”. It is run a fleet. Use the V12 for the hard reasoning jobs that actually need it. Use the V6 diesel workhorse for everyday work. Use the hatchback for high-volume jobs where speed and cost matter more than depth. The smartest organisations don’t pick a single AI. They match the engine to the journey.
If the engine moves the vehicle and the fuel keeps it running, what is it actually carrying? Your data. Documents, emails, sales records, customer histories, internal knowledge, meeting notes. Every bit of context you feed the AI.

This is the part most businesses underestimate. AI doesn’t just carry cargo; it transforms it. It sorts emails into priorities. It compresses a 200-page contract into a one-page brief. It pulls insights out of customer feedback. It bundles raw data into recommendations.
In effect, AI is less a smart tool and more a logistics system for your data. The value isn’t the vehicle, it’s how efficiently you move, transform, and deliver the cargo.
And not all cargo is the same. Clean, structured cargo (spreadsheets, CRM records) sits neatly on a pallet, perfect for tools like Copilot for Excel. Messy unstructured cargo (PDFs, emails, meeting notes) is the mixed-box delivery van work, better suited to ChatGPT or Claude. Heavy or oversized cargo (long videos, huge datasets, full codebases) needs a specialist rig with a serious engine. kids from school. It works, but it’s wasteful, expensive, and not as efficient as the right tool for the job.
For any business handling sensitive information, this is the question that matters most: how protected is the cargo?
Enterprise vehicles like Microsoft 365 Copilot for Business, ChatGPT Enterprise, and Claude for Work are built like armoured trucks. Strong governance, data ringfenced inside your own environment, audit trails, compliance baked in. They’re slower to get moving, but the cargo is well protected.
The consumer versions of those same products (free tiers and standard paid tiers like ChatGPT Plus or Claude Pro) are a different vehicle entirely. More like a ute with no canopy: fast, capable, brilliant for most jobs, but you’d think twice before loading the tray with anything truly sensitive. Same brand, very different vehicle.

This is also where most businesses quietly get into trouble. An employee expenses $20 a month for ChatGPT Plus, starts pasting in customer data and internal documents, and the company has no visibility, no contract, and no protection. The cargo is moving on a personal vehicle the business doesn’t even know exists.
Three questions are worth asking before you load anything onto an AI:
This is also where open-source engines become genuinely interesting. Running a Llama or DeepSeek model in your own environment is the equivalent of keeping your trucks in your own depot. More work to maintain, but the cargo never leaves the yard.
The right answer for most businesses isn’t one vehicle or one approach. It’s a fleet, with the right vehicle assigned to the right cargo and the right job. Which raises a bigger question: how do you want to assemble that fleet?
Some businesses buy everything from one manufacturer. They pick Microsoft, or Google, or Anthropic, and run their entire AI capability inside that ecosystem. The advantage is consistency: one vendor, one contract, one support line, one governance model. The disadvantage is lock-in: you get whatever that manufacturer is best (and worst) at.
Others build their own fleet using a platform layer. Microsoft Foundry, Amazon Bedrock, and Google Vertex AI all let businesses pick and choose engines from multiple providers (Claude, GPT, Llama, Mistral) and assemble their own custom vehicles on top. More flexibility, more work to manage.
Others again pick best-in-class for each job. Claude for writing, Perplexity for research, GitHub Copilot for code, Harvey for legal, Agentforce for sales. The advantage is excellence in each lane. The disadvantage is complexity: more contracts, more logins, more integrations to maintain.
There’s no single right answer. Whichever fleet model you choose, the bigger question is how the cargo flows through it.
Real cargo rarely makes a single trip in a single vehicle. It moves through a logistics network: collected, sorted, transformed, delivered.
AI works the same way, and not just within a single team. Across a modern business, AI is now embedded into nearly every functional pillar, with specialist tools running each one:
Each of these is its own vehicle, with its own engine, carrying its own cargo, optimised for its own job. None of them does everything. All of them do something specific better than a general-purpose tool would.
But here’s where most businesses get stuck. A real logistics network isn’t just trucks. It’s trucks plus warehouses. The cargo can’t only live inside the vehicle that picked it up, because then nothing else in the business can see it. The Salesforce truck ends up with a shed full of customer data that the SAP truck can’t read. The HR truck has a shed full of people data that the finance truck can’t access. Each AI ends up optimising its own corner of the business with one eye closed.

The fix is a central warehouse. One place where the cargo lives, where every vehicle drops off what it’s carrying and picks up what it needs. In practice this is what platforms like Amazon SageMaker Lakehouse, Snowflake, Databricks, Microsoft Fabric, and Salesforce Data Cloud are trying to be: the shared depot underneath the operational fleet, where data is stored once, indexed properly, and made available to whichever AI needs to reason over it.
This is what unlocks the real value of AI across an organisation. An AI that can only see sales data makes sales recommendations. An AI that can see sales data, support history, payment patterns, and contract terms together makes business recommendations. The warehouse is what turns a collection of clever tools into a coherent intelligence.
The businesses winning with AI right now aren’t the ones who picked the “best” tool. They’re the ones who designed how data flows through their organisation. Which vehicle picks it up. Where it gets stored. Where it gets transformed. Where it ends up delivered.
The implication is that “choosing an AI” is the wrong frame. The right frame is designing your AI logistics network: which vehicles handle which legs of the journey, where the warehouse sits, and how the cargo moves between them.
Even with the right vehicle, the right cargo, and a well-stocked warehouse, none of it moves if the roads are broken.
The roads are the connective tissue of your business. The APIs that let one system talk to another. The integrations that let your CRM update your finance system. The workflow tools that pass cargo from one vehicle to the next without a human in the middle. The business processes that decide who picks up what, when, and where it goes.
Most businesses underestimate how much of their AI success depends on this layer. The vehicles get the attention. The roads quietly determine whether anything actually gets delivered. A great AI tool plugged into a business with poor integrations is like a Ferrari in a paddock. Beautiful, fast, capable, but going nowhere useful.
Three things tend to separate good AI roads from bad ones.
The first is integration. Can your AI tools actually talk to the systems where your data lives? If your sales AI can’t read your support tickets, your finance AI can’t see your contracts, and your HR AI can’t access your performance data, then each one is operating without the full picture. The cargo is stuck in the depot.
The second is automation. Once two systems can talk, can the cargo move between them without a human carrying it across? Modern automation platforms (Zapier, Make, Microsoft Power Automate, n8n) and agentic frameworks are increasingly the road network that connects AI vehicles to each other and to the rest of the business.
The third is process. Even with great integrations and automation, the business still needs to know who is responsible for what. Which decisions does the AI make on its own? Which ones get escalated? When does a human pick up the cargo from the AI? These aren’t technical questions. They’re operational ones, and they’re the difference between AI that helps the business and AI that creates new bottlenecks.
The businesses that get this right treat their roads as seriously as their vehicles. They invest in integration. They build automation. They redesign processes around what AI can now do, rather than bolting AI onto processes designed for humans.
The lesson is that you can’t just buy a faster car and expect to get there sooner. You also have to fix the roads.
So far, we’ve worked through the mechanics of running AI as a business capability. The engines that power it. The vehicles that put it in your hands. The fuel that keeps it moving. The cargo it carries. The warehouse where data lives. The roads that connect it all together.
If you take only one thing from this part of the series, it’s that the businesses winning with AI aren’t the ones with the most powerful tool. They’re the ones who designed how all of these layers work together.
But there’s still one element we haven’t talked about, and it’s the most important of all: the human behind the wheel. Even the most expensive fleet on the smoothest roads is wasted if the drivers don’t know what they’re doing.
In Part 3, we’ll look at the driver. We’ll also look at what happens when the cars start driving themselves, because agentic AI is changing the rules of the road faster than most businesses can keep up. And we’ll close with the road rules: the governance, ethics, and cultural questions that decide whether your fleet helps your business or quietly puts it at risk.
That’s where the strategic conversation actually lives.
Stay tuned.
If you need help with your AI strategy, implementing an AI Agent, or your AI governance policy, contact our chief for a brief chat.


