The quiet AI stack: how a 12-person team replaced four tools with one workflow.
A small ops team called us last February with a very specific kind of tired. They had a Zapier account with 38 active zaps, a Make scenario nobody remembered building, a GPT wrapper one of the founders had spun up in a weekend, and a spreadsheet that quietly tied them all together. Every morning the ops lead opened four tabs before coffee to check whether anything had broken overnight. Something almost always had.
They did not need more AI. They needed less.
What the stack looked like before
The team ran a B2B service business — twelve people, about six hundred customers, mid four-figure MRR. The core loop was simple on paper: a customer signs up, paperwork gets generated, a human checks it, money moves, the customer gets onboarded. The automation layer had grown in rings around that loop, one quick fix at a time.
- Zapier handled the HubSpot-to-Stripe glue and a handful of Slack notifications.
- Make ran a lead-enrichment flow nobody had touched in ten months.
- A custom GPT-powered script summarised inbound support emails and dropped drafts into a shared inbox.
- A Google Sheet acted as the source of truth because nobody trusted the CRM anymore.
On a good day it worked. On a bad day — which averaged twice a week — the ops lead spent an hour diffing the sheet against HubSpot, restarting a Zap, and copy-pasting into Slack.
Why the quick fixes kept failing
Every automation the team had built was locally correct. Each Zap did exactly what it said. The problem was that none of them knew about each other. If the GPT script summarised a support email and suggested a refund, Zapier had no idea that a conversation had happened. If a deal closed in HubSpot, Make didn't update the enrichment record. The connective tissue was the ops lead, quietly doing the work the automations were supposed to do.
"I'm not the bottleneck. I'm the glue. And I'd like to not be the glue on Sunday nights."
That sentence, more or less, is how every engagement we take on starts. The team isn't short on tools. They're short on one system that remembers what just happened.
What we actually built
We did not replace HubSpot, Stripe, or Slack. We replaced the automation layer. One small Node service, deployed on their own infrastructure, sat in the middle and subscribed to the four events that actually mattered to the business: lead created, deal won, subscription changed, ticket opened. Everything else was derivative.
Inside that service, we ran an AI step only where a human had previously been skimming text and making a judgement call. Support email → structured summary and proposed action. Contract PDF → extracted renewal terms. Inbound web form → intent classification plus a draft reply. Everything ran on top of a permissioned view of the CRM, so the model never saw fields it didn't need.
Two design rules carried the whole thing.
- One source of truth. HubSpot. Every other system reads from it or writes through the service. The Google Sheet was archived, not deleted — we kept it read-only as a grief-counselling tool for the first two weeks.
- Humans sign off on anything that costs money or tone. The AI drafts, proposes, extracts. A person clicks send. That one rule is the difference between a system that gets trusted and a system that gets turned off in week three.
What we cut
Of the 38 Zaps, 31 were deprecated. Six were folded into the new service. One — a simple "new customer joined" Slack ping — we kept in Zapier because it didn't need to know about anything else, and rewriting it for the sake of tidiness was the kind of over-engineering that makes clients quietly regret hiring you.
The Make scenario and the weekend GPT script are gone. So is the spreadsheet.
What actually changed for the team
Six weeks in, the ops lead sent a message we keep pinned: "I had a weekend." That's the outcome we optimise for. Not zapless purity, not a full model fine-tune, not a dashboard with seventeen KPIs. A quiet system that handles what it should handle, escalates what it shouldn't, and lets the person who used to be the glue go do higher-leverage work instead.
The wins that showed up on paper were smaller than the founders expected and more boring than the pitch decks would have you believe. Monthly close moved from three days to half a day. Inbound response time dropped from 14 hours to under 2. Nobody opened four tabs before coffee anymore. That, in our experience, is what a real AI workflow looks like when it's working: you stop noticing it.
Three things we'd do again, every time
- Start from the loop, not the tool. Draw the customer's journey through the business on one page before touching a single integration. Automations exist to serve the loop, not the other way round.
- Put AI behind the human, not in front of them. Draft, extract, classify, score. Let the person press the button. This is not about trust in the model — it's about trust from the customer on the other end.
- Own the automation layer. One small service in a repo you can read beats four no-code tools glued with a prayer. It also means you can hand it to another engineer and not apologise for the state of it.
If you're reading this and you recognise the four-tabs-before-coffee feeling, we'd be happy to look at your stack. A scoping call is free, it runs 30 minutes, and we'll tell you honestly if you need us or just need to turn three Zaps off.
— Written by the Bulsu Labs team · Talk to us