Here's what I see happen in accounts that rely on Smart Bidding.
The first thing they do is lower the conversion threshold. The algorithm needs 30 conversions to function. Most B2B accounts don't hit that with real conversions. So they compromise. They add whitepaper downloads. They add gated content views. They add demo request form views. Anything to feed the machine.
Suddenly they hit 30 conversions a month. The algorithm feels fed. It starts optimizing. And the reporting gets poisoned.
Because now your primary conversion metric isn't "real qualified opportunity" anymore. It's "things that trigger pixels, weighted toward cheap early-stage garbage." The algorithm optimizes against that. Spend goes up. The cheap stuff gets cheaper. Your reporting dashboard tells you the campaign is thriving.
Your sales team tells you it's a dumpster fire.
The data corruption cycle
Let me trace through how this actually works, because it's a specific failure mode that most people don't understand.
Smart Bidding algorithms need signal. They need to see conversions so they can learn. The more conversions, the better they learn. Most B2B accounts can't generate 30 pure, revenue-qualified conversions a month. So they make a choice: either disable the algorithm or pollute the signal.
They poison the signal.
They add a micro-conversion. An eBook download. This gets tracked as a conversion, same as an actual sales-qualified lead. The algorithm doesn't know the difference. It just sees that eBook downloads convert and cost $8 apiece. Meanwhile, real SQLs are costing $500 because they're rare.
The algorithm, which is trying to hit a target CPA of $100, learns really fast. It can make eBook downloads happen all day at $8. It does. The account hits the target CPA. The dashboard lights up green. And you're spending $50K a month on garbage leads.
This isn't a flaw in the algorithm. The algorithm is working perfectly. It's optimizing exactly as designed. The problem is that the signal you're feeding it is corrupted.
And once you corrupt the signal, it's almost impossible to uncorrupt it. The algorithm has learned that cheap, fast conversions are good. Even when you eventually fix the data, that bias remains. The machine has internalized "cheap junk good, expensive deals bad."
What this does to your actual metrics
Your reporting metrics become completely disconnected from business reality.
You're sitting in a board meeting with a dashboard that says:
- CPA down 30%
- Conversion rate up 25%
- Cost per conversion down 20%
- ROI trending up
Your sales team is simultaneously telling you:
- Lead quality is terrible
- Close rate is down 40%
- Sales cycle is longer
- Team morale is in the gutter
Both things are true. The dashboard is measuring conversions. The business is measuring revenue. They're completely misaligned.
And because the algorithm is optimizing against the dashboard metrics, not the business metrics, it's getting worse every day. It's learning that the strategy is working when the strategy is actually destroying value.
The worst part is that this all happened because you were trying to keep the machine fed. You made a compromise that seemed reasonable (add more conversion types to hit the algorithmic minimum) and it cascaded into complete misalignment.
The counter: "Just use better conversion tracking"
The argument is that if you track the right conversions, Smart Bidding works great.
It's true. If you track only SQLs, and only SQLs show up in the conversion column, Smart Bidding will optimize for SQLs. The problem is that most B2B accounts don't have the infrastructure to do this cleanly.
Real SQL conversion tracking requires a custom pipeline. Salesforce stage data flowing back to Google Ads via API. Delayed lookback windows. Automated stage mapping. It's not a checkbox. It's a project that takes 4-6 weeks to set up properly.
Most accounts don't do this. So they compromise. They track SQLs, but they also track MQLs, and they also track whitepaper downloads because "why not, it's on the pixel already." Now the conversion column is a mishmash. The algorithm is confused. You're back in the poison situation.
The only way out is to build the real infrastructure. Which most agencies won't do because it costs them time and money. So they don't. They enable Smart Bidding on polluted data and hope for the best.
What actually works
Stop using Smart Bidding on low-volume B2B accounts.
Go back to manual CPC bidding or use rule-based automation. Write rules like "if conversion rate on this keyword is above X, bid up 10%." Simple, transparent, auditable. It doesn't scale like an algorithm. But it scales enough for most B2B.
Or, if you're committed to algorithmic bidding, don't compromise on conversion tracking. Wire up the real infrastructure. Get Salesforce stage data flowing back. Make sure your conversion column is clean. Only then enable Smart Bidding.
But don't do the middle thing. Don't enable an algorithm that needs signal, realize you don't have enough signal, and then feed it junk to make it work. That's how you end up with a dashboard that looks great and a business that's falling apart.
The algorithm will optimize exactly against what you tell it to optimize against. If you feed it noise, it will get really good at noise. And your business will suffer.
Keep your signal clean. Keep your conversion tracking honest. And if that means using less fancy bidding logic, so be it. A transparent, simple bidding rule that works beats a sophisticated algorithm optimizing against garbage data every single time.
Alex Langton
Senior B2B paid media manager · ~$650K/mo industrial spend
12+ years running B2B Google Ads accounts in industrial, manufacturing, and B2B e-commerce. Builds Langton Tools because generic PPC SaaS was never designed for the multi-MCC, complex- pacing, B2B-vocabulary reality of the accounts that actually drive industrial revenue.