Your offline conversion imports are failing silently and you probably don't know it.
The standard setup looks fine in the interface. The upload history shows green. Conversions appear in reporting. Everything seems connected.
Dig into the data and you'll find that 20-40% of your actual closed deals are not being attributed. The GCLIDs expired. The upload format had field issues. The timestamps were wrong. The data was rejected at the API level without any visible error in the Google Ads UI.
Smart Bidding is optimizing on an incomplete dataset. It's performing as well as the data it receives, which is degraded.
How GCLIDs actually work in practice
When a user clicks your ad, Google appends a GCLID to the URL. That GCLID needs to be captured by your form, stored in your CRM with the lead record, and then uploaded back to Google when that lead closes.
This works cleanly in theory. In production B2B:
Step 1 failure: The GCLID parameter gets stripped by corporate proxies, CDNs, or browser extensions before it reaches the form. Up to 15% of GCLIDs never make it into the form field.
Step 2 failure: The GCLID makes it into the form but gets stored incorrectly in the CRM. Field mapping issues, special character encoding, truncation on import. Another 5-10% data loss.
Step 3 failure: The GCLID expires before the deal closes. GCLIDs are valid for 90 days in most CRM integrations. If your sales cycle is 180 days, 30-50% of your closed deals have expired GCLIDs that can't be uploaded.
Step 4 failure: The offline conversion upload itself has data quality issues. Wrong date format, missing required fields, conversion action name mismatch. Google rejects the record silently.
After all four steps, you might be successfully attributing 50-60% of your actual closed pipeline. The rest is invisible to the algorithm.
The solution architecture
Manual uploads from a spreadsheet are not sufficient for anything above $20K/month. The data quality issues require automated cleaning and validation before each upload.
The architecture that works:
Capture GCLID + UTM data in hidden form fields. Both. UTM as fallback when GCLID fails.
Store in a staging table in BigQuery, not just in Salesforce. Salesforce can corrupt or overwrite data. BigQuery maintains a clean historical record.
Daily automated job that queries Salesforce for new closed-won opportunities (or whatever stage you're tracking). Pulls opportunity amount, close date, associated contact email, company domain.
Matching logic in BigQuery: match Salesforce records to GCLID by email → domain → time window (click must be within 180 days before close). In that priority order.
Validation step: check that each GCLID is still within Google's upload window, that amounts are in the correct format, that conversion action names match exactly.
Upload to Google Ads offline conversions API. Log the results. Alert on rejections.
With this setup, you recover 15-20% of previously lost attribution just from the domain-matching fallback on expired GCLIDs. Your Smart Bidding model gets better data. Performance improves over 60-90 days.
What this costs
About $200-300/month in BigQuery compute for typical B2B account volumes. Development time to build: 3-4 weeks.
The ROI calculation: if you're recovering attribution on 15% more closed deals and your average deal is $40K, and this improves your bidding enough to close 2-3 more deals per quarter, the math is obvious.
Set it up once. Run it forever. Check the logs monthly for new failure modes.
The baseline of "spreadsheet upload once a month" isn't attribution. It's a simulation of attribution that leaves real performance on the table.
Alex Langton
Senior B2B paid media manager · ~$650K/mo industrial spend
12+ years running B2B Google Ads accounts in industrial, manufacturing, and B2B e-commerce. Builds Langton Tools because generic PPC SaaS was never designed for the multi-MCC, complex- pacing, B2B-vocabulary reality of the accounts that actually drive industrial revenue.