Marketing Technology Manager

Post-Merger Global Marketing Automation and Web Consolidation

Standardizing AEM + Pardot + Salesforce across 8 regions

Company Enterprise healthcare diagnostics company, post-acquisition. Eight regions operating independently on separate web properties, automation platforms, and data practices. Platforms: Adobe Experience Manager (AEM), Salesforce Pardot, Salesforce CRM, Azure DevOps.
Team Cross-functional matrix: regional marketers across NA, EMEA, LATAM, APAC, Japan, and China. Salesforce admin partner, AEM developers, external QA resources. Twelve months end-to-end.
Problem Post-merger fragmentation. No shared CMS workflow, no consistent consent model, no unified lead scoring, no standard QA process. Manual everything. Sales didn't trust the leads. Compliance exposure across EU and APAC. Campaign launches took weeks.
What I did Designed and led the global consolidation of marketing automation and web infrastructure across 8 regions — consent architecture, template standardization, lead scoring redesign, AEM-to-Pardot integration, QA governance, and wave-based regional rollout.
My role I owned the architecture decisions, led cross-regional rollout, designed the lead scoring model, built the integration logic, defined QA governance, standardized the consent and preference center approach, and aligned stakeholders on a single global model when regions pushed for exceptions.

Fragmentation was the baseline state. Not by design, just by accumulation. Two legacy organizations had been merged, but nobody had merged the marketing infrastructure. Every region had built its own workarounds — different templates, different consent implementations, different scoring models, different everything. The technical debt was real, but the organizational debt was worse: eight regional teams, each convinced their local approach was the right one.

  • Multiple web properties with no shared CMS workflow
  • Region-specific consent language and opt-in implementations with no consistency
  • Three or more scoring variations across regions, no shared MQL definition
  • Manual CSV uploads, manual campaign enrollment, manual everything
  • QA was informal. Release risk was higher than it should have been.
  • Reporting was hard to trust because the underlying data was not clean.

Sales did not trust the leads. Compliance exposure was real across EU and APAC. Campaign launches took weeks. Every region had its own workaround.

Getting eight regional teams to adopt a single global model was the hardest part of this project. Every region had a legitimate reason their market was different. The approach was not to dismiss those differences but to build a system flexible enough to handle regional compliance requirements within a shared architecture — and then prove it worked in NA before asking anyone else to adopt it.

01 Define one global consent model. Single opt-in field, centralized suppression logic, preference center with dynamic language to handle regional compliance requirements including GDPR without rebuilding for every market.
Global preference center with dynamic language rendering across all regions.
Global preference center with dynamic language rendering. One opt-in field replacing multiple fragmented implementations across all regions.
02 Consolidate templates. Went from 18+ regional email templates to 6 global masters. Standardized components. Eliminated the variance that was inflating launch timelines.
03 Rebuild scoring. Partnered with the Salesforce admin to redesign lead scoring from scratch. Aligned on a single MQL definition, calibrated thresholds by intent signal, and got sales involved early so they would actually trust what came through.
04 Re-architect the AEM to Pardot integration. Automated form submission to campaign enrollment. Eliminated manual list handling. Built enrollment logic that was testable and auditable.
Engagement Studio automation map showing AB test segmentation, reminder logic, attended vs. missed branching, and post-event follow-up sequences.
Engagement Studio automation map — AB test segmentation, reminder logic, attended/missed branching, and post-event follow-up sequences.
05 Stand up QA governance. Defined entry criteria, test coverage expectations, and go-live validation checkpoints. Brought in external QA resources and built an operating model around structured releases.
Azure DevOps board used for release tracking across regional rollout waves.
Azure DevOps release tracking across regional rollout waves. Work items tracked through New, Approved, Doing, and Done states.
06 Wave-based rollout. Started with NA as the pilot. Documented what worked. Rolled to EMEA, LATAM, APAC, Japan, and China in sequenced waves with go-live checklists and cross-time-zone handoffs built in.
Brazil regional site in Portuguese showing consistent AEM framework applied across markets.
Brazil regional site (Portuguese) — consistent AEM framework applied across markets. Same nav structure, local language, local compliance, shared system.

The rollout followed a deliberate wave structure. NA went first as the pilot — not because it was easiest, but because it was the most controlled environment to validate the architecture before asking other regions to adopt it.

Each subsequent wave — EMEA, LATAM, APAC, Japan, China — had a documented go-live checklist, cross-time-zone handoff protocols, and entry criteria that had to be met before launch. Regional teams were involved in UAT for their wave, which built ownership and surfaced locale-specific issues early.

This was a twelve-month program with regular late-night calls across time zones. The coordination overhead was real. What made it work was that every wave benefited from the documented learnings of the previous one — the playbook got better with each region.

62% Campaign deployment time reduction
17% Increase in MQL to SQL conversion
96% CRM-marketing field alignment (from 72%)
Metric Result
Campaign deployment time Reduced 62% (3–4 weeks → 4–7 days)
MQL to SQL conversion Increased 17%
CRM-marketing field alignment 72% → 96%
Duplicate lead records Reduced 38%
Post-launch defects Reduced 33%
Rollback incidents Reduced 40%
On-time regional launches 95%
Regions on unified architecture 8

Email engagement improved as a downstream effect of better targeting, cleaner consent data, and template consistency. Not a creative story — a data and architecture story.

Metric Before After
Email Open Rate 19.5% 23.4%
CTR 2.6% 4.5%
Unsubscribe Rate 1.9% 0.5%
List Growth -5.2% +7.1%

Consent and suppression logic was standardized across all 8 regions. Lead scoring had a single shared MQL definition calibrated with sales input. Campaign launches that previously required weeks of regional coordination could be executed in under a week. Sales started trusting the leads because the data was clean and the routing was reliable.

The system is stable and adopted, but it's not done. The next maturity curve is self-service: giving regional marketers a reporting layer so they can validate their own data without routing through central ops. Progressive profiling in the preference center would improve list quality over time without adding form friction. Lead scoring needs a formal quarterly recalibration cycle tied to actual sales outcomes — set-it-and-forget-it doesn't work. And QA governance should become a shared playbook that regional teams can own, reducing the dependency on central coordination for standard launches.

This project shows what it takes to consolidate marketing infrastructure at enterprise scale after an acquisition — not just technically, but organizationally. I designed a global architecture that was flexible enough to handle eight regions' compliance and market requirements within a single system. I built the operating model for QA, release governance, and cross-regional rollout. I got eight teams who were comfortable with their local workarounds to adopt a unified platform — not by mandating it, but by proving it worked in the pilot and documenting everything well enough that adoption was the obvious choice. The result was a system where campaign execution, data quality, consent management, and lead routing all work together — and where regional teams can operate within it independently.