Built a churn risk model in Einstein Model Builder, trained on behavioral signals from Data Cloud. Operationalized through Marketing Cloud journeys with a retention playbook attached to each risk tier. The model doesn't just predict, it triggers action.
At Quanata, we had a churn problem we couldn't see.
Customers would leave, and we'd find out when they didn't renew. By then, it was too late. The decision to leave had been made weeks or months earlier—we just didn't know it.
We had data. Lots of it. App engagement, driving scores, support tickets, billing history, quote activity. But it lived in silos, and nobody had connected it to churn in a predictive way.
The question: could we identify customers likely to churn *before* they made the decision, early enough to intervene?
Most churn models I'd seen in enterprise settings shared a common failure mode: they predicted but didn't act.
Data science team builds a model. Model lives in a notebook. Score gets exported to a CSV. Marketing team gets the CSV a week later. By the time anyone acts, the moment has passed.
I needed a model that:
And it had to work within our existing Salesforce + Marketing Cloud stack.
Data Foundation (Data Cloud)
First, I had to unify the signals. Data Cloud became the foundation:
Each signal got normalized into a unified customer profile with a 12-month lookback.
Feature Engineering
The raw signals weren't enough. I engineered features that captured *change*:
The insight: it's not the absolute values that predict churn, it's the *trajectory*. A customer with moderate engagement who's declining is higher risk than a customer with low engagement who's stable.
Model Training (Einstein Model Builder)
Einstein Model Builder let me train directly on our Data Cloud unified profiles. Key decisions:
The model achieved 0.93 AUC on the holdout set. More importantly, the precision-recall tradeoff at our chosen threshold gave us:
Operationalization
Here's where most models die. Not this one.
Einstein scores update automatically as behavior changes. Those scores write back to the unified profile in Data Cloud. Data Cloud syncs to Marketing Cloud.
I built three journey branches:
The journeys run continuously. No manual exports. No weekly batch processes. A customer's risk tier changes, their journey changes within hours.
0.93 AUC
The model performs exceptionally well at distinguishing churners from non-churners. But AUC alone doesn't matter—what matters is whether we can act on it.
78% Precision at High-Risk Tier
When we flag someone as high-risk, we're right 78% of the time. That's high enough to justify intensive intervention without wasting resources on false positives.
Real-Time Scoring
Scores update as behavior changes. A customer who suddenly stops opening the app sees their risk score increase within 24 hours, not the next monthly batch.
Automated Action
Zero manual intervention required to move customers into retention journeys. The system watches, scores, and acts.
Behavioral features over static attributes
Demographics barely moved the needle. What predicted churn was *behavior change*—declining engagement, increasing support tickets, shopping signals.
Tight feedback loop
Because scores update in real-time and journeys trigger automatically, we could measure intervention effectiveness and iterate quickly.
Tiered response
Not every at-risk customer needs the same intervention. Tiering let us allocate resources appropriately—intensive outreach for high-risk, lighter touch for medium.
More sophisticated journey logic
The current journeys are effective but relatively simple. With more time, I'd build branching logic that adapts based on intervention response.
Churn reason classification
The model predicts *if* someone will churn, not *why*. A secondary model classifying likely churn reason would enable more targeted interventions.
A/B testing infrastructure
We measured overall retention improvement but didn't have clean A/B tests of specific interventions. Building that infrastructure would accelerate learning.
The hard part of predictive modeling isn't the model. It's the operationalization.
A model that lives in a notebook is a science project. A model that triggers automated action is a system. The difference is everything.
We didn't build the most sophisticated churn model in the world. We built one that ships.