Joshua Vogel

Joshua Vogel

Customer Success leader · AI builder · Sydney, AU Open to work

joshua.ad.vogel@gmail.com · LinkedIn · The CS PRESS

Senior CS operator. Most recent book: $10.5M ARR enterprise portfolio at 147% NRR, with 100% logo retention and $934K of upsell pipeline generated in two quarters. Across 10+ years I've built Customer Success programs from scratch, run $25M ARR books at 120%+ NRR, and led the customer save plays that get recognized by industry panels. Most recently the 2025 Creative Customer Success Leader Award.

What's different about the past 18 months: I've built an AI-augmented operating model for Customer Success that returns hours of capacity per week to high-value customer engagement. Patterns I built for myself have been picked up across the broader Customer Success organization. The result is a working template for what AI-native Customer Success looks like in practice. Operator judgment in front, agentic infrastructure underneath. Open to bringing this operating model to the next team.

10+ Years CS
Leadership
$33M+ ARR Managed
Across Career
147% Current
Net Retention
1,200+ Customers
Served

The thesis

Reactive Customer Success
vs proactive Customer Success.

Most CS organizations operate reactively: an inbox plus a Salesforce login. The teams hitting 130%+ NRR aren't doing more of the same. They're operating with a fundamentally different model. Here's what changes.

Reactive baseline

Where most CS functions live today.

Proactive operating model

The work I do, scaled across teams I've led.

Inbox-driven engagement. Reply when prompted; manage what's loud, miss what's quiet.
Scheduled cadence and signal-driven outreach. Quiet accounts surface as engagement gaps before they become churn risks.
QBR data extracted manually the day before each meeting. The whole prep day burns on spreadsheets.
Always-on telemetry feeding executive-ready charts. QBR cycle compressed 75%. Prep time goes to narrative and strategy, not data wrangling.
Single CSM thread to the customer. One relationship, one perspective, one point of failure.
Account pod model (CSM + AE + SE + RAM). Twelve-stakeholder customer mapped to four-function vendor pod, single shared roadmap.
Renewal scramble in the last 90 days. Sponsor change in month nine derails the conversation.
Renewal posture from day one. Stakeholder map maintained; sponsor changes caught early; expansion conversation always on the table.
CSM workflow ad hoc per person. Knowledge stays in heads; one departure resets the team.
Codified workflows as reusable systems. Account memory, voice profile, governance rules. New CSMs onboard onto an operating standard, not into a fog.
AI used as a side tool. ChatGPT for drafts. Tools added with no measurement.
AI as production infrastructure. Cost-tiered model routing. Value reported quarterly. Capacity returned to high-leverage human work.

Selected work

What I've built and what it produced

Five systems I've built and run in production against an enterprise portfolio. Each started as a recurring problem in my own CSM workflow. Each produces measurable capacity, consistency, or coverage that wouldn't exist otherwise, and travels with me to the next team.

The Customer Success operating surface I run on

Hours per day of context-switching eliminated. Faster response to at-risk signals. Better preparation for executive customer conversations.

A single web surface that aggregates everything a senior CSM needs to make daily decisions: today's customer meetings with prep brief, escalations needing my attention, customer signals from the past 24 hours, my prioritized daily plan, and recap drafts after meetings.

Before this existed, the same context lived across 8 different tools and tabs. The cost of switching between them was the dominant tax on the role. Compressing that into one surface is the single biggest productivity unlock I've shipped.

Stack Cloudflare Workers · D1 · Pages · Custom sync-daemon · Daily production use

A library of repeatable Customer Success workflows

The same job, done in a fraction of the time, every time. Consistent customer experience regardless of CSM workload. Best-practice execution baked into the workflow itself, not relying on memory.

Fifty-plus self-contained workflows that handle the recurring jobs CS practitioners do manually: account research before customer meetings, follow-up emails after meetings, churn-risk scoring, escalation tracking, QBR data preparation, executive briefing decks, customer reactivation outreach.

CS practitioners spend an enormous share of their time on knowable, repeatable work. Capturing that work as reusable systems returns capacity to the part of the job AI can't do: the human relationship work that actually drives renewal and expansion.

Stack OpenCode + Anthropic Claude · Model Context Protocol (MCP) · Versioned, evaluated, self-improving over time

Patterns adopted across the broader CS organization

Tools I built for myself, picked up by peers across the global CS team where I work. The same time-saving and consistency benefits I get, scaled.

Multiple workflows from my personal library (a customer usage analytics skill, a QBR report generator, a repository bootstrap tool, a doc-drift audit skill) sanitized and contributed to my employer's central Customer Success tooling library. Reviewed via the same merge-request process used by engineering teams.

Internal AI tooling at scale is valuable when one practitioner's improvements compound across an organization, not when each person reinvents the wheel. Showing I can lead this kind of upstream contribution while still hitting portfolio number is the differentiated profile.

Scope Maintainer-level access on the shared library · Active peer collaboration on a unified hub

An operating standard for AI in customer-facing work

AI assistance that's reliable enough to run against real customer accounts without erosion of trust, voice, or accuracy. Mistakes that would have damaged customer relationships, codified as guardrails before they happen.

A written-down, version-controlled set of rules and protocols defining how AI systems handle customer-facing work safely. Covers voice and tone discipline, when to escalate to a human, what to never automate, how to handle customer data, how to attribute work honestly, and how to surface uncertainty rather than fake confidence.

The biggest risk in AI-augmented customer-facing work isn't model capability. It's drift between AI behavior and the standards a senior operator would apply. This standard closes that gap operationally, not philosophically. It's the kind of governance work most organizations will need within the next 12 months and almost none have written down today.

Format ~3,000 lines of git-managed governance · Reviewed and tightened on a fortnightly autonomous audit cadence

An always-on layer that surfaces what matters

Important customer signals get caught early. The morning plan reflects current reality, not yesterday's. Nothing important falls through the cracks because it was buried in a notification stream.

Forty-plus scheduled jobs running before, during, and after my working day to surface relevant signals: meeting prep before customer calls, engagement scans across the book of business, escalation triage, end-of-day wrap-ups, fortnightly audits of what I might have missed.

CSM workflows are fundamentally driven by external signal. Operating reactively means missing signals. This layer makes the signals visible before I have to ask, which is what separates senior IC operating from junior IC reaction.

Reliability Auth-aware · Idempotent · Fail-safe · Production-grade

Customer Success is easy.
People are hard.
Software is just the part nobody trips over.

Joshua Vogel · The CS PRESS · 2026

Flagship evidence

What the work actually looks like

Three artefacts from real anonymized customer engagements, rendered through my qbr-charts skill. Customer profile is a synthetic analog of an enterprise financial services account in my book. Data shapes, peak values, and visual treatment are unchanged from production.

Selected outputs in detail

The systems behind the artefacts

Five systems that produce the work above. Each started as a recurring problem in my own CSM workflow. Each is in daily production use. All examples use synthetic customer data; actual customer information is never shown publicly.

CSM Hub Dashboard · Daily view

The single surface I open every morning

Today's customer meetings · 3
09:30
Acme Industries
QBR · 6 attendeesPREP READY
11:00
Globex Logistics
Weekly syncRECAP DUE
14:30
Helios Energy
Renewal kickoffAT-RISK
Portfolio signals · last 24h
Engagement gaps 3 accounts > 21 days quiet
Hot escalations 2 unresolved · INC-1142, INC-1156
Renewals next 90 days 7 accounts · $4.2M ARR
Pipeline activity + $180K added (Q4 cross-sell)

Morning brief · 09:45 AEST · Daily output

How the day starts, before I do

FRIDAY · 2026-05-08 · 09:45 AEST · 7 customer signals · 3 meetings · 2 escalations

Top of mind today

  • Helios Energy renewal kickoff at 14:30. $1.2M annual contract; sponsor moved last quarter, new contact has 2 weeks of context. Brief draft prepped. Risk score: 6/10.
  • Acme Industries QBR at 09:30. Multi-product expansion ready to discuss. Bot Management adoption up 40% QoQ; ready to position Application Security upsell.
  • Globex Logistics weekly sync at 11:00. Recap from last week shows 3 unresolved action items. Following up on integration timeline.

Engagement gaps · 3 accounts

  • Polaris Financial: 24 days since last meaningful contact. Last 3 emails one-way (mine). Suggest re-engagement outreach.
  • Nimbus Software: 22 days quiet. Renewal in 73 days. Worth a check-in.
  • Atlas Retail: 21 days quiet. No meetings booked. Possible sponsor change; verify on LinkedIn.

Hot escalations · 2 unresolved

  • INC-1142 (Globex): 5 days unresolved. Ticket sitting on Engineering. Last update 48h ago. Worth nudging in QBR.
  • INC-1156 (Acme): 2 days unresolved. New escalation; product team aware. No action needed yet.

CS Skills Library · 54 production skills

The library I built one workflow at a time

Each skill is a self-contained, versioned, evaluated workflow that handles a recurring CSM job. Highlighted skills have been contributed upstream to the broader CS organization. Click any skill to see what it does and an example of what it produces.

43 of 54 shown. Each skill is git-versioned and follows a maturity progression: draft → tested → trial → crystallized.

QBR pipeline · End-to-end deck automation · Daily production use

The biggest single capacity unlock I've shipped

What used to take 4-6 hours of manual chart work per QBR now compresses by 75%. Across a 44-account enterprise portfolio that's 150-180 hours returned per quarter, or roughly four full working weeks of senior CSM time redirected from spreadsheet wrangling to strategic preparation, executive relationship work, and team coaching.

Most CSMs lose the day before a QBR to data extraction, copy-paste, chart formatting, and slide assembly. The QBR pipeline turns that day into a 30-minute run: telemetry pulled, charts rendered, slides assembled, deck exported as PDF and PPTX. The opener slide ("Business Priorities") proves I understand the business before showing any usage data. The chart slides translate that data into executive-ready visual framing. The CSM's job becomes the only thing the customer actually values: framing the story, refining the narrative, anticipating the executive question.

The dollar math is real. At a fully-loaded senior-CSM cost, the time recovered is six-figure annual capacity per CSM, and it compounds across a team. But the bigger impact is qualitative. CSMs who aren't burning the day before a QBR show up to the meeting prepared to think, not just present. That's the difference between a tactical reporting cadence and a strategic partnership.

The three rendered artefacts (Business Priorities opener, WAF chart, Bot Management chart) live above in the flagship section so the evidence is findable in the first 60 seconds, not buried in this section's detail. ↑ Back to the flagship visuals.

Stack Node.js + Puppeteer (headless Chrome) · Custom HTML/CSS chart templates · JSON-driven data model · Synthesized from telemetry + public filings + internal research · 14-slide deck output, PDF + PPTX exported

Agentic Codex · ~3,000 lines of governance

How AI handles customer-facing work, written down

A version-controlled rulebook that AI assistants read at every customer-facing touchpoint. Excerpts from the table of contents:

01
CRM is the source of truth for the account team. No inferences. Helping doesn't transfer ownership.
02
Customer-facing email output format. Plain text in fenced code blocks; structured by ALL CAPS section headers and bullet items; no markdown rendering tricks.
03
Date-day verification, non-negotiable. Every date in customer output must be machine-verified against `cal`, never inferred.
04
Time-of-day verification. Run `date` before stating elapsed time, current time, or remaining time. Hallucinated time is the most common LLM failure mode.
05
Tool-first context retrieval. When the answer exists in a tool, grab it. Don't ask the user for what an MCP can answer in seconds.
06
Declaration discipline. Never claim "done" without verifying user-level success, not just file-state success.
07
Reproduce-first debugging. Before opening any source file in response to a UI bug, reproduce the symptom with DevTools open.
08
Sub-agent delegation gates. 5 mechanical gates fire BEFORE inline tool calls; main session reserved for judgment-heavy work.
12 more rules covering security, attribution, voice and tone, customer-artefact destination, sensitive file handling, and process-state-vs-file-state debugging.

System architecture

How the pieces fit together

SIGNAL ORCHESTRATION INFRASTRUCTURE SURFACE Customer signals Email · Calendar · Slack CRM systems Salesforce · CRM data Escalation streams Jira · Pagerduty Schedules + cadence launchd · cron OpenCode + Anthropic Claude (MCP) 54 production skills · Multi-agent orchestration · Agentic codex governance 42 scheduled jobs · Self-healing maturity progression Cloudflare Workers Sync-daemon · skill-bridge Cloudflare D1 Source of truth Cloudflare Pages Dashboard frontend CSM Hub Dashboard · Daily decisions, single surface

User → Signal collection → Agentic orchestration → Cloudflare infrastructure → Operating surface


Career snapshot

Where I've worked

2025 — Now
Cloudflare · Customer Success Manager (Enterprise) · Sydney, AU
$10.5M ARR @ 147% NRR · 44 enterprise accounts · $934K Q3-Q4 pipeline · 100% logo retention · QBR cycle compressed 75% via AI tooling I built
2023 — 2025
aboutGOLF · Director of Customer Success and Support · US (Remote)
Built CS from scratch · 1,200 accounts · 98% retention · 44% churn reactivation ($400K) · $2.5M upsell ARR · 1-10 health scoring system in Salesforce · Cart-to-Curb e-commerce automation
2019 — 2023
WithYouWithMe · Head of Enterprise Account Management · Sydney, AU
$25M ARR @ 120% NRR · Promoted twice in 3.5 years · Grew Accenture from $1.3M → $5M ARR in 90 days · Led an 8-person CSM team across global expansion (UK + Canada launches)
2012 — 2019
U.S. Navy · IT Infrastructure Project Manager · Naples, Italy
140+ infrastructure projects across EMEA · Navy Achievement Medal · DISA Facility Control Office of the Year (2017) · The technical foundation underneath every CS role since

↓ Download resume (PDF)


Track record at scale

What the work added up to

The numbers behind the roles. What I built scaled to thousands of accounts. What I led scaled to dozens of teammates. Both axes matter for senior CS work.

150% 130% 110% 100% SaaS BENCHMARK · 110% 120% WithYouWithMe $25M ARR · 30 ACCTS 2019 — 2023 108% aboutGOLF $5.4M · 1,200 ACCTS 2023 — 2025 147% Cloudflare $10.5M · 44 ACCTS 2025 — RECENT Net Revenue Retention across roles SENIOR CS OUTCOMES · 2019—2026

Three roles. Three different stages, segments, and product categories. NRR consistently above the SaaS benchmark of ~110%. Most recent reading: top-decile.

Customer-facing scale

What 1:Many digital success looks like in practice. The cohort numbers behind the headline retention metrics.

1,200 Customer accounts

Active book at aboutGOLF (residential, SMB, mid-market) running on a single CS function with four people.

98% Logo retention

Year-over-year subscription retention across the full 1,200-account cohort. Two points below stretch target on commercial churn.

44% Churn reactivation

$400K of churned ARR recovered through cross-functional Happy Path playbooks and re-onboarding programs. Industry-panel award winner.

67% CSM capacity scale

From 150 to 250+ accounts per CSM via a "1:Many" digital success model: monthly Open Office Hours, Town Halls, automated touchpoints.

Leadership scope & mentorship

The team-leading half of the job. Senior CS isn't a single-thread contributor role; it's a force multiplier across the people you work with.

12 Direct team led

CS + Support team at aboutGOLF when the two functions merged under one Director. CSAT up 10%, response times down 83%.

8 Enterprise CSM team

Built and ran the enterprise CSM team at WithYouWithMe across UK, Canada, and Australia expansion phases.

100+ Mentees & coachees

Practitioners across career stages worked through the TORCHED mentorship framework I created. Plus Catalyst Growth Coaching.

2 Promotions in 3.5 yrs

Started as IC enterprise CSM at WithYouWithMe; left as Head of Enterprise Account Management. Promotion arc that maps directly to senior IC + leadership readiness.


Voice and recognition

Writing, speaking, awards


What's next

Where I'm pointing this work

The pattern I've built is portable. The next chapter is bringing this operating model to a team that wants AI-native CS at scale, not as a science experiment. Three problems I want to work on. First, the IC-to-leader transition for AI-fluent CSMs: what does career growth look like when the scope changes from accounts to systems? Second, the metrics gap between AI-augmented capacity and traditional CS reporting: what do we measure when 75% of QBR prep time has been compressed? Third, the cross-functional shape of post-AI CS: where do the lines move between CS, SE, and FDE when tools are agentic? Open to conversations on any of these.


Get in touch

Talk to me

I'm actively exploring next-chapter opportunities in senior Customer Success leadership, AI-native CS, and roles that bridge customer relationship work with AI infrastructure. The fastest path to a conversation is a direct email.

For AI-engineering hybrid roles, the AI & Automation Portfolio Companion ↗ goes deeper on the daily systems, the operating model, and concrete outcomes (151 dashboard commits, 54 skills, 42 scheduled automations). One page, evidence-heavy.

joshua.ad.vogel@gmail.com  ·  LinkedIn  ·  Subscribe to The CS PRESS

Sydney-based with Australian Permanent Residency. Open to senior IC and leadership conversations in Customer Success and AI-native CS / Solutions Engineering globally, on-site in Sydney, hybrid, or fully remote.