Back to Blog

The n8n Automation Playbook: 10+ Workflows in Production

n8n Automation

I used to be a snob about low-code automation. "Real engineers write code," I told myself, and spent weeks building custom microservices for jobs that could have been done in n8n in an afternoon.

Then I joined Offplan, inherited a mess of cron jobs and Python scripts, and rebuilt the whole thing in n8n. Six months later, I had 10+ production workflows running reliably, and I was a convert. Here is the playbook.

Why n8n Won Me Over

Three things flipped my opinion:

  1. Visual flow debugging: you can see exactly where data got stuck. No more tailing logs and guessing.
  2. Built-in credentials vault: every team member can use API keys without ever seeing them in plaintext.
  3. Code nodes when you need them: n8n is not a no-code trap. Any node can drop to a JavaScript or Python function when the visual stuff runs out of rope.

That third one is the key. n8n is not "drag and drop instead of coding." It is "drag and drop until the abstraction breaks, then code."

Workflow 1: WhatsApp OTP Sessioning

Problem: users needed to verify their phone numbers during property inquiry without downloading an app. Solution: a Twilio WhatsApp webhook hooked into n8n that generates a 6-digit code, stores it in Redis with a 5-minute TTL, and verifies on submission.

The clever part was the session state machine. Users sometimes take 4 minutes to come back. The workflow uses a wait node with a webhook trigger that fires when the user submits the code, with a fallback timeout path.

Workflow 2: Google Drive to Vector DB Ingestion

Problem: the marketing team drops new property documents into a Google Drive folder. The RAG system needs them indexed within 10 minutes.

Workflow:

Trigger: Google Drive (new file in folder)
  -> Download file
  -> Extract text (PDF, DOCX, images via OCR)
  -> Chunk into 800-token passages
  -> Embed via OpenAI embeddings API
  -> Upsert to Weaviate
  -> Slack notification: "Indexed N new docs"

Added retries on each step with exponential backoff. Added a deduplication check using content hash so the same doc does not get re-indexed if someone renames it.

Workflow 3: Multi-Channel MCP Style Orchestration

For Metamorphic, I built an orchestration workflow that routes incoming requests across WhatsApp, email, the website chat, and Instagram DMs into a single unified intake queue. Each channel has its own adapter, but they all feed the same downstream processing:

  • Intent classification via Claude
  • Lead scoring based on historical conversion data
  • Auto-routing to the right team member in Pipedrive
  • Fallback to human review for edge cases

Workflow 4: AI Tutor Content Flow

This one was fun. Users can ask a property question in natural language. The workflow:

  1. Captures the question via webhook
  2. Runs it through a retrieval step against the knowledge base
  3. Generates an answer with citations
  4. Passes the answer through a second LLM for fact-checking against the retrieved sources
  5. If fact-check fails, escalates to a human reviewer on Slack
  6. Logs everything to Supabase for the eval harness

Workflow 5: Content QA and Refresh

Our 300k+ SEO pages need periodic refreshes as listings change. A scheduled workflow runs every 6 hours:

  • Fetches listings updated in the last 6 hours from the DB
  • Re-generates the relevant page content via an LLM call
  • Pushes updates to Botble CMS
  • Notifies search engines via IndexNow
  • Updates the sitemap

Workflow 6: Scheduled Health Checks

Every 5 minutes, a workflow pings all critical endpoints, checks TLS certificates, verifies CDN cache hit rates, and runs a synthetic RAG query. If anything fails the health criteria, it pages me via Telegram and Slack simultaneously.

Workflow 7 to 10: The Support Cast

  • Daily analytics digest: pulls metrics from GA4, Mixpanel, and our DB, posts a formatted summary to Slack at 9am
  • Lead enrichment: when a new lead hits Pipedrive, fetches LinkedIn info, company data, and enriches the deal record
  • Invoice automation: generates, sends, and tracks invoices through the full accounts receivable cycle
  • Competitor monitoring: scrapes selected competitor sites weekly, diffs the content, flags significant changes

The Patterns That Matter

1. Idempotency Keys Everywhere

Every external API call carries an idempotency key. Retries are safe. Failed workflows can be replayed without creating duplicates.

2. Webhook Signatures

Incoming webhooks from Twilio, Stripe, Pipedrive all get HMAC signature verified before doing any work. This is non-negotiable in production.

3. Audit Trails

Every workflow logs its inputs and outputs to a central Postgres audit table. When something goes wrong at 3am, I can replay the exact sequence of events.

4. Human In The Loop

High-intent events (big leads, possible fraud, edge cases in AI classification) get routed to Slack with approve or reject buttons. The workflow pauses until someone clicks.

5. Back Pressure

Rate-limited APIs (OpenAI, Twilio) are wrapped in a queue node with a token bucket. The workflow slows down gracefully instead of hitting 429 errors.

6. Secrets Management

n8n's built-in credentials system is good, but I go further: all production credentials live in AWS Secrets Manager and get rotated monthly. n8n reads them via a custom credential type.

7. Environment Separation

Dev, staging, and production are three separate n8n instances. Workflows are exported as JSON and moved via a CI pipeline, not clicked through the UI.

What I Would Do Differently

If I were starting from scratch today, I would:

  • Use n8n's code nodes for orchestration only, not heavy computation. Push heavy work to dedicated microservices.
  • Start with version control for workflow JSON from day one. Retrofitting it later is a pain.
  • Build a shared "common steps" library earlier. I kept rewriting the same retry logic in 10 different workflows.
  • Use sub-workflows aggressively for any logic that appears in more than one place.
n8n is not a toy. For glue work, webhook handling, scheduled jobs, and API orchestration, it is the most productive tool I have ever used. Stop reinventing these wheels in Python.

If you want to see any of these workflows in more detail, ping me. I am considering open-sourcing a cleaned-up template set.