Lead-research automation pipeline
For a small B2B agency: scrapes a target list, enriches each lead with public data, scores fit with GPT-4, and writes a personalised cold email — all unattended, error-alerted on Slack.
A small B2B agency was burning ~14 hours a week on lead research — copy/pasting names, hunting for emails, drafting cold outreach. I rebuilt the entire flow as a self-running n8n workflow.
The pipeline scrapes a target list (Apify + custom nodes), enriches each lead with public data (LinkedIn, company sites, news), scores company-fit with GPT-4 against the agency's ICP, and finally drafts a personalised cold email tailored to each lead's recent activity.
Everything runs on a schedule, writes to Google Sheets, and pings Slack on error — so the team only sees the pipeline when it actually needs them.
Target-list scraping
Resilient scraping via Apify + custom n8n nodes — bypasses rate limits and bot checks cleanly.
AI fit-scoring
GPT-4 scores each lead against the agency's ICP rubric and tags it Hot / Warm / Skip.
Personalised cold-email drafts
GPT-4 writes a tailored opener referencing the lead's recent posts or company news — ready for review.
Slack error alerts
Failures trigger a structured Slack alert so a tiny team can keep the system reliable without monitoring it.
- ↓ 14 hours / week of manual lead research time
- ↑ 3× more outbound volume with the same team
- Zero touch operation — runs while the team sleeps