Built workflow processing 150 patient forms monthly. Worked great in testing. Week 2 in production, client calls: “Stopped at form 43 again.”
Checked logs. Workflow crashed. Memory overflow. Same spot every time. Document 43.
THE PROBLEM
n8n loaded ALL 150 documents into memory at workflow start. By document 43, memory maxed out. Workflow crashed. Lost all progress.
Testing with 10 documents worked fine. Production with 150 revealed the limit.
WHAT WAS HAPPENING
Gmail Trigger → pulled 150 emails with PDFs → loaded ALL into memory → started processing → crash at document 43 when memory filled
My laptop: 8GB RAM. Enough for testing. n8n cloud instance: 2GB RAM. Not enough for 150 simultaneous PDFs.
THE FIX
Process documents ONE AT A TIME. Clear memory between each. Use Loop node instead of processing all simultaneously.
THE NEW WORKFLOW
1. GMAIL TRIGGER: Get emails with PDFs
2. SPLIT IN BATCHES: Process 1 at a time
3. LOOP NODE: For each document…
– Parse document
– Extract data
– Post to system
– Clear variables
4. WAIT NODE: 2 seconds between documents
5. ERROR TRIGGER: Alert on failures
KEY CONFIGURATION
Split In Batches node: Batch size = 1
This forces sequential processing. Memory clears after each document.
Wait node: 2 seconds delay prevents API rate limits and gives memory time to clear.
THE RESULTS
BEFORE:
– Crashed at document 43 consistently
– Lost all progress on crash
– Had to manually reprocess from start
– Client frustrated with unreliability
AFTER:
– Processes all 150+ documents successfully
– Progress saved after each document
– If crash occurs, resume from failure point
– Zero memory issues
PERFORMANCE:
Processing time increased: 150 documents now takes 8 minutes (was 3 minutes before crashes)
But 8 minutes reliable beats 3 minutes that crashes at document 43
CLIENT REACTION
“I don’t care if it takes 8 minutes. I care that it FINISHES. Previous developer couldn’t figure this out in 2 months.”
Charged extra $400 for the fix. Worth it to them.
THE LESSON
Test with production volume, not sample size. 10 documents is easy. 150 documents reveals memory limits.
Always use Split In Batches for document processing. Sequential processing prevents memory crashes.
