How I Built a Client Reporting System That Runs Itself (With n8n)
Client reports were taking 3 hours every Friday. After one weekend building an n8n automation, they now take 10 minutes to review and send. Here's the workflow, the mistakes I made, and the parts that surprised me.
How I Built a Client Reporting System That Runs Itself (With n8n)#
Every Friday afternoon I'd sit down and spend 2-3 hours writing client reports. Pull the time logs. Open the project tracker. Count the completed tasks. Write a summary of what happened this week. Format it into an email. Send it.
I was doing this for six clients. Multiply that out: somewhere between two and three hours every single Friday, doing the same ritual, copying the same data from the same places into the same email structure.
I knew I should automate it. I kept saying "I'll do it this weekend." I said that for four months.
Then a client asked a question about last week's report that I couldn't answer quickly, and I realized I wasn't even building a record of this work anywhere. I was writing reports and then the data was gone.
That was the motivation I needed.
What I Wanted the System to Do#
I mapped it out before touching n8n. The system needed to:
- Run automatically every Friday at 4 PM
- Pull completed tasks from ClickUp for each client project
- Pull time tracking from Toggl for each client
- Merge the data and calculate totals (hours, tasks, estimated vs actual)
- Generate a clean, readable HTML email — not a wall of text
- Send to each client's email address
- Archive a copy to Notion so I have a historical record
- Take 10 minutes of my time maximum (to review and optionally add a personal note)
The last point was important. I didn't want to fully automate something that goes to clients without any human eyes on it. I wanted automation to do the 80% work, and me to do the 20% that requires judgment.
The Data Architecture First#
Before I built any workflow, I created a "Clients" database in Notion with these fields:
- Client name
- Email address
- ClickUp project ID
- Toggl project ID
- Status (active/paused/completed)
- Report frequency (weekly/biweekly)
This database became the source of truth. The workflow reads from it to know which clients to report on, where to pull their data, and where to send the report. If I add a new client, I add them to this database and the automation picks them up automatically.
This step — designing the data before writing any workflow — saved me significant time. I'd made the mistake before of hardcoding client details directly into workflows, then needing to update five places when a client changed their email.
Building the Workflow#
Stage 1: The Trigger and Client Loop#
Schedule Trigger: Friday, 4:00 PM
Immediately after the trigger, I fetch my active clients from Notion:
Notion node:
Resource: Database
Operation: Get Many
Database ID: [my clients database]
Filters: Status = "active"
This returns an array of client records. n8n automatically processes each one individually through the subsequent nodes — I don't need an explicit loop for this basic pattern.
Stage 2: Fetching Data in Parallel#
For each client, I need data from both ClickUp and Toggl. Instead of fetching them sequentially (slow), I use n8n's parallel branches:
The workflow splits into two simultaneous paths:
- Path A: ClickUp node → fetch tasks completed this week for this client's project
- Path B: Toggl node → fetch time entries this week for this client's project
Both paths run at the same time. A Merge node (set to "Wait for Both") waits until both complete, then combines the results.
The ClickUp query:
Resource: Task
Operation: Get All
Project ID: {{ $json.clickupProjectId }}
Filters:
- date_updated_gt: [7 days ago in Unix timestamp]
- status: complete
The Toggl query:
HTTP Request (Toggl API):
GET https://api.track.toggl.com/api/v9/me/time_entries
Parameters:
start_date: [Monday of current week]
end_date: [today]
Headers:
Authorization: Basic [base64 encoded api_token:api_token]
Then filter by project ID in a Code node.
Stage 3: Calculating the Summary#
After merging, a Code node does the calculations:
const client = $node["Get Client"].json;
const tasks = $node["ClickUp Tasks"].json;
const timeEntries = $node["Toggl Entries"].json;
// Filter time entries for this specific client
const clientEntries = timeEntries.filter(e =>
e.project_id === parseInt(client.togglProjectId)
);
// Calculate totals
const totalHours = clientEntries.reduce((sum, e) => {
return sum + (e.duration / 3600); // Toggl stores duration in seconds
}, 0);
const completedTasks = tasks.length;
// Group tasks by category (using ClickUp task list name)
const byCategory = {};
tasks.forEach(task => {
const category = task.list?.name || 'General';
if (!byCategory[category]) byCategory[category] = [];
byCategory[category].push(task.name);
});
return [{
json: {
clientName: client.name,
clientEmail: client.email,
weekStart: getMonday().toLocaleDateString('en-US', { month: 'long', day: 'numeric' }),
weekEnd: new Date().toLocaleDateString('en-US', { month: 'long', day: 'numeric', year: 'numeric' }),
totalHours: totalHours.toFixed(1),
completedTasks,
tasksByCategory: byCategory
}
}];
Stage 4: Generating the Email HTML#
Another Code node builds the HTML email. I spent real time on this — the report needed to look professional, not like a plaintext dump.
const data = $json;
const categoryRows = Object.entries(data.tasksByCategory)
.map(([category, tasks]) => `
<tr>
<td style="padding: 12px; font-weight: 600; color: #374151;
border-bottom: 1px solid #e5e7eb;">${category}</td>
<td style="padding: 12px; color: #6b7280; border-bottom: 1px solid #e5e7eb;">
${tasks.map(t => `• ${t}`).join('<br>')}
</td>
</tr>
`).join('');
const html = `
<!DOCTYPE html>
<html>
<head>
<meta charset="utf-8">
</head>
<body style="font-family: -apple-system, BlinkMacSystemFont, 'Segoe UI', sans-serif;
max-width: 600px; margin: 0 auto; padding: 20px; color: #111827;">
<div style="border-left: 4px solid #3b82f6; padding-left: 16px; margin-bottom: 24px;">
<p style="margin: 0; font-size: 13px; color: #6b7280; text-transform: uppercase;
letter-spacing: 0.05em;">Weekly Report</p>
<h1 style="margin: 4px 0 0; font-size: 22px; font-weight: 700;">${data.clientName}</h1>
<p style="margin: 4px 0 0; color: #6b7280;">${data.weekStart} – ${data.weekEnd}</p>
</div>
<div style="display: flex; gap: 16px; margin-bottom: 24px;">
<div style="flex: 1; background: #f9fafb; border-radius: 8px; padding: 16px;">
<p style="margin: 0; font-size: 13px; color: #6b7280;">Hours this week</p>
<p style="margin: 4px 0 0; font-size: 28px; font-weight: 700; color: #111827;">
${data.totalHours}h
</p>
</div>
<div style="flex: 1; background: #f9fafb; border-radius: 8px; padding: 16px;">
<p style="margin: 0; font-size: 13px; color: #6b7280;">Tasks completed</p>
<p style="margin: 4px 0 0; font-size: 28px; font-weight: 700; color: #111827;">
${data.completedTasks}
</p>
</div>
</div>
<h2 style="font-size: 16px; font-weight: 600; margin-bottom: 12px;">Work completed</h2>
<table style="width: 100%; border-collapse: collapse; margin-bottom: 24px;">
${categoryRows}
</table>
<p style="font-size: 13px; color: #9ca3af; border-top: 1px solid #f3f4f6;
padding-top: 16px;">
This report was generated automatically.
Reply to this email with any questions.
</p>
</body>
</html>
`;
return [{ json: { ...data, emailHtml: html } }];
Stage 5: The Human Review Step (The Key Design Decision)#
Here's where I diverged from full automation. Instead of sending the email directly, the workflow creates a Gmail draft.
Gmail node:
Operation: Create Draft
To: {{ $json.clientEmail }}
Subject: Weekly Update – {{ $json.weekStart }} to {{ $json.weekEnd }}
HTML Content: {{ $json.emailHtml }}
Every Friday at 4 PM, my inbox has six draft emails ready. I open each one, read it for 60-90 seconds, sometimes add a personal note at the top ("Had a productive week — the new feature is looking solid"), and hit send.
The whole process takes about 10 minutes instead of 2-3 hours.
Why not auto-send? Client communication is the one area where I want human judgment. The automation handles data and formatting. I handle tone and relationship. That division of labor feels right.
Stage 6: Archive to Notion#
After the draft is created, a Notion node creates a page in my "Weekly Reports" database:
- Title: "[Client Name] — Week of [date]"
- Properties: client, date, hours, tasks completed
- Body: the formatted task list
Now I have a permanent searchable record of every report I've ever sent.
What I Built Wrong the First Time#
I made one significant structural mistake in the first version: I hardcoded the client list inside the workflow itself, as a static JSON array.
// First version (wrong)
const clients = [
{ name: "Client A", email: "a@client.com", clickupId: "xxx", togglId: "yyy" },
{ name: "Client B", email: "b@client.com", clickupId: "aaa", togglId: "bbb" },
// ...
];
This worked fine until I needed to pause reporting for one client while they were on holiday. I had to edit the workflow directly, which felt fragile. What if I introduced a bug while editing?
Moving the client list to Notion was an hour of rework that was absolutely worth it. Now managing clients means updating a database, not editing a workflow.
The Unexpected Side Effect#
I didn't anticipate this, but having structured weekly report archives changed how I manage client relationships.
When a client questions the scope of work three months in, I can pull up every weekly report and show exactly what was delivered each week. When a client wants to renew a contract, I can calculate the actual hours logged vs. estimated and have a data-backed conversation.
The automation didn't just save me time writing reports. It gave me institutional memory about my work that I never had before.
Time Comparison#
| Task | Before | After | |---|---|---| | Pulling ClickUp data | 15 min × 6 clients = 90 min | Automated | | Pulling Toggl data | 10 min × 6 clients = 60 min | Automated | | Writing summaries | 15 min × 6 clients = 90 min | Automated | | Formatting emails | 10 min × 6 clients = 60 min | Automated | | Personal review + send | — | 10 min total | | Total Friday time | 5 hours | 10 minutes |
Building the automation took one weekend — call it 10 hours. At 5 hours saved per week, it paid for itself in two weeks.
Frequently Asked Questions#
Can n8n send formatted HTML emails automatically?#
Yes. n8n's Gmail and SMTP nodes support HTML content. Build your HTML template in a Code node using JavaScript template literals, then pass the result to the email node's "HTML" field. Test by sending to yourself first before using it live.
How do I pull data from multiple sources and combine it in n8n?#
Use the Merge node with "Combine" mode to join datasets on a common key. For combining without a shared key, use the Code node to manually merge arrays. Pull data from each source in parallel using separate branches before the Merge node.
How do I send different emails to different clients from one workflow?#
Store client data in Notion or Google Sheets. Fetch all active clients at the start. n8n automatically processes each client record through subsequent nodes individually, so one workflow handles all clients without extra loop logic.
What is the best way to schedule n8n workflows for specific days?#
Use the Schedule Trigger. For "every Friday at 4 PM", select "Weeks", tick "Friday", set hour to 16. For complex schedules, use cron expression mode. Set the timezone to avoid time zone surprises.
Frequently Asked Questions
Continue Reading
How I Migrated from Zapier to n8n and Cut My Automation Bill to Zero
I was paying $120/month on Zapier and barely using a third of it. Here's the honest story of migrating to n8n — the wins, the failures, and the one thing that almost made me give up.
How I Stopped Missing Leads by Building a $0 CRM Automation in n8n
A $4,000 project slipped through my fingers because I missed a contact form email. Here's the n8n automation I built to make sure that never happens again — and it cost nothing to run.
How I Fixed My n8n Workflow That Was Failing Silently for Three Weeks
My n8n workflow was silently failing every Tuesday for three weeks. No errors, no alerts, just nothing happening. Here's the debugging story and the monitoring setup I built so it can never sneak past me again.