“My biggest time saver at this company.”

Stagwell Global replaced a 2-hour daily cash reporting process with an automated pipeline built on Airtable, n8n, and Postgres. The new system reduced manual reporting time, improved transaction deduplication, and gave the treasury team a faster, more reliable way to review daily cash activity.

The problem

Stagwell’s treasury team was producing daily cash reports through a manual Excel process that had become slow, fragile, and hard to scale.

Each day involved downloading bank files, cleaning them, deduplicating transactions, mapping accounts to partners, and assembling a report for leadership. The process worked, but only because someone on the team was spending a large part of their day pushing it through.

“I would say that this daily cash plus takes two hours a day. So then, that's 40 hours a month spent on this.” — Joey

There were also technical issues baked into the workflow.

Some bank files had no stable transaction IDs, which made deduplication difficult. Excel sometimes corrupted account numbers into scientific notation. Airtable alone could not comfortably hold the growing transaction history. And previous consultants had already said the dataset was too large to manage properly in Airtable.

The problem was not just reporting. It was the entire operational chain behind the report.

The approach

We started by mapping the reporting workflow from raw bank CSV to final executive summary.

The aim was not to recreate the Excel file in another tool. The aim was to build a system that could ingest raw files reliably, store history at scale, and generate reports without introducing new manual cleanup work.

That is the same principle behind you don’t need more tools, you need better systems.

The final setup used Airtable as the operational front end, n8n as the automation layer, and Postgres as the long-term transaction database. Daily operators still worked through a simple interface, but the heavy lifting moved out of spreadsheets and into a pipeline that could handle larger files, historical data, and more complex deduplication logic.

What we built

CSV ingestion that non-technical users could run

The treasury team needed a workflow that was simple enough to run every day without opening grid views or debugging scripts.

So we built an Airtable upload interface where users could submit raw CSV files, trigger processing, and see upload status and errors in one place.

Behind the scenes, n8n handled file processing, deduplication, and database updates. That helped bypass Airtable’s file size and script limits while keeping the user experience simple.

Deduplication that could survive messy bank data

This was one of the hardest parts of the build.

Some files did not include reliable transaction IDs, so we could not rely on a clean primary key. Instead, the system used a composite and sequence-based deduplication approach based on transaction date, amount, account, and nearby row order.

It also had to defend against duplicate imports if a user accidentally ran overlapping files or uploaded the same set twice.

“If you accidentally run it twice, it doesn't create duplicates. That was a big hassle because if you have duplicates in the system, all of your data will be off.” — Vikas

We also had to work around Excel-induced corruption of account numbers. That led to a strict raw CSV only process for certain sources, especially TD Bank, where scientific notation could silently break account mapping.

Account and partner mapping

The report depended on transactions being assigned correctly to entities, partners, and account classes.

We created master tables for entities, partners, and accounts, along with logic to surface unmapped accounts automatically. That made it easier for the team to catch issues early instead of discovering them later through reconciliation discrepancies.

As the system matured, the reporting logic moved from raw account level views into more stable partner level summaries that were easier to maintain and more useful for daily reporting.

Daily cash activity reporting

Once ingestion and mapping were reliable, the reporting layer could become much simpler.

The system generated:

  • US cash activity summaries
  • partner level cash flow breakdowns
  • partner cash balances
  • skip logic for non-operational transactions like sweeps and payroll concentration activity
  • HTML report output for easier review and sharing

The report was not just a data dump. It reflected treasury logic about what was operationally material and what should be excluded or grouped differently.

That mix of structure and judgment is part of why build automation systems that don’t break at 2 AM matters so much in finance operations. The workflow has to be reliable, but it also has to reflect how the team actually interprets the numbers.

Historical transaction dashboard

The original reporting workflow was focused on the daily deliverable, but the project also needed historical visibility.

We built a searchable transaction dashboard and imported historical data so the team could review past movements without digging through old files. Because Airtable record limits would eventually become a problem, the transaction history moved to Postgres while Airtable remained the control layer.

This hybrid structure gave the team the speed of a proper database with the usability of a no-code front end.

The outcome

The new workflow reduced daily reporting time dramatically.

What had been taking roughly two hours each day moved much closer to a review and exception handling process rather than a manual assembly exercise.

“This Airtable report was like 10 minutes-ish. Oh, it’s a great improvement.” — Josh

The project also reduced risk in a few important ways:

  • duplicate imports were controlled
  • large files no longer broke the workflow
  • account mapping issues became visible earlier
  • historical data could be stored without running into Airtable limits
  • operators could run the process through a simple interface instead of working directly in fragile spreadsheets

Just as important, the team now had a foundation that could support additional reporting modules, bank sources, and future automation work without rebuilding the system from scratch.

That is also why this was not just an “automation project.” It was an operations redesign. The reporting got faster, but the bigger gain came from reworking the underlying process end to end, much like the approach described in a practical approach to no-code for your business.

Start with a blueprint

If your reporting workflow still depends on spreadsheets, manual cleanup, and people remembering the right sequence of steps, the first fix is usually not another tool. It is a better system design.

Book a blueprint session

Subscribe to OpsTwo

Don’t miss out on the latest issues. Sign up now to get access to the library of members-only issues.
jamie@example.com
Subscribe