How I Turned a 30min Daily Task into a 5min Breeze (Saving 100+ Hours a Year)
Published on June 04, 2025
About this project
It was 2018, Excel was king, and if you could create a pivot table, you were a prodigy.
If you knew macros? You were a God! Fun times!
😵 We had this sales reporting beast:
A 12-page PDF generated every single day from a 20MB Excel file, bloated with pivot tables and legacy logic.
The job? Send out the daily performance report for the entire company.
The manual routine looked like this:
- Download 5 ERP reports in .xlsx format
- Clean up the import template in Excel
- Paste everything into the right places
- Update 30+ pivot tables
- Double-check every sheet
- Manually export the PDF (via Ctrl+Click on sheets)
- Send it over via email
⏱️ Average time: 30 minutes/day
📆 Annual effort: 126 hours
But here’s the thing: only one person could do it.
And if they were out? Chaos.
Fast forward to 2025: the entire process is now automated and I decided to go back at the project while reviewing old notes.
It took 3 months of mapping the steps, testing scenarios, and refactoring into a scalable process.
By the end of 2021, we launched the new version:
🔁 Same data
🎁 A better looking report
🎉 But 90% less effort
Here’s the new flow:
- Upload raw files to a Google Drive folder
- Run a Google Colab script (pandas, gspread, numpy)
- Data flows into 2 Google Sheets, auto-formatted
- Linked Google Slides update charts and visuals
- Click “Update linked objects” → Export PDF → Done.
⏱️ New time: 5 minutes/day
📆 Annual effort: 21 hours
💥 That’s a 105-hour/year time save.
🧠 No more human errors.
🔁 Easily replicable and documented.
🤖 Zero dependency on Excel wizardry or muscle-memory workflows.
I’m still amazed we were spending 15x more time for the exact same result.
Optimization isn’t always about new technology, it’s about smarter flows.

Have you ever turned a beastly manual process into a 5-minute miracle?
#automation #datascience #processengineering #nocode #exceltoai #colab #productivity #growth