Building a Scalable Analytics Feature for My URL Shortener
"If you can't measure it, you can't improve it."
I've been deep in the code for my URL shortener project ZapLink, and I just wrapped up a super exciting session building out a new feature: analytics.
The tool was already good at its main job—shortening links, but I knew it wouldn't be complete until users could actually see how many people were clicking their links.
1. First, Secure the Data with User Accounts​
My first thought was simple: just count the clicks.
But that immediately led to a bigger question: who should get to see the stats?
I definitely didn't want the analytics for a link to be public.
The answer was clear: ZapLink needed user accounts.
Since I'm already using Supabase for my PostgreSQL database, their built-in Auth was a no-brainer.
It fits perfectly. I mapped out a full system for users to:
- Sign up
- Log in
- Have their links tied directly to their account
Getting this right was crucial to make sure all the analytics data would be private and secure.
2. The Big Challenge: What if a Link Goes Viral?​
This was the fun part.
My initial idea was to just add a new row to the database for every single click.
But what happens if a link gets thousands of clicks every minute?
- My database would be working overtime
- The whole app would slow down
- Even the most important function - the URL redirection would start to lag
And no one wants to wait for a link to load.
3. The Solution: A "Batch and Flush" Architecture​
Here’s the two-step plan I came up with.
Step 1 : The Super-Fast Tally Counter (Redis)​
I'm already using Redis for caching because it's lightning-fast.
Now, when a link gets clicked, I don’t touch the main database.
I simply tell Redis to add +1 to a counter for that specific link.
This is practically instant.
Step 2 : The Little Helper Robot (My "Cron Job" Lambda)​
I built a separate little worker on AWS Lambda with only one job:
- Every 12 hrs, it wakes up (Will provide real-time analytics for paid users in future)
- Grabs all new click counts from Redis
- Saves them to PostgreSQL in one quick batch
- Resets the counters in Redis
Result:
The main database gets a gentle nudge every few minutes instead of being bombarded.
The links stay fast, and I still get perfect analytics.
So, What I Got Done Today​
It was a busy one! I managed to:
- Design the database tables for user links and their analytics
- Implement a full user login and signup system on the frontend
- Build the backend worker function that handles all the analytics data
- Sketch out a cool design for the user dashboard and the analytics page
What's Next?​
The backend foundation is all set up and ready to go.
The only thing left is to make it look great.
Tomorrow’s plan:
- Style the dashboard
- Style the analytics page
- Make the new login/signup buttons match the rest of ZapLink
Thanks for reading! It's been an awesome ride so far. Stay tuned for the next update!
