> Deciding Where Next
Last year I watched a great interview of Naval Ravikant by Chris Williamson on 44 Harsh Truths About The Game of Life.
In that interview, Naval pointed out something obvious yet strangely overlooked. People spend so much time pondering where to study and what career to pursue, but barely spend time thinking about where they want to live. Yet where you live determines everything: air/water/food quality, safety, healthcare, and the community around you. Most people simply end up living wherever they happened to go to school or find a job.
That really resonated with me. I had spent a decade in Hong Kong, mostly because it was the city where I finished university and naturally started working. This time, I wanted to be more intentional. If I were to choose where to spend the next decade of my life, where would that be?
The more I thought about it, the more Australia ticked the boxes. So I booked a trip and spent a month trying to live like a local. By the end it was obvious: this was where I wanted to be.
The next question was: how?
Luckily, I discovered I was in my last year of eligibility for the Working Holiday Visa. It’s not a long-term solution, but it’s a door. And it’s better than nothing.
The bigger challenge was getting back into the job market. It had been five years since I last interviewed for a role. I looked at different resume services and interview coaching platforms, but thought: why not just build a customized tool for my own needs? Thankfully, this time I had AI to help me.
> Job Dashboard Architecture
The goal was to build a personal, single-user web app that brings the entire job search workflow into one place. From job discovery to AI-powered deep-dive analysis and a tailored, export-ready resume. No more toggling between job boards, spreadsheets, and multiple versions of the same CV.
>> Job Scraper
I use n8n to automate job aggregation from Seek and Linkedin, the two biggest job boards in Australia. The scraping itself is handled through Apif with the Seek Job Scraper and Linkedin Jobs Scraper APIs.
To prevent duplicates across platforms, I generate a dedupe_key using [job_title, company_name, location]. All job data gets stored in Supabase.
Once new jobs are added to the database, a second n8n workflow automatically triggers. Each posting goes through an AI scoring pass using ChatGPT, which assigns a score between [0-10] based on how well it matches my profile. The evaluation considers: industry match, role relevance, skills overlap, and experience level.
This quickly surfaces the jobs most worth looking into.
>> Job Dashboard
Once the data pipeline was working, the next challenge was designing a clean interface to actually use it.
For early inspiration, I experimented with Figma Make.
I knew this was going to be a big project and that I would need proper tools with subscription. So I decided to try Cursor as my development environment.
I used Cursor Directory to create a .cursorrules file and used ChatGPT to generate several foundational markdown files (project.md, database-schema.md, and n8n-workflows.md). These served as context for Cursor’s planning mode, which helped generate a structured implementation plan. I spent a lot of time iterating in ‘Plan’ mode before generating any code.
Key dashboard features include:
- Grid/Table view of all tracked jobs - Pipeline status bar (New → Interested → Applied → Interviewing → Offered/Rejected/Archived) that doubles as a click-to-filter
- Multi-select industry filter, keyword search, AI score range filter, and source filter
- Detailed job view showing: full job description, AI scoring analysis, personal notes, direct job posting and application links
One unexpected problem was industry classification. Job listings used widely inconsistent labels.
To solve this, I created an industry_mapping_rules table in Supabase that uses regex patterns to map keyword variations into a smaller set of standardized industry groups.
>> AI Analysis
I didn’t want this to simply be a job aggregator. The real goal was to use AI to help me think more clearly about each opportunity - research companies, understand what they’re really looking for, identity how to position myself, and tailor my resume to each role.
While exploring different tools, I came across autobiographer.io. What stood out was its philosophy: instead of merely helping people “game the ATS”, it focuses on helping people find roles that actually fit them. I liked that approach and borrowed inspiration from it.
The AI Analysis has three main capabilities: Company & Job Analysis, Game Plan, and Tailor Resume.
1. Company & Job Analysis
Using Tavily API, the system conducts two parallel web searches: one for recent company news/financials and one for Glassdoor culture signals. Claude (Sonnet 4.6) synthesizes the results into a structured analysis covering: surface requirements, team dynamics, company intel, culture signals, red and green flags, work style fit, and a profile of who would thrive there.
2. Game Plan
Claude uses the Company & Job Analysis along with my master resume to map out: what’s working for me, what I’m up against, how to position my background, what my opinion move should be, and where my real differentiators lie.
3. Tailored Resume
Finally, with all that context in hand, Claude generates a structured draft writing a ~70-word profile summary, selecting the most relevant experiences from each of my previous roles, deriving a list of skills from those experiences that are ATS-friendly, and providing a list of suggested skills that are missing from my experience but highlighted in the JD. I can make edits directly on the page and export using the master template stored in the database.
>> Company Watchlist
This is a list of companies I keep an eye on for new job openings. Each company card has a “Check for new jobs” button that fires a POST to an n8n web hook, triggering a scrape on the company’s website and Linkedin page. Any new roles discovered are written directly back into the Supabase database.
>> Tech Stack
- Next.js - Full-stack framework
- Tailwind CSS + Shadcn UI + Radix UI - Styling
- n8n - Automation (job scraping + AI scoring)
- Supabase - Database
- Vercel AI SDK + Claude Sonnet 4.6 - AI analysis
- Tavily API - Web research
- docxtemplater + PizZip - DOCX resume generation
- Tiptap - Rich text editing
- Cursor - Vibe coding tool
>> Takeaways
I built this job dashboard partly to improve my job search, but mostly to challenge myself to build a more complex AI-powered product. It took about a month from ideation to completion. The biggest challenges were designing prompts that produced useful AI analysis and iterating on the tailored resume UI until it actually felt good to use.
Even with all this automation, I don’t believe spray-and-pray is how you land a job. AI is a powerful tool, but don’t let it do all the work for you. You still need to read the job description carefully, refine your narrative, and add a human touch before sending any application out. Otherwise, it can be a shortcut to the AI slop bucket.
Now that the tools are built, it’s time to put them to work. Wish me luck!
Serendipitously, I also received a recent issue of Lenny’s Newsletter on How to use AI for your next job interview. I’ll try out his Claude Code workflow to prep for interviews.