Case Study

Job Radar: From Side Project Framing to Operating System Framing

Job Radar was not built as a novelty bot. It was designed as a live monitoring and decision-support system: structured ingestion, controlled state, human approvals, deterministic checks, and reliable notifications for time-sensitive job discovery work.

Executive Summary

The problem was not "can a bot send messages?" The real problem was building a system that could monitor multiple hiring sources, avoid duplicate noise, cope with changing source behavior, preserve operator trust, and keep the state sane across repeated runs. The architecture was shaped around reliability, not novelty.

ATS Monitoring SQLite State Deduplication Rate-Limit Handling Telegram Delivery Human-in-the-loop

What Was Built

  • A Telegram-first monitoring workflow across a 27-source wishlist, combining Greenhouse, Lever, Ashby, and direct careers-page discovery.
  • A SQLite-backed state model for jobs, scans, triage outcomes, and operator feedback.
  • Deduplication logic to reduce repeat noise and keep alerts tied to real change.
  • Guardrails for scan locking, short writes, and retry-friendly behavior under concurrent activity.
  • Scheduled scans plus on-demand Telegram `/probe` checks for a single URL, company page, or domain when manual verification is needed.
  • Approval gates before tailoring and export, including resume and cover-letter generation from fixed templates and operator parameters.

Operational Effect

  • Search latency matters: a delayed alert can mean a missed application window.
  • Duplicate alerts destroy trust fast and make operators ignore the system.
  • Source changes and partial failures are normal, so the system has to degrade safely.
  • State quality matters more than clever prompts because downstream decisions depend on it.

High-Level Architecture

This is the operating shape at a glance. The goal is clear boundaries, explicit ownership, and no guessing about where validation happens.

1. Source Polling

The default 15-minute timer polls the 27-source watchlist, while Telegram can trigger manual `/probe` scans against a single URL or domain.

2. Ingress + Normalization

Records are normalized into a consistent shape before they are trusted by the rest of the pipeline.

3. State + Deduplication

SQLite tracks seen jobs, scan reports, and triage state so repeated detections do not become repeat noise.

4. Ranking + Human Review

Potential matches are ranked and then reviewed at explicit checkpoints before any resume, cover-letter, tailoring, or export step happens.

5. Notification + Follow-Up

Approved results are sent through Telegram, with status visibility and outcome feedback feeding later runs.

Proof That This Is a System, Not a Script

State Ownership SQLite was used as a deliberate state boundary for lifecycle tracking, not just as a quick local cache.
Concurrency Discipline Scan locking, short write patterns, and retry-friendly behavior were used to reduce contention and duplicate work.
Delivery Trust Deduplication and explicit triage reduce alert fatigue, which is critical in any notification-driven system.
Operator Control Human approval remains part of the system contract, so automation can accelerate work without silently drifting.

Operational Constraints Addressed

Runtime Today, Scale Later

Today the desktop machine is the active node, which is why the host needs to stay on for scheduled scans and Telegram interaction. The architecture is already shaped so it can move to a dedicated always-on device later, where multiple agents can run in parallel without changing the core operating model.

How To Talk About It

The strongest framing is architectural, not hobbyist. This is the concise version I would use in recruiter-facing contexts:

Architected a Telegram-first ATS monitoring and decision-support system using SQLite for state management, lifecycle tracking, and deduplication across a 27-source watchlist spanning Greenhouse, Lever, Ashby, and direct careers-page discovery. Designed around scheduled plus manual Telegram scans, retry-aware polling, human approval checkpoints, and template-driven resume/cover-letter generation, so the workflow reduced search latency without becoming noisy or brittle.

© Hubsays Studio · architecture index