ChiboTrack is a multi-service project for saving job postings from any browser tab, extracting structured job info, and viewing saved jobs in a dashboard.
apps/frontend: React + TypeScript + Vite UI.apps/backend: Spring Boot REST API + JPA persistence.apps/scraper: Flask service that extracts title/company from raw HTML.apps/extension: Chrome extension that captures page HTML and sends it to the backend.docker-compose.yml: Local full-stack orchestration (Postgres + backend + frontend + scraper).
- Frontend: React 19, TypeScript, Vite, Tailwind CSS
- Backend: Java 17, Spring Boot 3, Spring Data JPA
- Scraper: Python 3.11, Flask, BeautifulSoup, extruct
- Database: PostgreSQL 15
- DevOps: Docker Compose
- You click
Save Jobin the Chrome extension. - Extension content script grabs
document.documentElement.outerHTML. - Extension sends HTML + URL + date to
POST /jobs/cleanandsaveon the backend. - Backend forwards HTML to scraper
POST /cleanerfor metadata extraction. - Backend saves parsed result to Postgres.
- Frontend fetches jobs from
GET /jobs.
- Node.js 22+ and npm
- Java 17+
- Python 3.11+
- PostgreSQL (if not using Docker)
- Docker + Docker Compose (optional, recommended for easiest setup)
Backend reads DB settings from root .env.properties.
- Copy example env file:
cp .env.example .env.properties- Update
.env.propertiesvalues:
DB_URL=your db url
DB_USERNAME=your_username
DB_PASSWORD=your_passwordFrom repo root:
docker compose up --buildServices:
- Frontend:
http://localhost:5173 - Backend:
http://localhost:8080 - Scraper:
http://localhost:5001/cleaner - Postgres:
localhost:5432(postgres/password, dbtracker_db)
Stop:
docker compose downRun each app in separate terminals.
Create/use a database that matches your DB_URL in .env.properties.
cd apps/scraper
python3 -m venv .venv
source .venv/bin/activate
pip install -r requirements.txt
python app.pycd apps/backend
./mvnw spring-boot:runWindows:
mvnw.cmd spring-boot:runcd apps/frontend
npm install
npm run devBase URL: http://localhost:8080
GET /jobs: Return all saved jobs.POST /jobs/cleanandsave: Extract and save a job from raw HTML.PUT /jobs/{id}: Update job fields (title,company,url) for an existing record.DELETE /jobs/{id}: Delete a job by ID.
Example POST /jobs/cleanandsave body:
{
"html": "<html>...</html>",
"url": "https://company.com/jobs/123",
"dateSaved": "2026-03-01"
}- Open
chrome://extensions. - Enable
Developer mode. - Click
Load unpacked. - Select
apps/extension. - Open a job page and click extension
Save Job.
Notes:
- Extension currently posts to
http://localhost:8080/jobs/cleanandsave. - Backend must be running for save to work.
- Main saved-job flow is wired end-to-end (extension -> backend -> scraper -> database -> frontend list).
- Some frontend routes/pages are still placeholders (
Under Construction).
Internship_Tracker/
├── apps/
│ ├── backend/
│ ├── frontend/
│ ├── scraper/
│ └── extension/
├── docker-compose.yml
├── .env.example
└── .env.properties
- Backend DB connection errors:
- verify
.env.propertiesexists at repo root and credentials are correct. - verify Postgres is running and DB in
DB_URLexists.
- verify
- Extension save fails:
- confirm backend is running on
localhost:8080. - confirm extension is loaded and active on the current tab.
- confirm backend is running on
- Frontend shows no jobs:
- confirm backend
GET /jobsreturns data. - confirm CORS/network calls are not blocked in browser devtools.
- confirm backend