Skip to content

Arthur1asdf/ChiboTrack

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

35 Commits
 
 
 
 
 
 
 
 
 
 

Repository files navigation

ChiboTrack

ChiboTrack is a multi-service project for saving job postings from any browser tab, extracting structured job info, and viewing saved jobs in a dashboard.

What Is In This Repo

  • apps/frontend: React + TypeScript + Vite UI.
  • apps/backend: Spring Boot REST API + JPA persistence.
  • apps/scraper: Flask service that extracts title/company from raw HTML.
  • apps/extension: Chrome extension that captures page HTML and sends it to the backend.
  • docker-compose.yml: Local full-stack orchestration (Postgres + backend + frontend + scraper).

Tech Stack

  • Frontend: React 19, TypeScript, Vite, Tailwind CSS
  • Backend: Java 17, Spring Boot 3, Spring Data JPA
  • Scraper: Python 3.11, Flask, BeautifulSoup, extruct
  • Database: PostgreSQL 15
  • DevOps: Docker Compose

Architecture (Request Flow)

  1. You click Save Job in the Chrome extension.
  2. Extension content script grabs document.documentElement.outerHTML.
  3. Extension sends HTML + URL + date to POST /jobs/cleanandsave on the backend.
  4. Backend forwards HTML to scraper POST /cleaner for metadata extraction.
  5. Backend saves parsed result to Postgres.
  6. Frontend fetches jobs from GET /jobs.

Prerequisites

  • Node.js 22+ and npm
  • Java 17+
  • Python 3.11+
  • PostgreSQL (if not using Docker)
  • Docker + Docker Compose (optional, recommended for easiest setup)

Environment Setup

Backend reads DB settings from root .env.properties.

  1. Copy example env file:
cp .env.example .env.properties
  1. Update .env.properties values:
DB_URL=your db url
DB_USERNAME=your_username
DB_PASSWORD=your_password

Quick Start (Docker Compose)

From repo root:

docker compose up --build

Services:

  • Frontend: http://localhost:5173
  • Backend: http://localhost:8080
  • Scraper: http://localhost:5001/cleaner
  • Postgres: localhost:5432 (postgres/password, db tracker_db)

Stop:

docker compose down

Local Development (Without Docker)

Run each app in separate terminals.

1) Start Postgres

Create/use a database that matches your DB_URL in .env.properties.

2) Start Scraper

cd apps/scraper
python3 -m venv .venv
source .venv/bin/activate
pip install -r requirements.txt
python app.py

3) Start Backend

cd apps/backend
./mvnw spring-boot:run

Windows:

mvnw.cmd spring-boot:run

4) Start Frontend

cd apps/frontend
npm install
npm run dev

API Endpoints

Base URL: http://localhost:8080

  • GET /jobs: Return all saved jobs.
  • POST /jobs/cleanandsave: Extract and save a job from raw HTML.
  • PUT /jobs/{id}: Update job fields (title, company, url) for an existing record.
  • DELETE /jobs/{id}: Delete a job by ID.

Example POST /jobs/cleanandsave body:

{
  "html": "<html>...</html>",
  "url": "https://company.com/jobs/123",
  "dateSaved": "2026-03-01"
}

Chrome Extension Setup

  1. Open chrome://extensions.
  2. Enable Developer mode.
  3. Click Load unpacked.
  4. Select apps/extension.
  5. Open a job page and click extension Save Job.

Notes:

  • Extension currently posts to http://localhost:8080/jobs/cleanandsave.
  • Backend must be running for save to work.

Current Project Status

  • Main saved-job flow is wired end-to-end (extension -> backend -> scraper -> database -> frontend list).
  • Some frontend routes/pages are still placeholders (Under Construction).

Repo Layout

Internship_Tracker/
├── apps/
│   ├── backend/
│   ├── frontend/
│   ├── scraper/
│   └── extension/
├── docker-compose.yml
├── .env.example
└── .env.properties

Troubleshooting

  • Backend DB connection errors:
    • verify .env.properties exists at repo root and credentials are correct.
    • verify Postgres is running and DB in DB_URL exists.
  • Extension save fails:
    • confirm backend is running on localhost:8080.
    • confirm extension is loaded and active on the current tab.
  • Frontend shows no jobs:
    • confirm backend GET /jobs returns data.
    • confirm CORS/network calls are not blocked in browser devtools.

About

Auto tracking your jobs

Resources

Stars

Watchers

Forks

Releases

No releases published

Packages

 
 
 

Contributors