![]() ⚡ Yoruichi Speed Daemon P2P Coordinator
|
![]() 🗡️ Retsu Code Samurai Inference Engine
|
![]() 🌊 Nelliel Data Guardian State Manager
|
![]() 🐝 Soifon Security Ninja Auth Handler
|
![]() 🌸 Yachiru UX Specialist Stream Router
|
![]() ❄️ Rukia Code Freezer Debugger Agent
|
![]() 🔬 Nemu Lab Architect Research Agent
|
![]() ✨ Orihime Bug Healer Refactor Agent
|
![]() 🎯 Makima Flow Controller Orchestrator
|
"The fastest AI-powered code generation platform, now with peer-to-peer collaboration"
Groq Coder is a production-grade acceleration engine that leverages Groq LPU™ (Language Processing Unit) for blazing-fast inference. Unlike GPU-based solutions that batch requests, we stream logic in real-time—enabling developers to iterate at the speed of thought.
+ ⚡ Sub-50ms inference latency via Groq LPU
+ 🔗 Real-time P2P collaboration without central servers
+ 🎯 800+ tokens/second streaming
+ 🔐 End-to-end encrypted peer connections
+ 🌐 Decentralized code sharing networkflowchart TB
subgraph "🌐 Public Network"
direction TB
STUN[("🔄 STUN Server<br/>NAT Traversal")]
TURN[("🔀 TURN Relay<br/>Fallback Route")]
end
subgraph "👤 Peer A - Creator"
direction TB
UA[("🖥️ Browser Client")]
WA["📡 WebRTC Agent"]
LA["💾 Local State<br/>IndexedDB"]
end
subgraph "👥 Peer B - Collaborator"
direction TB
UB[("🖥️ Browser Client")]
WB["📡 WebRTC Agent"]
LB["💾 Local State<br/>IndexedDB"]
end
subgraph "⚡ Signaling Layer"
direction LR
SIG["🔔 WebSocket<br/>Signaling Server"]
end
UA --> WA
UB --> WB
WA <-->|"📋 SDP Offer/Answer"| SIG
WB <-->|"📋 SDP Offer/Answer"| SIG
WA <-->|"🧊 ICE Candidates"| STUN
WB <-->|"🧊 ICE Candidates"| STUN
WA <-.->|"🔀 Relay (if needed)"| TURN
WB <-.->|"🔀 Relay (if needed)"| TURN
WA <===>|"🔐 Encrypted DataChannel<br/>Code + Cursor + State"| WB
WA --> LA
WB --> LB
style UA fill:#ff6b00,stroke:#fff,color:#fff
style UB fill:#00d4aa,stroke:#fff,color:#fff
style SIG fill:#8b5cf6,stroke:#fff,color:#fff
style STUN fill:#3b82f6,stroke:#fff,color:#fff
style TURN fill:#ec4899,stroke:#fff,color:#fff
┌─────────────────────────────────────────────────────────────────────┐
│ PEER-TO-PEER HANDSHAKE SEQUENCE │
├─────────────────────────────────────────────────────────────────────┤
│ │
│ PEER A SIGNALING PEER B │
│ │ │ │ │
│ │──── Create Offer ───────>│ │ │
│ │ │──── Forward Offer ───────>│ │
│ │ │ │ │
│ │ │<───── Create Answer ──────│ │
│ │<──── Forward Answer ─────│ │ │
│ │ │ │ │
│ │<═════════ ICE Candidates Exchange ═════════════════>│ │
│ │ │ │ │
│ │◀════════════════ DTLS Handshake ═════════════════▶ │ │
│ │ │ │ │
│ │◀══════════ ENCRYPTED DATA CHANNEL ═══════════════▶ │ │
│ │ │ │
│ ┌─┴─┐ ┌─┴─┐ │
│ │ A │◀═══════════ LIVE COLLABORATION ═════════════════▶│ B │ │
│ └───┘ └───┘ │
│ │
└─────────────────────────────────────────────────────────────────────┘
| Channel | Purpose | Priority |
|---|---|---|
code-sync |
Real-time code delta sync | 🔴 Critical |
cursor-pos |
Cursor position broadcast | 🟡 High |
ai-stream |
AI response streaming | 🔴 Critical |
presence |
User presence/status | 🟢 Normal |
files |
Large file transfer | 🔵 Low |
graph TB
subgraph "🎨 Client Layer"
direction TB
UI["🖼️ React UI<br/>Next.js 15"]
Editor["📝 Monaco Editor<br/>Code Workspace"]
Preview["👁️ Live Preview<br/>Sandboxed iFrame"]
P2P["🔗 P2P Module<br/>WebRTC"]
end
subgraph "🌐 Edge Layer"
direction TB
CDN["☁️ Vercel Edge<br/>Global CDN"]
MW["🛡️ Middleware<br/>Rate Limiting"]
Auth["🔐 NextAuth.js<br/>OAuth/JWT"]
end
subgraph "⚡ Inference Engine"
direction TB
Groq["🧠 Groq LPU<br/>Primary Engine"]
Cerebras["🔄 Cerebras<br/>Fallback"]
DeepSeek["💭 DeepSeek R1<br/>Reasoning Model"]
end
subgraph "💾 Data Layer"
direction TB
Mongo["🗄️ MongoDB Atlas<br/>Project Storage"]
Redis["⚡ Upstash Redis<br/>Session Cache"]
Vector["🔍 Vector DB<br/>Embeddings"]
end
subgraph "📡 Real-time Layer"
direction TB
WS["🔌 WebSocket<br/>Live Updates"]
SSE["📊 SSE<br/>AI Streaming"]
Signal["🔔 Signaling<br/>P2P Coordination"]
end
UI --> Editor
UI --> Preview
UI --> P2P
Editor -->|"HTTP/2"| CDN
CDN --> MW
MW --> Auth
Auth --> Groq
Auth --> Cerebras
Groq --> DeepSeek
Auth --> Mongo
Auth --> Redis
Mongo --> Vector
Editor --> WS
Groq --> SSE
P2P --> Signal
style Groq fill:#ff6b00,stroke:#fff,color:#fff
style P2P fill:#00d4aa,stroke:#fff,color:#fff
style UI fill:#8b5cf6,stroke:#fff,color:#fff
┌───────────────────────────────────────────────────────────────────────────┐
│ REQUEST PROCESSING PIPELINE │
└───────────────────────────────────────────────────────────────────────────┘
╔═══════════════╗ ╔═══════════════╗ ╔═══════════════╗
║ USER INPUT ║────▶║ EDGE RUNTIME ║────▶║ AUTH GUARD ║
║ "Build form" ║ ║ validate req ║ ║ check session ║
╚═══════════════╝ ╚═══════════════╝ ╚═══════════════╝
│
▼
╔═══════════════╗ ╔═══════════════╗ ╔═══════════════╗
║ RATE LIMITER ║◀────║ CONTEXT BUILD ║◀────║ PROMPT ENGINE ║
║ Upstash Redis║ ║ inject meta ║ ║ system prompt ║
╚═══════════════╝ ╚═══════════════╝ ╚═══════════════╝
│
▼
╔═══════════════╗ ╔═══════════════╗ ╔═══════════════╗
║ GROQ LPU ║────▶║ STREAM ║────▶║ SSE PUSH ║
║ 800 tok/sec ║ ║ transformer ║ ║ to client ║
╚═══════════════╝ ╚═══════════════╝ ╚═══════════════╝
│
▼
╔═══════════════╗ ╔═══════════════╗ ╔═══════════════╝
║ ASYNC WRITE ║◀────║ P2P FANOUT ║◀────║ RENDER ║
║ MongoDB ║ ║ sync to peers ║ ║ live preview ║
╚═══════════════╝ ╚═══════════════╝ ╚═══════════════╝
| Layer | Technology | Purpose | Latency Target |
|---|---|---|---|
| Client | Next.js 15 + React 19 | Server Components, Streaming | < 100ms FCP |
| Auth | NextAuth.js + JWT | OAuth, Session Management | < 50ms |
| Inference | Groq LPU | Primary AI Engine | < 50ms TTFB |
| Fallback | Cerebras WSE | Secondary Inference | < 150ms TTFB |
| Cache | Upstash Redis | Rate Limiting, Sessions | < 10ms |
| Database | MongoDB Atlas | Project Persistence | < 100ms |
| P2P | WebRTC + DataChannels | Real-time Collaboration | < 30ms RTT |
"Intelligence shouldn't be gated behind paywalls"
In an era of $20/month AI subscriptions, Groq Coder stands for accessibility:
| Belief | Our Commitment |
|---|---|
| 🌍 Access is a Right | Every developer deserves SOTA tooling |
| 👥 Community > Corporation | Features come from users, not roadmaps |
| 🔍 Transparency is Trust | See every prompt, every decision |
This project demonstrates:
- 🏗️ Full-Stack Mastery: MongoDB → GraphQL → React → Edge Runtime
- ⚡ Performance Obsession: Sub-50ms latency is a feature, not a goal
- 🤖 AI Integration: LLM context windows, streaming, graceful degradation
- 🔗 P2P Expertise: WebRTC, STUN/TURN, encrypted DataChannels
- 🎨 Product Sense: Onboarding, galleries, social features
Built by a single determined engineer to prove that high-performance AI apps are achievable.
Node.js >= 18.0.0
MongoDB Atlas Account
Groq API Keygit clone https://github.com/ixchio/GroqCoder.git
cd GroqCoder
npm installcp .env.example .env.local# Required
MONGODB_URI=mongodb+srv://...
GROQ_API_KEY=gsk_...
# OAuth (optional)
GITHUB_CLIENT_ID=...
GITHUB_CLIENT_SECRET=...
# P2P Signaling
NEXT_PUBLIC_SIGNALING_URL=wss://...npm run devVisit http://localhost:3000 and start building!
┌─────────────────────────────────────────────────────────────────┐
│ BENCHMARK RESULTS │
├─────────────────────────────────────────────────────────────────┤
│ │
│ Inference Latency (TTFB) ████████░░░░░░░░░░░░ 42ms │
│ Token Generation Rate ████████████████████ 823/s │
│ P2P Connection Setup ██████░░░░░░░░░░░░░░ 285ms │
│ DataChannel RTT █████░░░░░░░░░░░░░░░ 24ms │
│ State Sync Latency ███░░░░░░░░░░░░░░░░░ 12ms │
│ Cold Start (Vercel Edge) ████████░░░░░░░░░░░░ 180ms │
│ │
└─────────────────────────────────────────────────────────────────┘
We welcome contributions! Check out our Contributing Guide for details.
git checkout -b feature/amazing-feature
git commit -m 'Add amazing feature'
git push origin feature/amazing-feature









