Skip to content

Groq Coder is a decentralized AI coding platform built with Next.js and Groq LPU, delivering ultra-low-latency code generation, live streaming responses, and secure peer-to-peer collaboration.

Notifications You must be signed in to change notification settings

ixchio/GroqCoder

Repository files navigation

Groq Coder Llama

⚡ GROQ CODER ⚡

🔥 Your Imagination. Compiled. Instantly. 🔥

Powered By Groq Next.js 15 TypeScript P2P Enabled MIT License

🌐 Live Demo🚀 Quick Start🔗 P2P Flow🏗️ System Design


🎭 Meet The Squad


⚡ Yoruichi
Speed Daemon
P2P Coordinator

🗡️ Retsu
Code Samurai
Inference Engine

🌊 Nelliel
Data Guardian
State Manager

🐝 Soifon
Security Ninja
Auth Handler

🌸 Yachiru
UX Specialist
Stream Router

⚔️ AI Agents Squad


❄️ Rukia
Code Freezer
Debugger Agent

🔬 Nemu
Lab Architect
Research Agent

✨ Orihime
Bug Healer
Refactor Agent

🎯 Makima
Flow Controller
Orchestrator

🚀 What is Groq Coder?

"The fastest AI-powered code generation platform, now with peer-to-peer collaboration"

Groq Coder is a production-grade acceleration engine that leverages Groq LPU™ (Language Processing Unit) for blazing-fast inference. Unlike GPU-based solutions that batch requests, we stream logic in real-time—enabling developers to iterate at the speed of thought.

✨ Key Highlights

+ ⚡ Sub-50ms inference latency via Groq LPU
+ 🔗 Real-time P2P collaboration without central servers
+ 🎯 800+ tokens/second streaming
+ 🔐 End-to-end encrypted peer connections
+ 🌐 Decentralized code sharing network

🔗 Peer-to-Peer Architecture

The Decentralized Code Network

flowchart TB
    subgraph "🌐 Public Network"
        direction TB
        STUN[("🔄 STUN Server<br/>NAT Traversal")]
        TURN[("🔀 TURN Relay<br/>Fallback Route")]
    end

    subgraph "👤 Peer A - Creator"
        direction TB
        UA[("🖥️ Browser Client")]
        WA["📡 WebRTC Agent"]
        LA["💾 Local State<br/>IndexedDB"]
    end

    subgraph "👥 Peer B - Collaborator"
        direction TB
        UB[("🖥️ Browser Client")]
        WB["📡 WebRTC Agent"]
        LB["💾 Local State<br/>IndexedDB"]
    end

    subgraph "⚡ Signaling Layer"
        direction LR
        SIG["🔔 WebSocket<br/>Signaling Server"]
    end

    UA --> WA
    UB --> WB
    
    WA <-->|"📋 SDP Offer/Answer"| SIG
    WB <-->|"📋 SDP Offer/Answer"| SIG
    
    WA <-->|"🧊 ICE Candidates"| STUN
    WB <-->|"🧊 ICE Candidates"| STUN
    
    WA <-.->|"🔀 Relay (if needed)"| TURN
    WB <-.->|"🔀 Relay (if needed)"| TURN
    
    WA <===>|"🔐 Encrypted DataChannel<br/>Code + Cursor + State"| WB
    
    WA --> LA
    WB --> LB

    style UA fill:#ff6b00,stroke:#fff,color:#fff
    style UB fill:#00d4aa,stroke:#fff,color:#fff
    style SIG fill:#8b5cf6,stroke:#fff,color:#fff
    style STUN fill:#3b82f6,stroke:#fff,color:#fff
    style TURN fill:#ec4899,stroke:#fff,color:#fff
Loading

🔄 P2P Connection Flow

┌─────────────────────────────────────────────────────────────────────┐
│                    PEER-TO-PEER HANDSHAKE SEQUENCE                  │
├─────────────────────────────────────────────────────────────────────┤
│                                                                     │
│   PEER A                    SIGNALING                    PEER B     │
│     │                          │                           │        │
│     │──── Create Offer ───────>│                           │        │
│     │                          │──── Forward Offer ───────>│        │
│     │                          │                           │        │
│     │                          │<───── Create Answer ──────│        │
│     │<──── Forward Answer ─────│                           │        │
│     │                          │                           │        │
│     │<═════════ ICE Candidates Exchange ═════════════════>│        │
│     │                          │                           │        │
│     │◀════════════════ DTLS Handshake ═════════════════▶ │        │
│     │                          │                           │        │
│     │◀══════════ ENCRYPTED DATA CHANNEL ═══════════════▶ │        │
│     │                                                      │        │
│   ┌─┴─┐                                                  ┌─┴─┐      │
│   │ A │◀═══════════ LIVE COLLABORATION ═════════════════▶│ B │      │
│   └───┘                                                  └───┘      │
│                                                                     │
└─────────────────────────────────────────────────────────────────────┘

📡 Data Channel Protocol

Channel Purpose Priority
code-sync Real-time code delta sync 🔴 Critical
cursor-pos Cursor position broadcast 🟡 High
ai-stream AI response streaming 🔴 Critical
presence User presence/status 🟢 Normal
files Large file transfer 🔵 Low

🏗️ System Design

Complete System Architecture

graph TB
    subgraph "🎨 Client Layer"
        direction TB
        UI["🖼️ React UI<br/>Next.js 15"]
        Editor["📝 Monaco Editor<br/>Code Workspace"]
        Preview["👁️ Live Preview<br/>Sandboxed iFrame"]
        P2P["🔗 P2P Module<br/>WebRTC"]
    end

    subgraph "🌐 Edge Layer"
        direction TB
        CDN["☁️ Vercel Edge<br/>Global CDN"]
        MW["🛡️ Middleware<br/>Rate Limiting"]
        Auth["🔐 NextAuth.js<br/>OAuth/JWT"]
    end

    subgraph "⚡ Inference Engine"
        direction TB
        Groq["🧠 Groq LPU<br/>Primary Engine"]
        Cerebras["🔄 Cerebras<br/>Fallback"]
        DeepSeek["💭 DeepSeek R1<br/>Reasoning Model"]
    end

    subgraph "💾 Data Layer"
        direction TB
        Mongo["🗄️ MongoDB Atlas<br/>Project Storage"]
        Redis["⚡ Upstash Redis<br/>Session Cache"]
        Vector["🔍 Vector DB<br/>Embeddings"]
    end

    subgraph "📡 Real-time Layer"
        direction TB
        WS["🔌 WebSocket<br/>Live Updates"]
        SSE["📊 SSE<br/>AI Streaming"]
        Signal["🔔 Signaling<br/>P2P Coordination"]
    end

    UI --> Editor
    UI --> Preview
    UI --> P2P
    
    Editor -->|"HTTP/2"| CDN
    CDN --> MW
    MW --> Auth
    
    Auth --> Groq
    Auth --> Cerebras
    Groq --> DeepSeek
    
    Auth --> Mongo
    Auth --> Redis
    Mongo --> Vector
    
    Editor --> WS
    Groq --> SSE
    P2P --> Signal

    style Groq fill:#ff6b00,stroke:#fff,color:#fff
    style P2P fill:#00d4aa,stroke:#fff,color:#fff
    style UI fill:#8b5cf6,stroke:#fff,color:#fff
Loading

🧬 Request Lifecycle

┌───────────────────────────────────────────────────────────────────────────┐
│                         REQUEST PROCESSING PIPELINE                        │
└───────────────────────────────────────────────────────────────────────────┘

  ╔═══════════════╗     ╔═══════════════╗     ╔═══════════════╗
  ║   USER INPUT  ║────▶║  EDGE RUNTIME ║────▶║  AUTH GUARD   ║
  ║  "Build form" ║     ║  validate req ║     ║ check session ║
  ╚═══════════════╝     ╚═══════════════╝     ╚═══════════════╝
                                                      │
                                                      ▼
  ╔═══════════════╗     ╔═══════════════╗     ╔═══════════════╗
  ║ RATE LIMITER  ║◀────║ CONTEXT BUILD ║◀────║ PROMPT ENGINE ║
  ║  Upstash Redis║     ║ inject meta   ║     ║ system prompt ║
  ╚═══════════════╝     ╚═══════════════╝     ╚═══════════════╝
          │
          ▼
  ╔═══════════════╗     ╔═══════════════╗     ╔═══════════════╗
  ║   GROQ LPU    ║────▶║    STREAM     ║────▶║   SSE PUSH    ║
  ║  800 tok/sec  ║     ║  transformer  ║     ║  to client    ║
  ╚═══════════════╝     ╚═══════════════╝     ╚═══════════════╝
                                                      │
                                                      ▼
  ╔═══════════════╗     ╔═══════════════╗     ╔═══════════════╝
  ║  ASYNC WRITE  ║◀────║  P2P FANOUT   ║◀────║    RENDER     ║
  ║   MongoDB     ║     ║ sync to peers ║     ║  live preview ║
  ╚═══════════════╝     ╚═══════════════╝     ╚═══════════════╝

🎯 Component Matrix

Layer Technology Purpose Latency Target
Client Next.js 15 + React 19 Server Components, Streaming < 100ms FCP
Auth NextAuth.js + JWT OAuth, Session Management < 50ms
Inference Groq LPU Primary AI Engine < 50ms TTFB
Fallback Cerebras WSE Secondary Inference < 150ms TTFB
Cache Upstash Redis Rate Limiting, Sessions < 10ms
Database MongoDB Atlas Project Persistence < 100ms
P2P WebRTC + DataChannels Real-time Collaboration < 30ms RTT

🔓 Why Open Source?

"Intelligence shouldn't be gated behind paywalls"

In an era of $20/month AI subscriptions, Groq Coder stands for accessibility:

Belief Our Commitment
🌍 Access is a Right Every developer deserves SOTA tooling
👥 Community > Corporation Features come from users, not roadmaps
🔍 Transparency is Trust See every prompt, every decision

💼 For Recruiters

This project demonstrates:

  • 🏗️ Full-Stack Mastery: MongoDB → GraphQL → React → Edge Runtime
  • ⚡ Performance Obsession: Sub-50ms latency is a feature, not a goal
  • 🤖 AI Integration: LLM context windows, streaming, graceful degradation
  • 🔗 P2P Expertise: WebRTC, STUN/TURN, encrypted DataChannels
  • 🎨 Product Sense: Onboarding, galleries, social features

Built by a single determined engineer to prove that high-performance AI apps are achievable.


🛠️ Quick Start

Prerequisites

Node.js >= 18.0.0
MongoDB Atlas Account
Groq API Key

1. Clone & Install

git clone https://github.com/ixchio/GroqCoder.git
cd GroqCoder
npm install

2. Configure Environment

cp .env.example .env.local
# Required
MONGODB_URI=mongodb+srv://...
GROQ_API_KEY=gsk_...

# OAuth (optional)
GITHUB_CLIENT_ID=...
GITHUB_CLIENT_SECRET=...

# P2P Signaling
NEXT_PUBLIC_SIGNALING_URL=wss://...

3. Launch 🚀

npm run dev

Visit http://localhost:3000 and start building!


📊 Performance Metrics

┌─────────────────────────────────────────────────────────────────┐
│                     BENCHMARK RESULTS                           │
├─────────────────────────────────────────────────────────────────┤
│                                                                 │
│  Inference Latency (TTFB)         ████████░░░░░░░░░░░░  42ms   │
│  Token Generation Rate            ████████████████████  823/s   │
│  P2P Connection Setup             ██████░░░░░░░░░░░░░░  285ms  │
│  DataChannel RTT                  █████░░░░░░░░░░░░░░░  24ms   │
│  State Sync Latency               ███░░░░░░░░░░░░░░░░░  12ms   │
│  Cold Start (Vercel Edge)         ████████░░░░░░░░░░░░  180ms  │
│                                                                 │
└─────────────────────────────────────────────────────────────────┘

🤝 Contributing

We welcome contributions! Check out our Contributing Guide for details.

git checkout -b feature/amazing-feature
git commit -m 'Add amazing feature'
git push origin feature/amazing-feature

🌟 Star this repo if you found it useful!


Built with ❤️ and ⚡ by a 10x Engineer

"The future of coding is decentralized, fast, and free."

About

Groq Coder is a decentralized AI coding platform built with Next.js and Groq LPU, delivering ultra-low-latency code generation, live streaming responses, and secure peer-to-peer collaboration.

Topics

Resources

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published

Contributors 2

  •  
  •  

Languages