-
In backend REST api development to view resource we need to create LIST and VIEW API for each resource i.e. 4x2= 8 api's for 4 resource and if we add more then we need develope those also mannually hard coding.
-
Inside all those api we are just changing the SQL queries to fetch data from different tables.
-
For any kind of aggregate oparation we need to create those functions mannually.
LMM Comes to picture
- With AI LMM models we can now solve that problem by generating dynamic SQL queries on the fly and excute those to db and get ideal data.
a. Profile Information of Student name John. Expecting class info and its subjects and marks if exist
b. Class marks report for Student name John
c. Get all students of Subject Physic
Create a prompt API where we will ask question and it will query the DB and return response as json
first_name
last_name
gender
dob
age
class_id
name
status
name
class_id
description
name
class_id
student_id
subject_id
marks
totalMarks
- Stundent Belongs to one Class
- One Class has multiple Subjects
- One Class has multiple Students
- One Student has marks for every subject
-
Create a python virtual environment
python -m venv venv -
activate
source venv/bin/active -
Install uv
pip install uv -
Install jupyter notebook
uv pip install uv -
Install dependency
uv pip install -r requirements.txt -
Install VSCode Jupyter NoteBook extensions
-
copy
.env.exampleto.env. Enter Database creds and API KEY for Grok
- Creat database name
gen_ai_dbin postgres db - If you have docker use
docker/docker-compose.yamlfile to spine up a postgres instace