Skip to content
/ llm Public

Large Language Model in one week (Trained on Ascendance of a Bookworm)

License

Notifications You must be signed in to change notification settings

cheyao/llm

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

21 Commits
 
 
 
 
 
 
 
 
 
 
 
 

Repository files navigation

Ascendance of a bookworm LLM

Making a llm for the hackclub ysws program

I'm using 33 volumes of Ascendance of a Bookworm as training data :D

Usage

python3 -m env vnev
source vnev/bin/activate
pip install -r requirements.txt
curl -L https://www.kaggle.com/api/v1/models/geminn/ascendance-of-a-bookworm-llm/pyTorch/default/1/download
fastapi run main.py

Now visit http://localhost:8000/

Disclaimer

This project is just a educational project, if any of the publishers isn't happy, just email me a message and I'll sort it out.

About

Large Language Model in one week (Trained on Ascendance of a Bookworm)

Topics

Resources

License

Stars

Watchers

Forks