Skip to content

Latest commit

 

History

History
6 lines (3 loc) · 608 Bytes

README.md

File metadata and controls

6 lines (3 loc) · 608 Bytes

from freeCodeCamp.org taught by Elliot

  • The Bigram model.ipynb implements a Bigram Language Model using PyTorch, designed to learn and generate text from character-level sequences. The model uses embeddings to represent each character as a learnable vector, and it generates new text by predicting the next character in a sequence.

  • The chatbot.py contains a PyTorch implementation of a GPT-based language model (gpt-v1.ipynb) that can generate text based on input prompts. The model is trained using transformer architecture and includes self-attention and feedforward layers to process text data.