Skip to content

Conversation

@joennlae
Copy link

@joennlae joennlae commented Jan 5, 2024

Hello 😄

First of all, this is a really cool collection of links 💯 And I also wanted to tell you that I am a big fan of your finetunes on huggingface 😄 I am particularly interested in the drllama and drmistral as I am looking into a similar direction.

This PR is a shameless self-plug.

I propose to add Tensorli. It is a minimalistic implementation of a trainable GPT transformer using only numpy.

The implementation includes:

  • Automatic differentiation
  • Tensorli object (PyTorch like)
  • Simple NN layers: Linearli, Embeddingli, MultiheadAttentionli, LayerNorm
  • Optimizers: Adamli

All that is "needed" to train and execute a GPT-like transformer model.
...and everything else is just efficiency ~ Andrej Karpathy1.
1: Youtube - micrograd

I implemented it for myself to understand transformers better. Then I noticed it would also help others, so I open-sourced it.

Have a good one and cheers 😄

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

None yet

Projects

None yet

Development

Successfully merging this pull request may close these issues.

1 participant