wordplay ๐ŸŽฎ ๐Ÿ’ฌ

February 2, 2024

A set of simple, scalable and highly configurable tools for working1 with LLMs.

Background

What started as some simple modifications to Andrej Karpathy's nanoGPT has now grown into the wordplay project.

(a) nanoGPT
(b) wordplay
Figure 1: nanoGPT, transformed.
If youโ€™re curiousโ€ฆ

While nanoGPT is a great project and an excellent resource; it is, by design, very minimal2 and limited in its flexibility.

Working through the code I found myself making minor changes here and there to test new ideas and run variations on different experiments. These changes eventually built to the point where my {goals, scope, code} for the project had diverged significantly from the original vision.

As a result, I figured it made more sense to move things to a new project, wordplay.

Iโ€™ve priortized adding functionality that I have found to be useful or interesting, but am absolutely open to input or suggestions for improvement.

Different aspects of this project have been motivated by some of my recent work on LLMs.

Completed

In Progress

Install

Grab-n-Go

The easiest way to get the most recent version is to:

python3 -m pip install "git+https://github.com/saforem2/wordplay.git"
Development

If youโ€™d like to work with the project and run / change things yourself, Iโ€™d recommend installing from a local (editable) clone of this repository:

git clone "https://github.com/saforem2/wordplay"
cd wordplay
mkdir v venv
python3 -m venv venv --system-site-packages
source venv/bin/activate
python3 -m pip install -e .

Last Updated: 02/02/2024 @ 21:53:52

Back to top

References

Song, Shuaiwen Leon, Bonnie Kruft, Minjia Zhang, Conglong Li, Shiyang Chen, Chengming Zhang, Masahiro Tanaka, et al. 2023. โ€œDeepSpeed4Science Initiative: Enabling Large-Scale Scientific Discovery Through Sophisticated AI System Technologies.โ€ https://arxiv.org/abs/2310.04610.

Footnotes

  1. {
      "training",
      "fine-tuning",
      "benchmarking",
      "parallelizing",
      "distributing",
      "measuring",
      "..."
    }

    large models at scale.โ†ฉ๏ธŽ

  2. nano, even ๐Ÿ˜‚โ†ฉ๏ธŽ

Citation

BibTeX citation:
@online{foreman2024,
  author = {Foreman, Sam},
  title = {`Wordplay` ๐ŸŽฎ ๐Ÿ’ฌ},
  date = {2024-02-02},
  url = {https://saforem2.github.io/wordplay},
  langid = {en}
}
For attribution, please cite this work as:
Foreman, Sam. 2024. โ€œ`Wordplay` ๐ŸŽฎ ๐Ÿ’ฌ.โ€ February 2, 2024. https://saforem2.github.io/wordplay.