Few people understand Large Language Models (LLMs) as thoroughly as Andrej Karpathy, and fortunately for all of us, he channels that knowledge into useful open-source projects. His latest creation, nanochat, is described as “the best ChatGPT $100 can buy.”
So, what exactly is nanochat? It is a minimal and hackable software project, encapsulated in a single speedrun.sh script that creates a simple ChatGPT clone from scratch, complete with a web interface. The entire codebase consists of about 8,000 lines of clean, readable code with minimal dependencies, making every component of the process accessible and easy to modify.
The $100 cost refers to the computational effort required to train the model, which takes approximately four hours on a single NVIDIA 8XH100 GPU node. The end product is a 1.9-billion-parameter micro-model, trained on roughly 38 billion tokens from an open dataset.
As Andrej describes in his announcement on X, this model is a “little ChatGPT clone you can sort of talk to, and which can write stories/poems, answer simple questions.” A detailed walk-through of the entire process makes it as straightforward as possible for users to get started.
While $100 won’t buy you a model that rivals modern commercial offerings, scaling up the process brings significant improvements. For example, a $1,000 version (detailed here) is far more coherent and capable—able to solve simple math and coding problems as well as take multiple-choice tests.
Nanochat is an exciting step toward accessible, customizable AI language models, lowering the barrier for enthusiasts and developers who want to experiment with ChatGPT-like technologies on a budget.
https://hackaday.com/2025/10/20/nanochat-lets-you-build-your-own-hackable-llm/