A series of transformer-based language models styled around GPT-3 architecture. The aim of the project is to train these models and make them open source, so that everyone can use it publicly, for free.
There are currently two repos in development – GPT-Neo and GPT NeoX. Check them out!
The team is already running some experiments with things like alternative architectures and attention types. So, you can expect a few models to be released until Neo is ready!