Releasing v1 of GPT-JT powered by open-source AI — TOGETHER


“With a new decentralized training algorithm, we fine-tuned GPT-J (6B) on 3.53 billion tokens, resulting in GPT-JT (6B), a model that outperforms many 100B+ parameter models on classification benchmarks.0
Read more at TOGETHER…