LLM Reference

About

GPT-JT is a series of large language models that originate from a fine-tuned version of EleutherAI's GPT-J 6B model. These models utilize a decentralized training algorithm, allowing them to operate efficiently despite using a network with relatively slow interconnect speeds. This novel approach optimizes the use of diverse hardware resources. The training process integrates various open-source methodologies and datasets, including Google Research's UL2 training objective, Chain-of-Thought prompting, and datasets like BigScience's Public Pool of Prompts (P3) and AllenAI's Natural Instructions (NI). As a result, GPT-JT models exhibit strong performance on classification benchmarks and are known to outperform models with significantly larger parameters. Importantly, these models are available as open-source, inviting community participation for further enhancements145.

Models(3)

Details

ResearcherTogether.ai
Models3