LLM ReferenceLLM Reference

Palmyra Large

About

Palmyra Large, a 20-billion parameter causal decoder-only language model by Writer, is designed with a transformer decoder architecture similar to GPT-3. It excels in nuanced tasks such as sentiment classification and summarization. Trained primarily on English text, it operates with impressive speed but may sometimes deliver factually inaccurate results and exhibit biases from its training data. Although the smaller versions like Palmyra Small and Palmyra Base have been open-sourced on Hugging Face, Palmyra Large is currently not actively maintained or supported 1 2 4 5 6.

Capabilities

VisionMultimodalReasoningFunction CallingTool UseStructured OutputsCode Execution

Rankings

Specifications

FamilyPalmyra
Released2023-03-02
Parameters20B
ArchitectureDecoder Only
Specializationgeneral
Trainingfinetuning

Created by

AI-powered writing assistance and optimization.

San Francisco, California, United States
Founded 2020
Website