LLM Reference

Palmyra Med 20B

About

Palmyra-Med-20b, a large language model with 20 billion parameters, is tailored for medical applications, developed by Writer. It is a causal decoder-only model, which sequentially predicts text, and has been fine-tuned using a custom-curated dataset that blends PubMedQA and MedQA data. This setup aims to improve performance in medical dialogues and question-answering tasks. The model's capabilities extend to answering complex medical questions, engaging in dialogues, and potentially aiding in clinical decision-making and research. Despite these features, Palmyra-Med-20b is deprecated and no longer maintained. Consequently, its creators advise against its use in production, recommending newer models instead. The model primarily operates in English, may exhibit biases inherent in its training data, and requires a single 40GB A100 GPU for operation.

Capabilities

MultimodalFunction CallingTool UseJSON Mode

Specifications

Released2023-08-02
ArchitectureDecoder Only
Specializationgeneral