LLM Reference
GCP Vertex AI

Vicuna 13B 16K on GCP Vertex AI

Vicuna · LMSYS Org

Serverless

Capabilities

VisionMultimodalReasoningFunction CallingTool UseJSON ModeCode Execution

About Vicuna 13B 16K

The Vicuna 13B v1.5 16K model by LMSYS is an advanced conversational AI built on the transformer architecture, fine-tuned from Llama 2. It features 13 billion parameters and handles up to 16,000 tokens in context, making it suitable for diverse tasks like chatbots and content generation. Trained on 125,000 conversations from ShareGPT, it excels in generating coherent text and engaging dialogue but may struggle with domain-specific knowledge and nuanced context. Despite potential biases, it represents a significant step forward in conversational AI.

Get Started

Model Specs

Released2023-10-23
Parameters13B
Context16K
ArchitectureDecoder Only

Related Models on GCP Vertex AI

Provider

GCP Vertex AI
GCP Vertex AI

Google Cloud Platform (GCP)

All models on GCP Vertex AI