LLM Reference

Reka Flash

About

Reka Flash is a multimodal large language model with 21 billion parameters, developed by Reka AI. It is capable of processing and reasoning across text, images, video, and audio inputs, offering a comprehensive approach to data interpretation. Equipped with a substantial 128K context length, it can manage extensive information efficiently. Reka Flash supports 32 languages and is built on an innovative multimodal transformer architecture. It delivers state-of-the-art performance for its computing class, frequently surpassing larger models, though some reports suggest the need for more independent validation. It is also available via Reka's API and chat interface. The model's extensive training included a mix of public and proprietary datasets, with emphasis on code and STEM content, followed by instruction tuning and reinforcement learning from human feedback.

Capabilities

MultimodalFunction CallingTool UseJSON Mode

Providers(2)

ProviderInput (per 1M)Output (per 1M)Type
Snowflake Cortex$0.9$0.9Serverless
Reka PlatformServerless

API Versions

reka-flash-20240226

Specifications

FamilyReka
Released2024-02-12
Parameters21B
Context128K
ArchitectureDecoder Only
Knowledge cutoff2023-11
Specializationgeneral