Pantheon R P Pure 1.6.2 22b Small

Gryphe

Introduction

The Pantheon-RP-Pure-1.6.2-22B-Small model is part of Gryphe's Pantheon series, designed to enhance roleplay experiences by introducing a variety of personas through simple activation phrases. This specific variant focuses on roleplay without the story and GPT-4 datasets.

Architecture

This model is based on the Mistral-Small architecture and is fine-tuned with various datasets, including diverse entries from the SlimOrca Sonnet dataset and the Sonnet 3.5 Pantheon Persona dataset. It is designed to handle both Markdown and novel-style inputs.

Training

The model underwent a unique finetuning process due to the instruct-tuned nature of Mistral Small. It utilizes datasets with character names and incorporates inputs like Lyra the Assistant for coding, summarizing, and roleplay-specific queries.

Guide: Running Locally

  1. Setup Environment: Ensure Python and necessary libraries are installed. Consider using a virtual environment.
  2. Download Model: Retrieve the model from the Hugging Face repository.
  3. Load Pre-trained Model: Utilize Hugging Face's transformers library to load the model.
  4. Configure Parameters: Adjust parameters such as temperature and repetition penalty for optimal performance.
  5. Run Inference: Use the prompt format to interact with the model.

Cloud GPUs: For enhanced performance, consider using cloud-based GPUs from providers like AWS, Google Cloud, or Azure.

License

The model is licensed under the MRL (Mistral License), version 0.1. More details can be found in the license document.

More Related APIs