M F A N N Llama3.1 Abliterated Slerp T I E S

netcat420

MFANN-Llama3.1-Abliterated-Slerp-TIES

Introduction

MFANN-Llama3.1-Abliterated-Slerp-TIES is a text generation model developed by netcat420. It leverages advanced transformers and mergekit libraries for improved performance. This model specializes in various text generation tasks and has been evaluated across multiple datasets.

Architecture

The model utilizes a combination of several base models, including:

  • netcat420/MFANN-llama3.1-abliterated-v2
  • netcat420/MFANN-llama3.1-abliterated-SLERP-v3.2
  • mlabonne/Meta-Llama-3.1-8B-Instruct-abliterated

The architecture is built using the transformers library, optimized for handling text generation tasks. It includes experimental settings such as SATANN mode for advanced cyber operations.

Training

The model has been evaluated using various datasets with different shot configurations:

  • IFEval (0-Shot): Achieved a strict accuracy of 42.93.
  • BBH (3-Shot): Recorded a normalized accuracy of 27.6.
  • MATH Lvl 5 (4-Shot): Achieved an exact match score of 5.97.
  • GPQA (0-shot): Recorded an acc_norm of 5.59.
  • MuSR (0-shot): Achieved an acc_norm of 4.59.
  • MMLU-PRO (5-shot): Achieved an accuracy of 28.13.

These results are available on the Open LLM Leaderboard.

Guide: Running Locally

To run the model locally, follow these steps:

  1. Install Prerequisites: Ensure Python and the Hugging Face Transformers library are installed.
  2. Clone the Repository: Download the model from Hugging Face using git clone.
  3. Load the Model: Use the Transformers library to load the model with the appropriate configuration.
  4. Inference: Run text generation tasks using the loaded model.

For optimal performance, it is recommended to use cloud GPUs such as NVIDIA A100 or V100.

License

The model is available under a license that allows use for research and commercial purposes. Users should review the specific terms associated with this model on the Hugging Face platform.

More Related APIs in Text Generation