Llama 3.2 1 B Instruct M L X Tuned
mlx-communityIntroduction
The MLX-COMMUNITY/LLAMA-3.2-1B-INSTRUCT-MLXTUNED is a fine-tuned text generation model available on Hugging Face. It is a transformer-based model, optimized for generating human-like text in eight languages. The model is compatible with the MLX library and is built upon the meta-llama/Llama-3.2-1B-Instruct model.
Architecture
The model architecture is based on the LLaMA (Large Language Model Meta AI) framework, utilizing the PyTorch library. It supports efficient text generation processes with a focus on conversational and instruction-based outputs. The model is tuned using MLX, a library that facilitates the conversion and operation of language models.
Training
This model, originally developed by Meta, was further fine-tuned by the MLX community to enhance its performance for specific text generation tasks. Fine-tuning involves adjusting the model weights and optimizing its output for more specialized language tasks or datasets.
Guide: Running Locally
To run the LLAMA-3.2-1B-INSTRUCT-MLXTUNED model locally, follow these steps:
-
Install the Required Libraries:
pip install mlx-lm
-
Load the Model:
from mlx_lm import load, generate model, tokenizer = load("mlx-community/Llama-3.2-1B-Instruct-MLXTuned")
-
Generate Text:
prompt = "hello" if hasattr(tokenizer, "apply_chat_template") and tokenizer.chat_template is not None: messages = [{"role": "user", "content": prompt}] prompt = tokenizer.apply_chat_template( messages, tokenize=False, add_generation_prompt=True ) response = generate(model, tokenizer, prompt=prompt, verbose=True)
-
Consider Cloud GPUs: For better performance, consider using cloud-based GPU services like AWS, Google Cloud, or Azure to handle the computational demands of the model.
License
The model is distributed under the Llama 3.2 Community License Agreement. Users are granted a non-exclusive, worldwide, non-transferable, and royalty-free limited license to use, reproduce, distribute, and modify the Llama Materials. Usage is subject to compliance with the Acceptable Use Policy and additional commercial terms if applicable.