Tencent Hunyuan Large

tencent

Introduction

Hunyuan-Large is an open-source Mixture of Experts (MoE) model developed by Tencent, designed to tackle the challenges of optimizing resource consumption while maintaining high performance in large language models. With 389 billion total parameters and 52 billion active parameters, it is currently the largest open-source Transformer-based MoE model. The model leverages high-quality synthetic data, KV cache compression, expert-specific learning rate scaling, and long-context processing capabilities to enhance performance across various tasks.

Architecture

The Hunyuan-Large model utilizes several innovative techniques:

  • High-Quality Synthetic Data: Enhances the model's capability to learn richer representations and generalize better to unseen data.
  • KV Cache Compression: Employs Grouped Query Attention (GQA) and Cross-Layer Attention (CLA) to reduce memory and computational overhead.
  • Expert-Specific Learning Rate Scaling: Assigns different learning rates to different experts for effective learning.
  • Long-Context Processing Capability: Supports long text sequences, with pre-trained models handling up to 256K tokens and instruct models supporting up to 128K.

Training

The model undergoes extensive benchmarking to validate its effectiveness and safety across various languages and tasks. Hunyuan-Large achieves superior performance in commonsense understanding, reasoning, and classical NLP tasks, as well as mathematics datasets, surpassing other models in several benchmarks like MMLU, CMMLU, and C-Eval.

Guide: Running Locally

  1. Installation:

    • Clone the repository from GitHub.
    • Install dependencies using pip install -r requirements.txt.
  2. Model Download:

  3. Inference:

    • Use the provided scripts for inference and deployment with TRT-LLM and vLLM.
  4. Hardware Recommendations:

    • For optimal performance, use cloud GPUs such as NVIDIA A100 or V100 available on platforms like AWS, Azure, or Google Cloud.

License

The model is released under the Tencent License. For detailed license terms, refer to the license file.

More Related APIs in Text Generation