Poly Coder 2.7 B

NinedayWang

Introduction

PolyCoder-2.7B is a language model with 2.7 billion parameters, designed for code generation and presented in the paper "A Systematic Evaluation of Large Language Models of Code." The model is trained on a diverse set of 249 GB of code from 12 programming languages.

Architecture

The PolyCoder-2.7B model is built using the GPT-NeoX architecture and is implemented with PyTorch. This model is specifically tailored for text generation tasks related to programming code.

Training

PolyCoder-2.7B was trained using a large corpus of code spanning 12 different programming languages. The training process ensures that the model can generate relevant code snippets and understand multiple programming environments.

Guide: Running Locally

To run PolyCoder-2.7B locally, follow these steps:

  1. Install Transformers Library: Ensure you have the required version of the Transformers library:

    pip install transformers==4.23.0
    
  2. Hardware Requirements: Due to the model's size, it is recommended to use a machine with a powerful GPU. Consider using cloud services that offer access to GPUs, such as AWS, Google Cloud, or Azure.

  3. Download and Initial Setup: Clone the model repository and set up the environment as per the instructions provided in the repository.

  4. Run the Model: Use the inference scripts available in the repository to generate code or perform other text-generation tasks.

License

For licensing details, and to use the PolyCoder-2.7B model, refer to the official repository: Code-LMs GitHub. Make sure to comply with any citation requirements or usage guidelines specified by the authors.

More Related APIs in Text Generation