gpt2 finetuned recipes cooking_v2

mrm8488

Introduction

The GPT2-FINETUNED-RECIPES-COOKING_V2 is a model available on Hugging Face, fine-tuned for generating cooking recipes. Leveraging the GPT-2 architecture, it is designed to assist in creating diverse and creative recipe content in English.

Architecture

This model is based on the GPT-2 architecture, a transformer model known for its capability in generating coherent and contextually relevant text. The model has been fine-tuned specifically for the task of recipe generation, thus enhancing its performance in this domain.

Training

The model was fine-tuned on a dataset of cooking recipes, allowing it to generate text that is relevant to culinary content. The fine-tuning process involves adjusting the pre-trained GPT-2 weights to better suit the niche requirements of recipe generation.

Guide: Running Locally

To run the GPT2-FINETUNED-RECIPES-COOKING_V2 model locally, follow these basic steps:

  1. Install Dependencies: Ensure you have Python and PyTorch installed along with the transformers library from Hugging Face.

    pip install torch transformers
    
  2. Download the Model: Use the Hugging Face model hub to download the model:

    from transformers import GPT2LMHeadModel, GPT2Tokenizer
    
    tokenizer = GPT2Tokenizer.from_pretrained("mrm8488/gpt2-finetuned-recipes-cooking_v2")
    model = GPT2LMHeadModel.from_pretrained("mrm8488/gpt2-finetuned-recipes-cooking_v2")
    
  3. Generate Text: Input a prompt to generate a recipe:

    input_text = "HuggingFace Cake:"
    input_ids = tokenizer.encode(input_text, return_tensors='pt')
    output = model.generate(input_ids, max_length=150, num_return_sequences=1)
    print(tokenizer.decode(output[0], skip_special_tokens=True))
    

For optimal performance, consider using cloud GPU services such as AWS, GCP, or Azure to handle the computational load.

License

The model is shared under the MIT License. This allows for broad use, modification, and distribution, ensuring that developers can incorporate it into their projects with minimal restrictions.

More Related APIs in Text Generation