Transformers
PyTorch
Safetensors
py
English
t5
text2text-generation
Code2TextGeneration
Code2TextSummarisation
text-generation-inference
Instructions to use stmnk/codet5-small-code-summarization-python with libraries, inference providers, notebooks, and local apps. Follow these links to get started.
- Libraries
- Transformers
How to use stmnk/codet5-small-code-summarization-python with Transformers:
# Load model directly from transformers import AutoTokenizer, AutoModelForSeq2SeqLM tokenizer = AutoTokenizer.from_pretrained("stmnk/codet5-small-code-summarization-python") model = AutoModelForSeq2SeqLM.from_pretrained("stmnk/codet5-small-code-summarization-python") - Notebooks
- Google Colab
- Kaggle
metadata
language:
- py
- en
thumbnail: url to a thumbnail used in social sharing
tags:
- Code2TextGeneration
- Code2TextSummarisation
license: apache-2.0
datasets:
- code_x_glue_ct_code_to_text
- code_x_glue_ct_code_to_text (python)
metrics:
- code-x-bleu
pretrained model: https://huggingface.co/Salesforce/codet5-small
finetuning dataset: https://huggingface.co/datasets/code_x_glue_ct_code_to_text (only the python split)
official inference check point (for comparison, using base, not small, size): https://storage.googleapis.com/sfr-codet5-data-research/finetuned_models/summarize_python_codet5_base.bin
for fine-tuning process metrics see this w&b report