Feature Extraction
Transformers
Safetensors
Fairseq
French
pantagruel_uni
fill-mask
data2vec2
JEPA
text
custom_code
Instructions to use PantagrueLLM/Text_Base_FR_OSCAR with libraries, inference providers, notebooks, and local apps. Follow these links to get started.
- Libraries
- Transformers
How to use PantagrueLLM/Text_Base_FR_OSCAR with Transformers:
# Use a pipeline as a high-level helper from transformers import pipeline pipe = pipeline("feature-extraction", model="PantagrueLLM/Text_Base_FR_OSCAR", trust_remote_code=True)# Load model directly from transformers import AutoModelForMaskedLM model = AutoModelForMaskedLM.from_pretrained("PantagrueLLM/Text_Base_FR_OSCAR", trust_remote_code=True, dtype="auto") - Fairseq
How to use PantagrueLLM/Text_Base_FR_OSCAR with Fairseq:
from fairseq.checkpoint_utils import load_model_ensemble_and_task_from_hf_hub models, cfg, task = load_model_ensemble_and_task_from_hf_hub( "PantagrueLLM/Text_Base_FR_OSCAR" ) - Notebooks
- Google Colab
- Kaggle