Instructions to use deepset/gbert-base with libraries, inference providers, notebooks, and local apps. Follow these links to get started.
- Libraries
- Transformers
How to use deepset/gbert-base with Transformers:
# Use a pipeline as a high-level helper from transformers import pipeline pipe = pipeline("fill-mask", model="deepset/gbert-base")# Load model directly from transformers import AutoModelForMaskedLM model = AutoModelForMaskedLM.from_pretrained("deepset/gbert-base", dtype="auto") - Inference
- Notebooks
- Google Colab
- Kaggle
- Xet hash:
- 0918eb5ff06e8e562513d1175fd7052bd1d29bb8856137ac29883fad8581c6ee
- Size of remote file:
- 442 MB
- SHA256:
- 227220a5807266f9120d332a562ad169c95411c54ee0f2439955fe5dcea87867
·
Xet efficiently stores Large Files inside Git, intelligently splitting files into unique chunks and accelerating uploads and downloads. More info.