Instructions to use NbAiLab/nb-bert-base with libraries, inference providers, notebooks, and local apps. Follow these links to get started.
- Libraries
- Transformers
How to use NbAiLab/nb-bert-base with Transformers:
# Use a pipeline as a high-level helper from transformers import pipeline pipe = pipeline("fill-mask", model="NbAiLab/nb-bert-base")# Load model directly from transformers import AutoModel model = AutoModel.from_pretrained("NbAiLab/nb-bert-base", dtype="auto") - Inference
- Notebooks
- Google Colab
- Kaggle
- Xet hash:
- ab0749d9bfaadbb6acb32e546a9d61c908feea671541d85e87aab61c16e574ae
- Size of remote file:
- 711 MB
- SHA256:
- 662be18560f38314702891506910f06f154230221136319122ecaf22e4815c5d
·
Xet efficiently stores Large Files inside Git, intelligently splitting files into unique chunks and accelerating uploads and downloads. More info.