Instructions to use michaelfeil/ct2fast-codegen-16B-multi with libraries, inference providers, notebooks, and local apps. Follow these links to get started.
- Libraries
- Transformers
How to use michaelfeil/ct2fast-codegen-16B-multi with Transformers:
# Load model directly from transformers import AutoModel model = AutoModel.from_pretrained("michaelfeil/ct2fast-codegen-16B-multi", dtype="auto") - Notebooks
- Google Colab
- Kaggle
- Xet hash:
- 1f77b398f2f6fe8a3048974cdaf16dc9cec7c5bacf74ddd663d70aa130902567
- Size of remote file:
- 32.1 GB
- SHA256:
- e5537b669e8bf51067b3f3214e9241c397f08798cf767d8055758fe480b1bc68
·
Xet efficiently stores Large Files inside Git, intelligently splitting files into unique chunks and accelerating uploads and downloads. More info.