TFLai/Turkish-Dialog-Dataset
Viewer • Updated • 343k • 47 • 10
How to use TurkishCodeMan/DeepSeek-R1-Turkish-Dialog-Dataset with Unsloth Studio:
curl -fsSL https://unsloth.ai/install.sh | sh # Run unsloth studio unsloth studio -H 0.0.0.0 -p 8888 # Then open http://localhost:8888 in your browser # Search for TurkishCodeMan/DeepSeek-R1-Turkish-Dialog-Dataset to start chatting
irm https://unsloth.ai/install.ps1 | iex # Run unsloth studio unsloth studio -H 0.0.0.0 -p 8888 # Then open http://localhost:8888 in your browser # Search for TurkishCodeMan/DeepSeek-R1-Turkish-Dialog-Dataset to start chatting
# No setup required # Open https://huggingface.co/spaces/unsloth/studio in your browser # Search for TurkishCodeMan/DeepSeek-R1-Turkish-Dialog-Dataset to start chatting
pip install unsloth
from unsloth import FastModel
model, tokenizer = FastModel.from_pretrained(
model_name="TurkishCodeMan/DeepSeek-R1-Turkish-Dialog-Dataset",
max_seq_length=2048,
)irm https://unsloth.ai/install.ps1 | iex
# Run unsloth studio
unsloth studio -H 0.0.0.0 -p 8888
# Then open http://localhost:8888 in your browser
# Search for TurkishCodeMan/DeepSeek-R1-Turkish-Dialog-Dataset to start chatting# No setup required# Open https://huggingface.co/spaces/unsloth/studio in your browser
# Search for TurkishCodeMan/DeepSeek-R1-Turkish-Dialog-Dataset to start chattingpip install unsloth
from unsloth import FastModel
model, tokenizer = FastModel.from_pretrained(
model_name="TurkishCodeMan/DeepSeek-R1-Turkish-Dialog-Dataset",
max_seq_length=2048,
)Türkçe Sohbetler için İnce Ayar Yapılmış Dil Modeli
Bu model, DeepSeek-R1 temel alınarak TFLai/Turkish-Dialog-Dataset ile ince ayar yapılmıştır. Türkçe diyalog üretme, sohbet botları ve metin tamamlama görevleri için optimize edilmiştir.
from transformers import AutoTokenizer, AutoModelForCausalLM
model_name = "[your-username]/DeepSeek-R1-Turkish-Finetuned"
tokenizer = AutoTokenizer.from_pretrained(model_name)
model = AutoModelForCausalLM.from_pretrained(model_name)
# Diyalog örneği
input_text = "Merhaba! Nasılsın?"
inputs = tokenizer.encode(input_text, return_tensors="pt")
outputs = model.generate(inputs, max_length=100)
print(tokenizer.decode(outputs[0]))
Base model
deepseek-ai/DeepSeek-R1-Distill-Llama-8B
Install Unsloth Studio (macOS, Linux, WSL)
# Run unsloth studio unsloth studio -H 0.0.0.0 -p 8888 # Then open http://localhost:8888 in your browser # Search for TurkishCodeMan/DeepSeek-R1-Turkish-Dialog-Dataset to start chatting