DedeProGames's picture
Open to Collab

DedeProGames PRO

DedeProGames

AI & ML interests

Thinking and Agentic Finetuning

Recent Activity

reacted to SeaWolf-AI's post with ๐Ÿ”ฅ about 9 hours ago
๐Ÿ”ฅ 128 Blackwell GPUs โ€” Thank You, Hugging Face I've been awarded 128 NVIDIA Blackwell GPUs through NIPA (Korea's National IT Industry Promotion Agency). Sharing this here first โ€” because Hugging Face is where it all started. I design LLM architectures from scratch. HF was my lab โ€” dissecting Transformers internals, analyzing thousands of checkpoints, iterating on Spaces with global feedback. Our FINAL Bench reached #5 globally in HF dataset popularity, and this research is exactly what earned the GPU grant. ๐Ÿ‘‰ https://huggingface.co/spaces/FINAL-Bench/Leaderboard These 128 Blackwells will scale AETHER-Net โ€” our Proto-AGI architecture (Emergence Engine ยท Meta-Cognition ยท SLAI ยท Multi-Intelligence ยท Synergy & Critique) โ€” validated at 0.8B with MoE expansion to 2.1B params. Next stop: 166B. People I must thank: @John6666 โ€” Guardian of this ecosystem. Never misses a forum question, interested in every project, active 24/7. I've genuinely wondered if you're a machine. Remarkable. @bartowski โ€” Master of quantization. The hidden infrastructure of open-source LLM. Countless experiments possible thanks to you. @SaylorTwift โ€” You see what others miss. Insight that cuts to the essence. Deep respect. My promise: AETHER-Net design docs, training recipes, checkpoints, and failure logs โ€” all shared here openly. ๐Ÿค— Thank you, Hugging Face. Let's turn the next page together. ๐Ÿš€ vidraft ยท VIDRAFT #OpenScience #HuggingFace #ProtoAGI #AETHER #LLMArchitecture #Blackwell #NIPA
reacted to SeaWolf-AI's post with ๐Ÿค about 9 hours ago
๐Ÿ”ฅ 128 Blackwell GPUs โ€” Thank You, Hugging Face I've been awarded 128 NVIDIA Blackwell GPUs through NIPA (Korea's National IT Industry Promotion Agency). Sharing this here first โ€” because Hugging Face is where it all started. I design LLM architectures from scratch. HF was my lab โ€” dissecting Transformers internals, analyzing thousands of checkpoints, iterating on Spaces with global feedback. Our FINAL Bench reached #5 globally in HF dataset popularity, and this research is exactly what earned the GPU grant. ๐Ÿ‘‰ https://huggingface.co/spaces/FINAL-Bench/Leaderboard These 128 Blackwells will scale AETHER-Net โ€” our Proto-AGI architecture (Emergence Engine ยท Meta-Cognition ยท SLAI ยท Multi-Intelligence ยท Synergy & Critique) โ€” validated at 0.8B with MoE expansion to 2.1B params. Next stop: 166B. People I must thank: @John6666 โ€” Guardian of this ecosystem. Never misses a forum question, interested in every project, active 24/7. I've genuinely wondered if you're a machine. Remarkable. @bartowski โ€” Master of quantization. The hidden infrastructure of open-source LLM. Countless experiments possible thanks to you. @SaylorTwift โ€” You see what others miss. Insight that cuts to the essence. Deep respect. My promise: AETHER-Net design docs, training recipes, checkpoints, and failure logs โ€” all shared here openly. ๐Ÿค— Thank you, Hugging Face. Let's turn the next page together. ๐Ÿš€ vidraft ยท VIDRAFT #OpenScience #HuggingFace #ProtoAGI #AETHER #LLMArchitecture #Blackwell #NIPA
updated a model about 14 hours ago
OrionLLM/GRM2-3b
View all activity

Organizations

Hugging Science's profile picture Orion LLM Labs's profile picture