Collections

Discover the best community collections!

Collections trending this week
ColBERT-Zero 🐶
First large-scale fully pre-trained ColBERT model using only public data, outperforming GTE-ModernColBERT and GTE-ModernBERT
Cerebras REAP
Sparse MoE models compressed using REAP (Router-weighted Expert Activation Pruning) method
DINOv3
DINOv3: foundation models producing excellent dense features, outperforming SotA w/o fine-tuning - https://arxiv.org/abs/2508.10104
Claude 4.5 Opus
Distilled models and datasets for Claude 4.5 Opus.
ColBERT-Zero 🐶
First large-scale fully pre-trained ColBERT model using only public data, outperforming GTE-ModernColBERT and GTE-ModernBERT
DINOv3
DINOv3: foundation models producing excellent dense features, outperforming SotA w/o fine-tuning - https://arxiv.org/abs/2508.10104
Claude 4.5 Opus
Distilled models and datasets for Claude 4.5 Opus.
Cerebras REAP
Sparse MoE models compressed using REAP (Router-weighted Expert Activation Pruning) method