Zubnet AI學習Wiki › Hugging Face
公司

Hugging Face

HF
開源 AI 生態的中心樞紐。Hugging Face 託管模型倉庫(超過 50 萬個模型)、資料集(10 萬+ 資料集)、Python 中用模型的 Transformers 函式庫、以及部署 demo 的 Spaces。它對 AI 就像 GitHub 對程式 — 社群分享、發現、協作模型的地方。

為什麼重要

如果你用 open-weight 模型,你就用 Hugging Face。每個 Llama、Mistral、Qwen 和社群 fine-tune 都是從那裡下載的。Transformers 函式庫是載入和執行模型的事實標準。Hub 的 model card、討論、排行榜塑造社群知識。Hugging Face 是基礎設施 — 大部分開源 AI 生態都依賴它。

Deep Dive

Hugging Face started as a chatbot company and pivoted to become the platform layer for open-source AI. The Transformers library provides a unified API for thousands of model architectures — you can load a BERT model or a Llama model with the same AutoModel.from_pretrained() call. This standardization dramatically lowered the barrier to using new models: instead of each lab releasing its own framework, models are shared in a common format on the Hub.

Beyond the Hub

Hugging Face's ecosystem extends beyond model hosting: Datasets for sharing and loading training data, Evaluate for benchmarking, Accelerate for distributed training, PEFT for parameter-efficient fine-tuning, TRL for RLHF/DPO training, and Inference Endpoints for deploying models to production. They also offer a free Inference API for testing models and Spaces for deploying Gradio/Streamlit apps. The business model is enterprise features (private repos, dedicated compute, security) layered on top of the free community platform.

相關概念

In The News

Hugging Face TRL v1.0 Standardizes Post-Training Into Production Pipeline
Apr 01, 2026
← 所有術語
← HiDream Human Evaluation →