Jump to content

Hugging Face

From Wikipedia, the free encyclopedia

This is an old revision of this page, as edited by Pintoza (talk | contribs) at 00:56, 8 July 2023 (Added Template: Differentiable Computing). The present address (URL) is a permanent link to this revision, which may differ significantly from the current revision.

Hugging Face, Inc.
Company typePrivate
IndustryArtificial intelligence, machine learning, software development
Founded2016; 8 years ago (2016) in New York City
Headquarters
New York City
,
U.S.
Area served
Worldwide
Key people
  • Clément Delangue (CEO)
  • Julien Chaumond (CTO)
  • Thomas Wolf (CSO)
ProductsTransformers, datasets, spaces
Revenue15,000,000 United States dollar (2022) Edit this on Wikidata
Number of employees
170 (2023) Edit this on Wikidata
Websitehuggingface.co

Hugging Face, Inc. is an American company that develops tools for building applications using machine learning.[1] It is most notable for its transformers library built for natural language processing applications and its platform that allows users to share machine learning models and datasets.

History

The company was founded in 2016 by French entrepreneurs Clément Delangue, Julien Chaumond, and Thomas Wolf originally as a company that developed a chatbot app targeted at teenagers.[2] After open-sourcing the model behind the chatbot, the company pivoted to focus on being a platform for machine learning.

In March 2021, Hugging Face raised $40 million in a Series B funding round.[3]

On April 28, 2021, the company launched the BigScience Research Workshop in collaboration with several other research groups to release an open large language model.[4] In 2022, the workshop concluded with the announcement of BLOOM, a multilingual large language model with 176 billion parameters.[5]

On December 21, 2021, the company announced its acquisition of Gradio, a software library used to make interactive browser demos of machine learning models.[6]

On May 5, 2022, the company announced its Series C funding round led by Coatue and Sequoia.[7] The company received a $2 billion valuation.

On May 13, 2022, the company introduced its Student Ambassador Program to help fulfill its mission to teach machine learning to 5 million people by 2023.[8]

On May 26, 2022, the company announced a partnership with Graphcore to optimize its Transformers library for the Graphcore IPU.[9]

On August 3, 2022, the company announced the Private Hub, an enterprise version of its public Hugging Face Hub that supports SaaS or on-premise deployment.[10]

In February 2023, the company announced partnership with Amazon Web Services (AWS) which would allow Hugging Face's products available to AWS customers to use them as the building blocks for their custom applications. The company also said the next generation of BLOOM will be run on Trainium, a proprietary machine learning chip created by AWS.[11][12]

Services and technologies

Transformers Library

The Transformers library is a Python package that contains open-source implementations of transformer models for text, image, and audio tasks. It is compatible with the PyTorch, TensorFlow and JAX deep learning libraries and includes implementations of notable models like BERT and GPT-2.[13] The library was originally called "pytorch-pretrained-bert"[14] which was then renamed to "pytorch-transformers" and finally "transformers."

Hugging Face Hub

The Hugging Face Hub is a platform (centralized web service) for hosting:[15]

  • Git-based code repositories, with features similar to GitHub, including discussions and pull requests for projects.
  • models, also with Git-based version control;
  • datasets, mainly in text, images, and audio;
  • web applications ("spaces" and "widgets"), intended for small-scale demos of machine learning applications.

Other Libraries

In addition to Transformers and the Hugging Face Hub, the Hugging Face ecosystem contains libraries for other tasks, such as dataset processing ("Datasets"), model evaluation ("Evaluate"), simulation ("Simulate"), machine learning demos ("Gradio").[16]

References

  1. ^ "Hugging Face – The AI community building the future". huggingface.co. Retrieved 2022-08-20.
  2. ^ "Hugging Face wants to become your artificial BFF". TechCrunch. 9 March 2017. Retrieved 2022-08-20.
  3. ^ "Hugging Face raises $40 million for its natural language processing library". 11 March 2021.
  4. ^ "Inside BigScience, the quest to build a powerful open language model". 10 January 2022.
  5. ^ "BLOOM". bigscience.huggingface.co. Retrieved 2022-08-20.
  6. ^ "Gradio is joining Hugging Face!". huggingface.co. Retrieved 2022-08-20.
  7. ^ Cai, Kenrick. "The $2 Billion Emoji: Hugging Face Wants To Be Launchpad For A Machine Learning Revolution". Forbes. Retrieved 2022-08-20.
  8. ^ "Student Ambassador Program's call for applications is open!". huggingface.co. Retrieved 2022-08-20.
  9. ^ "Graphcore and Hugging Face Launch New Lineup of IPU-Ready Transformers". huggingface.co. Retrieved 2022-08-19.
  10. ^ "Introducing the Private Hub: A New Way to Build With Machine Learning". huggingface.co. Retrieved 2022-08-20.
  11. ^ Bass, Dina (2023-02-21). "Amazon's Cloud Unit Partners With Startup Hugging Face as AI Deals Heat Up". Bloomberg News.
  12. ^ Nellis, Stephen (2023-02-21). "Amazon Web Services pairs with Hugging Face to target AI developers". Reuters.
  13. ^ "🤗 Transformers". huggingface.co. Retrieved 2022-08-20.
  14. ^ "First release". Github. Nov 17, 2018. Retrieved 28 March 2023.
  15. ^ "Hugging Face Hub documentation". huggingface.co. Retrieved 2022-08-20.
  16. ^ "Hugging Face - Documentation". huggingface.co. Retrieved 2023-02-18.