Jump to content

Hugging Face

From Wikipedia, the free encyclopedia
Hugging Face, Inc.
Company typePrivate
IndustryArtificial intelligence,machine learning,software development
Founded2016;8 years ago(2016)
Headquarters
Area served
Worldwide
Key people
  • Clément Delangue (CEO)
  • Julien Chaumond (CTO)
  • Thomas Wolf (CSO)
ProductsModels, datasets, spaces
Revenue15,000,000 United States dollar (2022)Edit this on Wikidata
Number of employees
170 (2023)Edit this on Wikidata
Websitehuggingface.co

Hugging Face, Inc.is a French-American company incorporated under theDelaware General Corporation Law[1]and based inNew York Citythat developscomputationtools for building applications usingmachine learning.It is most notable for itstransformerslibrarybuilt fornatural language processingapplications and its platform that allows users to share machine learning models anddatasetsand showcase their work.

History[edit]

The company was founded in 2016 by French entrepreneurs Clément Delangue, Julien Chaumond, and Thomas Wolf inNew York City,originally as a company that developed achatbotapp targeted at teenagers.[2]The company was named after the "hugging face"emoji.[2]Afteropen sourcingthe model behind the chatbot, the companypivotedto focus on being a platform for machine learning.

In March 2021, Hugging Face raised US$40 million in aSeries Bfunding round.[3]

On April 28, 2021, the company launched the BigScience Research Workshop in collaboration with several other research groups to release an openlarge language model.[4]In 2022, the workshop concluded with the announcement ofBLOOM,a multilingual large language model with 176 billion parameters.[5][6]

In December 2022, the company acquired Gradio, an open source library built for developing machine learning applications in Python.[7]

On May 5, 2022, the company announced itsSeries Cfunding round led byCoatueandSequoia.[8]The company received a $2 billion valuation.

On August 3, 2022, the company announced the Private Hub, an enterprise version of its public Hugging Face Hub that supportsSaaSoron-premisesdeployment.[9]

In February 2023, the company announced partnership withAmazon Web Services(AWS) which would allow Hugging Face's products available to AWS customers to use them as the building blocks for their custom applications. The company also said the next generation of BLOOM will be run on Trainium, a proprietarymachine learning chipcreated by AWS.[10][11][12]

In August 2023, the company announced that it raised $235 million in aSeries Dfunding, at a $4.5 billion valuation. The funding was led bySalesforce,and notable participation came fromGoogle,Amazon,Nvidia,AMD,Intel,IBM,andQualcomm.[13]

In June 2024, the company announced, along withMetaandScaleway,their launch of a new AI accelerator program for European startups. This initiative aims to help startups integrate open foundation models into their products, accelerating the EU AI ecosystem. The program, based at STATION F in Paris, will run from September 2024 to February 2025. Selected startups will receive mentoring, access to AI models and tools, and Scaleway’s computing power.[14]

Services and technologies[edit]

Transformers Library[edit]

The Transformers library is aPythonpackage that contains open-source implementations oftransformermodels for text, image, and audio tasks. It is compatible with thePyTorch,TensorFlowandJAXdeep learninglibraries and includes implementations of notable models likeBERTandGPT-2.[15]The library was originally called "pytorch-pretrained-bert"[16]which was then renamed to "pytorch-transformers" and finally "transformers."

Ajavascriptversion (transformers.js[17]) have also been developed, allowing to run models directly in the browser.

Hugging Face Hub[edit]

The Hugging Face Hub is a platform (centralizedweb service) for hosting:[18]

  • Git-basedcode repositories,including discussions and pull requests for projects.
  • models, also with Git-based version control;
  • datasets, mainly in text, images, and audio;
  • web applications ( "spaces" and "widgets" ), intended for small-scale demos of machine learning applications.

There are numerous pre-trained models that support common tasks in different modalities, such as:

  • Natural Language Processing:text classification, named entity recognition, question answering, language modeling, summarization, translation, multiple choice, and text generation.
  • Computer Vision:image classification, object detection, and segmentation.
  • Audio: automatic speech recognition and audio classification.

Other libraries[edit]

Gradio UI Example

In addition to Transformers and the Hugging Face Hub, the Hugging Face ecosystem contains libraries for other tasks, such asdataset processing( "Datasets" ), model evaluation ( "Evaluate" ), and machine learning demos ( "Gradio" ).[19]

See also[edit]

References[edit]

  1. ^"Terms of Service – Hugging Face".huggingface.co.Retrieved2024-05-24.
  2. ^ab"Hugging Face wants to become your artificial BFF".TechCrunch.9 March 2017.Archivedfrom the original on 2022-09-25.Retrieved2023-09-17.
  3. ^"Hugging Face raises $40 million for its natural language processing library".11 March 2021.Archivedfrom the original on 28 July 2023.Retrieved5 August2022.
  4. ^"Inside BigScience, the quest to build a powerful open language model".10 January 2022.Archivedfrom the original on 1 July 2022.Retrieved5 August2022.
  5. ^"BLOOM".bigscience.huggingface.co.Archivedfrom the original on 2022-11-14.Retrieved2022-08-20.
  6. ^"Inside a radical new project to democratize AI".MIT Technology Review.Archivedfrom the original on 2022-12-04.Retrieved2023-08-25.
  7. ^Nataraj, Poornima (2021-12-23)."Hugging Face Acquires Gradio, A Customizable UI Components Library For Python".Analytics India Magazine.Retrieved2024-01-26.
  8. ^Cai, Kenrick."The $2 Billion Emoji: Hugging Face Wants To Be Launchpad For A Machine Learning Revolution".Forbes.Archivedfrom the original on 2022-11-03.Retrieved2022-08-20.
  9. ^"Introducing the Private Hub: A New Way to Build With Machine Learning".huggingface.co.Archivedfrom the original on 2022-11-14.Retrieved2022-08-20.
  10. ^Bass, Dina (2023-02-21)."Amazon's Cloud Unit Partners With Startup Hugging Face as AI Deals Heat Up".Bloomberg News.Archivedfrom the original on 2023-05-22.Retrieved2023-02-22.
  11. ^Nellis, Stephen (2023-02-21)."Amazon Web Services pairs with Hugging Face to target AI developers".Reuters.Archivedfrom the original on 2023-05-30.Retrieved2023-02-22.
  12. ^"AWS and Hugging Face collaborate to make generative AI more accessible and cost efficient | AWS Machine Learning Blog".aws.amazon.2023-02-21.Archivedfrom the original on 2023-08-25.Retrieved2023-08-25.
  13. ^Leswing, Kif (2023-08-24)."Google, Amazon, Nvidia and other tech giants invest in AI startup Hugging Face, sending its valuation to $4.5 billion".CNBC.Archivedfrom the original on 2023-08-24.Retrieved2023-08-24.
  14. ^"META Collaboration Launches AI Accelerator for European Startups".Yahoo Finance.2024-06-25.Retrieved2024-07-11.
  15. ^"🤗 Transformers".huggingface.co.Archivedfrom the original on 2023-09-27.Retrieved2022-08-20.
  16. ^"First release".GitHub.Nov 17, 2018.Archivedfrom the original on 30 April 2023.Retrieved28 March2023.
  17. ^"xenova/transformers.js".GitHub.
  18. ^"Hugging Face Hub documentation".huggingface.co.Archivedfrom the original on 2023-09-20.Retrieved2022-08-20.
  19. ^"Hugging Face - Documentation".huggingface.co.Archivedfrom the original on 2023-09-30.Retrieved2023-02-18.