Skip to content

facebookresearch/metaseq

Repository files navigation

Metaseq

A codebase for working withOpen Pre-trained Transformers,originally forked fromfairseq.

Community Integrations

Using OPT with 🤗 Transformers

The OPT 125M--66B models are now available inHugging Face Transformers.You can access them under thefacebookorganization on theHugging Face Hub

Using OPT-175B with Alpa

The OPT 125M--175B models are now supported in theAlpa project,which enables serving OPT-175B with more flexible parallelisms on older generations of GPUs, such as 40GB A100, V100, T4, M60, etc.

Using OPT with Colossal-AI

The OPT models are now supported in theColossal-AI,which helps users to efficiently and quickly deploy OPT models training and inference, reducing large AI model budgets and scaling down the labor cost of learning and deployment.

Using OPT with CTranslate2

The OPT 125M--66B models can be executed withCTranslate2,which is a fast inference engine for Transformer models. The project integrates theSmoothQuanttechnique to allow 8-bit quantization of OPT models. See theusage exampleto get started.

Using OPT with FasterTransformer

The OPT models can be served withFasterTransformer,a highly optimized inference framework written and maintained by NVIDIA. We provide instructions to convert OPT checkpoints into FasterTransformer format anda usage examplewith some benchmark results.

Using OPT with DeepSpeed

The OPT models can be finetuned usingDeepSpeed.See theDeepSpeed-Chat exampleto get started.

Getting Started in Metaseq

Followsetup instructions hereto get started.

Documentation on workflows

Background Info

Support

If you have any questions, bug reports, or feature requests regarding either the codebase or the models released in the projects section, please don't hesitate to post on ourGithub Issues page.

Please remember to follow ourCode of Conduct.

Contributing

We welcome PRs from the community!

You can find information about contributing to metaseq in ourContributingdocument.

The Team

Metaseq is currently maintained by the CODEOWNERS:Susan Zhang,Naman Goyal,Punit Singh Koura,Moya Chen,Kurt Shuster,David Esiobu,Igor Molybog,Peter Albert,Andrew Poulton,Nikolay Bashlykov,Binh Tang,Uriel Singer,Yuchen Zhang,Armen Aghajanya,Lili Yu,andAdam Polyak.

License

The majority of metaseq is licensed under the MIT license, however portions of the project are available under separate license terms: