Skip to content

Facebook AI Research Sequence-to-Sequence Toolkit written in Python.

License

Notifications You must be signed in to change notification settings

ProteinQure/fairseq

Repository files navigation



Support Ukraine MIT License Latest Release Build Status Documentation Status CicleCI Status


Fairseq(-py) is a sequence modeling toolkit that allows researchers and developers to train custom models for translation, summarization, language modeling and other text generation tasks.

We provide reference implementations of various sequence modeling papers:

List of implemented papers

What's New:

Previous updates

Features:

We also providepre-trained models for translation and language modeling with a convenienttorch.hubinterface:

en2de=torch.hub.load('pytorch/fairseq','transformer.wmt19.en-de.single_model')
en2de.translate('Hello world',beam=5)
# 'Hallo Welt'

See the PyTorch Hub tutorials fortranslation andRoBERTafor more examples.

Requirements and Installation

  • PyTorchversion >= 1.10.0
  • Python version >= 3.8
  • For training new models, you'll also need an NVIDIA GPU andNCCL
  • To install fairseqand develop locally:
git clone https://github /pytorch/fairseq
cdfairseq
pip install --editable./

#on MacOS:
#CFLAGS= "-stdlib=libc++" pip install --editable./

#to install the latest stable release (0.10.x)
#pip install fairseq
  • For faster traininginstall NVIDIA'sapexlibrary:
git clone https://github /NVIDIA/apex
cdapex
pip install -v --no-cache-dir --global-option="--cpp_ext"--global-option="--cuda_ext"\
--global-option="--deprecated_fused_adam"--global-option="--xentropy"\
--global-option="--fast_multihead_attn"./
  • For large datasetsinstallPyArrow:pip install pyarrow
  • If you use Docker make sure to increase the shared memory size either with--ipc=hostor--shm-size as command line options tonvidia-docker run.

Getting Started

Thefull documentationcontains instructions for getting started, training new models and extending fairseq with new model types and tasks.

Pre-trained models and examples

We provide pre-trained models and pre-processed, binarized test sets for several tasks listed below, as well as example training and evaluation commands.

We also have more detailed READMEs to reproduce results from specific papers:

Join the fairseq community

License

fairseq(-py) is MIT-licensed. The license applies to the pre-trained models as well.

Citation

Please cite as:

@inproceedings{ott2019fairseq,
title={fairseq: A Fast, Extensible Toolkit for Sequence Modeling},
author={Myle Ott and Sergey Edunov and Alexei Baevski and Angela Fan and Sam Gross and Nathan Ng and David Grangier and Michael Auli},
booktitle={Proceedings of NAACL-HLT 2019: Demonstrations},
year={2019},
}

About

Facebook AI Research Sequence-to-Sequence Toolkit written in Python.

Resources

License

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published

Languages

  • Python 98.1%
  • Other 1.9%