transformers

Get started

  • Quick tour
  • Installation
  • Philosophy
  • Glossary

Using 🤗 Transformers

  • Summary of the tasks
  • Summary of the models
  • Preprocessing data
  • Training and fine-tuning
  • Model sharing and uploading
  • Tokenizer summary
  • Multi-lingual models

Advanced guides

  • Pretrained models
  • Examples
  • Fine-tuning with custom datasets
  • 🤗 Transformers Notebooks
  • Converting Tensorflow Checkpoints
  • Migrating from previous packages
  • How to contribute to transformers?
  • Testing
  • Exporting transformers models

Research

  • BERTology
  • Perplexity of fixed-length models
  • Benchmarks

Main Classes

  • Configuration
  • Logging
  • Models
  • Optimization
  • Model outputs
  • Pipelines
  • Processors
  • Tokenizer
  • Trainer

Models

  • ALBERT
  • AutoClasses
  • Bart
  • BERT
  • BertGeneration
  • CamemBERT
  • CTRL
  • DialoGPT
  • DistilBERT
  • DPR
  • ELECTRA
  • Encoder Decoder Models
  • FlauBERT
  • FSMT
  • Funnel Transformer
  • LayoutLM
  • Longformer
  • LXMERT
  • MarianMT
  • MBart
  • MobileBERT
  • OpenAI GPT
  • OpenAI GPT2
  • Pegasus
  • RAG
  • Reformer
  • RetriBERT
  • RoBERTa
  • T5
  • Transformer XL
  • XLM
  • XLM-RoBERTa
  • XLNet

Internal Helpers

  • Custom Layers and Utilities
  • Utilities for pipelines
  • Utilities for Tokenizers
transformers

ⓘ You are viewing legacy docs. Go to latest documentation instead.

  • Docs »
  • Search


© Copyright 2020, huggingface

Built with Sphinx using a theme provided by Read the Docs.