Google opens Switch Transformer models in T5X/JAX

google brain opened the source Switch Transformer models including 1.6T param Switch-C and the 395B param Switch-XXL in T5X/JAX.

Check out the GitHub repository here.

Sign up for your weekly dose of what’s happening in emerging technologies.

What is JAX?

JAX (Just After eXecution) is a machine/deep learning library developed by DeepMind. All JAX operations are based on XLA or Accelerated Linear Algebra. XLA, developed by Google, is a domain-specific compiler for linear algebra that uses whole-program optimizations to speed up computation. XLA accelerates the training speed of BERT by almost 7.3 times.

T5X

T5X is a modular, composable, and research-friendly framework for high-performance, configurable, and inferential sequence models (starting with language) at many scales.

T5X makes it possible to uniformly express many different classes of linguistic tasks, and a single encoder-decoder architecture can handle them without any task-specific parameters.

Switch Transformer

In deep learning, models usually reuse the same parameters for all inputs. However, the switching transformer uses a mixture of experts (MoE) and selects different parameters for each incoming example.

The model is mainly used for NLP to research. The transformer uses an algorithm called Switch Routing. Instead of enabling multiple experts and combining the output, it chooses a single expert to work on an input. The algorithm simplifies routing computation and reduces communication costs since individual expert models are hosted on different GPU devices.