Enabling PyTorch on XLA Devices (e.g. Google TPU)
JAX - A curated list of resources https://github.com/google/jax
The purpose of this repo is to make it easy to get started with JAX, Fla...
Zero-copy MPI communication of JAX arrays, for turbo-charged HPC applica...
Julia on TPUs
ALBERT model Pretraining and Fine Tuning using TF2.0
Simple and efficient RevNet-Library for PyTorch with XLA and DeepSpeed s...
TensorFlow wheels built for latest CUDA/CuDNN and enabled performance fl...
基于tensorflow1.x的预训练模型调用,支持单机多卡、梯度累积,XLA加速,混合...
Presents comprehensive benchmarks of XLA-compatible pre-trained models i...