This is the official repository for the paper "Flora: Low-Rank Adapters Are Secretly Gradient Compressors".
-
Updated
Jun 3, 2024 - Python
This is the official repository for the paper "Flora: Low-Rank Adapters Are Secretly Gradient Compressors".
🤗 Transformers: State-of-the-art Machine Learning for Pytorch, TensorFlow, and JAX.
🤗 Diffusers: State-of-the-art diffusion models for image and audio generation in PyTorch and FLAX.
Accelerate your training with this open-source library. Optimize performance with streamlined training and serving options with JAX. 🚀
Flax Engine – multi-platform 3D game engine
Large language models (LLMs) made easy, EasyLM is a one stop solution for pre-training, finetuning, evaluating and serving LLMs in JAX/Flax.
This is a JAX/Flax-based transformer language model trained on a Japanese dataset. It is based on the official Flax example code (lm1b).
Tevatron - A flexible toolkit for neural retrieval research and development.
Orbax provides common utility libraries for JAX users.
An experimental code base for system identification with Jax.
Flax Engine Documentation
NAACL '24 (Demo) / MlSys @ NeurIPS '23 - RedCoast: A Lightweight Tool to Automate Distributed Training and Inference
Clean single-file implementation of offline RL algorithms in JAX
Deep Learning examples using the Jax ecosystem of libraries
bioflax provides a JAX implementation of biologically plausible learning algorithms
Code I used for my YouTube videos
A Jax-based library for designing and training transformer models from scratch.
Add a description, image, and links to the flax topic page so that developers can more easily learn about it.
To associate your repository with the flax topic, visit your repo's landing page and select "manage topics."