Fast, Modern, Memory Efficient, and Low Precision PyTorch Optimizers
-
Updated
Jun 12, 2024 - Python
Fast, Modern, Memory Efficient, and Low Precision PyTorch Optimizers
D2 is a strongly-typed, statically-typed, (mostly) inferred-type compiled language.
Lua-Based Machine, Deep And Reinforcement Learning Library (For Roblox And Pure Lua). Contains 34 Models!
Learn DSPy framework by coding text adventure game
Nx-powered Neural Networks
🧑🏫 50! Implementations/tutorials of deep learning papers with side-by-side notes 📝; including transformers (original, xl, switch, feedback, vit, ...), optimizers (adam, adabelief, ...), gans(cyclegan, stylegan2, ...), 🎮 reinforcement learning (ppo, dqn), capsnet, distillation, ... 🧠
Optim4RL is a Jax framework of learning to optimize for reinforcement learning.
0th order optimizers, gradient chaining, random gradient approximation
JAX implementation of 'Sophia: A Scalable Stochastic Second-order Optimizer for Language Model Pre-training'
Pytorch library to test optimizers by visualizing how they descend on a your images. You can draw your own custom loss landscape and see what different optimizers do.
🦁 Lion, new optimizer discovered by Google Brain using genetic algorithms that is purportedly better than Adam(w), in Pytorch
Intergration to get optimizers information from the SolarEdge portal
🧑🏫 60 Implementations/tutorials of deep learning papers with side-by-side notes 📝; including transformers (original, xl, switch, feedback, vit, ...), optimizers (adam, adabelief, sophia, ...), gans(cyclegan, stylegan2, ...), 🎮 reinforcement learning (ppo, dqn), capsnet, distillation, ... 🧠
Cardis Optimizer is a simple but complex optimizer that will have your pc running better in minutes!
A set of NBA optimizers and GPP tools to help you win daily fantasy sports
Summarize Massive Datasets using Submodular Optimization
Polyvalent neural network
Tutorials on optimizers for deep neural networks
A curated list of optimizers for machine learning.
Add a description, image, and links to the optimizers topic page so that developers can more easily learn about it.
To associate your repository with the optimizers topic, visit your repo's landing page and select "manage topics."