ppo
Here are 631 public repositories matching this topic...
XuanCe: A Comprehensive and Unified Deep Reinforcement Learning Library
-
Updated
Jun 11, 2024 - Python
High-quality single file implementation of Deep Reinforcement Learning algorithms with research-friendly features (PPO, DQN, C51, DDPG, TD3, SAC, PPG)
-
Updated
Jun 10, 2024 - Python
Reinforcement Learning Agents in .NET
-
Updated
Jun 10, 2024 - C#
An elegant PyTorch deep reinforcement learning library.
-
Updated
Jun 10, 2024 - Python
强化学习中文教程(蘑菇书🍄),在线阅读地址:https://datawhalechina.github.io/easy-rl/
-
Updated
Jun 9, 2024 - Jupyter Notebook
NAACL '24 (Demo) / MlSys @ NeurIPS '23 - RedCoast: A Lightweight Tool to Automate Distributed Training and Inference
-
Updated
Jun 9, 2024 - Python
Clean, Robust, and Unified PyTorch implementation of popular DRL Algorithms (Q-learning, Duel DDQN, PER, C51, Noisy DQN, PPO, DDPG, TD3, SAC, ASL)
-
Updated
Jun 8, 2024 - Python
This is the official implementation of Multi-Agent PPO (MAPPO).
-
Updated
Jun 8, 2024 - Python
testing MLP, DQN, PPO, SAC, policy-gradient by snake
-
Updated
Jun 8, 2024 - C++
[WIP] RL agent for the SuperTuxKart game.
-
Updated
Jun 7, 2024 - Python
-
Updated
Jun 5, 2024 - Jupyter Notebook
Mini RL Lab
-
Updated
Jun 5, 2024 - Python
-
Updated
Jun 5, 2024 - Python
This project involves creating a custom Blackjack environment and training an AI using reinforcement learning techniques, specifically Proximal Policy Optimization (PPO) and Deep Q-Network (DQN). The goal is to teach the AI to play Blackjack and achieve the best possible win rate.
-
Updated
Jun 5, 2024 - Python
Massively Parallel Deep Reinforcement Learning. 🔥
-
Updated
Jun 5, 2024 - Python
A RL approach to enable cost-effective, intelligent interactions between a local agent and a remote LLM
-
Updated
Jun 4, 2024 - Python
Improve this page
Add a description, image, and links to the ppo topic page so that developers can more easily learn about it.
Add this topic to your repo
To associate your repository with the ppo topic, visit your repo's landing page and select "manage topics."