PyTorch/XLA integration with JetStream (https://github.com/google/JetStream) for LLM inference"
-
Updated
Jun 12, 2024 - Python
PyTorch/XLA integration with JetStream (https://github.com/google/JetStream) for LLM inference"
A PyTorch library for all things Reinforcement Learning (RL) for Combinatorial Optimization (CO)
This repository contains the code to reproduce the experiments performed in the Dynamical Mean-Field Theory of Self-Attention Neural Networks article.
Source code for the GAtt method in "Revisiting Attention Weights as Interpretations of Message-Passing Neural Networks".
Julia Implementation of Transformer models
Generative Pre-trained Transformer in PyTorch
A compilation of the best multi-agent papers
Scenic: A Jax Library for Computer Vision Research and Beyond
🚀🚀🚀 A collection of some awesome public YOLO object detection series projects.
Pytorch implementation of various token mixers; Attention Mechanisms, MLP, and etc for understanding computer vision papers and other tasks.
QuillGPT is an implementation of the GPT decoder block based on the architecture from Attention is All You Need paper by Vaswani et. al. in PyTorch. Additionally, this repository contains two pre-trained models — Shakespearean GPT and Harpoon GPT, a Streamlit Playground, Containerized FastAPI Microservice, training - inference scripts & notebooks.
Reference implementation of "Softmax Attention with Constant Cost per Token" (Heinsen, 2024)
A solid foundational understanding of XAI, primarily emphasizing how XAI methodologies can expose latent biases in datasets and reveal valuable insights.
Visualizing the attention of vision-language models
Unified-modal Salient Object Detection via Adaptive Prompt Learning
Alignment-Free RGBT Salient Object Detection: Semantics-guided Asymmetric Correlation Network and A Unified Benchmark
A collection of memory efficient attention operators implemented in the Triton language.
[ICML 2024] Outlier-Efficient Hopfield Layers for Large Transformer-Based Models
Add a description, image, and links to the attention topic page so that developers can more easily learn about it.
To associate your repository with the attention topic, visit your repo's landing page and select "manage topics."