Train transformer-based models.
-
Updated
Jun 11, 2024 - Python
Train transformer-based models.
Stretching GPU performance for GEMMs and tensor contractions.
ONNX Runtime: cross-platform, high performance ML inferencing and training accelerator
Machine Learning - Projects.
Supercharge Your Model Training
NGC-Learn: Neurobiological Learning and Biomimetic Systems Simulation in Python
Repositorio con el material para el taller sobre PINNs en MAPI-3 2024
A scalable generative AI framework built for researchers and developers working on Large Language Models, Multimodal, and Speech AI (Automatic Speech Recognition and Text-to-Speech)
Introductions to key concepts in quantum programming, as well as tutorials and implementations from cutting-edge quantum computing research.
Evaluation and Tracking for LLM Experiments
Official Repository for "Mish: A Self Regularized Non-Monotonic Neural Activation Function" [BMVC 2020]
Deep Learning for humans
EBOP Model Automatic input Value Estimation Neural network
My first ML sandbox
Language modeling and instruction tuning for Russian
High-efficiency floating-point neural network inference operators for mobile, server, and Web
A resource-conscious neural network implementation for MCUs
ActTensor: Activation Functions for TensorFlow. https://pypi.org/project/ActTensor-tf/ Authors: Pouya Ardehkhani, Pegah Ardehkhani
Generating Neural Spatial Interaction Tables
Add a description, image, and links to the neural-networks topic page so that developers can more easily learn about it.
To associate your repository with the neural-networks topic, visit your repo's landing page and select "manage topics."