MINERS ⛏️: The semantic retrieval benchmark for evaluating multilingual language models.
-
Updated
Jun 12, 2024 - Python
MINERS ⛏️: The semantic retrieval benchmark for evaluating multilingual language models.
Project repository for the development of a Question-Answering (QA) information retrieval system fine-tuned on customer queries.
This repository contains the implementation of Transformer Architecture from the paper "Attention is All You Need"
中文nlp解决方案(大模型、数据、模型、训练、推理)
Ongoing research training transformer models at scale
Implementation of Vision Transformer, a simple way to achieve SOTA in vision classification with only a single transformer encoder, in Pytorch
TAO71 I4.0 is an AI created by TAO71 in C# and Python.
Multi-LoRA inference server that scales to 1000s of fine-tuned LLMs
This repository is a curated collection of the most exciting and influential CVPR 2024 papers. 🔥 [Paper + Code + Demo]
Easy and lightning fast training of 🤗 Transformers on Habana Gaudi processor (HPU)
Unify Efficient Fine-Tuning of 100+ LLMs
A low-memory high-performance CPU-based API for Meta's No Language Left Behind (NLLB) using CTranslate2, hosted on Hugging Face Spaces.
State-of-the-art Machine Learning for the web. Run 🤗 Transformers directly in your browser, with no need for a server!
Train transformer-based models.
🤗 PEFT: State-of-the-art Parameter-Efficient Fine-Tuning.
🔍 LLM orchestration framework to build customizable, production-ready LLM applications. Connect components (models, vector DBs, file converters) to pipelines or agents that can interact with your data. With advanced retrieval methods, it's best suited for building RAG, question answering, semantic search or conversational agent chatbots.
💡 All-in-one open-source embeddings database for semantic search, LLM orchestration and language model workflows
Repository for the paper "Advancing Time Series Forecasting: Variance-Aware Loss Functions in Transformers"
A PyTorch-based Speech Toolkit
Add a description, image, and links to the transformers topic page so that developers can more easily learn about it.
To associate your repository with the transformers topic, visit your repo's landing page and select "manage topics."