llama3
Here are 329 public repositories matching this topic...
Choose the model that's right for you
-
Updated
Jun 5, 2024 - Python
Chat locally using leading open models built by the community, optimized and accelerated by NVIDIA's enterprise-ready inference runtime
-
Updated
Jun 11, 2024 - Python
Easy "1-line" calling of all LLMs from OpenAI, MS Azure, AWS Bedrock, GCP Vertex, and Ollama
-
Updated
Jun 11, 2024 - Python
Detailed description given in the README
-
Updated
May 19, 2024 - Python
Create synthetic datasets for training and testing Language Learning Models (LLMs) in a Question-Answering (QA) context.
-
Updated
Apr 27, 2024 - Python
Simple Summarizer Tool using Llama 3 8b.
-
Updated
May 14, 2024 - Python
Run a "super-human" AI from the year 2124 in your terminal with only three lines of code, today!
-
Updated
May 21, 2024 - Python
Ce projet est destiné aux utilisateurs souhaitant extraire et analyser des informations de plusieurs fichiers PDF.
-
Updated
May 28, 2024 - Python
⚗️ Llama3 8b instruct model repository trained by meta managed by DVC
-
Updated
Jun 5, 2024 - Python
Multi-agent workflows with Llama3: A private on-device multi-agent framework
-
Updated
Jun 5, 2024 - Python
how to create a local RAG (Retrieval Augmented Generation) pipeline that processes and allows you to chat with your PDF file(s) using Ollama and LangChain!
-
Updated
May 11, 2024 - Jupyter Notebook
Fine tuned llama 3 models for context based question answering in bengali language.
-
Updated
May 25, 2024
Improve this page
Add a description, image, and links to the llama3 topic page so that developers can more easily learn about it.
Add this topic to your repo
To associate your repository with the llama3 topic, visit your repo's landing page and select "manage topics."