turnkey self-hosted offline transcription and diarization service with llm summary
-
Updated
Jun 2, 2024 - Python
turnkey self-hosted offline transcription and diarization service with llm summary
Gateway and load balancer to your LLM inference endpoints
experiment: create AI Agents
🔍 AI search engine - self-host with local or cloud LLMs
🤯 Lobe Chat - an open-source, modern-design LLMs/AI chat framework. Supports Multi AI Providers( OpenAI / Claude 3 / Gemini / Ollama / Bedrock / Azure / Mistral / Perplexity ), Multi-Modals (Vision/TTS) and plugin system. One-click FREE deployment of your private ChatGPT chat application.
基于 Java Web 项目的 SpringBoot 框架初始化模板,该模板整合了常用的框架(Mybatis-Plus、ShardingSphere、Redis、RabbitMQ、Elasticsearch、SaToken、OSS、Caffeine以及MongoDB等),同时接入了Spring AI,该模板适用于前后端分离项目启动开发,保证大家在此基础上能够快速开发自己的项目,同时也适合入门学习,本项目会由作者持续更新。
Private & local AI personal knowledge management app.
User-friendly WebUI for LLMs (Formerly Ollama WebUI)
A modern and easy-to-use client for Ollama
A rust voice assistant which is was made to be easy to set up, customize and learn from.
A package manager for Go
REST API proxy to Vertex AI with the interface of ollama. HTTP server for accessing Vertex AI via the REST API interface of ollama. Written in Go.
🤖 Neovim code suggestion and completion (just like GitHub Copilot, but locally using Ollama)
Add a description, image, and links to the ollama topic page so that developers can more easily learn about it.
To associate your repository with the ollama topic, visit your repo's landing page and select "manage topics."