# llmchat **Repository Path**: bcledger_admin/llmchat ## Basic Information - **Project Name**: llmchat - **Description**: No description available - **Primary Language**: Unknown - **License**: MIT - **Default Branch**: attachment-support - **Homepage**: None - **GVP Project**: No ## Statistics - **Stars**: 0 - **Forks**: 0 - **Created**: 2025-02-06 - **Last Updated**: 2025-02-06 ## Categories & Tags **Categories**: Uncategorized **Tags**: None ## README ![og_6x](https://github.com/user-attachments/assets/4813a6b5-3294-4056-88bb-c536a45c220c) [demo.webm](https://github.com/user-attachments/assets/1c555c20-5adf-4c7b-8e55-96f5abcc3563)

LLMChat

Most intuitive All-in-one AI chat interface.

## Key Features - 🧠 **Multiple LLM Providers**: Supports various language models, including Ollama. - 🔌 **Plugins Library**: Enhance functionality with an expandable plugin system, including function calling capabilities. - 🌐 **Web Search Plugin**: Allows AI to fetch and utilize real-time web data. - 🤖 **Custom Assistants**: Create and tailor AI assistants for specific tasks or domains. - 🗣️ **Text-to-Speech**: Converts AI-generated text responses to speech using Whisper. - 🎙️ **Speech-to-Text**: (Coming soon) Enables voice input for more natural interaction. - 💾 **Local Storage**: Securely store data locally using in-browser IndexedDB for faster access and privacy. - 📤📥 **Data Portability**: Easily import or export chat data for backup and migration. - 📚 **Knowledge Spaces**: (Coming soon) Build custom knowledge bases for specialized topics. - 📝 **Prompt Library**: Use pre-defined prompts to guide AI conversations efficiently. - 👤 **Personalization**: Memory plugin ensures more contextual and personalized responses. - 📱 **Progressive Web App (PWA)**: Installable on various devices for a native-like app experience. ## Tech Stack - 🌍 **Next.js** - 🔤 **TypeScript** - 🧩 **LangChain** - 📦 **Zustand** - 🔄 **React Query** - 🗄️ **Supabase** - 🎨 **Tailwind CSS** - ✨ **Framer Motion** - 🖌️ **Shadcn** - 📝 **Tiptap** ## Roadmap - 🎙️ **Speech-to-Text**: Coming soon. - 📚 **Knowledge Spaces**: Coming soon. ## Quick Start To get the project running locally: ### Prerequisites - Ensure you have `yarn` or `bun` installed. ### Installation 1. Clone the repository: ```bash git clone https://github.com/your-repo/llmchat.git cd llmchat ``` 2. Install dependencies: ```bash yarn install # or bun install ``` 3. Start the development server: ```bash yarn dev # or bun dev ``` 4. Open your browser and navigate to `http://localhost:3000`. ## Deployment Instructions for deploying the project will be added soon.