R Shiny Interface for Chatting with LLMs Offline via Ollama
Experience seamless, private, and offline AI conversations right on your machine! shiny.ollama
provides a user-friendly R Shiny interface to interact with LLMs locally, powered by Ollama.
⚠️ Disclaimer
Important: shiny.ollama
requires Ollama to be installed on your system. Without it, this package will not function. Follow the Installation Guide below to set up Ollama first.
📦 Installation
From CRAN (Recommended)
install.packages("shiny.ollama")
From GitHub (Latest Development Version)
# Install devtools if not already installed
install.packages("devtools")
devtools::install_github("ineelhere/shiny.ollama")
✨ Features
- 🔒 Fully Offline: No internet required – complete privacy
- 🎛 Model Selection: Easily choose from available LLM models
- 💬 Message Input: Engage in conversations with AI
- 💾 Save & Download Chats: Export your chat history
- 🖥 User-Friendly Interface: Powered by R Shiny
🚀 Quick Start
Launch the Shiny app in R with:
library(shiny.ollama)
# Start the application
shiny.ollama::run_app()