Skip to contents

CRAN Version

R Shiny Interface for Chatting with LLMs Offline via Ollama

Experience seamless, private, and offline AI conversations right on your machine! shiny.ollama provides a user-friendly R Shiny interface to interact with LLMs locally, powered by Ollama.

pkgdown Visitors License CRAN downloads [R-CMD-check]

⚠️ Disclaimer

Important: shiny.ollama requires Ollama to be installed on your system. Without it, this package will not function. Follow the Installation Guide below to set up Ollama first.

📦 Installation

install.packages("shiny.ollama")

From GitHub (Latest Development Version)

# Install devtools if not already installed
install.packages("devtools")

devtools::install_github("ineelhere/shiny.ollama")

✨ Features

  • 🔒 Fully Offline: No internet required – complete privacy
  • 🎛 Model Selection: Easily choose from available LLM models
  • 💬 Message Input: Engage in conversations with AI
  • 💾 Save & Download Chats: Export your chat history
  • 🖥 User-Friendly Interface: Powered by R Shiny

🚀 Quick Start

Launch the Shiny app in R with:

library(shiny.ollama)

# Start the application
shiny.ollama::run_app()

📥 How to Install Ollama

To use this package, install Ollama first:

  1. 🔗 Download Ollama from here (Mac, Windows, Linux supported).

  2. 🛠 Install it by following the provided instructions.

  3. ✅ Verify your installation:

    ollama --version

    If successful, the version number will be displayed.

  4. 📌 Pull a model (e.g., Llama3.3) to get started.

📄 License

This project is licensed under the Apache License 2.0.

💡 Contributions, feedback, and feature requests are always welcome! Stay tuned for more updates. 🚀