Ollama AI NPC - Local LLM Integration for ServUO 57+

Ollama AI NPC - Local LLM Integration for ServUO 57+ 1.1

Author
Unstable Kitsune
Downloads
23
Views
373
First release
Last update

Ratings

0.00 star(s) 0 ratings
No permission to download Join the discussion Get support
Requirements
Ollama Locally Installed
Transform your ServUO world with AI-powered NPCs that actually understand and respond to player questions naturally. This system integrates with your local Ollama instance (or any OpenAI-compatible API) to provide intelligent, context-aware conversations without impacting server performance. Connects to Ollama (or any OpenAI-compatible API) for dynamic, context-aware responses without blocking your server.


FEATURES​

  • Fully configurable via in-game [props command
  • Supports ANY Ollama model (llama3.2, mistral, qwen, etc.)
  • Customizable system prompts via external text file
  • Configurable endpoint and API key support
  • Async responses - NO server lag or blocking
  • Proper serialization for world saves
  • Compatible with remote OpenAI-compatible APIs

CREDITS & LICENSE​

Original Concept: Unstable Kitsune (Kitsunic Realms)
Generic Version: Released for the ServUO Community

LICENSE: Free to use and modify
Attribution appreciated but not required
Share your improvements with the community!

COMMUNITY:


HAPPY ADVENTURING!
May your NPCs be wise and your servers stable.
  • Like
Reactions: SomeDude and Feng

Latest Updates

  1. OllamaNPC v1.1 - Memory System Integration

    How to update Just drag and drop/overwrite current OllamaNPC.cs OR Delete your current...

Donations

Total amount
$50.00
Goal
$500.00

Shards

Back