Run Local LLMs on Ubuntu with Ollama and Open WebUI (GPU Acceleration, API, and Troubleshooting) on October 27, 2025 AI inference Docker Edge AI Local LLM NVIDIA GPU Ollama Open WebUI Ubuntu +