Embed the AI Chat Widget
Add a self-hosted AI chat widget to any website with a single script tag. Fully customizable and privacy-first.
1 Quick Start
Paste this script tag just before the closing </body> tag on your website:
<script src="https://laravelgpt.com/widget.js" data-chat-server="https://ai.izdrail.com" data-model="mistral:7b" data-title="AI Assistant" data-intro-message="Hello! How can I help you today?" data-dark-mode="true" ></script>
2 Configuration Options
| Attribute | Type | Default | Description |
|---|---|---|---|
| data-chat-server | string | https://ai.izdrail.com | Your Ollama API endpoint |
| data-model | string | mistral:7b | Default LLM model |
| data-title | string | AI Assistant | Chat header title |
| data-intro-message | string | Welcome message | First message shown to visitors |
| data-dark-mode | boolean | true | Enable dark theme |
| data-vrm-model-url | string | - | URL to a custom .vrm 3D model |
| data-position | string | bottom-right | Widget position: bottom-right or bottom-left |
| data-primary-color | string | #7C3AED | Primary gradient color (hex) |
| data-secondary-color | string | #EC4899 | Secondary gradient color (hex) |
| data-trigger-size | number | 60 | Trigger button size in pixels |
| data-proxy-path | string | - | Reverse proxy URL (avoids CORS entirely). Requests go to {proxyPath}/api/generate |
3
Advanced: URL Configuration
You can also pass configuration via URL parameter for dynamic setups:
<script src="https://laravelgpt.com/widget.js?conf=%7B%22chatServer%22%3A%22https%3A%2F%2Fai.izdrail.com%22%2C%22model%22%3A%22mistral%3A7b%22%7D" ></script>
URL config takes priority over data attributes.
!
CORS Troubleshooting
When embedding the widget on a different domain than your Ollama server,
the browser blocks cross-origin requests unless the server explicitly allows them.
Solution 1: Enable CORS on Ollama (easiest)
Start Ollama with permissive CORS:
OLLAMA_ORIGINS=* ollama serve
Or add it to your systemd/supervisor config. See Ollama docs →
Solution 2: Reverse Proxy on your domain (recommended for production)
Set up a reverse proxy on your own domain so the widget fetches from the same origin (no CORS needed).
Example with nginx:
# /etc/nginx/sites-enabled/your-site.com
location /ollama-proxy/
proxy_pass http://127.0.0.1:11434/;
proxy_set_header Host $host;
Then configure the widget with data-proxy-path:
<script
src="https://laravelgpt.com/widget.js"
data-proxy-path="https://your-site.com/ollama-proxy"
></script>
Solution 3: Local development
During development, the Astro dev server proxies /ollama-api/* to
https://ai.izdrail.com/api/* automatically.
Just run npm run dev and the widget works out of the box.