feat(core): Added support for Exo AI provider

This commit is contained in:
2025-02-08 12:08:14 +01:00
parent d93b198b09
commit f89888a542
6 changed files with 249 additions and 4 deletions

View File

@@ -110,6 +110,19 @@ SmartAi supports multiple AI providers. Configure each provider with its corresp
}
```
### Exo
- **Models:** Configurable (supports LLaMA, Mistral, LlaVA, Qwen, and Deepseek)
- **Features:** Chat, Streaming
- **Configuration Example:**
```typescript
exo: {
baseUrl: 'http://localhost:8080/v1', // Optional
apiKey: 'your-api-key' // Optional for local deployments
}
```
## Quick Start
Initialize SmartAi with the provider configurations you plan to use:
@@ -126,6 +139,10 @@ const smartAi = new SmartAi({
ollama: {
baseUrl: 'http://localhost:11434',
model: 'llama2'
},
exo: {
baseUrl: 'http://localhost:8080/v1',
apiKey: 'your-api-key'
}
});