feat(core): Added support for Exo AI provider
This commit is contained in:
17
readme.md
17
readme.md
@@ -110,6 +110,19 @@ SmartAi supports multiple AI providers. Configure each provider with its corresp
|
||||
}
|
||||
```
|
||||
|
||||
### Exo
|
||||
|
||||
- **Models:** Configurable (supports LLaMA, Mistral, LlaVA, Qwen, and Deepseek)
|
||||
- **Features:** Chat, Streaming
|
||||
- **Configuration Example:**
|
||||
|
||||
```typescript
|
||||
exo: {
|
||||
baseUrl: 'http://localhost:8080/v1', // Optional
|
||||
apiKey: 'your-api-key' // Optional for local deployments
|
||||
}
|
||||
```
|
||||
|
||||
## Quick Start
|
||||
|
||||
Initialize SmartAi with the provider configurations you plan to use:
|
||||
@@ -126,6 +139,10 @@ const smartAi = new SmartAi({
|
||||
ollama: {
|
||||
baseUrl: 'http://localhost:11434',
|
||||
model: 'llama2'
|
||||
},
|
||||
exo: {
|
||||
baseUrl: 'http://localhost:8080/v1',
|
||||
apiKey: 'your-api-key'
|
||||
}
|
||||
});
|
||||
|
||||
|
Reference in New Issue
Block a user