fix(core): update
This commit is contained in:
parent
f628a71184
commit
3a5f2d52e5
@ -5,21 +5,20 @@
|
|||||||
"githost": "code.foss.global",
|
"githost": "code.foss.global",
|
||||||
"gitscope": "push.rocks",
|
"gitscope": "push.rocks",
|
||||||
"gitrepo": "smartai",
|
"gitrepo": "smartai",
|
||||||
"description": "Provides a standardized interface for integrating and conversing with multiple AI models, supporting operations like chat and potentially audio responses.",
|
"description": "A TypeScript library for integrating and interacting with multiple AI models, offering capabilities for chat and potentially audio responses.",
|
||||||
"npmPackagename": "@push.rocks/smartai",
|
"npmPackagename": "@push.rocks/smartai",
|
||||||
"license": "MIT",
|
"license": "MIT",
|
||||||
"projectDomain": "push.rocks",
|
"projectDomain": "push.rocks",
|
||||||
"keywords": [
|
"keywords": [
|
||||||
"AI models integration",
|
"AI integration",
|
||||||
"OpenAI GPT",
|
"chatbot",
|
||||||
"Anthropic AI",
|
|
||||||
"text-to-speech",
|
|
||||||
"conversation stream",
|
|
||||||
"TypeScript",
|
"TypeScript",
|
||||||
"ESM",
|
"OpenAI",
|
||||||
"streaming API",
|
"Anthropic",
|
||||||
"modular design",
|
"multi-model support",
|
||||||
"development tool"
|
"audio responses",
|
||||||
|
"text-to-speech",
|
||||||
|
"streaming chat"
|
||||||
]
|
]
|
||||||
}
|
}
|
||||||
},
|
},
|
||||||
|
21
package.json
21
package.json
@ -2,7 +2,7 @@
|
|||||||
"name": "@push.rocks/smartai",
|
"name": "@push.rocks/smartai",
|
||||||
"version": "0.0.12",
|
"version": "0.0.12",
|
||||||
"private": false,
|
"private": false,
|
||||||
"description": "Provides a standardized interface for integrating and conversing with multiple AI models, supporting operations like chat and potentially audio responses.",
|
"description": "A TypeScript library for integrating and interacting with multiple AI models, offering capabilities for chat and potentially audio responses.",
|
||||||
"main": "dist_ts/index.js",
|
"main": "dist_ts/index.js",
|
||||||
"typings": "dist_ts/index.d.ts",
|
"typings": "dist_ts/index.d.ts",
|
||||||
"type": "module",
|
"type": "module",
|
||||||
@ -57,15 +57,14 @@
|
|||||||
"readme.md"
|
"readme.md"
|
||||||
],
|
],
|
||||||
"keywords": [
|
"keywords": [
|
||||||
"AI models integration",
|
"AI integration",
|
||||||
"OpenAI GPT",
|
"chatbot",
|
||||||
"Anthropic AI",
|
|
||||||
"text-to-speech",
|
|
||||||
"conversation stream",
|
|
||||||
"TypeScript",
|
"TypeScript",
|
||||||
"ESM",
|
"OpenAI",
|
||||||
"streaming API",
|
"Anthropic",
|
||||||
"modular design",
|
"multi-model support",
|
||||||
"development tool"
|
"audio responses",
|
||||||
|
"text-to-speech",
|
||||||
|
"streaming chat"
|
||||||
]
|
]
|
||||||
}
|
}
|
95
readme.md
95
readme.md
@ -14,99 +14,82 @@ This command installs the package and adds it to your project's dependencies.
|
|||||||
|
|
||||||
## Usage
|
## Usage
|
||||||
|
|
||||||
The usage section delves into how to leverage the `@push.rocks/smartai` package to interact with AI models in an application. This package simplifies the integration and conversation with AI models by providing a standardized interface. The examples below demonstrate the package's capabilities in engaging with AI models for chat operations and potentially handling audio responses using TypeScript and ESM syntax.
|
The `@push.rocks/smartai` package is a comprehensive solution for integrating and interacting with various AI models, designed to support operations ranging from chat interactions to possibly handling audio responses. This documentation will guide you through the process of utilizing `@push.rocks/smartai` in your applications, focusing on TypeScript and ESM syntax to demonstrate its full capabilities.
|
||||||
|
|
||||||
### Integrating AI Models
|
### Getting Started
|
||||||
|
|
||||||
#### Importing the Module
|
Before you begin, ensure you have installed the package in your project as described in the **Install** section above. Once installed, you can start integrating AI functionalities into your application.
|
||||||
|
|
||||||
Start by importing `SmartAi` and the AI providers you wish to use from `@push.rocks/smartai`.
|
### Initializing SmartAi
|
||||||
|
|
||||||
|
The first step is to import and initialize the `SmartAi` class with appropriate options, including tokens for the AI services you plan to use:
|
||||||
|
|
||||||
```typescript
|
```typescript
|
||||||
import { SmartAi, OpenAiProvider, AnthropicProvider } from '@push.rocks/smartai';
|
import { SmartAi } from '@push.rocks/smartai';
|
||||||
```
|
|
||||||
|
|
||||||
#### Initializing `SmartAi`
|
|
||||||
|
|
||||||
Create an instance of `SmartAi` with the necessary credentials for accessing the AI services.
|
|
||||||
|
|
||||||
```typescript
|
|
||||||
const smartAi = new SmartAi({
|
const smartAi = new SmartAi({
|
||||||
openaiToken: 'your-openai-access-token',
|
openaiToken: 'your-openai-access-token',
|
||||||
anthropicToken: 'your-anthropic-access-token'
|
anthropicToken: 'your-anthropic-access-token'
|
||||||
});
|
});
|
||||||
|
|
||||||
|
await smartAi.start();
|
||||||
```
|
```
|
||||||
|
|
||||||
### Chatting with the AI
|
### Creating Conversations with AI
|
||||||
|
|
||||||
#### Creating a Conversation
|
`SmartAi` provides a flexible interface to create and manage conversations with different AI providers. You can create a conversation with any supported AI provider like OpenAI or Anthropic by specifying the provider you want to use:
|
||||||
|
|
||||||
To begin a conversation, choose the AI provider you'd like to use. For instance, to use OpenAI:
|
|
||||||
|
|
||||||
```typescript
|
```typescript
|
||||||
async function createOpenAiConversation() {
|
const openAiConversation = await smartAi.createConversation('openai');
|
||||||
const conversation = await smartAi.createOpenApiConversation();
|
const anthropicConversation = await smartAi.createConversation('anthropic');
|
||||||
// Use the conversation for chatting
|
|
||||||
}
|
|
||||||
```
|
```
|
||||||
|
|
||||||
Similarly, for an Anthropic AI conversation:
|
### Chatting with AI
|
||||||
|
|
||||||
|
Once you have a conversation instance, you can start sending messages to the AI and receive responses. Each conversation object provides methods to interact in a synchronous or asynchronous manner, depending on your use case.
|
||||||
|
|
||||||
|
#### Synchronous Chat Example
|
||||||
|
|
||||||
|
Here's how you can have a synchronous chat with OpenAI:
|
||||||
|
|
||||||
```typescript
|
```typescript
|
||||||
async function createAnthropicConversation() {
|
const response = await openAiConversation.chat({
|
||||||
const conversation = await smartAi.createAnthropicConversation();
|
systemMessage: 'This is a greeting from the system.',
|
||||||
// Use the conversation for chatting
|
userMessage: 'Hello, AI! How are you today?',
|
||||||
}
|
messageHistory: [] // Previous messages in the conversation
|
||||||
|
});
|
||||||
|
|
||||||
|
console.log(response.message); // Log the response from AI
|
||||||
```
|
```
|
||||||
|
|
||||||
### Streaming Chat with OpenAI
|
#### Streaming Chat Example
|
||||||
|
|
||||||
For more advanced scenarios, like a streaming chat with OpenAI, you would interact with the chat stream directly:
|
For real-time, streaming interactions, you can utilize the streaming capabilities provided by the conversation object. This enables a continuous exchange of messages between your application and the AI:
|
||||||
|
|
||||||
```typescript
|
```typescript
|
||||||
// Assuming a conversation has been created and initialized...
|
const inputStreamWriter = openAiConversation.getInputStreamWriter();
|
||||||
const inputStreamWriter = conversation.getInputStreamWriter();
|
const outputStream = openAiConversation.getOutputStream();
|
||||||
const outputStream = conversation.getOutputStream();
|
|
||||||
|
|
||||||
// Write a message to the input stream for the AI to process
|
inputStreamWriter.write('Hello, AI! Can you stream responses?');
|
||||||
await inputStreamWriter.write('Hello, how can I help you today?');
|
|
||||||
|
|
||||||
// Listen to the output stream for responses from AI
|
|
||||||
const reader = outputStream.getReader();
|
const reader = outputStream.getReader();
|
||||||
reader.read().then(function processText({ done, value }) {
|
reader.read().then(function processText({done, value}) {
|
||||||
if (done) {
|
if (done) {
|
||||||
console.log("No more messages from AI");
|
console.log('Stream finished.');
|
||||||
return;
|
return;
|
||||||
}
|
}
|
||||||
console.log("AI says:", value);
|
console.log('AI says:', value);
|
||||||
// Continue reading messages
|
reader.read().then(processText); // Continue reading messages
|
||||||
reader.read().then(processText);
|
|
||||||
});
|
});
|
||||||
```
|
```
|
||||||
|
|
||||||
### Handling Audio Responses
|
### Extending Conversations
|
||||||
|
|
||||||
The package may also support converting text responses from the AI into audio. While specific implementation details depend on the AI provider's capabilities, a generic approach would involve creating a text-to-speech instance and utilizing it:
|
The modular design of `@push.rocks/smartai` allows you to extend conversations with additional features, such as handling audio responses or integrating other AI-powered functionalities. Utilize the provided AI providers' APIs to explore and implement a wide range of AI interactions within your conversations.
|
||||||
|
|
||||||
```typescript
|
|
||||||
// This is a hypothetical function call as the implementation might vary
|
|
||||||
const tts = await TTS.createWithOpenAi(smartAi);
|
|
||||||
|
|
||||||
// The TTS instance would then be used to convert text to speech
|
|
||||||
```
|
|
||||||
|
|
||||||
### Extensive Feature Set
|
|
||||||
|
|
||||||
`@push.rocks/smartai` provides comprehensive support for interacting with various AI models, not limited to text chat. It encompasses audio responses, potentially incorporating AI-powered analyses, and other multi-modal interactions.
|
|
||||||
|
|
||||||
Refer to the specific AI providers’ documentation through `@push.rocks/smartai`, such as OpenAI and Anthropic, for detailed guidance on utilizing the full spectrum of capabilities, including the implementation of custom conversation flows, handling streaming data efficiently, and generating audio responses from AI conversations.
|
|
||||||
|
|
||||||
### Conclusion
|
### Conclusion
|
||||||
|
|
||||||
Equipped with `@push.rocks/smartai`, developers can streamline the integration of sophisticated AI interactions into their applications. The package facilitates robust communication with AI models, supporting diverse operations from simple chats to complex audio feedback mechanisms, all within a unified, easy-to-use interface.
|
With `@push.rocks/smartai`, integrating AI functionalities into your applications becomes streamlined and efficient. By leveraging the standardized interface provided by the package, you can easily converse with multiple AI models, expanding the capabilities of your applications with cutting-edge AI features. Whether you're implementing simple chat interactions or complex, real-time communication flows, `@push.rocks/smartai` offers the tools and flexibility needed to create engaging, AI-enhanced experiences.
|
||||||
|
|
||||||
Explore the package more to uncover its full potential in creating engaging, AI-enhanced interactions in your applications.
|
|
||||||
|
|
||||||
|
|
||||||
## License and Legal Information
|
## License and Legal Information
|
||||||
|
|
||||||
|
@ -3,6 +3,6 @@
|
|||||||
*/
|
*/
|
||||||
export const commitinfo = {
|
export const commitinfo = {
|
||||||
name: '@push.rocks/smartai',
|
name: '@push.rocks/smartai',
|
||||||
version: '0.0.12',
|
version: '0.0.13',
|
||||||
description: 'Provides a standardized interface for integrating and conversing with multiple AI models, supporting operations like chat and potentially audio responses.'
|
description: 'A TypeScript library for integrating and interacting with multiple AI models, offering capabilities for chat and potentially audio responses.'
|
||||||
}
|
}
|
||||||
|
Loading…
Reference in New Issue
Block a user