A TypeScript library for integrating and interacting with multiple AI models, offering capabilities for chat and potentially audio responses.
Go to file
2024-05-17 16:25:22 +02:00
.gitea/workflows fix(core): update 2024-03-30 12:42:44 +01:00
.vscode fix(core): update 2024-03-30 12:42:44 +01:00
test fix(core): update 2024-04-29 18:04:14 +02:00
ts fix(core): update 2024-05-17 16:25:22 +02:00
.gitignore fix(core): update 2024-03-30 12:42:44 +01:00
npmextra.json fix(core): update 2024-04-29 12:37:43 +02:00
package.json 0.0.16 2024-05-17 16:25:22 +02:00
pnpm-lock.yaml fix(core): update 2024-05-17 16:25:22 +02:00
qenv.yml fix(core): update 2024-04-25 10:49:07 +02:00
readme.hints.md update tsconfig 2024-04-14 17:19:32 +02:00
readme.md fix(core): update 2024-04-29 12:37:43 +02:00
tsconfig.json fix(core): update 2024-03-30 12:42:44 +01:00

@push.rocks/smartai

Provides a standardized interface for integrating and conversing with multiple AI models, supporting operations like chat and potentially audio responses.

Install

To add @push.rocks/smartai to your project, run the following command in your terminal:

npm install @push.rocks/smartai

This command installs the package and adds it to your project's dependencies.

Usage

The @push.rocks/smartai package is a comprehensive solution for integrating and interacting with various AI models, designed to support operations ranging from chat interactions to possibly handling audio responses. This documentation will guide you through the process of utilizing @push.rocks/smartai in your applications, focusing on TypeScript and ESM syntax to demonstrate its full capabilities.

Getting Started

Before you begin, ensure you have installed the package in your project as described in the Install section above. Once installed, you can start integrating AI functionalities into your application.

Initializing SmartAi

The first step is to import and initialize the SmartAi class with appropriate options, including tokens for the AI services you plan to use:

import { SmartAi } from '@push.rocks/smartai';

const smartAi = new SmartAi({
  openaiToken: 'your-openai-access-token',
  anthropicToken: 'your-anthropic-access-token'
});

await smartAi.start();

Creating Conversations with AI

SmartAi provides a flexible interface to create and manage conversations with different AI providers. You can create a conversation with any supported AI provider like OpenAI or Anthropic by specifying the provider you want to use:

const openAiConversation = await smartAi.createConversation('openai');
const anthropicConversation = await smartAi.createConversation('anthropic');

Chatting with AI

Once you have a conversation instance, you can start sending messages to the AI and receive responses. Each conversation object provides methods to interact in a synchronous or asynchronous manner, depending on your use case.

Synchronous Chat Example

Here's how you can have a synchronous chat with OpenAI:

const response = await openAiConversation.chat({
  systemMessage: 'This is a greeting from the system.',
  userMessage: 'Hello, AI! How are you today?',
  messageHistory: [] // Previous messages in the conversation
});

console.log(response.message); // Log the response from AI

Streaming Chat Example

For real-time, streaming interactions, you can utilize the streaming capabilities provided by the conversation object. This enables a continuous exchange of messages between your application and the AI:

const inputStreamWriter = openAiConversation.getInputStreamWriter();
const outputStream = openAiConversation.getOutputStream();

inputStreamWriter.write('Hello, AI! Can you stream responses?');

const reader = outputStream.getReader();
reader.read().then(function processText({done, value}) {
  if (done) {
    console.log('Stream finished.');
    return;
  }
  console.log('AI says:', value);
  reader.read().then(processText); // Continue reading messages
});

Extending Conversations

The modular design of @push.rocks/smartai allows you to extend conversations with additional features, such as handling audio responses or integrating other AI-powered functionalities. Utilize the provided AI providers' APIs to explore and implement a wide range of AI interactions within your conversations.

Conclusion

With @push.rocks/smartai, integrating AI functionalities into your applications becomes streamlined and efficient. By leveraging the standardized interface provided by the package, you can easily converse with multiple AI models, expanding the capabilities of your applications with cutting-edge AI features. Whether you're implementing simple chat interactions or complex, real-time communication flows, @push.rocks/smartai offers the tools and flexibility needed to create engaging, AI-enhanced experiences.

This repository contains open-source code that is licensed under the MIT License. A copy of the MIT License can be found in the license file within this repository.

Please note: The MIT License does not grant permission to use the trade names, trademarks, service marks, or product names of the project, except as required for reasonable and customary use in describing the origin of the work and reproducing the content of the NOTICE file.

Trademarks

This project is owned and maintained by Task Venture Capital GmbH. The names and logos associated with Task Venture Capital GmbH and any related products or services are trademarks of Task Venture Capital GmbH and are not included within the scope of the MIT license granted herein. Use of these trademarks must comply with Task Venture Capital GmbH's Trademark Guidelines, and any usage must be approved in writing by Task Venture Capital GmbH.

Company Information

Task Venture Capital GmbH
Registered at District court Bremen HRB 35230 HB, Germany

For any legal inquiries or if you require further information, please contact us via email at hello@task.vc.

By using this repository, you acknowledge that you have read this section, agree to comply with its terms, and understand that the licensing of the code does not imply endorsement by Task Venture Capital GmbH of any derivative works.