The systemd service that monitors and keeps a node online
Go to file
2024-06-13 15:15:42 +02:00
.vscode fix(core): update 2024-05-08 20:49:10 +02:00
test fix(core): update 2024-05-08 20:49:10 +02:00
ts fix(core): update 2024-06-13 15:15:41 +02:00
.gitignore fix(core): update 2024-05-08 20:49:10 +02:00
cli.child.ts fix(core): update 2024-05-08 20:49:10 +02:00
cli.js fix(core): update 2024-05-08 20:49:10 +02:00
cli.ts.js fix(core): update 2024-05-08 20:49:10 +02:00
license fix(core): update 2024-06-13 15:12:07 +02:00
npmextra.json fix(core): update 2024-05-08 20:53:35 +02:00
package.json 1.0.87 2024-06-13 15:15:42 +02:00
pnpm-lock.yaml fix(core): update 2024-06-13 15:12:07 +02:00
readme.hints.md fix(core): update 2024-05-08 20:53:35 +02:00
readme.md fix(core): update 2024-05-08 20:53:35 +02:00
tsconfig.json fix(core): update 2024-05-08 20:49:10 +02:00

@serve.zone/spark

sparks the servezone services

Install

To install @serve.zone/spark, run the following command in your terminal:

npm install @serve.zone/spark --save

Usage

Getting Started

To use @serve.zone/spark in your project, you need to include and initiate it in your TypeScript project. Ensure you have TypeScript and the necessary build tools set up in your project.

First, import @serve.zone/spark:

import { Spark } from '@serve.zone/spark';

Initializing Spark

Create an instance of the Spark class to start using Spark. This instance will serve as the main entry point for interacting with the Spark functionalities.

const sparkInstance = new Spark();

Running Spark as a Daemon

To run Spark as a daemon, which is useful for maintaining and configuring servers on the base OS level, use the CLI feature bundled with Spark. This should ideally be handled outside of your code through a command-line terminal but can also be automated within your Node.js scripts if required.

spark installdaemon

The command above sets up Spark as a system service, enabling it to run and maintain server configurations automatically.

Updating Spark or Maintained Services

Spark can self-update and manage updates for its maintained services. Trigger an update check and process by calling the updateServices method on the Spark instance.

await sparkInstance.sparkUpdateManager.updateServices();

Managing Configuration and Logging

Spark allows for extensive configuration and logging customization. Use the SparkLocalConfig and logging features to tailor Spark's operation to your needs.

// Accessing the local configuration
const localConfig = sparkInstance.sparkLocalConfig;

// Utilizing the logger for custom log messages
import { logger } from '@serve.zone/spark';

logger.log('info', 'Custom log message');

Advanced Usage

@serve.zone/spark offers a suite of tools for detailed server and service management, including but not limited to task scheduling, daemon management, and service updates. Explore the SparkTaskManager for scheduling specific tasks, SparkUpdateManager for handling service updates, and SparkLocalConfig for configuration.

Example: Scheduling Custom Tasks

import { SparkTaskManager } from '@serve.zone/spark';

const sparkInstance = new Spark();
const myTask = {
  name: 'customTask',
  taskFunction: async () => {
    console.log('Running custom task');
  },
};

sparkInstance.sparkTaskManager.taskmanager.addAndScheduleTask(myTask, '* * * * * *');

The example above creates a simple task that logs a message every second, demonstrating how to use Spark's task manager for custom scheduled tasks.

Advanced Configuration

For advanced configurations, including Docker and service management:

  • Use SparkUpdateManager to handle Docker image updates, service creation, and management.
  • Access and modify Docker and service configurations through Spark's integration with configuration files and environment variables.
// Managing Docker services with Spark
await sparkInstance.sparkUpdateManager.dockerHost.someDockerMethod();

// Example: Creating a Docker service
const newServiceDefinition = {...};
await sparkInstance.sparkUpdateManager.createService(newServiceDefinition);

Conclusion

@serve.zone/spark provides a comprehensive toolkit for orchestrating and managing server environments and Docker-based services. By leveraging its CLI and programmatic interfaces, you can automate and streamline server operations, configurations, updates, and task scheduling, ensuring your infrastructure is responsive, updated, and maintained efficiently. undefined