A comprehensive tool for maintaining and configuring servers, integrating with Docker and supporting advanced task scheduling, targeted at the serve.zone infrastructure. It's mainly designed to be utilized by @serve.zone/cloudly as a cluster node server system manager, maintaining and configuring servers on the base OS level.
To use `@serve.zone/spark` in your project, you need to include and initiate it in your TypeScript project. Ensure you have TypeScript and the necessary build tools set up in your project.
Create an instance of the `Spark` class to start using Spark. This instance will serve as the main entry point for interacting with Spark functionalities.
To run Spark as a daemon, which is useful for maintaining and configuring servers at the OS level, you can use the CLI feature bundled with Spark. This should ideally be handled outside of your code through a command-line terminal but can also be automated within your Node.js scripts if required.
The command above sets up Spark as a system service, enabling it to run and maintain server configurations automatically.
### Updating Spark or Maintained Services
Spark can self-update and manage updates for its maintained services. Trigger an update check and process by calling the `updateServices` method on the Spark instance.
Spark allows extensive configuration and logging customization. Use the `SparkLocalConfig` and logging features to tailor Spark's operation to your needs.
`@serve.zone/spark` offers tools for detailed server and service management, including but not limited to task scheduling, daemon management, and service updates. Explore the `SparkTaskManager` for scheduling specific tasks, `SparkUpdateManager` for handling service updates, and `SparkLocalConfig` for configuration.
This illustrates how to initiate and update the Spark daemon using the `SmartDaemon` class from `@push.rocks/smartdaemon`.
### Configuration Management
Extensive configuration management is possible through the `SparkLocalConfig` and other configuration classes. This feature allows you to make your application's behavior adaptable based on different environments and requirements.
```typescript
// Example on setting local config
import { SparkLocalConfig } from '@serve.zone/spark';
const localConfig = new SparkLocalConfig(sparkInstance);
Logging is a crucial aspect of any automation tool, and `@serve.zone/spark` offers rich logging functionality through its built-in logging library.
```typescript
import { logger, Spark } from '@serve.zone/spark';
const sparkInstance = new Spark();
logger.log('info', 'Spark instance created.');
// Using logger in various levels of severity
logger.log('debug', 'This is a debug message');
logger.log('warn', 'This is a warning message');
logger.log('error', 'This is an error message');
logger.log('ok', 'This is a success message');
```
### Real-World Scenarios
#### Automated System Update and Restart
In real-world scenarios, you might want to automate system updates and reboots to ensure your services are running the latest security patches and features.
```typescript
import { Spark } from '@serve.zone/spark';
import { SmartShell } from '@push.rocks/smartshell';
const sparkInstance = new Spark();
const shell = new SmartShell({ executor: 'bash' });
This example demonstrates creating and scheduling a task to update and restart the server every Sunday at 3 AM using Spark's task management capabilities.
#### Integrating with Docker for Service Deployment
Spark's tight integration with Docker makes it an excellent tool for deploying containerized applications across your infrastructure.
```typescript
import { Spark } from '@serve.zone/spark';
import { DockerHost } from '@apiclient.xyz/docker';
This example demonstrates how to pull a Docker image and deploy it as a new service in your infrastructure using Spark's Docker integration.
### Managing Secrets
Managing secrets and sensitive data is crucial in any configuration and automation tool. Spark's integration with Docker allows you to handle secrets securely.
```typescript
import { Spark, SparkUpdateManager } from '@serve.zone/spark';
import { DockerSecret } from '@apiclient.xyz/docker';
const sparkInstance = new Spark();
const updateManager = new SparkUpdateManager(sparkInstance);
This example shows how to create a Docker secret using Spark's `SparkUpdateManager` class, ensuring that sensitive information is securely stored and managed.
This repository contains open-source code that is licensed under the MIT License. A copy of the MIT License can be found in the [license](license) file within this repository.
**Please note:** The MIT License does not grant permission to use the trade names, trademarks, service marks, or product names of the project, except as required for reasonable and customary use in describing the origin of the work and reproducing the content of the NOTICE file.
### Trademarks
This project is owned and maintained by Task Venture Capital GmbH. The names and logos associated with Task Venture Capital GmbH and any related products or services are trademarks of Task Venture Capital GmbH and are not included within the scope of the MIT license granted herein. Use of these trademarks must comply with Task Venture Capital GmbH's Trademark Guidelines, and any usage must be approved in writing by Task Venture Capital GmbH.
### Company Information
Task Venture Capital GmbH
Registered at District court Bremen HRB 35230 HB, Germany
For any legal inquiries or if you require further information, please contact us via email at hello@task.vc.
By using this repository, you acknowledge that you have read this section, agree to comply with its terms, and understand that the licensing of the code does not imply endorsement by Task Venture Capital GmbH of any derivative works.