283 lines
9.4 KiB
Markdown
283 lines
9.4 KiB
Markdown
# @serve.zone/spark
|
|
A tool to maintain and configure servers on the base OS level for the Servezone infrastructure.
|
|
|
|
## Install
|
|
To install `@serve.zone/spark`, run the following command in your terminal:
|
|
|
|
```sh
|
|
npm install @serve.zone/spark --save
|
|
```
|
|
|
|
## Usage
|
|
|
|
### Getting Started
|
|
To use `@serve.zone/spark` in your project, you need to include and initiate it in your TypeScript project. Ensure you have TypeScript and the necessary build tools set up in your project.
|
|
|
|
First, import `@serve.zone/spark`:
|
|
|
|
```typescript
|
|
import { Spark } from '@serve.zone/spark';
|
|
```
|
|
|
|
### Initializing Spark
|
|
Create an instance of the `Spark` class to start using Spark. This instance will serve as the main entry point for interacting with the Spark functionalities.
|
|
|
|
```typescript
|
|
const sparkInstance = new Spark();
|
|
```
|
|
|
|
### Running Spark as a Daemon
|
|
To run Spark as a daemon, which is useful for maintaining and configuring servers on the base OS level, use the CLI feature bundled with Spark. This should ideally be handled outside of your code through a command-line terminal but can also be automated within your Node.js scripts if required.
|
|
|
|
```shell
|
|
spark installdaemon
|
|
```
|
|
|
|
The command above sets up Spark as a system service, enabling it to run and maintain server configurations automatically.
|
|
|
|
### Updating Spark or Maintained Services
|
|
Spark can self-update and manage updates for its maintained services. Trigger an update check and process by calling the `updateServices` method on the Spark instance.
|
|
|
|
```typescript
|
|
await sparkInstance.sparkUpdateManager.updateServices();
|
|
```
|
|
|
|
### Managing Configuration and Logging
|
|
Spark allows for extensive configuration and logging customization. Use the `SparkLocalConfig` and logging features to tailor Spark's operation to your needs.
|
|
|
|
```typescript
|
|
// Accessing the local configuration
|
|
const localConfig = sparkInstance.sparkLocalConfig;
|
|
|
|
// Utilizing the logger for custom log messages
|
|
import { logger } from '@serve.zone/spark';
|
|
|
|
logger.log('info', 'Custom log message');
|
|
```
|
|
|
|
### Advanced Usage
|
|
`@serve.zone/spark` offers a suite of tools for detailed server and service management, including but not limited to task scheduling, daemon management, and service updates. Explore the `SparkTaskManager` for scheduling specific tasks, `SparkUpdateManager` for handling service updates, and `SparkLocalConfig` for configuration.
|
|
|
|
### Example: Scheduling Custom Tasks
|
|
```typescript
|
|
import { SparkTaskManager } from '@serve.zone/spark';
|
|
|
|
const sparkInstance = new Spark();
|
|
const myTask = {
|
|
name: 'customTask',
|
|
taskFunction: async () => {
|
|
console.log('Running custom task');
|
|
},
|
|
};
|
|
|
|
sparkInstance.sparkTaskManager.taskmanager.addAndScheduleTask(myTask, '* * * * * *');
|
|
```
|
|
|
|
The example above creates a simple task that logs a message every second, demonstrating how to use Spark's task manager for custom scheduled tasks.
|
|
|
|
### Detailed Service Management
|
|
For advanced configurations, including Docker and service management:
|
|
|
|
- Use `SparkUpdateManager` to handle Docker image updates, service creation, and management.
|
|
- Access and modify Docker and service configurations through Spark's integration with configuration files and environment variables.
|
|
|
|
```typescript
|
|
// Managing Docker services with Spark
|
|
await sparkInstance.sparkUpdateManager.dockerHost.someDockerMethod();
|
|
|
|
// Example: Creating a Docker service
|
|
const newServiceDefinition = {...};
|
|
await sparkInstance.sparkUpdateManager.createService(newServiceDefinition);
|
|
```
|
|
|
|
### CLI Commands
|
|
Spark provides several CLI commands to interact with and manage the system services:
|
|
|
|
#### Installing Spark as a Daemon
|
|
```shell
|
|
spark installdaemon
|
|
```
|
|
|
|
Sets up Spark as a system service to maintain server configurations automatically.
|
|
|
|
#### Updating the Daemon
|
|
```shell
|
|
spark updatedaemon
|
|
```
|
|
|
|
Updates the daemon service if a new version is available.
|
|
|
|
#### Running Spark as Daemon
|
|
```shell
|
|
spark asdaemon
|
|
```
|
|
|
|
Runs Spark in daemon mode, which is suitable for executing automated tasks.
|
|
|
|
#### Viewing Logs
|
|
```shell
|
|
spark logs
|
|
```
|
|
|
|
Views the logs of the Spark daemon service.
|
|
|
|
#### Cleaning Up Services
|
|
```shell
|
|
spark prune
|
|
```
|
|
|
|
Stops and cleans up all Docker services (stacks, networks, secrets, etc.) and prunes the Docker system.
|
|
|
|
### Programmatic Daemon Management
|
|
You can also manage the daemon programmatically as shown in the following examples:
|
|
|
|
```typescript
|
|
import { SmartDaemon } from '@push.rocks/smartdaemon';
|
|
import { Spark } from '@serve.zone/spark';
|
|
|
|
const sparkInstance = new Spark();
|
|
const smartDaemon = new SmartDaemon();
|
|
|
|
const startDaemon = async () => {
|
|
const sparkService = await smartDaemon.addService({
|
|
name: 'spark',
|
|
version: sparkInstance.sparkInfo.projectInfo.version,
|
|
command: 'spark asdaemon',
|
|
description: 'Spark daemon service',
|
|
workingDir: '/path/to/project',
|
|
});
|
|
await sparkService.save();
|
|
await sparkService.enable();
|
|
await sparkService.start();
|
|
};
|
|
|
|
const updateDaemon = async () => {
|
|
const sparkService = await smartDaemon.addService({
|
|
name: 'spark',
|
|
version: sparkInstance.sparkInfo.projectInfo.version,
|
|
command: 'spark asdaemon',
|
|
description: 'Spark daemon service',
|
|
workingDir: '/path/to/project',
|
|
});
|
|
await sparkService.reload();
|
|
};
|
|
|
|
startDaemon();
|
|
updateDaemon();
|
|
```
|
|
|
|
This illustrates how to initiate and update the Spark daemon using the `SmartDaemon` class from `@push.rocks/smartdaemon`.
|
|
|
|
### Configuration Management
|
|
Extensive configuration management is possible through the `SparkLocalConfig` and other configuration classes. This feature allows you to make your application's behavior adaptable based on different environments and requirements.
|
|
|
|
```typescript
|
|
// Example on setting local config
|
|
import { SparkLocalConfig } from '@serve.zone/spark';
|
|
|
|
const localConfig = new SparkLocalConfig(sparkInstance);
|
|
await localConfig.kvStore.set('someKey', 'someValue');
|
|
|
|
// Retrieving a value from local config
|
|
const someConfigValue = await localConfig.kvStore.get('someKey');
|
|
|
|
console.log(someConfigValue); // Outputs: someValue
|
|
```
|
|
|
|
### Detailed Log Management
|
|
Logging is a crucial aspect of any automation tool, and `@serve.zone/spark` offers rich logging functionality through its built-in logging library.
|
|
|
|
```typescript
|
|
import { logger, Spark } from '@serve.zone/spark';
|
|
|
|
const sparkInstance = new Spark();
|
|
|
|
logger.log('info', 'Spark instance created.');
|
|
|
|
// Using logger in various levels of severity
|
|
logger.log('debug', 'This is a debug message');
|
|
logger.log('warn', 'This is a warning message');
|
|
logger.log('error', 'This is an error message');
|
|
logger.log('ok', 'This is a success message');
|
|
```
|
|
|
|
### Real-World Scenarios
|
|
|
|
#### Automated System Update and Restart
|
|
In real-world scenarios, you might want to automate system updates and reboots to ensure your services are running the latest security patches and features.
|
|
|
|
```typescript
|
|
import { Spark } from '@serve.zone/spark';
|
|
import { SmartShell } from '@push.rocks/smartshell';
|
|
|
|
const sparkInstance = new Spark();
|
|
const shell = new SmartShell({ executor: 'bash' });
|
|
|
|
const updateAndRestart = async () => {
|
|
await shell.exec('apt-get update && apt-get upgrade -y');
|
|
console.log('System updated.');
|
|
await shell.exec('reboot');
|
|
};
|
|
|
|
sparkInstance.sparkTaskManager.taskmanager.addAndScheduleTask(
|
|
{ name: 'updateAndRestart', taskFunction: updateAndRestart },
|
|
'0 3 * * 7' // Every Sunday at 3 AM
|
|
);
|
|
```
|
|
|
|
This example demonstrates creating and scheduling a task to update and restart the server every Sunday at 3 AM using Spark's task management capabilities.
|
|
|
|
#### Integrating with Docker for Service Deployment
|
|
Spark's tight integration with Docker makes it an excellent tool for deploying containerized applications across your infrastructure.
|
|
|
|
```typescript
|
|
import { Spark } from '@serve.zone/spark';
|
|
import { DockerHost } from '@apiclient.xyz/docker';
|
|
|
|
const sparkInstance = new Spark();
|
|
const dockerHost = new DockerHost({});
|
|
|
|
const deployService = async () => {
|
|
const image = await dockerHost.pullImage('my-docker-repo/my-service:latest');
|
|
const newService = await dockerHost.createService({
|
|
name: 'my-service',
|
|
image,
|
|
ports: ['80:8080'],
|
|
environmentVariables: {
|
|
NODE_ENV: 'production',
|
|
},
|
|
});
|
|
console.log(`Service ${newService.name} deployed.`);
|
|
};
|
|
|
|
deployService();
|
|
```
|
|
|
|
This example demonstrates how to pull a Docker image and deploy it as a new service in your infrastructure using Spark's Docker integration.
|
|
|
|
### Managing Secrets
|
|
Managing secrets and sensitive data is crucial in any configuration and automation tool. Spark's integration with Docker allows you to handle secrets securely.
|
|
|
|
```typescript
|
|
import { Spark, SparkUpdateManager } from '@serve.zone/spark';
|
|
import { DockerSecret } from '@apiclient.xyz/docker';
|
|
|
|
const sparkInstance = new Spark();
|
|
const updateManager = new SparkUpdateManager(sparkInstance);
|
|
|
|
const createDockerSecret = async () => {
|
|
const secret = await DockerSecret.createSecret(updateManager.dockerHost, {
|
|
name: 'dbPassword',
|
|
contentArg: 'superSecretPassword',
|
|
});
|
|
console.log(`Secret ${secret.Spec.Name} created.`);
|
|
};
|
|
|
|
createDockerSecret();
|
|
```
|
|
|
|
This example shows how to create a Docker secret using Spark's `SparkUpdateManager` class, ensuring that sensitive information is securely stored and managed.
|
|
|
|
### Conclusion
|
|
`@serve.zone/spark` is a comprehensive toolkit for orchestrating and managing server environments and Docker-based services. By leveraging its CLI and programmatic interfaces, you can automate and streamline server operations, configurations, updates, and task scheduling, ensuring your infrastructure is responsive, updated, and maintained efficiently.
|
|
undefined |