fix(core): update

This commit is contained in:
Philipp Kunz 2024-06-13 15:19:59 +02:00
parent 8a8e901205
commit 23661f60e5
4 changed files with 208 additions and 10 deletions

View File

@ -5,7 +5,7 @@
"githost": "gitlab.com", "githost": "gitlab.com",
"gitscope": "losslessone/services/initzone", "gitscope": "losslessone/services/initzone",
"gitrepo": "spark", "gitrepo": "spark",
"description": "A tool to maintain and configure servers on the base OS level for the Servezone infrastructure.", "description": "A comprehensive tool for maintaining and configuring servers, integrating with Docker and supporting advanced task scheduling, targeted at the Servezone infrastructure.",
"npmPackagename": "@losslessone_private/spark", "npmPackagename": "@losslessone_private/spark",
"license": "MIT", "license": "MIT",
"projectDomain": "https://lossless.one", "projectDomain": "https://lossless.one",
@ -20,7 +20,11 @@
"continuous deployment", "continuous deployment",
"deployment automation", "deployment automation",
"service orchestration", "service orchestration",
"node.js" "node.js",
"task scheduling",
"CLI",
"logging",
"server maintenance"
] ]
} }
}, },

View File

@ -2,7 +2,7 @@
"name": "@serve.zone/spark", "name": "@serve.zone/spark",
"version": "1.0.87", "version": "1.0.87",
"private": false, "private": false,
"description": "A tool to maintain and configure servers on the base OS level for the Servezone infrastructure.", "description": "A comprehensive tool for maintaining and configuring servers, integrating with Docker and supporting advanced task scheduling, targeted at the Servezone infrastructure.",
"main": "dist_ts/index.js", "main": "dist_ts/index.js",
"typings": "dist_ts/index.d.ts", "typings": "dist_ts/index.d.ts",
"author": "Task Venture Capital GmbH", "author": "Task Venture Capital GmbH",
@ -67,6 +67,10 @@
"continuous deployment", "continuous deployment",
"deployment automation", "deployment automation",
"service orchestration", "service orchestration",
"node.js" "node.js",
"task scheduling",
"CLI",
"logging",
"server maintenance"
] ]
} }

196
readme.md
View File

@ -1,8 +1,9 @@
# @serve.zone/spark # @serve.zone/spark
sparks the servezone services A tool to maintain and configure servers on the base OS level for the Servezone infrastructure.
## Install ## Install
To install `@serve.zone/spark`, run the following command in your terminal: To install `@serve.zone/spark`, run the following command in your terminal:
```sh ```sh
npm install @serve.zone/spark --save npm install @serve.zone/spark --save
``` ```
@ -13,6 +14,7 @@ npm install @serve.zone/spark --save
To use `@serve.zone/spark` in your project, you need to include and initiate it in your TypeScript project. Ensure you have TypeScript and the necessary build tools set up in your project. To use `@serve.zone/spark` in your project, you need to include and initiate it in your TypeScript project. Ensure you have TypeScript and the necessary build tools set up in your project.
First, import `@serve.zone/spark`: First, import `@serve.zone/spark`:
```typescript ```typescript
import { Spark } from '@serve.zone/spark'; import { Spark } from '@serve.zone/spark';
``` ```
@ -73,7 +75,7 @@ sparkInstance.sparkTaskManager.taskmanager.addAndScheduleTask(myTask, '* * * * *
The example above creates a simple task that logs a message every second, demonstrating how to use Spark's task manager for custom scheduled tasks. The example above creates a simple task that logs a message every second, demonstrating how to use Spark's task manager for custom scheduled tasks.
### Advanced Configuration ### Detailed Service Management
For advanced configurations, including Docker and service management: For advanced configurations, including Docker and service management:
- Use `SparkUpdateManager` to handle Docker image updates, service creation, and management. - Use `SparkUpdateManager` to handle Docker image updates, service creation, and management.
@ -88,6 +90,194 @@ const newServiceDefinition = {...};
await sparkInstance.sparkUpdateManager.createService(newServiceDefinition); await sparkInstance.sparkUpdateManager.createService(newServiceDefinition);
``` ```
### CLI Commands
Spark provides several CLI commands to interact with and manage the system services:
#### Installing Spark as a Daemon
```shell
spark installdaemon
```
Sets up Spark as a system service to maintain server configurations automatically.
#### Updating the Daemon
```shell
spark updatedaemon
```
Updates the daemon service if a new version is available.
#### Running Spark as Daemon
```shell
spark asdaemon
```
Runs Spark in daemon mode, which is suitable for executing automated tasks.
#### Viewing Logs
```shell
spark logs
```
Views the logs of the Spark daemon service.
#### Cleaning Up Services
```shell
spark prune
```
Stops and cleans up all Docker services (stacks, networks, secrets, etc.) and prunes the Docker system.
### Programmatic Daemon Management
You can also manage the daemon programmatically as shown in the following examples:
```typescript
import { SmartDaemon } from '@push.rocks/smartdaemon';
import { Spark } from '@serve.zone/spark';
const sparkInstance = new Spark();
const smartDaemon = new SmartDaemon();
const startDaemon = async () => {
const sparkService = await smartDaemon.addService({
name: 'spark',
version: sparkInstance.sparkInfo.projectInfo.version,
command: 'spark asdaemon',
description: 'Spark daemon service',
workingDir: '/path/to/project',
});
await sparkService.save();
await sparkService.enable();
await sparkService.start();
};
const updateDaemon = async () => {
const sparkService = await smartDaemon.addService({
name: 'spark',
version: sparkInstance.sparkInfo.projectInfo.version,
command: 'spark asdaemon',
description: 'Spark daemon service',
workingDir: '/path/to/project',
});
await sparkService.reload();
};
startDaemon();
updateDaemon();
```
This illustrates how to initiate and update the Spark daemon using the `SmartDaemon` class from `@push.rocks/smartdaemon`.
### Configuration Management
Extensive configuration management is possible through the `SparkLocalConfig` and other configuration classes. This feature allows you to make your application's behavior adaptable based on different environments and requirements.
```typescript
// Example on setting local config
import { SparkLocalConfig } from '@serve.zone/spark';
const localConfig = new SparkLocalConfig(sparkInstance);
await localConfig.kvStore.set('someKey', 'someValue');
// Retrieving a value from local config
const someConfigValue = await localConfig.kvStore.get('someKey');
console.log(someConfigValue); // Outputs: someValue
```
### Detailed Log Management
Logging is a crucial aspect of any automation tool, and `@serve.zone/spark` offers rich logging functionality through its built-in logging library.
```typescript
import { logger, Spark } from '@serve.zone/spark';
const sparkInstance = new Spark();
logger.log('info', 'Spark instance created.');
// Using logger in various levels of severity
logger.log('debug', 'This is a debug message');
logger.log('warn', 'This is a warning message');
logger.log('error', 'This is an error message');
logger.log('ok', 'This is a success message');
```
### Real-World Scenarios
#### Automated System Update and Restart
In real-world scenarios, you might want to automate system updates and reboots to ensure your services are running the latest security patches and features.
```typescript
import { Spark } from '@serve.zone/spark';
import { SmartShell } from '@push.rocks/smartshell';
const sparkInstance = new Spark();
const shell = new SmartShell({ executor: 'bash' });
const updateAndRestart = async () => {
await shell.exec('apt-get update && apt-get upgrade -y');
console.log('System updated.');
await shell.exec('reboot');
};
sparkInstance.sparkTaskManager.taskmanager.addAndScheduleTask(
{ name: 'updateAndRestart', taskFunction: updateAndRestart },
'0 3 * * 7' // Every Sunday at 3 AM
);
```
This example demonstrates creating and scheduling a task to update and restart the server every Sunday at 3 AM using Spark's task management capabilities.
#### Integrating with Docker for Service Deployment
Spark's tight integration with Docker makes it an excellent tool for deploying containerized applications across your infrastructure.
```typescript
import { Spark } from '@serve.zone/spark';
import { DockerHost } from '@apiclient.xyz/docker';
const sparkInstance = new Spark();
const dockerHost = new DockerHost({});
const deployService = async () => {
const image = await dockerHost.pullImage('my-docker-repo/my-service:latest');
const newService = await dockerHost.createService({
name: 'my-service',
image,
ports: ['80:8080'],
environmentVariables: {
NODE_ENV: 'production',
},
});
console.log(`Service ${newService.name} deployed.`);
};
deployService();
```
This example demonstrates how to pull a Docker image and deploy it as a new service in your infrastructure using Spark's Docker integration.
### Managing Secrets
Managing secrets and sensitive data is crucial in any configuration and automation tool. Spark's integration with Docker allows you to handle secrets securely.
```typescript
import { Spark, SparkUpdateManager } from '@serve.zone/spark';
import { DockerSecret } from '@apiclient.xyz/docker';
const sparkInstance = new Spark();
const updateManager = new SparkUpdateManager(sparkInstance);
const createDockerSecret = async () => {
const secret = await DockerSecret.createSecret(updateManager.dockerHost, {
name: 'dbPassword',
contentArg: 'superSecretPassword',
});
console.log(`Secret ${secret.Spec.Name} created.`);
};
createDockerSecret();
```
This example shows how to create a Docker secret using Spark's `SparkUpdateManager` class, ensuring that sensitive information is securely stored and managed.
### Conclusion ### Conclusion
`@serve.zone/spark` provides a comprehensive toolkit for orchestrating and managing server environments and Docker-based services. By leveraging its CLI and programmatic interfaces, you can automate and streamline server operations, configurations, updates, and task scheduling, ensuring your infrastructure is responsive, updated, and maintained efficiently. `@serve.zone/spark` is a comprehensive toolkit for orchestrating and managing server environments and Docker-based services. By leveraging its CLI and programmatic interfaces, you can automate and streamline server operations, configurations, updates, and task scheduling, ensuring your infrastructure is responsive, updated, and maintained efficiently.
undefined undefined

View File

@ -3,6 +3,6 @@
*/ */
export const commitinfo = { export const commitinfo = {
name: '@serve.zone/spark', name: '@serve.zone/spark',
version: '1.0.87', version: '1.0.88',
description: 'A tool to maintain and configure servers on the base OS level for the Servezone infrastructure.' description: 'A comprehensive tool for maintaining and configuring servers, integrating with Docker and supporting advanced task scheduling, targeted at the Servezone infrastructure.'
} }