Files
spark/readme.md
T

180 lines
6.9 KiB
Markdown
Raw Normal View History

2026-05-07 20:22:13 +00:00
# @serve.zone/spark
2024-05-08 20:49:10 +02:00
2026-05-07 20:22:13 +00:00
Spark is a Deno-powered server management agent for serve.zone hosts. It installs as a system daemon, activates Docker Swarm, schedules host/service maintenance tasks, and provides the bootstrap profiles currently used by Cloudly and Coreflow node deployments.
2024-06-13 15:19:59 +02:00
2026-05-07 20:22:13 +00:00
## Issue Reporting and Security
2024-05-08 20:49:10 +02:00
2026-05-07 20:22:13 +00:00
For reporting bugs, issues, or security vulnerabilities, please visit [community.foss.global/](https://community.foss.global/). This is the central community hub for all issue reporting. Developers who sign and comply with our contribution agreement and go through identification can also get a [code.foss.global/](https://code.foss.global/) account to submit Pull Requests directly.
2024-06-13 15:30:49 +02:00
2026-05-07 20:22:13 +00:00
## Current Role
2024-05-08 20:49:10 +02:00
2026-05-07 20:22:13 +00:00
Spark is intentionally small and operational. It is not a general-purpose configuration management framework; it is the serve.zone node-side utility that knows how to run itself as a daemon and keep selected Docker services moving.
2024-05-08 20:49:10 +02:00
2026-05-07 20:22:13 +00:00
The current implementation does four main things:
2024-06-13 15:19:59 +02:00
2026-05-07 20:22:13 +00:00
- Installs and updates a `smartdaemon_spark` systemd service through `@push.rocks/smartdaemon`.
- Runs in an explicit mode: `cloudly` or `coreflow-node`.
- Activates Docker Swarm through `@apiclient.xyz/docker` when daemon mode starts.
- Schedules recurring tasks with `@push.rocks/taskbuffer` for Spark updates, host package updates, and managed Docker service updates.
2026-05-07 20:22:13 +00:00
## Installation
2026-05-07 20:22:13 +00:00
Install a released binary:
```bash
2026-05-07 20:22:13 +00:00
curl -sSL https://code.foss.global/serve.zone/spark/raw/branch/main/install.sh | sudo bash
```
2026-05-07 20:22:13 +00:00
Install through the npm package wrapper with pnpm:
2024-05-08 20:49:10 +02:00
```bash
2026-05-07 20:22:13 +00:00
pnpm add --global @serve.zone/spark
2024-05-08 20:53:35 +02:00
```
2024-05-08 20:49:10 +02:00
2026-05-07 20:22:13 +00:00
The wrapper downloads the matching release binary for the current OS/architecture. Release builds currently target Linux x64/ARM64, macOS x64/ARM64, and Windows x64.
2024-05-08 20:49:10 +02:00
2026-05-07 20:22:13 +00:00
## Requirements
2024-05-08 20:49:10 +02:00
2026-05-07 20:22:13 +00:00
- Linux with systemd for daemon operation.
- Docker for service and Swarm management.
- Root privileges for daemon installation, Docker maintenance, package updates, and `prune`.
- Deno only when running from source.
2024-05-08 20:53:35 +02:00
2026-05-07 20:22:13 +00:00
macOS and Windows binaries are built for CLI/library availability, but the operational daemon paths are Linux/systemd oriented.
2024-05-08 20:53:35 +02:00
2026-05-07 20:22:13 +00:00
## Quick Start
2026-05-07 20:22:13 +00:00
Install Spark as a daemon:
```bash
sudo spark installdaemon
2024-05-08 20:53:35 +02:00
```
2026-05-07 20:22:13 +00:00
Run daemon mode with a profile:
2024-05-08 20:53:35 +02:00
```bash
sudo spark asdaemon --mode cloudly
2024-05-08 20:53:35 +02:00
```
2026-05-07 20:22:13 +00:00
or:
```bash
2026-05-07 20:22:13 +00:00
sudo spark asdaemon --mode coreflow-node
```
2024-05-08 20:53:35 +02:00
2026-05-07 20:22:13 +00:00
The selected mode is persisted in a user-home `npmextra` key/value store under the `servezone_spark` identity. Later `spark asdaemon` calls can reuse the stored mode when no `--mode` flag is provided.
2026-05-07 20:22:13 +00:00
Follow daemon logs:
```bash
sudo spark logs
2024-05-08 20:53:35 +02:00
```
2026-05-07 20:22:13 +00:00
## CLI Reference
2024-06-13 15:19:59 +02:00
```bash
2026-05-07 20:22:13 +00:00
spark <command> [options]
2024-06-13 15:19:59 +02:00
```
2026-05-07 20:22:13 +00:00
| Command | Purpose |
| --- | --- |
| `installdaemon` | Create, enable, and start the Spark daemon service. |
| `updatedaemon` | Reload the daemon service definition for the current Spark version. |
| `asdaemon --mode cloudly` | Run the daemon loop with the Cloudly profile. |
| `asdaemon --mode coreflow-node` | Run the daemon loop with the Coreflow node profile. |
| `logs` | Follow `journalctl -u smartdaemon_spark -f`. |
| `prune` | Stop Spark, remove Docker stacks/services/secrets, remove selected networks, prune Docker, restart Docker, and restart Spark. |
2024-06-13 15:19:59 +02:00
2026-05-07 20:22:13 +00:00
`prune` is destructive. Use it only on nodes where Spark owns the Docker runtime state or where losing all stacks, services, and secrets is intended.
2026-05-07 20:22:13 +00:00
## Daemon Behavior
2024-06-13 15:19:59 +02:00
2026-05-07 20:22:13 +00:00
`Spark.daemonStart()` starts two subsystems:
2024-06-13 15:19:59 +02:00
2026-05-07 20:22:13 +00:00
- `SparkServicesManager.start()` activates Docker Swarm.
- `SparkTaskManager.start()` schedules recurring maintenance tasks.
2024-06-13 15:19:59 +02:00
2026-05-07 20:22:13 +00:00
Scheduled tasks:
2026-05-07 20:22:13 +00:00
| Task | Schedule | Action |
| --- | --- | --- |
| `updateServices` | Every 2 minutes at second 30 | Checks managed Docker services and recreates them when images change. |
| `updateSpark` | Every minute | Checks for a newer Spark release and reloads the daemon after upgrade. |
| `updateHost` | Daily at midnight | Runs apt update/upgrade/autoremove/autoclean. |
2026-05-07 20:22:13 +00:00
The managed service list is populated by the selected mode before daemon startup. Service updates use Docker images, Docker secrets, and published port mappings.
2024-06-13 15:19:59 +02:00
2026-05-07 20:22:13 +00:00
## Programmatic Usage
2024-06-13 15:19:59 +02:00
2026-05-07 20:22:13 +00:00
Spark exports the main `Spark` class from `mod.ts`:
```typescript
import { Spark } from './mod.ts';
const spark = new Spark();
await spark.daemonStart();
2024-06-13 15:19:59 +02:00
```
2026-05-07 20:22:13 +00:00
The public instance exposes:
2024-06-13 15:19:59 +02:00
2026-05-07 20:22:13 +00:00
| Property | Purpose |
| --- | --- |
| `smartdaemon` | systemd service integration. |
| `sparkConfig` | persisted mode/config key-value store. |
| `sparkTaskManager` | taskbuffer scheduler and built-in maintenance tasks. |
| `sparkUpdateManager` | Docker Swarm activation and managed service update logic. |
| `sparkInfo` | package metadata lookup. |
2024-06-13 15:19:59 +02:00
2026-05-07 20:22:13 +00:00
## Development
2026-05-07 20:22:13 +00:00
Run from source:
```bash
2026-05-07 20:22:13 +00:00
deno task dev
```
2024-06-13 15:19:59 +02:00
2026-05-07 20:22:13 +00:00
Quality and build tasks:
```bash
2026-05-07 20:22:13 +00:00
deno task check
deno task lint
deno task fmt
deno task test
deno task compile
```
2026-05-07 20:22:13 +00:00
The package scripts intentionally defer to Deno tasks; `pnpm build` only reports that no Node build is needed.
2026-05-07 20:22:13 +00:00
Source map:
2026-05-07 20:22:13 +00:00
| Path | Purpose |
| --- | --- |
| `mod.ts` | CLI entry point and library export. |
| `ts/spark.cli.ts` | Command routing. |
| `ts/spark.classes.spark.ts` | Main class wiring. |
| `ts/spark.classes.updatemanager.ts` | Docker Swarm and service update manager. |
| `ts/spark.classes.taskmanager.ts` | Scheduled maintenance tasks. |
| `ts/spark.classes.config.ts` | Persisted mode/config store. |
| `scripts/compile-all.sh` | Multi-platform Deno compilation. |
2024-06-13 15:19:59 +02:00
2024-06-13 15:30:49 +02:00
## License and Legal Information
2026-05-07 20:22:13 +00:00
This repository contains open-source code licensed under the MIT License. A copy of the license can be found in the [license](./license) file.
2024-06-13 15:30:49 +02:00
**Please note:** The MIT License does not grant permission to use the trade names, trademarks, service marks, or product names of the project, except as required for reasonable and customary use in describing the origin of the work and reproducing the content of the NOTICE file.
### Trademarks
2026-05-07 20:22:13 +00:00
This project is owned and maintained by Task Venture Capital GmbH. The names and logos associated with Task Venture Capital GmbH and any related products or services are trademarks of Task Venture Capital GmbH or third parties, and are not included within the scope of the MIT license granted herein.
Use of these trademarks must comply with Task Venture Capital GmbH's Trademark Guidelines or the guidelines of the respective third-party owners, and any usage must be approved in writing. Third-party trademarks used herein are the property of their respective owners and used only in a descriptive manner, e.g. for an implementation of an API or similar.
2024-06-13 15:30:49 +02:00
### Company Information
2026-05-07 20:22:13 +00:00
Task Venture Capital GmbH\
Registered at District Court Bremen HRB 35230 HB, Germany
2024-06-13 15:30:49 +02:00
2026-05-07 20:22:13 +00:00
For any legal inquiries or further information, please contact us via email at hello@task.vc.
2024-06-13 15:30:49 +02:00
2026-05-07 20:22:13 +00:00
By using this repository, you acknowledge that you have read this section, agree to comply with its terms, and understand that the licensing of the code does not imply endorsement by Task Venture Capital GmbH of any derivative works.