docs: refresh readme and legal info
CI / Type Check & Lint (push) Successful in 50s
CI / Build Test (Current Platform) (push) Successful in 56s
CI / Build All Platforms (push) Successful in 2m10s

This commit is contained in:
2026-05-07 20:22:13 +00:00
parent 9244ff8610
commit 82dd073e50
+94 -483
View File
@@ -1,568 +1,179 @@
# @serve.zone/spark 🔥
# @serve.zone/spark
> **A powerful Deno-powered server management tool for the modern infrastructure**
Spark is a Deno-powered server management agent for serve.zone hosts. It installs as a system daemon, activates Docker Swarm, schedules host/service maintenance tasks, and provides the bootstrap profiles currently used by Cloudly and Coreflow node deployments.
Spark is a comprehensive tool for maintaining and configuring servers at the OS level, with deep Docker integration and advanced task scheduling capabilities. Built for the serve.zone infrastructure, Spark serves as the backbone for [@serve.zone/cloudly](https://code.foss.global/serve.zone/cloudly) cluster management, handling everything from daemon orchestration to container lifecycle management.
## Issue Reporting and Security
## ✨ Features
For reporting bugs, issues, or security vulnerabilities, please visit [community.foss.global/](https://community.foss.global/). This is the central community hub for all issue reporting. Developers who sign and comply with our contribution agreement and go through identification can also get a [code.foss.global/](https://code.foss.global/) account to submit Pull Requests directly.
- 🚀 **Standalone Binary** - No runtime dependencies, just download and run
- 🐳 **Docker Integration** - Native support for Docker services, stacks, secrets, and networks
- ⚙️ **Daemon Management** - Systemd integration for reliable service operation
- 📅 **Task Scheduling** - Cron-like task scheduling for automation
- 🔄 **Auto-Updates** - Self-updating capabilities for zero-downtime deployments
- 🔐 **Secure Secrets** - Docker secrets management for sensitive data
- 📊 **Comprehensive Logging** - Built-in logging with multiple severity levels
- 🎯 **Mode Support** - Cloudly and CoreFlow node operation modes
## Current Role
## 🚀 Installation
Spark is intentionally small and operational. It is not a general-purpose configuration management framework; it is the serve.zone node-side utility that knows how to run itself as a daemon and keep selected Docker services moving.
### Quick Install (Recommended)
The current implementation does four main things:
Install the latest version via our installation script:
- Installs and updates a `smartdaemon_spark` systemd service through `@push.rocks/smartdaemon`.
- Runs in an explicit mode: `cloudly` or `coreflow-node`.
- Activates Docker Swarm through `@apiclient.xyz/docker` when daemon mode starts.
- Schedules recurring tasks with `@push.rocks/taskbuffer` for Spark updates, host package updates, and managed Docker service updates.
## Installation
Install a released binary:
```bash
curl -sSL https://code.foss.global/serve.zone/spark/raw/branch/master/install.sh | sudo bash
curl -sSL https://code.foss.global/serve.zone/spark/raw/branch/main/install.sh | sudo bash
```
### npm Install
Install via npm (automatically downloads the correct binary for your platform):
Install through the npm package wrapper with pnpm:
```bash
npm install -g @serve.zone/spark
pnpm add --global @serve.zone/spark
```
### Specific Version
The wrapper downloads the matching release binary for the current OS/architecture. Release builds currently target Linux x64/ARM64, macOS x64/ARM64, and Windows x64.
```bash
curl -sSL https://code.foss.global/serve.zone/spark/raw/branch/master/install.sh | sudo bash -s -- --version v1.2.4
```
## Requirements
### Manual Installation
- Linux with systemd for daemon operation.
- Docker for service and Swarm management.
- Root privileges for daemon installation, Docker maintenance, package updates, and `prune`.
- Deno only when running from source.
Download the binary for your platform from the [releases page](https://code.foss.global/serve.zone/spark/releases) and make it executable:
macOS and Windows binaries are built for CLI/library availability, but the operational daemon paths are Linux/systemd oriented.
```bash
# Example for Linux x64
wget https://code.foss.global/serve.zone/spark/releases/download/v1.2.4/spark-linux-x64
chmod +x spark-linux-x64
sudo mv spark-linux-x64 /usr/local/bin/spark
```
## Quick Start
### Supported Platforms
- 🐧 Linux (x86_64, ARM64)
- 🍎 macOS (Intel, Apple Silicon)
- 🪟 Windows (x86_64)
## 🎯 Quick Start
### Install as System Daemon
Set up Spark to run as a systemd service:
Install Spark as a daemon:
```bash
sudo spark installdaemon
```
This command:
- Creates a systemd service unit
- Enables automatic startup on boot
- Starts the Spark daemon immediately
### Configure Operation Mode
Spark supports different operation modes for various use cases:
Run daemon mode with a profile:
```bash
# For Cloudly cluster management
sudo spark asdaemon --mode cloudly
```
# For CoreFlow node management
or:
```bash
sudo spark asdaemon --mode coreflow-node
```
### View Logs
The selected mode is persisted in a user-home `npmextra` key/value store under the `servezone_spark` identity. Later `spark asdaemon` calls can reuse the stored mode when no `--mode` flag is provided.
Monitor Spark daemon activity in real-time:
Follow daemon logs:
```bash
sudo spark logs
```
## 📖 CLI Reference
### Core Commands
#### `spark installdaemon`
Installs Spark as a system daemon service. This sets up a systemd unit that automatically starts on boot.
## CLI Reference
```bash
sudo spark installdaemon
spark <command> [options]
```
#### `spark updatedaemon`
Updates the daemon service configuration to the current Spark version.
| Command | Purpose |
| --- | --- |
| `installdaemon` | Create, enable, and start the Spark daemon service. |
| `updatedaemon` | Reload the daemon service definition for the current Spark version. |
| `asdaemon --mode cloudly` | Run the daemon loop with the Cloudly profile. |
| `asdaemon --mode coreflow-node` | Run the daemon loop with the Coreflow node profile. |
| `logs` | Follow `journalctl -u smartdaemon_spark -f`. |
| `prune` | Stop Spark, remove Docker stacks/services/secrets, remove selected networks, prune Docker, restart Docker, and restart Spark. |
```bash
sudo spark updatedaemon
```
`prune` is destructive. Use it only on nodes where Spark owns the Docker runtime state or where losing all stacks, services, and secrets is intended.
#### `spark asdaemon [--mode MODE]`
Runs Spark in daemon mode. Requires a mode to be specified (either via `--mode` flag or from saved configuration).
## Daemon Behavior
```bash
sudo spark asdaemon --mode cloudly
```
`Spark.daemonStart()` starts two subsystems:
**Available modes:**
- `cloudly` - Manages Cloudly services
- `coreflow-node` - Manages CoreFlow node services
- `SparkServicesManager.start()` activates Docker Swarm.
- `SparkTaskManager.start()` schedules recurring maintenance tasks.
#### `spark logs`
Displays real-time logs from the Spark daemon service.
Scheduled tasks:
```bash
sudo spark logs
```
| Task | Schedule | Action |
| --- | --- | --- |
| `updateServices` | Every 2 minutes at second 30 | Checks managed Docker services and recreates them when images change. |
| `updateSpark` | Every minute | Checks for a newer Spark release and reloads the daemon after upgrade. |
| `updateHost` | Daily at midnight | Runs apt update/upgrade/autoremove/autoclean. |
#### `spark prune`
Performs a complete cleanup of Docker resources and restarts services. Use with caution!
The managed service list is populated by the selected mode before daemon startup. Service updates use Docker images, Docker secrets, and published port mappings.
```bash
sudo spark prune
```
## Programmatic Usage
This command:
1. Stops the Spark daemon
2. Removes all Docker stacks
3. Removes all Docker services
4. Removes all Docker secrets
5. Removes specified Docker networks
6. Prunes the Docker system
7. Restarts Docker
8. Restarts the Spark daemon
### Advanced Usage
#### Check Version
```bash
spark --version
```
#### Get Help
```bash
spark help
```
## 🔧 Programmatic Usage
While Spark is primarily designed as a CLI tool and daemon, you can also use it as a library in your Deno projects.
### Import from Deno
```typescript
import { Spark } from 'https://code.foss.global/serve.zone/spark/raw/branch/master/mod.ts';
```
### Basic Usage
Spark exports the main `Spark` class from `mod.ts`:
```typescript
import { Spark } from './mod.ts';
// Create a Spark instance
const spark = new Spark();
// Start the daemon programmatically
await spark.daemonStart();
```
### Task Scheduling
The public instance exposes:
Schedule automated tasks using the built-in task manager:
| Property | Purpose |
| --- | --- |
| `smartdaemon` | systemd service integration. |
| `sparkConfig` | persisted mode/config key-value store. |
| `sparkTaskManager` | taskbuffer scheduler and built-in maintenance tasks. |
| `sparkUpdateManager` | Docker Swarm activation and managed service update logic. |
| `sparkInfo` | package metadata lookup. |
```typescript
import { Spark } from './mod.ts';
## Development
const spark = new Spark();
// Define a custom task
const backupTask = {
name: 'daily-backup',
taskFunction: async () => {
console.log('Running backup...');
// Your backup logic here
},
};
// Schedule it to run daily at 2 AM
spark.sparkTaskManager.taskmanager.addAndScheduleTask(
backupTask,
'0 2 * * *'
);
```
### Service Management
Manage Docker services programmatically:
```typescript
import { Spark } from './mod.ts';
const spark = new Spark();
// Add a service to manage
spark.sparkUpdateManager.services.push({
name: 'my-app',
image: 'code.foss.global/myorg/myapp',
url: 'myapp',
environment: 'production',
port: '3000',
secretJson: {
API_KEY: 'secret-value',
DATABASE_URL: 'postgresql://...',
},
});
// Start managing services
await spark.sparkUpdateManager.start();
```
### Configuration Management
Access and modify Spark's configuration:
```typescript
import { Spark } from './mod.ts';
const spark = new Spark();
// Write configuration
await spark.sparkConfig.kvStore.writeKey('mode', 'cloudly');
// Read configuration
const mode = await spark.sparkConfig.kvStore.readKey('mode');
console.log(`Current mode: ${mode}`);
```
### Logging
Use Spark's built-in logger for consistent logging:
```typescript
import { logger } from './ts/spark.logging.ts';
// Log at different levels
logger.log('info', 'Application starting...');
logger.log('ok', 'Service deployed successfully');
logger.log('warn', 'High memory usage detected');
logger.log('error', 'Failed to connect to database');
logger.log('success', 'Backup completed');
```
## 🏗️ Architecture
### Core Components
```
┌─────────────────────────────────────────┐
│ Spark Instance │
├─────────────────────────────────────────┤
│ │
│ ┌─────────────────────────────────┐ │
│ │ SparkConfig │ │
│ │ - KV Store │ │
│ │ - Mode Configuration │ │
│ └─────────────────────────────────┘ │
│ │
│ ┌─────────────────────────────────┐ │
│ │ SparkTaskManager │ │
│ │ - Cron Scheduling │ │
│ │ - Task Execution │ │
│ └─────────────────────────────────┘ │
│ │
│ ┌─────────────────────────────────┐ │
│ │ SparkServicesManager │ │
│ │ - Docker Integration │ │
│ │ - Service Updates │ │
│ │ - Secret Management │ │
│ └─────────────────────────────────┘ │
│ │
│ ┌─────────────────────────────────┐ │
│ │ SmartDaemon │ │
│ │ - Systemd Integration │ │
│ │ - Service Lifecycle │ │
│ └─────────────────────────────────┘ │
│ │
└─────────────────────────────────────────┘
```
### Key Classes
- **`Spark`** - Main orchestrator class that coordinates all components
- **`SparkConfig`** - Handles configuration storage and retrieval
- **`SparkTaskManager`** - Manages scheduled tasks and automation
- **`SparkServicesManager`** - Manages Docker services and updates
- **`SparkInfo`** - Provides project and version information
## 🔄 Update Management
Spark includes self-updating capabilities:
```typescript
import { Spark } from './mod.ts';
const spark = new Spark();
// Check for and apply updates
await spark.sparkUpdateManager.updateServices();
```
The update manager:
- Pulls latest Docker images
- Manages service rollouts
- Handles zero-downtime deployments
- Manages Docker secrets securely
## 🐳 Docker Integration
### Service Definition
```typescript
const serviceDefinition = {
name: 'api-server',
image: 'code.foss.global/myorg/api',
url: 'api',
environment: 'production',
port: '8080',
secretJson: {
JWT_SECRET: 'your-jwt-secret',
DB_PASSWORD: 'your-db-password',
},
};
spark.sparkUpdateManager.services.push(serviceDefinition);
```
### Stack Management
Spark manages Docker stacks for complex multi-service deployments:
Run from source:
```bash
# View running stacks
docker stack ls
# Spark manages these automatically
deno task dev
```
## 🛠️ Development
### Prerequisites
- [Deno](https://deno.land/) v2.x or later
### Running from Source
Quality and build tasks:
```bash
# Clone the repository
git clone https://code.foss.global/serve.zone/spark.git
cd spark
# Run directly
deno run --allow-all mod.ts
# Run tests
deno test --allow-all test/
# Type check
deno check mod.ts
# Format code
deno fmt
# Lint
deno lint
deno task check
deno task lint
deno task fmt
deno task test
deno task compile
```
### Building Binaries
The package scripts intentionally defer to Deno tasks; `pnpm build` only reports that no Node build is needed.
Compile for all supported platforms:
Source map:
```bash
bash scripts/compile-all.sh
```
Binaries will be output to `dist/binaries/`.
### Compile for Specific Platform
```bash
# Linux x64
deno compile --allow-all --output spark-linux-x64 --target x86_64-unknown-linux-gnu mod.ts
# macOS ARM64
deno compile --allow-all --output spark-macos-arm64 --target aarch64-apple-darwin mod.ts
```
## 🔐 Security
### Permissions
Spark requires the following Deno permissions:
- `--allow-net` - API communication, Docker socket access
- `--allow-read` - Configuration files, project files
- `--allow-write` - Logs, configuration updates
- `--allow-run` - systemctl, Docker commands
- `--allow-env` - Environment variables
- `--allow-sys` - System information
### Secrets Management
Always use Docker secrets for sensitive data:
```typescript
const serviceWithSecrets = {
name: 'secure-app',
image: 'myapp:latest',
secretJson: {
API_KEY: Deno.env.get('API_KEY')!,
DB_PASSWORD: Deno.env.get('DB_PASSWORD')!,
},
};
```
## 🐛 Troubleshooting
### Service Won't Start
Check the daemon status:
```bash
sudo systemctl status smartdaemon_spark
```
View recent logs:
```bash
sudo journalctl -u smartdaemon_spark -n 100
```
### Docker Issues
Verify Docker is running:
```bash
sudo systemctl status docker
```
Check Docker socket permissions:
```bash
sudo ls -la /var/run/docker.sock
```
### Configuration Issues
Check current mode:
```bash
# Run spark programmatically
deno run --allow-all -e "
import { Spark } from './mod.ts';
const s = new Spark();
const mode = await s.sparkConfig.kvStore.readKey('mode');
console.log('Mode:', mode);
"
```
## 📝 Examples
### Automated System Maintenance
```typescript
import { Spark } from './mod.ts';
const spark = new Spark();
// Schedule weekly system updates
const updateTask = {
name: 'system-update',
taskFunction: async () => {
const shell = new Deno.Command('bash', {
args: ['-c', 'apt-get update && apt-get upgrade -y'],
});
await shell.output();
},
};
// Every Sunday at 3 AM
spark.sparkTaskManager.taskmanager.addAndScheduleTask(
updateTask,
'0 3 * * 0'
);
await spark.daemonStart();
```
### Multi-Service Deployment
```typescript
import { Spark } from './mod.ts';
const spark = new Spark();
// Add multiple services
const services = [
{
name: 'frontend',
image: 'code.foss.global/myorg/frontend',
url: 'frontend',
port: '80',
environment: 'production',
},
{
name: 'backend',
image: 'code.foss.global/myorg/backend',
url: 'backend',
port: '3000',
environment: 'production',
secretJson: {
DATABASE_URL: Deno.env.get('DATABASE_URL')!,
},
},
{
name: 'worker',
image: 'code.foss.global/myorg/worker',
url: 'worker',
environment: 'production',
},
];
services.forEach(svc => spark.sparkUpdateManager.services.push(svc));
await spark.daemonStart();
```
## 🤝 Support
For issues, questions, or contributions:
- 🐛 [Report Issues](https://code.foss.global/serve.zone/spark/issues)
- 📖 [View Source](https://code.foss.global/serve.zone/spark)
| Path | Purpose |
| --- | --- |
| `mod.ts` | CLI entry point and library export. |
| `ts/spark.cli.ts` | Command routing. |
| `ts/spark.classes.spark.ts` | Main class wiring. |
| `ts/spark.classes.updatemanager.ts` | Docker Swarm and service update manager. |
| `ts/spark.classes.taskmanager.ts` | Scheduled maintenance tasks. |
| `ts/spark.classes.config.ts` | Persisted mode/config store. |
| `scripts/compile-all.sh` | Multi-platform Deno compilation. |
## License and Legal Information
This repository contains open-source code that is licensed under the MIT License. A copy of the MIT License can be found in the [license](license) file within this repository.
This repository contains open-source code licensed under the MIT License. A copy of the license can be found in the [license](./license) file.
**Please note:** The MIT License does not grant permission to use the trade names, trademarks, service marks, or product names of the project, except as required for reasonable and customary use in describing the origin of the work and reproducing the content of the NOTICE file.
### Trademarks
This project is owned and maintained by Task Venture Capital GmbH. The names and logos associated with Task Venture Capital GmbH and any related products or services are trademarks of Task Venture Capital GmbH and are not included within the scope of the MIT license granted herein. Use of these trademarks must comply with Task Venture Capital GmbH's Trademark Guidelines, and any usage must be approved in writing by Task Venture Capital GmbH.
This project is owned and maintained by Task Venture Capital GmbH. The names and logos associated with Task Venture Capital GmbH and any related products or services are trademarks of Task Venture Capital GmbH or third parties, and are not included within the scope of the MIT license granted herein.
Use of these trademarks must comply with Task Venture Capital GmbH's Trademark Guidelines or the guidelines of the respective third-party owners, and any usage must be approved in writing. Third-party trademarks used herein are the property of their respective owners and used only in a descriptive manner, e.g. for an implementation of an API or similar.
### Company Information
Task Venture Capital GmbH
Registered at District court Bremen HRB 35230 HB, Germany
Task Venture Capital GmbH\
Registered at District Court Bremen HRB 35230 HB, Germany
For any legal inquiries or if you require further information, please contact us via email at hello@task.vc.
For any legal inquiries or further information, please contact us via email at hello@task.vc.
By using this repository, you acknowledge that you have read this section, agree to comply with its terms, and understand that the licensing of the code does not imply endorsement by Task Venture Capital GmbH of any derivative works.
By using this repository, you acknowledge that you have read this section, agree to comply with its terms, and understand that the licensing of the code does not imply endorsement by Task Venture Capital GmbH of any derivative works.