docs: Create comprehensive Deno-based readme
- Updated installation instructions for binary distribution - Removed all npm/Node.js references - Added detailed CLI command reference - Included programmatic usage examples - Added architecture diagrams - Added troubleshooting section - Added emojis for modern appeal - Documented Deno development workflow - Kept legal section up to date
This commit is contained in:
724
readme.md
724
readme.md
@@ -1,284 +1,544 @@
|
|||||||
# @serve.zone/spark
|
# @serve.zone/spark 🔥
|
||||||
A comprehensive tool for maintaining and configuring servers, integrating with Docker and supporting advanced task scheduling, targeted at the serve.zone infrastructure. It's mainly designed to be utilized by @serve.zone/cloudly as a cluster node server system manager, maintaining and configuring servers on the base OS level.
|
|
||||||
|
|
||||||
## Install
|
> **A powerful Deno-powered server management tool for the modern infrastructure**
|
||||||
To install `@serve.zone/spark`, run the following command in your terminal:
|
|
||||||
|
|
||||||
```sh
|
Spark is a comprehensive tool for maintaining and configuring servers at the OS level, with deep Docker integration and advanced task scheduling capabilities. Built for the serve.zone infrastructure, Spark serves as the backbone for [@serve.zone/cloudly](https://code.foss.global/serve.zone/cloudly) cluster management, handling everything from daemon orchestration to container lifecycle management.
|
||||||
npm install @serve.zone/spark --save
|
|
||||||
|
## ✨ Features
|
||||||
|
|
||||||
|
- 🚀 **Standalone Binary** - No runtime dependencies, just download and run
|
||||||
|
- 🐳 **Docker Integration** - Native support for Docker services, stacks, secrets, and networks
|
||||||
|
- ⚙️ **Daemon Management** - Systemd integration for reliable service operation
|
||||||
|
- 📅 **Task Scheduling** - Cron-like task scheduling for automation
|
||||||
|
- 🔄 **Auto-Updates** - Self-updating capabilities for zero-downtime deployments
|
||||||
|
- 🔐 **Secure Secrets** - Docker secrets management for sensitive data
|
||||||
|
- 📊 **Comprehensive Logging** - Built-in logging with multiple severity levels
|
||||||
|
- 🎯 **Mode Support** - Cloudly and CoreFlow node operation modes
|
||||||
|
|
||||||
|
## 🚀 Installation
|
||||||
|
|
||||||
|
### Quick Install (Recommended)
|
||||||
|
|
||||||
|
Install the latest version via our installation script:
|
||||||
|
|
||||||
|
```bash
|
||||||
|
curl -sSL https://code.foss.global/serve.zone/spark/raw/branch/master/install.sh | sudo bash
|
||||||
```
|
```
|
||||||
|
|
||||||
Ensure you have both Node.js and npm installed on your machine.
|
### Specific Version
|
||||||
|
|
||||||
## Usage
|
```bash
|
||||||
|
curl -sSL https://code.foss.global/serve.zone/spark/raw/branch/master/install.sh | sudo bash -s -- --version v1.2.2
|
||||||
### Getting Started
|
|
||||||
To use `@serve.zone/spark` in your project, you need to include and initiate it in your TypeScript project. Ensure you have TypeScript and the necessary build tools set up in your project.
|
|
||||||
|
|
||||||
First, import `@serve.zone/spark`:
|
|
||||||
|
|
||||||
```typescript
|
|
||||||
import { Spark } from '@serve.zone/spark';
|
|
||||||
```
|
```
|
||||||
|
|
||||||
### Initializing Spark
|
### Manual Installation
|
||||||
Create an instance of the `Spark` class to start using Spark. This instance will serve as the main entry point for interacting with Spark functionalities.
|
|
||||||
|
|
||||||
```typescript
|
Download the binary for your platform from the [releases page](https://code.foss.global/serve.zone/spark/releases) and make it executable:
|
||||||
const sparkInstance = new Spark();
|
|
||||||
|
```bash
|
||||||
|
# Example for Linux x64
|
||||||
|
wget https://code.foss.global/serve.zone/spark/releases/download/v1.2.2/spark-linux-x64
|
||||||
|
chmod +x spark-linux-x64
|
||||||
|
sudo mv spark-linux-x64 /usr/local/bin/spark
|
||||||
```
|
```
|
||||||
|
|
||||||
### Running Spark as a Daemon
|
### Supported Platforms
|
||||||
To run Spark as a daemon, which is useful for maintaining and configuring servers at the OS level, you can use the CLI feature bundled with Spark. This should ideally be handled outside of your code through a command-line terminal but can also be automated within your Node.js scripts if required.
|
|
||||||
|
|
||||||
```shell
|
- 🐧 Linux (x86_64, ARM64)
|
||||||
spark installdaemon
|
- 🍎 macOS (Intel, Apple Silicon)
|
||||||
|
- 🪟 Windows (x86_64)
|
||||||
|
|
||||||
|
## 🎯 Quick Start
|
||||||
|
|
||||||
|
### Install as System Daemon
|
||||||
|
|
||||||
|
Set up Spark to run as a systemd service:
|
||||||
|
|
||||||
|
```bash
|
||||||
|
sudo spark installdaemon
|
||||||
```
|
```
|
||||||
|
|
||||||
The command above sets up Spark as a system service, enabling it to run and maintain server configurations automatically.
|
This command:
|
||||||
|
- Creates a systemd service unit
|
||||||
|
- Enables automatic startup on boot
|
||||||
|
- Starts the Spark daemon immediately
|
||||||
|
|
||||||
### Updating Spark or Maintained Services
|
### Configure Operation Mode
|
||||||
Spark can self-update and manage updates for its maintained services. Trigger an update check and process by calling the `updateServices` method on the Spark instance.
|
|
||||||
|
|
||||||
```typescript
|
Spark supports different operation modes for various use cases:
|
||||||
await sparkInstance.sparkUpdateManager.updateServices();
|
|
||||||
|
```bash
|
||||||
|
# For Cloudly cluster management
|
||||||
|
sudo spark asdaemon --mode cloudly
|
||||||
|
|
||||||
|
# For CoreFlow node management
|
||||||
|
sudo spark asdaemon --mode coreflow-node
|
||||||
```
|
```
|
||||||
|
|
||||||
### Managing Configuration and Logging
|
### View Logs
|
||||||
Spark allows extensive configuration and logging customization. Use the `SparkLocalConfig` and logging features to tailor Spark's operation to your needs.
|
|
||||||
|
|
||||||
```typescript
|
Monitor Spark daemon activity in real-time:
|
||||||
// Accessing the local configuration
|
|
||||||
const localConfig = sparkInstance.sparkLocalConfig;
|
|
||||||
|
|
||||||
// Utilizing the logger for custom log messages
|
```bash
|
||||||
import { logger } from '@serve.zone/spark';
|
sudo spark logs
|
||||||
|
|
||||||
logger.log('info', 'Custom log message');
|
|
||||||
```
|
```
|
||||||
|
|
||||||
|
## 📖 CLI Reference
|
||||||
|
|
||||||
|
### Core Commands
|
||||||
|
|
||||||
|
#### `spark installdaemon`
|
||||||
|
Installs Spark as a system daemon service. This sets up a systemd unit that automatically starts on boot.
|
||||||
|
|
||||||
|
```bash
|
||||||
|
sudo spark installdaemon
|
||||||
|
```
|
||||||
|
|
||||||
|
#### `spark updatedaemon`
|
||||||
|
Updates the daemon service configuration to the current Spark version.
|
||||||
|
|
||||||
|
```bash
|
||||||
|
sudo spark updatedaemon
|
||||||
|
```
|
||||||
|
|
||||||
|
#### `spark asdaemon [--mode MODE]`
|
||||||
|
Runs Spark in daemon mode. Requires a mode to be specified (either via `--mode` flag or from saved configuration).
|
||||||
|
|
||||||
|
```bash
|
||||||
|
sudo spark asdaemon --mode cloudly
|
||||||
|
```
|
||||||
|
|
||||||
|
**Available modes:**
|
||||||
|
- `cloudly` - Manages Cloudly services
|
||||||
|
- `coreflow-node` - Manages CoreFlow node services
|
||||||
|
|
||||||
|
#### `spark logs`
|
||||||
|
Displays real-time logs from the Spark daemon service.
|
||||||
|
|
||||||
|
```bash
|
||||||
|
sudo spark logs
|
||||||
|
```
|
||||||
|
|
||||||
|
#### `spark prune`
|
||||||
|
Performs a complete cleanup of Docker resources and restarts services. Use with caution!
|
||||||
|
|
||||||
|
```bash
|
||||||
|
sudo spark prune
|
||||||
|
```
|
||||||
|
|
||||||
|
This command:
|
||||||
|
1. Stops the Spark daemon
|
||||||
|
2. Removes all Docker stacks
|
||||||
|
3. Removes all Docker services
|
||||||
|
4. Removes all Docker secrets
|
||||||
|
5. Removes specified Docker networks
|
||||||
|
6. Prunes the Docker system
|
||||||
|
7. Restarts Docker
|
||||||
|
8. Restarts the Spark daemon
|
||||||
|
|
||||||
### Advanced Usage
|
### Advanced Usage
|
||||||
`@serve.zone/spark` offers tools for detailed server and service management, including but not limited to task scheduling, daemon management, and service updates. Explore the `SparkTaskManager` for scheduling specific tasks, `SparkUpdateManager` for handling service updates, and `SparkLocalConfig` for configuration.
|
|
||||||
|
|
||||||
### Example: Scheduling Custom Tasks
|
#### Check Version
|
||||||
|
|
||||||
|
```bash
|
||||||
|
spark --version
|
||||||
|
```
|
||||||
|
|
||||||
|
#### Get Help
|
||||||
|
|
||||||
|
```bash
|
||||||
|
spark help
|
||||||
|
```
|
||||||
|
|
||||||
|
## 🔧 Programmatic Usage
|
||||||
|
|
||||||
|
While Spark is primarily designed as a CLI tool and daemon, you can also use it as a library in your Deno projects.
|
||||||
|
|
||||||
|
### Import from Deno
|
||||||
|
|
||||||
```typescript
|
```typescript
|
||||||
import { SparkTaskManager } from '@serve.zone/spark';
|
import { Spark } from 'https://code.foss.global/serve.zone/spark/raw/branch/master/mod.ts';
|
||||||
|
```
|
||||||
|
|
||||||
const sparkInstance = new Spark();
|
### Basic Usage
|
||||||
const myTask = {
|
|
||||||
name: 'customTask',
|
```typescript
|
||||||
|
import { Spark } from './mod.ts';
|
||||||
|
|
||||||
|
// Create a Spark instance
|
||||||
|
const spark = new Spark();
|
||||||
|
|
||||||
|
// Start the daemon programmatically
|
||||||
|
await spark.daemonStart();
|
||||||
|
```
|
||||||
|
|
||||||
|
### Task Scheduling
|
||||||
|
|
||||||
|
Schedule automated tasks using the built-in task manager:
|
||||||
|
|
||||||
|
```typescript
|
||||||
|
import { Spark } from './mod.ts';
|
||||||
|
|
||||||
|
const spark = new Spark();
|
||||||
|
|
||||||
|
// Define a custom task
|
||||||
|
const backupTask = {
|
||||||
|
name: 'daily-backup',
|
||||||
taskFunction: async () => {
|
taskFunction: async () => {
|
||||||
console.log('Running custom task');
|
console.log('Running backup...');
|
||||||
|
// Your backup logic here
|
||||||
},
|
},
|
||||||
};
|
};
|
||||||
|
|
||||||
sparkInstance.sparkTaskManager.taskmanager.addAndScheduleTask(myTask, '* * * * * *');
|
// Schedule it to run daily at 2 AM
|
||||||
```
|
spark.sparkTaskManager.taskmanager.addAndScheduleTask(
|
||||||
|
backupTask,
|
||||||
The example above creates a simple task that logs a message every second, demonstrating how to use Spark's task manager for custom scheduled tasks.
|
'0 2 * * *'
|
||||||
|
|
||||||
### Detailed Service Management
|
|
||||||
For advanced configurations, including Docker and service management, you can utilize the following patterns:
|
|
||||||
|
|
||||||
- Use `SparkUpdateManager` to handle Docker image updates, service creation, and management.
|
|
||||||
- Access and modify Docker and service configurations through Spark's integration with configuration files and environment variables.
|
|
||||||
|
|
||||||
```typescript
|
|
||||||
// Managing Docker services with Spark
|
|
||||||
await sparkInstance.sparkUpdateManager.dockerHost.someDockerMethod();
|
|
||||||
|
|
||||||
// Example: Creating a Docker service
|
|
||||||
const newServiceDefinition = {...};
|
|
||||||
await sparkInstance.sparkUpdateManager.createService(newServiceDefinition);
|
|
||||||
```
|
|
||||||
|
|
||||||
### CLI Commands
|
|
||||||
Spark provides several CLI commands to interact with and manage the system services:
|
|
||||||
|
|
||||||
#### Installing Spark as a Daemon
|
|
||||||
```shell
|
|
||||||
spark installdaemon
|
|
||||||
```
|
|
||||||
|
|
||||||
Sets up Spark as a system service to maintain server configurations automatically.
|
|
||||||
|
|
||||||
#### Updating the Daemon
|
|
||||||
```shell
|
|
||||||
spark updatedaemon
|
|
||||||
```
|
|
||||||
|
|
||||||
Updates the daemon service if a new version is available.
|
|
||||||
|
|
||||||
#### Running Spark as Daemon
|
|
||||||
```shell
|
|
||||||
spark asdaemon
|
|
||||||
```
|
|
||||||
|
|
||||||
Runs Spark in daemon mode, which is suitable for executing automated tasks.
|
|
||||||
|
|
||||||
#### Viewing Logs
|
|
||||||
```shell
|
|
||||||
spark logs
|
|
||||||
```
|
|
||||||
|
|
||||||
Views the logs of the Spark daemon service.
|
|
||||||
|
|
||||||
#### Cleaning Up Services
|
|
||||||
```shell
|
|
||||||
spark prune
|
|
||||||
```
|
|
||||||
|
|
||||||
Stops and cleans up all Docker services (stacks, networks, secrets, etc.) and prunes the Docker system.
|
|
||||||
|
|
||||||
### Programmatic Daemon Management
|
|
||||||
You can also manage the daemon programmatically:
|
|
||||||
|
|
||||||
```typescript
|
|
||||||
import { SmartDaemon } from '@push.rocks/smartdaemon';
|
|
||||||
import { Spark } from '@serve.zone/spark';
|
|
||||||
|
|
||||||
const sparkInstance = new Spark();
|
|
||||||
const smartDaemon = new SmartDaemon();
|
|
||||||
|
|
||||||
const startDaemon = async () => {
|
|
||||||
const sparkService = await smartDaemon.addService({
|
|
||||||
name: 'spark',
|
|
||||||
version: sparkInstance.sparkInfo.projectInfo.version,
|
|
||||||
command: 'spark asdaemon',
|
|
||||||
description: 'Spark daemon service',
|
|
||||||
workingDir: '/path/to/project',
|
|
||||||
});
|
|
||||||
await sparkService.save();
|
|
||||||
await sparkService.enable();
|
|
||||||
await sparkService.start();
|
|
||||||
};
|
|
||||||
|
|
||||||
const updateDaemon = async () => {
|
|
||||||
const sparkService = await smartDaemon.addService({
|
|
||||||
name: 'spark',
|
|
||||||
version: sparkInstance.sparkInfo.projectInfo.version,
|
|
||||||
command: 'spark asdaemon',
|
|
||||||
description: 'Spark daemon service',
|
|
||||||
workingDir: '/path/to/project',
|
|
||||||
});
|
|
||||||
await sparkService.reload();
|
|
||||||
};
|
|
||||||
|
|
||||||
startDaemon();
|
|
||||||
updateDaemon();
|
|
||||||
```
|
|
||||||
|
|
||||||
This illustrates how to initiate and update the Spark daemon using the `SmartDaemon` class from `@push.rocks/smartdaemon`.
|
|
||||||
|
|
||||||
### Configuration Management
|
|
||||||
Extensive configuration management is possible through the `SparkLocalConfig` and other configuration classes. This feature allows you to make your application's behavior adaptable based on different environments and requirements.
|
|
||||||
|
|
||||||
```typescript
|
|
||||||
// Example on setting local config
|
|
||||||
import { SparkLocalConfig } from '@serve.zone/spark';
|
|
||||||
|
|
||||||
const localConfig = new SparkLocalConfig(sparkInstance);
|
|
||||||
await localConfig.kvStore.set('someKey', 'someValue');
|
|
||||||
|
|
||||||
// Retrieving a value from local config
|
|
||||||
const someConfigValue = await localConfig.kvStore.get('someKey');
|
|
||||||
|
|
||||||
console.log(someConfigValue); // Outputs: someValue
|
|
||||||
```
|
|
||||||
|
|
||||||
### Detailed Log Management
|
|
||||||
Logging is a crucial aspect of any automation tool, and `@serve.zone/spark` offers rich logging functionality through its built-in logging library.
|
|
||||||
|
|
||||||
```typescript
|
|
||||||
import { logger, Spark } from '@serve.zone/spark';
|
|
||||||
|
|
||||||
const sparkInstance = new Spark();
|
|
||||||
|
|
||||||
logger.log('info', 'Spark instance created.');
|
|
||||||
|
|
||||||
// Using logger in various levels of severity
|
|
||||||
logger.log('debug', 'This is a debug message');
|
|
||||||
logger.log('warn', 'This is a warning message');
|
|
||||||
logger.log('error', 'This is an error message');
|
|
||||||
logger.log('ok', 'This is a success message');
|
|
||||||
```
|
|
||||||
|
|
||||||
### Real-World Scenarios
|
|
||||||
|
|
||||||
#### Automated System Update and Restart
|
|
||||||
In real-world scenarios, you might want to automate system updates and reboots to ensure your services are running the latest security patches and features.
|
|
||||||
|
|
||||||
```typescript
|
|
||||||
import { Spark } from '@serve.zone/spark';
|
|
||||||
import { SmartShell } from '@push.rocks/smartshell';
|
|
||||||
|
|
||||||
const sparkInstance = new Spark();
|
|
||||||
const shell = new SmartShell({ executor: 'bash' });
|
|
||||||
|
|
||||||
const updateAndRestart = async () => {
|
|
||||||
await shell.exec('apt-get update && apt-get upgrade -y');
|
|
||||||
console.log('System updated.');
|
|
||||||
await shell.exec('reboot');
|
|
||||||
};
|
|
||||||
|
|
||||||
sparkInstance.sparkTaskManager.taskmanager.addAndScheduleTask(
|
|
||||||
{ name: 'updateAndRestart', taskFunction: updateAndRestart },
|
|
||||||
'0 3 * * 7' // Every Sunday at 3 AM
|
|
||||||
);
|
);
|
||||||
```
|
```
|
||||||
|
|
||||||
This example demonstrates creating and scheduling a task to update and restart the server every Sunday at 3 AM using Spark's task management capabilities.
|
### Service Management
|
||||||
|
|
||||||
#### Integrating with Docker for Service Deployment
|
Manage Docker services programmatically:
|
||||||
Spark's tight integration with Docker makes it an excellent tool for deploying containerized applications across your infrastructure.
|
|
||||||
|
|
||||||
```typescript
|
```typescript
|
||||||
import { Spark } from '@serve.zone/spark';
|
import { Spark } from './mod.ts';
|
||||||
import { DockerHost } from '@apiclient.xyz/docker';
|
|
||||||
|
|
||||||
const sparkInstance = new Spark();
|
const spark = new Spark();
|
||||||
const dockerHost = new DockerHost({});
|
|
||||||
|
|
||||||
const deployService = async () => {
|
// Add a service to manage
|
||||||
const image = await dockerHost.pullImage('my-docker-repo/my-service:latest');
|
spark.sparkUpdateManager.services.push({
|
||||||
const newService = await dockerHost.createService({
|
name: 'my-app',
|
||||||
name: 'my-service',
|
image: 'code.foss.global/myorg/myapp',
|
||||||
image,
|
url: 'myapp',
|
||||||
ports: ['80:8080'],
|
environment: 'production',
|
||||||
environmentVariables: {
|
port: '3000',
|
||||||
NODE_ENV: 'production',
|
secretJson: {
|
||||||
|
API_KEY: 'secret-value',
|
||||||
|
DATABASE_URL: 'postgresql://...',
|
||||||
},
|
},
|
||||||
});
|
});
|
||||||
console.log(`Service ${newService.name} deployed.`);
|
|
||||||
};
|
|
||||||
|
|
||||||
deployService();
|
// Start managing services
|
||||||
|
await spark.sparkUpdateManager.start();
|
||||||
```
|
```
|
||||||
|
|
||||||
This example demonstrates how to pull a Docker image and deploy it as a new service in your infrastructure using Spark's Docker integration.
|
### Configuration Management
|
||||||
|
|
||||||
### Managing Secrets
|
Access and modify Spark's configuration:
|
||||||
Managing secrets and sensitive data is crucial in any configuration and automation tool. Spark's integration with Docker allows you to handle secrets securely.
|
|
||||||
|
|
||||||
```typescript
|
```typescript
|
||||||
import { Spark, SparkUpdateManager } from '@serve.zone/spark';
|
import { Spark } from './mod.ts';
|
||||||
import { DockerSecret } from '@apiclient.xyz/docker';
|
|
||||||
|
|
||||||
const sparkInstance = new Spark();
|
const spark = new Spark();
|
||||||
const updateManager = new SparkUpdateManager(sparkInstance);
|
|
||||||
|
|
||||||
const createDockerSecret = async () => {
|
// Write configuration
|
||||||
const secret = await DockerSecret.createSecret(updateManager.dockerHost, {
|
await spark.sparkConfig.kvStore.writeKey('mode', 'cloudly');
|
||||||
name: 'dbPassword',
|
|
||||||
contentArg: 'superSecretPassword',
|
|
||||||
});
|
|
||||||
console.log(`Secret ${secret.Spec.Name} created.`);
|
|
||||||
};
|
|
||||||
|
|
||||||
createDockerSecret();
|
// Read configuration
|
||||||
|
const mode = await spark.sparkConfig.kvStore.readKey('mode');
|
||||||
|
console.log(`Current mode: ${mode}`);
|
||||||
```
|
```
|
||||||
|
|
||||||
This example shows how to create a Docker secret using Spark's `SparkUpdateManager` class, ensuring that sensitive information is securely stored and managed.
|
### Logging
|
||||||
|
|
||||||
|
Use Spark's built-in logger for consistent logging:
|
||||||
|
|
||||||
|
```typescript
|
||||||
|
import { logger } from './ts/spark.logging.ts';
|
||||||
|
|
||||||
|
// Log at different levels
|
||||||
|
logger.log('info', 'Application starting...');
|
||||||
|
logger.log('ok', 'Service deployed successfully');
|
||||||
|
logger.log('warn', 'High memory usage detected');
|
||||||
|
logger.log('error', 'Failed to connect to database');
|
||||||
|
logger.log('success', 'Backup completed');
|
||||||
|
```
|
||||||
|
|
||||||
|
## 🏗️ Architecture
|
||||||
|
|
||||||
|
### Core Components
|
||||||
|
|
||||||
|
```
|
||||||
|
┌─────────────────────────────────────────┐
|
||||||
|
│ Spark Instance │
|
||||||
|
├─────────────────────────────────────────┤
|
||||||
|
│ │
|
||||||
|
│ ┌─────────────────────────────────┐ │
|
||||||
|
│ │ SparkConfig │ │
|
||||||
|
│ │ - KV Store │ │
|
||||||
|
│ │ - Mode Configuration │ │
|
||||||
|
│ └─────────────────────────────────┘ │
|
||||||
|
│ │
|
||||||
|
│ ┌─────────────────────────────────┐ │
|
||||||
|
│ │ SparkTaskManager │ │
|
||||||
|
│ │ - Cron Scheduling │ │
|
||||||
|
│ │ - Task Execution │ │
|
||||||
|
│ └─────────────────────────────────┘ │
|
||||||
|
│ │
|
||||||
|
│ ┌─────────────────────────────────┐ │
|
||||||
|
│ │ SparkServicesManager │ │
|
||||||
|
│ │ - Docker Integration │ │
|
||||||
|
│ │ - Service Updates │ │
|
||||||
|
│ │ - Secret Management │ │
|
||||||
|
│ └─────────────────────────────────┘ │
|
||||||
|
│ │
|
||||||
|
│ ┌─────────────────────────────────┐ │
|
||||||
|
│ │ SmartDaemon │ │
|
||||||
|
│ │ - Systemd Integration │ │
|
||||||
|
│ │ - Service Lifecycle │ │
|
||||||
|
│ └─────────────────────────────────┘ │
|
||||||
|
│ │
|
||||||
|
└─────────────────────────────────────────┘
|
||||||
|
```
|
||||||
|
|
||||||
|
### Key Classes
|
||||||
|
|
||||||
|
- **`Spark`** - Main orchestrator class that coordinates all components
|
||||||
|
- **`SparkConfig`** - Handles configuration storage and retrieval
|
||||||
|
- **`SparkTaskManager`** - Manages scheduled tasks and automation
|
||||||
|
- **`SparkServicesManager`** - Manages Docker services and updates
|
||||||
|
- **`SparkInfo`** - Provides project and version information
|
||||||
|
|
||||||
|
## 🔄 Update Management
|
||||||
|
|
||||||
|
Spark includes self-updating capabilities:
|
||||||
|
|
||||||
|
```typescript
|
||||||
|
import { Spark } from './mod.ts';
|
||||||
|
|
||||||
|
const spark = new Spark();
|
||||||
|
|
||||||
|
// Check for and apply updates
|
||||||
|
await spark.sparkUpdateManager.updateServices();
|
||||||
|
```
|
||||||
|
|
||||||
|
The update manager:
|
||||||
|
- Pulls latest Docker images
|
||||||
|
- Manages service rollouts
|
||||||
|
- Handles zero-downtime deployments
|
||||||
|
- Manages Docker secrets securely
|
||||||
|
|
||||||
|
## 🐳 Docker Integration
|
||||||
|
|
||||||
|
### Service Definition
|
||||||
|
|
||||||
|
```typescript
|
||||||
|
const serviceDefinition = {
|
||||||
|
name: 'api-server',
|
||||||
|
image: 'code.foss.global/myorg/api',
|
||||||
|
url: 'api',
|
||||||
|
environment: 'production',
|
||||||
|
port: '8080',
|
||||||
|
secretJson: {
|
||||||
|
JWT_SECRET: 'your-jwt-secret',
|
||||||
|
DB_PASSWORD: 'your-db-password',
|
||||||
|
},
|
||||||
|
};
|
||||||
|
|
||||||
|
spark.sparkUpdateManager.services.push(serviceDefinition);
|
||||||
|
```
|
||||||
|
|
||||||
|
### Stack Management
|
||||||
|
|
||||||
|
Spark manages Docker stacks for complex multi-service deployments:
|
||||||
|
|
||||||
|
```bash
|
||||||
|
# View running stacks
|
||||||
|
docker stack ls
|
||||||
|
|
||||||
|
# Spark manages these automatically
|
||||||
|
```
|
||||||
|
|
||||||
|
## 🛠️ Development
|
||||||
|
|
||||||
|
### Prerequisites
|
||||||
|
|
||||||
|
- [Deno](https://deno.land/) v2.x or later
|
||||||
|
|
||||||
|
### Running from Source
|
||||||
|
|
||||||
|
```bash
|
||||||
|
# Clone the repository
|
||||||
|
git clone https://code.foss.global/serve.zone/spark.git
|
||||||
|
cd spark
|
||||||
|
|
||||||
|
# Run directly
|
||||||
|
deno run --allow-all mod.ts
|
||||||
|
|
||||||
|
# Run tests
|
||||||
|
deno test --allow-all test/
|
||||||
|
|
||||||
|
# Type check
|
||||||
|
deno check mod.ts
|
||||||
|
|
||||||
|
# Format code
|
||||||
|
deno fmt
|
||||||
|
|
||||||
|
# Lint
|
||||||
|
deno lint
|
||||||
|
```
|
||||||
|
|
||||||
|
### Building Binaries
|
||||||
|
|
||||||
|
Compile for all supported platforms:
|
||||||
|
|
||||||
|
```bash
|
||||||
|
bash scripts/compile-all.sh
|
||||||
|
```
|
||||||
|
|
||||||
|
Binaries will be output to `dist/binaries/`.
|
||||||
|
|
||||||
|
### Compile for Specific Platform
|
||||||
|
|
||||||
|
```bash
|
||||||
|
# Linux x64
|
||||||
|
deno compile --allow-all --output spark-linux-x64 --target x86_64-unknown-linux-gnu mod.ts
|
||||||
|
|
||||||
|
# macOS ARM64
|
||||||
|
deno compile --allow-all --output spark-macos-arm64 --target aarch64-apple-darwin mod.ts
|
||||||
|
```
|
||||||
|
|
||||||
|
## 🔐 Security
|
||||||
|
|
||||||
|
### Permissions
|
||||||
|
|
||||||
|
Spark requires the following Deno permissions:
|
||||||
|
|
||||||
|
- `--allow-net` - API communication, Docker socket access
|
||||||
|
- `--allow-read` - Configuration files, project files
|
||||||
|
- `--allow-write` - Logs, configuration updates
|
||||||
|
- `--allow-run` - systemctl, Docker commands
|
||||||
|
- `--allow-env` - Environment variables
|
||||||
|
- `--allow-sys` - System information
|
||||||
|
|
||||||
|
### Secrets Management
|
||||||
|
|
||||||
|
Always use Docker secrets for sensitive data:
|
||||||
|
|
||||||
|
```typescript
|
||||||
|
const serviceWithSecrets = {
|
||||||
|
name: 'secure-app',
|
||||||
|
image: 'myapp:latest',
|
||||||
|
secretJson: {
|
||||||
|
API_KEY: Deno.env.get('API_KEY')!,
|
||||||
|
DB_PASSWORD: Deno.env.get('DB_PASSWORD')!,
|
||||||
|
},
|
||||||
|
};
|
||||||
|
```
|
||||||
|
|
||||||
|
## 🐛 Troubleshooting
|
||||||
|
|
||||||
|
### Service Won't Start
|
||||||
|
|
||||||
|
Check the daemon status:
|
||||||
|
```bash
|
||||||
|
sudo systemctl status smartdaemon_spark
|
||||||
|
```
|
||||||
|
|
||||||
|
View recent logs:
|
||||||
|
```bash
|
||||||
|
sudo journalctl -u smartdaemon_spark -n 100
|
||||||
|
```
|
||||||
|
|
||||||
|
### Docker Issues
|
||||||
|
|
||||||
|
Verify Docker is running:
|
||||||
|
```bash
|
||||||
|
sudo systemctl status docker
|
||||||
|
```
|
||||||
|
|
||||||
|
Check Docker socket permissions:
|
||||||
|
```bash
|
||||||
|
sudo ls -la /var/run/docker.sock
|
||||||
|
```
|
||||||
|
|
||||||
|
### Configuration Issues
|
||||||
|
|
||||||
|
Check current mode:
|
||||||
|
```bash
|
||||||
|
# Run spark programmatically
|
||||||
|
deno run --allow-all -e "
|
||||||
|
import { Spark } from './mod.ts';
|
||||||
|
const s = new Spark();
|
||||||
|
const mode = await s.sparkConfig.kvStore.readKey('mode');
|
||||||
|
console.log('Mode:', mode);
|
||||||
|
"
|
||||||
|
```
|
||||||
|
|
||||||
|
## 📝 Examples
|
||||||
|
|
||||||
|
### Automated System Maintenance
|
||||||
|
|
||||||
|
```typescript
|
||||||
|
import { Spark } from './mod.ts';
|
||||||
|
|
||||||
|
const spark = new Spark();
|
||||||
|
|
||||||
|
// Schedule weekly system updates
|
||||||
|
const updateTask = {
|
||||||
|
name: 'system-update',
|
||||||
|
taskFunction: async () => {
|
||||||
|
const shell = new Deno.Command('bash', {
|
||||||
|
args: ['-c', 'apt-get update && apt-get upgrade -y'],
|
||||||
|
});
|
||||||
|
await shell.output();
|
||||||
|
},
|
||||||
|
};
|
||||||
|
|
||||||
|
// Every Sunday at 3 AM
|
||||||
|
spark.sparkTaskManager.taskmanager.addAndScheduleTask(
|
||||||
|
updateTask,
|
||||||
|
'0 3 * * 0'
|
||||||
|
);
|
||||||
|
|
||||||
|
await spark.daemonStart();
|
||||||
|
```
|
||||||
|
|
||||||
|
### Multi-Service Deployment
|
||||||
|
|
||||||
|
```typescript
|
||||||
|
import { Spark } from './mod.ts';
|
||||||
|
|
||||||
|
const spark = new Spark();
|
||||||
|
|
||||||
|
// Add multiple services
|
||||||
|
const services = [
|
||||||
|
{
|
||||||
|
name: 'frontend',
|
||||||
|
image: 'code.foss.global/myorg/frontend',
|
||||||
|
url: 'frontend',
|
||||||
|
port: '80',
|
||||||
|
environment: 'production',
|
||||||
|
},
|
||||||
|
{
|
||||||
|
name: 'backend',
|
||||||
|
image: 'code.foss.global/myorg/backend',
|
||||||
|
url: 'backend',
|
||||||
|
port: '3000',
|
||||||
|
environment: 'production',
|
||||||
|
secretJson: {
|
||||||
|
DATABASE_URL: Deno.env.get('DATABASE_URL')!,
|
||||||
|
},
|
||||||
|
},
|
||||||
|
{
|
||||||
|
name: 'worker',
|
||||||
|
image: 'code.foss.global/myorg/worker',
|
||||||
|
url: 'worker',
|
||||||
|
environment: 'production',
|
||||||
|
},
|
||||||
|
];
|
||||||
|
|
||||||
|
services.forEach(svc => spark.sparkUpdateManager.services.push(svc));
|
||||||
|
|
||||||
|
await spark.daemonStart();
|
||||||
|
```
|
||||||
|
|
||||||
|
## 🤝 Support
|
||||||
|
|
||||||
|
For issues, questions, or contributions:
|
||||||
|
- 🐛 [Report Issues](https://code.foss.global/serve.zone/spark/issues)
|
||||||
|
- 📖 [View Source](https://code.foss.global/serve.zone/spark)
|
||||||
|
|
||||||
## License and Legal Information
|
## License and Legal Information
|
||||||
|
|
||||||
|
|||||||
Reference in New Issue
Block a user