Files
spark/.serena/memories/spark_project_overview.md
Juergen Kunz 526b4f46dd
All checks were successful
CI / Type Check & Lint (push) Successful in 13s
CI / Build Test (Current Platform) (push) Successful in 19s
CI / Build All Platforms (push) Successful in 1m48s
feat(migration): Migrate from Node.js to Deno runtime
Major migration to Deno runtime following the nupst project pattern:

Core Changes:
- Created deno.json configuration with tasks, imports, and settings
- Created mod.ts as main entry point with Deno permissions
- Updated all TypeScript imports from .js to .ts extensions
- Replaced Node.js APIs (process.exit) with Deno equivalents (Deno.exit)
- Updated path imports to use @std/path from JSR

Dependencies:
- Migrated all npm dependencies to use npm: prefix in import map
- Added Deno standard library imports (@std/path, @std/assert)
- Configured import aliases for all @push.rocks and @serve.zone packages

Build & Distribution:
- Created install.sh for downloading pre-compiled binaries
- Created uninstall.sh for clean system removal
- Created scripts/compile-all.sh for multi-platform compilation
- Supports Linux (x64, ARM64), macOS (x64, ARM64), Windows (x64)

Testing:
- Migrated tests to Deno test framework using @std/assert
- Created test.simple.ts for basic verification
- Updated test structure to use Deno.test instead of tap

CI/CD:
- Created .gitea/workflows/ci.yml for type checking, linting, and builds
- Created .gitea/workflows/release.yml for automated releases
- Created .gitea/release-template.md for release documentation

Cleanup:
- Removed package.json, pnpm-lock.yaml, tsconfig.json
- Removed Node.js CLI files (cli.js, cli.child.ts, cli.ts.js)
- Removed dist_ts/ compiled output directory
- Removed npmextra.json configuration

This migration enables standalone binary distribution without Node.js
runtime dependency while maintaining all existing functionality.
2025-10-23 23:22:16 +00:00

47 lines
1.9 KiB
Markdown

# Spark Project Overview
## Project Purpose
Spark is a comprehensive tool for maintaining and configuring servers, integrating with Docker and supporting advanced task scheduling, targeted at the Servezone infrastructure and used by @serve.zone/cloudly as a cluster node server system manager.
## Tech Stack
- **Language**: TypeScript
- **Runtime**: Node.js (currently)
- **Package Manager**: pnpm
- **Build Tool**: @git.zone/tsbuild
- **Test Framework**: @git.zone/tstest with @push.rocks/tapbundle
- **CLI Framework**: @push.rocks/smartcli
- **Version**: 1.2.2
## Directory Structure
```
spark/
├── ts/ # TypeScript source files
├── test/ # Test files (single test.nonci.ts)
├── dist_ts/ # Compiled TypeScript output
├── cli.js # CLI entry point
├── cli.child.ts # Child process CLI
├── cli.ts.js # TypeScript CLI wrapper
└── package.json # Dependencies and scripts
```
## Key Dependencies
- **@serve.zone/api**: API client for Servezone
- **@serve.zone/interfaces**: Interface definitions
- **@apiclient.xyz/docker**: Docker API client
- **@push.rocks/*** packages: Various utilities (smartlog, smartfile, smartcli, smartdaemon, etc.)
## Main Components
1. **CLI** (spark.cli.ts): Command-line interface with commands like installdaemon, updatedaemon, asdaemon
2. **Spark** (spark.classes.spark.ts): Main application class
3. **TaskManager** (spark.classes.taskmanager.ts): Task scheduling
4. **UpdateManager** (spark.classes.updatemanager.ts): Service updates
5. **Config** (spark.classes.config.ts): Configuration management
## Commands
- `pnpm build`: Build the TypeScript code
- `pnpm test`: Run tests
- `spark installdaemon`: Install as system daemon
- `spark updatedaemon`: Update daemon service
- `spark asdaemon`: Run as daemon
- `spark logs`: View daemon logs
- `spark prune`: Clean up resources