@push.rocks/taskbuffer 🚀
A powerful, flexible, and TypeScript-first task management library for orchestrating asynchronous operations with style. From simple task execution to complex distributed workflows with real-time progress tracking, taskbuffer has got you covered.
Install 📦
npm install @push.rocks/taskbuffer --save
Or with pnpm (recommended):
pnpm add @push.rocks/taskbuffer
Why taskbuffer? 🤔
In the modern JavaScript ecosystem, managing asynchronous tasks efficiently is crucial. Whether you're building a data pipeline, managing API rate limits, or orchestrating complex workflows, @push.rocks/taskbuffer provides the tools you need:
- 🎯 TypeScript-first: Built with TypeScript for TypeScript - enjoy complete type safety and excellent IDE support
- ⚡ Flexible execution: From simple tasks to complex parallel workflows with dependencies
- 🔄 Smart buffering: Control concurrent executions with intelligent buffer management
- ⏰ Built-in scheduling: Cron-based task scheduling without additional dependencies
- 🎭 Multiple paradigms: Support for debounced, throttled, and one-time execution patterns
- 📊 Progress tracking: Real-time step-by-step progress monitoring for UI integration
- 🔌 Extensible: Clean architecture that's easy to extend and customize
- 🏃 Zero dependencies on external schedulers: Everything you need is included
Core Concepts 🎓
Task
The fundamental unit of work. A task wraps an asynchronous function and provides powerful execution control, now with step-by-step progress tracking.
Taskchain
Sequential task execution - tasks run one after another, with results passed along the chain.
Taskparallel
Parallel task execution - multiple tasks run simultaneously for maximum performance.
TaskManager
Centralized task scheduling and management using cron expressions, with rich metadata collection.
TaskDebounced
Debounced task execution - prevents rapid repeated executions, only running after a quiet period.
TaskOnce
Singleton task execution - ensures a task runs exactly once, perfect for initialization routines.
TaskStep 🆕
Granular progress tracking - define named steps with percentage weights for real-time progress monitoring.
Quick Start 🏁
Basic Task Execution
import { Task } from '@push.rocks/taskbuffer';
// Create a simple task
const myTask = new Task({
name: 'DataProcessor',
taskFunction: async () => {
const data = await fetchData();
return processData(data);
},
});
// Execute the task
const result = await myTask.trigger();
Task with Progress Steps 🆕
Track granular progress for complex operations - perfect for UI progress bars:
const dataProcessingTask = new Task({
name: 'DataProcessor',
steps: [
{ name: 'validate', description: 'Validating input data', percentage: 15 },
{ name: 'fetch', description: 'Fetching external resources', percentage: 25 },
{ name: 'transform', description: 'Transforming data', percentage: 35 },
{ name: 'save', description: 'Saving to database', percentage: 25 }
] as const, // Use 'as const' for full type safety
taskFunction: async (inputData) => {
// TypeScript knows these step names!
dataProcessingTask.notifyStep('validate');
const validated = await validateData(inputData);
dataProcessingTask.notifyStep('fetch');
const external = await fetchExternalData();
dataProcessingTask.notifyStep('transform');
const transformed = await transformData(validated, external);
dataProcessingTask.notifyStep('save');
const result = await saveToDatabase(transformed);
return result;
}
});
// Monitor progress in real-time
const result = await dataProcessingTask.trigger();
console.log(`Final progress: ${dataProcessingTask.getProgress()}%`); // 100%
TypeScript Generics Support 🔬
TaskBuffer leverages TypeScript's powerful generics system for complete type safety across your task chains and workflows.
Generic Task Functions
Tasks support generic type parameters for both input and output types:
import { Task, ITaskFunction } from '@push.rocks/taskbuffer';
// Define typed interfaces
interface UserData {
id: string;
name: string;
email: string;
}
interface ProcessedUser {
userId: string;
displayName: string;
normalized: boolean;
}
// Create strongly typed tasks
const processUserTask = new Task<ProcessedUser>({
name: 'ProcessUser',
taskFunction: async (user: UserData): Promise<ProcessedUser> => {
return {
userId: user.id,
displayName: user.name.toUpperCase(),
normalized: true
};
}
});
// Type safety enforced at compile time
const result: ProcessedUser = await processUserTask.trigger({
id: '123',
name: 'John Doe',
email: 'john@example.com'
});
Generic Setup Values
Tasks can accept setup values through generics, perfect for configuration:
interface TaskConfig {
apiEndpoint: string;
retryCount: number;
timeout: number;
}
const configuredTask = new Task<TaskConfig>({
name: 'ConfiguredTask',
taskSetup: async (): Promise<TaskConfig> => ({
apiEndpoint: 'https://api.example.com',
retryCount: 3,
timeout: 5000
}),
taskFunction: async (data: any, setupValue: TaskConfig) => {
// setupValue is fully typed!
for (let i = 0; i < setupValue.retryCount; i++) {
try {
return await fetchWithTimeout(
setupValue.apiEndpoint,
setupValue.timeout
);
} catch (error) {
if (i === setupValue.retryCount - 1) throw error;
}
}
}
});
Type-Safe Task Chains
Chain tasks with preserved type flow:
// Each task knows its input and output types
const fetchTask = new Task<void>({
name: 'FetchUsers',
taskFunction: async (): Promise<UserData[]> => {
return await api.getUsers();
}
});
const filterTask = new Task<void>({
name: 'FilterActive',
taskFunction: async (users: UserData[]): Promise<UserData[]> => {
return users.filter(user => user.isActive);
}
});
const mapTask = new Task<void>({
name: 'MapToProcessed',
taskFunction: async (users: UserData[]): Promise<ProcessedUser[]> => {
return users.map(transformUser);
}
});
// Type safety flows through the chain
const chain = new Taskchain({
name: 'UserPipeline',
taskArray: [fetchTask, filterTask, mapTask]
});
const finalResult: ProcessedUser[] = await chain.trigger();
Progress Tracking & Metadata 📊 🆕
TaskBuffer now provides comprehensive progress tracking and metadata collection, perfect for building dashboards and monitoring systems.
Step-by-Step Progress
Define weighted steps for accurate progress calculation:
const migrationTask = new Task({
name: 'DatabaseMigration',
steps: [
{ name: 'backup', description: 'Backing up database', percentage: 20 },
{ name: 'schema', description: 'Updating schema', percentage: 30 },
{ name: 'data', description: 'Migrating data', percentage: 40 },
{ name: 'validate', description: 'Validating integrity', percentage: 10 }
] as const,
taskFunction: async () => {
migrationTask.notifyStep('backup');
await backupDatabase();
console.log(`Progress: ${migrationTask.getProgress()}%`); // ~20%
migrationTask.notifyStep('schema');
await updateSchema();
console.log(`Progress: ${migrationTask.getProgress()}%`); // ~50%
migrationTask.notifyStep('data');
await migrateData();
console.log(`Progress: ${migrationTask.getProgress()}%`); // ~90%
migrationTask.notifyStep('validate');
await validateIntegrity();
console.log(`Progress: ${migrationTask.getProgress()}%`); // 100%
}
});
// Get detailed step information
const steps = migrationTask.getStepsMetadata();
steps.forEach(step => {
console.log(`${step.name}: ${step.status} (${step.percentage}%)`);
if (step.duration) {
console.log(` Duration: ${step.duration}ms`);
}
});
Task Metadata Collection
Get comprehensive metadata about task execution:
const task = new Task({
name: 'DataProcessor',
buffered: true,
bufferMax: 5,
steps: [
{ name: 'process', description: 'Processing', percentage: 100 }
] as const,
taskFunction: async () => {
task.notifyStep('process');
await processData();
}
});
// Get complete task metadata
const metadata = task.getMetadata();
console.log({
name: metadata.name,
status: metadata.status, // 'idle' | 'running' | 'completed' | 'failed'
progress: metadata.currentProgress, // 0-100
currentStep: metadata.currentStep,
runCount: metadata.runCount,
lastRun: metadata.lastRun,
buffered: metadata.buffered,
bufferMax: metadata.bufferMax
});
TaskManager Enhanced Metadata
The TaskManager now provides rich metadata for monitoring and dashboards:
const manager = new TaskManager();
// Add tasks with step tracking
manager.addAndScheduleTask(backupTask, '0 2 * * *'); // 2 AM daily
manager.addAndScheduleTask(cleanupTask, '0 */6 * * *'); // Every 6 hours
// Get metadata for all tasks
const allTasksMetadata = manager.getAllTasksMetadata();
allTasksMetadata.forEach(task => {
console.log(`Task: ${task.name}`);
console.log(` Status: ${task.status}`);
console.log(` Progress: ${task.currentProgress}%`);
console.log(` Run count: ${task.runCount}`);
console.log(` Schedule: ${task.cronSchedule}`);
});
// Get scheduled tasks with next run times
const scheduledTasks = manager.getScheduledTasks();
scheduledTasks.forEach(task => {
console.log(`${task.name}: Next run at ${task.nextRun}`);
if (task.steps) {
console.log(` Steps: ${task.steps.length}`);
}
});
// Get upcoming executions
const nextRuns = manager.getNextScheduledRuns(10);
console.log('Next 10 scheduled executions:', nextRuns);
Execute and Track Tasks
Execute tasks with full lifecycle tracking and automatic cleanup:
const manager = new TaskManager();
const analyticsTask = new Task({
name: 'Analytics',
steps: [
{ name: 'collect', description: 'Collecting metrics', percentage: 30 },
{ name: 'analyze', description: 'Analyzing data', percentage: 50 },
{ name: 'report', description: 'Generating report', percentage: 20 }
] as const,
taskFunction: async () => {
analyticsTask.notifyStep('collect');
const metrics = await collectMetrics();
analyticsTask.notifyStep('analyze');
const analysis = await analyzeData(metrics);
analyticsTask.notifyStep('report');
return await generateReport(analysis);
}
});
// Execute with automatic cleanup and metadata collection
const report = await manager.addExecuteRemoveTask(analyticsTask, {
trackProgress: true
});
console.log('Execution Report:', {
taskName: report.taskName,
duration: report.duration,
stepsCompleted: report.stepsCompleted,
finalProgress: report.progress,
result: report.result
});
Frontend Integration Example
Perfect for building real-time progress UIs:
// WebSocket server for real-time updates
io.on('connection', (socket) => {
socket.on('startTask', async (taskId) => {
const task = new Task({
name: taskId,
steps: [
{ name: 'start', description: 'Starting...', percentage: 10 },
{ name: 'process', description: 'Processing...', percentage: 70 },
{ name: 'finish', description: 'Finishing...', percentage: 20 }
] as const,
taskFunction: async () => {
task.notifyStep('start');
socket.emit('progress', {
step: 'start',
progress: task.getProgress(),
metadata: task.getStepsMetadata()
});
task.notifyStep('process');
socket.emit('progress', {
step: 'process',
progress: task.getProgress(),
metadata: task.getStepsMetadata()
});
task.notifyStep('finish');
socket.emit('progress', {
step: 'finish',
progress: task.getProgress(),
metadata: task.getStepsMetadata()
});
}
});
await task.trigger();
socket.emit('complete', task.getMetadata());
});
});
Buffer Behavior Deep Dive 🌊
The buffer system in TaskBuffer provides intelligent control over concurrent executions, preventing system overload while maximizing throughput.
How Buffering Works
When a task is buffered, TaskBuffer manages a queue of executions:
const bufferedTask = new Task({
name: 'BufferedOperation',
taskFunction: async (data) => {
console.log(`Processing: ${data}`);
await simulateWork();
return `Processed: ${data}`;
},
buffered: true,
bufferMax: 3 // Maximum 3 concurrent executions
});
// Trigger 10 executions rapidly
for (let i = 0; i < 10; i++) {
bufferedTask.trigger(`Item ${i}`);
}
// What happens:
// 1. First 3 tasks start immediately
// 2. Items 4-10 are queued
// 3. As each task completes, next queued item starts
// 4. Never more than 3 tasks running simultaneously
Buffer Truncation Behavior
When buffer limit is reached, new calls are intelligently managed:
const truncatingTask = new Task({
name: 'TruncatingBuffer',
taskFunction: async (data) => {
await processData(data);
},
buffered: true,
bufferMax: 5 // Maximum 5 in buffer
});
// Rapid fire 100 calls
for (let i = 0; i < 100; i++) {
truncatingTask.trigger(`Data ${i}`);
}
// Buffer behavior:
// - First 5 calls: Added to buffer and start processing
// - Calls 6-100: Each overwrites the 5th buffer slot
// - Result: Only processes items 0,1,2,3, and 99 (last one)
// - This prevents memory overflow in high-frequency scenarios
Advanced Buffer Strategies
1. Sliding Window Buffer
Perfect for real-time data processing where only recent items matter:
const slidingWindowTask = new Task({
name: 'SlidingWindow',
taskFunction: async (data) => {
return await analyzeRecentData(data);
},
buffered: true,
bufferMax: 10, // Keep last 10 items
execDelay: 100 // Process every 100ms
});
// In a real-time stream scenario
dataStream.on('data', (chunk) => {
slidingWindowTask.trigger(chunk);
// Older items automatically dropped when buffer full
});
2. Throttled Buffer
Combine buffering with execution delays for rate limiting:
const apiRateLimiter = new Task({
name: 'RateLimitedAPI',
taskFunction: async (request) => {
return await api.call(request);
},
buffered: true,
bufferMax: 10, // Max 10 queued requests
execDelay: 1000 // 1 second between executions
});
// Requests are queued and executed at 1/second
// Prevents API rate limit violations
3. Priority Buffer (Custom Implementation)
Implement priority queuing with buffer management:
class PriorityBufferedTask extends Task {
private priorityQueue: Array<{data: any, priority: number}> = [];
constructor(options) {
super({
...options,
taskFunction: async (item) => {
// Process based on priority
return await this.processByPriority(item);
}
});
}
triggerWithPriority(data: any, priority: number) {
if (this.priorityQueue.length >= this.bufferMax) {
// Remove lowest priority item if buffer full
this.priorityQueue.sort((a, b) => b.priority - a.priority);
this.priorityQueue.pop();
}
this.priorityQueue.push({data, priority});
this.priorityQueue.sort((a, b) => b.priority - a.priority);
return this.trigger(this.priorityQueue.shift());
}
}
Buffer Monitoring
Track buffer utilization and performance:
const monitoredTask = new Task({
name: 'MonitoredBuffer',
taskFunction: async (data) => {
const startTime = Date.now();
const result = await processData(data);
console.log(`Processing time: ${Date.now() - startTime}ms`);
console.log(`Buffer utilization: ${monitoredTask.bufferRunner.bufferCounter}/${monitoredTask.bufferMax}`);
return result;
},
buffered: true,
bufferMax: 20
});
// Monitor buffer saturation
setInterval(() => {
const utilization = (monitoredTask.bufferRunner.bufferCounter / monitoredTask.bufferMax) * 100;
if (utilization > 80) {
console.warn(`Buffer near capacity: ${utilization.toFixed(1)}%`);
}
}, 1000);
Buffer Best Practices
-
Choose appropriate buffer sizes:
- I/O operations: 5-10 concurrent
- CPU-intensive: Number of cores
- API calls: Based on rate limits
-
Handle buffer overflow gracefully:
const task = new Task({ taskFunction: async (data) => { try { return await process(data); } catch (error) { if (error.code === 'BUFFER_OVERFLOW') { // Implement backoff strategy await delay(1000); return task.trigger(data); } throw error; } }, buffered: true, bufferMax: 10 });
-
Monitor and adjust dynamically:
// Adjust buffer size based on system load const adaptiveTask = new Task({ name: 'AdaptiveBuffer', taskFunction: async (data) => { const cpuLoad = await getSystemLoad(); if (cpuLoad > 0.8) { adaptiveTask.bufferMax = Math.max(2, adaptiveTask.bufferMax - 1); } else if (cpuLoad < 0.5) { adaptiveTask.bufferMax = Math.min(20, adaptiveTask.bufferMax + 1); } return await process(data); }, buffered: true, bufferMax: 10 });
Common Patterns 🎨
Task Chains - Sequential Workflows
Build complex workflows where each step depends on the previous:
import { Task, Taskchain } from '@push.rocks/taskbuffer';
const fetchTask = new Task({
name: 'FetchData',
taskFunction: async () => {
const response = await fetch('/api/data');
return response.json();
},
});
const transformTask = new Task({
name: 'TransformData',
taskFunction: async (data) => {
return data.map((item) => ({
...item,
processed: true,
timestamp: Date.now(),
}));
},
});
const saveTask = new Task({
name: 'SaveData',
taskFunction: async (transformedData) => {
await database.bulkInsert(transformedData);
return { saved: transformedData.length };
},
});
const workflow = new Taskchain({
name: 'DataPipeline',
taskArray: [fetchTask, transformTask, saveTask],
});
// Execute the entire chain
const result = await workflow.trigger();
console.log(`Processed ${result.saved} items`);
Parallel Execution - Maximum Performance
Execute multiple independent tasks simultaneously:
import { Task, Taskparallel } from '@push.rocks/taskbuffer';
const tasks = ['user', 'posts', 'comments'].map(
(resource) =>
new Task({
name: `Fetch${resource}`,
taskFunction: async () => {
const data = await fetch(`/api/${resource}`);
return data.json();
},
}),
);
const parallelFetch = new Taskparallel({
taskArray: tasks,
});
// All tasks execute simultaneously
const [users, posts, comments] = await parallelFetch.trigger();
Scheduled Tasks with TaskManager
Run tasks on a schedule using cron expressions:
import { Task, TaskManager } from '@push.rocks/taskbuffer';
const backupTask = new Task({
name: 'DatabaseBackup',
steps: [
{ name: 'dump', description: 'Creating dump', percentage: 70 },
{ name: 'upload', description: 'Uploading to S3', percentage: 30 }
] as const,
taskFunction: async () => {
backupTask.notifyStep('dump');
await performBackup();
backupTask.notifyStep('upload');
await uploadToS3();
console.log(`Backup completed at ${new Date().toISOString()}`);
},
});
const manager = new TaskManager();
// Add and schedule tasks
manager.addAndScheduleTask(backupTask, '0 0 * * *'); // Daily at midnight
// Start the scheduler
manager.start();
// Monitor scheduled tasks
const scheduled = manager.getScheduledTasks();
console.log('Scheduled tasks:', scheduled);
// Later... stop if needed
manager.stop();
Debounced Tasks - Smart Throttling
Prevent task spam with intelligent debouncing:
import { TaskDebounced } from '@push.rocks/taskbuffer';
const saveTask = new TaskDebounced({
name: 'AutoSave',
taskFunction: async (content: string) => {
await saveToDatabase(content);
console.log('Content saved');
},
debounceTimeInMillis: 2000, // Wait 2 seconds of inactivity
});
// Rapid calls will be debounced
input.addEventListener('input', (e) => {
saveTask.trigger(e.target.value);
});
One-Time Tasks - Initialize Once
Ensure initialization code runs exactly once:
import { TaskOnce } from '@push.rocks/taskbuffer';
const initTask = new TaskOnce({
name: 'SystemInitialization',
taskFunction: async () => {
await database.connect();
await cache.initialize();
await loadConfiguration();
console.log('System initialized');
},
});
// Safe to call multiple times - only runs once
await initTask.trigger();
await initTask.trigger(); // This won't run again
Advanced Features 🔥
Task Dependencies with Pre/Post Hooks
Create sophisticated task relationships:
const validationTask = new Task({
name: 'ValidateInput',
taskFunction: async (data) => {
if (!isValid(data)) {
throw new Error('Validation failed');
}
return data;
},
});
const mainTask = new Task({
name: 'ProcessData',
taskFunction: async (data) => {
return await complexProcessing(data);
},
preTask: validationTask, // Runs before main task
afterTask: cleanupTask, // Runs after main task
});
Task Runners - Distributed Execution
The TaskRunner system enables distributed task execution across multiple workers:
import { TaskRunner } from '@push.rocks/taskbuffer';
const runner = new TaskRunner({
name: 'WorkerNode1',
maxConcurrentTasks: 5,
});
// Register tasks this runner can handle
runner.registerTask(dataProcessingTask);
runner.registerTask(imageResizeTask);
// Start processing
runner.start();
Dynamic Task Creation
Create tasks on-the-fly based on runtime conditions:
const dynamicWorkflow = async (config: Config) => {
const tasks = config.steps.map(
(step) =>
new Task({
name: step.name,
steps: step.substeps?.map(s => ({
name: s.id,
description: s.label,
percentage: s.weight
})) as const,
taskFunction: async (input) => {
for (const substep of step.substeps || []) {
task.notifyStep(substep.id);
await processStep(substep, input);
}
return input;
},
}),
);
const chain = new Taskchain({
name: 'DynamicWorkflow',
taskArray: tasks,
});
return await chain.trigger();
};
API Reference 📚
Task Options
Option | Type | Description |
---|---|---|
name |
string |
Unique identifier for the task |
taskFunction |
Function |
Async function to execute |
steps |
Array |
Step definitions with name, description, percentage |
buffered |
boolean |
Enable buffer management |
bufferMax |
number |
Maximum concurrent executions |
execDelay |
number |
Delay between executions (ms) |
timeout |
number |
Task timeout (ms) |
preTask |
Task |
Task to run before |
afterTask |
Task |
Task to run after |
Task Methods
Method | Description |
---|---|
trigger(x?) |
Execute the task |
notifyStep(stepName) |
Mark a step as active (typed step names!) |
getProgress() |
Get current progress percentage (0-100) |
getStepsMetadata() |
Get all steps with their current status |
getMetadata() |
Get complete task metadata |
resetSteps() |
Reset all steps to pending state |
TaskManager Methods
Method | Description |
---|---|
addTask(task) |
Add a task to the manager |
addAndScheduleTask(task, cron) |
Add and schedule a task |
getTaskByName(name) |
Get a specific task by name |
getTaskMetadata(name) |
Get metadata for a specific task |
getAllTasksMetadata() |
Get metadata for all tasks |
getScheduledTasks() |
Get all scheduled tasks with info |
getNextScheduledRuns(limit) |
Get upcoming scheduled executions |
addExecuteRemoveTask(task, opts) |
Execute task with lifecycle tracking |
triggerTaskByName(name) |
Trigger a task by its name |
scheduleTaskByName(name, cron) |
Schedule a task using cron expression |
descheduleTaskByName(name) |
Remove task from schedule |
start() |
Start the scheduler |
stop() |
Stop the scheduler |
Taskchain Methods
Method | Description |
---|---|
addTask(task) |
Add task to chain |
removeTask(taskName) |
Remove task from chain |
trigger(initialValue) |
Execute the chain |
reset() |
Reset chain state |
Performance Tips 🏎️
- Use buffering for I/O operations: Prevents overwhelming external services
- Leverage parallel execution: When tasks are independent, run them simultaneously
- Implement proper error handling: Use try-catch in task functions
- Monitor task execution: Use the built-in stats and logging
- Set appropriate timeouts: Prevent hanging tasks from blocking your system
- Use step tracking wisely: Don't create too many granular steps - aim for meaningful progress points
Error Handling 🛡️
const robustTask = new Task({
name: 'RobustOperation',
steps: [
{ name: 'try', description: 'Attempting operation', percentage: 80 },
{ name: 'retry', description: 'Retrying on failure', percentage: 20 }
] as const,
taskFunction: async (input) => {
try {
robustTask.notifyStep('try');
return await riskyOperation(input);
} catch (error) {
// Log error
console.error(`Task failed: ${error.message}`);
// Optionally retry
if (error.retryable) {
robustTask.notifyStep('retry');
return await riskyOperation(input);
}
// Or return default value
return defaultValue;
}
},
timeout: 5000, // Fail if takes longer than 5 seconds
});
Real-World Examples 🌍
API Rate Limiting with Progress
const apiClient = new Task({
name: 'RateLimitedAPI',
steps: [
{ name: 'wait', description: 'Rate limit delay', percentage: 10 },
{ name: 'call', description: 'API call', percentage: 90 }
] as const,
taskFunction: async (endpoint: string) => {
apiClient.notifyStep('wait');
await delay(100); // Rate limiting
apiClient.notifyStep('call');
return await fetch(`https://api.example.com${endpoint}`);
},
buffered: true,
bufferMax: 10, // 10 requests
execDelay: 100, // Per 100ms = 100 req/s max
});
Database Migration Pipeline with Progress
const migrationChain = new Taskchain({
name: 'DatabaseMigration',
taskArray: [
new Task({
name: 'Backup',
steps: [{ name: 'backup', description: 'Creating backup', percentage: 100 }] as const,
taskFunction: async () => {
backupTask.notifyStep('backup');
return await createBackup();
}
}),
new Task({
name: 'SchemaUpdate',
steps: [
{ name: 'analyze', description: 'Analyzing changes', percentage: 30 },
{ name: 'apply', description: 'Applying migrations', percentage: 70 }
] as const,
taskFunction: async () => {
schemaTask.notifyStep('analyze');
const changes = await analyzeSchema();
schemaTask.notifyStep('apply');
return await applyMigrations(changes);
}
}),
// ... more tasks
],
});
// Execute with progress monitoring
const result = await migrationChain.trigger();
Microservice Health Monitoring Dashboard
const healthMonitor = new TaskManager();
services.forEach((service) => {
const healthCheck = new Task({
name: `HealthCheck:${service.name}`,
steps: [
{ name: 'ping', description: 'Pinging service', percentage: 30 },
{ name: 'check', description: 'Checking health', percentage: 50 },
{ name: 'report', description: 'Reporting status', percentage: 20 }
] as const,
taskFunction: async () => {
healthCheck.notifyStep('ping');
const responsive = await ping(service.url);
healthCheck.notifyStep('check');
const healthy = await checkHealth(service.url);
healthCheck.notifyStep('report');
if (!healthy) {
await alertOps(service);
}
return { service: service.name, healthy, timestamp: Date.now() };
},
});
healthMonitor.addAndScheduleTask(healthCheck, '*/1 * * * *'); // Every minute
});
// Dashboard endpoint
app.get('/api/health/dashboard', (req, res) => {
const metadata = healthMonitor.getAllTasksMetadata();
res.json({
services: metadata.map(task => ({
name: task.name.replace('HealthCheck:', ''),
status: task.status,
lastCheck: task.lastRun,
nextCheck: healthMonitor.getScheduledTasks()
.find(s => s.name === task.name)?.nextRun,
progress: task.currentProgress,
currentStep: task.currentStep
}))
});
});
Testing 🧪
import { expect, tap } from '@git.zone/tstest';
import { Task, TaskStep } from '@push.rocks/taskbuffer';
tap.test('should track task progress through steps', async () => {
const task = new Task({
name: 'TestTask',
steps: [
{ name: 'step1', description: 'First step', percentage: 50 },
{ name: 'step2', description: 'Second step', percentage: 50 }
] as const,
taskFunction: async () => {
task.notifyStep('step1');
expect(task.getProgress()).toBeLessThanOrEqual(50);
task.notifyStep('step2');
expect(task.getProgress()).toBeLessThanOrEqual(100);
}
});
await task.trigger();
expect(task.getProgress()).toEqual(100);
});
tap.test('should collect execution metadata', async () => {
const manager = new TaskManager();
const task = new Task({
name: 'MetadataTest',
taskFunction: async () => 'result'
});
const report = await manager.addExecuteRemoveTask(task);
expect(report.taskName).toEqual('MetadataTest');
expect(report.result).toEqual('result');
expect(report.duration).toBeGreaterThan(0);
});
tap.start();
Support 💬
- 📧 Email: hello@task.vc
- 🐛 Issues: GitHub Issues
- 📖 Docs: Documentation
License and Legal Information
This repository contains open-source code that is licensed under the MIT License. A copy of the MIT License can be found in the license file within this repository.
Please note: The MIT License does not grant permission to use the trade names, trademarks, service marks, or product names of the project, except as required for reasonable and customary use in describing the origin of the work and reproducing the content of the NOTICE file.
Trademarks
This project is owned and maintained by Task Venture Capital GmbH. The names and logos associated with Task Venture Capital GmbH and any related products or services are trademarks of Task Venture Capital GmbH and are not included within the scope of the MIT license granted herein. Use of these trademarks must comply with Task Venture Capital GmbH's Trademark Guidelines, and any usage must be approved in writing by Task Venture Capital GmbH.
Company Information
Task Venture Capital GmbH
Registered at District court Bremen HRB 35230 HB, Germany
For any legal inquiries or if you require further information, please contact us via email at hello@task.vc.
By using this repository, you acknowledge that you have read this section, agree to comply with its terms, and understand that the licensing of the code does not imply endorsement by Task Venture Capital GmbH of any derivative works.