2025-05-31 17:14:15 +00:00
# SmartProxy Performance Optimization Plan
2025-05-28 23:33:02 +00:00
2025-05-31 17:14:15 +00:00
## Executive Summary
2025-05-28 23:33:02 +00:00
2025-05-31 17:14:15 +00:00
This plan addresses critical performance issues in SmartProxy that impact scalability, responsiveness, and stability. The approach is phased, starting with critical event loop blockers and progressing to long-term architectural improvements.
2025-05-28 23:33:02 +00:00
2025-05-31 17:14:15 +00:00
## Phase 1: Critical Issues (Week 1)
### 1.1 Eliminate Busy Wait Loop
**Issue**: `ts/proxies/nftables-proxy/nftables-proxy.ts:235-238` blocks the entire event loop
**Solution**:
2025-05-28 23:33:02 +00:00
```typescript
2025-05-31 17:14:15 +00:00
// Create utility function in ts/core/utils/async-utils.ts
export async function delay(ms: number): Promise< void > {
return new Promise(resolve => setTimeout(resolve, ms));
}
// Replace busy wait in nftables-proxy.ts
// OLD:
const waitUntil = Date.now() + retryDelayMs;
while (Date.now() < waitUntil ) { }
// NEW:
await delay(retryDelayMs);
```
**Implementation**:
1. Create `async-utils.ts` with common async utilities
2. Replace all synchronous sleeps with async delay
3. Ensure all calling functions are async
### 1.2 Async Filesystem Operations
**Issue**: Multiple synchronous filesystem operations blocking the event loop
**Solution Architecture**:
```typescript
// Create ts/core/utils/fs-utils.ts
import * as plugins from '../../plugins.js';
export class AsyncFileSystem {
static async exists(path: string): Promise< boolean > {
try {
await plugins.fs.promises.access(path);
return true;
} catch {
return false;
2025-05-28 23:33:02 +00:00
}
2025-05-31 17:14:15 +00:00
}
static async ensureDir(path: string): Promise< void > {
await plugins.fs.promises.mkdir(path, { recursive: true });
}
static async readFile(path: string): Promise< string > {
return plugins.fs.promises.readFile(path, 'utf8');
}
static async writeFile(path: string, data: string): Promise< void > {
// Ensure directory exists
const dir = plugins.path.dirname(path);
await this.ensureDir(dir);
await plugins.fs.promises.writeFile(path, data);
}
static async remove(path: string): Promise< void > {
try {
await plugins.fs.promises.unlink(path);
} catch (error: any) {
if (error.code !== 'ENOENT') throw error;
}
}
static async readJSON< T > (path: string): Promise< T > {
const content = await this.readFile(path);
return JSON.parse(content);
}
static async writeJSON(path: string, data: any): Promise< void > {
await this.writeFile(path, JSON.stringify(data, null, 2));
}
}
2025-05-28 23:33:02 +00:00
```
2025-05-31 17:14:15 +00:00
**Migration Strategy**:
1. **Certificate Manager** (`ts/proxies/http-proxy/certificate-manager.ts` )
```typescript
// OLD:
constructor(private options: IHttpProxyOptions) {
if (!fs.existsSync(this.certDir)) {
fs.mkdirSync(this.certDir, { recursive: true });
}
}
// NEW:
private initialized = false;
constructor(private options: IHttpProxyOptions) {}
async initialize(): Promise< void > {
if (this.initialized) return;
await AsyncFileSystem.ensureDir(this.certDir);
this.initialized = true;
}
async getCertificate(domain: string): Promise< { cert: string; key: string } | null> {
await this.initialize();
const certPath = path.join(this.certDir, `${domain}.crt` );
const keyPath = path.join(this.certDir, `${domain}.key` );
if (await AsyncFileSystem.exists(certPath) & & await AsyncFileSystem.exists(keyPath)) {
const [cert, key] = await Promise.all([
AsyncFileSystem.readFile(certPath),
AsyncFileSystem.readFile(keyPath)
]);
return { cert, key };
}
return null;
}
```
2025-05-28 23:33:02 +00:00
2025-05-31 17:14:15 +00:00
2. **Certificate Store** (`ts/proxies/smart-proxy/cert-store.ts` )
```typescript
// Convert all methods to async
export class CertStore {
constructor(private storePath: string) {}
async init(): Promise< void > {
await AsyncFileSystem.ensureDir(this.storePath);
}
async hasCertificate(domain: string): Promise< boolean > {
const certPath = this.getCertificatePath(domain);
return AsyncFileSystem.exists(certPath);
}
async getCertificate(domain: string): Promise< ICertificateInfo | null > {
if (!await this.hasCertificate(domain)) return null;
const metaPath = path.join(this.getCertificatePath(domain), 'meta.json');
return AsyncFileSystem.readJSON< ICertificateInfo > (metaPath);
}
}
```
2025-05-28 23:33:02 +00:00
2025-05-31 17:14:15 +00:00
3. **NFTables Proxy** (`ts/proxies/nftables-proxy/nftables-proxy.ts` )
```typescript
// Replace execSync with execAsync
private async execNftCommand(command: string): Promise< string > {
const maxRetries = 3;
let lastError: Error | null = null;
for (let i = 0; i < maxRetries ; i + + ) {
try {
const { stdout } = await this.execAsync(command);
return stdout;
} catch (err: any) {
lastError = err;
if (i < maxRetries - 1 ) {
await delay(this.retryDelayMs);
}
}
}
throw new NftExecutionError(`Failed after ${maxRetries} attempts: ${lastError?.message}` );
}
```
2025-05-28 23:33:02 +00:00
2025-05-31 17:14:15 +00:00
## Dependencies Between Phases
2025-05-28 23:33:02 +00:00
2025-05-31 17:14:15 +00:00
### Critical Path
```
Phase 1.1 (Busy Wait) ─┐
├─> Phase 2.1 (Timer Management) ─> Phase 3.2 (Worker Threads)
Phase 1.2 (Async FS) ──┘ │
├─> Phase 4.1 (Monitoring)
Phase 2.2 (Connection Pool) ────────────────────────────┘
```
### Phase Dependencies
- **Phase 1** must complete before Phase 2 (foundation for async operations)
- **Phase 2.1** enables proper cleanup for Phase 3.2 worker threads
- **Phase 3** optimizations depend on stable async foundation
- **Phase 4** monitoring requires all components to be instrumented
## Phase 2: Resource Management (Week 2)
### 2.1 Timer Lifecycle Management
**Issue**: Timers created without cleanup references causing memory leaks
**Solution Pattern**:
2025-05-28 23:33:02 +00:00
```typescript
2025-05-31 17:14:15 +00:00
// Create base class in ts/core/utils/lifecycle-component.ts
export abstract class LifecycleComponent {
private timers: Set< NodeJS.Timeout > = new Set();
private listeners: Array< { target: any, event: string, handler: Function }> = [];
protected isShuttingDown = false;
2025-05-28 23:33:02 +00:00
2025-05-31 17:14:15 +00:00
protected setInterval(handler: Function, timeout: number): NodeJS.Timeout {
const timer = setInterval(() => {
if (!this.isShuttingDown) {
handler();
}
}, timeout);
this.timers.add(timer);
return timer;
}
2025-05-28 23:33:02 +00:00
2025-05-31 17:14:15 +00:00
protected setTimeout(handler: Function, timeout: number): NodeJS.Timeout {
const timer = setTimeout(() => {
this.timers.delete(timer);
if (!this.isShuttingDown) {
handler();
}
}, timeout);
this.timers.add(timer);
return timer;
}
protected addEventListener(target: any, event: string, handler: Function): void {
target.on(event, handler);
this.listeners.push({ target, event, handler });
}
protected async cleanup(): Promise< void > {
this.isShuttingDown = true;
// Clear all timers
for (const timer of this.timers) {
clearInterval(timer);
clearTimeout(timer);
}
this.timers.clear();
// Remove all listeners
for (const { target, event, handler } of this.listeners) {
target.removeListener(event, handler);
}
this.listeners = [];
}
2025-05-28 23:33:02 +00:00
}
```
2025-05-31 17:14:15 +00:00
**Implementation**:
1. Extend LifecycleComponent in:
- `HttpProxy`
- `SmartProxy`
- `ConnectionManager`
- `RequestHandler`
- `SharedSecurityManager`
2. Replace direct timer/listener usage with lifecycle methods
2025-05-28 23:33:02 +00:00
2025-05-31 17:14:15 +00:00
### 2.2 Connection Pool Enhancement
2025-05-28 23:33:02 +00:00
2025-05-31 17:14:15 +00:00
**Issue**: No backpressure mechanism and synchronous operations
2025-05-28 23:33:02 +00:00
2025-05-31 17:14:15 +00:00
**Solution**:
2025-05-28 23:33:02 +00:00
```typescript
2025-05-31 17:14:15 +00:00
// First, implement efficient BinaryHeap for O(log n) operations
// Create ts/core/utils/binary-heap.ts
export class BinaryHeap< T > {
private heap: T[] = [];
2025-05-28 23:33:02 +00:00
2025-05-31 17:14:15 +00:00
constructor(
private compareFn: (a: T, b: T) => number,
private extractKey?: (item: T) => string
) {}
insert(item: T): void {
this.heap.push(item);
this.bubbleUp(this.heap.length - 1);
}
extract(): T | undefined {
if (this.heap.length === 0) return undefined;
if (this.heap.length === 1) return this.heap.pop();
const result = this.heap[0];
this.heap[0] = this.heap.pop()!;
this.bubbleDown(0);
return result;
}
extractIf(predicate: (item: T) => boolean): T | undefined {
const index = this.heap.findIndex(predicate);
if (index === -1) return undefined;
if (index === this.heap.length - 1) return this.heap.pop();
const result = this.heap[index];
this.heap[index] = this.heap.pop()!;
// Restore heap property
this.bubbleUp(index);
this.bubbleDown(index);
return result;
}
sizeFor(key: string): number {
if (!this.extractKey) return this.heap.length;
return this.heap.filter(item => this.extractKey!(item) === key).length;
}
private bubbleUp(index: number): void {
while (index > 0) {
const parentIndex = Math.floor((index - 1) / 2);
if (this.compareFn(this.heap[index], this.heap[parentIndex]) >= 0) break;
[this.heap[index], this.heap[parentIndex]] =
[this.heap[parentIndex], this.heap[index]];
index = parentIndex;
}
}
private bubbleDown(index: number): void {
while (true) {
let minIndex = index;
const leftChild = 2 * index + 1;
const rightChild = 2 * index + 2;
if (leftChild < this.heap.length & &
this.compareFn(this.heap[leftChild], this.heap[minIndex]) < 0 ) {
minIndex = leftChild;
}
if (rightChild < this.heap.length & &
this.compareFn(this.heap[rightChild], this.heap[minIndex]) < 0 ) {
minIndex = rightChild;
}
if (minIndex === index) break;
[this.heap[index], this.heap[minIndex]] =
[this.heap[minIndex], this.heap[index]];
index = minIndex;
}
}
}
// Enhanced connection pool with queue and heap
export class EnhancedConnectionPool extends LifecycleComponent {
private connectionQueue: Array< {
resolve: (socket: net.Socket) => void;
reject: (error: Error) => void;
host: string;
port: number;
timestamp: number;
}> = [];
private connectionHeap: BinaryHeap< IConnectionEntry > ;
private metricsCollector: ConnectionMetrics;
constructor(options: IConnectionPoolOptions) {
super();
// Priority: least recently used connections first
this.connectionHeap = new BinaryHeap(
(a, b) => a.lastUsed - b.lastUsed,
(item) => item.poolKey
);
this.metricsCollector = new ConnectionMetrics();
this.startQueueProcessor();
}
private startQueueProcessor(): void {
// Process queue periodically to handle timeouts and retries
this.setInterval(() => {
const now = Date.now();
const timeout = this.options.connectionQueueTimeout || 30000;
// Remove timed out requests
this.connectionQueue = this.connectionQueue.filter(item => {
if (now - item.timestamp > timeout) {
item.reject(new Error(`Connection pool timeout for ${item.host}:${item.port}` ));
this.metricsCollector.recordTimeout();
return false;
}
return true;
});
// Try to fulfill queued requests
this.processQueue();
}, 1000);
}
private processQueue(): void {
if (this.connectionQueue.length === 0) return;
// Group by destination
const grouped = new Map< string , typeof this . connectionQueue > ();
for (const item of this.connectionQueue) {
const key = `${item.host}:${item.port}` ;
if (!grouped.has(key)) grouped.set(key, []);
grouped.get(key)!.push(item);
2025-05-28 23:33:02 +00:00
}
2025-05-31 17:14:15 +00:00
// Try to fulfill requests for each destination
for (const [poolKey, requests] of grouped) {
const available = this.connectionHeap.extractIf(
conn => conn.poolKey === poolKey & & conn.isIdle & & !conn.socket.destroyed
);
2025-05-28 23:33:02 +00:00
2025-05-31 17:14:15 +00:00
if (available) {
const request = requests.shift()!;
this.connectionQueue = this.connectionQueue.filter(item => item !== request);
available.isIdle = false;
available.lastUsed = Date.now();
request.resolve(available.socket);
this.metricsCollector.recordReuse();
2025-05-28 23:33:02 +00:00
}
2025-05-31 17:14:15 +00:00
}
}
async getConnection(host: string, port: number): Promise< net.Socket > {
const poolKey = `${host}:${port}` ;
// Try to get existing connection
let connection = this.connectionHeap.extractIf(
conn => conn.poolKey === poolKey & & conn.isIdle & & !conn.socket.destroyed
);
if (connection) {
connection.isIdle = false;
connection.lastUsed = Date.now();
this.metricsCollector.recordReuse();
// Validate connection is still alive
if (await this.validateConnection(connection.socket)) {
return connection.socket;
2025-05-28 23:33:02 +00:00
}
2025-05-31 17:14:15 +00:00
// Connection is dead, try another
connection.socket.destroy();
return this.getConnection(host, port);
2025-05-28 23:33:02 +00:00
}
2025-05-31 17:14:15 +00:00
// Check pool size
const poolSize = this.connectionHeap.sizeFor(poolKey);
if (poolSize < this.options.connectionPoolSize ) {
return this.createConnection(host, port);
}
// Queue the request
return this.queueConnectionRequest(host, port);
}
private async validateConnection(socket: net.Socket): Promise< boolean > {
return new Promise((resolve) => {
if (socket.destroyed || !socket.readable || !socket.writable) {
resolve(false);
return;
}
// Try to write a TCP keepalive probe
const originalWrite = socket.write;
let writeError = false;
socket.write = function(data: any, encoding?: any, cb?: any) {
writeError = true;
return false;
};
socket.setNoDelay(true);
socket.setNoDelay(false);
socket.write = originalWrite;
resolve(!writeError);
});
}
returnConnection(socket: net.Socket, host: string, port: number): void {
const poolKey = `${host}:${port}` ;
2025-05-28 23:33:02 +00:00
2025-05-31 17:14:15 +00:00
// Check for queued requests first
const queuedIndex = this.connectionQueue.findIndex(
item => item.host === host & & item.port === port
);
if (queuedIndex >= 0) {
const queued = this.connectionQueue.splice(queuedIndex, 1)[0];
queued.resolve(socket);
this.metricsCollector.recordDirectHandoff();
return;
}
// Return to pool
this.connectionHeap.insert({
socket,
poolKey,
lastUsed: Date.now(),
isIdle: true,
created: Date.now()
});
}
getMetrics(): IConnectionPoolMetrics {
return {
...this.metricsCollector.getMetrics(),
poolSize: this.connectionHeap.size(),
queueLength: this.connectionQueue.length
};
2025-05-28 23:33:02 +00:00
}
}
```
2025-05-31 17:14:15 +00:00
## Phase 3: Performance Optimizations (Week 3)
2025-05-28 23:33:02 +00:00
2025-05-31 17:14:15 +00:00
### 3.1 JSON Operations Optimization
2025-05-28 23:33:02 +00:00
2025-05-31 17:14:15 +00:00
**Issue**: Frequent JSON.stringify for cache keys
2025-05-28 23:33:02 +00:00
2025-05-31 17:14:15 +00:00
**Solution**:
2025-05-28 23:33:02 +00:00
```typescript
2025-05-31 17:14:15 +00:00
// Create ts/core/utils/hash-utils.ts
import * as crypto from 'crypto';
export class HashUtils {
private static readonly objectCache = new WeakMap< object , string > ();
static hashObject(obj: any): string {
// Check cache first
if (typeof obj === 'object' & & obj !== null) {
const cached = this.objectCache.get(obj);
if (cached) return cached;
}
// Create stable string representation
const str = this.stableStringify(obj);
const hash = crypto.createHash('sha256').update(str).digest('hex').slice(0, 16);
// Cache if object
if (typeof obj === 'object' & & obj !== null) {
this.objectCache.set(obj, hash);
}
return hash;
}
private static stableStringify(obj: any): string {
if (obj === null || typeof obj !== 'object') {
return JSON.stringify(obj);
}
if (Array.isArray(obj)) {
return '[' + obj.map(item => this.stableStringify(item)).join(',') + ']';
}
const keys = Object.keys(obj).sort();
const pairs = keys.map(key => `"${key}":${this.stableStringify(obj[key])}` );
return '{' + pairs.join(',') + '}';
}
}
// Update function-cache.ts
private computeContextHash(context: IRouteContext): string {
return HashUtils.hashObject({
domain: context.domain,
path: context.path,
clientIp: context.clientIp
});
}
2025-05-28 23:33:02 +00:00
```
2025-05-31 17:14:15 +00:00
### 3.2 Worker Thread Integration
**Issue**: CPU-intensive operations blocking event loop
**Solution Architecture**:
2025-05-28 23:33:02 +00:00
```typescript
2025-05-31 17:14:15 +00:00
// Create ts/core/workers/worker-pool.ts
import { Worker } from 'worker_threads';
export class WorkerPool {
private workers: Worker[] = [];
private queue: Array< {
task: any;
resolve: Function;
reject: Function;
}> = [];
private busyWorkers = new Set< Worker > ();
constructor(
private workerScript: string,
private poolSize: number = 4
) {
this.initializeWorkers();
}
async execute< T > (task: any): Promise< T > {
const worker = await this.getAvailableWorker();
return new Promise((resolve, reject) => {
const messageHandler = (result: any) => {
worker.off('message', messageHandler);
worker.off('error', errorHandler);
this.releaseWorker(worker);
resolve(result);
};
const errorHandler = (error: Error) => {
worker.off('message', messageHandler);
worker.off('error', errorHandler);
this.releaseWorker(worker);
reject(error);
};
worker.on('message', messageHandler);
worker.on('error', errorHandler);
worker.postMessage(task);
});
}
2025-05-28 23:33:02 +00:00
}
2025-05-31 17:14:15 +00:00
// Create ts/core/workers/nftables-worker.ts
import { parentPort } from 'worker_threads';
import { exec } from 'child_process';
import { promisify } from 'util';
2025-05-28 23:33:02 +00:00
2025-05-31 17:14:15 +00:00
const execAsync = promisify(exec);
2025-05-28 23:33:02 +00:00
2025-05-31 17:14:15 +00:00
parentPort?.on('message', async (task) => {
try {
const result = await execAsync(task.command);
parentPort?.postMessage({ success: true, result });
} catch (error) {
parentPort?.postMessage({ success: false, error: error.message });
}
});
```
## Phase 4: Monitoring & Metrics (Week 4)
### 4.1 Event Loop Monitoring
2025-05-28 23:33:02 +00:00
```typescript
2025-05-31 17:14:15 +00:00
// Create ts/core/monitoring/performance-monitor.ts
export class PerformanceMonitor extends LifecycleComponent {
private metrics = {
eventLoopLag: [] as number[],
activeConnections: 0,
memoryUsage: {} as NodeJS.MemoryUsage,
cpuUsage: {} as NodeJS.CpuUsage
2025-05-28 23:33:02 +00:00
};
2025-05-31 17:14:15 +00:00
start() {
// Monitor event loop lag
let lastCheck = process.hrtime.bigint();
this.setInterval(() => {
const now = process.hrtime.bigint();
const expectedInterval = 100n * 1000000n; // 100ms in nanoseconds
const actualInterval = now - lastCheck;
const lag = Number(actualInterval - expectedInterval) / 1000000; // Convert to ms
this.metrics.eventLoopLag.push(lag);
if (this.metrics.eventLoopLag.length > 100) {
this.metrics.eventLoopLag.shift();
}
lastCheck = now;
}, 100);
// Monitor system resources
this.setInterval(() => {
this.metrics.memoryUsage = process.memoryUsage();
this.metrics.cpuUsage = process.cpuUsage();
}, 5000);
}
getMetrics() {
const avgLag = this.metrics.eventLoopLag.reduce((a, b) => a + b, 0)
/ this.metrics.eventLoopLag.length;
return {
eventLoopLag: {
current: this.metrics.eventLoopLag[this.metrics.eventLoopLag.length - 1],
average: avgLag,
max: Math.max(...this.metrics.eventLoopLag)
},
memory: this.metrics.memoryUsage,
cpu: this.metrics.cpuUsage,
activeConnections: this.metrics.activeConnections
};
}
2025-05-28 23:33:02 +00:00
}
2025-05-31 17:14:15 +00:00
```
## Testing Strategy
### Unit Tests
1. Create tests for each new utility class
2. Mock filesystem and network operations
3. Test error scenarios and edge cases
### Integration Tests
1. Test async migration with real filesystem
2. Verify timer cleanup on shutdown
3. Test connection pool under load
2025-05-28 23:33:02 +00:00
2025-05-31 17:14:15 +00:00
### Performance Tests
```typescript
// Create test/performance/event-loop-test.ts
import { tap, expect } from '@git .zone/tstest/tapbundle';
tap.test('should not block event loop', async () => {
const intervals: number[] = [];
let lastTime = Date.now();
2025-05-28 23:33:02 +00:00
2025-05-31 17:14:15 +00:00
const timer = setInterval(() => {
const now = Date.now();
intervals.push(now - lastTime);
lastTime = now;
}, 10);
// Run operations that might block
await runPotentiallyBlockingOperation();
clearInterval(timer);
2025-05-28 23:33:02 +00:00
2025-05-31 17:14:15 +00:00
// Check that no interval exceeded 50ms (allowing some tolerance)
const maxInterval = Math.max(...intervals);
expect(maxInterval).toBeLessThan(50);
});
```
## Migration Timeline
### Week 1: Critical Fixes
- Day 1-2: Fix busy wait loop
- Day 3-4: Convert critical sync operations
- Day 5: Testing and validation
### Week 2: Resource Management
- Day 1-2: Implement LifecycleComponent
- Day 3-4: Migrate components
- Day 5: Connection pool enhancement
### Week 3: Optimizations
- Day 1-2: JSON operation optimization
- Day 3-4: Worker thread integration
- Day 5: Performance testing
### Week 4: Monitoring & Polish
- Day 1-2: Performance monitoring
- Day 3-4: Load testing
- Day 5: Documentation and release
## Error Handling Strategy
### Graceful Degradation
```typescript
// Create ts/core/utils/error-handler.ts
export class ErrorHandler {
private static errorCounts = new Map< string , number > ();
private static circuitBreakers = new Map< string , CircuitBreaker > ();
static async withFallback< T > (
operation: () => Promise< T > ,
fallback: () => Promise< T > ,
context: string
): Promise< T > {
const breaker = this.getCircuitBreaker(context);
if (breaker.isOpen()) {
return fallback();
}
try {
const result = await operation();
breaker.recordSuccess();
return result;
} catch (error) {
breaker.recordFailure();
this.recordError(context, error);
if (breaker.isOpen()) {
logger.warn(`Circuit breaker opened for ${context}` );
}
return fallback();
}
2025-05-28 23:33:02 +00:00
}
2025-05-31 17:14:15 +00:00
private static getCircuitBreaker(context: string): CircuitBreaker {
if (!this.circuitBreakers.has(context)) {
this.circuitBreakers.set(context, new CircuitBreaker({
failureThreshold: 5,
resetTimeout: 60000
}));
}
return this.circuitBreakers.get(context)!;
}
}
// Usage example in Certificate Manager
async getCertificate(domain: string): Promise< CertificateInfo | null > {
return ErrorHandler.withFallback(
// Try async operation
async () => {
await this.initialize();
return this.loadCertificateAsync(domain);
},
// Fallback to sync if needed
async () => {
logger.warn(`Falling back to sync certificate load for ${domain}` );
return this.loadCertificateSync(domain);
},
'certificate-load'
);
}
2025-05-28 23:33:02 +00:00
```
2025-05-31 17:14:15 +00:00
## Backward Compatibility
2025-05-28 23:33:02 +00:00
2025-05-31 17:14:15 +00:00
### API Preservation
1. **Maintain existing interfaces** - All public APIs remain unchanged
2. **Progressive enhancement** - New features are opt-in via configuration
3. **Sync method wrappers** - Provide sync-looking APIs that use async internally
2025-05-28 23:33:02 +00:00
```typescript
2025-05-31 17:14:15 +00:00
// Example: Maintaining backward compatibility
export class CertStore {
// Old sync API (deprecated but maintained)
getCertificateSync(domain: string): ICertificateInfo | null {
console.warn('getCertificateSync is deprecated, use getCertificate');
return this.syncFallbackGetCertificate(domain);
}
// New async API
async getCertificate(domain: string): Promise< ICertificateInfo | null > {
return this.asyncGetCertificate(domain);
}
// Smart detection for gradual migration
getCertificateAuto(domain: string, callback?: (err: Error | null, cert: ICertificateInfo | null) => void) {
if (callback) {
// Callback style for compatibility
this.getCertificate(domain)
.then(cert => callback(null, cert))
.catch(err => callback(err, null));
} else {
// Return promise for modern usage
return this.getCertificate(domain);
2025-05-28 23:33:02 +00:00
}
}
}
```
2025-05-31 17:14:15 +00:00
### Configuration Compatibility
2025-05-28 23:33:02 +00:00
```typescript
2025-05-31 17:14:15 +00:00
// Support both old and new configuration formats
interface SmartProxyOptions {
// Old options (maintained)
preserveSourceIP?: boolean;
defaultAllowedIPs?: string[];
// New performance options (added)
performance?: {
asyncFilesystem?: boolean;
enhancedConnectionPool?: boolean;
workerThreads?: boolean;
};
}
```
## Monitoring Dashboard
### Real-time Metrics Visualization
```typescript
// Create ts/core/monitoring/dashboard-server.ts
export class MonitoringDashboard {
private httpServer: http.Server;
private wsServer: WebSocket.Server;
private metricsHistory: MetricsHistory;
async start(port: number = 9090): Promise< void > {
this.httpServer = http.createServer(this.handleRequest.bind(this));
this.wsServer = new WebSocket.Server({ server: this.httpServer });
this.wsServer.on('connection', (ws) => {
// Send current metrics
ws.send(JSON.stringify({
type: 'initial',
data: this.metricsHistory.getLast(100)
}));
// Subscribe to updates
const interval = setInterval(() => {
if (ws.readyState === WebSocket.OPEN) {
ws.send(JSON.stringify({
type: 'update',
data: this.performanceMonitor.getMetrics()
}));
}
}, 1000);
ws.on('close', () => clearInterval(interval));
});
this.httpServer.listen(port);
logger.info(`Monitoring dashboard available at http://localhost:${port}` );
}
private handleRequest(req: http.IncomingMessage, res: http.ServerResponse) {
if (req.url === '/') {
res.writeHead(200, { 'Content-Type': 'text/html' });
res.end(this.getDashboardHTML());
} else if (req.url === '/metrics') {
res.writeHead(200, { 'Content-Type': 'application/json' });
res.end(JSON.stringify(this.performanceMonitor.getMetrics()));
}
}
private getDashboardHTML(): string {
return `
<!DOCTYPE html>
< html >
< head >
< title > SmartProxy Performance Monitor< / title >
< script src = "https://cdn.jsdelivr.net/npm/chart.js" > < / script >
< style >
body { font-family: Arial, sans-serif; padding: 20px; }
.metric { display: inline-block; margin: 10px; padding: 15px; background: #f0f0f0 ; }
.chart-container { width: 45%; display: inline-block; margin: 2%; }
< / style >
< / head >
< body >
< h1 > SmartProxy Performance Monitor< / h1 >
< div id = "metrics" >
< div class = "metric" >
< h3 > Event Loop Lag< / h3 >
< div id = "eventLoopLag" > --< / div >
< / div >
< div class = "metric" >
< h3 > Active Connections< / h3 >
< div id = "activeConnections" > --< / div >
< / div >
< div class = "metric" >
< h3 > Memory Usage< / h3 >
< div id = "memoryUsage" > --< / div >
< / div >
< div class = "metric" >
< h3 > Connection Pool< / h3 >
< div id = "connectionPool" > --< / div >
< / div >
< / div >
< div class = "chart-container" >
< canvas id = "eventLoopChart" > < / canvas >
< / div >
< div class = "chart-container" >
< canvas id = "connectionChart" > < / canvas >
< / div >
< script >
const ws = new WebSocket('ws://localhost:9090');
const eventLoopData = [];
const connectionData = [];
// Initialize charts
const eventLoopChart = new Chart(document.getElementById('eventLoopChart'), {
type: 'line',
data: {
labels: [],
datasets: [{
label: 'Event Loop Lag (ms)',
data: eventLoopData,
borderColor: 'rgb(255, 99, 132)',
tension: 0.1
}]
},
options: {
responsive: true,
scales: { y: { beginAtZero: true } }
}
});
ws.onmessage = (event) => {
const message = JSON.parse(event.data);
updateMetrics(message.data);
};
function updateMetrics(metrics) {
document.getElementById('eventLoopLag').textContent =
metrics.eventLoopLag.current.toFixed(2) + ' ms';
document.getElementById('activeConnections').textContent =
metrics.activeConnections;
document.getElementById('memoryUsage').textContent =
(metrics.memory.heapUsed / 1024 / 1024).toFixed(2) + ' MB';
// Update charts
const now = new Date().toLocaleTimeString();
eventLoopData.push(metrics.eventLoopLag.current);
eventLoopChart.data.labels.push(now);
if (eventLoopData.length > 60) {
eventLoopData.shift();
eventLoopChart.data.labels.shift();
}
eventLoopChart.update();
}
< / script >
< / body >
< / html >
`;
2025-05-28 23:33:02 +00:00
}
}
```
2025-05-31 17:14:15 +00:00
## Performance Benchmarking
### Benchmark Suite
2025-05-28 23:33:02 +00:00
```typescript
2025-05-31 17:14:15 +00:00
// Create test/performance/benchmark.ts
import { SmartProxy } from '../../ts/index.js';
export class PerformanceBenchmark {
async runConnectionStresTest(): Promise< BenchmarkResult > {
const proxy = new SmartProxy({ /* config */ });
await proxy.start();
const results = {
connectionRate: 0,
avgLatency: 0,
maxConnections: 0,
eventLoopLag: []
};
// Monitor event loop during test
const lagSamples: number[] = [];
let lastCheck = process.hrtime.bigint();
const monitor = setInterval(() => {
const now = process.hrtime.bigint();
const lag = Number(now - lastCheck - 100_000_000n) / 1_000_000;
lagSamples.push(lag);
lastCheck = now;
}, 100);
// Create connections with increasing rate
const startTime = Date.now();
let connectionCount = 0;
for (let rate = 100; rate < = 10000; rate += 100) {
const connections = await this.createConnections(rate);
connectionCount += connections.length;
2025-05-28 23:33:02 +00:00
2025-05-31 17:14:15 +00:00
// Check if performance degrades
const avgLag = lagSamples.slice(-10).reduce((a, b) => a + b) / 10;
if (avgLag > 50) {
results.maxConnections = connectionCount;
break;
2025-05-28 23:33:02 +00:00
}
2025-05-31 17:14:15 +00:00
await this.delay(1000);
2025-05-28 23:33:02 +00:00
}
2025-05-31 17:14:15 +00:00
clearInterval(monitor);
await proxy.stop();
results.connectionRate = connectionCount / ((Date.now() - startTime) / 1000);
results.avgLatency = this.calculateAvgLatency();
results.eventLoopLag = lagSamples;
return results;
2025-05-28 23:33:02 +00:00
}
}
```
2025-05-31 17:14:15 +00:00
## Documentation Updates
### API Documentation
1. **Update JSDoc comments** for all modified methods
2. **Add migration guide** for async transitions
3. **Performance tuning guide** with recommended settings
### Example Updates
```typescript
/**
* Gets a certificate for the specified domain
* @param domain - The domain to get certificate for
* @returns Promise resolving to certificate info or null
* @since v20.0.0 - Now returns Promise (breaking change)
* @example
* // Old way (deprecated)
* const cert = certStore.getCertificateSync('example.com');
*
* // New way
* const cert = await certStore.getCertificate('example.com');
*
* // Compatibility mode
* certStore.getCertificateAuto('example.com', (err, cert) => {
* if (err) console.error(err);
* else console.log(cert);
* });
*/
async getCertificate(domain: string): Promise< ICertificateInfo | null > {
// Implementation
}
```
## Rollback Strategy
Each phase is designed to be independently deployable with feature flags:
```typescript
export const PerformanceFlags = {
useAsyncFilesystem: process.env.SMARTPROXY_ASYNC_FS !== 'false',
useEnhancedPool: process.env.SMARTPROXY_ENHANCED_POOL === 'true',
useWorkerThreads: process.env.SMARTPROXY_WORKERS === 'true',
enableMonitoring: process.env.SMARTPROXY_MONITORING === 'true'
};
```
### Gradual Rollout Plan
1. **Development** : All flags enabled
2. **Staging** : Monitor metrics for 1 week
3. **Production** :
- 10% traffic → 25% → 50% → 100%
- Monitor key metrics at each stage
- Rollback if metrics degrade
2025-05-28 23:33:02 +00:00
2025-05-31 17:14:15 +00:00
## Success Metrics
2025-05-28 23:33:02 +00:00
2025-05-31 17:14:15 +00:00
1. **Event Loop Lag** : < 10ms average , < 50ms max
2. **Connection Handling** : Support 10k+ concurrent connections
3. **Memory Usage** : Stable under sustained load
4. **CPU Usage** : Efficient utilization across cores
5. **Response Time** : < 5ms overhead for proxy operations
2025-05-28 23:33:02 +00:00
2025-05-31 17:14:15 +00:00
## Risk Mitigation
2025-05-28 23:33:02 +00:00
2025-05-31 17:14:15 +00:00
1. **Backward Compatibility** : Maintain existing APIs
2. **Gradual Rollout** : Use feature flags
3. **Monitoring** : Track metrics before/after changes
4. **Testing** : Comprehensive test coverage
5. **Documentation** : Update all API docs
2025-05-28 23:33:02 +00:00
2025-05-31 17:14:15 +00:00
## Summary of Key Optimizations
2025-05-28 23:33:02 +00:00
2025-05-31 17:14:15 +00:00
### Immediate Impact (Phase 1)
1. **Eliminate busy wait loop** - Unblocks event loop immediately
2. **Async filesystem operations** - Prevents I/O blocking
3. **Proper error handling** - Graceful degradation with fallbacks
2025-05-28 23:33:02 +00:00
2025-05-31 17:14:15 +00:00
### Performance Enhancements (Phase 2-3)
1. **Enhanced connection pooling** - O(log n) operations with BinaryHeap
2. **Resource lifecycle management** - Prevents memory leaks
3. **Worker threads** - Offloads CPU-intensive operations
4. **Optimized JSON operations** - Reduces parsing overhead
2025-05-28 23:33:02 +00:00
2025-05-31 17:14:15 +00:00
### Monitoring & Validation (Phase 4)
1. **Real-time dashboard** - Visual performance monitoring
2. **Event loop lag tracking** - Early warning system
3. **Automated benchmarking** - Regression prevention
2025-05-28 23:33:02 +00:00
2025-05-31 17:14:15 +00:00
## Implementation Checklist
2025-05-28 23:33:02 +00:00
2025-05-31 17:14:15 +00:00
### Phase 1: Critical Fixes (Priority: URGENT)
- [ ] Create `ts/core/utils/async-utils.ts` with delay function
- [ ] Fix busy wait loop in `nftables-proxy.ts`
- [ ] Create `ts/core/utils/fs-utils.ts` with AsyncFileSystem class
- [ ] Migrate `certificate-manager.ts` to async operations
- [ ] Migrate `cert-store.ts` to async operations
- [ ] Replace `execSync` with `execAsync` in `nftables-proxy.ts`
- [ ] Add comprehensive unit tests for async operations
- [ ] Performance test to verify event loop improvements
2025-05-28 23:33:02 +00:00
2025-05-31 17:14:15 +00:00
### Phase 2: Resource Management
- [ ] Implement LifecycleComponent base class
- [ ] Migrate components to extend LifecycleComponent
- [ ] Implement BinaryHeap data structure
- [ ] Create EnhancedConnectionPool with queue support
- [ ] Add connection validation and health checks
- [ ] Implement proper timer cleanup in all components
- [ ] Add integration tests for resource management
2025-05-28 23:33:02 +00:00
2025-05-31 17:14:15 +00:00
### Phase 3: Performance Optimizations
- [ ] Implement HashUtils for efficient object hashing
- [ ] Create WorkerPool for CPU-intensive operations
- [ ] Migrate NFTables operations to worker threads
- [ ] Optimize JSON operations with caching
- [ ] Add performance benchmarks
2025-05-28 23:33:02 +00:00
2025-05-31 17:14:15 +00:00
### Phase 4: Monitoring & Polish
- [ ] Implement PerformanceMonitor class
- [ ] Create monitoring dashboard with WebSocket updates
- [ ] Add comprehensive metrics collection
- [ ] Document all API changes
- [ ] Create migration guide
- [ ] Update examples and tutorials
2025-05-29 00:24:57 +00:00
2025-05-31 17:14:15 +00:00
## Next Steps
2025-05-29 00:24:57 +00:00
2025-05-31 17:14:15 +00:00
1. **Immediate Action** : Fix the busy wait loop (blocks entire event loop)
2. **Code Review** : Review this plan with the team
3. **Feature Branch** : Create `feature/performance-optimization`
4. **Phase 1 Implementation** : Complete within 1 week
5. **Staging Deployment** : Test with real traffic patterns
6. **Gradual Rollout** : Use feature flags for production
7. **Monitor & Iterate** : Track metrics and adjust as needed
2025-05-29 00:24:57 +00:00
2025-05-31 17:14:15 +00:00
## Expected Outcomes
2025-05-29 00:24:57 +00:00
2025-05-31 17:14:15 +00:00
After implementing all phases:
- **10x improvement** in concurrent connection handling
- **90% reduction** in event loop blocking
- **50% reduction** in memory usage under load
- **Zero memory leaks** with proper resource cleanup
- **Real-time visibility** into performance metrics
- **Graceful degradation** under extreme load
2025-05-29 00:24:57 +00:00
2025-05-31 17:14:15 +00:00
## Version Plan
2025-05-29 00:24:57 +00:00
2025-05-31 17:14:15 +00:00
- **v19.6.0**: Phase 1 (Critical fixes) - Backward compatible
- **v19.7.0**: Phase 2 (Resource management) - Backward compatible
- **v19.8.0**: Phase 3 (Optimizations) - Backward compatible
- **v20.0.0**: Phase 4 (Full async) - Breaking changes with migration path