8 Commits

Author SHA1 Message Date
dbc8566aad v2.5.0
Some checks failed
Default (tags) / security (push) Successful in 43s
Default (tags) / test (push) Failing after 44s
Default (tags) / release (push) Has been skipped
Default (tags) / metadata (push) Has been skipped
2025-11-27 21:11:04 +00:00
bd64a7b140 feat(pypi,rubygems): Add PyPI and RubyGems protocol implementations, upstream caching, and auth/storage improvements 2025-11-27 21:11:04 +00:00
ae8dec9142 v2.4.0
Some checks failed
Default (tags) / security (push) Successful in 45s
Default (tags) / test (push) Failing after 45s
Default (tags) / release (push) Has been skipped
Default (tags) / metadata (push) Has been skipped
2025-11-27 20:59:49 +00:00
19da87a9df feat(core): Add pluggable auth providers, storage hooks, multi-upstream cache awareness, and PyPI/RubyGems protocol implementations 2025-11-27 20:59:49 +00:00
99b01733e7 v2.3.0
Some checks failed
Default (tags) / security (push) Successful in 39s
Default (tags) / test (push) Failing after 45s
Default (tags) / release (push) Has been skipped
Default (tags) / metadata (push) Has been skipped
2025-11-27 14:20:01 +00:00
0610077eec feat(upstream): Add upstream proxy/cache subsystem and integrate per-protocol upstreams 2025-11-27 14:20:01 +00:00
cfadc89b5a v2.2.3
Some checks failed
Default (tags) / security (push) Successful in 45s
Default (tags) / test (push) Failing after 38s
Default (tags) / release (push) Has been skipped
Default (tags) / metadata (push) Has been skipped
2025-11-27 12:41:38 +00:00
eb91a3f75b fix(tests): Use unique test run IDs and add S3 cleanup in test helpers to avoid cross-run conflicts 2025-11-27 12:41:38 +00:00
44 changed files with 5031 additions and 518 deletions

View File

@@ -1,5 +1,48 @@
# Changelog # Changelog
## 2025-11-27 - 2.5.0 - feat(pypi,rubygems)
Add PyPI and RubyGems protocol implementations, upstream caching, and auth/storage improvements
- Implemented full PyPI support (PEP 503 Simple API HTML, PEP 691 JSON API, legacy upload handling, name normalization, hash verification, content negotiation, package/file storage and metadata management).
- Implemented RubyGems support (compact index, /versions, /info, /names endpoints, gem upload, yank/unyank, platform handling and file storage).
- Expanded RegistryStorage with protocol-specific helpers for OCI, NPM, Maven, Cargo, Composer, PyPI, and RubyGems (get/put/delete/list helpers, metadata handling, context-aware hooks).
- Added AuthManager and DefaultAuthProvider improvements: unified token creation/validation for multiple protocols (npm, oci, maven, composer, cargo, pypi, rubygems) and OCI JWT support.
- Added upstream infrastructure: BaseUpstream, UpstreamCache (S3-backed optional, stale-while-revalidate, negative caching), circuit breaker with retries/backoff and resilience defaults.
- Added various protocol registries (NPM, Maven, Cargo, OCI, PyPI) with request routing, permission checks, and optional upstream proxying/caching.
## 2025-11-27 - 2.4.0 - feat(core)
Add pluggable auth providers, storage hooks, multi-upstream cache awareness, and PyPI/RubyGems protocol implementations
- Introduce pluggable authentication: IAuthProvider interface and DefaultAuthProvider (in-memory) with OCI JWT support and UUID tokens.
- AuthManager now accepts a custom provider and delegates all auth operations (authenticate, validateToken, create/revoke tokens, authorize, listUserTokens).
- Add storage hooks (IStorageHooks) and hook contexts: beforePut/afterPut/afterGet/beforeDelete/afterDelete. RegistryStorage now supports hooks, context management (setContext/withContext) and invokes hooks around operations.
- RegistryStorage expanded with many protocol-specific helper methods (OCI, NPM, Maven, Cargo, Composer, PyPI, RubyGems) and improved S3/SmartBucket integration.
- Upstream improvements: BaseUpstream and UpstreamCache became multi-upstream aware (cache keys now include upstream URL), cache operations are async and support negative caching, stale-while-revalidate, ETag/metadata persistence, and S3-backed storage layer.
- Circuit breaker, retry, resilience and scope-rule routing enhancements for upstreams; upstream fetch logic updated to prefer primary upstream for cache keys and background revalidation behavior.
- SmartRegistry API extended to accept custom authProvider and storageHooks, and now wires RegistryStorage and AuthManager with those options. Core exports updated to expose auth and storage interfaces and DefaultAuthProvider.
- Add full PyPI (PEP 503/691, upload API) and RubyGems (Compact Index, API v1, uploads/yank/unyank, specs endpoints) registry implementations with parsing, upload/download, metadata management and upstream proxying.
- Add utility helpers: binary buffer helpers (toBuffer/isBinaryData), pypi and rubygems helper modules, and numerous protocol-specific helpers and tests referenced in readme.hints.
- These changes are additive and designed to be backward compatible; bumping minor version.
## 2025-11-27 - 2.3.0 - feat(upstream)
Add upstream proxy/cache subsystem and integrate per-protocol upstreams
- Introduce a complete upstream subsystem (BaseUpstream, UpstreamCache, CircuitBreaker) with caching, negative-cache, stale-while-revalidate, retries, exponential backoff and per-upstream circuit breakers.
- Add upstream interfaces and defaults (ts/upstream/interfaces.upstream.ts) and export upstream utilities from ts/upstream/index.ts and root ts/index.ts.
- Implement protocol-specific upstream clients for npm, pypi, maven, composer, cargo and rubygems (classes.*upstream.ts) to fetch metadata and artifacts from configured upstream registries.
- Integrate upstream usage into registries: registries now accept an upstream config, attempt to fetch missing metadata/artifacts from upstreams, cache results locally, and expose destroy() to stop upstream resources.
- Add SmartRequest and minimatch to dependencies and expose smartrequest/minimatch via ts/plugins.ts for HTTP requests and glob-based scope matching.
- Update package.json to add @push.rocks/smartrequest and minimatch dependencies.
- Various registry implementations updated to utilize upstreams (npm, pypi, maven, composer, cargo, rubygems, oci) including URL rewrites and caching behavior.
## 2025-11-27 - 2.2.3 - fix(tests)
Use unique test run IDs and add S3 cleanup in test helpers to avoid cross-run conflicts
- Add generateTestRunId() helper in test/helpers/registry.ts to produce unique IDs for each test run
- Update PyPI and Composer native CLI tests to use generated testPackageName / unauth-pkg-<id> to avoid package name collisions between runs
- Import smartbucket and add S3 bucket cleanup logic in test helpers to remove leftover objects between test runs
- Improve test robustness by skipping upload-dependent checks when tools (twine/composer) are not available and logging outputs for debugging
## 2025-11-25 - 2.2.2 - fix(npm) ## 2025-11-25 - 2.2.2 - fix(npm)
Replace console logging with structured Smartlog in NPM registry and silence RubyGems helper error logging Replace console logging with structured Smartlog in NPM registry and silence RubyGems helper error logging

View File

@@ -1,6 +1,6 @@
{ {
"name": "@push.rocks/smartregistry", "name": "@push.rocks/smartregistry",
"version": "2.2.2", "version": "2.5.0",
"private": false, "private": false,
"description": "A composable TypeScript library implementing OCI, NPM, Maven, Cargo, Composer, PyPI, and RubyGems registries for building unified container and package registries", "description": "A composable TypeScript library implementing OCI, NPM, Maven, Cargo, Composer, PyPI, and RubyGems registries for building unified container and package registries",
"main": "dist_ts/index.js", "main": "dist_ts/index.js",
@@ -50,8 +50,10 @@
"@push.rocks/smartbucket": "^4.3.0", "@push.rocks/smartbucket": "^4.3.0",
"@push.rocks/smartlog": "^3.1.10", "@push.rocks/smartlog": "^3.1.10",
"@push.rocks/smartpath": "^6.0.0", "@push.rocks/smartpath": "^6.0.0",
"@push.rocks/smartrequest": "^5.0.1",
"@tsclass/tsclass": "^9.3.0", "@tsclass/tsclass": "^9.3.0",
"adm-zip": "^0.5.10" "adm-zip": "^0.5.10",
"minimatch": "^10.1.1"
}, },
"packageManager": "pnpm@10.18.1+sha512.77a884a165cbba2d8d1c19e3b4880eee6d2fcabd0d879121e282196b80042351d5eb3ca0935fa599da1dc51265cc68816ad2bddd2a2de5ea9fdf92adbec7cd34" "packageManager": "pnpm@10.18.1+sha512.77a884a165cbba2d8d1c19e3b4880eee6d2fcabd0d879121e282196b80042351d5eb3ca0935fa599da1dc51265cc68816ad2bddd2a2de5ea9fdf92adbec7cd34"
} }

6
pnpm-lock.yaml generated
View File

@@ -20,12 +20,18 @@ importers:
'@push.rocks/smartpath': '@push.rocks/smartpath':
specifier: ^6.0.0 specifier: ^6.0.0
version: 6.0.0 version: 6.0.0
'@push.rocks/smartrequest':
specifier: ^5.0.1
version: 5.0.1
'@tsclass/tsclass': '@tsclass/tsclass':
specifier: ^9.3.0 specifier: ^9.3.0
version: 9.3.0 version: 9.3.0
adm-zip: adm-zip:
specifier: ^0.5.10 specifier: ^0.5.10
version: 0.5.16 version: 0.5.16
minimatch:
specifier: ^10.1.1
version: 10.1.1
devDependencies: devDependencies:
'@git.zone/tsbuild': '@git.zone/tsbuild':
specifier: ^3.1.0 specifier: ^3.1.0

226
readme.md
View File

@@ -4,7 +4,7 @@
## Issue Reporting and Security ## Issue Reporting and Security
For reporting bugs, issues, or security vulnerabilities, please visit [community.foss.global/](https://community.foss.global/). This is the central community hub for all issue reporting. Developers who want to sign a contribution agreement and go through identification can also get a [code.foss.global/](https://code.foss.global/) account to submit Pull Requests directly. For reporting bugs, issues, or security vulnerabilities, please visit [community.foss.global/](https://community.foss.global/). This is the central community hub for all issue reporting. Developers who sign and comply with our contribution agreement and go through identification can also get a [code.foss.global/](https://code.foss.global/) account to submit Pull Requests directly.
## ✨ Features ## ✨ Features
@@ -82,6 +82,19 @@ For reporting bugs, issues, or security vulnerabilities, please visit [community
- ✅ Dependency resolution - ✅ Dependency resolution
- ✅ Legacy API compatibility - ✅ Legacy API compatibility
### 🌐 Upstream Proxy & Caching
- **Multi-Upstream Support**: Configure multiple upstream registries per protocol with priority ordering
- **Scope-Based Routing**: Route specific packages/scopes to different upstreams (e.g., `@company/*` → private registry)
- **S3-Backed Cache**: Persistent caching using existing S3 storage with URL-based cache paths
- **Circuit Breaker**: Automatic failover with configurable thresholds
- **Stale-While-Revalidate**: Serve cached content while refreshing in background
- **Content-Aware TTLs**: Different TTLs for immutable (tarballs) vs mutable (metadata) content
### 🔌 Enterprise Extensibility
- **Pluggable Auth Provider** (`IAuthProvider`): Integrate LDAP, OAuth, SSO, or custom auth systems
- **Storage Event Hooks** (`IStorageHooks`): Quota tracking, audit logging, virus scanning, cache invalidation
- **Request Actor Context**: Pass user/org info through requests for audit trails and rate limiting
## 📥 Installation ## 📥 Installation
```bash ```bash
@@ -648,6 +661,217 @@ const canWrite = await authManager.authorize(
); );
``` ```
### 🌐 Upstream Proxy Configuration
```typescript
import { SmartRegistry, IRegistryConfig } from '@push.rocks/smartregistry';
const config: IRegistryConfig = {
storage: { /* S3 config */ },
auth: { /* Auth config */ },
npm: {
enabled: true,
basePath: '/npm',
upstream: {
enabled: true,
upstreams: [
{
id: 'company-private',
name: 'Company Private NPM',
url: 'https://npm.internal.company.com',
priority: 1, // Lower = higher priority
enabled: true,
scopeRules: [
{ pattern: '@company/*', action: 'include' },
{ pattern: '@internal/*', action: 'include' },
],
auth: { type: 'bearer', token: process.env.NPM_PRIVATE_TOKEN },
},
{
id: 'npmjs',
name: 'NPM Public Registry',
url: 'https://registry.npmjs.org',
priority: 10,
enabled: true,
scopeRules: [
{ pattern: '@company/*', action: 'exclude' },
{ pattern: '@internal/*', action: 'exclude' },
],
auth: { type: 'none' },
cache: { defaultTtlSeconds: 300 },
resilience: { timeoutMs: 30000, maxRetries: 3 },
},
],
cache: { enabled: true, staleWhileRevalidate: true },
},
},
oci: {
enabled: true,
basePath: '/oci',
upstream: {
enabled: true,
upstreams: [
{
id: 'dockerhub',
name: 'Docker Hub',
url: 'https://registry-1.docker.io',
priority: 1,
enabled: true,
auth: { type: 'none' },
},
{
id: 'ghcr',
name: 'GitHub Container Registry',
url: 'https://ghcr.io',
priority: 2,
enabled: true,
scopeRules: [{ pattern: 'ghcr.io/*', action: 'include' }],
auth: { type: 'bearer', token: process.env.GHCR_TOKEN },
},
],
},
},
};
const registry = new SmartRegistry(config);
await registry.init();
// Requests for @company/* packages go to private registry
// Other packages proxy through to npmjs.org with caching
```
### 🔌 Custom Auth Provider
```typescript
import { SmartRegistry, IAuthProvider, IAuthToken, ICredentials, TRegistryProtocol } from '@push.rocks/smartregistry';
// Implement custom auth (e.g., LDAP, OAuth)
class LdapAuthProvider implements IAuthProvider {
constructor(private ldapClient: LdapClient) {}
async authenticate(credentials: ICredentials): Promise<string | null> {
const result = await this.ldapClient.bind(credentials.username, credentials.password);
return result.success ? credentials.username : null;
}
async validateToken(token: string, protocol?: TRegistryProtocol): Promise<IAuthToken | null> {
const session = await this.sessionStore.get(token);
if (!session) return null;
return {
userId: session.userId,
scopes: session.scopes,
readonly: session.readonly,
created: session.created,
};
}
async createToken(userId: string, protocol: TRegistryProtocol, options?: ITokenOptions): Promise<string> {
const token = crypto.randomUUID();
await this.sessionStore.set(token, { userId, protocol, ...options });
return token;
}
async revokeToken(token: string): Promise<void> {
await this.sessionStore.delete(token);
}
async authorize(token: IAuthToken | null, resource: string, action: string): Promise<boolean> {
if (!token) return action === 'read'; // Anonymous read-only
// Check LDAP groups, roles, etc.
return this.checkPermissions(token.userId, resource, action);
}
}
// Use custom provider
const registry = new SmartRegistry({
...config,
authProvider: new LdapAuthProvider(ldapClient),
});
```
### 📊 Storage Hooks (Quota & Audit)
```typescript
import { SmartRegistry, IStorageHooks, IStorageHookContext } from '@push.rocks/smartregistry';
const storageHooks: IStorageHooks = {
// Block uploads that exceed quota
async beforePut(ctx: IStorageHookContext) {
if (ctx.actor?.orgId) {
const usage = await getStorageUsage(ctx.actor.orgId);
const quota = await getQuota(ctx.actor.orgId);
if (usage + (ctx.metadata?.size || 0) > quota) {
return { allowed: false, reason: 'Storage quota exceeded' };
}
}
return { allowed: true };
},
// Update usage tracking after successful upload
async afterPut(ctx: IStorageHookContext) {
if (ctx.actor?.orgId && ctx.metadata?.size) {
await incrementUsage(ctx.actor.orgId, ctx.metadata.size);
}
// Audit log
await auditLog.write({
action: 'storage.put',
key: ctx.key,
protocol: ctx.protocol,
actor: ctx.actor,
timestamp: ctx.timestamp,
});
},
// Prevent deletion of protected packages
async beforeDelete(ctx: IStorageHookContext) {
if (await isProtectedPackage(ctx.key)) {
return { allowed: false, reason: 'Cannot delete protected package' };
}
return { allowed: true };
},
// Log all access for compliance
async afterGet(ctx: IStorageHookContext) {
await accessLog.write({
action: 'storage.get',
key: ctx.key,
actor: ctx.actor,
timestamp: ctx.timestamp,
});
},
};
const registry = new SmartRegistry({
...config,
storageHooks,
});
```
### 👤 Request Actor Context
```typescript
// Pass actor information through requests for audit/quota tracking
const response = await registry.handleRequest({
method: 'PUT',
path: '/npm/my-package',
headers: { 'Authorization': 'Bearer <token>' },
query: {},
body: packageData,
actor: {
userId: 'user123',
tokenId: 'token-abc',
ip: req.ip,
userAgent: req.headers['user-agent'],
orgId: 'org-456',
sessionId: 'session-xyz',
},
});
// Actor info is available in storage hooks for quota/audit
```
## ⚙️ Configuration ## ⚙️ Configuration
### Storage Configuration ### Storage Configuration

View File

@@ -1,11 +1,63 @@
import * as qenv from '@push.rocks/qenv'; import * as qenv from '@push.rocks/qenv';
import * as crypto from 'crypto'; import * as crypto from 'crypto';
import * as smartarchive from '@push.rocks/smartarchive'; import * as smartarchive from '@push.rocks/smartarchive';
import * as smartbucket from '@push.rocks/smartbucket';
import { SmartRegistry } from '../../ts/classes.smartregistry.js'; import { SmartRegistry } from '../../ts/classes.smartregistry.js';
import type { IRegistryConfig } from '../../ts/core/interfaces.core.js'; import type { IRegistryConfig } from '../../ts/core/interfaces.core.js';
const testQenv = new qenv.Qenv('./', './.nogit'); const testQenv = new qenv.Qenv('./', './.nogit');
/**
* Clean up S3 bucket contents for a fresh test run
* @param prefix Optional prefix to delete (e.g., 'cargo/', 'npm/', 'composer/')
*/
/**
* Generate a unique test run ID for avoiding conflicts between test runs
* Uses timestamp + random suffix for uniqueness
*/
export function generateTestRunId(): string {
const timestamp = Date.now().toString(36);
const random = Math.random().toString(36).substring(2, 6);
return `${timestamp}${random}`;
}
export async function cleanupS3Bucket(prefix?: string): Promise<void> {
const s3AccessKey = await testQenv.getEnvVarOnDemand('S3_ACCESSKEY');
const s3SecretKey = await testQenv.getEnvVarOnDemand('S3_SECRETKEY');
const s3Endpoint = await testQenv.getEnvVarOnDemand('S3_ENDPOINT');
const s3Port = await testQenv.getEnvVarOnDemand('S3_PORT');
const s3 = new smartbucket.SmartBucket({
accessKey: s3AccessKey || 'minioadmin',
accessSecret: s3SecretKey || 'minioadmin',
endpoint: s3Endpoint || 'localhost',
port: parseInt(s3Port || '9000', 10),
useSsl: false,
});
try {
const bucket = await s3.getBucket('test-registry');
if (bucket) {
if (prefix) {
// Delete only objects with the given prefix
const files = await bucket.fastList({ prefix });
for (const file of files) {
await bucket.fastRemove({ path: file.name });
}
} else {
// Delete all objects in the bucket
const files = await bucket.fastList({});
for (const file of files) {
await bucket.fastRemove({ path: file.name });
}
}
}
} catch (error) {
// Bucket might not exist yet, that's fine
console.log('Cleanup: No bucket to clean or error:', error);
}
}
/** /**
* Create a test SmartRegistry instance with all protocols enabled * Create a test SmartRegistry instance with all protocols enabled
*/ */

View File

@@ -6,7 +6,7 @@
import { expect, tap } from '@git.zone/tstest/tapbundle'; import { expect, tap } from '@git.zone/tstest/tapbundle';
import { tapNodeTools } from '@git.zone/tstest/tapbundle_serverside'; import { tapNodeTools } from '@git.zone/tstest/tapbundle_serverside';
import { SmartRegistry } from '../ts/index.js'; import { SmartRegistry } from '../ts/index.js';
import { createTestRegistry, createTestTokens, createComposerZip } from './helpers/registry.js'; import { createTestRegistry, createTestTokens, createComposerZip, generateTestRunId } from './helpers/registry.js';
import type { IRequestContext, IResponse } from '../ts/core/interfaces.core.js'; import type { IRequestContext, IResponse } from '../ts/core/interfaces.core.js';
import * as http from 'http'; import * as http from 'http';
import * as url from 'url'; import * as url from 'url';
@@ -21,6 +21,11 @@ let registryPort: number;
let composerToken: string; let composerToken: string;
let testDir: string; let testDir: string;
let composerHome: string; let composerHome: string;
let hasComposer = false;
// Unique test run ID to avoid conflicts between test runs
const testRunId = generateTestRunId();
const testPackageName = `testvendor/test-pkg-${testRunId}`;
/** /**
* Create HTTP server wrapper around SmartRegistry * Create HTTP server wrapper around SmartRegistry
@@ -235,12 +240,11 @@ tap.test('Composer CLI: should verify composer is installed', async () => {
try { try {
const result = await tapNodeTools.runCommand('composer --version'); const result = await tapNodeTools.runCommand('composer --version');
console.log('Composer version output:', result.stdout.substring(0, 200)); console.log('Composer version output:', result.stdout.substring(0, 200));
hasComposer = result.exitCode === 0;
expect(result.exitCode).toEqual(0); expect(result.exitCode).toEqual(0);
} catch (error) { } catch (error) {
console.log('Composer CLI not available, skipping native CLI tests'); console.log('Composer CLI not available, skipping native CLI tests');
// Skip remaining tests if Composer is not installed hasComposer = false;
tap.skip.test('Composer CLI: remaining tests skipped - composer not available');
return;
} }
}); });
@@ -284,27 +288,32 @@ tap.test('Composer CLI: should verify server is responding', async () => {
}); });
tap.test('Composer CLI: should upload a package via API', async () => { tap.test('Composer CLI: should upload a package via API', async () => {
const vendorPackage = 'testvendor/test-package';
const version = '1.0.0'; const version = '1.0.0';
await uploadComposerPackage(vendorPackage, version, composerToken, registryUrl); await uploadComposerPackage(testPackageName, version, composerToken, registryUrl);
// Verify package exists via packages.json // Verify package exists via p2 metadata endpoint (more reliable than packages.json for new packages)
const response = await fetch(`${registryUrl}/composer/packages.json`); const metadataResponse = await fetch(`${registryUrl}/composer/p2/${testPackageName}.json`);
expect(response.status).toEqual(200); expect(metadataResponse.status).toEqual(200);
const packagesJson = await response.json(); const metadata = await metadataResponse.json();
expect(packagesJson.packages).toBeDefined(); expect(metadata.packages).toBeDefined();
expect(packagesJson.packages[vendorPackage]).toBeDefined(); expect(metadata.packages[testPackageName]).toBeDefined();
expect(metadata.packages[testPackageName].length).toBeGreaterThan(0);
}); });
tap.test('Composer CLI: should require package from registry', async () => { tap.test('Composer CLI: should require package from registry', async () => {
if (!hasComposer) {
console.log('Skipping - composer not available');
return;
}
const projectDir = path.join(testDir, 'consumer-project'); const projectDir = path.join(testDir, 'consumer-project');
createComposerProject(projectDir, registryUrl); createComposerProject(projectDir, registryUrl);
// Try to require the package we uploaded // Try to require the package we uploaded
const result = await runComposerCommand( const result = await runComposerCommand(
'require testvendor/test-package:1.0.0 --no-interaction', `require ${testPackageName}:1.0.0 --no-interaction`,
projectDir projectDir
); );
console.log('composer require output:', result.stdout); console.log('composer require output:', result.stdout);
@@ -314,8 +323,15 @@ tap.test('Composer CLI: should require package from registry', async () => {
}); });
tap.test('Composer CLI: should verify package in vendor directory', async () => { tap.test('Composer CLI: should verify package in vendor directory', async () => {
if (!hasComposer) {
console.log('Skipping - composer not available');
return;
}
const projectDir = path.join(testDir, 'consumer-project'); const projectDir = path.join(testDir, 'consumer-project');
const packageDir = path.join(projectDir, 'vendor', 'testvendor', 'test-package'); // Parse vendor/package from testPackageName (e.g., "testvendor/test-pkg-abc123")
const [vendor, pkg] = testPackageName.split('/');
const packageDir = path.join(projectDir, 'vendor', vendor, pkg);
expect(fs.existsSync(packageDir)).toEqual(true); expect(fs.existsSync(packageDir)).toEqual(true);
@@ -325,25 +341,36 @@ tap.test('Composer CLI: should verify package in vendor directory', async () =>
}); });
tap.test('Composer CLI: should upload second version', async () => { tap.test('Composer CLI: should upload second version', async () => {
const vendorPackage = 'testvendor/test-package';
const version = '2.0.0'; const version = '2.0.0';
await uploadComposerPackage(vendorPackage, version, composerToken, registryUrl); await uploadComposerPackage(testPackageName, version, composerToken, registryUrl);
// Verify both versions exist // Verify both versions exist via p2 metadata endpoint (Composer v2 format)
const response = await fetch(`${registryUrl}/composer/packages.json`); const response = await fetch(`${registryUrl}/composer/p2/${testPackageName}.json`);
const packagesJson = await response.json(); expect(response.status).toEqual(200);
expect(packagesJson.packages[vendorPackage]['1.0.0']).toBeDefined(); const metadata = await response.json();
expect(packagesJson.packages[vendorPackage]['2.0.0']).toBeDefined(); expect(metadata.packages).toBeDefined();
expect(metadata.packages[testPackageName]).toBeDefined();
// Check that both versions are present
const versions = metadata.packages[testPackageName];
expect(versions.length).toBeGreaterThanOrEqual(2);
const versionNumbers = versions.map((v: any) => v.version);
expect(versionNumbers).toContain('1.0.0');
expect(versionNumbers).toContain('2.0.0');
}); });
tap.test('Composer CLI: should update to new version', async () => { tap.test('Composer CLI: should update to new version', async () => {
if (!hasComposer) {
console.log('Skipping - composer not available');
return;
}
const projectDir = path.join(testDir, 'consumer-project'); const projectDir = path.join(testDir, 'consumer-project');
// Update to version 2.0.0 // Update to version 2.0.0
const result = await runComposerCommand( const result = await runComposerCommand(
'require testvendor/test-package:2.0.0 --no-interaction', `require ${testPackageName}:2.0.0 --no-interaction`,
projectDir projectDir
); );
console.log('composer update output:', result.stdout); console.log('composer update output:', result.stdout);
@@ -355,11 +382,16 @@ tap.test('Composer CLI: should update to new version', async () => {
expect(fs.existsSync(lockPath)).toEqual(true); expect(fs.existsSync(lockPath)).toEqual(true);
const lockContent = JSON.parse(fs.readFileSync(lockPath, 'utf-8')); const lockContent = JSON.parse(fs.readFileSync(lockPath, 'utf-8'));
const pkg = lockContent.packages.find((p: any) => p.name === 'testvendor/test-package'); const pkg = lockContent.packages.find((p: any) => p.name === testPackageName);
expect(pkg?.version).toEqual('2.0.0'); expect(pkg?.version).toEqual('2.0.0');
}); });
tap.test('Composer CLI: should search for packages', async () => { tap.test('Composer CLI: should search for packages', async () => {
if (!hasComposer) {
console.log('Skipping - composer not available');
return;
}
const projectDir = path.join(testDir, 'consumer-project'); const projectDir = path.join(testDir, 'consumer-project');
// Search for packages (may not work on all Composer versions) // Search for packages (may not work on all Composer versions)
@@ -375,23 +407,33 @@ tap.test('Composer CLI: should search for packages', async () => {
}); });
tap.test('Composer CLI: should show package info', async () => { tap.test('Composer CLI: should show package info', async () => {
if (!hasComposer) {
console.log('Skipping - composer not available');
return;
}
const projectDir = path.join(testDir, 'consumer-project'); const projectDir = path.join(testDir, 'consumer-project');
const result = await runComposerCommand( const result = await runComposerCommand(
'show testvendor/test-package --no-interaction', `show ${testPackageName} --no-interaction`,
projectDir projectDir
); );
console.log('composer show output:', result.stdout); console.log('composer show output:', result.stdout);
expect(result.exitCode).toEqual(0); expect(result.exitCode).toEqual(0);
expect(result.stdout).toContain('testvendor/test-package'); expect(result.stdout).toContain(testPackageName);
}); });
tap.test('Composer CLI: should remove package', async () => { tap.test('Composer CLI: should remove package', async () => {
if (!hasComposer) {
console.log('Skipping - composer not available');
return;
}
const projectDir = path.join(testDir, 'consumer-project'); const projectDir = path.join(testDir, 'consumer-project');
const result = await runComposerCommand( const result = await runComposerCommand(
'remove testvendor/test-package --no-interaction', `remove ${testPackageName} --no-interaction`,
projectDir projectDir
); );
console.log('composer remove output:', result.stdout); console.log('composer remove output:', result.stdout);
@@ -399,7 +441,8 @@ tap.test('Composer CLI: should remove package', async () => {
expect(result.exitCode).toEqual(0); expect(result.exitCode).toEqual(0);
// Verify package is removed from vendor // Verify package is removed from vendor
const packageDir = path.join(projectDir, 'vendor', 'testvendor', 'test-package'); const [vendor, pkg] = testPackageName.split('/');
const packageDir = path.join(projectDir, 'vendor', vendor, pkg);
expect(fs.existsSync(packageDir)).toEqual(false); expect(fs.existsSync(packageDir)).toEqual(false);
}); });

View File

@@ -6,7 +6,7 @@
import { expect, tap } from '@git.zone/tstest/tapbundle'; import { expect, tap } from '@git.zone/tstest/tapbundle';
import { tapNodeTools } from '@git.zone/tstest/tapbundle_serverside'; import { tapNodeTools } from '@git.zone/tstest/tapbundle_serverside';
import { SmartRegistry } from '../ts/index.js'; import { SmartRegistry } from '../ts/index.js';
import { createTestRegistry, createTestTokens, createPythonWheel, createPythonSdist } from './helpers/registry.js'; import { createTestRegistry, createTestTokens, createPythonWheel, createPythonSdist, generateTestRunId } from './helpers/registry.js';
import type { IRequestContext, IResponse } from '../ts/core/interfaces.core.js'; import type { IRequestContext, IResponse } from '../ts/core/interfaces.core.js';
import * as http from 'http'; import * as http from 'http';
import * as url from 'url'; import * as url from 'url';
@@ -24,6 +24,10 @@ let pipHome: string;
let hasPip = false; let hasPip = false;
let hasTwine = false; let hasTwine = false;
// Unique test run ID to avoid conflicts between test runs
const testRunId = generateTestRunId();
const testPackageName = `test-pypi-pkg-${testRunId}`;
/** /**
* Create HTTP server wrapper around SmartRegistry * Create HTTP server wrapper around SmartRegistry
*/ */
@@ -347,9 +351,8 @@ tap.test('PyPI CLI: should upload wheel with twine', async () => {
return; return;
} }
const packageName = 'test-pypi-pkg';
const version = '1.0.0'; const version = '1.0.0';
const wheelPath = await createTestWheelFile(packageName, version, testDir); const wheelPath = await createTestWheelFile(testPackageName, version, testDir);
expect(fs.existsSync(wheelPath)).toEqual(true); expect(fs.existsSync(wheelPath)).toEqual(true);
@@ -369,9 +372,7 @@ tap.test('PyPI CLI: should verify package in simple index', async () => {
return; return;
} }
const packageName = 'test-pypi-pkg'; const response = await fetch(`${registryUrl}/pypi/simple/${testPackageName}/`);
const response = await fetch(`${registryUrl}/pypi/simple/${packageName}/`);
expect(response.status).toEqual(200); expect(response.status).toEqual(200);
const html = await response.text(); const html = await response.text();
@@ -384,9 +385,8 @@ tap.test('PyPI CLI: should upload sdist with twine', async () => {
return; return;
} }
const packageName = 'test-pypi-pkg';
const version = '1.1.0'; const version = '1.1.0';
const sdistPath = await createTestSdistFile(packageName, version, testDir); const sdistPath = await createTestSdistFile(testPackageName, version, testDir);
expect(fs.existsSync(sdistPath)).toEqual(true); expect(fs.existsSync(sdistPath)).toEqual(true);
@@ -406,9 +406,7 @@ tap.test('PyPI CLI: should list all versions in simple index', async () => {
return; return;
} }
const packageName = 'test-pypi-pkg'; const response = await fetch(`${registryUrl}/pypi/simple/${testPackageName}/`);
const response = await fetch(`${registryUrl}/pypi/simple/${packageName}/`);
expect(response.status).toEqual(200); expect(response.status).toEqual(200);
const html = await response.text(); const html = await response.text();
@@ -422,14 +420,12 @@ tap.test('PyPI CLI: should get JSON metadata', async () => {
return; return;
} }
const packageName = 'test-pypi-pkg'; const response = await fetch(`${registryUrl}/pypi/pypi/${testPackageName}/json`);
const response = await fetch(`${registryUrl}/pypi/pypi/${packageName}/json`);
expect(response.status).toEqual(200); expect(response.status).toEqual(200);
const metadata = await response.json(); const metadata = await response.json();
expect(metadata.info).toBeDefined(); expect(metadata.info).toBeDefined();
expect(metadata.info.name).toEqual(packageName); expect(metadata.info.name).toEqual(testPackageName);
expect(metadata.releases).toBeDefined(); expect(metadata.releases).toBeDefined();
expect(metadata.releases['1.0.0']).toBeDefined(); expect(metadata.releases['1.0.0']).toBeDefined();
}); });
@@ -445,7 +441,7 @@ tap.test('PyPI CLI: should download package with pip', async () => {
// Download (not install) the package // Download (not install) the package
const result = await runPipCommand( const result = await runPipCommand(
`download test-pypi-pkg==1.0.0 --dest "${downloadDir}" --no-deps`, `download ${testPackageName}==1.0.0 --dest "${downloadDir}" --no-deps`,
testDir testDir
); );
console.log('pip download output:', result.stdout); console.log('pip download output:', result.stdout);
@@ -457,14 +453,17 @@ tap.test('PyPI CLI: should download package with pip', async () => {
}); });
tap.test('PyPI CLI: should search for packages via API', async () => { tap.test('PyPI CLI: should search for packages via API', async () => {
const packageName = 'test-pypi-pkg'; if (!hasTwine) {
console.log('Skipping - twine not available (no packages uploaded)');
return;
}
// Use the JSON API to search/list // Use the JSON API to search/list
const response = await fetch(`${registryUrl}/pypi/pypi/${packageName}/json`); const response = await fetch(`${registryUrl}/pypi/pypi/${testPackageName}/json`);
expect(response.status).toEqual(200); expect(response.status).toEqual(200);
const metadata = await response.json(); const metadata = await response.json();
expect(metadata.info.name).toEqual(packageName); expect(metadata.info.name).toEqual(testPackageName);
}); });
tap.test('PyPI CLI: should fail upload without auth', async () => { tap.test('PyPI CLI: should fail upload without auth', async () => {
@@ -473,9 +472,9 @@ tap.test('PyPI CLI: should fail upload without auth', async () => {
return; return;
} }
const packageName = 'unauth-pkg'; const unauthPkgName = `unauth-pkg-${testRunId}`;
const version = '1.0.0'; const version = '1.0.0';
const wheelPath = await createTestWheelFile(packageName, version, testDir); const wheelPath = await createTestWheelFile(unauthPkgName, version, testDir);
// Create a pypirc without proper credentials // Create a pypirc without proper credentials
const badPypircPath = path.join(testDir, '.bad-pypirc'); const badPypircPath = path.join(testDir, '.bad-pypirc');

View File

@@ -3,6 +3,6 @@
*/ */
export const commitinfo = { export const commitinfo = {
name: '@push.rocks/smartregistry', name: '@push.rocks/smartregistry',
version: '2.2.2', version: '2.5.0',
description: 'A composable TypeScript library implementing OCI, NPM, Maven, Cargo, Composer, PyPI, and RubyGems registries for building unified container and package registries' description: 'A composable TypeScript library implementing OCI, NPM, Maven, Cargo, Composer, PyPI, and RubyGems registries for building unified container and package registries'
} }

View File

@@ -3,6 +3,7 @@ import { BaseRegistry } from '../core/classes.baseregistry.js';
import { RegistryStorage } from '../core/classes.registrystorage.js'; import { RegistryStorage } from '../core/classes.registrystorage.js';
import { AuthManager } from '../core/classes.authmanager.js'; import { AuthManager } from '../core/classes.authmanager.js';
import type { IRequestContext, IResponse, IAuthToken } from '../core/interfaces.core.js'; import type { IRequestContext, IResponse, IAuthToken } from '../core/interfaces.core.js';
import type { IProtocolUpstreamConfig } from '../upstream/interfaces.upstream.js';
import type { import type {
ICargoIndexEntry, ICargoIndexEntry,
ICargoPublishMetadata, ICargoPublishMetadata,
@@ -13,6 +14,7 @@ import type {
ICargoSearchResponse, ICargoSearchResponse,
ICargoSearchResult, ICargoSearchResult,
} from './interfaces.cargo.js'; } from './interfaces.cargo.js';
import { CargoUpstream } from './classes.cargoupstream.js';
/** /**
* Cargo/crates.io registry implementation * Cargo/crates.io registry implementation
@@ -25,12 +27,14 @@ export class CargoRegistry extends BaseRegistry {
private basePath: string = '/cargo'; private basePath: string = '/cargo';
private registryUrl: string; private registryUrl: string;
private logger: Smartlog; private logger: Smartlog;
private upstream: CargoUpstream | null = null;
constructor( constructor(
storage: RegistryStorage, storage: RegistryStorage,
authManager: AuthManager, authManager: AuthManager,
basePath: string = '/cargo', basePath: string = '/cargo',
registryUrl: string = 'http://localhost:5000/cargo' registryUrl: string = 'http://localhost:5000/cargo',
upstreamConfig?: IProtocolUpstreamConfig
) { ) {
super(); super();
this.storage = storage; this.storage = storage;
@@ -50,6 +54,20 @@ export class CargoRegistry extends BaseRegistry {
} }
}); });
this.logger.enableConsole(); this.logger.enableConsole();
// Initialize upstream if configured
if (upstreamConfig?.enabled) {
this.upstream = new CargoUpstream(upstreamConfig, undefined, this.logger);
}
}
/**
* Clean up resources (timers, connections, etc.)
*/
public destroy(): void {
if (this.upstream) {
this.upstream.stop();
}
} }
public async init(): Promise<void> { public async init(): Promise<void> {
@@ -207,7 +225,25 @@ export class CargoRegistry extends BaseRegistry {
* Serve index file for a crate * Serve index file for a crate
*/ */
private async handleIndexFile(crateName: string): Promise<IResponse> { private async handleIndexFile(crateName: string): Promise<IResponse> {
const index = await this.storage.getCargoIndex(crateName); let index = await this.storage.getCargoIndex(crateName);
// Try upstream if not found locally
if ((!index || index.length === 0) && this.upstream) {
const upstreamIndex = await this.upstream.fetchCrateIndex(crateName);
if (upstreamIndex) {
// Parse the newline-delimited JSON
const parsedIndex: ICargoIndexEntry[] = upstreamIndex
.split('\n')
.filter(line => line.trim())
.map(line => JSON.parse(line));
if (parsedIndex.length > 0) {
// Cache locally
await this.storage.putCargoIndex(crateName, parsedIndex);
index = parsedIndex;
}
}
}
if (!index || index.length === 0) { if (!index || index.length === 0) {
return { return {
@@ -399,7 +435,16 @@ export class CargoRegistry extends BaseRegistry {
): Promise<IResponse> { ): Promise<IResponse> {
this.logger.log('debug', 'handleDownload', { crate: crateName, version }); this.logger.log('debug', 'handleDownload', { crate: crateName, version });
const crateFile = await this.storage.getCargoCrate(crateName, version); let crateFile = await this.storage.getCargoCrate(crateName, version);
// Try upstream if not found locally
if (!crateFile && this.upstream) {
crateFile = await this.upstream.fetchCrate(crateName, version);
if (crateFile) {
// Cache locally
await this.storage.putCargoCrate(crateName, version, crateFile);
}
}
if (!crateFile) { if (!crateFile) {
return { return {

View File

@@ -0,0 +1,159 @@
import * as plugins from '../plugins.js';
import { BaseUpstream } from '../upstream/classes.baseupstream.js';
import type {
IProtocolUpstreamConfig,
IUpstreamFetchContext,
IUpstreamRegistryConfig,
} from '../upstream/interfaces.upstream.js';
/**
* Cargo-specific upstream implementation.
*
* Handles:
* - Crate metadata (index) fetching
* - Crate file (.crate) downloading
* - Sparse index protocol support
* - Content-addressable caching for .crate files
*/
export class CargoUpstream extends BaseUpstream {
protected readonly protocolName = 'cargo';
/** Base URL for crate downloads (may differ from index URL) */
private readonly downloadUrl: string;
constructor(
config: IProtocolUpstreamConfig,
downloadUrl?: string,
logger?: plugins.smartlog.Smartlog,
) {
super(config, logger);
// Default to crates.io download URL if not specified
this.downloadUrl = downloadUrl || 'https://static.crates.io/crates';
}
/**
* Fetch crate metadata from the sparse index.
*/
public async fetchCrateIndex(crateName: string): Promise<string | null> {
const path = this.buildIndexPath(crateName);
const context: IUpstreamFetchContext = {
protocol: 'cargo',
resource: crateName,
resourceType: 'index',
path,
method: 'GET',
headers: {
'accept': 'text/plain',
},
query: {},
};
const result = await this.fetch(context);
if (!result || !result.success) {
return null;
}
if (Buffer.isBuffer(result.body)) {
return result.body.toString('utf8');
}
return typeof result.body === 'string' ? result.body : null;
}
/**
* Fetch a crate file from upstream.
*/
public async fetchCrate(crateName: string, version: string): Promise<Buffer | null> {
// Crate downloads typically go to a different URL than the index
const path = `/${crateName}/${crateName}-${version}.crate`;
const context: IUpstreamFetchContext = {
protocol: 'cargo',
resource: crateName,
resourceType: 'crate',
path,
method: 'GET',
headers: {
'accept': 'application/octet-stream',
},
query: {},
};
// Use special handling for crate downloads
const result = await this.fetchCrateFile(crateName, version);
return result;
}
/**
* Fetch crate file directly from the download URL.
*/
private async fetchCrateFile(crateName: string, version: string): Promise<Buffer | null> {
const context: IUpstreamFetchContext = {
protocol: 'cargo',
resource: crateName,
resourceType: 'crate',
path: `/${crateName}/${crateName}-${version}.crate`,
method: 'GET',
headers: {
'accept': 'application/octet-stream',
},
query: {},
};
const result = await this.fetch(context);
if (!result || !result.success) {
return null;
}
return Buffer.isBuffer(result.body) ? result.body : Buffer.from(result.body);
}
/**
* Build the sparse index path for a crate.
*
* Path structure:
* - 1 char: /1/{name}
* - 2 chars: /2/{name}
* - 3 chars: /3/{first char}/{name}
* - 4+ chars: /{first 2}/{next 2}/{name}
*/
private buildIndexPath(crateName: string): string {
const lowerName = crateName.toLowerCase();
const len = lowerName.length;
if (len === 1) {
return `/1/${lowerName}`;
} else if (len === 2) {
return `/2/${lowerName}`;
} else if (len === 3) {
return `/3/${lowerName[0]}/${lowerName}`;
} else {
return `/${lowerName.slice(0, 2)}/${lowerName.slice(2, 4)}/${lowerName}`;
}
}
/**
* Override URL building for Cargo-specific handling.
*/
protected buildUpstreamUrl(
upstream: IUpstreamRegistryConfig,
context: IUpstreamFetchContext,
): string {
let baseUrl = upstream.url;
// For crate downloads, use the download URL
if (context.resourceType === 'crate') {
baseUrl = this.downloadUrl;
}
// Remove trailing slash
if (baseUrl.endsWith('/')) {
baseUrl = baseUrl.slice(0, -1);
}
return `${baseUrl}${context.path}`;
}
}

View File

@@ -3,4 +3,5 @@
*/ */
export { CargoRegistry } from './classes.cargoregistry.js'; export { CargoRegistry } from './classes.cargoregistry.js';
export { CargoUpstream } from './classes.cargoupstream.js';
export * from './interfaces.cargo.js'; export * from './interfaces.cargo.js';

View File

@@ -11,8 +11,39 @@ import { PypiRegistry } from './pypi/classes.pypiregistry.js';
import { RubyGemsRegistry } from './rubygems/classes.rubygemsregistry.js'; import { RubyGemsRegistry } from './rubygems/classes.rubygemsregistry.js';
/** /**
* Main registry orchestrator * Main registry orchestrator.
* Routes requests to appropriate protocol handlers (OCI, NPM, Maven, Cargo, Composer, PyPI, or RubyGems) * Routes requests to appropriate protocol handlers (OCI, NPM, Maven, Cargo, Composer, PyPI, or RubyGems).
*
* Supports pluggable authentication and storage hooks:
*
* @example
* ```typescript
* // Basic usage with default in-memory auth
* const registry = new SmartRegistry(config);
*
* // With custom auth provider (LDAP, OAuth, etc.)
* const registry = new SmartRegistry({
* ...config,
* authProvider: new LdapAuthProvider(ldapClient),
* });
*
* // With storage hooks for quota tracking
* const registry = new SmartRegistry({
* ...config,
* storageHooks: {
* beforePut: async (ctx) => {
* const quota = await getQuota(ctx.actor?.orgId);
* if (ctx.metadata?.size > quota) {
* return { allowed: false, reason: 'Quota exceeded' };
* }
* return { allowed: true };
* },
* afterPut: async (ctx) => {
* await auditLog('storage.put', ctx);
* }
* }
* });
* ```
*/ */
export class SmartRegistry { export class SmartRegistry {
private storage: RegistryStorage; private storage: RegistryStorage;
@@ -23,8 +54,12 @@ export class SmartRegistry {
constructor(config: IRegistryConfig) { constructor(config: IRegistryConfig) {
this.config = config; this.config = config;
this.storage = new RegistryStorage(config.storage);
this.authManager = new AuthManager(config.auth); // Create storage with optional hooks
this.storage = new RegistryStorage(config.storage, config.storageHooks);
// Create auth manager with optional custom provider
this.authManager = new AuthManager(config.auth, config.authProvider);
} }
/** /**
@@ -46,7 +81,13 @@ export class SmartRegistry {
realm: this.config.auth.ociTokens.realm, realm: this.config.auth.ociTokens.realm,
service: this.config.auth.ociTokens.service, service: this.config.auth.ociTokens.service,
} : undefined; } : undefined;
const ociRegistry = new OciRegistry(this.storage, this.authManager, ociBasePath, ociTokens); const ociRegistry = new OciRegistry(
this.storage,
this.authManager,
ociBasePath,
ociTokens,
this.config.oci.upstream
);
await ociRegistry.init(); await ociRegistry.init();
this.registries.set('oci', ociRegistry); this.registries.set('oci', ociRegistry);
} }
@@ -55,7 +96,13 @@ export class SmartRegistry {
if (this.config.npm?.enabled) { if (this.config.npm?.enabled) {
const npmBasePath = this.config.npm.basePath ?? '/npm'; const npmBasePath = this.config.npm.basePath ?? '/npm';
const registryUrl = `http://localhost:5000${npmBasePath}`; // TODO: Make configurable const registryUrl = `http://localhost:5000${npmBasePath}`; // TODO: Make configurable
const npmRegistry = new NpmRegistry(this.storage, this.authManager, npmBasePath, registryUrl); const npmRegistry = new NpmRegistry(
this.storage,
this.authManager,
npmBasePath,
registryUrl,
this.config.npm.upstream
);
await npmRegistry.init(); await npmRegistry.init();
this.registries.set('npm', npmRegistry); this.registries.set('npm', npmRegistry);
} }
@@ -64,7 +111,13 @@ export class SmartRegistry {
if (this.config.maven?.enabled) { if (this.config.maven?.enabled) {
const mavenBasePath = this.config.maven.basePath ?? '/maven'; const mavenBasePath = this.config.maven.basePath ?? '/maven';
const registryUrl = `http://localhost:5000${mavenBasePath}`; // TODO: Make configurable const registryUrl = `http://localhost:5000${mavenBasePath}`; // TODO: Make configurable
const mavenRegistry = new MavenRegistry(this.storage, this.authManager, mavenBasePath, registryUrl); const mavenRegistry = new MavenRegistry(
this.storage,
this.authManager,
mavenBasePath,
registryUrl,
this.config.maven.upstream
);
await mavenRegistry.init(); await mavenRegistry.init();
this.registries.set('maven', mavenRegistry); this.registries.set('maven', mavenRegistry);
} }
@@ -73,7 +126,13 @@ export class SmartRegistry {
if (this.config.cargo?.enabled) { if (this.config.cargo?.enabled) {
const cargoBasePath = this.config.cargo.basePath ?? '/cargo'; const cargoBasePath = this.config.cargo.basePath ?? '/cargo';
const registryUrl = `http://localhost:5000${cargoBasePath}`; // TODO: Make configurable const registryUrl = `http://localhost:5000${cargoBasePath}`; // TODO: Make configurable
const cargoRegistry = new CargoRegistry(this.storage, this.authManager, cargoBasePath, registryUrl); const cargoRegistry = new CargoRegistry(
this.storage,
this.authManager,
cargoBasePath,
registryUrl,
this.config.cargo.upstream
);
await cargoRegistry.init(); await cargoRegistry.init();
this.registries.set('cargo', cargoRegistry); this.registries.set('cargo', cargoRegistry);
} }
@@ -82,7 +141,13 @@ export class SmartRegistry {
if (this.config.composer?.enabled) { if (this.config.composer?.enabled) {
const composerBasePath = this.config.composer.basePath ?? '/composer'; const composerBasePath = this.config.composer.basePath ?? '/composer';
const registryUrl = `http://localhost:5000${composerBasePath}`; // TODO: Make configurable const registryUrl = `http://localhost:5000${composerBasePath}`; // TODO: Make configurable
const composerRegistry = new ComposerRegistry(this.storage, this.authManager, composerBasePath, registryUrl); const composerRegistry = new ComposerRegistry(
this.storage,
this.authManager,
composerBasePath,
registryUrl,
this.config.composer.upstream
);
await composerRegistry.init(); await composerRegistry.init();
this.registries.set('composer', composerRegistry); this.registries.set('composer', composerRegistry);
} }
@@ -91,7 +156,13 @@ export class SmartRegistry {
if (this.config.pypi?.enabled) { if (this.config.pypi?.enabled) {
const pypiBasePath = this.config.pypi.basePath ?? '/pypi'; const pypiBasePath = this.config.pypi.basePath ?? '/pypi';
const registryUrl = `http://localhost:5000`; // TODO: Make configurable const registryUrl = `http://localhost:5000`; // TODO: Make configurable
const pypiRegistry = new PypiRegistry(this.storage, this.authManager, pypiBasePath, registryUrl); const pypiRegistry = new PypiRegistry(
this.storage,
this.authManager,
pypiBasePath,
registryUrl,
this.config.pypi.upstream
);
await pypiRegistry.init(); await pypiRegistry.init();
this.registries.set('pypi', pypiRegistry); this.registries.set('pypi', pypiRegistry);
} }
@@ -100,7 +171,13 @@ export class SmartRegistry {
if (this.config.rubygems?.enabled) { if (this.config.rubygems?.enabled) {
const rubygemsBasePath = this.config.rubygems.basePath ?? '/rubygems'; const rubygemsBasePath = this.config.rubygems.basePath ?? '/rubygems';
const registryUrl = `http://localhost:5000${rubygemsBasePath}`; // TODO: Make configurable const registryUrl = `http://localhost:5000${rubygemsBasePath}`; // TODO: Make configurable
const rubygemsRegistry = new RubyGemsRegistry(this.storage, this.authManager, rubygemsBasePath, registryUrl); const rubygemsRegistry = new RubyGemsRegistry(
this.storage,
this.authManager,
rubygemsBasePath,
registryUrl,
this.config.rubygems.upstream
);
await rubygemsRegistry.init(); await rubygemsRegistry.init();
this.registries.set('rubygems', rubygemsRegistry); this.registries.set('rubygems', rubygemsRegistry);
} }

View File

@@ -7,6 +7,7 @@ import { BaseRegistry } from '../core/classes.baseregistry.js';
import type { RegistryStorage } from '../core/classes.registrystorage.js'; import type { RegistryStorage } from '../core/classes.registrystorage.js';
import type { AuthManager } from '../core/classes.authmanager.js'; import type { AuthManager } from '../core/classes.authmanager.js';
import type { IRequestContext, IResponse, IAuthToken } from '../core/interfaces.core.js'; import type { IRequestContext, IResponse, IAuthToken } from '../core/interfaces.core.js';
import type { IProtocolUpstreamConfig } from '../upstream/interfaces.upstream.js';
import { isBinaryData, toBuffer } from '../core/helpers.buffer.js'; import { isBinaryData, toBuffer } from '../core/helpers.buffer.js';
import type { import type {
IComposerPackage, IComposerPackage,
@@ -22,24 +23,41 @@ import {
generatePackagesJson, generatePackagesJson,
sortVersions, sortVersions,
} from './helpers.composer.js'; } from './helpers.composer.js';
import { ComposerUpstream } from './classes.composerupstream.js';
export class ComposerRegistry extends BaseRegistry { export class ComposerRegistry extends BaseRegistry {
private storage: RegistryStorage; private storage: RegistryStorage;
private authManager: AuthManager; private authManager: AuthManager;
private basePath: string = '/composer'; private basePath: string = '/composer';
private registryUrl: string; private registryUrl: string;
private upstream: ComposerUpstream | null = null;
constructor( constructor(
storage: RegistryStorage, storage: RegistryStorage,
authManager: AuthManager, authManager: AuthManager,
basePath: string = '/composer', basePath: string = '/composer',
registryUrl: string = 'http://localhost:5000/composer' registryUrl: string = 'http://localhost:5000/composer',
upstreamConfig?: IProtocolUpstreamConfig
) { ) {
super(); super();
this.storage = storage; this.storage = storage;
this.authManager = authManager; this.authManager = authManager;
this.basePath = basePath; this.basePath = basePath;
this.registryUrl = registryUrl; this.registryUrl = registryUrl;
// Initialize upstream if configured
if (upstreamConfig?.enabled) {
this.upstream = new ComposerUpstream(upstreamConfig);
}
}
/**
* Clean up resources (timers, connections, etc.)
*/
public destroy(): void {
if (this.upstream) {
this.upstream.stop();
}
} }
public async init(): Promise<void> { public async init(): Promise<void> {
@@ -161,7 +179,26 @@ export class ComposerRegistry extends BaseRegistry {
token: IAuthToken | null token: IAuthToken | null
): Promise<IResponse> { ): Promise<IResponse> {
// Read operations are public, no authentication required // Read operations are public, no authentication required
const metadata = await this.storage.getComposerPackageMetadata(vendorPackage); let metadata = await this.storage.getComposerPackageMetadata(vendorPackage);
// Try upstream if not found locally
if (!metadata && this.upstream) {
const [vendor, packageName] = vendorPackage.split('/');
if (vendor && packageName) {
const upstreamMetadata = includeDev
? await this.upstream.fetchPackageDevMetadata(vendor, packageName)
: await this.upstream.fetchPackageMetadata(vendor, packageName);
if (upstreamMetadata && upstreamMetadata.packages) {
// Store upstream metadata locally
metadata = {
packages: upstreamMetadata.packages,
lastModified: new Date().toUTCString(),
};
await this.storage.putComposerPackageMetadata(vendorPackage, metadata);
}
}
}
if (!metadata) { if (!metadata) {
return { return {

View File

@@ -0,0 +1,200 @@
import * as plugins from '../plugins.js';
import { BaseUpstream } from '../upstream/classes.baseupstream.js';
import type {
IProtocolUpstreamConfig,
IUpstreamFetchContext,
IUpstreamRegistryConfig,
} from '../upstream/interfaces.upstream.js';
/**
* Composer-specific upstream implementation.
*
* Handles:
* - Package metadata fetching (packages.json, provider-includes)
* - Package version metadata (p2/{vendor}/{package}.json)
* - Dist file (zip) proxying
* - Packagist v2 API support
*/
export class ComposerUpstream extends BaseUpstream {
protected readonly protocolName = 'composer';
constructor(
config: IProtocolUpstreamConfig,
logger?: plugins.smartlog.Smartlog,
) {
super(config, logger);
}
/**
* Fetch the root packages.json from upstream.
*/
public async fetchPackagesJson(): Promise<any | null> {
const context: IUpstreamFetchContext = {
protocol: 'composer',
resource: '*',
resourceType: 'root',
path: '/packages.json',
method: 'GET',
headers: {
'accept': 'application/json',
},
query: {},
};
const result = await this.fetch(context);
if (!result || !result.success) {
return null;
}
if (Buffer.isBuffer(result.body)) {
return JSON.parse(result.body.toString('utf8'));
}
return result.body;
}
/**
* Fetch package metadata using v2 API (p2/{vendor}/{package}.json).
*/
public async fetchPackageMetadata(vendor: string, packageName: string): Promise<any | null> {
const fullName = `${vendor}/${packageName}`;
const path = `/p2/${vendor}/${packageName}.json`;
const context: IUpstreamFetchContext = {
protocol: 'composer',
resource: fullName,
resourceType: 'metadata',
path,
method: 'GET',
headers: {
'accept': 'application/json',
},
query: {},
};
const result = await this.fetch(context);
if (!result || !result.success) {
return null;
}
if (Buffer.isBuffer(result.body)) {
return JSON.parse(result.body.toString('utf8'));
}
return result.body;
}
/**
* Fetch package metadata with dev versions (p2/{vendor}/{package}~dev.json).
*/
public async fetchPackageDevMetadata(vendor: string, packageName: string): Promise<any | null> {
const fullName = `${vendor}/${packageName}`;
const path = `/p2/${vendor}/${packageName}~dev.json`;
const context: IUpstreamFetchContext = {
protocol: 'composer',
resource: fullName,
resourceType: 'metadata-dev',
path,
method: 'GET',
headers: {
'accept': 'application/json',
},
query: {},
};
const result = await this.fetch(context);
if (!result || !result.success) {
return null;
}
if (Buffer.isBuffer(result.body)) {
return JSON.parse(result.body.toString('utf8'));
}
return result.body;
}
/**
* Fetch a provider-includes file.
*/
public async fetchProviderIncludes(path: string): Promise<any | null> {
const context: IUpstreamFetchContext = {
protocol: 'composer',
resource: '*',
resourceType: 'provider',
path: path.startsWith('/') ? path : `/${path}`,
method: 'GET',
headers: {
'accept': 'application/json',
},
query: {},
};
const result = await this.fetch(context);
if (!result || !result.success) {
return null;
}
if (Buffer.isBuffer(result.body)) {
return JSON.parse(result.body.toString('utf8'));
}
return result.body;
}
/**
* Fetch a dist file (zip) from upstream.
*/
public async fetchDist(url: string): Promise<Buffer | null> {
// Parse the URL to get the path
let path: string;
try {
const parsed = new URL(url);
path = parsed.pathname;
} catch {
path = url;
}
const context: IUpstreamFetchContext = {
protocol: 'composer',
resource: '*',
resourceType: 'dist',
path,
method: 'GET',
headers: {
'accept': 'application/zip, application/octet-stream',
},
query: {},
};
const result = await this.fetch(context);
if (!result || !result.success) {
return null;
}
return Buffer.isBuffer(result.body) ? result.body : Buffer.from(result.body);
}
/**
* Override URL building for Composer-specific handling.
*/
protected buildUpstreamUrl(
upstream: IUpstreamRegistryConfig,
context: IUpstreamFetchContext,
): string {
let baseUrl = upstream.url;
// Remove trailing slash
if (baseUrl.endsWith('/')) {
baseUrl = baseUrl.slice(0, -1);
}
return `${baseUrl}${context.path}`;
}
}

View File

@@ -4,5 +4,6 @@
*/ */
export { ComposerRegistry } from './classes.composerregistry.js'; export { ComposerRegistry } from './classes.composerregistry.js';
export { ComposerUpstream } from './classes.composerupstream.js';
export * from './interfaces.composer.js'; export * from './interfaces.composer.js';
export * from './helpers.composer.js'; export * from './helpers.composer.js';

View File

@@ -1,109 +1,79 @@
import type { IAuthConfig, IAuthToken, ICredentials, TRegistryProtocol } from './interfaces.core.js'; import type { IAuthConfig, IAuthToken, ICredentials, TRegistryProtocol } from './interfaces.core.js';
import * as crypto from 'crypto'; import type { IAuthProvider, ITokenOptions } from './interfaces.auth.js';
import { DefaultAuthProvider } from './classes.defaultauthprovider.js';
/** /**
* Unified authentication manager for all registry protocols * Unified authentication manager for all registry protocols.
* Handles both NPM UUID tokens and OCI JWT tokens * Delegates to a pluggable IAuthProvider for actual auth operations.
*
* @example
* ```typescript
* // Use default in-memory provider
* const auth = new AuthManager(config);
*
* // Use custom provider (LDAP, OAuth, etc.)
* const auth = new AuthManager(config, new LdapAuthProvider(ldapClient));
* ```
*/ */
export class AuthManager { export class AuthManager {
private tokenStore: Map<string, IAuthToken> = new Map(); private provider: IAuthProvider;
private userCredentials: Map<string, string> = new Map(); // username -> password hash (mock)
constructor(private config: IAuthConfig) {} constructor(
private config: IAuthConfig,
provider?: IAuthProvider
) {
// Use provided provider or default in-memory implementation
this.provider = provider || new DefaultAuthProvider(config);
}
/** /**
* Initialize the auth manager * Initialize the auth manager
*/ */
public async init(): Promise<void> { public async init(): Promise<void> {
// Initialize token store (in-memory for now) if (this.provider.init) {
// In production, this could be Redis or a database await this.provider.init();
}
} }
// ======================================================================== // ========================================================================
// UUID TOKEN CREATION (Base method for NPM, Maven, etc.) // UNIFIED AUTHENTICATION (Delegated to Provider)
// ======================================================================== // ========================================================================
/** /**
* Create a UUID-based token with custom scopes (base method) * Authenticate user credentials
* @param userId - User ID * @param credentials - Username and password
* @param protocol - Protocol type * @returns User ID or null
* @param scopes - Permission scopes
* @param readonly - Whether the token is readonly
* @returns UUID token string
*/ */
private async createUuidToken( public async authenticate(credentials: ICredentials): Promise<string | null> {
userId: string, return this.provider.authenticate(credentials);
protocol: TRegistryProtocol,
scopes: string[],
readonly: boolean = false
): Promise<string> {
const token = this.generateUuid();
const authToken: IAuthToken = {
type: protocol,
userId,
scopes,
readonly,
metadata: {
created: new Date().toISOString(),
},
};
this.tokenStore.set(token, authToken);
return token;
} }
/** /**
* Generic protocol token creation (internal helper) * Validate any token (NPM, Maven, OCI, PyPI, RubyGems, Composer, Cargo)
* @param userId - User ID * @param tokenString - Token string (UUID or JWT)
* @param protocol - Protocol type (npm, maven, composer, etc.) * @param protocol - Expected protocol type (optional, improves performance)
* @param readonly - Whether the token is readonly
* @returns UUID token string
*/
private async createProtocolToken(
userId: string,
protocol: TRegistryProtocol,
readonly: boolean
): Promise<string> {
const scopes = readonly
? [`${protocol}:*:*:read`]
: [`${protocol}:*:*:*`];
return this.createUuidToken(userId, protocol, scopes, readonly);
}
/**
* Generic protocol token validation (internal helper)
* @param token - UUID token string
* @param protocol - Expected protocol type
* @returns Auth token object or null * @returns Auth token object or null
*/ */
private async validateProtocolToken( public async validateToken(
token: string, tokenString: string,
protocol: TRegistryProtocol protocol?: TRegistryProtocol
): Promise<IAuthToken | null> { ): Promise<IAuthToken | null> {
if (!this.isValidUuid(token)) { return this.provider.validateToken(tokenString, protocol);
return null;
}
const authToken = this.tokenStore.get(token);
if (!authToken || authToken.type !== protocol) {
return null;
}
// Check expiration if set
if (authToken.expiresAt && authToken.expiresAt < new Date()) {
this.tokenStore.delete(token);
return null;
}
return authToken;
} }
/** /**
* Generic protocol token revocation (internal helper) * Check if token has permission for an action
* @param token - UUID token string * @param token - Auth token (or null for anonymous)
* @param resource - Resource being accessed (e.g., "npm:package:foo")
* @param action - Action being performed (read, write, push, pull, delete)
* @returns true if authorized
*/ */
private async revokeProtocolToken(token: string): Promise<void> { public async authorize(
this.tokenStore.delete(token); token: IAuthToken | null,
resource: string,
action: string
): Promise<boolean> {
return this.provider.authorize(token, resource, action);
} }
// ======================================================================== // ========================================================================
@@ -120,7 +90,7 @@ export class AuthManager {
if (!this.config.npmTokens.enabled) { if (!this.config.npmTokens.enabled) {
throw new Error('NPM tokens are not enabled'); throw new Error('NPM tokens are not enabled');
} }
return this.createProtocolToken(userId, 'npm', readonly); return this.provider.createToken(userId, 'npm', { readonly });
} }
/** /**
@@ -129,7 +99,7 @@ export class AuthManager {
* @returns Auth token object or null * @returns Auth token object or null
*/ */
public async validateNpmToken(token: string): Promise<IAuthToken | null> { public async validateNpmToken(token: string): Promise<IAuthToken | null> {
return this.validateProtocolToken(token, 'npm'); return this.provider.validateToken(token, 'npm');
} }
/** /**
@@ -137,7 +107,7 @@ export class AuthManager {
* @param token - NPM UUID token * @param token - NPM UUID token
*/ */
public async revokeNpmToken(token: string): Promise<void> { public async revokeNpmToken(token: string): Promise<void> {
return this.revokeProtocolToken(token); return this.provider.revokeToken(token);
} }
/** /**
@@ -149,20 +119,12 @@ export class AuthManager {
key: string; key: string;
readonly: boolean; readonly: boolean;
created: string; created: string;
protocol?: TRegistryProtocol;
}>> { }>> {
const tokens: Array<{key: string; readonly: boolean; created: string}> = []; if (this.provider.listUserTokens) {
return this.provider.listUserTokens(userId);
for (const [token, authToken] of this.tokenStore.entries()) {
if (authToken.userId === userId) {
tokens.push({
key: this.hashToken(token),
readonly: authToken.readonly || false,
created: authToken.metadata?.created || 'unknown',
});
} }
} return [];
return tokens;
} }
// ======================================================================== // ========================================================================
@@ -174,39 +136,17 @@ export class AuthManager {
* @param userId - User ID * @param userId - User ID
* @param scopes - Permission scopes * @param scopes - Permission scopes
* @param expiresIn - Expiration time in seconds * @param expiresIn - Expiration time in seconds
* @returns JWT token string (HMAC-SHA256 signed) * @returns JWT token string
*/ */
public async createOciToken( public async createOciToken(
userId: string, userId: string,
scopes: string[], scopes: string[],
expiresIn: number = 3600 expiresIn: number = 3600
): Promise<string> { ): Promise<string> {
if (!this.config.ociTokens.enabled) { if (!this.config.ociTokens?.enabled) {
throw new Error('OCI tokens are not enabled'); throw new Error('OCI tokens are not enabled');
} }
return this.provider.createToken(userId, 'oci', { scopes, expiresIn });
const now = Math.floor(Date.now() / 1000);
const payload = {
iss: this.config.ociTokens.realm,
sub: userId,
aud: this.config.ociTokens.service,
exp: now + expiresIn,
nbf: now,
iat: now,
access: this.scopesToOciAccess(scopes),
};
// Create JWT with HMAC-SHA256 signature
const header = { alg: 'HS256', typ: 'JWT' };
const headerB64 = Buffer.from(JSON.stringify(header)).toString('base64url');
const payloadB64 = Buffer.from(JSON.stringify(payload)).toString('base64url');
const signature = crypto
.createHmac('sha256', this.config.jwtSecret)
.update(`${headerB64}.${payloadB64}`)
.digest('base64url');
return `${headerB64}.${payloadB64}.${signature}`;
} }
/** /**
@@ -215,80 +155,7 @@ export class AuthManager {
* @returns Auth token object or null * @returns Auth token object or null
*/ */
public async validateOciToken(jwt: string): Promise<IAuthToken | null> { public async validateOciToken(jwt: string): Promise<IAuthToken | null> {
try { return this.provider.validateToken(jwt, 'oci');
const parts = jwt.split('.');
if (parts.length !== 3) {
return null;
}
const [headerB64, payloadB64, signatureB64] = parts;
// Verify signature
const expectedSignature = crypto
.createHmac('sha256', this.config.jwtSecret)
.update(`${headerB64}.${payloadB64}`)
.digest('base64url');
if (signatureB64 !== expectedSignature) {
return null;
}
// Decode and parse payload
const payload = JSON.parse(Buffer.from(payloadB64, 'base64url').toString('utf-8'));
// Check expiration
const now = Math.floor(Date.now() / 1000);
if (payload.exp && payload.exp < now) {
return null;
}
// Check not-before time
if (payload.nbf && payload.nbf > now) {
return null;
}
// Convert to unified token format
const scopes = this.ociAccessToScopes(payload.access || []);
return {
type: 'oci',
userId: payload.sub,
scopes,
expiresAt: payload.exp ? new Date(payload.exp * 1000) : undefined,
metadata: {
iss: payload.iss,
aud: payload.aud,
},
};
} catch (error) {
return null;
}
}
// ========================================================================
// UNIFIED AUTHENTICATION
// ========================================================================
/**
* Authenticate user credentials
* @param credentials - Username and password
* @returns User ID or null
*/
public async authenticate(credentials: ICredentials): Promise<string | null> {
// Mock authentication - in production, verify against database
const storedPassword = this.userCredentials.get(credentials.username);
if (!storedPassword) {
// Auto-register for testing (remove in production)
this.userCredentials.set(credentials.username, credentials.password);
return credentials.username;
}
if (storedPassword === credentials.password) {
return credentials.username;
}
return null;
} }
// ======================================================================== // ========================================================================
@@ -302,7 +169,7 @@ export class AuthManager {
* @returns Maven UUID token * @returns Maven UUID token
*/ */
public async createMavenToken(userId: string, readonly: boolean = false): Promise<string> { public async createMavenToken(userId: string, readonly: boolean = false): Promise<string> {
return this.createProtocolToken(userId, 'maven', readonly); return this.provider.createToken(userId, 'maven', { readonly });
} }
/** /**
@@ -311,7 +178,7 @@ export class AuthManager {
* @returns Auth token object or null * @returns Auth token object or null
*/ */
public async validateMavenToken(token: string): Promise<IAuthToken | null> { public async validateMavenToken(token: string): Promise<IAuthToken | null> {
return this.validateProtocolToken(token, 'maven'); return this.provider.validateToken(token, 'maven');
} }
/** /**
@@ -319,7 +186,7 @@ export class AuthManager {
* @param token - Maven UUID token * @param token - Maven UUID token
*/ */
public async revokeMavenToken(token: string): Promise<void> { public async revokeMavenToken(token: string): Promise<void> {
return this.revokeProtocolToken(token); return this.provider.revokeToken(token);
} }
// ======================================================================== // ========================================================================
@@ -333,7 +200,7 @@ export class AuthManager {
* @returns Composer UUID token * @returns Composer UUID token
*/ */
public async createComposerToken(userId: string, readonly: boolean = false): Promise<string> { public async createComposerToken(userId: string, readonly: boolean = false): Promise<string> {
return this.createProtocolToken(userId, 'composer', readonly); return this.provider.createToken(userId, 'composer', { readonly });
} }
/** /**
@@ -342,7 +209,7 @@ export class AuthManager {
* @returns Auth token object or null * @returns Auth token object or null
*/ */
public async validateComposerToken(token: string): Promise<IAuthToken | null> { public async validateComposerToken(token: string): Promise<IAuthToken | null> {
return this.validateProtocolToken(token, 'composer'); return this.provider.validateToken(token, 'composer');
} }
/** /**
@@ -350,7 +217,7 @@ export class AuthManager {
* @param token - Composer UUID token * @param token - Composer UUID token
*/ */
public async revokeComposerToken(token: string): Promise<void> { public async revokeComposerToken(token: string): Promise<void> {
return this.revokeProtocolToken(token); return this.provider.revokeToken(token);
} }
// ======================================================================== // ========================================================================
@@ -364,7 +231,7 @@ export class AuthManager {
* @returns Cargo UUID token * @returns Cargo UUID token
*/ */
public async createCargoToken(userId: string, readonly: boolean = false): Promise<string> { public async createCargoToken(userId: string, readonly: boolean = false): Promise<string> {
return this.createProtocolToken(userId, 'cargo', readonly); return this.provider.createToken(userId, 'cargo', { readonly });
} }
/** /**
@@ -373,7 +240,7 @@ export class AuthManager {
* @returns Auth token object or null * @returns Auth token object or null
*/ */
public async validateCargoToken(token: string): Promise<IAuthToken | null> { public async validateCargoToken(token: string): Promise<IAuthToken | null> {
return this.validateProtocolToken(token, 'cargo'); return this.provider.validateToken(token, 'cargo');
} }
/** /**
@@ -381,7 +248,7 @@ export class AuthManager {
* @param token - Cargo UUID token * @param token - Cargo UUID token
*/ */
public async revokeCargoToken(token: string): Promise<void> { public async revokeCargoToken(token: string): Promise<void> {
return this.revokeProtocolToken(token); return this.provider.revokeToken(token);
} }
// ======================================================================== // ========================================================================
@@ -395,7 +262,7 @@ export class AuthManager {
* @returns PyPI UUID token * @returns PyPI UUID token
*/ */
public async createPypiToken(userId: string, readonly: boolean = false): Promise<string> { public async createPypiToken(userId: string, readonly: boolean = false): Promise<string> {
return this.createProtocolToken(userId, 'pypi', readonly); return this.provider.createToken(userId, 'pypi', { readonly });
} }
/** /**
@@ -404,7 +271,7 @@ export class AuthManager {
* @returns Auth token object or null * @returns Auth token object or null
*/ */
public async validatePypiToken(token: string): Promise<IAuthToken | null> { public async validatePypiToken(token: string): Promise<IAuthToken | null> {
return this.validateProtocolToken(token, 'pypi'); return this.provider.validateToken(token, 'pypi');
} }
/** /**
@@ -412,7 +279,7 @@ export class AuthManager {
* @param token - PyPI UUID token * @param token - PyPI UUID token
*/ */
public async revokePypiToken(token: string): Promise<void> { public async revokePypiToken(token: string): Promise<void> {
return this.revokeProtocolToken(token); return this.provider.revokeToken(token);
} }
// ======================================================================== // ========================================================================
@@ -426,7 +293,7 @@ export class AuthManager {
* @returns RubyGems UUID token * @returns RubyGems UUID token
*/ */
public async createRubyGemsToken(userId: string, readonly: boolean = false): Promise<string> { public async createRubyGemsToken(userId: string, readonly: boolean = false): Promise<string> {
return this.createProtocolToken(userId, 'rubygems', readonly); return this.provider.createToken(userId, 'rubygems', { readonly });
} }
/** /**
@@ -435,7 +302,7 @@ export class AuthManager {
* @returns Auth token object or null * @returns Auth token object or null
*/ */
public async validateRubyGemsToken(token: string): Promise<IAuthToken | null> { public async validateRubyGemsToken(token: string): Promise<IAuthToken | null> {
return this.validateProtocolToken(token, 'rubygems'); return this.provider.validateToken(token, 'rubygems');
} }
/** /**
@@ -443,211 +310,6 @@ export class AuthManager {
* @param token - RubyGems UUID token * @param token - RubyGems UUID token
*/ */
public async revokeRubyGemsToken(token: string): Promise<void> { public async revokeRubyGemsToken(token: string): Promise<void> {
return this.revokeProtocolToken(token); return this.provider.revokeToken(token);
}
// ========================================================================
// UNIFIED AUTHENTICATION
// ========================================================================
/**
* Validate any token (NPM, Maven, OCI, PyPI, RubyGems, Composer, Cargo)
* Optimized: O(1) lookup when protocol hint provided
* @param tokenString - Token string (UUID or JWT)
* @param protocol - Expected protocol type (optional, improves performance)
* @returns Auth token object or null
*/
public async validateToken(
tokenString: string,
protocol?: TRegistryProtocol
): Promise<IAuthToken | null> {
// OCI uses JWT (contains dots), not UUID - check first if OCI is expected
if (protocol === 'oci' || tokenString.includes('.')) {
const ociToken = await this.validateOciToken(tokenString);
if (ociToken && (!protocol || protocol === 'oci')) {
return ociToken;
}
// If protocol was explicitly OCI but validation failed, return null
if (protocol === 'oci') {
return null;
}
}
// UUID-based tokens: single O(1) Map lookup
if (this.isValidUuid(tokenString)) {
const authToken = this.tokenStore.get(tokenString);
if (authToken) {
// If protocol specified, verify it matches
if (protocol && authToken.type !== protocol) {
return null;
}
// Check expiration
if (authToken.expiresAt && authToken.expiresAt < new Date()) {
this.tokenStore.delete(tokenString);
return null;
}
return authToken;
}
}
return null;
}
/**
* Check if token has permission for an action
* @param token - Auth token
* @param resource - Resource being accessed (e.g., "package:foo" or "repository:bar")
* @param action - Action being performed (read, write, push, pull, delete)
* @returns true if authorized
*/
public async authorize(
token: IAuthToken | null,
resource: string,
action: string
): Promise<boolean> {
if (!token) {
return false;
}
// Check readonly flag
if (token.readonly && ['write', 'push', 'delete'].includes(action)) {
return false;
}
// Check scopes
for (const scope of token.scopes) {
if (this.matchesScope(scope, resource, action)) {
return true;
}
}
return false;
}
// ========================================================================
// HELPER METHODS
// ========================================================================
/**
* Check if a scope matches a resource and action
* Scope format: "{protocol}:{type}:{name}:{action}"
* Examples:
* - "npm:*:*" - All NPM access
* - "npm:package:foo:*" - All actions on package foo
* - "npm:package:foo:read" - Read-only on package foo
* - "oci:repository:*:pull" - Pull from any OCI repo
*/
private matchesScope(scope: string, resource: string, action: string): boolean {
const scopeParts = scope.split(':');
const resourceParts = resource.split(':');
// Scope must have at least protocol:type:name:action
if (scopeParts.length < 4) {
return false;
}
const [scopeProtocol, scopeType, scopeName, scopeAction] = scopeParts;
const [resourceProtocol, resourceType, resourceName] = resourceParts;
// Check protocol
if (scopeProtocol !== '*' && scopeProtocol !== resourceProtocol) {
return false;
}
// Check type
if (scopeType !== '*' && scopeType !== resourceType) {
return false;
}
// Check name
if (scopeName !== '*' && scopeName !== resourceName) {
return false;
}
// Check action
if (scopeAction !== '*' && scopeAction !== action) {
// Map action aliases
const actionAliases: Record<string, string[]> = {
read: ['pull', 'get'],
write: ['push', 'put', 'post'],
};
const aliases = actionAliases[scopeAction] || [];
if (!aliases.includes(action)) {
return false;
}
}
return true;
}
/**
* Convert unified scopes to OCI access array
*/
private scopesToOciAccess(scopes: string[]): Array<{
type: string;
name: string;
actions: string[];
}> {
const access: Array<{type: string; name: string; actions: string[]}> = [];
for (const scope of scopes) {
const parts = scope.split(':');
if (parts.length >= 4 && parts[0] === 'oci') {
access.push({
type: parts[1],
name: parts[2],
actions: [parts[3]],
});
}
}
return access;
}
/**
* Convert OCI access array to unified scopes
*/
private ociAccessToScopes(access: Array<{
type: string;
name: string;
actions: string[];
}>): string[] {
const scopes: string[] = [];
for (const item of access) {
for (const action of item.actions) {
scopes.push(`oci:${item.type}:${item.name}:${action}`);
}
}
return scopes;
}
/**
* Generate UUID for NPM tokens
*/
private generateUuid(): string {
return 'xxxxxxxx-xxxx-4xxx-yxxx-xxxxxxxxxxxx'.replace(/[xy]/g, (c) => {
const r = (Math.random() * 16) | 0;
const v = c === 'x' ? r : (r & 0x3) | 0x8;
return v.toString(16);
});
}
/**
* Check if string is a valid UUID
*/
private isValidUuid(str: string): boolean {
const uuidRegex = /^[0-9a-f]{8}-[0-9a-f]{4}-4[0-9a-f]{3}-[89ab][0-9a-f]{3}-[0-9a-f]{12}$/i;
return uuidRegex.test(str);
}
/**
* Hash a token for identification (SHA-512 mock)
*/
private hashToken(token: string): string {
// In production, use actual SHA-512
return `sha512-${token.substring(0, 16)}...`;
} }
} }

View File

@@ -0,0 +1,393 @@
import * as crypto from 'crypto';
import type { IAuthProvider, ITokenOptions } from './interfaces.auth.js';
import type { IAuthConfig, IAuthToken, ICredentials, TRegistryProtocol } from './interfaces.core.js';
/**
* Default in-memory authentication provider.
* This is the reference implementation that stores tokens in memory.
* For production use, implement IAuthProvider with Redis, database, or external auth.
*/
export class DefaultAuthProvider implements IAuthProvider {
private tokenStore: Map<string, IAuthToken> = new Map();
private userCredentials: Map<string, string> = new Map(); // username -> password hash (mock)
constructor(private config: IAuthConfig) {}
/**
* Initialize the auth provider
*/
public async init(): Promise<void> {
// Initialize token store (in-memory for now)
// In production, this could be Redis or a database
}
// ========================================================================
// IAuthProvider Implementation
// ========================================================================
/**
* Authenticate user credentials
*/
public async authenticate(credentials: ICredentials): Promise<string | null> {
// Mock authentication - in production, verify against database/LDAP
const storedPassword = this.userCredentials.get(credentials.username);
if (!storedPassword) {
// Auto-register for testing (remove in production)
this.userCredentials.set(credentials.username, credentials.password);
return credentials.username;
}
if (storedPassword === credentials.password) {
return credentials.username;
}
return null;
}
/**
* Validate any token (NPM, Maven, OCI, PyPI, RubyGems, Composer, Cargo)
*/
public async validateToken(
tokenString: string,
protocol?: TRegistryProtocol
): Promise<IAuthToken | null> {
// OCI uses JWT (contains dots), not UUID - check first if OCI is expected
if (protocol === 'oci' || tokenString.includes('.')) {
const ociToken = await this.validateOciToken(tokenString);
if (ociToken && (!protocol || protocol === 'oci')) {
return ociToken;
}
// If protocol was explicitly OCI but validation failed, return null
if (protocol === 'oci') {
return null;
}
}
// UUID-based tokens: single O(1) Map lookup
if (this.isValidUuid(tokenString)) {
const authToken = this.tokenStore.get(tokenString);
if (authToken) {
// If protocol specified, verify it matches
if (protocol && authToken.type !== protocol) {
return null;
}
// Check expiration
if (authToken.expiresAt && authToken.expiresAt < new Date()) {
this.tokenStore.delete(tokenString);
return null;
}
return authToken;
}
}
return null;
}
/**
* Create a new token for a user
*/
public async createToken(
userId: string,
protocol: TRegistryProtocol,
options?: ITokenOptions
): Promise<string> {
// OCI tokens use JWT
if (protocol === 'oci') {
return this.createOciToken(userId, options?.scopes || ['oci:*:*:*'], options?.expiresIn || 3600);
}
// All other protocols use UUID tokens
const token = this.generateUuid();
const scopes = options?.scopes || (options?.readonly
? [`${protocol}:*:*:read`]
: [`${protocol}:*:*:*`]);
const authToken: IAuthToken = {
type: protocol,
userId,
scopes,
readonly: options?.readonly,
expiresAt: options?.expiresIn ? new Date(Date.now() + options.expiresIn * 1000) : undefined,
metadata: {
created: new Date().toISOString(),
},
};
this.tokenStore.set(token, authToken);
return token;
}
/**
* Revoke a token
*/
public async revokeToken(token: string): Promise<void> {
this.tokenStore.delete(token);
}
/**
* Check if token has permission for an action
*/
public async authorize(
token: IAuthToken | null,
resource: string,
action: string
): Promise<boolean> {
if (!token) {
return false;
}
// Check readonly flag
if (token.readonly && ['write', 'push', 'delete'].includes(action)) {
return false;
}
// Check scopes
for (const scope of token.scopes) {
if (this.matchesScope(scope, resource, action)) {
return true;
}
}
return false;
}
/**
* List all tokens for a user
*/
public async listUserTokens(userId: string): Promise<Array<{
key: string;
readonly: boolean;
created: string;
protocol?: TRegistryProtocol;
}>> {
const tokens: Array<{key: string; readonly: boolean; created: string; protocol?: TRegistryProtocol}> = [];
for (const [token, authToken] of this.tokenStore.entries()) {
if (authToken.userId === userId) {
tokens.push({
key: this.hashToken(token),
readonly: authToken.readonly || false,
created: authToken.metadata?.created || 'unknown',
protocol: authToken.type,
});
}
}
return tokens;
}
// ========================================================================
// OCI JWT Token Methods
// ========================================================================
/**
* Create an OCI JWT token
*/
private async createOciToken(
userId: string,
scopes: string[],
expiresIn: number = 3600
): Promise<string> {
if (!this.config.ociTokens?.enabled) {
throw new Error('OCI tokens are not enabled');
}
const now = Math.floor(Date.now() / 1000);
const payload = {
iss: this.config.ociTokens.realm,
sub: userId,
aud: this.config.ociTokens.service,
exp: now + expiresIn,
nbf: now,
iat: now,
access: this.scopesToOciAccess(scopes),
};
// Create JWT with HMAC-SHA256 signature
const header = { alg: 'HS256', typ: 'JWT' };
const headerB64 = Buffer.from(JSON.stringify(header)).toString('base64url');
const payloadB64 = Buffer.from(JSON.stringify(payload)).toString('base64url');
const signature = crypto
.createHmac('sha256', this.config.jwtSecret)
.update(`${headerB64}.${payloadB64}`)
.digest('base64url');
return `${headerB64}.${payloadB64}.${signature}`;
}
/**
* Validate an OCI JWT token
*/
private async validateOciToken(jwt: string): Promise<IAuthToken | null> {
try {
const parts = jwt.split('.');
if (parts.length !== 3) {
return null;
}
const [headerB64, payloadB64, signatureB64] = parts;
// Verify signature
const expectedSignature = crypto
.createHmac('sha256', this.config.jwtSecret)
.update(`${headerB64}.${payloadB64}`)
.digest('base64url');
if (signatureB64 !== expectedSignature) {
return null;
}
// Decode and parse payload
const payload = JSON.parse(Buffer.from(payloadB64, 'base64url').toString('utf-8'));
// Check expiration
const now = Math.floor(Date.now() / 1000);
if (payload.exp && payload.exp < now) {
return null;
}
// Check not-before time
if (payload.nbf && payload.nbf > now) {
return null;
}
// Convert to unified token format
const scopes = this.ociAccessToScopes(payload.access || []);
return {
type: 'oci',
userId: payload.sub,
scopes,
expiresAt: payload.exp ? new Date(payload.exp * 1000) : undefined,
metadata: {
iss: payload.iss,
aud: payload.aud,
},
};
} catch (error) {
return null;
}
}
// ========================================================================
// Helper Methods
// ========================================================================
/**
* Check if a scope matches a resource and action
*/
private matchesScope(scope: string, resource: string, action: string): boolean {
const scopeParts = scope.split(':');
const resourceParts = resource.split(':');
// Scope must have at least protocol:type:name:action
if (scopeParts.length < 4) {
return false;
}
const [scopeProtocol, scopeType, scopeName, scopeAction] = scopeParts;
const [resourceProtocol, resourceType, resourceName] = resourceParts;
// Check protocol
if (scopeProtocol !== '*' && scopeProtocol !== resourceProtocol) {
return false;
}
// Check type
if (scopeType !== '*' && scopeType !== resourceType) {
return false;
}
// Check name
if (scopeName !== '*' && scopeName !== resourceName) {
return false;
}
// Check action
if (scopeAction !== '*' && scopeAction !== action) {
// Map action aliases
const actionAliases: Record<string, string[]> = {
read: ['pull', 'get'],
write: ['push', 'put', 'post'],
};
const aliases = actionAliases[scopeAction] || [];
if (!aliases.includes(action)) {
return false;
}
}
return true;
}
/**
* Convert unified scopes to OCI access array
*/
private scopesToOciAccess(scopes: string[]): Array<{
type: string;
name: string;
actions: string[];
}> {
const access: Array<{type: string; name: string; actions: string[]}> = [];
for (const scope of scopes) {
const parts = scope.split(':');
if (parts.length >= 4 && parts[0] === 'oci') {
access.push({
type: parts[1],
name: parts[2],
actions: [parts[3]],
});
}
}
return access;
}
/**
* Convert OCI access array to unified scopes
*/
private ociAccessToScopes(access: Array<{
type: string;
name: string;
actions: string[];
}>): string[] {
const scopes: string[] = [];
for (const item of access) {
for (const action of item.actions) {
scopes.push(`oci:${item.type}:${item.name}:${action}`);
}
}
return scopes;
}
/**
* Generate UUID for tokens
*/
private generateUuid(): string {
return 'xxxxxxxx-xxxx-4xxx-yxxx-xxxxxxxxxxxx'.replace(/[xy]/g, (c) => {
const r = (Math.random() * 16) | 0;
const v = c === 'x' ? r : (r & 0x3) | 0x8;
return v.toString(16);
});
}
/**
* Check if string is a valid UUID
*/
private isValidUuid(str: string): boolean {
const uuidRegex = /^[0-9a-f]{8}-[0-9a-f]{4}-4[0-9a-f]{3}-[89ab][0-9a-f]{3}-[0-9a-f]{12}$/i;
return uuidRegex.test(str);
}
/**
* Hash a token for identification
*/
private hashToken(token: string): string {
return `sha512-${token.substring(0, 16)}...`;
}
}

View File

@@ -1,17 +1,54 @@
import * as plugins from '../plugins.js'; import * as plugins from '../plugins.js';
import type { IStorageConfig, IStorageBackend } from './interfaces.core.js'; import type { IStorageConfig, IStorageBackend, TRegistryProtocol } from './interfaces.core.js';
import type {
IStorageHooks,
IStorageHookContext,
IStorageActor,
IStorageMetadata,
} from './interfaces.storage.js';
/** /**
* Storage abstraction layer for registry * Storage abstraction layer for registry.
* Provides a unified interface over SmartBucket * Provides a unified interface over SmartBucket with optional hooks
* for quota tracking, audit logging, cache invalidation, etc.
*
* @example
* ```typescript
* // Basic usage
* const storage = new RegistryStorage(config);
*
* // With hooks for quota tracking
* const storage = new RegistryStorage(config, {
* beforePut: async (ctx) => {
* const quota = await getQuota(ctx.actor?.orgId);
* const usage = await getUsage(ctx.actor?.orgId);
* if (usage + (ctx.metadata?.size || 0) > quota) {
* return { allowed: false, reason: 'Quota exceeded' };
* }
* return { allowed: true };
* },
* afterPut: async (ctx) => {
* await updateUsage(ctx.actor?.orgId, ctx.metadata?.size || 0);
* }
* });
* ```
*/ */
export class RegistryStorage implements IStorageBackend { export class RegistryStorage implements IStorageBackend {
private smartBucket: plugins.smartbucket.SmartBucket; private smartBucket: plugins.smartbucket.SmartBucket;
private bucket: plugins.smartbucket.Bucket; private bucket: plugins.smartbucket.Bucket;
private bucketName: string; private bucketName: string;
private hooks?: IStorageHooks;
constructor(private config: IStorageConfig) { constructor(private config: IStorageConfig, hooks?: IStorageHooks) {
this.bucketName = config.bucketName; this.bucketName = config.bucketName;
this.hooks = hooks;
}
/**
* Set storage hooks (can be called after construction)
*/
public setHooks(hooks: IStorageHooks): void {
this.hooks = hooks;
} }
/** /**
@@ -34,7 +71,24 @@ export class RegistryStorage implements IStorageBackend {
*/ */
public async getObject(key: string): Promise<Buffer | null> { public async getObject(key: string): Promise<Buffer | null> {
try { try {
return await this.bucket.fastGet({ path: key }); const data = await this.bucket.fastGet({ path: key });
// Call afterGet hook (non-blocking)
if (this.hooks?.afterGet && data) {
const context = this.currentContext;
if (context) {
this.hooks.afterGet({
operation: 'get',
key,
protocol: context.protocol,
actor: context.actor,
metadata: context.metadata,
timestamp: new Date(),
}).catch(() => {}); // Don't fail on hook errors
}
}
return data;
} catch (error) { } catch (error) {
return null; return null;
} }
@@ -48,19 +102,159 @@ export class RegistryStorage implements IStorageBackend {
data: Buffer, data: Buffer,
metadata?: Record<string, string> metadata?: Record<string, string>
): Promise<void> { ): Promise<void> {
// Call beforePut hook if available
if (this.hooks?.beforePut) {
const context = this.currentContext;
if (context) {
const hookContext: IStorageHookContext = {
operation: 'put',
key,
protocol: context.protocol,
actor: context.actor,
metadata: {
...context.metadata,
size: data.length,
},
timestamp: new Date(),
};
const result = await this.hooks.beforePut(hookContext);
if (!result.allowed) {
throw new Error(result.reason || 'Storage operation denied by hook');
}
}
}
// Note: SmartBucket doesn't support metadata yet // Note: SmartBucket doesn't support metadata yet
await this.bucket.fastPut({ await this.bucket.fastPut({
path: key, path: key,
contents: data, contents: data,
overwrite: true, // Always overwrite existing objects overwrite: true, // Always overwrite existing objects
}); });
// Call afterPut hook (non-blocking)
if (this.hooks?.afterPut) {
const context = this.currentContext;
if (context) {
this.hooks.afterPut({
operation: 'put',
key,
protocol: context.protocol,
actor: context.actor,
metadata: {
...context.metadata,
size: data.length,
},
timestamp: new Date(),
}).catch(() => {}); // Don't fail on hook errors
}
}
} }
/** /**
* Delete an object * Delete an object
*/ */
public async deleteObject(key: string): Promise<void> { public async deleteObject(key: string): Promise<void> {
// Call beforeDelete hook if available
if (this.hooks?.beforeDelete) {
const context = this.currentContext;
if (context) {
const hookContext: IStorageHookContext = {
operation: 'delete',
key,
protocol: context.protocol,
actor: context.actor,
metadata: context.metadata,
timestamp: new Date(),
};
const result = await this.hooks.beforeDelete(hookContext);
if (!result.allowed) {
throw new Error(result.reason || 'Delete operation denied by hook');
}
}
}
await this.bucket.fastRemove({ path: key }); await this.bucket.fastRemove({ path: key });
// Call afterDelete hook (non-blocking)
if (this.hooks?.afterDelete) {
const context = this.currentContext;
if (context) {
this.hooks.afterDelete({
operation: 'delete',
key,
protocol: context.protocol,
actor: context.actor,
metadata: context.metadata,
timestamp: new Date(),
}).catch(() => {}); // Don't fail on hook errors
}
}
}
// ========================================================================
// CONTEXT FOR HOOKS
// ========================================================================
/**
* Current operation context for hooks.
* Set this before performing storage operations to enable hooks.
*/
private currentContext?: {
protocol: TRegistryProtocol;
actor?: IStorageActor;
metadata?: IStorageMetadata;
};
/**
* Set the current operation context for hooks.
* Call this before performing storage operations.
*
* @example
* ```typescript
* storage.setContext({
* protocol: 'npm',
* actor: { userId: 'user123', ip: '192.168.1.1' },
* metadata: { packageName: 'lodash', version: '4.17.21' }
* });
* await storage.putNpmTarball('lodash', '4.17.21', tarball);
* storage.clearContext();
* ```
*/
public setContext(context: {
protocol: TRegistryProtocol;
actor?: IStorageActor;
metadata?: IStorageMetadata;
}): void {
this.currentContext = context;
}
/**
* Clear the current operation context.
*/
public clearContext(): void {
this.currentContext = undefined;
}
/**
* Execute a function with a temporary context.
* Context is automatically cleared after execution.
*/
public async withContext<T>(
context: {
protocol: TRegistryProtocol;
actor?: IStorageActor;
metadata?: IStorageMetadata;
},
fn: () => Promise<T>
): Promise<T> {
this.setContext(context);
try {
return await fn();
} finally {
this.clearContext();
}
} }
/** /**

View File

@@ -2,9 +2,16 @@
* Core registry infrastructure exports * Core registry infrastructure exports
*/ */
// Interfaces // Core interfaces
export * from './interfaces.core.js'; export * from './interfaces.core.js';
// Auth interfaces and provider
export * from './interfaces.auth.js';
export { DefaultAuthProvider } from './classes.defaultauthprovider.js';
// Storage interfaces and hooks
export * from './interfaces.storage.js';
// Classes // Classes
export { BaseRegistry } from './classes.baseregistry.js'; export { BaseRegistry } from './classes.baseregistry.js';
export { RegistryStorage } from './classes.registrystorage.js'; export { RegistryStorage } from './classes.registrystorage.js';

View File

@@ -0,0 +1,91 @@
import type { IAuthToken, ICredentials, TRegistryProtocol } from './interfaces.core.js';
/**
* Options for creating a token
*/
export interface ITokenOptions {
/** Whether the token is readonly */
readonly?: boolean;
/** Permission scopes */
scopes?: string[];
/** Expiration time in seconds */
expiresIn?: number;
}
/**
* Pluggable authentication provider interface.
* Implement this to integrate external auth systems (LDAP, OAuth, SSO, OIDC).
*
* @example
* ```typescript
* class LdapAuthProvider implements IAuthProvider {
* constructor(private ldap: LdapClient, private redis: RedisClient) {}
*
* async authenticate(credentials: ICredentials): Promise<string | null> {
* return await this.ldap.bind(credentials.username, credentials.password);
* }
*
* async validateToken(token: string): Promise<IAuthToken | null> {
* return await this.redis.get(`token:${token}`);
* }
* // ...
* }
* ```
*/
export interface IAuthProvider {
/**
* Initialize the auth provider (optional)
*/
init?(): Promise<void>;
/**
* Authenticate user credentials (login flow)
* @param credentials - Username and password
* @returns User ID on success, null on failure
*/
authenticate(credentials: ICredentials): Promise<string | null>;
/**
* Validate an existing token
* @param token - Token string (UUID or JWT)
* @param protocol - Optional protocol hint for optimization
* @returns Auth token info or null if invalid
*/
validateToken(token: string, protocol?: TRegistryProtocol): Promise<IAuthToken | null>;
/**
* Create a new token for a user
* @param userId - User ID
* @param protocol - Protocol type (npm, oci, maven, etc.)
* @param options - Token options (readonly, scopes, expiration)
* @returns Token string
*/
createToken(userId: string, protocol: TRegistryProtocol, options?: ITokenOptions): Promise<string>;
/**
* Revoke a token
* @param token - Token string to revoke
*/
revokeToken(token: string): Promise<void>;
/**
* Check if user has permission for an action
* @param token - Auth token (or null for anonymous)
* @param resource - Resource being accessed (e.g., "npm:package:lodash")
* @param action - Action being performed (read, write, push, pull, delete)
* @returns true if authorized
*/
authorize(token: IAuthToken | null, resource: string, action: string): Promise<boolean>;
/**
* List all tokens for a user (optional)
* @param userId - User ID
* @returns List of token info
*/
listUserTokens?(userId: string): Promise<Array<{
key: string;
readonly: boolean;
created: string;
protocol?: TRegistryProtocol;
}>>;
}

View File

@@ -3,6 +3,9 @@
*/ */
import type * as plugins from '../plugins.js'; import type * as plugins from '../plugins.js';
import type { IProtocolUpstreamConfig } from '../upstream/interfaces.upstream.js';
import type { IAuthProvider } from './interfaces.auth.js';
import type { IStorageHooks } from './interfaces.storage.js';
/** /**
* Registry protocol types * Registry protocol types
@@ -86,6 +89,8 @@ export interface IProtocolConfig {
enabled: boolean; enabled: boolean;
basePath: string; basePath: string;
features?: Record<string, boolean>; features?: Record<string, boolean>;
/** Upstream registry configuration for proxying/caching */
upstream?: IProtocolUpstreamConfig;
} }
/** /**
@@ -94,6 +99,20 @@ export interface IProtocolConfig {
export interface IRegistryConfig { export interface IRegistryConfig {
storage: IStorageConfig; storage: IStorageConfig;
auth: IAuthConfig; auth: IAuthConfig;
/**
* Custom authentication provider.
* If not provided, uses the default in-memory auth provider.
* Implement IAuthProvider to integrate LDAP, OAuth, SSO, etc.
*/
authProvider?: IAuthProvider;
/**
* Storage event hooks for quota tracking, audit logging, etc.
* Called before/after storage operations.
*/
storageHooks?: IStorageHooks;
oci?: IProtocolConfig; oci?: IProtocolConfig;
npm?: IProtocolConfig; npm?: IProtocolConfig;
maven?: IProtocolConfig; maven?: IProtocolConfig;
@@ -149,6 +168,24 @@ export interface IRegistryError {
}>; }>;
} }
/**
* Actor information - identifies who is performing the request
*/
export interface IRequestActor {
/** User ID (from validated token) */
userId?: string;
/** Token ID/hash for audit purposes */
tokenId?: string;
/** Client IP address */
ip?: string;
/** Client User-Agent */
userAgent?: string;
/** Organization ID (for multi-tenant setups) */
orgId?: string;
/** Session ID */
sessionId?: string;
}
/** /**
* Base request context * Base request context
*/ */
@@ -165,6 +202,11 @@ export interface IRequestContext {
*/ */
rawBody?: Buffer; rawBody?: Buffer;
token?: string; token?: string;
/**
* Actor information - identifies who is performing the request.
* Populated after authentication for audit logging, quota enforcement, etc.
*/
actor?: IRequestActor;
} }
/** /**

View File

@@ -0,0 +1,130 @@
import type { TRegistryProtocol } from './interfaces.core.js';
/**
* Actor information from request context
*/
export interface IStorageActor {
userId?: string;
tokenId?: string;
ip?: string;
userAgent?: string;
orgId?: string;
sessionId?: string;
}
/**
* Metadata about the storage operation
*/
export interface IStorageMetadata {
/** Content type of the object */
contentType?: string;
/** Size in bytes */
size?: number;
/** Content digest (e.g., sha256:abc123) */
digest?: string;
/** Package/artifact name */
packageName?: string;
/** Version */
version?: string;
}
/**
* Context passed to storage hooks
*/
export interface IStorageHookContext {
/** Type of operation */
operation: 'put' | 'delete' | 'get';
/** Storage key/path */
key: string;
/** Protocol that triggered this operation */
protocol: TRegistryProtocol;
/** Actor who performed the operation (if known) */
actor?: IStorageActor;
/** Metadata about the object */
metadata?: IStorageMetadata;
/** Timestamp of the operation */
timestamp: Date;
}
/**
* Result from a beforePut hook that can modify the operation
*/
export interface IBeforePutResult {
/** Whether to allow the operation */
allowed: boolean;
/** Optional reason for rejection */
reason?: string;
/** Optional modified metadata */
metadata?: IStorageMetadata;
}
/**
* Result from a beforeDelete hook
*/
export interface IBeforeDeleteResult {
/** Whether to allow the operation */
allowed: boolean;
/** Optional reason for rejection */
reason?: string;
}
/**
* Storage event hooks for quota tracking, audit logging, cache invalidation, etc.
*
* @example
* ```typescript
* const quotaHooks: IStorageHooks = {
* async beforePut(context) {
* const quota = await getQuota(context.actor?.orgId);
* const currentUsage = await getUsage(context.actor?.orgId);
* if (currentUsage + (context.metadata?.size || 0) > quota) {
* return { allowed: false, reason: 'Quota exceeded' };
* }
* return { allowed: true };
* },
*
* async afterPut(context) {
* await updateUsage(context.actor?.orgId, context.metadata?.size || 0);
* await auditLog('storage.put', context);
* },
*
* async afterDelete(context) {
* await invalidateCache(context.key);
* }
* };
* ```
*/
export interface IStorageHooks {
/**
* Called before storing an object.
* Return { allowed: false } to reject the operation.
* Use for quota checks, virus scanning, validation, etc.
*/
beforePut?(context: IStorageHookContext): Promise<IBeforePutResult>;
/**
* Called after successfully storing an object.
* Use for quota tracking, audit logging, notifications, etc.
*/
afterPut?(context: IStorageHookContext): Promise<void>;
/**
* Called before deleting an object.
* Return { allowed: false } to reject the operation.
* Use for preventing deletion of protected resources.
*/
beforeDelete?(context: IStorageHookContext): Promise<IBeforeDeleteResult>;
/**
* Called after successfully deleting an object.
* Use for quota updates, audit logging, cache invalidation, etc.
*/
afterDelete?(context: IStorageHookContext): Promise<void>;
/**
* Called after reading an object.
* Use for access logging, analytics, etc.
* Note: This is called even for cache hits.
*/
afterGet?(context: IStorageHookContext): Promise<void>;
}

View File

@@ -9,6 +9,9 @@ export { SmartRegistry } from './classes.smartregistry.js';
// Core infrastructure // Core infrastructure
export * from './core/index.js'; export * from './core/index.js';
// Upstream infrastructure
export * from './upstream/index.js';
// OCI Registry // OCI Registry
export * from './oci/index.js'; export * from './oci/index.js';

View File

@@ -7,6 +7,7 @@ import { BaseRegistry } from '../core/classes.baseregistry.js';
import type { RegistryStorage } from '../core/classes.registrystorage.js'; import type { RegistryStorage } from '../core/classes.registrystorage.js';
import type { AuthManager } from '../core/classes.authmanager.js'; import type { AuthManager } from '../core/classes.authmanager.js';
import type { IRequestContext, IResponse, IAuthToken } from '../core/interfaces.core.js'; import type { IRequestContext, IResponse, IAuthToken } from '../core/interfaces.core.js';
import type { IProtocolUpstreamConfig } from '../upstream/interfaces.upstream.js';
import { toBuffer } from '../core/helpers.buffer.js'; import { toBuffer } from '../core/helpers.buffer.js';
import type { IMavenCoordinate, IMavenMetadata, IChecksums } from './interfaces.maven.js'; import type { IMavenCoordinate, IMavenMetadata, IChecksums } from './interfaces.maven.js';
import { import {
@@ -21,6 +22,7 @@ import {
extractGAVFromPom, extractGAVFromPom,
gavToPath, gavToPath,
} from './helpers.maven.js'; } from './helpers.maven.js';
import { MavenUpstream } from './classes.mavenupstream.js';
/** /**
* Maven Registry class * Maven Registry class
@@ -31,18 +33,34 @@ export class MavenRegistry extends BaseRegistry {
private authManager: AuthManager; private authManager: AuthManager;
private basePath: string = '/maven'; private basePath: string = '/maven';
private registryUrl: string; private registryUrl: string;
private upstream: MavenUpstream | null = null;
constructor( constructor(
storage: RegistryStorage, storage: RegistryStorage,
authManager: AuthManager, authManager: AuthManager,
basePath: string, basePath: string,
registryUrl: string registryUrl: string,
upstreamConfig?: IProtocolUpstreamConfig
) { ) {
super(); super();
this.storage = storage; this.storage = storage;
this.authManager = authManager; this.authManager = authManager;
this.basePath = basePath; this.basePath = basePath;
this.registryUrl = registryUrl; this.registryUrl = registryUrl;
// Initialize upstream if configured
if (upstreamConfig?.enabled) {
this.upstream = new MavenUpstream(upstreamConfig);
}
}
/**
* Clean up resources (timers, connections, etc.)
*/
public destroy(): void {
if (this.upstream) {
this.upstream.stop();
}
} }
public async init(): Promise<void> { public async init(): Promise<void> {
@@ -234,7 +252,23 @@ export class MavenRegistry extends BaseRegistry {
version: string, version: string,
filename: string filename: string
): Promise<IResponse> { ): Promise<IResponse> {
const data = await this.storage.getMavenArtifact(groupId, artifactId, version, filename); let data = await this.storage.getMavenArtifact(groupId, artifactId, version, filename);
// Try upstream if not found locally
if (!data && this.upstream) {
// Parse the filename to extract extension and classifier
const { extension, classifier } = this.parseFilename(filename, artifactId, version);
if (extension) {
data = await this.upstream.fetchArtifact(groupId, artifactId, version, extension, classifier);
if (data) {
// Cache the artifact locally
await this.storage.putMavenArtifact(groupId, artifactId, version, filename, data);
// Generate and store checksums
const checksums = await calculateChecksums(data);
await this.storeChecksums(groupId, artifactId, version, filename, checksums);
}
}
}
if (!data) { if (!data) {
return { return {
@@ -462,7 +496,17 @@ export class MavenRegistry extends BaseRegistry {
// ======================================================================== // ========================================================================
private async getMetadata(groupId: string, artifactId: string): Promise<IResponse> { private async getMetadata(groupId: string, artifactId: string): Promise<IResponse> {
const metadataBuffer = await this.storage.getMavenMetadata(groupId, artifactId); let metadataBuffer = await this.storage.getMavenMetadata(groupId, artifactId);
// Try upstream if not found locally
if (!metadataBuffer && this.upstream) {
const upstreamMetadata = await this.upstream.fetchMetadata(groupId, artifactId);
if (upstreamMetadata) {
metadataBuffer = Buffer.from(upstreamMetadata, 'utf-8');
// Cache the metadata locally
await this.storage.putMavenMetadata(groupId, artifactId, metadataBuffer);
}
}
if (!metadataBuffer) { if (!metadataBuffer) {
// Generate empty metadata if none exists // Generate empty metadata if none exists
@@ -578,4 +622,41 @@ export class MavenRegistry extends BaseRegistry {
return contentTypes[extension] || 'application/octet-stream'; return contentTypes[extension] || 'application/octet-stream';
} }
/**
* Parse a Maven filename to extract extension and classifier.
* Filename format: {artifactId}-{version}[-{classifier}].{extension}
*/
private parseFilename(
filename: string,
artifactId: string,
version: string
): { extension: string; classifier?: string } {
const prefix = `${artifactId}-${version}`;
if (!filename.startsWith(prefix)) {
// Fallback: just get the extension
const lastDot = filename.lastIndexOf('.');
return { extension: lastDot > 0 ? filename.slice(lastDot + 1) : '' };
}
const remainder = filename.slice(prefix.length);
// remainder is either ".extension" or "-classifier.extension"
if (remainder.startsWith('.')) {
return { extension: remainder.slice(1) };
}
if (remainder.startsWith('-')) {
const lastDot = remainder.lastIndexOf('.');
if (lastDot > 1) {
return {
classifier: remainder.slice(1, lastDot),
extension: remainder.slice(lastDot + 1),
};
}
}
return { extension: '' };
}
} }

View File

@@ -0,0 +1,220 @@
import * as plugins from '../plugins.js';
import { BaseUpstream } from '../upstream/classes.baseupstream.js';
import type {
IProtocolUpstreamConfig,
IUpstreamFetchContext,
IUpstreamRegistryConfig,
} from '../upstream/interfaces.upstream.js';
import type { IMavenCoordinate } from './interfaces.maven.js';
/**
* Maven-specific upstream implementation.
*
* Handles:
* - Artifact fetching (JAR, POM, WAR, etc.)
* - Metadata fetching (maven-metadata.xml)
* - Checksum files (.md5, .sha1, .sha256, .sha512)
* - SNAPSHOT version handling
* - Content-addressable caching for release artifacts
*/
export class MavenUpstream extends BaseUpstream {
protected readonly protocolName = 'maven';
constructor(
config: IProtocolUpstreamConfig,
logger?: plugins.smartlog.Smartlog,
) {
super(config, logger);
}
/**
* Fetch an artifact from upstream registries.
*/
public async fetchArtifact(
groupId: string,
artifactId: string,
version: string,
extension: string,
classifier?: string,
): Promise<Buffer | null> {
const path = this.buildArtifactPath(groupId, artifactId, version, extension, classifier);
const resource = `${groupId}:${artifactId}`;
const context: IUpstreamFetchContext = {
protocol: 'maven',
resource,
resourceType: 'artifact',
path,
method: 'GET',
headers: {},
query: {},
};
const result = await this.fetch(context);
if (!result || !result.success) {
return null;
}
return Buffer.isBuffer(result.body) ? result.body : Buffer.from(result.body);
}
/**
* Fetch maven-metadata.xml from upstream.
*/
public async fetchMetadata(groupId: string, artifactId: string, version?: string): Promise<string | null> {
const groupPath = groupId.replace(/\./g, '/');
let path: string;
if (version) {
// Version-level metadata (for SNAPSHOTs)
path = `/${groupPath}/${artifactId}/${version}/maven-metadata.xml`;
} else {
// Artifact-level metadata (lists all versions)
path = `/${groupPath}/${artifactId}/maven-metadata.xml`;
}
const resource = `${groupId}:${artifactId}`;
const context: IUpstreamFetchContext = {
protocol: 'maven',
resource,
resourceType: 'metadata',
path,
method: 'GET',
headers: {
'accept': 'application/xml, text/xml',
},
query: {},
};
const result = await this.fetch(context);
if (!result || !result.success) {
return null;
}
if (Buffer.isBuffer(result.body)) {
return result.body.toString('utf8');
}
return typeof result.body === 'string' ? result.body : null;
}
/**
* Fetch a checksum file from upstream.
*/
public async fetchChecksum(
groupId: string,
artifactId: string,
version: string,
extension: string,
checksumType: 'md5' | 'sha1' | 'sha256' | 'sha512',
classifier?: string,
): Promise<string | null> {
const basePath = this.buildArtifactPath(groupId, artifactId, version, extension, classifier);
const path = `${basePath}.${checksumType}`;
const resource = `${groupId}:${artifactId}`;
const context: IUpstreamFetchContext = {
protocol: 'maven',
resource,
resourceType: 'checksum',
path,
method: 'GET',
headers: {
'accept': 'text/plain',
},
query: {},
};
const result = await this.fetch(context);
if (!result || !result.success) {
return null;
}
if (Buffer.isBuffer(result.body)) {
return result.body.toString('utf8').trim();
}
return typeof result.body === 'string' ? result.body.trim() : null;
}
/**
* Check if an artifact exists in upstream (HEAD request).
*/
public async headArtifact(
groupId: string,
artifactId: string,
version: string,
extension: string,
classifier?: string,
): Promise<{ exists: boolean; size?: number; lastModified?: string } | null> {
const path = this.buildArtifactPath(groupId, artifactId, version, extension, classifier);
const resource = `${groupId}:${artifactId}`;
const context: IUpstreamFetchContext = {
protocol: 'maven',
resource,
resourceType: 'artifact',
path,
method: 'HEAD',
headers: {},
query: {},
};
const result = await this.fetch(context);
if (!result) {
return null;
}
if (!result.success) {
return { exists: false };
}
return {
exists: true,
size: result.headers['content-length'] ? parseInt(result.headers['content-length'], 10) : undefined,
lastModified: result.headers['last-modified'],
};
}
/**
* Build the path for a Maven artifact.
*/
private buildArtifactPath(
groupId: string,
artifactId: string,
version: string,
extension: string,
classifier?: string,
): string {
const groupPath = groupId.replace(/\./g, '/');
let filename = `${artifactId}-${version}`;
if (classifier) {
filename += `-${classifier}`;
}
filename += `.${extension}`;
return `/${groupPath}/${artifactId}/${version}/${filename}`;
}
/**
* Override URL building for Maven-specific handling.
*/
protected buildUpstreamUrl(
upstream: IUpstreamRegistryConfig,
context: IUpstreamFetchContext,
): string {
let baseUrl = upstream.url;
// Remove trailing slash
if (baseUrl.endsWith('/')) {
baseUrl = baseUrl.slice(0, -1);
}
return `${baseUrl}${context.path}`;
}
}

View File

@@ -3,5 +3,6 @@
*/ */
export { MavenRegistry } from './classes.mavenregistry.js'; export { MavenRegistry } from './classes.mavenregistry.js';
export { MavenUpstream } from './classes.mavenupstream.js';
export * from './interfaces.maven.js'; export * from './interfaces.maven.js';
export * from './helpers.maven.js'; export * from './helpers.maven.js';

View File

@@ -3,6 +3,8 @@ import { BaseRegistry } from '../core/classes.baseregistry.js';
import { RegistryStorage } from '../core/classes.registrystorage.js'; import { RegistryStorage } from '../core/classes.registrystorage.js';
import { AuthManager } from '../core/classes.authmanager.js'; import { AuthManager } from '../core/classes.authmanager.js';
import type { IRequestContext, IResponse, IAuthToken } from '../core/interfaces.core.js'; import type { IRequestContext, IResponse, IAuthToken } from '../core/interfaces.core.js';
import type { IProtocolUpstreamConfig } from '../upstream/interfaces.upstream.js';
import { NpmUpstream } from './classes.npmupstream.js';
import type { import type {
IPackument, IPackument,
INpmVersion, INpmVersion,
@@ -25,12 +27,14 @@ export class NpmRegistry extends BaseRegistry {
private basePath: string = '/npm'; private basePath: string = '/npm';
private registryUrl: string; private registryUrl: string;
private logger: Smartlog; private logger: Smartlog;
private upstream: NpmUpstream | null = null;
constructor( constructor(
storage: RegistryStorage, storage: RegistryStorage,
authManager: AuthManager, authManager: AuthManager,
basePath: string = '/npm', basePath: string = '/npm',
registryUrl: string = 'http://localhost:5000/npm' registryUrl: string = 'http://localhost:5000/npm',
upstreamConfig?: IProtocolUpstreamConfig
) { ) {
super(); super();
this.storage = storage; this.storage = storage;
@@ -50,6 +54,14 @@ export class NpmRegistry extends BaseRegistry {
} }
}); });
this.logger.enableConsole(); this.logger.enableConsole();
// Initialize upstream if configured
if (upstreamConfig?.enabled) {
this.upstream = new NpmUpstream(upstreamConfig, registryUrl, this.logger);
this.logger.log('info', 'NPM upstream initialized', {
upstreams: upstreamConfig.upstreams.map(u => u.name),
});
}
} }
public async init(): Promise<void> { public async init(): Promise<void> {
@@ -209,13 +221,28 @@ export class NpmRegistry extends BaseRegistry {
token: IAuthToken | null, token: IAuthToken | null,
query: Record<string, string> query: Record<string, string>
): Promise<IResponse> { ): Promise<IResponse> {
const packument = await this.storage.getNpmPackument(packageName); let packument = await this.storage.getNpmPackument(packageName);
this.logger.log('debug', `getPackument: ${packageName}`, { this.logger.log('debug', `getPackument: ${packageName}`, {
packageName, packageName,
found: !!packument, found: !!packument,
versions: packument ? Object.keys(packument.versions).length : 0 versions: packument ? Object.keys(packument.versions).length : 0
}); });
// If not found locally, try upstream
if (!packument && this.upstream) {
this.logger.log('debug', `getPackument: fetching from upstream`, { packageName });
const upstreamPackument = await this.upstream.fetchPackument(packageName);
if (upstreamPackument) {
this.logger.log('debug', `getPackument: found in upstream`, {
packageName,
versions: Object.keys(upstreamPackument.versions || {}).length
});
packument = upstreamPackument;
// Optionally cache the packument locally (without tarballs)
// We don't store tarballs here - they'll be fetched on demand
}
}
if (!packument) { if (!packument) {
return { return {
status: 404, status: 404,
@@ -255,11 +282,21 @@ export class NpmRegistry extends BaseRegistry {
token: IAuthToken | null token: IAuthToken | null
): Promise<IResponse> { ): Promise<IResponse> {
this.logger.log('debug', 'handlePackageVersion', { packageName, version }); this.logger.log('debug', 'handlePackageVersion', { packageName, version });
const packument = await this.storage.getNpmPackument(packageName); let packument = await this.storage.getNpmPackument(packageName);
this.logger.log('debug', 'handlePackageVersion packument', { found: !!packument }); this.logger.log('debug', 'handlePackageVersion packument', { found: !!packument });
if (packument) { if (packument) {
this.logger.log('debug', 'handlePackageVersion versions', { versions: Object.keys(packument.versions || {}) }); this.logger.log('debug', 'handlePackageVersion versions', { versions: Object.keys(packument.versions || {}) });
} }
// If not found locally, try upstream
if (!packument && this.upstream) {
this.logger.log('debug', 'handlePackageVersion: fetching from upstream', { packageName });
const upstreamPackument = await this.upstream.fetchPackument(packageName);
if (upstreamPackument) {
packument = upstreamPackument;
}
}
if (!packument) { if (!packument) {
return { return {
status: 404, status: 404,
@@ -529,7 +566,7 @@ export class NpmRegistry extends BaseRegistry {
token: IAuthToken | null token: IAuthToken | null
): Promise<IResponse> { ): Promise<IResponse> {
// Extract version from filename: package-name-1.0.0.tgz // Extract version from filename: package-name-1.0.0.tgz
const versionMatch = filename.match(/-([\d.]+(?:-[a-z0-9.]+)?)\.tgz$/); const versionMatch = filename.match(/-([\d.]+(?:-[a-z0-9.]+)?)\.tgz$/i);
if (!versionMatch) { if (!versionMatch) {
return { return {
status: 400, status: 400,
@@ -539,7 +576,26 @@ export class NpmRegistry extends BaseRegistry {
} }
const version = versionMatch[1]; const version = versionMatch[1];
const tarball = await this.storage.getNpmTarball(packageName, version); let tarball = await this.storage.getNpmTarball(packageName, version);
// If not found locally, try upstream
if (!tarball && this.upstream) {
this.logger.log('debug', 'handleTarballDownload: fetching from upstream', {
packageName,
version,
});
const upstreamTarball = await this.upstream.fetchTarball(packageName, version);
if (upstreamTarball) {
tarball = upstreamTarball;
// Cache the tarball locally for future requests
await this.storage.putNpmTarball(packageName, version, tarball);
this.logger.log('debug', 'handleTarballDownload: cached tarball locally', {
packageName,
version,
size: tarball.length,
});
}
}
if (!tarball) { if (!tarball) {
return { return {

View File

@@ -0,0 +1,260 @@
import * as plugins from '../plugins.js';
import { BaseUpstream } from '../upstream/classes.baseupstream.js';
import type {
IProtocolUpstreamConfig,
IUpstreamFetchContext,
IUpstreamResult,
IUpstreamRegistryConfig,
} from '../upstream/interfaces.upstream.js';
import type { IPackument, INpmVersion } from './interfaces.npm.js';
/**
* NPM-specific upstream implementation.
*
* Handles:
* - Package metadata (packument) fetching
* - Tarball proxying
* - Scoped package routing (@scope/* patterns)
* - NPM-specific URL rewriting
*/
export class NpmUpstream extends BaseUpstream {
protected readonly protocolName = 'npm';
/** Local registry URL for rewriting tarball URLs */
private readonly localRegistryUrl: string;
constructor(
config: IProtocolUpstreamConfig,
localRegistryUrl: string,
logger?: plugins.smartlog.Smartlog,
) {
super(config, logger);
this.localRegistryUrl = localRegistryUrl;
}
/**
* Fetch a packument from upstream registries.
*/
public async fetchPackument(packageName: string): Promise<IPackument | null> {
const context: IUpstreamFetchContext = {
protocol: 'npm',
resource: packageName,
resourceType: 'packument',
path: `/${encodeURIComponent(packageName).replace('%40', '@')}`,
method: 'GET',
headers: {
'accept': 'application/json',
},
query: {},
};
const result = await this.fetch(context);
if (!result || !result.success) {
return null;
}
// Parse and process packument
let packument: IPackument;
if (Buffer.isBuffer(result.body)) {
packument = JSON.parse(result.body.toString('utf8'));
} else {
packument = result.body;
}
// Rewrite tarball URLs to point to local registry
packument = this.rewriteTarballUrls(packument);
return packument;
}
/**
* Fetch a specific version from upstream registries.
*/
public async fetchVersion(packageName: string, version: string): Promise<INpmVersion | null> {
const context: IUpstreamFetchContext = {
protocol: 'npm',
resource: packageName,
resourceType: 'version',
path: `/${encodeURIComponent(packageName).replace('%40', '@')}/${version}`,
method: 'GET',
headers: {
'accept': 'application/json',
},
query: {},
};
const result = await this.fetch(context);
if (!result || !result.success) {
return null;
}
let versionData: INpmVersion;
if (Buffer.isBuffer(result.body)) {
versionData = JSON.parse(result.body.toString('utf8'));
} else {
versionData = result.body;
}
// Rewrite tarball URL
if (versionData.dist?.tarball) {
versionData.dist.tarball = this.rewriteSingleTarballUrl(
packageName,
versionData.version,
versionData.dist.tarball,
);
}
return versionData;
}
/**
* Fetch a tarball from upstream registries.
*/
public async fetchTarball(packageName: string, version: string): Promise<Buffer | null> {
// First, try to get the tarball URL from packument
const packument = await this.fetchPackument(packageName);
let tarballPath: string;
if (packument?.versions?.[version]?.dist?.tarball) {
// Extract path from original (upstream) tarball URL
const tarballUrl = packument.versions[version].dist.tarball;
try {
const url = new URL(tarballUrl);
tarballPath = url.pathname;
} catch {
// Fallback to standard NPM tarball path
tarballPath = this.buildTarballPath(packageName, version);
}
} else {
tarballPath = this.buildTarballPath(packageName, version);
}
const context: IUpstreamFetchContext = {
protocol: 'npm',
resource: packageName,
resourceType: 'tarball',
path: tarballPath,
method: 'GET',
headers: {
'accept': 'application/octet-stream',
},
query: {},
};
const result = await this.fetch(context);
if (!result || !result.success) {
return null;
}
return Buffer.isBuffer(result.body) ? result.body : Buffer.from(result.body);
}
/**
* Search packages in upstream registries.
*/
public async search(text: string, size: number = 20, from: number = 0): Promise<any | null> {
const context: IUpstreamFetchContext = {
protocol: 'npm',
resource: '*',
resourceType: 'search',
path: '/-/v1/search',
method: 'GET',
headers: {
'accept': 'application/json',
},
query: {
text,
size: size.toString(),
from: from.toString(),
},
};
const result = await this.fetch(context);
if (!result || !result.success) {
return null;
}
if (Buffer.isBuffer(result.body)) {
return JSON.parse(result.body.toString('utf8'));
}
return result.body;
}
/**
* Build the standard NPM tarball path.
*/
private buildTarballPath(packageName: string, version: string): string {
// NPM uses: /{package}/-/{package-name}-{version}.tgz
// For scoped packages: /@scope/name/-/name-version.tgz
if (packageName.startsWith('@')) {
const [scope, name] = packageName.split('/');
return `/${scope}/${name}/-/${name}-${version}.tgz`;
} else {
return `/${packageName}/-/${packageName}-${version}.tgz`;
}
}
/**
* Rewrite all tarball URLs in a packument to point to local registry.
*/
private rewriteTarballUrls(packument: IPackument): IPackument {
if (!packument.versions) {
return packument;
}
const rewritten = { ...packument };
rewritten.versions = {};
for (const [version, versionData] of Object.entries(packument.versions)) {
const newVersionData = { ...versionData };
if (newVersionData.dist?.tarball) {
newVersionData.dist = {
...newVersionData.dist,
tarball: this.rewriteSingleTarballUrl(
packument.name,
version,
newVersionData.dist.tarball,
),
};
}
rewritten.versions[version] = newVersionData;
}
return rewritten;
}
/**
* Rewrite a single tarball URL to point to local registry.
*/
private rewriteSingleTarballUrl(
packageName: string,
version: string,
_originalUrl: string,
): string {
// Generate local tarball URL
// Format: {localRegistryUrl}/{package}/-/{package-name}-{version}.tgz
const safeName = packageName.replace('@', '').replace('/', '-');
return `${this.localRegistryUrl}/${packageName}/-/${safeName}-${version}.tgz`;
}
/**
* Override URL building for NPM-specific handling.
*/
protected buildUpstreamUrl(
upstream: IUpstreamRegistryConfig,
context: IUpstreamFetchContext,
): string {
// NPM registries often don't have trailing slashes
let baseUrl = upstream.url;
if (baseUrl.endsWith('/')) {
baseUrl = baseUrl.slice(0, -1);
}
return `${baseUrl}${context.path}`;
}
}

View File

@@ -3,4 +3,5 @@
*/ */
export { NpmRegistry } from './classes.npmregistry.js'; export { NpmRegistry } from './classes.npmregistry.js';
export { NpmUpstream } from './classes.npmupstream.js';
export * from './interfaces.npm.js'; export * from './interfaces.npm.js';

View File

@@ -1,7 +1,10 @@
import { Smartlog } from '@push.rocks/smartlog';
import { BaseRegistry } from '../core/classes.baseregistry.js'; import { BaseRegistry } from '../core/classes.baseregistry.js';
import { RegistryStorage } from '../core/classes.registrystorage.js'; import { RegistryStorage } from '../core/classes.registrystorage.js';
import { AuthManager } from '../core/classes.authmanager.js'; import { AuthManager } from '../core/classes.authmanager.js';
import type { IRequestContext, IResponse, IAuthToken, IRegistryError } from '../core/interfaces.core.js'; import type { IRequestContext, IResponse, IAuthToken, IRegistryError } from '../core/interfaces.core.js';
import type { IProtocolUpstreamConfig } from '../upstream/interfaces.upstream.js';
import { OciUpstream } from './classes.ociupstream.js';
import type { import type {
IUploadSession, IUploadSession,
IOciManifest, IOciManifest,
@@ -21,18 +24,42 @@ export class OciRegistry extends BaseRegistry {
private basePath: string = '/oci'; private basePath: string = '/oci';
private cleanupInterval?: NodeJS.Timeout; private cleanupInterval?: NodeJS.Timeout;
private ociTokens?: { realm: string; service: string }; private ociTokens?: { realm: string; service: string };
private upstream: OciUpstream | null = null;
private logger: Smartlog;
constructor( constructor(
storage: RegistryStorage, storage: RegistryStorage,
authManager: AuthManager, authManager: AuthManager,
basePath: string = '/oci', basePath: string = '/oci',
ociTokens?: { realm: string; service: string } ociTokens?: { realm: string; service: string },
upstreamConfig?: IProtocolUpstreamConfig
) { ) {
super(); super();
this.storage = storage; this.storage = storage;
this.authManager = authManager; this.authManager = authManager;
this.basePath = basePath; this.basePath = basePath;
this.ociTokens = ociTokens; this.ociTokens = ociTokens;
// Initialize logger
this.logger = new Smartlog({
logContext: {
company: 'push.rocks',
companyunit: 'smartregistry',
containerName: 'oci-registry',
environment: (process.env.NODE_ENV as any) || 'development',
runtime: 'node',
zone: 'oci'
}
});
this.logger.enableConsole();
// Initialize upstream if configured
if (upstreamConfig?.enabled) {
this.upstream = new OciUpstream(upstreamConfig, basePath, this.logger);
this.logger.log('info', 'OCI upstream initialized', {
upstreams: upstreamConfig.upstreams.map(u => u.name),
});
}
} }
public async init(): Promise<void> { public async init(): Promise<void> {
@@ -302,16 +329,50 @@ export class OciRegistry extends BaseRegistry {
if (!reference.startsWith('sha256:')) { if (!reference.startsWith('sha256:')) {
const tags = await this.getTagsData(repository); const tags = await this.getTagsData(repository);
digest = tags[reference]; digest = tags[reference];
if (!digest) { }
return {
status: 404, // Try local storage first (if we have a digest)
headers: {}, let manifestData: Buffer | null = null;
body: this.createError('MANIFEST_UNKNOWN', 'Manifest not found'), let contentType: string | null = null;
};
if (digest) {
manifestData = await this.storage.getOciManifest(repository, digest);
if (manifestData) {
contentType = await this.storage.getOciManifestContentType(repository, digest);
if (!contentType) {
contentType = this.detectManifestContentType(manifestData);
}
}
}
// If not found locally, try upstream
if (!manifestData && this.upstream) {
this.logger.log('debug', 'getManifest: fetching from upstream', { repository, reference });
const upstreamResult = await this.upstream.fetchManifest(repository, reference);
if (upstreamResult) {
manifestData = Buffer.from(JSON.stringify(upstreamResult.manifest), 'utf8');
contentType = upstreamResult.contentType;
digest = upstreamResult.digest;
// Cache the manifest locally
await this.storage.putOciManifest(repository, digest, manifestData, contentType);
// If reference is a tag, update tags mapping
if (!reference.startsWith('sha256:')) {
const tags = await this.getTagsData(repository);
tags[reference] = digest;
const tagsPath = `oci/tags/${repository}/tags.json`;
await this.storage.putObject(tagsPath, Buffer.from(JSON.stringify(tags), 'utf-8'));
}
this.logger.log('debug', 'getManifest: cached manifest locally', {
repository,
reference,
digest,
});
} }
} }
const manifestData = await this.storage.getOciManifest(repository, digest);
if (!manifestData) { if (!manifestData) {
return { return {
status: 404, status: 404,
@@ -320,17 +381,10 @@ export class OciRegistry extends BaseRegistry {
}; };
} }
// Get stored content type, falling back to detecting from manifest content
let contentType = await this.storage.getOciManifestContentType(repository, digest);
if (!contentType) {
// Fallback: detect content type from manifest content
contentType = this.detectManifestContentType(manifestData);
}
return { return {
status: 200, status: 200,
headers: { headers: {
'Content-Type': contentType, 'Content-Type': contentType || 'application/vnd.oci.image.manifest.v1+json',
'Docker-Content-Digest': digest, 'Docker-Content-Digest': digest,
}, },
body: manifestData, body: manifestData,
@@ -466,7 +520,25 @@ export class OciRegistry extends BaseRegistry {
return this.createUnauthorizedResponse(repository, 'pull'); return this.createUnauthorizedResponse(repository, 'pull');
} }
const data = await this.storage.getOciBlob(digest); // Try local storage first
let data = await this.storage.getOciBlob(digest);
// If not found locally, try upstream
if (!data && this.upstream) {
this.logger.log('debug', 'getBlob: fetching from upstream', { repository, digest });
const upstreamBlob = await this.upstream.fetchBlob(repository, digest);
if (upstreamBlob) {
data = upstreamBlob;
// Cache the blob locally (blobs are content-addressable and immutable)
await this.storage.putOciBlob(digest, data);
this.logger.log('debug', 'getBlob: cached blob locally', {
repository,
digest,
size: data.length,
});
}
}
if (!data) { if (!data) {
return { return {
status: 404, status: 404,

View File

@@ -0,0 +1,263 @@
import * as plugins from '../plugins.js';
import { BaseUpstream } from '../upstream/classes.baseupstream.js';
import type {
IProtocolUpstreamConfig,
IUpstreamFetchContext,
IUpstreamResult,
IUpstreamRegistryConfig,
} from '../upstream/interfaces.upstream.js';
import type { IOciManifest, IOciImageIndex, ITagList } from './interfaces.oci.js';
/**
* OCI-specific upstream implementation.
*
* Handles:
* - Manifest fetching (image manifests and index manifests)
* - Blob proxying (layers, configs)
* - Tag list fetching
* - Content-addressable caching (blobs are immutable)
* - Docker Hub authentication flow
*/
export class OciUpstream extends BaseUpstream {
protected readonly protocolName = 'oci';
/** Local registry base path for URL building */
private readonly localBasePath: string;
constructor(
config: IProtocolUpstreamConfig,
localBasePath: string = '/oci',
logger?: plugins.smartlog.Smartlog,
) {
super(config, logger);
this.localBasePath = localBasePath;
}
/**
* Fetch a manifest from upstream registries.
*/
public async fetchManifest(
repository: string,
reference: string,
): Promise<{ manifest: IOciManifest | IOciImageIndex; contentType: string; digest: string } | null> {
const context: IUpstreamFetchContext = {
protocol: 'oci',
resource: repository,
resourceType: 'manifest',
path: `/v2/${repository}/manifests/${reference}`,
method: 'GET',
headers: {
'accept': [
'application/vnd.oci.image.manifest.v1+json',
'application/vnd.oci.image.index.v1+json',
'application/vnd.docker.distribution.manifest.v2+json',
'application/vnd.docker.distribution.manifest.list.v2+json',
'application/vnd.docker.distribution.manifest.v1+json',
].join(', '),
},
query: {},
};
const result = await this.fetch(context);
if (!result || !result.success) {
return null;
}
let manifest: IOciManifest | IOciImageIndex;
if (Buffer.isBuffer(result.body)) {
manifest = JSON.parse(result.body.toString('utf8'));
} else {
manifest = result.body;
}
const contentType = result.headers['content-type'] || 'application/vnd.oci.image.manifest.v1+json';
const digest = result.headers['docker-content-digest'] || '';
return { manifest, contentType, digest };
}
/**
* Check if a manifest exists in upstream (HEAD request).
*/
public async headManifest(
repository: string,
reference: string,
): Promise<{ exists: boolean; contentType?: string; digest?: string; size?: number } | null> {
const context: IUpstreamFetchContext = {
protocol: 'oci',
resource: repository,
resourceType: 'manifest',
path: `/v2/${repository}/manifests/${reference}`,
method: 'HEAD',
headers: {
'accept': [
'application/vnd.oci.image.manifest.v1+json',
'application/vnd.oci.image.index.v1+json',
'application/vnd.docker.distribution.manifest.v2+json',
'application/vnd.docker.distribution.manifest.list.v2+json',
].join(', '),
},
query: {},
};
const result = await this.fetch(context);
if (!result) {
return null;
}
if (!result.success) {
return { exists: false };
}
return {
exists: true,
contentType: result.headers['content-type'],
digest: result.headers['docker-content-digest'],
size: result.headers['content-length'] ? parseInt(result.headers['content-length'], 10) : undefined,
};
}
/**
* Fetch a blob from upstream registries.
*/
public async fetchBlob(repository: string, digest: string): Promise<Buffer | null> {
const context: IUpstreamFetchContext = {
protocol: 'oci',
resource: repository,
resourceType: 'blob',
path: `/v2/${repository}/blobs/${digest}`,
method: 'GET',
headers: {
'accept': 'application/octet-stream',
},
query: {},
};
const result = await this.fetch(context);
if (!result || !result.success) {
return null;
}
return Buffer.isBuffer(result.body) ? result.body : Buffer.from(result.body);
}
/**
* Check if a blob exists in upstream (HEAD request).
*/
public async headBlob(
repository: string,
digest: string,
): Promise<{ exists: boolean; size?: number } | null> {
const context: IUpstreamFetchContext = {
protocol: 'oci',
resource: repository,
resourceType: 'blob',
path: `/v2/${repository}/blobs/${digest}`,
method: 'HEAD',
headers: {},
query: {},
};
const result = await this.fetch(context);
if (!result) {
return null;
}
if (!result.success) {
return { exists: false };
}
return {
exists: true,
size: result.headers['content-length'] ? parseInt(result.headers['content-length'], 10) : undefined,
};
}
/**
* Fetch the tag list for a repository.
*/
public async fetchTags(repository: string, n?: number, last?: string): Promise<ITagList | null> {
const query: Record<string, string> = {};
if (n) query.n = n.toString();
if (last) query.last = last;
const context: IUpstreamFetchContext = {
protocol: 'oci',
resource: repository,
resourceType: 'tags',
path: `/v2/${repository}/tags/list`,
method: 'GET',
headers: {
'accept': 'application/json',
},
query,
};
const result = await this.fetch(context);
if (!result || !result.success) {
return null;
}
let tagList: ITagList;
if (Buffer.isBuffer(result.body)) {
tagList = JSON.parse(result.body.toString('utf8'));
} else {
tagList = result.body;
}
return tagList;
}
/**
* Override URL building for OCI-specific handling.
* OCI registries use /v2/ prefix and may require special handling for Docker Hub.
*/
protected buildUpstreamUrl(
upstream: IUpstreamRegistryConfig,
context: IUpstreamFetchContext,
): string {
let baseUrl = upstream.url;
// Remove trailing slash
if (baseUrl.endsWith('/')) {
baseUrl = baseUrl.slice(0, -1);
}
// Handle Docker Hub special case
// Docker Hub uses registry-1.docker.io but library images need special handling
if (baseUrl.includes('docker.io') || baseUrl.includes('registry-1.docker.io')) {
// For library images (e.g., "nginx" -> "library/nginx")
const pathParts = context.path.match(/^\/v2\/([^\/]+)\/(.+)$/);
if (pathParts) {
const [, repository, rest] = pathParts;
// If repository doesn't contain a slash, it's a library image
if (!repository.includes('/')) {
return `${baseUrl}/v2/library/${repository}/${rest}`;
}
}
}
return `${baseUrl}${context.path}`;
}
/**
* Override header building for OCI-specific authentication.
* OCI registries may require token-based auth obtained from a separate endpoint.
*/
protected buildHeaders(
upstream: IUpstreamRegistryConfig,
context: IUpstreamFetchContext,
): Record<string, string> {
const headers = super.buildHeaders(upstream, context);
// OCI registries typically use Docker-Distribution-API-Version header
headers['docker-distribution-api-version'] = 'registry/2.0';
return headers;
}
}

View File

@@ -3,4 +3,5 @@
*/ */
export { OciRegistry } from './classes.ociregistry.js'; export { OciRegistry } from './classes.ociregistry.js';
export { OciUpstream } from './classes.ociupstream.js';
export * from './interfaces.oci.js'; export * from './interfaces.oci.js';

View File

@@ -8,10 +8,16 @@ import * as smartarchive from '@push.rocks/smartarchive';
import * as smartbucket from '@push.rocks/smartbucket'; import * as smartbucket from '@push.rocks/smartbucket';
import * as smartlog from '@push.rocks/smartlog'; import * as smartlog from '@push.rocks/smartlog';
import * as smartpath from '@push.rocks/smartpath'; import * as smartpath from '@push.rocks/smartpath';
import * as smartrequest from '@push.rocks/smartrequest';
export { smartarchive, smartbucket, smartlog, smartpath }; export { smartarchive, smartbucket, smartlog, smartpath, smartrequest };
// @tsclass scope // @tsclass scope
import * as tsclass from '@tsclass/tsclass'; import * as tsclass from '@tsclass/tsclass';
export { tsclass }; export { tsclass };
// third party
import { minimatch } from 'minimatch';
export { minimatch };

View File

@@ -3,6 +3,7 @@ import { BaseRegistry } from '../core/classes.baseregistry.js';
import { RegistryStorage } from '../core/classes.registrystorage.js'; import { RegistryStorage } from '../core/classes.registrystorage.js';
import { AuthManager } from '../core/classes.authmanager.js'; import { AuthManager } from '../core/classes.authmanager.js';
import type { IRequestContext, IResponse, IAuthToken } from '../core/interfaces.core.js'; import type { IRequestContext, IResponse, IAuthToken } from '../core/interfaces.core.js';
import type { IProtocolUpstreamConfig } from '../upstream/interfaces.upstream.js';
import { isBinaryData, toBuffer } from '../core/helpers.buffer.js'; import { isBinaryData, toBuffer } from '../core/helpers.buffer.js';
import type { import type {
IPypiPackageMetadata, IPypiPackageMetadata,
@@ -11,6 +12,7 @@ import type {
IPypiUploadResponse, IPypiUploadResponse,
} from './interfaces.pypi.js'; } from './interfaces.pypi.js';
import * as helpers from './helpers.pypi.js'; import * as helpers from './helpers.pypi.js';
import { PypiUpstream } from './classes.pypiupstream.js';
/** /**
* PyPI registry implementation * PyPI registry implementation
@@ -22,12 +24,14 @@ export class PypiRegistry extends BaseRegistry {
private basePath: string = '/pypi'; private basePath: string = '/pypi';
private registryUrl: string; private registryUrl: string;
private logger: Smartlog; private logger: Smartlog;
private upstream: PypiUpstream | null = null;
constructor( constructor(
storage: RegistryStorage, storage: RegistryStorage,
authManager: AuthManager, authManager: AuthManager,
basePath: string = '/pypi', basePath: string = '/pypi',
registryUrl: string = 'http://localhost:5000' registryUrl: string = 'http://localhost:5000',
upstreamConfig?: IProtocolUpstreamConfig
) { ) {
super(); super();
this.storage = storage; this.storage = storage;
@@ -47,6 +51,20 @@ export class PypiRegistry extends BaseRegistry {
} }
}); });
this.logger.enableConsole(); this.logger.enableConsole();
// Initialize upstream if configured
if (upstreamConfig?.enabled) {
this.upstream = new PypiUpstream(upstreamConfig, registryUrl, this.logger);
}
}
/**
* Clean up resources (timers, connections, etc.)
*/
public destroy(): void {
if (this.upstream) {
this.upstream.stop();
}
} }
public async init(): Promise<void> { public async init(): Promise<void> {
@@ -214,7 +232,45 @@ export class PypiRegistry extends BaseRegistry {
const normalized = helpers.normalizePypiPackageName(packageName); const normalized = helpers.normalizePypiPackageName(packageName);
// Get package metadata // Get package metadata
const metadata = await this.storage.getPypiPackageMetadata(normalized); let metadata = await this.storage.getPypiPackageMetadata(normalized);
// Try upstream if not found locally
if (!metadata && this.upstream) {
const upstreamHtml = await this.upstream.fetchSimplePackage(normalized);
if (upstreamHtml) {
// Parse the HTML to extract file information and cache it
// For now, just return the upstream HTML directly (caching can be improved later)
const acceptHeader = context.headers['accept'] || context.headers['Accept'] || '';
const preferJson = acceptHeader.includes('application/vnd.pypi.simple') &&
acceptHeader.includes('json');
if (preferJson) {
// Try to get JSON format from upstream
const upstreamJson = await this.upstream.fetchPackageJson(normalized);
if (upstreamJson) {
return {
status: 200,
headers: {
'Content-Type': 'application/vnd.pypi.simple.v1+json',
'Cache-Control': 'public, max-age=300'
},
body: upstreamJson,
};
}
}
// Return HTML format
return {
status: 200,
headers: {
'Content-Type': 'text/html; charset=utf-8',
'Cache-Control': 'public, max-age=300'
},
body: upstreamHtml,
};
}
}
if (!metadata) { if (!metadata) {
return this.errorResponse(404, 'Package not found'); return this.errorResponse(404, 'Package not found');
} }
@@ -449,7 +505,16 @@ export class PypiRegistry extends BaseRegistry {
*/ */
private async handleDownload(packageName: string, filename: string): Promise<IResponse> { private async handleDownload(packageName: string, filename: string): Promise<IResponse> {
const normalized = helpers.normalizePypiPackageName(packageName); const normalized = helpers.normalizePypiPackageName(packageName);
const fileData = await this.storage.getPypiPackageFile(normalized, filename); let fileData = await this.storage.getPypiPackageFile(normalized, filename);
// Try upstream if not found locally
if (!fileData && this.upstream) {
fileData = await this.upstream.fetchPackageFile(normalized, filename);
if (fileData) {
// Cache locally
await this.storage.putPypiPackageFile(normalized, filename, fileData);
}
}
if (!fileData) { if (!fileData) {
return { return {

View File

@@ -0,0 +1,211 @@
import * as plugins from '../plugins.js';
import { BaseUpstream } from '../upstream/classes.baseupstream.js';
import type {
IProtocolUpstreamConfig,
IUpstreamFetchContext,
IUpstreamRegistryConfig,
} from '../upstream/interfaces.upstream.js';
/**
* PyPI-specific upstream implementation.
*
* Handles:
* - Simple API (HTML) - PEP 503
* - JSON API - PEP 691
* - Package file downloads (wheels, sdists)
* - Package name normalization
*/
export class PypiUpstream extends BaseUpstream {
protected readonly protocolName = 'pypi';
/** Local registry URL for rewriting download URLs */
private readonly localRegistryUrl: string;
constructor(
config: IProtocolUpstreamConfig,
localRegistryUrl: string,
logger?: plugins.smartlog.Smartlog,
) {
super(config, logger);
this.localRegistryUrl = localRegistryUrl;
}
/**
* Fetch Simple API index (list of all packages) in HTML format.
*/
public async fetchSimpleIndex(): Promise<string | null> {
const context: IUpstreamFetchContext = {
protocol: 'pypi',
resource: '*',
resourceType: 'index',
path: '/simple/',
method: 'GET',
headers: {
'accept': 'text/html',
},
query: {},
};
const result = await this.fetch(context);
if (!result || !result.success) {
return null;
}
if (Buffer.isBuffer(result.body)) {
return result.body.toString('utf8');
}
return typeof result.body === 'string' ? result.body : null;
}
/**
* Fetch Simple API package page (list of files) in HTML format.
*/
public async fetchSimplePackage(packageName: string): Promise<string | null> {
const normalizedName = this.normalizePackageName(packageName);
const path = `/simple/${normalizedName}/`;
const context: IUpstreamFetchContext = {
protocol: 'pypi',
resource: packageName,
resourceType: 'simple',
path,
method: 'GET',
headers: {
'accept': 'text/html',
},
query: {},
};
const result = await this.fetch(context);
if (!result || !result.success) {
return null;
}
if (Buffer.isBuffer(result.body)) {
return result.body.toString('utf8');
}
return typeof result.body === 'string' ? result.body : null;
}
/**
* Fetch package metadata using JSON API (PEP 691).
*/
public async fetchPackageJson(packageName: string): Promise<any | null> {
const normalizedName = this.normalizePackageName(packageName);
const path = `/simple/${normalizedName}/`;
const context: IUpstreamFetchContext = {
protocol: 'pypi',
resource: packageName,
resourceType: 'metadata',
path,
method: 'GET',
headers: {
'accept': 'application/vnd.pypi.simple.v1+json',
},
query: {},
};
const result = await this.fetch(context);
if (!result || !result.success) {
return null;
}
if (Buffer.isBuffer(result.body)) {
return JSON.parse(result.body.toString('utf8'));
}
return result.body;
}
/**
* Fetch full package info from PyPI JSON API (/pypi/{package}/json).
*/
public async fetchPypiJson(packageName: string): Promise<any | null> {
const normalizedName = this.normalizePackageName(packageName);
const path = `/pypi/${normalizedName}/json`;
const context: IUpstreamFetchContext = {
protocol: 'pypi',
resource: packageName,
resourceType: 'pypi-json',
path,
method: 'GET',
headers: {
'accept': 'application/json',
},
query: {},
};
const result = await this.fetch(context);
if (!result || !result.success) {
return null;
}
if (Buffer.isBuffer(result.body)) {
return JSON.parse(result.body.toString('utf8'));
}
return result.body;
}
/**
* Fetch a package file (wheel or sdist) from upstream.
*/
public async fetchPackageFile(packageName: string, filename: string): Promise<Buffer | null> {
const normalizedName = this.normalizePackageName(packageName);
const path = `/packages/${normalizedName}/${filename}`;
const context: IUpstreamFetchContext = {
protocol: 'pypi',
resource: packageName,
resourceType: 'package',
path,
method: 'GET',
headers: {
'accept': 'application/octet-stream',
},
query: {},
};
const result = await this.fetch(context);
if (!result || !result.success) {
return null;
}
return Buffer.isBuffer(result.body) ? result.body : Buffer.from(result.body);
}
/**
* Normalize a PyPI package name according to PEP 503.
* - Lowercase all characters
* - Replace runs of ., -, _ with single -
*/
private normalizePackageName(name: string): string {
return name.toLowerCase().replace(/[-_.]+/g, '-');
}
/**
* Override URL building for PyPI-specific handling.
*/
protected buildUpstreamUrl(
upstream: IUpstreamRegistryConfig,
context: IUpstreamFetchContext,
): string {
let baseUrl = upstream.url;
// Remove trailing slash
if (baseUrl.endsWith('/')) {
baseUrl = baseUrl.slice(0, -1);
}
return `${baseUrl}${context.path}`;
}
}

View File

@@ -5,4 +5,5 @@
export * from './interfaces.pypi.js'; export * from './interfaces.pypi.js';
export * from './classes.pypiregistry.js'; export * from './classes.pypiregistry.js';
export { PypiUpstream } from './classes.pypiupstream.js';
export * as pypiHelpers from './helpers.pypi.js'; export * as pypiHelpers from './helpers.pypi.js';

View File

@@ -3,6 +3,7 @@ import { BaseRegistry } from '../core/classes.baseregistry.js';
import { RegistryStorage } from '../core/classes.registrystorage.js'; import { RegistryStorage } from '../core/classes.registrystorage.js';
import { AuthManager } from '../core/classes.authmanager.js'; import { AuthManager } from '../core/classes.authmanager.js';
import type { IRequestContext, IResponse, IAuthToken } from '../core/interfaces.core.js'; import type { IRequestContext, IResponse, IAuthToken } from '../core/interfaces.core.js';
import type { IProtocolUpstreamConfig } from '../upstream/interfaces.upstream.js';
import type { import type {
IRubyGemsMetadata, IRubyGemsMetadata,
IRubyGemsVersionMetadata, IRubyGemsVersionMetadata,
@@ -12,6 +13,7 @@ import type {
ICompactIndexInfoEntry, ICompactIndexInfoEntry,
} from './interfaces.rubygems.js'; } from './interfaces.rubygems.js';
import * as helpers from './helpers.rubygems.js'; import * as helpers from './helpers.rubygems.js';
import { RubygemsUpstream } from './classes.rubygemsupstream.js';
/** /**
* RubyGems registry implementation * RubyGems registry implementation
@@ -23,12 +25,14 @@ export class RubyGemsRegistry extends BaseRegistry {
private basePath: string = '/rubygems'; private basePath: string = '/rubygems';
private registryUrl: string; private registryUrl: string;
private logger: Smartlog; private logger: Smartlog;
private upstream: RubygemsUpstream | null = null;
constructor( constructor(
storage: RegistryStorage, storage: RegistryStorage,
authManager: AuthManager, authManager: AuthManager,
basePath: string = '/rubygems', basePath: string = '/rubygems',
registryUrl: string = 'http://localhost:5000/rubygems' registryUrl: string = 'http://localhost:5000/rubygems',
upstreamConfig?: IProtocolUpstreamConfig
) { ) {
super(); super();
this.storage = storage; this.storage = storage;
@@ -48,6 +52,20 @@ export class RubyGemsRegistry extends BaseRegistry {
} }
}); });
this.logger.enableConsole(); this.logger.enableConsole();
// Initialize upstream if configured
if (upstreamConfig?.enabled) {
this.upstream = new RubygemsUpstream(upstreamConfig, this.logger);
}
}
/**
* Clean up resources (timers, connections, etc.)
*/
public destroy(): void {
if (this.upstream) {
this.upstream.stop();
}
} }
public async init(): Promise<void> { public async init(): Promise<void> {
@@ -215,7 +233,17 @@ export class RubyGemsRegistry extends BaseRegistry {
* Handle /info/{gem} endpoint (Compact Index) * Handle /info/{gem} endpoint (Compact Index)
*/ */
private async handleInfoFile(gemName: string): Promise<IResponse> { private async handleInfoFile(gemName: string): Promise<IResponse> {
const content = await this.storage.getRubyGemsInfo(gemName); let content = await this.storage.getRubyGemsInfo(gemName);
// Try upstream if not found locally
if (!content && this.upstream) {
const upstreamInfo = await this.upstream.fetchInfo(gemName);
if (upstreamInfo) {
// Cache locally
await this.storage.putRubyGemsInfo(gemName, upstreamInfo);
content = upstreamInfo;
}
}
if (!content) { if (!content) {
return { return {
@@ -245,12 +273,21 @@ export class RubyGemsRegistry extends BaseRegistry {
return this.errorResponse(400, 'Invalid gem filename'); return this.errorResponse(400, 'Invalid gem filename');
} }
const gemData = await this.storage.getRubyGemsGem( let gemData = await this.storage.getRubyGemsGem(
parsed.name, parsed.name,
parsed.version, parsed.version,
parsed.platform parsed.platform
); );
// Try upstream if not found locally
if (!gemData && this.upstream) {
gemData = await this.upstream.fetchGem(parsed.name, parsed.version);
if (gemData) {
// Cache locally
await this.storage.putRubyGemsGem(parsed.name, parsed.version, gemData, parsed.platform);
}
}
if (!gemData) { if (!gemData) {
return this.errorResponse(404, 'Gem not found'); return this.errorResponse(404, 'Gem not found');
} }

View File

@@ -0,0 +1,230 @@
import * as plugins from '../plugins.js';
import { BaseUpstream } from '../upstream/classes.baseupstream.js';
import type {
IProtocolUpstreamConfig,
IUpstreamFetchContext,
IUpstreamRegistryConfig,
} from '../upstream/interfaces.upstream.js';
/**
* RubyGems-specific upstream implementation.
*
* Handles:
* - Compact Index format (/versions, /info/{gem}, /names)
* - Gem file (.gem) downloading
* - Gem spec fetching
* - HTTP Range requests for incremental updates
*/
export class RubygemsUpstream extends BaseUpstream {
protected readonly protocolName = 'rubygems';
constructor(
config: IProtocolUpstreamConfig,
logger?: plugins.smartlog.Smartlog,
) {
super(config, logger);
}
/**
* Fetch the /versions file (master list of all gems).
*/
public async fetchVersions(etag?: string): Promise<{ data: string; etag?: string } | null> {
const headers: Record<string, string> = {
'accept': 'text/plain',
};
if (etag) {
headers['if-none-match'] = etag;
}
const context: IUpstreamFetchContext = {
protocol: 'rubygems',
resource: '*',
resourceType: 'versions',
path: '/versions',
method: 'GET',
headers,
query: {},
};
const result = await this.fetch(context);
if (!result || !result.success) {
return null;
}
let data: string;
if (Buffer.isBuffer(result.body)) {
data = result.body.toString('utf8');
} else if (typeof result.body === 'string') {
data = result.body;
} else {
return null;
}
return {
data,
etag: result.headers['etag'],
};
}
/**
* Fetch gem info file (/info/{gemname}).
*/
public async fetchInfo(gemName: string): Promise<string | null> {
const context: IUpstreamFetchContext = {
protocol: 'rubygems',
resource: gemName,
resourceType: 'info',
path: `/info/${gemName}`,
method: 'GET',
headers: {
'accept': 'text/plain',
},
query: {},
};
const result = await this.fetch(context);
if (!result || !result.success) {
return null;
}
if (Buffer.isBuffer(result.body)) {
return result.body.toString('utf8');
}
return typeof result.body === 'string' ? result.body : null;
}
/**
* Fetch the /names file (list of all gem names).
*/
public async fetchNames(): Promise<string | null> {
const context: IUpstreamFetchContext = {
protocol: 'rubygems',
resource: '*',
resourceType: 'names',
path: '/names',
method: 'GET',
headers: {
'accept': 'text/plain',
},
query: {},
};
const result = await this.fetch(context);
if (!result || !result.success) {
return null;
}
if (Buffer.isBuffer(result.body)) {
return result.body.toString('utf8');
}
return typeof result.body === 'string' ? result.body : null;
}
/**
* Fetch a gem file.
*/
public async fetchGem(gemName: string, version: string): Promise<Buffer | null> {
const path = `/gems/${gemName}-${version}.gem`;
const context: IUpstreamFetchContext = {
protocol: 'rubygems',
resource: gemName,
resourceType: 'gem',
path,
method: 'GET',
headers: {
'accept': 'application/octet-stream',
},
query: {},
};
const result = await this.fetch(context);
if (!result || !result.success) {
return null;
}
return Buffer.isBuffer(result.body) ? result.body : Buffer.from(result.body);
}
/**
* Fetch gem spec (quick spec).
*/
public async fetchQuickSpec(gemName: string, version: string): Promise<Buffer | null> {
const path = `/quick/Marshal.4.8/${gemName}-${version}.gemspec.rz`;
const context: IUpstreamFetchContext = {
protocol: 'rubygems',
resource: gemName,
resourceType: 'spec',
path,
method: 'GET',
headers: {
'accept': 'application/octet-stream',
},
query: {},
};
const result = await this.fetch(context);
if (!result || !result.success) {
return null;
}
return Buffer.isBuffer(result.body) ? result.body : Buffer.from(result.body);
}
/**
* Fetch gem versions JSON from API.
*/
public async fetchVersionsJson(gemName: string): Promise<any[] | null> {
const path = `/api/v1/versions/${gemName}.json`;
const context: IUpstreamFetchContext = {
protocol: 'rubygems',
resource: gemName,
resourceType: 'versions-json',
path,
method: 'GET',
headers: {
'accept': 'application/json',
},
query: {},
};
const result = await this.fetch(context);
if (!result || !result.success) {
return null;
}
if (Buffer.isBuffer(result.body)) {
return JSON.parse(result.body.toString('utf8'));
}
return Array.isArray(result.body) ? result.body : null;
}
/**
* Override URL building for RubyGems-specific handling.
*/
protected buildUpstreamUrl(
upstream: IUpstreamRegistryConfig,
context: IUpstreamFetchContext,
): string {
let baseUrl = upstream.url;
// Remove trailing slash
if (baseUrl.endsWith('/')) {
baseUrl = baseUrl.slice(0, -1);
}
return `${baseUrl}${context.path}`;
}
}

View File

@@ -5,4 +5,5 @@
export * from './interfaces.rubygems.js'; export * from './interfaces.rubygems.js';
export * from './classes.rubygemsregistry.js'; export * from './classes.rubygemsregistry.js';
export { RubygemsUpstream } from './classes.rubygemsupstream.js';
export * as rubygemsHelpers from './helpers.rubygems.js'; export * as rubygemsHelpers from './helpers.rubygems.js';

View File

@@ -0,0 +1,526 @@
import * as plugins from '../plugins.js';
import type {
IUpstreamRegistryConfig,
IUpstreamAuthConfig,
IUpstreamCacheConfig,
IUpstreamResilienceConfig,
IUpstreamResult,
IUpstreamFetchContext,
IProtocolUpstreamConfig,
IUpstreamScopeRule,
TCircuitState,
} from './interfaces.upstream.js';
import {
DEFAULT_CACHE_CONFIG,
DEFAULT_RESILIENCE_CONFIG,
} from './interfaces.upstream.js';
import { CircuitBreaker, CircuitOpenError, withCircuitBreaker } from './classes.circuitbreaker.js';
import { UpstreamCache } from './classes.upstreamcache.js';
/**
* Base class for protocol-specific upstream implementations.
*
* Provides:
* - Multi-upstream routing with priority
* - Scope-based filtering (glob patterns)
* - Authentication handling
* - Circuit breaker per upstream
* - Caching with TTL
* - Retry with exponential backoff
* - 429 rate limit handling
*/
export abstract class BaseUpstream {
/** Protocol name for logging */
protected abstract readonly protocolName: string;
/** Upstream configuration */
protected readonly config: IProtocolUpstreamConfig;
/** Resolved cache configuration */
protected readonly cacheConfig: IUpstreamCacheConfig;
/** Resolved resilience configuration */
protected readonly resilienceConfig: IUpstreamResilienceConfig;
/** Circuit breakers per upstream */
protected readonly circuitBreakers: Map<string, CircuitBreaker> = new Map();
/** Upstream cache */
protected readonly cache: UpstreamCache;
/** Logger instance */
protected readonly logger: plugins.smartlog.Smartlog;
constructor(config: IProtocolUpstreamConfig, logger?: plugins.smartlog.Smartlog) {
this.config = config;
this.cacheConfig = { ...DEFAULT_CACHE_CONFIG, ...config.cache };
this.resilienceConfig = { ...DEFAULT_RESILIENCE_CONFIG, ...config.resilience };
this.cache = new UpstreamCache(this.cacheConfig);
this.logger = logger || new plugins.smartlog.Smartlog({
logContext: {
company: 'smartregistry',
companyunit: 'upstream',
environment: 'production',
runtime: 'node',
}
});
// Initialize circuit breakers for each upstream
for (const upstream of config.upstreams) {
const upstreamResilience = { ...this.resilienceConfig, ...upstream.resilience };
this.circuitBreakers.set(upstream.id, new CircuitBreaker(upstream.id, upstreamResilience));
}
}
/**
* Check if upstream is enabled.
*/
public isEnabled(): boolean {
return this.config.enabled;
}
/**
* Get all configured upstreams.
*/
public getUpstreams(): IUpstreamRegistryConfig[] {
return this.config.upstreams;
}
/**
* Get circuit breaker state for an upstream.
*/
public getCircuitState(upstreamId: string): TCircuitState | null {
const breaker = this.circuitBreakers.get(upstreamId);
return breaker ? breaker.getState() : null;
}
/**
* Get cache statistics.
*/
public getCacheStats() {
return this.cache.getStats();
}
/**
* Fetch a resource from upstreams.
* Tries upstreams in priority order, respecting circuit breakers and scope rules.
*/
public async fetch(context: IUpstreamFetchContext): Promise<IUpstreamResult | null> {
if (!this.config.enabled) {
return null;
}
// Get applicable upstreams sorted by priority
const applicableUpstreams = this.getApplicableUpstreams(context.resource);
if (applicableUpstreams.length === 0) {
return null;
}
// Use the first applicable upstream's URL for cache key
const primaryUpstreamUrl = applicableUpstreams[0]?.url;
// Check cache first
const cached = await this.cache.get(context, primaryUpstreamUrl);
if (cached && !cached.stale) {
return {
success: true,
status: 200,
headers: cached.headers,
body: cached.data,
upstreamId: cached.upstreamId,
fromCache: true,
latencyMs: 0,
};
}
// Check for negative cache (recent 404)
if (await this.cache.hasNegative(context, primaryUpstreamUrl)) {
return {
success: false,
status: 404,
headers: {},
upstreamId: 'cache',
fromCache: true,
latencyMs: 0,
};
}
// If we have stale cache, return it immediately and revalidate in background
if (cached?.stale && this.cacheConfig.staleWhileRevalidate) {
// Fire and forget revalidation
this.revalidateInBackground(context, applicableUpstreams);
return {
success: true,
status: 200,
headers: cached.headers,
body: cached.data,
upstreamId: cached.upstreamId,
fromCache: true,
latencyMs: 0,
};
}
// Try each upstream in order
let lastError: Error | null = null;
for (const upstream of applicableUpstreams) {
const breaker = this.circuitBreakers.get(upstream.id);
if (!breaker) continue;
try {
const result = await withCircuitBreaker(
breaker,
() => this.fetchFromUpstream(upstream, context),
);
// Cache successful responses
if (result.success && result.body) {
await this.cache.set(
context,
Buffer.isBuffer(result.body) ? result.body : Buffer.from(JSON.stringify(result.body)),
result.headers['content-type'] || 'application/octet-stream',
result.headers,
upstream.id,
upstream.url,
);
}
// Cache 404 responses
if (result.status === 404) {
await this.cache.setNegative(context, upstream.id, upstream.url);
}
return result;
} catch (error) {
if (error instanceof CircuitOpenError) {
this.logger.log('debug', `Circuit open for upstream ${upstream.id}, trying next`);
} else {
this.logger.log('warn', `Upstream ${upstream.id} failed: ${(error as Error).message}`);
}
lastError = error as Error;
// Continue to next upstream
}
}
// All upstreams failed
if (lastError) {
this.logger.log('error', `All upstreams failed for ${context.resource}: ${lastError.message}`);
}
return null;
}
/**
* Invalidate cache for a resource pattern.
*/
public async invalidateCache(pattern: RegExp): Promise<number> {
return this.cache.invalidatePattern(pattern);
}
/**
* Clear all cache entries.
*/
public async clearCache(): Promise<void> {
await this.cache.clear();
}
/**
* Stop the upstream (cleanup resources).
*/
public stop(): void {
this.cache.stop();
}
/**
* Get upstreams that apply to a resource, sorted by priority.
*/
protected getApplicableUpstreams(resource: string): IUpstreamRegistryConfig[] {
return this.config.upstreams
.filter(upstream => {
if (!upstream.enabled) return false;
// Check circuit breaker
const breaker = this.circuitBreakers.get(upstream.id);
if (breaker && !breaker.canRequest()) return false;
// Check scope rules
return this.matchesScopeRules(resource, upstream.scopeRules);
})
.sort((a, b) => a.priority - b.priority);
}
/**
* Check if a resource matches scope rules.
* Empty rules = match all.
*/
protected matchesScopeRules(resource: string, rules?: IUpstreamScopeRule[]): boolean {
if (!rules || rules.length === 0) {
return true;
}
// Process rules in order
// Start with default exclude (nothing matches)
let matched = false;
for (const rule of rules) {
const isMatch = plugins.minimatch(resource, rule.pattern);
if (isMatch) {
matched = rule.action === 'include';
}
}
return matched;
}
/**
* Fetch from a specific upstream with retry logic.
*/
protected async fetchFromUpstream(
upstream: IUpstreamRegistryConfig,
context: IUpstreamFetchContext,
): Promise<IUpstreamResult> {
const upstreamResilience = { ...this.resilienceConfig, ...upstream.resilience };
const startTime = Date.now();
let lastError: Error | null = null;
for (let attempt = 0; attempt <= upstreamResilience.maxRetries; attempt++) {
try {
const result = await this.executeRequest(upstream, context, upstreamResilience.timeoutMs);
return {
...result,
upstreamId: upstream.id,
fromCache: false,
latencyMs: Date.now() - startTime,
};
} catch (error) {
lastError = error as Error;
// Don't retry on 4xx errors (except 429)
if (this.isNonRetryableError(error)) {
break;
}
// Calculate delay with exponential backoff and jitter
if (attempt < upstreamResilience.maxRetries) {
const delay = this.calculateBackoffDelay(
attempt,
upstreamResilience.retryDelayMs,
upstreamResilience.retryMaxDelayMs,
);
await this.sleep(delay);
}
}
}
throw lastError || new Error('Request failed');
}
/**
* Execute a single HTTP request to an upstream.
*/
protected async executeRequest(
upstream: IUpstreamRegistryConfig,
context: IUpstreamFetchContext,
timeoutMs: number,
): Promise<Omit<IUpstreamResult, 'upstreamId' | 'fromCache' | 'latencyMs'>> {
// Build the full URL
const url = this.buildUpstreamUrl(upstream, context);
// Build headers with auth
const headers = this.buildHeaders(upstream, context);
// Make the request using SmartRequest
const request = plugins.smartrequest.SmartRequest.create()
.url(url)
.method(context.method as any)
.headers(headers)
.timeout(timeoutMs)
.handle429Backoff({ maxRetries: 3, fallbackDelay: 1000, maxWaitTime: 30000 });
// Add query params if present
if (Object.keys(context.query).length > 0) {
request.query(context.query);
}
let response: plugins.smartrequest.ICoreResponse;
switch (context.method.toUpperCase()) {
case 'GET':
response = await request.get();
break;
case 'HEAD':
// SmartRequest doesn't have head(), use options
response = await request.method('HEAD').get();
break;
default:
response = await request.get();
}
// Parse response
const responseHeaders: Record<string, string> = {};
for (const [key, value] of Object.entries(response.headers)) {
responseHeaders[key.toLowerCase()] = Array.isArray(value) ? value[0] : value;
}
let body: Buffer | any;
const contentType = responseHeaders['content-type'] || '';
if (response.ok) {
if (contentType.includes('application/json')) {
body = await response.json();
} else {
const arrayBuffer = await response.arrayBuffer();
body = Buffer.from(arrayBuffer);
}
}
return {
success: response.ok,
status: response.status,
headers: responseHeaders,
body,
};
}
/**
* Build the full URL for an upstream request.
* Subclasses can override for protocol-specific URL building.
*/
protected buildUpstreamUrl(upstream: IUpstreamRegistryConfig, context: IUpstreamFetchContext): string {
// Remove leading slash if URL already has trailing slash
let path = context.path;
if (upstream.url.endsWith('/') && path.startsWith('/')) {
path = path.slice(1);
}
return `${upstream.url}${path}`;
}
/**
* Build headers including authentication.
*/
protected buildHeaders(
upstream: IUpstreamRegistryConfig,
context: IUpstreamFetchContext,
): Record<string, string> {
const headers: Record<string, string> = { ...context.headers };
// Remove host header (will be set by HTTP client)
delete headers['host'];
// Add authentication
this.addAuthHeaders(headers, upstream.auth);
return headers;
}
/**
* Add authentication headers based on auth config.
*/
protected addAuthHeaders(headers: Record<string, string>, auth: IUpstreamAuthConfig): void {
switch (auth.type) {
case 'basic':
if (auth.username && auth.password) {
const credentials = Buffer.from(`${auth.username}:${auth.password}`).toString('base64');
headers['authorization'] = `Basic ${credentials}`;
}
break;
case 'bearer':
if (auth.token) {
headers['authorization'] = `Bearer ${auth.token}`;
}
break;
case 'api-key':
if (auth.token) {
const headerName = auth.headerName || 'authorization';
headers[headerName.toLowerCase()] = auth.token;
}
break;
case 'none':
default:
// No authentication
break;
}
}
/**
* Check if an error should not be retried.
*/
protected isNonRetryableError(error: unknown): boolean {
// Check for HTTP status errors
if (error && typeof error === 'object' && 'status' in error) {
const status = (error as { status: number }).status;
// Don't retry 4xx errors except 429 (rate limited)
if (status >= 400 && status < 500 && status !== 429) {
return true;
}
}
return false;
}
/**
* Calculate backoff delay with exponential backoff and jitter.
*/
protected calculateBackoffDelay(
attempt: number,
baseDelayMs: number,
maxDelayMs: number,
): number {
// Exponential backoff: delay = base * 2^attempt
const exponentialDelay = baseDelayMs * Math.pow(2, attempt);
// Cap at max delay
const cappedDelay = Math.min(exponentialDelay, maxDelayMs);
// Add jitter (±25%)
const jitter = cappedDelay * 0.25 * (Math.random() * 2 - 1);
return Math.floor(cappedDelay + jitter);
}
/**
* Sleep for a specified duration.
*/
protected sleep(ms: number): Promise<void> {
return new Promise(resolve => setTimeout(resolve, ms));
}
/**
* Revalidate cache in background.
*/
protected async revalidateInBackground(
context: IUpstreamFetchContext,
upstreams: IUpstreamRegistryConfig[],
): Promise<void> {
try {
for (const upstream of upstreams) {
const breaker = this.circuitBreakers.get(upstream.id);
if (!breaker || !breaker.canRequest()) continue;
try {
const result = await withCircuitBreaker(
breaker,
() => this.fetchFromUpstream(upstream, context),
);
if (result.success && result.body) {
await this.cache.set(
context,
Buffer.isBuffer(result.body) ? result.body : Buffer.from(JSON.stringify(result.body)),
result.headers['content-type'] || 'application/octet-stream',
result.headers,
upstream.id,
upstream.url,
);
return; // Successfully revalidated
}
} catch {
// Continue to next upstream
}
}
} catch (error) {
this.logger.log('debug', `Background revalidation failed: ${(error as Error).message}`);
}
}
}

View File

@@ -0,0 +1,238 @@
import type { TCircuitState, IUpstreamResilienceConfig } from './interfaces.upstream.js';
import { DEFAULT_RESILIENCE_CONFIG } from './interfaces.upstream.js';
/**
* Circuit breaker implementation for upstream resilience.
*
* States:
* - CLOSED: Normal operation, requests pass through
* - OPEN: Circuit is tripped, requests fail fast
* - HALF_OPEN: Testing if upstream has recovered
*
* Transitions:
* - CLOSED → OPEN: When failure count exceeds threshold
* - OPEN → HALF_OPEN: After reset timeout expires
* - HALF_OPEN → CLOSED: On successful request
* - HALF_OPEN → OPEN: On failed request
*/
export class CircuitBreaker {
/** Unique identifier for logging and metrics */
public readonly id: string;
/** Current circuit state */
private state: TCircuitState = 'CLOSED';
/** Count of consecutive failures */
private failureCount: number = 0;
/** Timestamp when circuit was opened */
private openedAt: number = 0;
/** Number of successful requests in half-open state */
private halfOpenSuccesses: number = 0;
/** Configuration */
private readonly config: IUpstreamResilienceConfig;
/** Number of successes required to close circuit from half-open */
private readonly halfOpenThreshold: number = 2;
constructor(id: string, config?: Partial<IUpstreamResilienceConfig>) {
this.id = id;
this.config = { ...DEFAULT_RESILIENCE_CONFIG, ...config };
}
/**
* Get current circuit state.
*/
public getState(): TCircuitState {
// Check if we should transition from OPEN to HALF_OPEN
if (this.state === 'OPEN') {
const elapsed = Date.now() - this.openedAt;
if (elapsed >= this.config.circuitBreakerResetMs) {
this.transitionTo('HALF_OPEN');
}
}
return this.state;
}
/**
* Check if circuit allows requests.
* Returns true if requests should be allowed.
*/
public canRequest(): boolean {
const currentState = this.getState();
return currentState !== 'OPEN';
}
/**
* Record a successful request.
* May transition circuit from HALF_OPEN to CLOSED.
*/
public recordSuccess(): void {
if (this.state === 'HALF_OPEN') {
this.halfOpenSuccesses++;
if (this.halfOpenSuccesses >= this.halfOpenThreshold) {
this.transitionTo('CLOSED');
}
} else if (this.state === 'CLOSED') {
// Reset failure count on success
this.failureCount = 0;
}
}
/**
* Record a failed request.
* May transition circuit from CLOSED/HALF_OPEN to OPEN.
*/
public recordFailure(): void {
if (this.state === 'HALF_OPEN') {
// Any failure in half-open immediately opens circuit
this.transitionTo('OPEN');
} else if (this.state === 'CLOSED') {
this.failureCount++;
if (this.failureCount >= this.config.circuitBreakerThreshold) {
this.transitionTo('OPEN');
}
}
}
/**
* Force circuit to open state.
* Useful for manual intervention or external health checks.
*/
public forceOpen(): void {
this.transitionTo('OPEN');
}
/**
* Force circuit to closed state.
* Useful for manual intervention after fixing upstream issues.
*/
public forceClose(): void {
this.transitionTo('CLOSED');
}
/**
* Reset circuit to initial state.
*/
public reset(): void {
this.state = 'CLOSED';
this.failureCount = 0;
this.openedAt = 0;
this.halfOpenSuccesses = 0;
}
/**
* Get circuit metrics for monitoring.
*/
public getMetrics(): ICircuitBreakerMetrics {
return {
id: this.id,
state: this.getState(),
failureCount: this.failureCount,
openedAt: this.openedAt > 0 ? new Date(this.openedAt) : null,
timeUntilHalfOpen: this.state === 'OPEN'
? Math.max(0, this.config.circuitBreakerResetMs - (Date.now() - this.openedAt))
: 0,
halfOpenSuccesses: this.halfOpenSuccesses,
threshold: this.config.circuitBreakerThreshold,
resetMs: this.config.circuitBreakerResetMs,
};
}
/**
* Transition to a new state with proper cleanup.
*/
private transitionTo(newState: TCircuitState): void {
const previousState = this.state;
this.state = newState;
switch (newState) {
case 'OPEN':
this.openedAt = Date.now();
this.halfOpenSuccesses = 0;
break;
case 'HALF_OPEN':
this.halfOpenSuccesses = 0;
break;
case 'CLOSED':
this.failureCount = 0;
this.openedAt = 0;
this.halfOpenSuccesses = 0;
break;
}
// Log state transition (useful for debugging and monitoring)
// In production, this would emit events or metrics
if (previousState !== newState) {
// State changed - could emit event here
}
}
}
/**
* Metrics for circuit breaker monitoring.
*/
export interface ICircuitBreakerMetrics {
/** Circuit breaker identifier */
id: string;
/** Current state */
state: TCircuitState;
/** Number of consecutive failures */
failureCount: number;
/** When circuit was opened (null if never opened) */
openedAt: Date | null;
/** Milliseconds until circuit transitions to half-open (0 if not open) */
timeUntilHalfOpen: number;
/** Number of successes in half-open state */
halfOpenSuccesses: number;
/** Failure threshold for opening circuit */
threshold: number;
/** Reset timeout in milliseconds */
resetMs: number;
}
/**
* Execute a function with circuit breaker protection.
*
* @param breaker The circuit breaker to use
* @param fn The async function to execute
* @param fallback Optional fallback function when circuit is open
* @returns The result of fn or fallback
* @throws CircuitOpenError if circuit is open and no fallback provided
*/
export async function withCircuitBreaker<T>(
breaker: CircuitBreaker,
fn: () => Promise<T>,
fallback?: () => Promise<T>,
): Promise<T> {
if (!breaker.canRequest()) {
if (fallback) {
return fallback();
}
throw new CircuitOpenError(breaker.id);
}
try {
const result = await fn();
breaker.recordSuccess();
return result;
} catch (error) {
breaker.recordFailure();
throw error;
}
}
/**
* Error thrown when circuit is open and no fallback is provided.
*/
export class CircuitOpenError extends Error {
public readonly circuitId: string;
constructor(circuitId: string) {
super(`Circuit breaker '${circuitId}' is open`);
this.name = 'CircuitOpenError';
this.circuitId = circuitId;
}
}

View File

@@ -0,0 +1,626 @@
import type {
ICacheEntry,
IUpstreamCacheConfig,
IUpstreamFetchContext,
} from './interfaces.upstream.js';
import { DEFAULT_CACHE_CONFIG } from './interfaces.upstream.js';
import type { IStorageBackend } from '../core/interfaces.core.js';
/**
* Cache metadata stored alongside cache entries.
*/
interface ICacheMetadata {
contentType: string;
headers: Record<string, string>;
cachedAt: string;
expiresAt?: string;
etag?: string;
upstreamId: string;
upstreamUrl: string;
}
/**
* S3-backed upstream cache with in-memory hot layer.
*
* Features:
* - TTL-based expiration
* - Stale-while-revalidate support
* - Negative caching (404s)
* - Content-type aware caching
* - ETag support for conditional requests
* - Multi-upstream support via URL-based cache paths
* - Persistent S3 storage with in-memory hot layer
*
* Cache paths are structured as:
* cache/{escaped-upstream-url}/{protocol}:{method}:{path}
*
* @example
* ```typescript
* // In-memory only (default)
* const cache = new UpstreamCache(config);
*
* // With S3 persistence
* const cache = new UpstreamCache(config, 10000, storage);
* ```
*/
export class UpstreamCache {
/** In-memory hot cache */
private readonly memoryCache: Map<string, ICacheEntry> = new Map();
/** Configuration */
private readonly config: IUpstreamCacheConfig;
/** Maximum in-memory cache entries */
private readonly maxMemoryEntries: number;
/** S3 storage backend (optional) */
private readonly storage?: IStorageBackend;
/** Cleanup interval handle */
private cleanupInterval: ReturnType<typeof setInterval> | null = null;
constructor(
config?: Partial<IUpstreamCacheConfig>,
maxMemoryEntries: number = 10000,
storage?: IStorageBackend
) {
this.config = { ...DEFAULT_CACHE_CONFIG, ...config };
this.maxMemoryEntries = maxMemoryEntries;
this.storage = storage;
// Start periodic cleanup if caching is enabled
if (this.config.enabled) {
this.startCleanup();
}
}
/**
* Check if caching is enabled.
*/
public isEnabled(): boolean {
return this.config.enabled;
}
/**
* Check if S3 storage is configured.
*/
public hasStorage(): boolean {
return !!this.storage;
}
/**
* Get cached entry for a request context.
* Checks memory first, then falls back to S3.
* Returns null if not found or expired (unless stale-while-revalidate).
*/
public async get(context: IUpstreamFetchContext, upstreamUrl?: string): Promise<ICacheEntry | null> {
if (!this.config.enabled) {
return null;
}
const key = this.buildCacheKey(context, upstreamUrl);
// Check memory cache first
let entry = this.memoryCache.get(key);
// If not in memory and we have storage, check S3
if (!entry && this.storage) {
entry = await this.loadFromStorage(key);
if (entry) {
// Promote to memory cache
this.memoryCache.set(key, entry);
}
}
if (!entry) {
return null;
}
const now = new Date();
// Check if entry is expired
if (entry.expiresAt && entry.expiresAt < now) {
// Check if we can serve stale content
if (this.config.staleWhileRevalidate && !entry.stale) {
const staleAge = (now.getTime() - entry.expiresAt.getTime()) / 1000;
if (staleAge <= this.config.staleMaxAgeSeconds) {
// Mark as stale and return
entry.stale = true;
return entry;
}
}
// Entry is too old, remove it
this.memoryCache.delete(key);
if (this.storage) {
await this.deleteFromStorage(key).catch(() => {});
}
return null;
}
return entry;
}
/**
* Store a response in the cache (memory and optionally S3).
*/
public async set(
context: IUpstreamFetchContext,
data: Buffer,
contentType: string,
headers: Record<string, string>,
upstreamId: string,
upstreamUrl: string,
options?: ICacheSetOptions,
): Promise<void> {
if (!this.config.enabled) {
return;
}
// Enforce max memory entries limit
if (this.memoryCache.size >= this.maxMemoryEntries) {
this.evictOldest();
}
const key = this.buildCacheKey(context, upstreamUrl);
const now = new Date();
// Determine TTL based on content type
const ttlSeconds = options?.ttlSeconds ?? this.determineTtl(context, contentType, headers);
const entry: ICacheEntry = {
data,
contentType,
headers,
cachedAt: now,
expiresAt: ttlSeconds > 0 ? new Date(now.getTime() + ttlSeconds * 1000) : undefined,
etag: headers['etag'] || options?.etag,
upstreamId,
stale: false,
};
// Store in memory
this.memoryCache.set(key, entry);
// Store in S3 if available
if (this.storage) {
await this.saveToStorage(key, entry, upstreamUrl).catch(() => {});
}
}
/**
* Store a negative cache entry (404 response).
*/
public async setNegative(context: IUpstreamFetchContext, upstreamId: string, upstreamUrl: string): Promise<void> {
if (!this.config.enabled || this.config.negativeCacheTtlSeconds <= 0) {
return;
}
const key = this.buildCacheKey(context, upstreamUrl);
const now = new Date();
const entry: ICacheEntry = {
data: Buffer.from(''),
contentType: 'application/octet-stream',
headers: {},
cachedAt: now,
expiresAt: new Date(now.getTime() + this.config.negativeCacheTtlSeconds * 1000),
upstreamId,
stale: false,
};
this.memoryCache.set(key, entry);
if (this.storage) {
await this.saveToStorage(key, entry, upstreamUrl).catch(() => {});
}
}
/**
* Check if there's a negative cache entry for this context.
*/
public async hasNegative(context: IUpstreamFetchContext, upstreamUrl?: string): Promise<boolean> {
const entry = await this.get(context, upstreamUrl);
return entry !== null && entry.data.length === 0;
}
/**
* Invalidate a specific cache entry.
*/
public async invalidate(context: IUpstreamFetchContext, upstreamUrl?: string): Promise<boolean> {
const key = this.buildCacheKey(context, upstreamUrl);
const deleted = this.memoryCache.delete(key);
if (this.storage) {
await this.deleteFromStorage(key).catch(() => {});
}
return deleted;
}
/**
* Invalidate all entries matching a pattern.
* Useful for invalidating all versions of a package.
*/
public async invalidatePattern(pattern: RegExp): Promise<number> {
let count = 0;
for (const key of this.memoryCache.keys()) {
if (pattern.test(key)) {
this.memoryCache.delete(key);
if (this.storage) {
await this.deleteFromStorage(key).catch(() => {});
}
count++;
}
}
return count;
}
/**
* Invalidate all entries from a specific upstream.
*/
public async invalidateUpstream(upstreamId: string): Promise<number> {
let count = 0;
for (const [key, entry] of this.memoryCache.entries()) {
if (entry.upstreamId === upstreamId) {
this.memoryCache.delete(key);
if (this.storage) {
await this.deleteFromStorage(key).catch(() => {});
}
count++;
}
}
return count;
}
/**
* Clear all cache entries (memory and S3).
*/
public async clear(): Promise<void> {
this.memoryCache.clear();
// Note: S3 cleanup would require listing and deleting all cache/* objects
// This is left as a future enhancement for bulk cleanup
}
/**
* Get cache statistics.
*/
public getStats(): ICacheStats {
let freshCount = 0;
let staleCount = 0;
let negativeCount = 0;
let totalSize = 0;
const now = new Date();
for (const entry of this.memoryCache.values()) {
totalSize += entry.data.length;
if (entry.data.length === 0) {
negativeCount++;
} else if (entry.stale || (entry.expiresAt && entry.expiresAt < now)) {
staleCount++;
} else {
freshCount++;
}
}
return {
totalEntries: this.memoryCache.size,
freshEntries: freshCount,
staleEntries: staleCount,
negativeEntries: negativeCount,
totalSizeBytes: totalSize,
maxEntries: this.maxMemoryEntries,
enabled: this.config.enabled,
hasStorage: !!this.storage,
};
}
/**
* Stop the cache and cleanup.
*/
public stop(): void {
if (this.cleanupInterval) {
clearInterval(this.cleanupInterval);
this.cleanupInterval = null;
}
}
// ========================================================================
// Storage Methods
// ========================================================================
/**
* Build storage path for a cache key.
* Escapes upstream URL for safe use in S3 paths.
*/
private buildStoragePath(key: string): string {
return `cache/${key}`;
}
/**
* Build storage path for cache metadata.
*/
private buildMetadataPath(key: string): string {
return `cache/${key}.meta`;
}
/**
* Load a cache entry from S3 storage.
*/
private async loadFromStorage(key: string): Promise<ICacheEntry | null> {
if (!this.storage) return null;
try {
const dataPath = this.buildStoragePath(key);
const metaPath = this.buildMetadataPath(key);
// Load data and metadata in parallel
const [data, metaBuffer] = await Promise.all([
this.storage.getObject(dataPath),
this.storage.getObject(metaPath),
]);
if (!data || !metaBuffer) {
return null;
}
const meta: ICacheMetadata = JSON.parse(metaBuffer.toString('utf-8'));
return {
data,
contentType: meta.contentType,
headers: meta.headers,
cachedAt: new Date(meta.cachedAt),
expiresAt: meta.expiresAt ? new Date(meta.expiresAt) : undefined,
etag: meta.etag,
upstreamId: meta.upstreamId,
stale: false,
};
} catch {
return null;
}
}
/**
* Save a cache entry to S3 storage.
*/
private async saveToStorage(key: string, entry: ICacheEntry, upstreamUrl: string): Promise<void> {
if (!this.storage) return;
const dataPath = this.buildStoragePath(key);
const metaPath = this.buildMetadataPath(key);
const meta: ICacheMetadata = {
contentType: entry.contentType,
headers: entry.headers,
cachedAt: entry.cachedAt.toISOString(),
expiresAt: entry.expiresAt?.toISOString(),
etag: entry.etag,
upstreamId: entry.upstreamId,
upstreamUrl,
};
// Save data and metadata in parallel
await Promise.all([
this.storage.putObject(dataPath, entry.data),
this.storage.putObject(metaPath, Buffer.from(JSON.stringify(meta), 'utf-8')),
]);
}
/**
* Delete a cache entry from S3 storage.
*/
private async deleteFromStorage(key: string): Promise<void> {
if (!this.storage) return;
const dataPath = this.buildStoragePath(key);
const metaPath = this.buildMetadataPath(key);
await Promise.all([
this.storage.deleteObject(dataPath).catch(() => {}),
this.storage.deleteObject(metaPath).catch(() => {}),
]);
}
// ========================================================================
// Helper Methods
// ========================================================================
/**
* Escape a URL for safe use in storage paths.
*/
private escapeUrl(url: string): string {
// Remove protocol prefix and escape special characters
return url
.replace(/^https?:\/\//, '')
.replace(/[\/\\:*?"<>|]/g, '_')
.replace(/__+/g, '_');
}
/**
* Build a unique cache key for a request context.
* Includes escaped upstream URL for multi-upstream support.
*/
private buildCacheKey(context: IUpstreamFetchContext, upstreamUrl?: string): string {
// Include method, protocol, path, and sorted query params
const queryString = Object.keys(context.query)
.sort()
.map(k => `${k}=${context.query[k]}`)
.join('&');
const baseKey = `${context.protocol}:${context.method}:${context.path}${queryString ? '?' + queryString : ''}`;
if (upstreamUrl) {
return `${this.escapeUrl(upstreamUrl)}/${baseKey}`;
}
return baseKey;
}
/**
* Determine TTL based on content characteristics.
*/
private determineTtl(
context: IUpstreamFetchContext,
contentType: string,
headers: Record<string, string>,
): number {
// Check for Cache-Control header
const cacheControl = headers['cache-control'];
if (cacheControl) {
const maxAgeMatch = cacheControl.match(/max-age=(\d+)/);
if (maxAgeMatch) {
return parseInt(maxAgeMatch[1], 10);
}
if (cacheControl.includes('no-store') || cacheControl.includes('no-cache')) {
return 0;
}
}
// Check if content is immutable (content-addressable)
if (this.isImmutableContent(context, contentType)) {
return this.config.immutableTtlSeconds;
}
// Default TTL for mutable content
return this.config.defaultTtlSeconds;
}
/**
* Check if content is immutable (content-addressable).
*/
private isImmutableContent(context: IUpstreamFetchContext, contentType: string): boolean {
// OCI blobs with digest are immutable
if (context.protocol === 'oci' && context.resourceType === 'blob') {
return true;
}
// NPM tarballs are immutable (versioned)
if (context.protocol === 'npm' && context.resourceType === 'tarball') {
return true;
}
// Maven artifacts with version are immutable
if (context.protocol === 'maven' && context.resourceType === 'artifact') {
return true;
}
// Cargo crate files are immutable
if (context.protocol === 'cargo' && context.resourceType === 'crate') {
return true;
}
// Composer dist files are immutable
if (context.protocol === 'composer' && context.resourceType === 'dist') {
return true;
}
// PyPI package files are immutable
if (context.protocol === 'pypi' && context.resourceType === 'package') {
return true;
}
// RubyGems .gem files are immutable
if (context.protocol === 'rubygems' && context.resourceType === 'gem') {
return true;
}
return false;
}
/**
* Evict oldest entries to make room for new ones.
*/
private evictOldest(): void {
// Evict 10% of max entries
const evictCount = Math.ceil(this.maxMemoryEntries * 0.1);
let evicted = 0;
// First, try to evict stale entries
const now = new Date();
for (const [key, entry] of this.memoryCache.entries()) {
if (evicted >= evictCount) break;
if (entry.stale || (entry.expiresAt && entry.expiresAt < now)) {
this.memoryCache.delete(key);
evicted++;
}
}
// If not enough evicted, evict oldest by cachedAt
if (evicted < evictCount) {
const entries = Array.from(this.memoryCache.entries())
.sort((a, b) => a[1].cachedAt.getTime() - b[1].cachedAt.getTime());
for (const [key] of entries) {
if (evicted >= evictCount) break;
this.memoryCache.delete(key);
evicted++;
}
}
}
/**
* Start periodic cleanup of expired entries.
*/
private startCleanup(): void {
// Run cleanup every minute
this.cleanupInterval = setInterval(() => {
this.cleanup();
}, 60000);
// Don't keep the process alive just for cleanup
if (this.cleanupInterval.unref) {
this.cleanupInterval.unref();
}
}
/**
* Remove all expired entries from memory cache.
*/
private cleanup(): void {
const now = new Date();
const staleDeadline = new Date(now.getTime() - this.config.staleMaxAgeSeconds * 1000);
for (const [key, entry] of this.memoryCache.entries()) {
if (entry.expiresAt) {
// Remove if past stale deadline
if (entry.expiresAt < staleDeadline) {
this.memoryCache.delete(key);
}
}
}
}
}
/**
* Options for cache set operation.
*/
export interface ICacheSetOptions {
/** Override TTL in seconds */
ttlSeconds?: number;
/** ETag for conditional requests */
etag?: string;
}
/**
* Cache statistics.
*/
export interface ICacheStats {
/** Total number of cached entries in memory */
totalEntries: number;
/** Number of fresh (non-expired) entries */
freshEntries: number;
/** Number of stale entries (expired but still usable) */
staleEntries: number;
/** Number of negative cache entries */
negativeEntries: number;
/** Total size of cached data in bytes (memory only) */
totalSizeBytes: number;
/** Maximum allowed memory entries */
maxEntries: number;
/** Whether caching is enabled */
enabled: boolean;
/** Whether S3 storage is configured */
hasStorage: boolean;
}

11
ts/upstream/index.ts Normal file
View File

@@ -0,0 +1,11 @@
// Interfaces and types
export * from './interfaces.upstream.js';
// Classes
export { CircuitBreaker, CircuitOpenError, withCircuitBreaker } from './classes.circuitbreaker.js';
export type { ICircuitBreakerMetrics } from './classes.circuitbreaker.js';
export { UpstreamCache } from './classes.upstreamcache.js';
export type { ICacheSetOptions, ICacheStats } from './classes.upstreamcache.js';
export { BaseUpstream } from './classes.baseupstream.js';

View File

@@ -0,0 +1,195 @@
import type { TRegistryProtocol } from '../core/interfaces.core.js';
/**
* Scope rule for routing requests to specific upstreams.
* Uses glob patterns for flexible matching.
*/
export interface IUpstreamScopeRule {
/** Glob pattern (e.g., "@company/*", "com.example.*", "library/*") */
pattern: string;
/** Whether matching resources should be included or excluded */
action: 'include' | 'exclude';
}
/**
* Authentication configuration for an upstream registry.
* Supports multiple auth strategies.
*/
export interface IUpstreamAuthConfig {
/** Authentication type */
type: 'none' | 'basic' | 'bearer' | 'api-key';
/** Username for basic auth */
username?: string;
/** Password for basic auth */
password?: string;
/** Token for bearer or api-key auth */
token?: string;
/** Custom header name for api-key auth (default: 'Authorization') */
headerName?: string;
}
/**
* Cache configuration for upstream content.
*/
export interface IUpstreamCacheConfig {
/** Whether caching is enabled */
enabled: boolean;
/** Default TTL in seconds for mutable content (default: 300 = 5 min) */
defaultTtlSeconds: number;
/** TTL in seconds for immutable/content-addressable content (default: 2592000 = 30 days) */
immutableTtlSeconds: number;
/** Whether to serve stale content while revalidating in background */
staleWhileRevalidate: boolean;
/** Maximum age in seconds for stale content (default: 3600 = 1 hour) */
staleMaxAgeSeconds: number;
/** TTL in seconds for negative cache entries (404s) (default: 60 = 1 min) */
negativeCacheTtlSeconds: number;
}
/**
* Resilience configuration for upstream requests.
*/
export interface IUpstreamResilienceConfig {
/** Request timeout in milliseconds (default: 30000) */
timeoutMs: number;
/** Maximum number of retry attempts (default: 3) */
maxRetries: number;
/** Initial retry delay in milliseconds (default: 1000) */
retryDelayMs: number;
/** Maximum retry delay in milliseconds (default: 30000) */
retryMaxDelayMs: number;
/** Number of failures before circuit breaker opens (default: 5) */
circuitBreakerThreshold: number;
/** Time in milliseconds before circuit breaker attempts reset (default: 30000) */
circuitBreakerResetMs: number;
}
/**
* Configuration for a single upstream registry.
*/
export interface IUpstreamRegistryConfig {
/** Unique identifier for this upstream */
id: string;
/** Human-readable name */
name: string;
/** Base URL of the upstream registry (e.g., "https://registry.npmjs.org") */
url: string;
/** Priority for routing (lower = higher priority, 1 = first) */
priority: number;
/** Whether this upstream is enabled */
enabled: boolean;
/** Scope rules for routing (empty = match all) */
scopeRules?: IUpstreamScopeRule[];
/** Authentication configuration */
auth: IUpstreamAuthConfig;
/** Cache configuration overrides */
cache?: Partial<IUpstreamCacheConfig>;
/** Resilience configuration overrides */
resilience?: Partial<IUpstreamResilienceConfig>;
}
/**
* Protocol-level upstream configuration.
* Configures upstream behavior for a specific protocol (npm, oci, etc.)
*/
export interface IProtocolUpstreamConfig {
/** Whether upstream is enabled for this protocol */
enabled: boolean;
/** List of upstream registries, ordered by priority */
upstreams: IUpstreamRegistryConfig[];
/** Protocol-level cache configuration defaults */
cache?: Partial<IUpstreamCacheConfig>;
/** Protocol-level resilience configuration defaults */
resilience?: Partial<IUpstreamResilienceConfig>;
}
/**
* Result of an upstream fetch operation.
*/
export interface IUpstreamResult {
/** Whether the fetch was successful (2xx status) */
success: boolean;
/** HTTP status code */
status: number;
/** Response headers */
headers: Record<string, string>;
/** Response body (Buffer for binary, object for JSON) */
body?: Buffer | any;
/** ID of the upstream that served the request */
upstreamId: string;
/** Whether the response was served from cache */
fromCache: boolean;
/** Request latency in milliseconds */
latencyMs: number;
}
/**
* Circuit breaker state.
*/
export type TCircuitState = 'CLOSED' | 'OPEN' | 'HALF_OPEN';
/**
* Context for an upstream fetch request.
*/
export interface IUpstreamFetchContext {
/** Protocol type */
protocol: TRegistryProtocol;
/** Resource identifier (package name, artifact name, etc.) */
resource: string;
/** Type of resource being fetched (packument, tarball, manifest, blob, etc.) */
resourceType: string;
/** Original request path */
path: string;
/** HTTP method */
method: string;
/** Request headers */
headers: Record<string, string>;
/** Query parameters */
query: Record<string, string>;
}
/**
* Cache entry stored in the upstream cache.
*/
export interface ICacheEntry {
/** Cached data */
data: Buffer;
/** Content type of the cached data */
contentType: string;
/** Original response headers */
headers: Record<string, string>;
/** When the entry was cached */
cachedAt: Date;
/** When the entry expires */
expiresAt?: Date;
/** ETag for conditional requests */
etag?: string;
/** ID of the upstream that provided the data */
upstreamId: string;
/** Whether the entry is stale but still usable */
stale?: boolean;
}
/**
* Default cache configuration values.
*/
export const DEFAULT_CACHE_CONFIG: IUpstreamCacheConfig = {
enabled: true,
defaultTtlSeconds: 300, // 5 minutes
immutableTtlSeconds: 2592000, // 30 days
staleWhileRevalidate: true,
staleMaxAgeSeconds: 3600, // 1 hour
negativeCacheTtlSeconds: 60, // 1 minute
};
/**
* Default resilience configuration values.
*/
export const DEFAULT_RESILIENCE_CONFIG: IUpstreamResilienceConfig = {
timeoutMs: 30000,
maxRetries: 3,
retryDelayMs: 1000,
retryMaxDelayMs: 30000,
circuitBreakerThreshold: 5,
circuitBreakerResetMs: 30000,
};