Compare commits
6 Commits
| Author | SHA1 | Date | |
|---|---|---|---|
| 351680159b | |||
| 0cabf284ed | |||
| dbc8566aad | |||
| bd64a7b140 | |||
| ae8dec9142 | |||
| 19da87a9df |
35
changelog.md
35
changelog.md
@@ -1,5 +1,40 @@
|
|||||||
# Changelog
|
# Changelog
|
||||||
|
|
||||||
|
## 2025-11-27 - 2.6.0 - feat(core)
|
||||||
|
Add core registry infrastructure: storage, auth, upstream cache, and protocol handlers
|
||||||
|
|
||||||
|
- Introduce RegistryStorage: unified storage abstraction with hook support (before/after put/delete/get) and helpers for OCI, NPM, Maven, Cargo, Composer, PyPI, and RubyGems paths and operations
|
||||||
|
- Add DefaultAuthProvider and AuthManager: in-memory token store, UUID tokens for package protocols, OCI JWT creation/validation, token lifecycle (create/validate/revoke) and authorization checking
|
||||||
|
- Add SmartRegistry orchestrator to initialize and route requests to protocol handlers (OCI, NPM, Maven, Cargo, Composer, PyPI, RubyGems)
|
||||||
|
- Implement upstream subsystem: UpstreamCache (in-memory + optional S3 persistence), BaseUpstream with multi-upstream routing, scope rules, retries, TTLs, stale-while-revalidate and negative caching
|
||||||
|
- Add circuit breaker implementation for upstream resilience with exponential backoff and per-upstream breakers
|
||||||
|
- Add protocol implementations and helpers: NpmRegistry/NpmUpstream (packument/tarball handling and tarball URL rewriting), PypiRegistry (PEP 503/691 support, uploads, metadata), MavenRegistry (artifact/metadata handling and checksum generation), CargoRegistry (sparse index, publish/download/yank)
|
||||||
|
- Utility exports and helpers: buffer helpers, plugins aggregator, path helpers, and various protocol-specific helper modules
|
||||||
|
|
||||||
|
## 2025-11-27 - 2.5.0 - feat(pypi,rubygems)
|
||||||
|
Add PyPI and RubyGems protocol implementations, upstream caching, and auth/storage improvements
|
||||||
|
|
||||||
|
- Implemented full PyPI support (PEP 503 Simple API HTML, PEP 691 JSON API, legacy upload handling, name normalization, hash verification, content negotiation, package/file storage and metadata management).
|
||||||
|
- Implemented RubyGems support (compact index, /versions, /info, /names endpoints, gem upload, yank/unyank, platform handling and file storage).
|
||||||
|
- Expanded RegistryStorage with protocol-specific helpers for OCI, NPM, Maven, Cargo, Composer, PyPI, and RubyGems (get/put/delete/list helpers, metadata handling, context-aware hooks).
|
||||||
|
- Added AuthManager and DefaultAuthProvider improvements: unified token creation/validation for multiple protocols (npm, oci, maven, composer, cargo, pypi, rubygems) and OCI JWT support.
|
||||||
|
- Added upstream infrastructure: BaseUpstream, UpstreamCache (S3-backed optional, stale-while-revalidate, negative caching), circuit breaker with retries/backoff and resilience defaults.
|
||||||
|
- Added various protocol registries (NPM, Maven, Cargo, OCI, PyPI) with request routing, permission checks, and optional upstream proxying/caching.
|
||||||
|
|
||||||
|
## 2025-11-27 - 2.4.0 - feat(core)
|
||||||
|
Add pluggable auth providers, storage hooks, multi-upstream cache awareness, and PyPI/RubyGems protocol implementations
|
||||||
|
|
||||||
|
- Introduce pluggable authentication: IAuthProvider interface and DefaultAuthProvider (in-memory) with OCI JWT support and UUID tokens.
|
||||||
|
- AuthManager now accepts a custom provider and delegates all auth operations (authenticate, validateToken, create/revoke tokens, authorize, listUserTokens).
|
||||||
|
- Add storage hooks (IStorageHooks) and hook contexts: beforePut/afterPut/afterGet/beforeDelete/afterDelete. RegistryStorage now supports hooks, context management (setContext/withContext) and invokes hooks around operations.
|
||||||
|
- RegistryStorage expanded with many protocol-specific helper methods (OCI, NPM, Maven, Cargo, Composer, PyPI, RubyGems) and improved S3/SmartBucket integration.
|
||||||
|
- Upstream improvements: BaseUpstream and UpstreamCache became multi-upstream aware (cache keys now include upstream URL), cache operations are async and support negative caching, stale-while-revalidate, ETag/metadata persistence, and S3-backed storage layer.
|
||||||
|
- Circuit breaker, retry, resilience and scope-rule routing enhancements for upstreams; upstream fetch logic updated to prefer primary upstream for cache keys and background revalidation behavior.
|
||||||
|
- SmartRegistry API extended to accept custom authProvider and storageHooks, and now wires RegistryStorage and AuthManager with those options. Core exports updated to expose auth and storage interfaces and DefaultAuthProvider.
|
||||||
|
- Add full PyPI (PEP 503/691, upload API) and RubyGems (Compact Index, API v1, uploads/yank/unyank, specs endpoints) registry implementations with parsing, upload/download, metadata management and upstream proxying.
|
||||||
|
- Add utility helpers: binary buffer helpers (toBuffer/isBinaryData), pypi and rubygems helper modules, and numerous protocol-specific helpers and tests referenced in readme.hints.
|
||||||
|
- These changes are additive and designed to be backward compatible; bumping minor version.
|
||||||
|
|
||||||
## 2025-11-27 - 2.3.0 - feat(upstream)
|
## 2025-11-27 - 2.3.0 - feat(upstream)
|
||||||
Add upstream proxy/cache subsystem and integrate per-protocol upstreams
|
Add upstream proxy/cache subsystem and integrate per-protocol upstreams
|
||||||
|
|
||||||
|
|||||||
@@ -1,6 +1,6 @@
|
|||||||
{
|
{
|
||||||
"name": "@push.rocks/smartregistry",
|
"name": "@push.rocks/smartregistry",
|
||||||
"version": "2.3.0",
|
"version": "2.6.0",
|
||||||
"private": false,
|
"private": false,
|
||||||
"description": "A composable TypeScript library implementing OCI, NPM, Maven, Cargo, Composer, PyPI, and RubyGems registries for building unified container and package registries",
|
"description": "A composable TypeScript library implementing OCI, NPM, Maven, Cargo, Composer, PyPI, and RubyGems registries for building unified container and package registries",
|
||||||
"main": "dist_ts/index.js",
|
"main": "dist_ts/index.js",
|
||||||
|
|||||||
226
readme.md
226
readme.md
@@ -4,7 +4,7 @@
|
|||||||
|
|
||||||
## Issue Reporting and Security
|
## Issue Reporting and Security
|
||||||
|
|
||||||
For reporting bugs, issues, or security vulnerabilities, please visit [community.foss.global/](https://community.foss.global/). This is the central community hub for all issue reporting. Developers who want to sign a contribution agreement and go through identification can also get a [code.foss.global/](https://code.foss.global/) account to submit Pull Requests directly.
|
For reporting bugs, issues, or security vulnerabilities, please visit [community.foss.global/](https://community.foss.global/). This is the central community hub for all issue reporting. Developers who sign and comply with our contribution agreement and go through identification can also get a [code.foss.global/](https://code.foss.global/) account to submit Pull Requests directly.
|
||||||
|
|
||||||
## ✨ Features
|
## ✨ Features
|
||||||
|
|
||||||
@@ -82,6 +82,19 @@ For reporting bugs, issues, or security vulnerabilities, please visit [community
|
|||||||
- ✅ Dependency resolution
|
- ✅ Dependency resolution
|
||||||
- ✅ Legacy API compatibility
|
- ✅ Legacy API compatibility
|
||||||
|
|
||||||
|
### 🌐 Upstream Proxy & Caching
|
||||||
|
- **Multi-Upstream Support**: Configure multiple upstream registries per protocol with priority ordering
|
||||||
|
- **Scope-Based Routing**: Route specific packages/scopes to different upstreams (e.g., `@company/*` → private registry)
|
||||||
|
- **S3-Backed Cache**: Persistent caching using existing S3 storage with URL-based cache paths
|
||||||
|
- **Circuit Breaker**: Automatic failover with configurable thresholds
|
||||||
|
- **Stale-While-Revalidate**: Serve cached content while refreshing in background
|
||||||
|
- **Content-Aware TTLs**: Different TTLs for immutable (tarballs) vs mutable (metadata) content
|
||||||
|
|
||||||
|
### 🔌 Enterprise Extensibility
|
||||||
|
- **Pluggable Auth Provider** (`IAuthProvider`): Integrate LDAP, OAuth, SSO, or custom auth systems
|
||||||
|
- **Storage Event Hooks** (`IStorageHooks`): Quota tracking, audit logging, virus scanning, cache invalidation
|
||||||
|
- **Request Actor Context**: Pass user/org info through requests for audit trails and rate limiting
|
||||||
|
|
||||||
## 📥 Installation
|
## 📥 Installation
|
||||||
|
|
||||||
```bash
|
```bash
|
||||||
@@ -648,6 +661,217 @@ const canWrite = await authManager.authorize(
|
|||||||
);
|
);
|
||||||
```
|
```
|
||||||
|
|
||||||
|
### 🌐 Upstream Proxy Configuration
|
||||||
|
|
||||||
|
```typescript
|
||||||
|
import { SmartRegistry, IRegistryConfig } from '@push.rocks/smartregistry';
|
||||||
|
|
||||||
|
const config: IRegistryConfig = {
|
||||||
|
storage: { /* S3 config */ },
|
||||||
|
auth: { /* Auth config */ },
|
||||||
|
npm: {
|
||||||
|
enabled: true,
|
||||||
|
basePath: '/npm',
|
||||||
|
upstream: {
|
||||||
|
enabled: true,
|
||||||
|
upstreams: [
|
||||||
|
{
|
||||||
|
id: 'company-private',
|
||||||
|
name: 'Company Private NPM',
|
||||||
|
url: 'https://npm.internal.company.com',
|
||||||
|
priority: 1, // Lower = higher priority
|
||||||
|
enabled: true,
|
||||||
|
scopeRules: [
|
||||||
|
{ pattern: '@company/*', action: 'include' },
|
||||||
|
{ pattern: '@internal/*', action: 'include' },
|
||||||
|
],
|
||||||
|
auth: { type: 'bearer', token: process.env.NPM_PRIVATE_TOKEN },
|
||||||
|
},
|
||||||
|
{
|
||||||
|
id: 'npmjs',
|
||||||
|
name: 'NPM Public Registry',
|
||||||
|
url: 'https://registry.npmjs.org',
|
||||||
|
priority: 10,
|
||||||
|
enabled: true,
|
||||||
|
scopeRules: [
|
||||||
|
{ pattern: '@company/*', action: 'exclude' },
|
||||||
|
{ pattern: '@internal/*', action: 'exclude' },
|
||||||
|
],
|
||||||
|
auth: { type: 'none' },
|
||||||
|
cache: { defaultTtlSeconds: 300 },
|
||||||
|
resilience: { timeoutMs: 30000, maxRetries: 3 },
|
||||||
|
},
|
||||||
|
],
|
||||||
|
cache: { enabled: true, staleWhileRevalidate: true },
|
||||||
|
},
|
||||||
|
},
|
||||||
|
oci: {
|
||||||
|
enabled: true,
|
||||||
|
basePath: '/oci',
|
||||||
|
upstream: {
|
||||||
|
enabled: true,
|
||||||
|
upstreams: [
|
||||||
|
{
|
||||||
|
id: 'dockerhub',
|
||||||
|
name: 'Docker Hub',
|
||||||
|
url: 'https://registry-1.docker.io',
|
||||||
|
priority: 1,
|
||||||
|
enabled: true,
|
||||||
|
auth: { type: 'none' },
|
||||||
|
},
|
||||||
|
{
|
||||||
|
id: 'ghcr',
|
||||||
|
name: 'GitHub Container Registry',
|
||||||
|
url: 'https://ghcr.io',
|
||||||
|
priority: 2,
|
||||||
|
enabled: true,
|
||||||
|
scopeRules: [{ pattern: 'ghcr.io/*', action: 'include' }],
|
||||||
|
auth: { type: 'bearer', token: process.env.GHCR_TOKEN },
|
||||||
|
},
|
||||||
|
],
|
||||||
|
},
|
||||||
|
},
|
||||||
|
};
|
||||||
|
|
||||||
|
const registry = new SmartRegistry(config);
|
||||||
|
await registry.init();
|
||||||
|
|
||||||
|
// Requests for @company/* packages go to private registry
|
||||||
|
// Other packages proxy through to npmjs.org with caching
|
||||||
|
```
|
||||||
|
|
||||||
|
### 🔌 Custom Auth Provider
|
||||||
|
|
||||||
|
```typescript
|
||||||
|
import { SmartRegistry, IAuthProvider, IAuthToken, ICredentials, TRegistryProtocol } from '@push.rocks/smartregistry';
|
||||||
|
|
||||||
|
// Implement custom auth (e.g., LDAP, OAuth)
|
||||||
|
class LdapAuthProvider implements IAuthProvider {
|
||||||
|
constructor(private ldapClient: LdapClient) {}
|
||||||
|
|
||||||
|
async authenticate(credentials: ICredentials): Promise<string | null> {
|
||||||
|
const result = await this.ldapClient.bind(credentials.username, credentials.password);
|
||||||
|
return result.success ? credentials.username : null;
|
||||||
|
}
|
||||||
|
|
||||||
|
async validateToken(token: string, protocol?: TRegistryProtocol): Promise<IAuthToken | null> {
|
||||||
|
const session = await this.sessionStore.get(token);
|
||||||
|
if (!session) return null;
|
||||||
|
return {
|
||||||
|
userId: session.userId,
|
||||||
|
scopes: session.scopes,
|
||||||
|
readonly: session.readonly,
|
||||||
|
created: session.created,
|
||||||
|
};
|
||||||
|
}
|
||||||
|
|
||||||
|
async createToken(userId: string, protocol: TRegistryProtocol, options?: ITokenOptions): Promise<string> {
|
||||||
|
const token = crypto.randomUUID();
|
||||||
|
await this.sessionStore.set(token, { userId, protocol, ...options });
|
||||||
|
return token;
|
||||||
|
}
|
||||||
|
|
||||||
|
async revokeToken(token: string): Promise<void> {
|
||||||
|
await this.sessionStore.delete(token);
|
||||||
|
}
|
||||||
|
|
||||||
|
async authorize(token: IAuthToken | null, resource: string, action: string): Promise<boolean> {
|
||||||
|
if (!token) return action === 'read'; // Anonymous read-only
|
||||||
|
// Check LDAP groups, roles, etc.
|
||||||
|
return this.checkPermissions(token.userId, resource, action);
|
||||||
|
}
|
||||||
|
}
|
||||||
|
|
||||||
|
// Use custom provider
|
||||||
|
const registry = new SmartRegistry({
|
||||||
|
...config,
|
||||||
|
authProvider: new LdapAuthProvider(ldapClient),
|
||||||
|
});
|
||||||
|
```
|
||||||
|
|
||||||
|
### 📊 Storage Hooks (Quota & Audit)
|
||||||
|
|
||||||
|
```typescript
|
||||||
|
import { SmartRegistry, IStorageHooks, IStorageHookContext } from '@push.rocks/smartregistry';
|
||||||
|
|
||||||
|
const storageHooks: IStorageHooks = {
|
||||||
|
// Block uploads that exceed quota
|
||||||
|
async beforePut(ctx: IStorageHookContext) {
|
||||||
|
if (ctx.actor?.orgId) {
|
||||||
|
const usage = await getStorageUsage(ctx.actor.orgId);
|
||||||
|
const quota = await getQuota(ctx.actor.orgId);
|
||||||
|
|
||||||
|
if (usage + (ctx.metadata?.size || 0) > quota) {
|
||||||
|
return { allowed: false, reason: 'Storage quota exceeded' };
|
||||||
|
}
|
||||||
|
}
|
||||||
|
return { allowed: true };
|
||||||
|
},
|
||||||
|
|
||||||
|
// Update usage tracking after successful upload
|
||||||
|
async afterPut(ctx: IStorageHookContext) {
|
||||||
|
if (ctx.actor?.orgId && ctx.metadata?.size) {
|
||||||
|
await incrementUsage(ctx.actor.orgId, ctx.metadata.size);
|
||||||
|
}
|
||||||
|
|
||||||
|
// Audit log
|
||||||
|
await auditLog.write({
|
||||||
|
action: 'storage.put',
|
||||||
|
key: ctx.key,
|
||||||
|
protocol: ctx.protocol,
|
||||||
|
actor: ctx.actor,
|
||||||
|
timestamp: ctx.timestamp,
|
||||||
|
});
|
||||||
|
},
|
||||||
|
|
||||||
|
// Prevent deletion of protected packages
|
||||||
|
async beforeDelete(ctx: IStorageHookContext) {
|
||||||
|
if (await isProtectedPackage(ctx.key)) {
|
||||||
|
return { allowed: false, reason: 'Cannot delete protected package' };
|
||||||
|
}
|
||||||
|
return { allowed: true };
|
||||||
|
},
|
||||||
|
|
||||||
|
// Log all access for compliance
|
||||||
|
async afterGet(ctx: IStorageHookContext) {
|
||||||
|
await accessLog.write({
|
||||||
|
action: 'storage.get',
|
||||||
|
key: ctx.key,
|
||||||
|
actor: ctx.actor,
|
||||||
|
timestamp: ctx.timestamp,
|
||||||
|
});
|
||||||
|
},
|
||||||
|
};
|
||||||
|
|
||||||
|
const registry = new SmartRegistry({
|
||||||
|
...config,
|
||||||
|
storageHooks,
|
||||||
|
});
|
||||||
|
```
|
||||||
|
|
||||||
|
### 👤 Request Actor Context
|
||||||
|
|
||||||
|
```typescript
|
||||||
|
// Pass actor information through requests for audit/quota tracking
|
||||||
|
const response = await registry.handleRequest({
|
||||||
|
method: 'PUT',
|
||||||
|
path: '/npm/my-package',
|
||||||
|
headers: { 'Authorization': 'Bearer <token>' },
|
||||||
|
query: {},
|
||||||
|
body: packageData,
|
||||||
|
actor: {
|
||||||
|
userId: 'user123',
|
||||||
|
tokenId: 'token-abc',
|
||||||
|
ip: req.ip,
|
||||||
|
userAgent: req.headers['user-agent'],
|
||||||
|
orgId: 'org-456',
|
||||||
|
sessionId: 'session-xyz',
|
||||||
|
},
|
||||||
|
});
|
||||||
|
|
||||||
|
// Actor info is available in storage hooks for quota/audit
|
||||||
|
```
|
||||||
|
|
||||||
## ⚙️ Configuration
|
## ⚙️ Configuration
|
||||||
|
|
||||||
### Storage Configuration
|
### Storage Configuration
|
||||||
|
|||||||
@@ -3,7 +3,9 @@ import * as crypto from 'crypto';
|
|||||||
import * as smartarchive from '@push.rocks/smartarchive';
|
import * as smartarchive from '@push.rocks/smartarchive';
|
||||||
import * as smartbucket from '@push.rocks/smartbucket';
|
import * as smartbucket from '@push.rocks/smartbucket';
|
||||||
import { SmartRegistry } from '../../ts/classes.smartregistry.js';
|
import { SmartRegistry } from '../../ts/classes.smartregistry.js';
|
||||||
import type { IRegistryConfig } from '../../ts/core/interfaces.core.js';
|
import type { IRegistryConfig, IAuthToken, TRegistryProtocol } from '../../ts/core/interfaces.core.js';
|
||||||
|
import type { IAuthProvider, ITokenOptions } from '../../ts/core/interfaces.auth.js';
|
||||||
|
import type { IStorageHooks, IStorageHookContext, IBeforePutResult, IBeforeDeleteResult } from '../../ts/core/interfaces.storage.js';
|
||||||
|
|
||||||
const testQenv = new qenv.Qenv('./', './.nogit');
|
const testQenv = new qenv.Qenv('./', './.nogit');
|
||||||
|
|
||||||
@@ -608,3 +610,228 @@ export function calculateRubyGemsChecksums(data: Buffer) {
|
|||||||
sha256: crypto.createHash('sha256').update(data).digest('hex'),
|
sha256: crypto.createHash('sha256').update(data).digest('hex'),
|
||||||
};
|
};
|
||||||
}
|
}
|
||||||
|
|
||||||
|
// ============================================================================
|
||||||
|
// Enterprise Extensibility Test Helpers
|
||||||
|
// ============================================================================
|
||||||
|
|
||||||
|
/**
|
||||||
|
* Create a mock auth provider for testing pluggable authentication.
|
||||||
|
* Allows customizing behavior for different test scenarios.
|
||||||
|
*/
|
||||||
|
export function createMockAuthProvider(overrides?: Partial<IAuthProvider>): IAuthProvider {
|
||||||
|
const tokens = new Map<string, IAuthToken>();
|
||||||
|
|
||||||
|
return {
|
||||||
|
init: async () => {},
|
||||||
|
authenticate: async (credentials) => {
|
||||||
|
// Default: always authenticate successfully
|
||||||
|
return credentials.username;
|
||||||
|
},
|
||||||
|
validateToken: async (token, protocol) => {
|
||||||
|
const stored = tokens.get(token);
|
||||||
|
if (stored && (!protocol || stored.type === protocol)) {
|
||||||
|
return stored;
|
||||||
|
}
|
||||||
|
// Mock token for tests
|
||||||
|
if (token === 'valid-mock-token') {
|
||||||
|
return {
|
||||||
|
type: 'npm' as TRegistryProtocol,
|
||||||
|
userId: 'mock-user',
|
||||||
|
scopes: ['npm:*:*:*'],
|
||||||
|
};
|
||||||
|
}
|
||||||
|
return null;
|
||||||
|
},
|
||||||
|
createToken: async (userId, protocol, options) => {
|
||||||
|
const tokenId = `mock-${protocol}-${Date.now()}`;
|
||||||
|
const authToken: IAuthToken = {
|
||||||
|
type: protocol,
|
||||||
|
userId,
|
||||||
|
scopes: options?.scopes || [`${protocol}:*:*:*`],
|
||||||
|
readonly: options?.readonly,
|
||||||
|
expiresAt: options?.expiresIn ? new Date(Date.now() + options.expiresIn * 1000) : undefined,
|
||||||
|
};
|
||||||
|
tokens.set(tokenId, authToken);
|
||||||
|
return tokenId;
|
||||||
|
},
|
||||||
|
revokeToken: async (token) => {
|
||||||
|
tokens.delete(token);
|
||||||
|
},
|
||||||
|
authorize: async (token, resource, action) => {
|
||||||
|
if (!token) return false;
|
||||||
|
if (token.readonly && ['write', 'push', 'delete'].includes(action)) {
|
||||||
|
return false;
|
||||||
|
}
|
||||||
|
return true;
|
||||||
|
},
|
||||||
|
listUserTokens: async (userId) => {
|
||||||
|
const result: Array<{ key: string; readonly: boolean; created: string; protocol?: TRegistryProtocol }> = [];
|
||||||
|
for (const [key, token] of tokens.entries()) {
|
||||||
|
if (token.userId === userId) {
|
||||||
|
result.push({
|
||||||
|
key: `hash-${key.substring(0, 8)}`,
|
||||||
|
readonly: token.readonly || false,
|
||||||
|
created: new Date().toISOString(),
|
||||||
|
protocol: token.type,
|
||||||
|
});
|
||||||
|
}
|
||||||
|
}
|
||||||
|
return result;
|
||||||
|
},
|
||||||
|
...overrides,
|
||||||
|
};
|
||||||
|
}
|
||||||
|
|
||||||
|
/**
|
||||||
|
* Create test storage hooks that track all calls.
|
||||||
|
* Useful for verifying hook invocation order and parameters.
|
||||||
|
*/
|
||||||
|
export function createTrackingHooks(options?: {
|
||||||
|
beforePutAllowed?: boolean;
|
||||||
|
beforeDeleteAllowed?: boolean;
|
||||||
|
throwOnAfterPut?: boolean;
|
||||||
|
throwOnAfterGet?: boolean;
|
||||||
|
}): {
|
||||||
|
hooks: IStorageHooks;
|
||||||
|
calls: Array<{ method: string; context: IStorageHookContext; timestamp: number }>;
|
||||||
|
} {
|
||||||
|
const calls: Array<{ method: string; context: IStorageHookContext; timestamp: number }> = [];
|
||||||
|
|
||||||
|
return {
|
||||||
|
calls,
|
||||||
|
hooks: {
|
||||||
|
beforePut: async (ctx) => {
|
||||||
|
calls.push({ method: 'beforePut', context: ctx, timestamp: Date.now() });
|
||||||
|
return {
|
||||||
|
allowed: options?.beforePutAllowed !== false,
|
||||||
|
reason: options?.beforePutAllowed === false ? 'Blocked by test' : undefined,
|
||||||
|
};
|
||||||
|
},
|
||||||
|
afterPut: async (ctx) => {
|
||||||
|
calls.push({ method: 'afterPut', context: ctx, timestamp: Date.now() });
|
||||||
|
if (options?.throwOnAfterPut) {
|
||||||
|
throw new Error('Test error in afterPut');
|
||||||
|
}
|
||||||
|
},
|
||||||
|
beforeDelete: async (ctx) => {
|
||||||
|
calls.push({ method: 'beforeDelete', context: ctx, timestamp: Date.now() });
|
||||||
|
return {
|
||||||
|
allowed: options?.beforeDeleteAllowed !== false,
|
||||||
|
reason: options?.beforeDeleteAllowed === false ? 'Blocked by test' : undefined,
|
||||||
|
};
|
||||||
|
},
|
||||||
|
afterDelete: async (ctx) => {
|
||||||
|
calls.push({ method: 'afterDelete', context: ctx, timestamp: Date.now() });
|
||||||
|
},
|
||||||
|
afterGet: async (ctx) => {
|
||||||
|
calls.push({ method: 'afterGet', context: ctx, timestamp: Date.now() });
|
||||||
|
if (options?.throwOnAfterGet) {
|
||||||
|
throw new Error('Test error in afterGet');
|
||||||
|
}
|
||||||
|
},
|
||||||
|
},
|
||||||
|
};
|
||||||
|
}
|
||||||
|
|
||||||
|
/**
|
||||||
|
* Create a blocking storage hooks implementation for quota testing.
|
||||||
|
*/
|
||||||
|
export function createQuotaHooks(maxSizeBytes: number): {
|
||||||
|
hooks: IStorageHooks;
|
||||||
|
currentUsage: { bytes: number };
|
||||||
|
} {
|
||||||
|
const currentUsage = { bytes: 0 };
|
||||||
|
|
||||||
|
return {
|
||||||
|
currentUsage,
|
||||||
|
hooks: {
|
||||||
|
beforePut: async (ctx) => {
|
||||||
|
const size = ctx.metadata?.size || 0;
|
||||||
|
if (currentUsage.bytes + size > maxSizeBytes) {
|
||||||
|
return { allowed: false, reason: `Quota exceeded: ${currentUsage.bytes + size} > ${maxSizeBytes}` };
|
||||||
|
}
|
||||||
|
return { allowed: true };
|
||||||
|
},
|
||||||
|
afterPut: async (ctx) => {
|
||||||
|
currentUsage.bytes += ctx.metadata?.size || 0;
|
||||||
|
},
|
||||||
|
afterDelete: async (ctx) => {
|
||||||
|
currentUsage.bytes -= ctx.metadata?.size || 0;
|
||||||
|
if (currentUsage.bytes < 0) currentUsage.bytes = 0;
|
||||||
|
},
|
||||||
|
},
|
||||||
|
};
|
||||||
|
}
|
||||||
|
|
||||||
|
/**
|
||||||
|
* Create a SmartBucket storage backend for upstream cache testing.
|
||||||
|
*/
|
||||||
|
export async function createTestStorageBackend(): Promise<{
|
||||||
|
storage: {
|
||||||
|
getObject: (key: string) => Promise<Buffer | null>;
|
||||||
|
putObject: (key: string, data: Buffer) => Promise<void>;
|
||||||
|
deleteObject: (key: string) => Promise<void>;
|
||||||
|
listObjects: (prefix: string) => Promise<string[]>;
|
||||||
|
};
|
||||||
|
bucket: smartbucket.Bucket;
|
||||||
|
cleanup: () => Promise<void>;
|
||||||
|
}> {
|
||||||
|
const s3AccessKey = await testQenv.getEnvVarOnDemand('S3_ACCESSKEY');
|
||||||
|
const s3SecretKey = await testQenv.getEnvVarOnDemand('S3_SECRETKEY');
|
||||||
|
const s3Endpoint = await testQenv.getEnvVarOnDemand('S3_ENDPOINT');
|
||||||
|
const s3Port = await testQenv.getEnvVarOnDemand('S3_PORT');
|
||||||
|
|
||||||
|
const s3 = new smartbucket.SmartBucket({
|
||||||
|
accessKey: s3AccessKey || 'minioadmin',
|
||||||
|
accessSecret: s3SecretKey || 'minioadmin',
|
||||||
|
endpoint: s3Endpoint || 'localhost',
|
||||||
|
port: parseInt(s3Port || '9000', 10),
|
||||||
|
useSsl: false,
|
||||||
|
});
|
||||||
|
|
||||||
|
const testRunId = generateTestRunId();
|
||||||
|
const bucketName = 'test-cache-' + testRunId.substring(0, 8);
|
||||||
|
const bucket = await s3.createBucket(bucketName);
|
||||||
|
|
||||||
|
const storage = {
|
||||||
|
getObject: async (key: string): Promise<Buffer | null> => {
|
||||||
|
try {
|
||||||
|
const file = await bucket.fastGet({ path: key });
|
||||||
|
if (!file) return null;
|
||||||
|
const stream = await file.createReadStream();
|
||||||
|
const chunks: Buffer[] = [];
|
||||||
|
for await (const chunk of stream) {
|
||||||
|
chunks.push(Buffer.from(chunk));
|
||||||
|
}
|
||||||
|
return Buffer.concat(chunks);
|
||||||
|
} catch {
|
||||||
|
return null;
|
||||||
|
}
|
||||||
|
},
|
||||||
|
putObject: async (key: string, data: Buffer): Promise<void> => {
|
||||||
|
await bucket.fastPut({ path: key, contents: data, overwrite: true });
|
||||||
|
},
|
||||||
|
deleteObject: async (key: string): Promise<void> => {
|
||||||
|
await bucket.fastRemove({ path: key });
|
||||||
|
},
|
||||||
|
listObjects: async (prefix: string): Promise<string[]> => {
|
||||||
|
const files = await bucket.fastList({ prefix });
|
||||||
|
return files.map(f => f.name);
|
||||||
|
},
|
||||||
|
};
|
||||||
|
|
||||||
|
const cleanup = async () => {
|
||||||
|
try {
|
||||||
|
const files = await bucket.fastList({});
|
||||||
|
for (const file of files) {
|
||||||
|
await bucket.fastRemove({ path: file.name });
|
||||||
|
}
|
||||||
|
await s3.removeBucket(bucketName);
|
||||||
|
} catch {
|
||||||
|
// Ignore cleanup errors
|
||||||
|
}
|
||||||
|
};
|
||||||
|
|
||||||
|
return { storage, bucket, cleanup };
|
||||||
|
}
|
||||||
|
|||||||
412
test/test.auth.provider.ts
Normal file
412
test/test.auth.provider.ts
Normal file
@@ -0,0 +1,412 @@
|
|||||||
|
import { expect, tap } from '@git.zone/tstest/tapbundle';
|
||||||
|
import { DefaultAuthProvider } from '../ts/core/classes.defaultauthprovider.js';
|
||||||
|
import { AuthManager } from '../ts/core/classes.authmanager.js';
|
||||||
|
import type { IAuthProvider } from '../ts/core/interfaces.auth.js';
|
||||||
|
import type { IAuthConfig, IAuthToken } from '../ts/core/interfaces.core.js';
|
||||||
|
import { createMockAuthProvider } from './helpers/registry.js';
|
||||||
|
|
||||||
|
// ============================================================================
|
||||||
|
// Test State
|
||||||
|
// ============================================================================
|
||||||
|
|
||||||
|
let provider: DefaultAuthProvider;
|
||||||
|
let authConfig: IAuthConfig;
|
||||||
|
|
||||||
|
// ============================================================================
|
||||||
|
// Setup
|
||||||
|
// ============================================================================
|
||||||
|
|
||||||
|
tap.test('setup: should create DefaultAuthProvider', async () => {
|
||||||
|
authConfig = {
|
||||||
|
jwtSecret: 'test-secret-key-for-jwt-signing',
|
||||||
|
tokenStore: 'memory',
|
||||||
|
npmTokens: { enabled: true },
|
||||||
|
ociTokens: {
|
||||||
|
enabled: true,
|
||||||
|
realm: 'https://auth.example.com/token',
|
||||||
|
service: 'test-registry',
|
||||||
|
},
|
||||||
|
mavenTokens: { enabled: true },
|
||||||
|
cargoTokens: { enabled: true },
|
||||||
|
composerTokens: { enabled: true },
|
||||||
|
pypiTokens: { enabled: true },
|
||||||
|
rubygemsTokens: { enabled: true },
|
||||||
|
};
|
||||||
|
|
||||||
|
provider = new DefaultAuthProvider(authConfig);
|
||||||
|
await provider.init();
|
||||||
|
expect(provider).toBeInstanceOf(DefaultAuthProvider);
|
||||||
|
});
|
||||||
|
|
||||||
|
// ============================================================================
|
||||||
|
// Authentication Tests
|
||||||
|
// ============================================================================
|
||||||
|
|
||||||
|
tap.test('authenticate: should authenticate new user (auto-registration)', async () => {
|
||||||
|
const userId = await provider.authenticate({
|
||||||
|
username: 'newuser',
|
||||||
|
password: 'newpassword',
|
||||||
|
});
|
||||||
|
|
||||||
|
expect(userId).toEqual('newuser');
|
||||||
|
});
|
||||||
|
|
||||||
|
tap.test('authenticate: should authenticate existing user with correct password', async () => {
|
||||||
|
// First registration
|
||||||
|
await provider.authenticate({
|
||||||
|
username: 'existinguser',
|
||||||
|
password: 'correctpass',
|
||||||
|
});
|
||||||
|
|
||||||
|
// Second authentication with same credentials
|
||||||
|
const userId = await provider.authenticate({
|
||||||
|
username: 'existinguser',
|
||||||
|
password: 'correctpass',
|
||||||
|
});
|
||||||
|
|
||||||
|
expect(userId).toEqual('existinguser');
|
||||||
|
});
|
||||||
|
|
||||||
|
tap.test('authenticate: should reject authentication with wrong password', async () => {
|
||||||
|
// First registration
|
||||||
|
await provider.authenticate({
|
||||||
|
username: 'passworduser',
|
||||||
|
password: 'originalpass',
|
||||||
|
});
|
||||||
|
|
||||||
|
// Attempt with wrong password
|
||||||
|
const userId = await provider.authenticate({
|
||||||
|
username: 'passworduser',
|
||||||
|
password: 'wrongpass',
|
||||||
|
});
|
||||||
|
|
||||||
|
expect(userId).toBeNull();
|
||||||
|
});
|
||||||
|
|
||||||
|
// ============================================================================
|
||||||
|
// Token Creation Tests
|
||||||
|
// ============================================================================
|
||||||
|
|
||||||
|
tap.test('createToken: should create NPM token with correct scopes', async () => {
|
||||||
|
const token = await provider.createToken('testuser', 'npm', {
|
||||||
|
scopes: ['npm:package:*:*'],
|
||||||
|
});
|
||||||
|
|
||||||
|
expect(token).toBeTruthy();
|
||||||
|
expect(typeof token).toEqual('string');
|
||||||
|
|
||||||
|
// Validate the token
|
||||||
|
const validated = await provider.validateToken(token, 'npm');
|
||||||
|
expect(validated).toBeTruthy();
|
||||||
|
expect(validated!.type).toEqual('npm');
|
||||||
|
expect(validated!.userId).toEqual('testuser');
|
||||||
|
expect(validated!.scopes).toContain('npm:package:*:*');
|
||||||
|
});
|
||||||
|
|
||||||
|
tap.test('createToken: should create Maven token', async () => {
|
||||||
|
const token = await provider.createToken('mavenuser', 'maven', {
|
||||||
|
readonly: true,
|
||||||
|
});
|
||||||
|
|
||||||
|
expect(token).toBeTruthy();
|
||||||
|
|
||||||
|
const validated = await provider.validateToken(token, 'maven');
|
||||||
|
expect(validated).toBeTruthy();
|
||||||
|
expect(validated!.type).toEqual('maven');
|
||||||
|
expect(validated!.readonly).toBeTrue();
|
||||||
|
});
|
||||||
|
|
||||||
|
tap.test('createToken: should create OCI JWT token with correct claims', async () => {
|
||||||
|
const token = await provider.createToken('ociuser', 'oci', {
|
||||||
|
scopes: ['oci:repository:myrepo:push', 'oci:repository:myrepo:pull'],
|
||||||
|
expiresIn: 3600,
|
||||||
|
});
|
||||||
|
|
||||||
|
expect(token).toBeTruthy();
|
||||||
|
// OCI tokens are JWTs (contain dots)
|
||||||
|
expect(token.split('.').length).toEqual(3);
|
||||||
|
|
||||||
|
const validated = await provider.validateToken(token, 'oci');
|
||||||
|
expect(validated).toBeTruthy();
|
||||||
|
expect(validated!.type).toEqual('oci');
|
||||||
|
expect(validated!.userId).toEqual('ociuser');
|
||||||
|
expect(validated!.scopes.length).toBeGreaterThan(0);
|
||||||
|
});
|
||||||
|
|
||||||
|
tap.test('createToken: should create token with expiration', async () => {
|
||||||
|
const token = await provider.createToken('expiryuser', 'npm', {
|
||||||
|
expiresIn: 60, // 60 seconds
|
||||||
|
});
|
||||||
|
|
||||||
|
const validated = await provider.validateToken(token, 'npm');
|
||||||
|
expect(validated).toBeTruthy();
|
||||||
|
expect(validated!.expiresAt).toBeTruthy();
|
||||||
|
expect(validated!.expiresAt!.getTime()).toBeGreaterThan(Date.now());
|
||||||
|
});
|
||||||
|
|
||||||
|
// ============================================================================
|
||||||
|
// Token Validation Tests
|
||||||
|
// ============================================================================
|
||||||
|
|
||||||
|
tap.test('validateToken: should validate UUID token (NPM, Maven, etc.)', async () => {
|
||||||
|
const npmToken = await provider.createToken('validateuser', 'npm');
|
||||||
|
const validated = await provider.validateToken(npmToken);
|
||||||
|
|
||||||
|
expect(validated).toBeTruthy();
|
||||||
|
expect(validated!.type).toEqual('npm');
|
||||||
|
expect(validated!.userId).toEqual('validateuser');
|
||||||
|
});
|
||||||
|
|
||||||
|
tap.test('validateToken: should validate OCI JWT token', async () => {
|
||||||
|
const ociToken = await provider.createToken('ocivalidate', 'oci', {
|
||||||
|
scopes: ['oci:repository:*:*'],
|
||||||
|
});
|
||||||
|
|
||||||
|
const validated = await provider.validateToken(ociToken, 'oci');
|
||||||
|
|
||||||
|
expect(validated).toBeTruthy();
|
||||||
|
expect(validated!.type).toEqual('oci');
|
||||||
|
expect(validated!.userId).toEqual('ocivalidate');
|
||||||
|
});
|
||||||
|
|
||||||
|
tap.test('validateToken: should reject expired tokens', async () => {
|
||||||
|
const token = await provider.createToken('expireduser', 'npm', {
|
||||||
|
expiresIn: -1, // Already expired (in the past)
|
||||||
|
});
|
||||||
|
|
||||||
|
// The token should be created but will fail validation due to expiry
|
||||||
|
const validated = await provider.validateToken(token, 'npm');
|
||||||
|
|
||||||
|
// Token should be rejected because it's expired
|
||||||
|
expect(validated).toBeNull();
|
||||||
|
});
|
||||||
|
|
||||||
|
tap.test('validateToken: should reject invalid token', async () => {
|
||||||
|
const validated = await provider.validateToken('invalid-random-token');
|
||||||
|
expect(validated).toBeNull();
|
||||||
|
});
|
||||||
|
|
||||||
|
tap.test('validateToken: should reject token with wrong protocol', async () => {
|
||||||
|
const npmToken = await provider.createToken('protocoluser', 'npm');
|
||||||
|
|
||||||
|
// Try to validate as Maven token
|
||||||
|
const validated = await provider.validateToken(npmToken, 'maven');
|
||||||
|
expect(validated).toBeNull();
|
||||||
|
});
|
||||||
|
|
||||||
|
// ============================================================================
|
||||||
|
// Token Revocation Tests
|
||||||
|
// ============================================================================
|
||||||
|
|
||||||
|
tap.test('revokeToken: should revoke tokens', async () => {
|
||||||
|
const token = await provider.createToken('revokeuser', 'npm');
|
||||||
|
|
||||||
|
// Verify token works before revocation
|
||||||
|
let validated = await provider.validateToken(token);
|
||||||
|
expect(validated).toBeTruthy();
|
||||||
|
|
||||||
|
// Revoke the token
|
||||||
|
await provider.revokeToken(token);
|
||||||
|
|
||||||
|
// Token should no longer be valid
|
||||||
|
validated = await provider.validateToken(token);
|
||||||
|
expect(validated).toBeNull();
|
||||||
|
});
|
||||||
|
|
||||||
|
// ============================================================================
|
||||||
|
// Authorization Tests
|
||||||
|
// ============================================================================
|
||||||
|
|
||||||
|
tap.test('authorize: should authorize read actions for readonly tokens', async () => {
|
||||||
|
const token = await provider.createToken('readonlyuser', 'npm', {
|
||||||
|
readonly: true,
|
||||||
|
scopes: ['npm:package:*:read'],
|
||||||
|
});
|
||||||
|
|
||||||
|
const validated = await provider.validateToken(token);
|
||||||
|
|
||||||
|
const canRead = await provider.authorize(validated, 'npm:package:lodash', 'read');
|
||||||
|
expect(canRead).toBeTrue();
|
||||||
|
|
||||||
|
const canPull = await provider.authorize(validated, 'npm:package:lodash', 'pull');
|
||||||
|
expect(canPull).toBeTrue();
|
||||||
|
});
|
||||||
|
|
||||||
|
tap.test('authorize: should deny write actions for readonly tokens', async () => {
|
||||||
|
const token = await provider.createToken('readonlyuser2', 'npm', {
|
||||||
|
readonly: true,
|
||||||
|
scopes: ['npm:package:*:*'],
|
||||||
|
});
|
||||||
|
|
||||||
|
const validated = await provider.validateToken(token);
|
||||||
|
|
||||||
|
const canWrite = await provider.authorize(validated, 'npm:package:lodash', 'write');
|
||||||
|
expect(canWrite).toBeFalse();
|
||||||
|
|
||||||
|
const canPush = await provider.authorize(validated, 'npm:package:lodash', 'push');
|
||||||
|
expect(canPush).toBeFalse();
|
||||||
|
|
||||||
|
const canDelete = await provider.authorize(validated, 'npm:package:lodash', 'delete');
|
||||||
|
expect(canDelete).toBeFalse();
|
||||||
|
});
|
||||||
|
|
||||||
|
tap.test('authorize: should match scopes with wildcards', async () => {
|
||||||
|
// The scope system uses literal * as wildcard, not glob patterns
|
||||||
|
// npm:*:*:* means "all types, all names, all actions under npm"
|
||||||
|
const token = await provider.createToken('wildcarduser', 'npm', {
|
||||||
|
scopes: ['npm:*:*:*'],
|
||||||
|
});
|
||||||
|
|
||||||
|
const validated = await provider.validateToken(token);
|
||||||
|
|
||||||
|
// Should match any npm resource with full wildcard scope
|
||||||
|
const canAccessAnyPackage = await provider.authorize(validated, 'npm:package:lodash', 'read');
|
||||||
|
expect(canAccessAnyPackage).toBeTrue();
|
||||||
|
|
||||||
|
const canAccessScopedPackage = await provider.authorize(validated, 'npm:package:@myorg/foo', 'write');
|
||||||
|
expect(canAccessScopedPackage).toBeTrue();
|
||||||
|
});
|
||||||
|
|
||||||
|
tap.test('authorize: should deny access with null token', async () => {
|
||||||
|
const canAccess = await provider.authorize(null, 'npm:package:lodash', 'read');
|
||||||
|
expect(canAccess).toBeFalse();
|
||||||
|
});
|
||||||
|
|
||||||
|
// ============================================================================
|
||||||
|
// List Tokens Tests
|
||||||
|
// ============================================================================
|
||||||
|
|
||||||
|
tap.test('listUserTokens: should list user tokens', async () => {
|
||||||
|
// Create multiple tokens for the same user
|
||||||
|
const userId = 'listtokenuser';
|
||||||
|
await provider.createToken(userId, 'npm');
|
||||||
|
await provider.createToken(userId, 'maven', { readonly: true });
|
||||||
|
await provider.createToken(userId, 'cargo');
|
||||||
|
|
||||||
|
const tokens = await provider.listUserTokens!(userId);
|
||||||
|
|
||||||
|
expect(tokens.length).toBeGreaterThanOrEqual(3);
|
||||||
|
|
||||||
|
// Check that tokens have expected properties
|
||||||
|
for (const token of tokens) {
|
||||||
|
expect(token.key).toBeTruthy();
|
||||||
|
expect(typeof token.readonly).toEqual('boolean');
|
||||||
|
expect(token.created).toBeTruthy();
|
||||||
|
}
|
||||||
|
|
||||||
|
// Verify we have different protocols
|
||||||
|
const protocols = tokens.map(t => t.protocol);
|
||||||
|
expect(protocols).toContain('npm');
|
||||||
|
expect(protocols).toContain('maven');
|
||||||
|
expect(protocols).toContain('cargo');
|
||||||
|
});
|
||||||
|
|
||||||
|
// ============================================================================
|
||||||
|
// AuthManager Integration Tests
|
||||||
|
// ============================================================================
|
||||||
|
|
||||||
|
tap.test('AuthManager: should accept custom IAuthProvider', async () => {
|
||||||
|
const mockProvider = createMockAuthProvider({
|
||||||
|
authenticate: async (credentials) => {
|
||||||
|
if (credentials.username === 'custom' && credentials.password === 'pass') {
|
||||||
|
return 'custom-user-id';
|
||||||
|
}
|
||||||
|
return null;
|
||||||
|
},
|
||||||
|
});
|
||||||
|
|
||||||
|
const manager = new AuthManager(authConfig, mockProvider);
|
||||||
|
await manager.init();
|
||||||
|
|
||||||
|
// Use the custom provider
|
||||||
|
const userId = await manager.authenticate({
|
||||||
|
username: 'custom',
|
||||||
|
password: 'pass',
|
||||||
|
});
|
||||||
|
|
||||||
|
expect(userId).toEqual('custom-user-id');
|
||||||
|
|
||||||
|
// Wrong credentials should fail
|
||||||
|
const failed = await manager.authenticate({
|
||||||
|
username: 'custom',
|
||||||
|
password: 'wrong',
|
||||||
|
});
|
||||||
|
|
||||||
|
expect(failed).toBeNull();
|
||||||
|
});
|
||||||
|
|
||||||
|
tap.test('AuthManager: should use default provider when none specified', async () => {
|
||||||
|
const manager = new AuthManager(authConfig);
|
||||||
|
await manager.init();
|
||||||
|
|
||||||
|
// Should use DefaultAuthProvider internally
|
||||||
|
const userId = await manager.authenticate({
|
||||||
|
username: 'defaultuser',
|
||||||
|
password: 'defaultpass',
|
||||||
|
});
|
||||||
|
|
||||||
|
expect(userId).toEqual('defaultuser');
|
||||||
|
});
|
||||||
|
|
||||||
|
tap.test('AuthManager: should delegate token creation to provider', async () => {
|
||||||
|
let tokenCreated = false;
|
||||||
|
const mockProvider = createMockAuthProvider({
|
||||||
|
createToken: async (userId, protocol, options) => {
|
||||||
|
tokenCreated = true;
|
||||||
|
return `mock-token-${protocol}-${userId}`;
|
||||||
|
},
|
||||||
|
});
|
||||||
|
|
||||||
|
const manager = new AuthManager(authConfig, mockProvider);
|
||||||
|
await manager.init();
|
||||||
|
|
||||||
|
const token = await manager.createNpmToken('delegateuser', false);
|
||||||
|
|
||||||
|
expect(tokenCreated).toBeTrue();
|
||||||
|
expect(token).toContain('mock-token-npm');
|
||||||
|
});
|
||||||
|
|
||||||
|
// ============================================================================
|
||||||
|
// Edge Cases
|
||||||
|
// ============================================================================
|
||||||
|
|
||||||
|
tap.test('edge: should handle concurrent token operations', async () => {
|
||||||
|
const promises: Promise<string>[] = [];
|
||||||
|
|
||||||
|
// Create 10 tokens concurrently
|
||||||
|
for (let i = 0; i < 10; i++) {
|
||||||
|
promises.push(provider.createToken(`concurrent-user-${i}`, 'npm'));
|
||||||
|
}
|
||||||
|
|
||||||
|
const tokens = await Promise.all(promises);
|
||||||
|
|
||||||
|
// All tokens should be unique
|
||||||
|
const uniqueTokens = new Set(tokens);
|
||||||
|
expect(uniqueTokens.size).toEqual(10);
|
||||||
|
|
||||||
|
// All tokens should be valid
|
||||||
|
for (const token of tokens) {
|
||||||
|
const validated = await provider.validateToken(token);
|
||||||
|
expect(validated).toBeTruthy();
|
||||||
|
}
|
||||||
|
});
|
||||||
|
|
||||||
|
tap.test('edge: should handle empty scopes', async () => {
|
||||||
|
const token = await provider.createToken('emptyuser', 'npm', {
|
||||||
|
scopes: [],
|
||||||
|
});
|
||||||
|
|
||||||
|
const validated = await provider.validateToken(token);
|
||||||
|
expect(validated).toBeTruthy();
|
||||||
|
// Even with empty scopes, token should be valid
|
||||||
|
});
|
||||||
|
|
||||||
|
// ============================================================================
|
||||||
|
// Cleanup
|
||||||
|
// ============================================================================
|
||||||
|
|
||||||
|
tap.test('cleanup', async () => {
|
||||||
|
// No cleanup needed for in-memory provider
|
||||||
|
});
|
||||||
|
|
||||||
|
export default tap.start();
|
||||||
506
test/test.storage.hooks.ts
Normal file
506
test/test.storage.hooks.ts
Normal file
@@ -0,0 +1,506 @@
|
|||||||
|
import { expect, tap } from '@git.zone/tstest/tapbundle';
|
||||||
|
import * as qenv from '@push.rocks/qenv';
|
||||||
|
import { RegistryStorage } from '../ts/core/classes.registrystorage.js';
|
||||||
|
import type { IStorageConfig } from '../ts/core/interfaces.core.js';
|
||||||
|
import type { IStorageHooks, IStorageHookContext } from '../ts/core/interfaces.storage.js';
|
||||||
|
import { createTrackingHooks, createQuotaHooks, generateTestRunId } from './helpers/registry.js';
|
||||||
|
|
||||||
|
const testQenv = new qenv.Qenv('./', './.nogit');
|
||||||
|
|
||||||
|
// ============================================================================
|
||||||
|
// Test State
|
||||||
|
// ============================================================================
|
||||||
|
|
||||||
|
let storage: RegistryStorage;
|
||||||
|
let storageConfig: IStorageConfig;
|
||||||
|
let testRunId: string;
|
||||||
|
|
||||||
|
// ============================================================================
|
||||||
|
// Setup
|
||||||
|
// ============================================================================
|
||||||
|
|
||||||
|
tap.test('setup: should create storage config', async () => {
|
||||||
|
testRunId = generateTestRunId();
|
||||||
|
|
||||||
|
const s3AccessKey = await testQenv.getEnvVarOnDemand('S3_ACCESSKEY');
|
||||||
|
const s3SecretKey = await testQenv.getEnvVarOnDemand('S3_SECRETKEY');
|
||||||
|
const s3Endpoint = await testQenv.getEnvVarOnDemand('S3_ENDPOINT');
|
||||||
|
const s3Port = await testQenv.getEnvVarOnDemand('S3_PORT');
|
||||||
|
|
||||||
|
storageConfig = {
|
||||||
|
accessKey: s3AccessKey || 'minioadmin',
|
||||||
|
accessSecret: s3SecretKey || 'minioadmin',
|
||||||
|
endpoint: s3Endpoint || 'localhost',
|
||||||
|
port: parseInt(s3Port || '9000', 10),
|
||||||
|
useSsl: false,
|
||||||
|
region: 'us-east-1',
|
||||||
|
bucketName: `test-hooks-${testRunId}`,
|
||||||
|
};
|
||||||
|
|
||||||
|
expect(storageConfig.bucketName).toBeTruthy();
|
||||||
|
});
|
||||||
|
|
||||||
|
// ============================================================================
|
||||||
|
// beforePut Hook Tests
|
||||||
|
// ============================================================================
|
||||||
|
|
||||||
|
tap.test('beforePut: should be called before storage', async () => {
|
||||||
|
const tracker = createTrackingHooks();
|
||||||
|
|
||||||
|
storage = new RegistryStorage(storageConfig, tracker.hooks);
|
||||||
|
await storage.init();
|
||||||
|
|
||||||
|
// Set context and put object
|
||||||
|
storage.setContext({
|
||||||
|
protocol: 'npm',
|
||||||
|
actor: { userId: 'testuser' },
|
||||||
|
metadata: { packageName: 'test-package' },
|
||||||
|
});
|
||||||
|
|
||||||
|
await storage.putObject('test/beforeput-called.txt', Buffer.from('test data'));
|
||||||
|
storage.clearContext();
|
||||||
|
|
||||||
|
// Verify beforePut was called
|
||||||
|
const beforePutCalls = tracker.calls.filter(c => c.method === 'beforePut');
|
||||||
|
expect(beforePutCalls.length).toEqual(1);
|
||||||
|
expect(beforePutCalls[0].context.operation).toEqual('put');
|
||||||
|
expect(beforePutCalls[0].context.key).toEqual('test/beforeput-called.txt');
|
||||||
|
expect(beforePutCalls[0].context.protocol).toEqual('npm');
|
||||||
|
});
|
||||||
|
|
||||||
|
tap.test('beforePut: returning {allowed: false} should block storage', async () => {
|
||||||
|
const tracker = createTrackingHooks({ beforePutAllowed: false });
|
||||||
|
|
||||||
|
const blockingStorage = new RegistryStorage(storageConfig, tracker.hooks);
|
||||||
|
await blockingStorage.init();
|
||||||
|
|
||||||
|
blockingStorage.setContext({
|
||||||
|
protocol: 'npm',
|
||||||
|
actor: { userId: 'testuser' },
|
||||||
|
});
|
||||||
|
|
||||||
|
let errorThrown = false;
|
||||||
|
try {
|
||||||
|
await blockingStorage.putObject('test/should-not-exist.txt', Buffer.from('blocked data'));
|
||||||
|
} catch (error) {
|
||||||
|
errorThrown = true;
|
||||||
|
expect((error as Error).message).toContain('Blocked by test');
|
||||||
|
}
|
||||||
|
|
||||||
|
blockingStorage.clearContext();
|
||||||
|
|
||||||
|
expect(errorThrown).toBeTrue();
|
||||||
|
|
||||||
|
// Verify object was NOT stored
|
||||||
|
const result = await blockingStorage.getObject('test/should-not-exist.txt');
|
||||||
|
expect(result).toBeNull();
|
||||||
|
});
|
||||||
|
|
||||||
|
// ============================================================================
|
||||||
|
// afterPut Hook Tests
|
||||||
|
// ============================================================================
|
||||||
|
|
||||||
|
tap.test('afterPut: should be called after successful storage', async () => {
|
||||||
|
const tracker = createTrackingHooks();
|
||||||
|
|
||||||
|
const trackedStorage = new RegistryStorage(storageConfig, tracker.hooks);
|
||||||
|
await trackedStorage.init();
|
||||||
|
|
||||||
|
trackedStorage.setContext({
|
||||||
|
protocol: 'maven',
|
||||||
|
actor: { userId: 'maven-user' },
|
||||||
|
});
|
||||||
|
|
||||||
|
await trackedStorage.putObject('test/afterput-test.txt', Buffer.from('after put test'));
|
||||||
|
trackedStorage.clearContext();
|
||||||
|
|
||||||
|
// Give async hook time to complete
|
||||||
|
await new Promise(resolve => setTimeout(resolve, 100));
|
||||||
|
|
||||||
|
const afterPutCalls = tracker.calls.filter(c => c.method === 'afterPut');
|
||||||
|
expect(afterPutCalls.length).toEqual(1);
|
||||||
|
expect(afterPutCalls[0].context.operation).toEqual('put');
|
||||||
|
});
|
||||||
|
|
||||||
|
tap.test('afterPut: should receive correct metadata (size, key, protocol)', async () => {
|
||||||
|
const tracker = createTrackingHooks();
|
||||||
|
|
||||||
|
const metadataStorage = new RegistryStorage(storageConfig, tracker.hooks);
|
||||||
|
await metadataStorage.init();
|
||||||
|
|
||||||
|
const testData = Buffer.from('metadata test data - some content here');
|
||||||
|
|
||||||
|
metadataStorage.setContext({
|
||||||
|
protocol: 'cargo',
|
||||||
|
actor: { userId: 'cargo-user', ip: '192.168.1.100' },
|
||||||
|
metadata: { packageName: 'my-crate', version: '1.0.0' },
|
||||||
|
});
|
||||||
|
|
||||||
|
await metadataStorage.putObject('test/metadata-test.txt', testData);
|
||||||
|
metadataStorage.clearContext();
|
||||||
|
|
||||||
|
await new Promise(resolve => setTimeout(resolve, 100));
|
||||||
|
|
||||||
|
const afterPutCalls = tracker.calls.filter(c => c.method === 'afterPut');
|
||||||
|
expect(afterPutCalls.length).toBeGreaterThanOrEqual(1);
|
||||||
|
|
||||||
|
const call = afterPutCalls[afterPutCalls.length - 1];
|
||||||
|
expect(call.context.metadata?.size).toEqual(testData.length);
|
||||||
|
expect(call.context.key).toEqual('test/metadata-test.txt');
|
||||||
|
expect(call.context.protocol).toEqual('cargo');
|
||||||
|
expect(call.context.actor?.userId).toEqual('cargo-user');
|
||||||
|
expect(call.context.actor?.ip).toEqual('192.168.1.100');
|
||||||
|
});
|
||||||
|
|
||||||
|
// ============================================================================
|
||||||
|
// beforeDelete Hook Tests
|
||||||
|
// ============================================================================
|
||||||
|
|
||||||
|
tap.test('beforeDelete: should be called before deletion', async () => {
|
||||||
|
const tracker = createTrackingHooks();
|
||||||
|
|
||||||
|
const deleteStorage = new RegistryStorage(storageConfig, tracker.hooks);
|
||||||
|
await deleteStorage.init();
|
||||||
|
|
||||||
|
// First, store an object
|
||||||
|
deleteStorage.setContext({ protocol: 'npm' });
|
||||||
|
await deleteStorage.putObject('test/to-delete.txt', Buffer.from('delete me'));
|
||||||
|
|
||||||
|
// Now delete it
|
||||||
|
await deleteStorage.deleteObject('test/to-delete.txt');
|
||||||
|
deleteStorage.clearContext();
|
||||||
|
|
||||||
|
const beforeDeleteCalls = tracker.calls.filter(c => c.method === 'beforeDelete');
|
||||||
|
expect(beforeDeleteCalls.length).toEqual(1);
|
||||||
|
expect(beforeDeleteCalls[0].context.operation).toEqual('delete');
|
||||||
|
expect(beforeDeleteCalls[0].context.key).toEqual('test/to-delete.txt');
|
||||||
|
});
|
||||||
|
|
||||||
|
tap.test('beforeDelete: returning {allowed: false} should block deletion', async () => {
|
||||||
|
const tracker = createTrackingHooks({ beforeDeleteAllowed: false });
|
||||||
|
|
||||||
|
const protectedStorage = new RegistryStorage(storageConfig, tracker.hooks);
|
||||||
|
await protectedStorage.init();
|
||||||
|
|
||||||
|
// First store an object
|
||||||
|
protectedStorage.setContext({ protocol: 'npm' });
|
||||||
|
await protectedStorage.putObject('test/protected.txt', Buffer.from('protected data'));
|
||||||
|
|
||||||
|
// Try to delete - should be blocked
|
||||||
|
let errorThrown = false;
|
||||||
|
try {
|
||||||
|
await protectedStorage.deleteObject('test/protected.txt');
|
||||||
|
} catch (error) {
|
||||||
|
errorThrown = true;
|
||||||
|
expect((error as Error).message).toContain('Blocked by test');
|
||||||
|
}
|
||||||
|
|
||||||
|
protectedStorage.clearContext();
|
||||||
|
|
||||||
|
expect(errorThrown).toBeTrue();
|
||||||
|
|
||||||
|
// Verify object still exists
|
||||||
|
const result = await protectedStorage.getObject('test/protected.txt');
|
||||||
|
expect(result).toBeTruthy();
|
||||||
|
});
|
||||||
|
|
||||||
|
// ============================================================================
|
||||||
|
// afterDelete Hook Tests
|
||||||
|
// ============================================================================
|
||||||
|
|
||||||
|
tap.test('afterDelete: should be called after successful deletion', async () => {
|
||||||
|
const tracker = createTrackingHooks();
|
||||||
|
|
||||||
|
const afterDeleteStorage = new RegistryStorage(storageConfig, tracker.hooks);
|
||||||
|
await afterDeleteStorage.init();
|
||||||
|
|
||||||
|
afterDeleteStorage.setContext({ protocol: 'pypi' });
|
||||||
|
await afterDeleteStorage.putObject('test/delete-tracked.txt', Buffer.from('to be deleted'));
|
||||||
|
await afterDeleteStorage.deleteObject('test/delete-tracked.txt');
|
||||||
|
afterDeleteStorage.clearContext();
|
||||||
|
|
||||||
|
await new Promise(resolve => setTimeout(resolve, 100));
|
||||||
|
|
||||||
|
const afterDeleteCalls = tracker.calls.filter(c => c.method === 'afterDelete');
|
||||||
|
expect(afterDeleteCalls.length).toEqual(1);
|
||||||
|
expect(afterDeleteCalls[0].context.operation).toEqual('delete');
|
||||||
|
});
|
||||||
|
|
||||||
|
// ============================================================================
|
||||||
|
// afterGet Hook Tests
|
||||||
|
// ============================================================================
|
||||||
|
|
||||||
|
tap.test('afterGet: should be called after reading object', async () => {
|
||||||
|
const tracker = createTrackingHooks();
|
||||||
|
|
||||||
|
const getStorage = new RegistryStorage(storageConfig, tracker.hooks);
|
||||||
|
await getStorage.init();
|
||||||
|
|
||||||
|
// Store an object first
|
||||||
|
getStorage.setContext({ protocol: 'rubygems' });
|
||||||
|
await getStorage.putObject('test/read-test.txt', Buffer.from('read me'));
|
||||||
|
|
||||||
|
// Clear calls to focus on the get
|
||||||
|
tracker.calls.length = 0;
|
||||||
|
|
||||||
|
// Read the object
|
||||||
|
const data = await getStorage.getObject('test/read-test.txt');
|
||||||
|
getStorage.clearContext();
|
||||||
|
|
||||||
|
await new Promise(resolve => setTimeout(resolve, 100));
|
||||||
|
|
||||||
|
expect(data).toBeTruthy();
|
||||||
|
expect(data!.toString()).toEqual('read me');
|
||||||
|
|
||||||
|
const afterGetCalls = tracker.calls.filter(c => c.method === 'afterGet');
|
||||||
|
expect(afterGetCalls.length).toEqual(1);
|
||||||
|
expect(afterGetCalls[0].context.operation).toEqual('get');
|
||||||
|
});
|
||||||
|
|
||||||
|
// ============================================================================
|
||||||
|
// Context Tests
|
||||||
|
// ============================================================================
|
||||||
|
|
||||||
|
tap.test('context: hooks should receive actor information', async () => {
|
||||||
|
const tracker = createTrackingHooks();
|
||||||
|
|
||||||
|
const actorStorage = new RegistryStorage(storageConfig, tracker.hooks);
|
||||||
|
await actorStorage.init();
|
||||||
|
|
||||||
|
actorStorage.setContext({
|
||||||
|
protocol: 'composer',
|
||||||
|
actor: {
|
||||||
|
userId: 'user-123',
|
||||||
|
tokenId: 'token-abc',
|
||||||
|
ip: '10.0.0.1',
|
||||||
|
userAgent: 'composer/2.0',
|
||||||
|
orgId: 'org-456',
|
||||||
|
sessionId: 'session-xyz',
|
||||||
|
},
|
||||||
|
});
|
||||||
|
|
||||||
|
await actorStorage.putObject('test/actor-test.txt', Buffer.from('actor test'));
|
||||||
|
actorStorage.clearContext();
|
||||||
|
|
||||||
|
const beforePutCall = tracker.calls.find(c => c.method === 'beforePut');
|
||||||
|
expect(beforePutCall).toBeTruthy();
|
||||||
|
expect(beforePutCall!.context.actor?.userId).toEqual('user-123');
|
||||||
|
expect(beforePutCall!.context.actor?.tokenId).toEqual('token-abc');
|
||||||
|
expect(beforePutCall!.context.actor?.ip).toEqual('10.0.0.1');
|
||||||
|
expect(beforePutCall!.context.actor?.userAgent).toEqual('composer/2.0');
|
||||||
|
expect(beforePutCall!.context.actor?.orgId).toEqual('org-456');
|
||||||
|
expect(beforePutCall!.context.actor?.sessionId).toEqual('session-xyz');
|
||||||
|
});
|
||||||
|
|
||||||
|
tap.test('withContext: should set and clear context correctly', async () => {
|
||||||
|
const tracker = createTrackingHooks();
|
||||||
|
|
||||||
|
const contextStorage = new RegistryStorage(storageConfig, tracker.hooks);
|
||||||
|
await contextStorage.init();
|
||||||
|
|
||||||
|
// Use withContext to ensure automatic cleanup
|
||||||
|
await contextStorage.withContext(
|
||||||
|
{
|
||||||
|
protocol: 'oci',
|
||||||
|
actor: { userId: 'oci-user' },
|
||||||
|
},
|
||||||
|
async () => {
|
||||||
|
await contextStorage.putObject('test/with-context.txt', Buffer.from('context managed'));
|
||||||
|
}
|
||||||
|
);
|
||||||
|
|
||||||
|
const call = tracker.calls.find(c => c.method === 'beforePut');
|
||||||
|
expect(call).toBeTruthy();
|
||||||
|
expect(call!.context.protocol).toEqual('oci');
|
||||||
|
expect(call!.context.actor?.userId).toEqual('oci-user');
|
||||||
|
});
|
||||||
|
|
||||||
|
tap.test('withContext: should clear context even on error', async () => {
|
||||||
|
const tracker = createTrackingHooks({ beforePutAllowed: false });
|
||||||
|
|
||||||
|
const errorStorage = new RegistryStorage(storageConfig, tracker.hooks);
|
||||||
|
await errorStorage.init();
|
||||||
|
|
||||||
|
let errorThrown = false;
|
||||||
|
try {
|
||||||
|
await errorStorage.withContext(
|
||||||
|
{
|
||||||
|
protocol: 'npm',
|
||||||
|
actor: { userId: 'error-user' },
|
||||||
|
},
|
||||||
|
async () => {
|
||||||
|
await errorStorage.putObject('test/error-context.txt', Buffer.from('will fail'));
|
||||||
|
}
|
||||||
|
);
|
||||||
|
} catch {
|
||||||
|
errorThrown = true;
|
||||||
|
}
|
||||||
|
|
||||||
|
expect(errorThrown).toBeTrue();
|
||||||
|
|
||||||
|
// Verify context was cleared - next operation without context should work
|
||||||
|
// (hooks won't be called without context)
|
||||||
|
tracker.hooks.beforePut = async () => ({ allowed: true });
|
||||||
|
await errorStorage.putObject('test/after-error.txt', Buffer.from('ok'));
|
||||||
|
});
|
||||||
|
|
||||||
|
// ============================================================================
|
||||||
|
// Graceful Degradation Tests
|
||||||
|
// ============================================================================
|
||||||
|
|
||||||
|
tap.test('graceful: hooks should not fail the operation if afterPut throws', async () => {
|
||||||
|
const tracker = createTrackingHooks({ throwOnAfterPut: true });
|
||||||
|
|
||||||
|
const gracefulStorage = new RegistryStorage(storageConfig, tracker.hooks);
|
||||||
|
await gracefulStorage.init();
|
||||||
|
|
||||||
|
gracefulStorage.setContext({ protocol: 'npm' });
|
||||||
|
|
||||||
|
// This should NOT throw even though afterPut throws
|
||||||
|
let errorThrown = false;
|
||||||
|
try {
|
||||||
|
await gracefulStorage.putObject('test/graceful-afterput.txt', Buffer.from('should succeed'));
|
||||||
|
} catch {
|
||||||
|
errorThrown = true;
|
||||||
|
}
|
||||||
|
|
||||||
|
gracefulStorage.clearContext();
|
||||||
|
|
||||||
|
expect(errorThrown).toBeFalse();
|
||||||
|
|
||||||
|
// Verify object was stored
|
||||||
|
const data = await gracefulStorage.getObject('test/graceful-afterput.txt');
|
||||||
|
expect(data).toBeTruthy();
|
||||||
|
});
|
||||||
|
|
||||||
|
tap.test('graceful: hooks should not fail the operation if afterGet throws', async () => {
|
||||||
|
const tracker = createTrackingHooks({ throwOnAfterGet: true });
|
||||||
|
|
||||||
|
const gracefulGetStorage = new RegistryStorage(storageConfig, tracker.hooks);
|
||||||
|
await gracefulGetStorage.init();
|
||||||
|
|
||||||
|
// Store first
|
||||||
|
gracefulGetStorage.setContext({ protocol: 'maven' });
|
||||||
|
await gracefulGetStorage.putObject('test/graceful-afterget.txt', Buffer.from('read me gracefully'));
|
||||||
|
|
||||||
|
// Read should succeed even though afterGet throws
|
||||||
|
let errorThrown = false;
|
||||||
|
try {
|
||||||
|
const data = await gracefulGetStorage.getObject('test/graceful-afterget.txt');
|
||||||
|
expect(data).toBeTruthy();
|
||||||
|
} catch {
|
||||||
|
errorThrown = true;
|
||||||
|
}
|
||||||
|
|
||||||
|
gracefulGetStorage.clearContext();
|
||||||
|
|
||||||
|
expect(errorThrown).toBeFalse();
|
||||||
|
});
|
||||||
|
|
||||||
|
// ============================================================================
|
||||||
|
// Quota Hooks Tests
|
||||||
|
// ============================================================================
|
||||||
|
|
||||||
|
tap.test('quota: should block storage when quota exceeded', async () => {
|
||||||
|
const maxSize = 100; // 100 bytes max
|
||||||
|
const quotaTracker = createQuotaHooks(maxSize);
|
||||||
|
|
||||||
|
const quotaStorage = new RegistryStorage(storageConfig, quotaTracker.hooks);
|
||||||
|
await quotaStorage.init();
|
||||||
|
|
||||||
|
quotaStorage.setContext({
|
||||||
|
protocol: 'npm',
|
||||||
|
actor: { userId: 'quota-user' },
|
||||||
|
});
|
||||||
|
|
||||||
|
// Store 50 bytes - should succeed
|
||||||
|
await quotaStorage.putObject('test/quota-1.txt', Buffer.from('x'.repeat(50)));
|
||||||
|
expect(quotaTracker.currentUsage.bytes).toEqual(50);
|
||||||
|
|
||||||
|
// Try to store 60 more bytes - should fail (total would be 110)
|
||||||
|
let errorThrown = false;
|
||||||
|
try {
|
||||||
|
await quotaStorage.putObject('test/quota-2.txt', Buffer.from('x'.repeat(60)));
|
||||||
|
} catch (error) {
|
||||||
|
errorThrown = true;
|
||||||
|
expect((error as Error).message).toContain('Quota exceeded');
|
||||||
|
}
|
||||||
|
|
||||||
|
quotaStorage.clearContext();
|
||||||
|
|
||||||
|
expect(errorThrown).toBeTrue();
|
||||||
|
expect(quotaTracker.currentUsage.bytes).toEqual(50); // Still 50, not 110
|
||||||
|
});
|
||||||
|
|
||||||
|
tap.test('quota: should update usage after delete', async () => {
|
||||||
|
const maxSize = 200;
|
||||||
|
const quotaTracker = createQuotaHooks(maxSize);
|
||||||
|
|
||||||
|
const quotaDelStorage = new RegistryStorage(storageConfig, quotaTracker.hooks);
|
||||||
|
await quotaDelStorage.init();
|
||||||
|
|
||||||
|
quotaDelStorage.setContext({
|
||||||
|
protocol: 'npm',
|
||||||
|
metadata: { size: 75 },
|
||||||
|
});
|
||||||
|
|
||||||
|
// Store and track
|
||||||
|
await quotaDelStorage.putObject('test/quota-del.txt', Buffer.from('x'.repeat(75)));
|
||||||
|
expect(quotaTracker.currentUsage.bytes).toEqual(75);
|
||||||
|
|
||||||
|
// Delete and verify usage decreases
|
||||||
|
await quotaDelStorage.deleteObject('test/quota-del.txt');
|
||||||
|
|
||||||
|
await new Promise(resolve => setTimeout(resolve, 100));
|
||||||
|
|
||||||
|
quotaDelStorage.clearContext();
|
||||||
|
|
||||||
|
// Usage should be reduced (though exact value depends on metadata)
|
||||||
|
expect(quotaTracker.currentUsage.bytes).toBeLessThanOrEqual(75);
|
||||||
|
});
|
||||||
|
|
||||||
|
// ============================================================================
|
||||||
|
// setHooks Tests
|
||||||
|
// ============================================================================
|
||||||
|
|
||||||
|
tap.test('setHooks: should allow setting hooks after construction', async () => {
|
||||||
|
const lateStorage = new RegistryStorage(storageConfig);
|
||||||
|
await lateStorage.init();
|
||||||
|
|
||||||
|
// Initially no hooks
|
||||||
|
await lateStorage.putObject('test/no-hooks.txt', Buffer.from('no hooks yet'));
|
||||||
|
|
||||||
|
// Add hooks later
|
||||||
|
const tracker = createTrackingHooks();
|
||||||
|
lateStorage.setHooks(tracker.hooks);
|
||||||
|
|
||||||
|
lateStorage.setContext({ protocol: 'npm' });
|
||||||
|
await lateStorage.putObject('test/with-late-hooks.txt', Buffer.from('now with hooks'));
|
||||||
|
lateStorage.clearContext();
|
||||||
|
|
||||||
|
const beforePutCalls = tracker.calls.filter(c => c.method === 'beforePut');
|
||||||
|
expect(beforePutCalls.length).toEqual(1);
|
||||||
|
});
|
||||||
|
|
||||||
|
// ============================================================================
|
||||||
|
// Cleanup
|
||||||
|
// ============================================================================
|
||||||
|
|
||||||
|
tap.test('cleanup: should clean up test bucket', async () => {
|
||||||
|
if (storage) {
|
||||||
|
// Clean up test objects
|
||||||
|
const prefixes = ['test/'];
|
||||||
|
for (const prefix of prefixes) {
|
||||||
|
try {
|
||||||
|
const objects = await storage.listObjects(prefix);
|
||||||
|
for (const obj of objects) {
|
||||||
|
await storage.deleteObject(obj);
|
||||||
|
}
|
||||||
|
} catch {
|
||||||
|
// Ignore cleanup errors
|
||||||
|
}
|
||||||
|
}
|
||||||
|
}
|
||||||
|
});
|
||||||
|
|
||||||
|
export default tap.start();
|
||||||
598
test/test.upstream.cache.s3.ts
Normal file
598
test/test.upstream.cache.s3.ts
Normal file
@@ -0,0 +1,598 @@
|
|||||||
|
import { expect, tap } from '@git.zone/tstest/tapbundle';
|
||||||
|
import * as qenv from '@push.rocks/qenv';
|
||||||
|
import * as smartbucket from '@push.rocks/smartbucket';
|
||||||
|
import { UpstreamCache } from '../ts/upstream/classes.upstreamcache.js';
|
||||||
|
import type { IUpstreamFetchContext, IUpstreamCacheConfig } from '../ts/upstream/interfaces.upstream.js';
|
||||||
|
import type { IStorageBackend } from '../ts/core/interfaces.core.js';
|
||||||
|
import { generateTestRunId } from './helpers/registry.js';
|
||||||
|
|
||||||
|
const testQenv = new qenv.Qenv('./', './.nogit');
|
||||||
|
|
||||||
|
// ============================================================================
|
||||||
|
// Test State
|
||||||
|
// ============================================================================
|
||||||
|
|
||||||
|
let cache: UpstreamCache;
|
||||||
|
let storageBackend: IStorageBackend;
|
||||||
|
let s3Bucket: smartbucket.Bucket;
|
||||||
|
let smartBucket: smartbucket.SmartBucket;
|
||||||
|
let testRunId: string;
|
||||||
|
let bucketName: string;
|
||||||
|
|
||||||
|
// ============================================================================
|
||||||
|
// Helper Functions
|
||||||
|
// ============================================================================
|
||||||
|
|
||||||
|
function createFetchContext(overrides?: Partial<IUpstreamFetchContext>): IUpstreamFetchContext {
|
||||||
|
// Use resource name as path to ensure unique cache keys
|
||||||
|
const resource = overrides?.resource || 'lodash';
|
||||||
|
return {
|
||||||
|
protocol: 'npm',
|
||||||
|
resource,
|
||||||
|
resourceType: 'packument',
|
||||||
|
path: `/${resource}`,
|
||||||
|
method: 'GET',
|
||||||
|
headers: {},
|
||||||
|
query: {},
|
||||||
|
...overrides,
|
||||||
|
};
|
||||||
|
}
|
||||||
|
|
||||||
|
// ============================================================================
|
||||||
|
// Setup
|
||||||
|
// ============================================================================
|
||||||
|
|
||||||
|
tap.test('setup: should create S3 storage backend', async () => {
|
||||||
|
testRunId = generateTestRunId();
|
||||||
|
bucketName = `test-ucache-${testRunId.substring(0, 8)}`;
|
||||||
|
|
||||||
|
const s3AccessKey = await testQenv.getEnvVarOnDemand('S3_ACCESSKEY');
|
||||||
|
const s3SecretKey = await testQenv.getEnvVarOnDemand('S3_SECRETKEY');
|
||||||
|
const s3Endpoint = await testQenv.getEnvVarOnDemand('S3_ENDPOINT');
|
||||||
|
const s3Port = await testQenv.getEnvVarOnDemand('S3_PORT');
|
||||||
|
|
||||||
|
smartBucket = new smartbucket.SmartBucket({
|
||||||
|
accessKey: s3AccessKey || 'minioadmin',
|
||||||
|
accessSecret: s3SecretKey || 'minioadmin',
|
||||||
|
endpoint: s3Endpoint || 'localhost',
|
||||||
|
port: parseInt(s3Port || '9000', 10),
|
||||||
|
useSsl: false,
|
||||||
|
});
|
||||||
|
|
||||||
|
s3Bucket = await smartBucket.createBucket(bucketName);
|
||||||
|
|
||||||
|
// Create storage backend adapter
|
||||||
|
storageBackend = {
|
||||||
|
getObject: async (key: string): Promise<Buffer | null> => {
|
||||||
|
try {
|
||||||
|
// fastGet returns Buffer directly (or undefined if not found)
|
||||||
|
const data = await s3Bucket.fastGet({ path: key });
|
||||||
|
if (!data) {
|
||||||
|
return null;
|
||||||
|
}
|
||||||
|
return data;
|
||||||
|
} catch (error) {
|
||||||
|
// fastGet throws if object doesn't exist
|
||||||
|
return null;
|
||||||
|
}
|
||||||
|
},
|
||||||
|
putObject: async (key: string, data: Buffer): Promise<void> => {
|
||||||
|
await s3Bucket.fastPut({ path: key, contents: data, overwrite: true });
|
||||||
|
},
|
||||||
|
deleteObject: async (key: string): Promise<void> => {
|
||||||
|
await s3Bucket.fastRemove({ path: key });
|
||||||
|
},
|
||||||
|
listObjects: async (prefix: string): Promise<string[]> => {
|
||||||
|
const paths: string[] = [];
|
||||||
|
for await (const path of s3Bucket.listAllObjects(prefix)) {
|
||||||
|
paths.push(path);
|
||||||
|
}
|
||||||
|
return paths;
|
||||||
|
},
|
||||||
|
};
|
||||||
|
|
||||||
|
expect(storageBackend).toBeTruthy();
|
||||||
|
});
|
||||||
|
|
||||||
|
tap.test('setup: should create UpstreamCache with S3 storage', async () => {
|
||||||
|
cache = new UpstreamCache(
|
||||||
|
{ enabled: true, defaultTtlSeconds: 300 },
|
||||||
|
10000,
|
||||||
|
storageBackend
|
||||||
|
);
|
||||||
|
|
||||||
|
expect(cache.isEnabled()).toBeTrue();
|
||||||
|
expect(cache.hasStorage()).toBeTrue();
|
||||||
|
});
|
||||||
|
|
||||||
|
// ============================================================================
|
||||||
|
// Basic Cache Operations
|
||||||
|
// ============================================================================
|
||||||
|
|
||||||
|
tap.test('cache: should store cache entry in S3', async () => {
|
||||||
|
const context = createFetchContext({ resource: 'store-test' });
|
||||||
|
const testData = Buffer.from(JSON.stringify({ name: 'store-test', version: '1.0.0' }));
|
||||||
|
const upstreamUrl = 'https://registry.npmjs.org';
|
||||||
|
|
||||||
|
await cache.set(
|
||||||
|
context,
|
||||||
|
testData,
|
||||||
|
'application/json',
|
||||||
|
{ 'etag': '"abc123"' },
|
||||||
|
'npmjs',
|
||||||
|
upstreamUrl
|
||||||
|
);
|
||||||
|
|
||||||
|
// Verify in S3
|
||||||
|
const stats = cache.getStats();
|
||||||
|
expect(stats.totalEntries).toBeGreaterThanOrEqual(1);
|
||||||
|
});
|
||||||
|
|
||||||
|
tap.test('cache: should retrieve cache entry from S3', async () => {
|
||||||
|
const context = createFetchContext({ resource: 'retrieve-test' });
|
||||||
|
const testData = Buffer.from('retrieve test data');
|
||||||
|
const upstreamUrl = 'https://registry.npmjs.org';
|
||||||
|
|
||||||
|
await cache.set(
|
||||||
|
context,
|
||||||
|
testData,
|
||||||
|
'application/octet-stream',
|
||||||
|
{},
|
||||||
|
'npmjs',
|
||||||
|
upstreamUrl
|
||||||
|
);
|
||||||
|
|
||||||
|
const entry = await cache.get(context, upstreamUrl);
|
||||||
|
|
||||||
|
expect(entry).toBeTruthy();
|
||||||
|
expect(entry!.data.toString()).toEqual('retrieve test data');
|
||||||
|
expect(entry!.contentType).toEqual('application/octet-stream');
|
||||||
|
expect(entry!.upstreamId).toEqual('npmjs');
|
||||||
|
});
|
||||||
|
|
||||||
|
// ============================================================================
|
||||||
|
// Multi-Upstream URL Tests
|
||||||
|
// ============================================================================
|
||||||
|
|
||||||
|
tap.test('cache: should include upstream URL in cache path', async () => {
|
||||||
|
const context = createFetchContext({ resource: 'url-path-test' });
|
||||||
|
const testData = Buffer.from('url path test');
|
||||||
|
|
||||||
|
await cache.set(
|
||||||
|
context,
|
||||||
|
testData,
|
||||||
|
'text/plain',
|
||||||
|
{},
|
||||||
|
'npmjs',
|
||||||
|
'https://registry.npmjs.org'
|
||||||
|
);
|
||||||
|
|
||||||
|
// The cache key should include the escaped URL
|
||||||
|
const entry = await cache.get(context, 'https://registry.npmjs.org');
|
||||||
|
expect(entry).toBeTruthy();
|
||||||
|
expect(entry!.data.toString()).toEqual('url path test');
|
||||||
|
});
|
||||||
|
|
||||||
|
tap.test('cache: should handle multiple upstreams with different URLs', async () => {
|
||||||
|
const context = createFetchContext({ resource: '@company/private-pkg' });
|
||||||
|
|
||||||
|
// Store from private upstream
|
||||||
|
const privateData = Buffer.from('private package data');
|
||||||
|
await cache.set(
|
||||||
|
context,
|
||||||
|
privateData,
|
||||||
|
'application/json',
|
||||||
|
{},
|
||||||
|
'private-npm',
|
||||||
|
'https://npm.company.com'
|
||||||
|
);
|
||||||
|
|
||||||
|
// Store from public upstream (same resource name, different upstream)
|
||||||
|
const publicData = Buffer.from('public package data');
|
||||||
|
await cache.set(
|
||||||
|
context,
|
||||||
|
publicData,
|
||||||
|
'application/json',
|
||||||
|
{},
|
||||||
|
'public-npm',
|
||||||
|
'https://registry.npmjs.org'
|
||||||
|
);
|
||||||
|
|
||||||
|
// Retrieve from private - should get private data
|
||||||
|
const privateEntry = await cache.get(context, 'https://npm.company.com');
|
||||||
|
expect(privateEntry).toBeTruthy();
|
||||||
|
expect(privateEntry!.data.toString()).toEqual('private package data');
|
||||||
|
expect(privateEntry!.upstreamId).toEqual('private-npm');
|
||||||
|
|
||||||
|
// Retrieve from public - should get public data
|
||||||
|
const publicEntry = await cache.get(context, 'https://registry.npmjs.org');
|
||||||
|
expect(publicEntry).toBeTruthy();
|
||||||
|
expect(publicEntry!.data.toString()).toEqual('public package data');
|
||||||
|
expect(publicEntry!.upstreamId).toEqual('public-npm');
|
||||||
|
});
|
||||||
|
|
||||||
|
// ============================================================================
|
||||||
|
// TTL and Expiration Tests
|
||||||
|
// ============================================================================
|
||||||
|
|
||||||
|
tap.test('cache: should respect TTL expiration', async () => {
|
||||||
|
// Create cache with very short TTL
|
||||||
|
const shortTtlCache = new UpstreamCache(
|
||||||
|
{
|
||||||
|
enabled: true,
|
||||||
|
defaultTtlSeconds: 1, // 1 second TTL
|
||||||
|
staleWhileRevalidate: false,
|
||||||
|
staleMaxAgeSeconds: 0,
|
||||||
|
immutableTtlSeconds: 1,
|
||||||
|
negativeCacheTtlSeconds: 1,
|
||||||
|
},
|
||||||
|
1000,
|
||||||
|
storageBackend
|
||||||
|
);
|
||||||
|
|
||||||
|
const context = createFetchContext({ resource: 'ttl-test' });
|
||||||
|
const testData = Buffer.from('expires soon');
|
||||||
|
|
||||||
|
await shortTtlCache.set(
|
||||||
|
context,
|
||||||
|
testData,
|
||||||
|
'text/plain',
|
||||||
|
{},
|
||||||
|
'test-upstream',
|
||||||
|
'https://test.example.com'
|
||||||
|
);
|
||||||
|
|
||||||
|
// Should exist immediately
|
||||||
|
let entry = await shortTtlCache.get(context, 'https://test.example.com');
|
||||||
|
expect(entry).toBeTruthy();
|
||||||
|
|
||||||
|
// Wait for expiration
|
||||||
|
await new Promise(resolve => setTimeout(resolve, 1500));
|
||||||
|
|
||||||
|
// Should be expired now
|
||||||
|
entry = await shortTtlCache.get(context, 'https://test.example.com');
|
||||||
|
expect(entry).toBeNull();
|
||||||
|
|
||||||
|
shortTtlCache.stop();
|
||||||
|
});
|
||||||
|
|
||||||
|
tap.test('cache: should serve stale content during stale-while-revalidate window', async () => {
|
||||||
|
const staleCache = new UpstreamCache(
|
||||||
|
{
|
||||||
|
enabled: true,
|
||||||
|
defaultTtlSeconds: 1, // 1 second fresh
|
||||||
|
staleWhileRevalidate: true,
|
||||||
|
staleMaxAgeSeconds: 60, // 60 seconds stale window
|
||||||
|
immutableTtlSeconds: 1,
|
||||||
|
negativeCacheTtlSeconds: 1,
|
||||||
|
},
|
||||||
|
1000,
|
||||||
|
storageBackend
|
||||||
|
);
|
||||||
|
|
||||||
|
const context = createFetchContext({ resource: 'stale-test' });
|
||||||
|
const testData = Buffer.from('stale but usable');
|
||||||
|
|
||||||
|
await staleCache.set(
|
||||||
|
context,
|
||||||
|
testData,
|
||||||
|
'text/plain',
|
||||||
|
{},
|
||||||
|
'stale-upstream',
|
||||||
|
'https://stale.example.com'
|
||||||
|
);
|
||||||
|
|
||||||
|
// Wait for fresh period to expire
|
||||||
|
await new Promise(resolve => setTimeout(resolve, 1500));
|
||||||
|
|
||||||
|
// Should still be available but marked as stale
|
||||||
|
const entry = await staleCache.get(context, 'https://stale.example.com');
|
||||||
|
expect(entry).toBeTruthy();
|
||||||
|
expect(entry!.stale).toBeTrue();
|
||||||
|
expect(entry!.data.toString()).toEqual('stale but usable');
|
||||||
|
|
||||||
|
staleCache.stop();
|
||||||
|
});
|
||||||
|
|
||||||
|
tap.test('cache: should reject content past stale deadline', async () => {
|
||||||
|
const veryShortCache = new UpstreamCache(
|
||||||
|
{
|
||||||
|
enabled: true,
|
||||||
|
defaultTtlSeconds: 1,
|
||||||
|
staleWhileRevalidate: true,
|
||||||
|
staleMaxAgeSeconds: 1, // Only 1 second stale window
|
||||||
|
immutableTtlSeconds: 1,
|
||||||
|
negativeCacheTtlSeconds: 1,
|
||||||
|
},
|
||||||
|
1000,
|
||||||
|
storageBackend
|
||||||
|
);
|
||||||
|
|
||||||
|
const context = createFetchContext({ resource: 'very-stale-test' });
|
||||||
|
await veryShortCache.set(
|
||||||
|
context,
|
||||||
|
Buffer.from('will expire completely'),
|
||||||
|
'text/plain',
|
||||||
|
{},
|
||||||
|
'short-upstream',
|
||||||
|
'https://short.example.com'
|
||||||
|
);
|
||||||
|
|
||||||
|
// Wait for both fresh AND stale periods to expire
|
||||||
|
await new Promise(resolve => setTimeout(resolve, 2500));
|
||||||
|
|
||||||
|
const entry = await veryShortCache.get(context, 'https://short.example.com');
|
||||||
|
expect(entry).toBeNull();
|
||||||
|
|
||||||
|
veryShortCache.stop();
|
||||||
|
});
|
||||||
|
|
||||||
|
// ============================================================================
|
||||||
|
// Negative Cache Tests
|
||||||
|
// ============================================================================
|
||||||
|
|
||||||
|
tap.test('cache: should store negative cache entries (404)', async () => {
|
||||||
|
const context = createFetchContext({ resource: 'not-found-pkg' });
|
||||||
|
const upstreamUrl = 'https://registry.npmjs.org';
|
||||||
|
|
||||||
|
await cache.setNegative(context, 'npmjs', upstreamUrl);
|
||||||
|
|
||||||
|
const hasNegative = await cache.hasNegative(context, upstreamUrl);
|
||||||
|
expect(hasNegative).toBeTrue();
|
||||||
|
});
|
||||||
|
|
||||||
|
tap.test('cache: should retrieve negative cache entries', async () => {
|
||||||
|
const context = createFetchContext({ resource: 'negative-retrieve-test' });
|
||||||
|
const upstreamUrl = 'https://registry.npmjs.org';
|
||||||
|
|
||||||
|
await cache.setNegative(context, 'npmjs', upstreamUrl);
|
||||||
|
|
||||||
|
const entry = await cache.get(context, upstreamUrl);
|
||||||
|
expect(entry).toBeTruthy();
|
||||||
|
expect(entry!.data.length).toEqual(0); // Empty buffer indicates 404
|
||||||
|
});
|
||||||
|
|
||||||
|
// ============================================================================
|
||||||
|
// Eviction Tests
|
||||||
|
// ============================================================================
|
||||||
|
|
||||||
|
tap.test('cache: should evict oldest entries when memory limit reached', async () => {
|
||||||
|
// Create cache with very small limit
|
||||||
|
const smallCache = new UpstreamCache(
|
||||||
|
{ enabled: true, defaultTtlSeconds: 300 },
|
||||||
|
5, // Only 5 entries
|
||||||
|
storageBackend
|
||||||
|
);
|
||||||
|
|
||||||
|
// Add 10 entries
|
||||||
|
for (let i = 0; i < 10; i++) {
|
||||||
|
const context = createFetchContext({ resource: `evict-test-${i}` });
|
||||||
|
await smallCache.set(
|
||||||
|
context,
|
||||||
|
Buffer.from(`data ${i}`),
|
||||||
|
'text/plain',
|
||||||
|
{},
|
||||||
|
'evict-upstream',
|
||||||
|
'https://evict.example.com'
|
||||||
|
);
|
||||||
|
}
|
||||||
|
|
||||||
|
const stats = smallCache.getStats();
|
||||||
|
// Should have evicted some entries
|
||||||
|
expect(stats.totalEntries).toBeLessThanOrEqual(5);
|
||||||
|
|
||||||
|
smallCache.stop();
|
||||||
|
});
|
||||||
|
|
||||||
|
// ============================================================================
|
||||||
|
// Query Parameter Tests
|
||||||
|
// ============================================================================
|
||||||
|
|
||||||
|
tap.test('cache: cache key should include query parameters', async () => {
|
||||||
|
const context1 = createFetchContext({
|
||||||
|
resource: 'query-test',
|
||||||
|
query: { version: '1.0.0' },
|
||||||
|
});
|
||||||
|
|
||||||
|
const context2 = createFetchContext({
|
||||||
|
resource: 'query-test',
|
||||||
|
query: { version: '2.0.0' },
|
||||||
|
});
|
||||||
|
|
||||||
|
const upstreamUrl = 'https://registry.npmjs.org';
|
||||||
|
|
||||||
|
// Store with v1 query
|
||||||
|
await cache.set(
|
||||||
|
context1,
|
||||||
|
Buffer.from('version 1 data'),
|
||||||
|
'text/plain',
|
||||||
|
{},
|
||||||
|
'query-upstream',
|
||||||
|
upstreamUrl
|
||||||
|
);
|
||||||
|
|
||||||
|
// Store with v2 query
|
||||||
|
await cache.set(
|
||||||
|
context2,
|
||||||
|
Buffer.from('version 2 data'),
|
||||||
|
'text/plain',
|
||||||
|
{},
|
||||||
|
'query-upstream',
|
||||||
|
upstreamUrl
|
||||||
|
);
|
||||||
|
|
||||||
|
// Retrieve v1 - should get v1 data
|
||||||
|
const entry1 = await cache.get(context1, upstreamUrl);
|
||||||
|
expect(entry1).toBeTruthy();
|
||||||
|
expect(entry1!.data.toString()).toEqual('version 1 data');
|
||||||
|
|
||||||
|
// Retrieve v2 - should get v2 data
|
||||||
|
const entry2 = await cache.get(context2, upstreamUrl);
|
||||||
|
expect(entry2).toBeTruthy();
|
||||||
|
expect(entry2!.data.toString()).toEqual('version 2 data');
|
||||||
|
});
|
||||||
|
|
||||||
|
// ============================================================================
|
||||||
|
// S3 Persistence Tests
|
||||||
|
// ============================================================================
|
||||||
|
|
||||||
|
tap.test('cache: should load from S3 on memory cache miss', async () => {
|
||||||
|
// Use a unique resource name for this test
|
||||||
|
const uniqueResource = `persist-test-${Date.now()}`;
|
||||||
|
const persistContext = createFetchContext({ resource: uniqueResource });
|
||||||
|
const upstreamUrl = 'https://persist.example.com';
|
||||||
|
|
||||||
|
// Store in first cache instance
|
||||||
|
await cache.set(
|
||||||
|
persistContext,
|
||||||
|
Buffer.from('persisted data'),
|
||||||
|
'text/plain',
|
||||||
|
{},
|
||||||
|
'persist-upstream',
|
||||||
|
upstreamUrl
|
||||||
|
);
|
||||||
|
|
||||||
|
// Wait for S3 write to complete
|
||||||
|
await new Promise(resolve => setTimeout(resolve, 200));
|
||||||
|
|
||||||
|
// Verify the entry is in the original cache's memory
|
||||||
|
const originalEntry = await cache.get(persistContext, upstreamUrl);
|
||||||
|
expect(originalEntry).toBeTruthy();
|
||||||
|
|
||||||
|
// Create a new cache instance (simulates restart) with SAME storage backend
|
||||||
|
const freshCache = new UpstreamCache(
|
||||||
|
{ enabled: true, defaultTtlSeconds: 300 },
|
||||||
|
10000,
|
||||||
|
storageBackend
|
||||||
|
);
|
||||||
|
|
||||||
|
// Fresh cache has empty memory, should load from S3
|
||||||
|
const entry = await freshCache.get(persistContext, upstreamUrl);
|
||||||
|
|
||||||
|
expect(entry).toBeTruthy();
|
||||||
|
expect(entry!.data.toString()).toEqual('persisted data');
|
||||||
|
|
||||||
|
freshCache.stop();
|
||||||
|
});
|
||||||
|
|
||||||
|
// ============================================================================
|
||||||
|
// Cache Stats Tests
|
||||||
|
// ============================================================================
|
||||||
|
|
||||||
|
tap.test('cache: should return accurate stats', async () => {
|
||||||
|
const statsCache = new UpstreamCache(
|
||||||
|
{ enabled: true, defaultTtlSeconds: 300 },
|
||||||
|
1000,
|
||||||
|
storageBackend
|
||||||
|
);
|
||||||
|
|
||||||
|
// Add some entries
|
||||||
|
for (let i = 0; i < 3; i++) {
|
||||||
|
const context = createFetchContext({ resource: `stats-test-${i}` });
|
||||||
|
await statsCache.set(
|
||||||
|
context,
|
||||||
|
Buffer.from(`stats data ${i}`),
|
||||||
|
'text/plain',
|
||||||
|
{},
|
||||||
|
'stats-upstream',
|
||||||
|
'https://stats.example.com'
|
||||||
|
);
|
||||||
|
}
|
||||||
|
|
||||||
|
// Add a negative entry
|
||||||
|
const negContext = createFetchContext({ resource: 'stats-negative' });
|
||||||
|
await statsCache.setNegative(negContext, 'stats-upstream', 'https://stats.example.com');
|
||||||
|
|
||||||
|
const stats = statsCache.getStats();
|
||||||
|
|
||||||
|
expect(stats.totalEntries).toBeGreaterThanOrEqual(4);
|
||||||
|
expect(stats.enabled).toBeTrue();
|
||||||
|
expect(stats.hasStorage).toBeTrue();
|
||||||
|
expect(stats.maxEntries).toEqual(1000);
|
||||||
|
|
||||||
|
statsCache.stop();
|
||||||
|
});
|
||||||
|
|
||||||
|
// ============================================================================
|
||||||
|
// Invalidation Tests
|
||||||
|
// ============================================================================
|
||||||
|
|
||||||
|
tap.test('cache: should invalidate specific cache entry', async () => {
|
||||||
|
const invalidateContext = createFetchContext({ resource: 'invalidate-test' });
|
||||||
|
const upstreamUrl = 'https://invalidate.example.com';
|
||||||
|
|
||||||
|
await cache.set(
|
||||||
|
invalidateContext,
|
||||||
|
Buffer.from('to be invalidated'),
|
||||||
|
'text/plain',
|
||||||
|
{},
|
||||||
|
'inv-upstream',
|
||||||
|
upstreamUrl
|
||||||
|
);
|
||||||
|
|
||||||
|
// Verify it exists
|
||||||
|
let entry = await cache.get(invalidateContext, upstreamUrl);
|
||||||
|
expect(entry).toBeTruthy();
|
||||||
|
|
||||||
|
// Invalidate
|
||||||
|
const deleted = await cache.invalidate(invalidateContext, upstreamUrl);
|
||||||
|
expect(deleted).toBeTrue();
|
||||||
|
|
||||||
|
// Should be gone
|
||||||
|
entry = await cache.get(invalidateContext, upstreamUrl);
|
||||||
|
expect(entry).toBeNull();
|
||||||
|
});
|
||||||
|
|
||||||
|
tap.test('cache: should invalidate entries matching pattern', async () => {
|
||||||
|
const upstreamUrl = 'https://pattern.example.com';
|
||||||
|
|
||||||
|
// Add multiple entries
|
||||||
|
for (const name of ['pattern-a', 'pattern-b', 'other-c']) {
|
||||||
|
const context = createFetchContext({ resource: name });
|
||||||
|
await cache.set(
|
||||||
|
context,
|
||||||
|
Buffer.from(`data for ${name}`),
|
||||||
|
'text/plain',
|
||||||
|
{},
|
||||||
|
'pattern-upstream',
|
||||||
|
upstreamUrl
|
||||||
|
);
|
||||||
|
}
|
||||||
|
|
||||||
|
// Invalidate entries matching 'pattern-*'
|
||||||
|
const count = await cache.invalidatePattern(/pattern-/);
|
||||||
|
expect(count).toBeGreaterThanOrEqual(2);
|
||||||
|
|
||||||
|
// pattern-a should be gone
|
||||||
|
const entryA = await cache.get(createFetchContext({ resource: 'pattern-a' }), upstreamUrl);
|
||||||
|
expect(entryA).toBeNull();
|
||||||
|
|
||||||
|
// other-c should still exist
|
||||||
|
const entryC = await cache.get(createFetchContext({ resource: 'other-c' }), upstreamUrl);
|
||||||
|
expect(entryC).toBeTruthy();
|
||||||
|
});
|
||||||
|
|
||||||
|
// ============================================================================
|
||||||
|
// Cleanup
|
||||||
|
// ============================================================================
|
||||||
|
|
||||||
|
tap.test('cleanup: should stop cache and clean up bucket', async () => {
|
||||||
|
if (cache) {
|
||||||
|
cache.stop();
|
||||||
|
}
|
||||||
|
|
||||||
|
// Clean up test bucket
|
||||||
|
if (s3Bucket) {
|
||||||
|
try {
|
||||||
|
const files = await s3Bucket.fastList({});
|
||||||
|
for (const file of files) {
|
||||||
|
await s3Bucket.fastRemove({ path: file.name });
|
||||||
|
}
|
||||||
|
await smartBucket.removeBucket(bucketName);
|
||||||
|
} catch {
|
||||||
|
// Ignore cleanup errors
|
||||||
|
}
|
||||||
|
}
|
||||||
|
});
|
||||||
|
|
||||||
|
export default tap.start();
|
||||||
@@ -3,6 +3,6 @@
|
|||||||
*/
|
*/
|
||||||
export const commitinfo = {
|
export const commitinfo = {
|
||||||
name: '@push.rocks/smartregistry',
|
name: '@push.rocks/smartregistry',
|
||||||
version: '2.3.0',
|
version: '2.6.0',
|
||||||
description: 'A composable TypeScript library implementing OCI, NPM, Maven, Cargo, Composer, PyPI, and RubyGems registries for building unified container and package registries'
|
description: 'A composable TypeScript library implementing OCI, NPM, Maven, Cargo, Composer, PyPI, and RubyGems registries for building unified container and package registries'
|
||||||
}
|
}
|
||||||
|
|||||||
@@ -11,8 +11,39 @@ import { PypiRegistry } from './pypi/classes.pypiregistry.js';
|
|||||||
import { RubyGemsRegistry } from './rubygems/classes.rubygemsregistry.js';
|
import { RubyGemsRegistry } from './rubygems/classes.rubygemsregistry.js';
|
||||||
|
|
||||||
/**
|
/**
|
||||||
* Main registry orchestrator
|
* Main registry orchestrator.
|
||||||
* Routes requests to appropriate protocol handlers (OCI, NPM, Maven, Cargo, Composer, PyPI, or RubyGems)
|
* Routes requests to appropriate protocol handlers (OCI, NPM, Maven, Cargo, Composer, PyPI, or RubyGems).
|
||||||
|
*
|
||||||
|
* Supports pluggable authentication and storage hooks:
|
||||||
|
*
|
||||||
|
* @example
|
||||||
|
* ```typescript
|
||||||
|
* // Basic usage with default in-memory auth
|
||||||
|
* const registry = new SmartRegistry(config);
|
||||||
|
*
|
||||||
|
* // With custom auth provider (LDAP, OAuth, etc.)
|
||||||
|
* const registry = new SmartRegistry({
|
||||||
|
* ...config,
|
||||||
|
* authProvider: new LdapAuthProvider(ldapClient),
|
||||||
|
* });
|
||||||
|
*
|
||||||
|
* // With storage hooks for quota tracking
|
||||||
|
* const registry = new SmartRegistry({
|
||||||
|
* ...config,
|
||||||
|
* storageHooks: {
|
||||||
|
* beforePut: async (ctx) => {
|
||||||
|
* const quota = await getQuota(ctx.actor?.orgId);
|
||||||
|
* if (ctx.metadata?.size > quota) {
|
||||||
|
* return { allowed: false, reason: 'Quota exceeded' };
|
||||||
|
* }
|
||||||
|
* return { allowed: true };
|
||||||
|
* },
|
||||||
|
* afterPut: async (ctx) => {
|
||||||
|
* await auditLog('storage.put', ctx);
|
||||||
|
* }
|
||||||
|
* }
|
||||||
|
* });
|
||||||
|
* ```
|
||||||
*/
|
*/
|
||||||
export class SmartRegistry {
|
export class SmartRegistry {
|
||||||
private storage: RegistryStorage;
|
private storage: RegistryStorage;
|
||||||
@@ -23,8 +54,12 @@ export class SmartRegistry {
|
|||||||
|
|
||||||
constructor(config: IRegistryConfig) {
|
constructor(config: IRegistryConfig) {
|
||||||
this.config = config;
|
this.config = config;
|
||||||
this.storage = new RegistryStorage(config.storage);
|
|
||||||
this.authManager = new AuthManager(config.auth);
|
// Create storage with optional hooks
|
||||||
|
this.storage = new RegistryStorage(config.storage, config.storageHooks);
|
||||||
|
|
||||||
|
// Create auth manager with optional custom provider
|
||||||
|
this.authManager = new AuthManager(config.auth, config.authProvider);
|
||||||
}
|
}
|
||||||
|
|
||||||
/**
|
/**
|
||||||
|
|||||||
@@ -1,109 +1,79 @@
|
|||||||
import type { IAuthConfig, IAuthToken, ICredentials, TRegistryProtocol } from './interfaces.core.js';
|
import type { IAuthConfig, IAuthToken, ICredentials, TRegistryProtocol } from './interfaces.core.js';
|
||||||
import * as crypto from 'crypto';
|
import type { IAuthProvider, ITokenOptions } from './interfaces.auth.js';
|
||||||
|
import { DefaultAuthProvider } from './classes.defaultauthprovider.js';
|
||||||
|
|
||||||
/**
|
/**
|
||||||
* Unified authentication manager for all registry protocols
|
* Unified authentication manager for all registry protocols.
|
||||||
* Handles both NPM UUID tokens and OCI JWT tokens
|
* Delegates to a pluggable IAuthProvider for actual auth operations.
|
||||||
|
*
|
||||||
|
* @example
|
||||||
|
* ```typescript
|
||||||
|
* // Use default in-memory provider
|
||||||
|
* const auth = new AuthManager(config);
|
||||||
|
*
|
||||||
|
* // Use custom provider (LDAP, OAuth, etc.)
|
||||||
|
* const auth = new AuthManager(config, new LdapAuthProvider(ldapClient));
|
||||||
|
* ```
|
||||||
*/
|
*/
|
||||||
export class AuthManager {
|
export class AuthManager {
|
||||||
private tokenStore: Map<string, IAuthToken> = new Map();
|
private provider: IAuthProvider;
|
||||||
private userCredentials: Map<string, string> = new Map(); // username -> password hash (mock)
|
|
||||||
|
|
||||||
constructor(private config: IAuthConfig) {}
|
constructor(
|
||||||
|
private config: IAuthConfig,
|
||||||
|
provider?: IAuthProvider
|
||||||
|
) {
|
||||||
|
// Use provided provider or default in-memory implementation
|
||||||
|
this.provider = provider || new DefaultAuthProvider(config);
|
||||||
|
}
|
||||||
|
|
||||||
/**
|
/**
|
||||||
* Initialize the auth manager
|
* Initialize the auth manager
|
||||||
*/
|
*/
|
||||||
public async init(): Promise<void> {
|
public async init(): Promise<void> {
|
||||||
// Initialize token store (in-memory for now)
|
if (this.provider.init) {
|
||||||
// In production, this could be Redis or a database
|
await this.provider.init();
|
||||||
|
}
|
||||||
}
|
}
|
||||||
|
|
||||||
// ========================================================================
|
// ========================================================================
|
||||||
// UUID TOKEN CREATION (Base method for NPM, Maven, etc.)
|
// UNIFIED AUTHENTICATION (Delegated to Provider)
|
||||||
// ========================================================================
|
// ========================================================================
|
||||||
|
|
||||||
/**
|
/**
|
||||||
* Create a UUID-based token with custom scopes (base method)
|
* Authenticate user credentials
|
||||||
* @param userId - User ID
|
* @param credentials - Username and password
|
||||||
* @param protocol - Protocol type
|
* @returns User ID or null
|
||||||
* @param scopes - Permission scopes
|
|
||||||
* @param readonly - Whether the token is readonly
|
|
||||||
* @returns UUID token string
|
|
||||||
*/
|
*/
|
||||||
private async createUuidToken(
|
public async authenticate(credentials: ICredentials): Promise<string | null> {
|
||||||
userId: string,
|
return this.provider.authenticate(credentials);
|
||||||
protocol: TRegistryProtocol,
|
|
||||||
scopes: string[],
|
|
||||||
readonly: boolean = false
|
|
||||||
): Promise<string> {
|
|
||||||
const token = this.generateUuid();
|
|
||||||
const authToken: IAuthToken = {
|
|
||||||
type: protocol,
|
|
||||||
userId,
|
|
||||||
scopes,
|
|
||||||
readonly,
|
|
||||||
metadata: {
|
|
||||||
created: new Date().toISOString(),
|
|
||||||
},
|
|
||||||
};
|
|
||||||
|
|
||||||
this.tokenStore.set(token, authToken);
|
|
||||||
return token;
|
|
||||||
}
|
}
|
||||||
|
|
||||||
/**
|
/**
|
||||||
* Generic protocol token creation (internal helper)
|
* Validate any token (NPM, Maven, OCI, PyPI, RubyGems, Composer, Cargo)
|
||||||
* @param userId - User ID
|
* @param tokenString - Token string (UUID or JWT)
|
||||||
* @param protocol - Protocol type (npm, maven, composer, etc.)
|
* @param protocol - Expected protocol type (optional, improves performance)
|
||||||
* @param readonly - Whether the token is readonly
|
|
||||||
* @returns UUID token string
|
|
||||||
*/
|
|
||||||
private async createProtocolToken(
|
|
||||||
userId: string,
|
|
||||||
protocol: TRegistryProtocol,
|
|
||||||
readonly: boolean
|
|
||||||
): Promise<string> {
|
|
||||||
const scopes = readonly
|
|
||||||
? [`${protocol}:*:*:read`]
|
|
||||||
: [`${protocol}:*:*:*`];
|
|
||||||
return this.createUuidToken(userId, protocol, scopes, readonly);
|
|
||||||
}
|
|
||||||
|
|
||||||
/**
|
|
||||||
* Generic protocol token validation (internal helper)
|
|
||||||
* @param token - UUID token string
|
|
||||||
* @param protocol - Expected protocol type
|
|
||||||
* @returns Auth token object or null
|
* @returns Auth token object or null
|
||||||
*/
|
*/
|
||||||
private async validateProtocolToken(
|
public async validateToken(
|
||||||
token: string,
|
tokenString: string,
|
||||||
protocol: TRegistryProtocol
|
protocol?: TRegistryProtocol
|
||||||
): Promise<IAuthToken | null> {
|
): Promise<IAuthToken | null> {
|
||||||
if (!this.isValidUuid(token)) {
|
return this.provider.validateToken(tokenString, protocol);
|
||||||
return null;
|
|
||||||
}
|
|
||||||
|
|
||||||
const authToken = this.tokenStore.get(token);
|
|
||||||
if (!authToken || authToken.type !== protocol) {
|
|
||||||
return null;
|
|
||||||
}
|
|
||||||
|
|
||||||
// Check expiration if set
|
|
||||||
if (authToken.expiresAt && authToken.expiresAt < new Date()) {
|
|
||||||
this.tokenStore.delete(token);
|
|
||||||
return null;
|
|
||||||
}
|
|
||||||
|
|
||||||
return authToken;
|
|
||||||
}
|
}
|
||||||
|
|
||||||
/**
|
/**
|
||||||
* Generic protocol token revocation (internal helper)
|
* Check if token has permission for an action
|
||||||
* @param token - UUID token string
|
* @param token - Auth token (or null for anonymous)
|
||||||
|
* @param resource - Resource being accessed (e.g., "npm:package:foo")
|
||||||
|
* @param action - Action being performed (read, write, push, pull, delete)
|
||||||
|
* @returns true if authorized
|
||||||
*/
|
*/
|
||||||
private async revokeProtocolToken(token: string): Promise<void> {
|
public async authorize(
|
||||||
this.tokenStore.delete(token);
|
token: IAuthToken | null,
|
||||||
|
resource: string,
|
||||||
|
action: string
|
||||||
|
): Promise<boolean> {
|
||||||
|
return this.provider.authorize(token, resource, action);
|
||||||
}
|
}
|
||||||
|
|
||||||
// ========================================================================
|
// ========================================================================
|
||||||
@@ -120,7 +90,7 @@ export class AuthManager {
|
|||||||
if (!this.config.npmTokens.enabled) {
|
if (!this.config.npmTokens.enabled) {
|
||||||
throw new Error('NPM tokens are not enabled');
|
throw new Error('NPM tokens are not enabled');
|
||||||
}
|
}
|
||||||
return this.createProtocolToken(userId, 'npm', readonly);
|
return this.provider.createToken(userId, 'npm', { readonly });
|
||||||
}
|
}
|
||||||
|
|
||||||
/**
|
/**
|
||||||
@@ -129,7 +99,7 @@ export class AuthManager {
|
|||||||
* @returns Auth token object or null
|
* @returns Auth token object or null
|
||||||
*/
|
*/
|
||||||
public async validateNpmToken(token: string): Promise<IAuthToken | null> {
|
public async validateNpmToken(token: string): Promise<IAuthToken | null> {
|
||||||
return this.validateProtocolToken(token, 'npm');
|
return this.provider.validateToken(token, 'npm');
|
||||||
}
|
}
|
||||||
|
|
||||||
/**
|
/**
|
||||||
@@ -137,7 +107,7 @@ export class AuthManager {
|
|||||||
* @param token - NPM UUID token
|
* @param token - NPM UUID token
|
||||||
*/
|
*/
|
||||||
public async revokeNpmToken(token: string): Promise<void> {
|
public async revokeNpmToken(token: string): Promise<void> {
|
||||||
return this.revokeProtocolToken(token);
|
return this.provider.revokeToken(token);
|
||||||
}
|
}
|
||||||
|
|
||||||
/**
|
/**
|
||||||
@@ -149,20 +119,12 @@ export class AuthManager {
|
|||||||
key: string;
|
key: string;
|
||||||
readonly: boolean;
|
readonly: boolean;
|
||||||
created: string;
|
created: string;
|
||||||
|
protocol?: TRegistryProtocol;
|
||||||
}>> {
|
}>> {
|
||||||
const tokens: Array<{key: string; readonly: boolean; created: string}> = [];
|
if (this.provider.listUserTokens) {
|
||||||
|
return this.provider.listUserTokens(userId);
|
||||||
for (const [token, authToken] of this.tokenStore.entries()) {
|
|
||||||
if (authToken.userId === userId) {
|
|
||||||
tokens.push({
|
|
||||||
key: this.hashToken(token),
|
|
||||||
readonly: authToken.readonly || false,
|
|
||||||
created: authToken.metadata?.created || 'unknown',
|
|
||||||
});
|
|
||||||
}
|
|
||||||
}
|
}
|
||||||
|
return [];
|
||||||
return tokens;
|
|
||||||
}
|
}
|
||||||
|
|
||||||
// ========================================================================
|
// ========================================================================
|
||||||
@@ -174,39 +136,17 @@ export class AuthManager {
|
|||||||
* @param userId - User ID
|
* @param userId - User ID
|
||||||
* @param scopes - Permission scopes
|
* @param scopes - Permission scopes
|
||||||
* @param expiresIn - Expiration time in seconds
|
* @param expiresIn - Expiration time in seconds
|
||||||
* @returns JWT token string (HMAC-SHA256 signed)
|
* @returns JWT token string
|
||||||
*/
|
*/
|
||||||
public async createOciToken(
|
public async createOciToken(
|
||||||
userId: string,
|
userId: string,
|
||||||
scopes: string[],
|
scopes: string[],
|
||||||
expiresIn: number = 3600
|
expiresIn: number = 3600
|
||||||
): Promise<string> {
|
): Promise<string> {
|
||||||
if (!this.config.ociTokens.enabled) {
|
if (!this.config.ociTokens?.enabled) {
|
||||||
throw new Error('OCI tokens are not enabled');
|
throw new Error('OCI tokens are not enabled');
|
||||||
}
|
}
|
||||||
|
return this.provider.createToken(userId, 'oci', { scopes, expiresIn });
|
||||||
const now = Math.floor(Date.now() / 1000);
|
|
||||||
const payload = {
|
|
||||||
iss: this.config.ociTokens.realm,
|
|
||||||
sub: userId,
|
|
||||||
aud: this.config.ociTokens.service,
|
|
||||||
exp: now + expiresIn,
|
|
||||||
nbf: now,
|
|
||||||
iat: now,
|
|
||||||
access: this.scopesToOciAccess(scopes),
|
|
||||||
};
|
|
||||||
|
|
||||||
// Create JWT with HMAC-SHA256 signature
|
|
||||||
const header = { alg: 'HS256', typ: 'JWT' };
|
|
||||||
const headerB64 = Buffer.from(JSON.stringify(header)).toString('base64url');
|
|
||||||
const payloadB64 = Buffer.from(JSON.stringify(payload)).toString('base64url');
|
|
||||||
|
|
||||||
const signature = crypto
|
|
||||||
.createHmac('sha256', this.config.jwtSecret)
|
|
||||||
.update(`${headerB64}.${payloadB64}`)
|
|
||||||
.digest('base64url');
|
|
||||||
|
|
||||||
return `${headerB64}.${payloadB64}.${signature}`;
|
|
||||||
}
|
}
|
||||||
|
|
||||||
/**
|
/**
|
||||||
@@ -215,80 +155,7 @@ export class AuthManager {
|
|||||||
* @returns Auth token object or null
|
* @returns Auth token object or null
|
||||||
*/
|
*/
|
||||||
public async validateOciToken(jwt: string): Promise<IAuthToken | null> {
|
public async validateOciToken(jwt: string): Promise<IAuthToken | null> {
|
||||||
try {
|
return this.provider.validateToken(jwt, 'oci');
|
||||||
const parts = jwt.split('.');
|
|
||||||
if (parts.length !== 3) {
|
|
||||||
return null;
|
|
||||||
}
|
|
||||||
|
|
||||||
const [headerB64, payloadB64, signatureB64] = parts;
|
|
||||||
|
|
||||||
// Verify signature
|
|
||||||
const expectedSignature = crypto
|
|
||||||
.createHmac('sha256', this.config.jwtSecret)
|
|
||||||
.update(`${headerB64}.${payloadB64}`)
|
|
||||||
.digest('base64url');
|
|
||||||
|
|
||||||
if (signatureB64 !== expectedSignature) {
|
|
||||||
return null;
|
|
||||||
}
|
|
||||||
|
|
||||||
// Decode and parse payload
|
|
||||||
const payload = JSON.parse(Buffer.from(payloadB64, 'base64url').toString('utf-8'));
|
|
||||||
|
|
||||||
// Check expiration
|
|
||||||
const now = Math.floor(Date.now() / 1000);
|
|
||||||
if (payload.exp && payload.exp < now) {
|
|
||||||
return null;
|
|
||||||
}
|
|
||||||
|
|
||||||
// Check not-before time
|
|
||||||
if (payload.nbf && payload.nbf > now) {
|
|
||||||
return null;
|
|
||||||
}
|
|
||||||
|
|
||||||
// Convert to unified token format
|
|
||||||
const scopes = this.ociAccessToScopes(payload.access || []);
|
|
||||||
|
|
||||||
return {
|
|
||||||
type: 'oci',
|
|
||||||
userId: payload.sub,
|
|
||||||
scopes,
|
|
||||||
expiresAt: payload.exp ? new Date(payload.exp * 1000) : undefined,
|
|
||||||
metadata: {
|
|
||||||
iss: payload.iss,
|
|
||||||
aud: payload.aud,
|
|
||||||
},
|
|
||||||
};
|
|
||||||
} catch (error) {
|
|
||||||
return null;
|
|
||||||
}
|
|
||||||
}
|
|
||||||
|
|
||||||
// ========================================================================
|
|
||||||
// UNIFIED AUTHENTICATION
|
|
||||||
// ========================================================================
|
|
||||||
|
|
||||||
/**
|
|
||||||
* Authenticate user credentials
|
|
||||||
* @param credentials - Username and password
|
|
||||||
* @returns User ID or null
|
|
||||||
*/
|
|
||||||
public async authenticate(credentials: ICredentials): Promise<string | null> {
|
|
||||||
// Mock authentication - in production, verify against database
|
|
||||||
const storedPassword = this.userCredentials.get(credentials.username);
|
|
||||||
|
|
||||||
if (!storedPassword) {
|
|
||||||
// Auto-register for testing (remove in production)
|
|
||||||
this.userCredentials.set(credentials.username, credentials.password);
|
|
||||||
return credentials.username;
|
|
||||||
}
|
|
||||||
|
|
||||||
if (storedPassword === credentials.password) {
|
|
||||||
return credentials.username;
|
|
||||||
}
|
|
||||||
|
|
||||||
return null;
|
|
||||||
}
|
}
|
||||||
|
|
||||||
// ========================================================================
|
// ========================================================================
|
||||||
@@ -302,7 +169,7 @@ export class AuthManager {
|
|||||||
* @returns Maven UUID token
|
* @returns Maven UUID token
|
||||||
*/
|
*/
|
||||||
public async createMavenToken(userId: string, readonly: boolean = false): Promise<string> {
|
public async createMavenToken(userId: string, readonly: boolean = false): Promise<string> {
|
||||||
return this.createProtocolToken(userId, 'maven', readonly);
|
return this.provider.createToken(userId, 'maven', { readonly });
|
||||||
}
|
}
|
||||||
|
|
||||||
/**
|
/**
|
||||||
@@ -311,7 +178,7 @@ export class AuthManager {
|
|||||||
* @returns Auth token object or null
|
* @returns Auth token object or null
|
||||||
*/
|
*/
|
||||||
public async validateMavenToken(token: string): Promise<IAuthToken | null> {
|
public async validateMavenToken(token: string): Promise<IAuthToken | null> {
|
||||||
return this.validateProtocolToken(token, 'maven');
|
return this.provider.validateToken(token, 'maven');
|
||||||
}
|
}
|
||||||
|
|
||||||
/**
|
/**
|
||||||
@@ -319,7 +186,7 @@ export class AuthManager {
|
|||||||
* @param token - Maven UUID token
|
* @param token - Maven UUID token
|
||||||
*/
|
*/
|
||||||
public async revokeMavenToken(token: string): Promise<void> {
|
public async revokeMavenToken(token: string): Promise<void> {
|
||||||
return this.revokeProtocolToken(token);
|
return this.provider.revokeToken(token);
|
||||||
}
|
}
|
||||||
|
|
||||||
// ========================================================================
|
// ========================================================================
|
||||||
@@ -333,7 +200,7 @@ export class AuthManager {
|
|||||||
* @returns Composer UUID token
|
* @returns Composer UUID token
|
||||||
*/
|
*/
|
||||||
public async createComposerToken(userId: string, readonly: boolean = false): Promise<string> {
|
public async createComposerToken(userId: string, readonly: boolean = false): Promise<string> {
|
||||||
return this.createProtocolToken(userId, 'composer', readonly);
|
return this.provider.createToken(userId, 'composer', { readonly });
|
||||||
}
|
}
|
||||||
|
|
||||||
/**
|
/**
|
||||||
@@ -342,7 +209,7 @@ export class AuthManager {
|
|||||||
* @returns Auth token object or null
|
* @returns Auth token object or null
|
||||||
*/
|
*/
|
||||||
public async validateComposerToken(token: string): Promise<IAuthToken | null> {
|
public async validateComposerToken(token: string): Promise<IAuthToken | null> {
|
||||||
return this.validateProtocolToken(token, 'composer');
|
return this.provider.validateToken(token, 'composer');
|
||||||
}
|
}
|
||||||
|
|
||||||
/**
|
/**
|
||||||
@@ -350,7 +217,7 @@ export class AuthManager {
|
|||||||
* @param token - Composer UUID token
|
* @param token - Composer UUID token
|
||||||
*/
|
*/
|
||||||
public async revokeComposerToken(token: string): Promise<void> {
|
public async revokeComposerToken(token: string): Promise<void> {
|
||||||
return this.revokeProtocolToken(token);
|
return this.provider.revokeToken(token);
|
||||||
}
|
}
|
||||||
|
|
||||||
// ========================================================================
|
// ========================================================================
|
||||||
@@ -364,7 +231,7 @@ export class AuthManager {
|
|||||||
* @returns Cargo UUID token
|
* @returns Cargo UUID token
|
||||||
*/
|
*/
|
||||||
public async createCargoToken(userId: string, readonly: boolean = false): Promise<string> {
|
public async createCargoToken(userId: string, readonly: boolean = false): Promise<string> {
|
||||||
return this.createProtocolToken(userId, 'cargo', readonly);
|
return this.provider.createToken(userId, 'cargo', { readonly });
|
||||||
}
|
}
|
||||||
|
|
||||||
/**
|
/**
|
||||||
@@ -373,7 +240,7 @@ export class AuthManager {
|
|||||||
* @returns Auth token object or null
|
* @returns Auth token object or null
|
||||||
*/
|
*/
|
||||||
public async validateCargoToken(token: string): Promise<IAuthToken | null> {
|
public async validateCargoToken(token: string): Promise<IAuthToken | null> {
|
||||||
return this.validateProtocolToken(token, 'cargo');
|
return this.provider.validateToken(token, 'cargo');
|
||||||
}
|
}
|
||||||
|
|
||||||
/**
|
/**
|
||||||
@@ -381,7 +248,7 @@ export class AuthManager {
|
|||||||
* @param token - Cargo UUID token
|
* @param token - Cargo UUID token
|
||||||
*/
|
*/
|
||||||
public async revokeCargoToken(token: string): Promise<void> {
|
public async revokeCargoToken(token: string): Promise<void> {
|
||||||
return this.revokeProtocolToken(token);
|
return this.provider.revokeToken(token);
|
||||||
}
|
}
|
||||||
|
|
||||||
// ========================================================================
|
// ========================================================================
|
||||||
@@ -395,7 +262,7 @@ export class AuthManager {
|
|||||||
* @returns PyPI UUID token
|
* @returns PyPI UUID token
|
||||||
*/
|
*/
|
||||||
public async createPypiToken(userId: string, readonly: boolean = false): Promise<string> {
|
public async createPypiToken(userId: string, readonly: boolean = false): Promise<string> {
|
||||||
return this.createProtocolToken(userId, 'pypi', readonly);
|
return this.provider.createToken(userId, 'pypi', { readonly });
|
||||||
}
|
}
|
||||||
|
|
||||||
/**
|
/**
|
||||||
@@ -404,7 +271,7 @@ export class AuthManager {
|
|||||||
* @returns Auth token object or null
|
* @returns Auth token object or null
|
||||||
*/
|
*/
|
||||||
public async validatePypiToken(token: string): Promise<IAuthToken | null> {
|
public async validatePypiToken(token: string): Promise<IAuthToken | null> {
|
||||||
return this.validateProtocolToken(token, 'pypi');
|
return this.provider.validateToken(token, 'pypi');
|
||||||
}
|
}
|
||||||
|
|
||||||
/**
|
/**
|
||||||
@@ -412,7 +279,7 @@ export class AuthManager {
|
|||||||
* @param token - PyPI UUID token
|
* @param token - PyPI UUID token
|
||||||
*/
|
*/
|
||||||
public async revokePypiToken(token: string): Promise<void> {
|
public async revokePypiToken(token: string): Promise<void> {
|
||||||
return this.revokeProtocolToken(token);
|
return this.provider.revokeToken(token);
|
||||||
}
|
}
|
||||||
|
|
||||||
// ========================================================================
|
// ========================================================================
|
||||||
@@ -426,7 +293,7 @@ export class AuthManager {
|
|||||||
* @returns RubyGems UUID token
|
* @returns RubyGems UUID token
|
||||||
*/
|
*/
|
||||||
public async createRubyGemsToken(userId: string, readonly: boolean = false): Promise<string> {
|
public async createRubyGemsToken(userId: string, readonly: boolean = false): Promise<string> {
|
||||||
return this.createProtocolToken(userId, 'rubygems', readonly);
|
return this.provider.createToken(userId, 'rubygems', { readonly });
|
||||||
}
|
}
|
||||||
|
|
||||||
/**
|
/**
|
||||||
@@ -435,7 +302,7 @@ export class AuthManager {
|
|||||||
* @returns Auth token object or null
|
* @returns Auth token object or null
|
||||||
*/
|
*/
|
||||||
public async validateRubyGemsToken(token: string): Promise<IAuthToken | null> {
|
public async validateRubyGemsToken(token: string): Promise<IAuthToken | null> {
|
||||||
return this.validateProtocolToken(token, 'rubygems');
|
return this.provider.validateToken(token, 'rubygems');
|
||||||
}
|
}
|
||||||
|
|
||||||
/**
|
/**
|
||||||
@@ -443,211 +310,6 @@ export class AuthManager {
|
|||||||
* @param token - RubyGems UUID token
|
* @param token - RubyGems UUID token
|
||||||
*/
|
*/
|
||||||
public async revokeRubyGemsToken(token: string): Promise<void> {
|
public async revokeRubyGemsToken(token: string): Promise<void> {
|
||||||
return this.revokeProtocolToken(token);
|
return this.provider.revokeToken(token);
|
||||||
}
|
|
||||||
|
|
||||||
// ========================================================================
|
|
||||||
// UNIFIED AUTHENTICATION
|
|
||||||
// ========================================================================
|
|
||||||
|
|
||||||
/**
|
|
||||||
* Validate any token (NPM, Maven, OCI, PyPI, RubyGems, Composer, Cargo)
|
|
||||||
* Optimized: O(1) lookup when protocol hint provided
|
|
||||||
* @param tokenString - Token string (UUID or JWT)
|
|
||||||
* @param protocol - Expected protocol type (optional, improves performance)
|
|
||||||
* @returns Auth token object or null
|
|
||||||
*/
|
|
||||||
public async validateToken(
|
|
||||||
tokenString: string,
|
|
||||||
protocol?: TRegistryProtocol
|
|
||||||
): Promise<IAuthToken | null> {
|
|
||||||
// OCI uses JWT (contains dots), not UUID - check first if OCI is expected
|
|
||||||
if (protocol === 'oci' || tokenString.includes('.')) {
|
|
||||||
const ociToken = await this.validateOciToken(tokenString);
|
|
||||||
if (ociToken && (!protocol || protocol === 'oci')) {
|
|
||||||
return ociToken;
|
|
||||||
}
|
|
||||||
// If protocol was explicitly OCI but validation failed, return null
|
|
||||||
if (protocol === 'oci') {
|
|
||||||
return null;
|
|
||||||
}
|
|
||||||
}
|
|
||||||
|
|
||||||
// UUID-based tokens: single O(1) Map lookup
|
|
||||||
if (this.isValidUuid(tokenString)) {
|
|
||||||
const authToken = this.tokenStore.get(tokenString);
|
|
||||||
if (authToken) {
|
|
||||||
// If protocol specified, verify it matches
|
|
||||||
if (protocol && authToken.type !== protocol) {
|
|
||||||
return null;
|
|
||||||
}
|
|
||||||
// Check expiration
|
|
||||||
if (authToken.expiresAt && authToken.expiresAt < new Date()) {
|
|
||||||
this.tokenStore.delete(tokenString);
|
|
||||||
return null;
|
|
||||||
}
|
|
||||||
return authToken;
|
|
||||||
}
|
|
||||||
}
|
|
||||||
|
|
||||||
return null;
|
|
||||||
}
|
|
||||||
|
|
||||||
/**
|
|
||||||
* Check if token has permission for an action
|
|
||||||
* @param token - Auth token
|
|
||||||
* @param resource - Resource being accessed (e.g., "package:foo" or "repository:bar")
|
|
||||||
* @param action - Action being performed (read, write, push, pull, delete)
|
|
||||||
* @returns true if authorized
|
|
||||||
*/
|
|
||||||
public async authorize(
|
|
||||||
token: IAuthToken | null,
|
|
||||||
resource: string,
|
|
||||||
action: string
|
|
||||||
): Promise<boolean> {
|
|
||||||
if (!token) {
|
|
||||||
return false;
|
|
||||||
}
|
|
||||||
|
|
||||||
// Check readonly flag
|
|
||||||
if (token.readonly && ['write', 'push', 'delete'].includes(action)) {
|
|
||||||
return false;
|
|
||||||
}
|
|
||||||
|
|
||||||
// Check scopes
|
|
||||||
for (const scope of token.scopes) {
|
|
||||||
if (this.matchesScope(scope, resource, action)) {
|
|
||||||
return true;
|
|
||||||
}
|
|
||||||
}
|
|
||||||
|
|
||||||
return false;
|
|
||||||
}
|
|
||||||
|
|
||||||
// ========================================================================
|
|
||||||
// HELPER METHODS
|
|
||||||
// ========================================================================
|
|
||||||
|
|
||||||
/**
|
|
||||||
* Check if a scope matches a resource and action
|
|
||||||
* Scope format: "{protocol}:{type}:{name}:{action}"
|
|
||||||
* Examples:
|
|
||||||
* - "npm:*:*" - All NPM access
|
|
||||||
* - "npm:package:foo:*" - All actions on package foo
|
|
||||||
* - "npm:package:foo:read" - Read-only on package foo
|
|
||||||
* - "oci:repository:*:pull" - Pull from any OCI repo
|
|
||||||
*/
|
|
||||||
private matchesScope(scope: string, resource: string, action: string): boolean {
|
|
||||||
const scopeParts = scope.split(':');
|
|
||||||
const resourceParts = resource.split(':');
|
|
||||||
|
|
||||||
// Scope must have at least protocol:type:name:action
|
|
||||||
if (scopeParts.length < 4) {
|
|
||||||
return false;
|
|
||||||
}
|
|
||||||
|
|
||||||
const [scopeProtocol, scopeType, scopeName, scopeAction] = scopeParts;
|
|
||||||
const [resourceProtocol, resourceType, resourceName] = resourceParts;
|
|
||||||
|
|
||||||
// Check protocol
|
|
||||||
if (scopeProtocol !== '*' && scopeProtocol !== resourceProtocol) {
|
|
||||||
return false;
|
|
||||||
}
|
|
||||||
|
|
||||||
// Check type
|
|
||||||
if (scopeType !== '*' && scopeType !== resourceType) {
|
|
||||||
return false;
|
|
||||||
}
|
|
||||||
|
|
||||||
// Check name
|
|
||||||
if (scopeName !== '*' && scopeName !== resourceName) {
|
|
||||||
return false;
|
|
||||||
}
|
|
||||||
|
|
||||||
// Check action
|
|
||||||
if (scopeAction !== '*' && scopeAction !== action) {
|
|
||||||
// Map action aliases
|
|
||||||
const actionAliases: Record<string, string[]> = {
|
|
||||||
read: ['pull', 'get'],
|
|
||||||
write: ['push', 'put', 'post'],
|
|
||||||
};
|
|
||||||
|
|
||||||
const aliases = actionAliases[scopeAction] || [];
|
|
||||||
if (!aliases.includes(action)) {
|
|
||||||
return false;
|
|
||||||
}
|
|
||||||
}
|
|
||||||
|
|
||||||
return true;
|
|
||||||
}
|
|
||||||
|
|
||||||
/**
|
|
||||||
* Convert unified scopes to OCI access array
|
|
||||||
*/
|
|
||||||
private scopesToOciAccess(scopes: string[]): Array<{
|
|
||||||
type: string;
|
|
||||||
name: string;
|
|
||||||
actions: string[];
|
|
||||||
}> {
|
|
||||||
const access: Array<{type: string; name: string; actions: string[]}> = [];
|
|
||||||
|
|
||||||
for (const scope of scopes) {
|
|
||||||
const parts = scope.split(':');
|
|
||||||
if (parts.length >= 4 && parts[0] === 'oci') {
|
|
||||||
access.push({
|
|
||||||
type: parts[1],
|
|
||||||
name: parts[2],
|
|
||||||
actions: [parts[3]],
|
|
||||||
});
|
|
||||||
}
|
|
||||||
}
|
|
||||||
|
|
||||||
return access;
|
|
||||||
}
|
|
||||||
|
|
||||||
/**
|
|
||||||
* Convert OCI access array to unified scopes
|
|
||||||
*/
|
|
||||||
private ociAccessToScopes(access: Array<{
|
|
||||||
type: string;
|
|
||||||
name: string;
|
|
||||||
actions: string[];
|
|
||||||
}>): string[] {
|
|
||||||
const scopes: string[] = [];
|
|
||||||
|
|
||||||
for (const item of access) {
|
|
||||||
for (const action of item.actions) {
|
|
||||||
scopes.push(`oci:${item.type}:${item.name}:${action}`);
|
|
||||||
}
|
|
||||||
}
|
|
||||||
|
|
||||||
return scopes;
|
|
||||||
}
|
|
||||||
|
|
||||||
/**
|
|
||||||
* Generate UUID for NPM tokens
|
|
||||||
*/
|
|
||||||
private generateUuid(): string {
|
|
||||||
return 'xxxxxxxx-xxxx-4xxx-yxxx-xxxxxxxxxxxx'.replace(/[xy]/g, (c) => {
|
|
||||||
const r = (Math.random() * 16) | 0;
|
|
||||||
const v = c === 'x' ? r : (r & 0x3) | 0x8;
|
|
||||||
return v.toString(16);
|
|
||||||
});
|
|
||||||
}
|
|
||||||
|
|
||||||
/**
|
|
||||||
* Check if string is a valid UUID
|
|
||||||
*/
|
|
||||||
private isValidUuid(str: string): boolean {
|
|
||||||
const uuidRegex = /^[0-9a-f]{8}-[0-9a-f]{4}-4[0-9a-f]{3}-[89ab][0-9a-f]{3}-[0-9a-f]{12}$/i;
|
|
||||||
return uuidRegex.test(str);
|
|
||||||
}
|
|
||||||
|
|
||||||
/**
|
|
||||||
* Hash a token for identification (SHA-512 mock)
|
|
||||||
*/
|
|
||||||
private hashToken(token: string): string {
|
|
||||||
// In production, use actual SHA-512
|
|
||||||
return `sha512-${token.substring(0, 16)}...`;
|
|
||||||
}
|
}
|
||||||
}
|
}
|
||||||
|
|||||||
393
ts/core/classes.defaultauthprovider.ts
Normal file
393
ts/core/classes.defaultauthprovider.ts
Normal file
@@ -0,0 +1,393 @@
|
|||||||
|
import * as crypto from 'crypto';
|
||||||
|
import type { IAuthProvider, ITokenOptions } from './interfaces.auth.js';
|
||||||
|
import type { IAuthConfig, IAuthToken, ICredentials, TRegistryProtocol } from './interfaces.core.js';
|
||||||
|
|
||||||
|
/**
|
||||||
|
* Default in-memory authentication provider.
|
||||||
|
* This is the reference implementation that stores tokens in memory.
|
||||||
|
* For production use, implement IAuthProvider with Redis, database, or external auth.
|
||||||
|
*/
|
||||||
|
export class DefaultAuthProvider implements IAuthProvider {
|
||||||
|
private tokenStore: Map<string, IAuthToken> = new Map();
|
||||||
|
private userCredentials: Map<string, string> = new Map(); // username -> password hash (mock)
|
||||||
|
|
||||||
|
constructor(private config: IAuthConfig) {}
|
||||||
|
|
||||||
|
/**
|
||||||
|
* Initialize the auth provider
|
||||||
|
*/
|
||||||
|
public async init(): Promise<void> {
|
||||||
|
// Initialize token store (in-memory for now)
|
||||||
|
// In production, this could be Redis or a database
|
||||||
|
}
|
||||||
|
|
||||||
|
// ========================================================================
|
||||||
|
// IAuthProvider Implementation
|
||||||
|
// ========================================================================
|
||||||
|
|
||||||
|
/**
|
||||||
|
* Authenticate user credentials
|
||||||
|
*/
|
||||||
|
public async authenticate(credentials: ICredentials): Promise<string | null> {
|
||||||
|
// Mock authentication - in production, verify against database/LDAP
|
||||||
|
const storedPassword = this.userCredentials.get(credentials.username);
|
||||||
|
|
||||||
|
if (!storedPassword) {
|
||||||
|
// Auto-register for testing (remove in production)
|
||||||
|
this.userCredentials.set(credentials.username, credentials.password);
|
||||||
|
return credentials.username;
|
||||||
|
}
|
||||||
|
|
||||||
|
if (storedPassword === credentials.password) {
|
||||||
|
return credentials.username;
|
||||||
|
}
|
||||||
|
|
||||||
|
return null;
|
||||||
|
}
|
||||||
|
|
||||||
|
/**
|
||||||
|
* Validate any token (NPM, Maven, OCI, PyPI, RubyGems, Composer, Cargo)
|
||||||
|
*/
|
||||||
|
public async validateToken(
|
||||||
|
tokenString: string,
|
||||||
|
protocol?: TRegistryProtocol
|
||||||
|
): Promise<IAuthToken | null> {
|
||||||
|
// OCI uses JWT (contains dots), not UUID - check first if OCI is expected
|
||||||
|
if (protocol === 'oci' || tokenString.includes('.')) {
|
||||||
|
const ociToken = await this.validateOciToken(tokenString);
|
||||||
|
if (ociToken && (!protocol || protocol === 'oci')) {
|
||||||
|
return ociToken;
|
||||||
|
}
|
||||||
|
// If protocol was explicitly OCI but validation failed, return null
|
||||||
|
if (protocol === 'oci') {
|
||||||
|
return null;
|
||||||
|
}
|
||||||
|
}
|
||||||
|
|
||||||
|
// UUID-based tokens: single O(1) Map lookup
|
||||||
|
if (this.isValidUuid(tokenString)) {
|
||||||
|
const authToken = this.tokenStore.get(tokenString);
|
||||||
|
if (authToken) {
|
||||||
|
// If protocol specified, verify it matches
|
||||||
|
if (protocol && authToken.type !== protocol) {
|
||||||
|
return null;
|
||||||
|
}
|
||||||
|
// Check expiration
|
||||||
|
if (authToken.expiresAt && authToken.expiresAt < new Date()) {
|
||||||
|
this.tokenStore.delete(tokenString);
|
||||||
|
return null;
|
||||||
|
}
|
||||||
|
return authToken;
|
||||||
|
}
|
||||||
|
}
|
||||||
|
|
||||||
|
return null;
|
||||||
|
}
|
||||||
|
|
||||||
|
/**
|
||||||
|
* Create a new token for a user
|
||||||
|
*/
|
||||||
|
public async createToken(
|
||||||
|
userId: string,
|
||||||
|
protocol: TRegistryProtocol,
|
||||||
|
options?: ITokenOptions
|
||||||
|
): Promise<string> {
|
||||||
|
// OCI tokens use JWT
|
||||||
|
if (protocol === 'oci') {
|
||||||
|
return this.createOciToken(userId, options?.scopes || ['oci:*:*:*'], options?.expiresIn || 3600);
|
||||||
|
}
|
||||||
|
|
||||||
|
// All other protocols use UUID tokens
|
||||||
|
const token = this.generateUuid();
|
||||||
|
const scopes = options?.scopes || (options?.readonly
|
||||||
|
? [`${protocol}:*:*:read`]
|
||||||
|
: [`${protocol}:*:*:*`]);
|
||||||
|
|
||||||
|
const authToken: IAuthToken = {
|
||||||
|
type: protocol,
|
||||||
|
userId,
|
||||||
|
scopes,
|
||||||
|
readonly: options?.readonly,
|
||||||
|
expiresAt: options?.expiresIn ? new Date(Date.now() + options.expiresIn * 1000) : undefined,
|
||||||
|
metadata: {
|
||||||
|
created: new Date().toISOString(),
|
||||||
|
},
|
||||||
|
};
|
||||||
|
|
||||||
|
this.tokenStore.set(token, authToken);
|
||||||
|
return token;
|
||||||
|
}
|
||||||
|
|
||||||
|
/**
|
||||||
|
* Revoke a token
|
||||||
|
*/
|
||||||
|
public async revokeToken(token: string): Promise<void> {
|
||||||
|
this.tokenStore.delete(token);
|
||||||
|
}
|
||||||
|
|
||||||
|
/**
|
||||||
|
* Check if token has permission for an action
|
||||||
|
*/
|
||||||
|
public async authorize(
|
||||||
|
token: IAuthToken | null,
|
||||||
|
resource: string,
|
||||||
|
action: string
|
||||||
|
): Promise<boolean> {
|
||||||
|
if (!token) {
|
||||||
|
return false;
|
||||||
|
}
|
||||||
|
|
||||||
|
// Check readonly flag
|
||||||
|
if (token.readonly && ['write', 'push', 'delete'].includes(action)) {
|
||||||
|
return false;
|
||||||
|
}
|
||||||
|
|
||||||
|
// Check scopes
|
||||||
|
for (const scope of token.scopes) {
|
||||||
|
if (this.matchesScope(scope, resource, action)) {
|
||||||
|
return true;
|
||||||
|
}
|
||||||
|
}
|
||||||
|
|
||||||
|
return false;
|
||||||
|
}
|
||||||
|
|
||||||
|
/**
|
||||||
|
* List all tokens for a user
|
||||||
|
*/
|
||||||
|
public async listUserTokens(userId: string): Promise<Array<{
|
||||||
|
key: string;
|
||||||
|
readonly: boolean;
|
||||||
|
created: string;
|
||||||
|
protocol?: TRegistryProtocol;
|
||||||
|
}>> {
|
||||||
|
const tokens: Array<{key: string; readonly: boolean; created: string; protocol?: TRegistryProtocol}> = [];
|
||||||
|
|
||||||
|
for (const [token, authToken] of this.tokenStore.entries()) {
|
||||||
|
if (authToken.userId === userId) {
|
||||||
|
tokens.push({
|
||||||
|
key: this.hashToken(token),
|
||||||
|
readonly: authToken.readonly || false,
|
||||||
|
created: authToken.metadata?.created || 'unknown',
|
||||||
|
protocol: authToken.type,
|
||||||
|
});
|
||||||
|
}
|
||||||
|
}
|
||||||
|
|
||||||
|
return tokens;
|
||||||
|
}
|
||||||
|
|
||||||
|
// ========================================================================
|
||||||
|
// OCI JWT Token Methods
|
||||||
|
// ========================================================================
|
||||||
|
|
||||||
|
/**
|
||||||
|
* Create an OCI JWT token
|
||||||
|
*/
|
||||||
|
private async createOciToken(
|
||||||
|
userId: string,
|
||||||
|
scopes: string[],
|
||||||
|
expiresIn: number = 3600
|
||||||
|
): Promise<string> {
|
||||||
|
if (!this.config.ociTokens?.enabled) {
|
||||||
|
throw new Error('OCI tokens are not enabled');
|
||||||
|
}
|
||||||
|
|
||||||
|
const now = Math.floor(Date.now() / 1000);
|
||||||
|
const payload = {
|
||||||
|
iss: this.config.ociTokens.realm,
|
||||||
|
sub: userId,
|
||||||
|
aud: this.config.ociTokens.service,
|
||||||
|
exp: now + expiresIn,
|
||||||
|
nbf: now,
|
||||||
|
iat: now,
|
||||||
|
access: this.scopesToOciAccess(scopes),
|
||||||
|
};
|
||||||
|
|
||||||
|
// Create JWT with HMAC-SHA256 signature
|
||||||
|
const header = { alg: 'HS256', typ: 'JWT' };
|
||||||
|
const headerB64 = Buffer.from(JSON.stringify(header)).toString('base64url');
|
||||||
|
const payloadB64 = Buffer.from(JSON.stringify(payload)).toString('base64url');
|
||||||
|
|
||||||
|
const signature = crypto
|
||||||
|
.createHmac('sha256', this.config.jwtSecret)
|
||||||
|
.update(`${headerB64}.${payloadB64}`)
|
||||||
|
.digest('base64url');
|
||||||
|
|
||||||
|
return `${headerB64}.${payloadB64}.${signature}`;
|
||||||
|
}
|
||||||
|
|
||||||
|
/**
|
||||||
|
* Validate an OCI JWT token
|
||||||
|
*/
|
||||||
|
private async validateOciToken(jwt: string): Promise<IAuthToken | null> {
|
||||||
|
try {
|
||||||
|
const parts = jwt.split('.');
|
||||||
|
if (parts.length !== 3) {
|
||||||
|
return null;
|
||||||
|
}
|
||||||
|
|
||||||
|
const [headerB64, payloadB64, signatureB64] = parts;
|
||||||
|
|
||||||
|
// Verify signature
|
||||||
|
const expectedSignature = crypto
|
||||||
|
.createHmac('sha256', this.config.jwtSecret)
|
||||||
|
.update(`${headerB64}.${payloadB64}`)
|
||||||
|
.digest('base64url');
|
||||||
|
|
||||||
|
if (signatureB64 !== expectedSignature) {
|
||||||
|
return null;
|
||||||
|
}
|
||||||
|
|
||||||
|
// Decode and parse payload
|
||||||
|
const payload = JSON.parse(Buffer.from(payloadB64, 'base64url').toString('utf-8'));
|
||||||
|
|
||||||
|
// Check expiration
|
||||||
|
const now = Math.floor(Date.now() / 1000);
|
||||||
|
if (payload.exp && payload.exp < now) {
|
||||||
|
return null;
|
||||||
|
}
|
||||||
|
|
||||||
|
// Check not-before time
|
||||||
|
if (payload.nbf && payload.nbf > now) {
|
||||||
|
return null;
|
||||||
|
}
|
||||||
|
|
||||||
|
// Convert to unified token format
|
||||||
|
const scopes = this.ociAccessToScopes(payload.access || []);
|
||||||
|
|
||||||
|
return {
|
||||||
|
type: 'oci',
|
||||||
|
userId: payload.sub,
|
||||||
|
scopes,
|
||||||
|
expiresAt: payload.exp ? new Date(payload.exp * 1000) : undefined,
|
||||||
|
metadata: {
|
||||||
|
iss: payload.iss,
|
||||||
|
aud: payload.aud,
|
||||||
|
},
|
||||||
|
};
|
||||||
|
} catch (error) {
|
||||||
|
return null;
|
||||||
|
}
|
||||||
|
}
|
||||||
|
|
||||||
|
// ========================================================================
|
||||||
|
// Helper Methods
|
||||||
|
// ========================================================================
|
||||||
|
|
||||||
|
/**
|
||||||
|
* Check if a scope matches a resource and action
|
||||||
|
*/
|
||||||
|
private matchesScope(scope: string, resource: string, action: string): boolean {
|
||||||
|
const scopeParts = scope.split(':');
|
||||||
|
const resourceParts = resource.split(':');
|
||||||
|
|
||||||
|
// Scope must have at least protocol:type:name:action
|
||||||
|
if (scopeParts.length < 4) {
|
||||||
|
return false;
|
||||||
|
}
|
||||||
|
|
||||||
|
const [scopeProtocol, scopeType, scopeName, scopeAction] = scopeParts;
|
||||||
|
const [resourceProtocol, resourceType, resourceName] = resourceParts;
|
||||||
|
|
||||||
|
// Check protocol
|
||||||
|
if (scopeProtocol !== '*' && scopeProtocol !== resourceProtocol) {
|
||||||
|
return false;
|
||||||
|
}
|
||||||
|
|
||||||
|
// Check type
|
||||||
|
if (scopeType !== '*' && scopeType !== resourceType) {
|
||||||
|
return false;
|
||||||
|
}
|
||||||
|
|
||||||
|
// Check name
|
||||||
|
if (scopeName !== '*' && scopeName !== resourceName) {
|
||||||
|
return false;
|
||||||
|
}
|
||||||
|
|
||||||
|
// Check action
|
||||||
|
if (scopeAction !== '*' && scopeAction !== action) {
|
||||||
|
// Map action aliases
|
||||||
|
const actionAliases: Record<string, string[]> = {
|
||||||
|
read: ['pull', 'get'],
|
||||||
|
write: ['push', 'put', 'post'],
|
||||||
|
};
|
||||||
|
|
||||||
|
const aliases = actionAliases[scopeAction] || [];
|
||||||
|
if (!aliases.includes(action)) {
|
||||||
|
return false;
|
||||||
|
}
|
||||||
|
}
|
||||||
|
|
||||||
|
return true;
|
||||||
|
}
|
||||||
|
|
||||||
|
/**
|
||||||
|
* Convert unified scopes to OCI access array
|
||||||
|
*/
|
||||||
|
private scopesToOciAccess(scopes: string[]): Array<{
|
||||||
|
type: string;
|
||||||
|
name: string;
|
||||||
|
actions: string[];
|
||||||
|
}> {
|
||||||
|
const access: Array<{type: string; name: string; actions: string[]}> = [];
|
||||||
|
|
||||||
|
for (const scope of scopes) {
|
||||||
|
const parts = scope.split(':');
|
||||||
|
if (parts.length >= 4 && parts[0] === 'oci') {
|
||||||
|
access.push({
|
||||||
|
type: parts[1],
|
||||||
|
name: parts[2],
|
||||||
|
actions: [parts[3]],
|
||||||
|
});
|
||||||
|
}
|
||||||
|
}
|
||||||
|
|
||||||
|
return access;
|
||||||
|
}
|
||||||
|
|
||||||
|
/**
|
||||||
|
* Convert OCI access array to unified scopes
|
||||||
|
*/
|
||||||
|
private ociAccessToScopes(access: Array<{
|
||||||
|
type: string;
|
||||||
|
name: string;
|
||||||
|
actions: string[];
|
||||||
|
}>): string[] {
|
||||||
|
const scopes: string[] = [];
|
||||||
|
|
||||||
|
for (const item of access) {
|
||||||
|
for (const action of item.actions) {
|
||||||
|
scopes.push(`oci:${item.type}:${item.name}:${action}`);
|
||||||
|
}
|
||||||
|
}
|
||||||
|
|
||||||
|
return scopes;
|
||||||
|
}
|
||||||
|
|
||||||
|
/**
|
||||||
|
* Generate UUID for tokens
|
||||||
|
*/
|
||||||
|
private generateUuid(): string {
|
||||||
|
return 'xxxxxxxx-xxxx-4xxx-yxxx-xxxxxxxxxxxx'.replace(/[xy]/g, (c) => {
|
||||||
|
const r = (Math.random() * 16) | 0;
|
||||||
|
const v = c === 'x' ? r : (r & 0x3) | 0x8;
|
||||||
|
return v.toString(16);
|
||||||
|
});
|
||||||
|
}
|
||||||
|
|
||||||
|
/**
|
||||||
|
* Check if string is a valid UUID
|
||||||
|
*/
|
||||||
|
private isValidUuid(str: string): boolean {
|
||||||
|
const uuidRegex = /^[0-9a-f]{8}-[0-9a-f]{4}-4[0-9a-f]{3}-[89ab][0-9a-f]{3}-[0-9a-f]{12}$/i;
|
||||||
|
return uuidRegex.test(str);
|
||||||
|
}
|
||||||
|
|
||||||
|
/**
|
||||||
|
* Hash a token for identification
|
||||||
|
*/
|
||||||
|
private hashToken(token: string): string {
|
||||||
|
return `sha512-${token.substring(0, 16)}...`;
|
||||||
|
}
|
||||||
|
}
|
||||||
@@ -1,17 +1,54 @@
|
|||||||
import * as plugins from '../plugins.js';
|
import * as plugins from '../plugins.js';
|
||||||
import type { IStorageConfig, IStorageBackend } from './interfaces.core.js';
|
import type { IStorageConfig, IStorageBackend, TRegistryProtocol } from './interfaces.core.js';
|
||||||
|
import type {
|
||||||
|
IStorageHooks,
|
||||||
|
IStorageHookContext,
|
||||||
|
IStorageActor,
|
||||||
|
IStorageMetadata,
|
||||||
|
} from './interfaces.storage.js';
|
||||||
|
|
||||||
/**
|
/**
|
||||||
* Storage abstraction layer for registry
|
* Storage abstraction layer for registry.
|
||||||
* Provides a unified interface over SmartBucket
|
* Provides a unified interface over SmartBucket with optional hooks
|
||||||
|
* for quota tracking, audit logging, cache invalidation, etc.
|
||||||
|
*
|
||||||
|
* @example
|
||||||
|
* ```typescript
|
||||||
|
* // Basic usage
|
||||||
|
* const storage = new RegistryStorage(config);
|
||||||
|
*
|
||||||
|
* // With hooks for quota tracking
|
||||||
|
* const storage = new RegistryStorage(config, {
|
||||||
|
* beforePut: async (ctx) => {
|
||||||
|
* const quota = await getQuota(ctx.actor?.orgId);
|
||||||
|
* const usage = await getUsage(ctx.actor?.orgId);
|
||||||
|
* if (usage + (ctx.metadata?.size || 0) > quota) {
|
||||||
|
* return { allowed: false, reason: 'Quota exceeded' };
|
||||||
|
* }
|
||||||
|
* return { allowed: true };
|
||||||
|
* },
|
||||||
|
* afterPut: async (ctx) => {
|
||||||
|
* await updateUsage(ctx.actor?.orgId, ctx.metadata?.size || 0);
|
||||||
|
* }
|
||||||
|
* });
|
||||||
|
* ```
|
||||||
*/
|
*/
|
||||||
export class RegistryStorage implements IStorageBackend {
|
export class RegistryStorage implements IStorageBackend {
|
||||||
private smartBucket: plugins.smartbucket.SmartBucket;
|
private smartBucket: plugins.smartbucket.SmartBucket;
|
||||||
private bucket: plugins.smartbucket.Bucket;
|
private bucket: plugins.smartbucket.Bucket;
|
||||||
private bucketName: string;
|
private bucketName: string;
|
||||||
|
private hooks?: IStorageHooks;
|
||||||
|
|
||||||
constructor(private config: IStorageConfig) {
|
constructor(private config: IStorageConfig, hooks?: IStorageHooks) {
|
||||||
this.bucketName = config.bucketName;
|
this.bucketName = config.bucketName;
|
||||||
|
this.hooks = hooks;
|
||||||
|
}
|
||||||
|
|
||||||
|
/**
|
||||||
|
* Set storage hooks (can be called after construction)
|
||||||
|
*/
|
||||||
|
public setHooks(hooks: IStorageHooks): void {
|
||||||
|
this.hooks = hooks;
|
||||||
}
|
}
|
||||||
|
|
||||||
/**
|
/**
|
||||||
@@ -34,7 +71,24 @@ export class RegistryStorage implements IStorageBackend {
|
|||||||
*/
|
*/
|
||||||
public async getObject(key: string): Promise<Buffer | null> {
|
public async getObject(key: string): Promise<Buffer | null> {
|
||||||
try {
|
try {
|
||||||
return await this.bucket.fastGet({ path: key });
|
const data = await this.bucket.fastGet({ path: key });
|
||||||
|
|
||||||
|
// Call afterGet hook (non-blocking)
|
||||||
|
if (this.hooks?.afterGet && data) {
|
||||||
|
const context = this.currentContext;
|
||||||
|
if (context) {
|
||||||
|
this.hooks.afterGet({
|
||||||
|
operation: 'get',
|
||||||
|
key,
|
||||||
|
protocol: context.protocol,
|
||||||
|
actor: context.actor,
|
||||||
|
metadata: context.metadata,
|
||||||
|
timestamp: new Date(),
|
||||||
|
}).catch(() => {}); // Don't fail on hook errors
|
||||||
|
}
|
||||||
|
}
|
||||||
|
|
||||||
|
return data;
|
||||||
} catch (error) {
|
} catch (error) {
|
||||||
return null;
|
return null;
|
||||||
}
|
}
|
||||||
@@ -48,19 +102,159 @@ export class RegistryStorage implements IStorageBackend {
|
|||||||
data: Buffer,
|
data: Buffer,
|
||||||
metadata?: Record<string, string>
|
metadata?: Record<string, string>
|
||||||
): Promise<void> {
|
): Promise<void> {
|
||||||
|
// Call beforePut hook if available
|
||||||
|
if (this.hooks?.beforePut) {
|
||||||
|
const context = this.currentContext;
|
||||||
|
if (context) {
|
||||||
|
const hookContext: IStorageHookContext = {
|
||||||
|
operation: 'put',
|
||||||
|
key,
|
||||||
|
protocol: context.protocol,
|
||||||
|
actor: context.actor,
|
||||||
|
metadata: {
|
||||||
|
...context.metadata,
|
||||||
|
size: data.length,
|
||||||
|
},
|
||||||
|
timestamp: new Date(),
|
||||||
|
};
|
||||||
|
|
||||||
|
const result = await this.hooks.beforePut(hookContext);
|
||||||
|
if (!result.allowed) {
|
||||||
|
throw new Error(result.reason || 'Storage operation denied by hook');
|
||||||
|
}
|
||||||
|
}
|
||||||
|
}
|
||||||
|
|
||||||
// Note: SmartBucket doesn't support metadata yet
|
// Note: SmartBucket doesn't support metadata yet
|
||||||
await this.bucket.fastPut({
|
await this.bucket.fastPut({
|
||||||
path: key,
|
path: key,
|
||||||
contents: data,
|
contents: data,
|
||||||
overwrite: true, // Always overwrite existing objects
|
overwrite: true, // Always overwrite existing objects
|
||||||
});
|
});
|
||||||
|
|
||||||
|
// Call afterPut hook (non-blocking)
|
||||||
|
if (this.hooks?.afterPut) {
|
||||||
|
const context = this.currentContext;
|
||||||
|
if (context) {
|
||||||
|
this.hooks.afterPut({
|
||||||
|
operation: 'put',
|
||||||
|
key,
|
||||||
|
protocol: context.protocol,
|
||||||
|
actor: context.actor,
|
||||||
|
metadata: {
|
||||||
|
...context.metadata,
|
||||||
|
size: data.length,
|
||||||
|
},
|
||||||
|
timestamp: new Date(),
|
||||||
|
}).catch(() => {}); // Don't fail on hook errors
|
||||||
|
}
|
||||||
|
}
|
||||||
}
|
}
|
||||||
|
|
||||||
/**
|
/**
|
||||||
* Delete an object
|
* Delete an object
|
||||||
*/
|
*/
|
||||||
public async deleteObject(key: string): Promise<void> {
|
public async deleteObject(key: string): Promise<void> {
|
||||||
|
// Call beforeDelete hook if available
|
||||||
|
if (this.hooks?.beforeDelete) {
|
||||||
|
const context = this.currentContext;
|
||||||
|
if (context) {
|
||||||
|
const hookContext: IStorageHookContext = {
|
||||||
|
operation: 'delete',
|
||||||
|
key,
|
||||||
|
protocol: context.protocol,
|
||||||
|
actor: context.actor,
|
||||||
|
metadata: context.metadata,
|
||||||
|
timestamp: new Date(),
|
||||||
|
};
|
||||||
|
|
||||||
|
const result = await this.hooks.beforeDelete(hookContext);
|
||||||
|
if (!result.allowed) {
|
||||||
|
throw new Error(result.reason || 'Delete operation denied by hook');
|
||||||
|
}
|
||||||
|
}
|
||||||
|
}
|
||||||
|
|
||||||
await this.bucket.fastRemove({ path: key });
|
await this.bucket.fastRemove({ path: key });
|
||||||
|
|
||||||
|
// Call afterDelete hook (non-blocking)
|
||||||
|
if (this.hooks?.afterDelete) {
|
||||||
|
const context = this.currentContext;
|
||||||
|
if (context) {
|
||||||
|
this.hooks.afterDelete({
|
||||||
|
operation: 'delete',
|
||||||
|
key,
|
||||||
|
protocol: context.protocol,
|
||||||
|
actor: context.actor,
|
||||||
|
metadata: context.metadata,
|
||||||
|
timestamp: new Date(),
|
||||||
|
}).catch(() => {}); // Don't fail on hook errors
|
||||||
|
}
|
||||||
|
}
|
||||||
|
}
|
||||||
|
|
||||||
|
// ========================================================================
|
||||||
|
// CONTEXT FOR HOOKS
|
||||||
|
// ========================================================================
|
||||||
|
|
||||||
|
/**
|
||||||
|
* Current operation context for hooks.
|
||||||
|
* Set this before performing storage operations to enable hooks.
|
||||||
|
*/
|
||||||
|
private currentContext?: {
|
||||||
|
protocol: TRegistryProtocol;
|
||||||
|
actor?: IStorageActor;
|
||||||
|
metadata?: IStorageMetadata;
|
||||||
|
};
|
||||||
|
|
||||||
|
/**
|
||||||
|
* Set the current operation context for hooks.
|
||||||
|
* Call this before performing storage operations.
|
||||||
|
*
|
||||||
|
* @example
|
||||||
|
* ```typescript
|
||||||
|
* storage.setContext({
|
||||||
|
* protocol: 'npm',
|
||||||
|
* actor: { userId: 'user123', ip: '192.168.1.1' },
|
||||||
|
* metadata: { packageName: 'lodash', version: '4.17.21' }
|
||||||
|
* });
|
||||||
|
* await storage.putNpmTarball('lodash', '4.17.21', tarball);
|
||||||
|
* storage.clearContext();
|
||||||
|
* ```
|
||||||
|
*/
|
||||||
|
public setContext(context: {
|
||||||
|
protocol: TRegistryProtocol;
|
||||||
|
actor?: IStorageActor;
|
||||||
|
metadata?: IStorageMetadata;
|
||||||
|
}): void {
|
||||||
|
this.currentContext = context;
|
||||||
|
}
|
||||||
|
|
||||||
|
/**
|
||||||
|
* Clear the current operation context.
|
||||||
|
*/
|
||||||
|
public clearContext(): void {
|
||||||
|
this.currentContext = undefined;
|
||||||
|
}
|
||||||
|
|
||||||
|
/**
|
||||||
|
* Execute a function with a temporary context.
|
||||||
|
* Context is automatically cleared after execution.
|
||||||
|
*/
|
||||||
|
public async withContext<T>(
|
||||||
|
context: {
|
||||||
|
protocol: TRegistryProtocol;
|
||||||
|
actor?: IStorageActor;
|
||||||
|
metadata?: IStorageMetadata;
|
||||||
|
},
|
||||||
|
fn: () => Promise<T>
|
||||||
|
): Promise<T> {
|
||||||
|
this.setContext(context);
|
||||||
|
try {
|
||||||
|
return await fn();
|
||||||
|
} finally {
|
||||||
|
this.clearContext();
|
||||||
|
}
|
||||||
}
|
}
|
||||||
|
|
||||||
/**
|
/**
|
||||||
|
|||||||
@@ -2,9 +2,16 @@
|
|||||||
* Core registry infrastructure exports
|
* Core registry infrastructure exports
|
||||||
*/
|
*/
|
||||||
|
|
||||||
// Interfaces
|
// Core interfaces
|
||||||
export * from './interfaces.core.js';
|
export * from './interfaces.core.js';
|
||||||
|
|
||||||
|
// Auth interfaces and provider
|
||||||
|
export * from './interfaces.auth.js';
|
||||||
|
export { DefaultAuthProvider } from './classes.defaultauthprovider.js';
|
||||||
|
|
||||||
|
// Storage interfaces and hooks
|
||||||
|
export * from './interfaces.storage.js';
|
||||||
|
|
||||||
// Classes
|
// Classes
|
||||||
export { BaseRegistry } from './classes.baseregistry.js';
|
export { BaseRegistry } from './classes.baseregistry.js';
|
||||||
export { RegistryStorage } from './classes.registrystorage.js';
|
export { RegistryStorage } from './classes.registrystorage.js';
|
||||||
|
|||||||
91
ts/core/interfaces.auth.ts
Normal file
91
ts/core/interfaces.auth.ts
Normal file
@@ -0,0 +1,91 @@
|
|||||||
|
import type { IAuthToken, ICredentials, TRegistryProtocol } from './interfaces.core.js';
|
||||||
|
|
||||||
|
/**
|
||||||
|
* Options for creating a token
|
||||||
|
*/
|
||||||
|
export interface ITokenOptions {
|
||||||
|
/** Whether the token is readonly */
|
||||||
|
readonly?: boolean;
|
||||||
|
/** Permission scopes */
|
||||||
|
scopes?: string[];
|
||||||
|
/** Expiration time in seconds */
|
||||||
|
expiresIn?: number;
|
||||||
|
}
|
||||||
|
|
||||||
|
/**
|
||||||
|
* Pluggable authentication provider interface.
|
||||||
|
* Implement this to integrate external auth systems (LDAP, OAuth, SSO, OIDC).
|
||||||
|
*
|
||||||
|
* @example
|
||||||
|
* ```typescript
|
||||||
|
* class LdapAuthProvider implements IAuthProvider {
|
||||||
|
* constructor(private ldap: LdapClient, private redis: RedisClient) {}
|
||||||
|
*
|
||||||
|
* async authenticate(credentials: ICredentials): Promise<string | null> {
|
||||||
|
* return await this.ldap.bind(credentials.username, credentials.password);
|
||||||
|
* }
|
||||||
|
*
|
||||||
|
* async validateToken(token: string): Promise<IAuthToken | null> {
|
||||||
|
* return await this.redis.get(`token:${token}`);
|
||||||
|
* }
|
||||||
|
* // ...
|
||||||
|
* }
|
||||||
|
* ```
|
||||||
|
*/
|
||||||
|
export interface IAuthProvider {
|
||||||
|
/**
|
||||||
|
* Initialize the auth provider (optional)
|
||||||
|
*/
|
||||||
|
init?(): Promise<void>;
|
||||||
|
|
||||||
|
/**
|
||||||
|
* Authenticate user credentials (login flow)
|
||||||
|
* @param credentials - Username and password
|
||||||
|
* @returns User ID on success, null on failure
|
||||||
|
*/
|
||||||
|
authenticate(credentials: ICredentials): Promise<string | null>;
|
||||||
|
|
||||||
|
/**
|
||||||
|
* Validate an existing token
|
||||||
|
* @param token - Token string (UUID or JWT)
|
||||||
|
* @param protocol - Optional protocol hint for optimization
|
||||||
|
* @returns Auth token info or null if invalid
|
||||||
|
*/
|
||||||
|
validateToken(token: string, protocol?: TRegistryProtocol): Promise<IAuthToken | null>;
|
||||||
|
|
||||||
|
/**
|
||||||
|
* Create a new token for a user
|
||||||
|
* @param userId - User ID
|
||||||
|
* @param protocol - Protocol type (npm, oci, maven, etc.)
|
||||||
|
* @param options - Token options (readonly, scopes, expiration)
|
||||||
|
* @returns Token string
|
||||||
|
*/
|
||||||
|
createToken(userId: string, protocol: TRegistryProtocol, options?: ITokenOptions): Promise<string>;
|
||||||
|
|
||||||
|
/**
|
||||||
|
* Revoke a token
|
||||||
|
* @param token - Token string to revoke
|
||||||
|
*/
|
||||||
|
revokeToken(token: string): Promise<void>;
|
||||||
|
|
||||||
|
/**
|
||||||
|
* Check if user has permission for an action
|
||||||
|
* @param token - Auth token (or null for anonymous)
|
||||||
|
* @param resource - Resource being accessed (e.g., "npm:package:lodash")
|
||||||
|
* @param action - Action being performed (read, write, push, pull, delete)
|
||||||
|
* @returns true if authorized
|
||||||
|
*/
|
||||||
|
authorize(token: IAuthToken | null, resource: string, action: string): Promise<boolean>;
|
||||||
|
|
||||||
|
/**
|
||||||
|
* List all tokens for a user (optional)
|
||||||
|
* @param userId - User ID
|
||||||
|
* @returns List of token info
|
||||||
|
*/
|
||||||
|
listUserTokens?(userId: string): Promise<Array<{
|
||||||
|
key: string;
|
||||||
|
readonly: boolean;
|
||||||
|
created: string;
|
||||||
|
protocol?: TRegistryProtocol;
|
||||||
|
}>>;
|
||||||
|
}
|
||||||
@@ -4,6 +4,8 @@
|
|||||||
|
|
||||||
import type * as plugins from '../plugins.js';
|
import type * as plugins from '../plugins.js';
|
||||||
import type { IProtocolUpstreamConfig } from '../upstream/interfaces.upstream.js';
|
import type { IProtocolUpstreamConfig } from '../upstream/interfaces.upstream.js';
|
||||||
|
import type { IAuthProvider } from './interfaces.auth.js';
|
||||||
|
import type { IStorageHooks } from './interfaces.storage.js';
|
||||||
|
|
||||||
/**
|
/**
|
||||||
* Registry protocol types
|
* Registry protocol types
|
||||||
@@ -97,6 +99,20 @@ export interface IProtocolConfig {
|
|||||||
export interface IRegistryConfig {
|
export interface IRegistryConfig {
|
||||||
storage: IStorageConfig;
|
storage: IStorageConfig;
|
||||||
auth: IAuthConfig;
|
auth: IAuthConfig;
|
||||||
|
|
||||||
|
/**
|
||||||
|
* Custom authentication provider.
|
||||||
|
* If not provided, uses the default in-memory auth provider.
|
||||||
|
* Implement IAuthProvider to integrate LDAP, OAuth, SSO, etc.
|
||||||
|
*/
|
||||||
|
authProvider?: IAuthProvider;
|
||||||
|
|
||||||
|
/**
|
||||||
|
* Storage event hooks for quota tracking, audit logging, etc.
|
||||||
|
* Called before/after storage operations.
|
||||||
|
*/
|
||||||
|
storageHooks?: IStorageHooks;
|
||||||
|
|
||||||
oci?: IProtocolConfig;
|
oci?: IProtocolConfig;
|
||||||
npm?: IProtocolConfig;
|
npm?: IProtocolConfig;
|
||||||
maven?: IProtocolConfig;
|
maven?: IProtocolConfig;
|
||||||
@@ -152,6 +168,24 @@ export interface IRegistryError {
|
|||||||
}>;
|
}>;
|
||||||
}
|
}
|
||||||
|
|
||||||
|
/**
|
||||||
|
* Actor information - identifies who is performing the request
|
||||||
|
*/
|
||||||
|
export interface IRequestActor {
|
||||||
|
/** User ID (from validated token) */
|
||||||
|
userId?: string;
|
||||||
|
/** Token ID/hash for audit purposes */
|
||||||
|
tokenId?: string;
|
||||||
|
/** Client IP address */
|
||||||
|
ip?: string;
|
||||||
|
/** Client User-Agent */
|
||||||
|
userAgent?: string;
|
||||||
|
/** Organization ID (for multi-tenant setups) */
|
||||||
|
orgId?: string;
|
||||||
|
/** Session ID */
|
||||||
|
sessionId?: string;
|
||||||
|
}
|
||||||
|
|
||||||
/**
|
/**
|
||||||
* Base request context
|
* Base request context
|
||||||
*/
|
*/
|
||||||
@@ -168,6 +202,11 @@ export interface IRequestContext {
|
|||||||
*/
|
*/
|
||||||
rawBody?: Buffer;
|
rawBody?: Buffer;
|
||||||
token?: string;
|
token?: string;
|
||||||
|
/**
|
||||||
|
* Actor information - identifies who is performing the request.
|
||||||
|
* Populated after authentication for audit logging, quota enforcement, etc.
|
||||||
|
*/
|
||||||
|
actor?: IRequestActor;
|
||||||
}
|
}
|
||||||
|
|
||||||
/**
|
/**
|
||||||
|
|||||||
130
ts/core/interfaces.storage.ts
Normal file
130
ts/core/interfaces.storage.ts
Normal file
@@ -0,0 +1,130 @@
|
|||||||
|
import type { TRegistryProtocol } from './interfaces.core.js';
|
||||||
|
|
||||||
|
/**
|
||||||
|
* Actor information from request context
|
||||||
|
*/
|
||||||
|
export interface IStorageActor {
|
||||||
|
userId?: string;
|
||||||
|
tokenId?: string;
|
||||||
|
ip?: string;
|
||||||
|
userAgent?: string;
|
||||||
|
orgId?: string;
|
||||||
|
sessionId?: string;
|
||||||
|
}
|
||||||
|
|
||||||
|
/**
|
||||||
|
* Metadata about the storage operation
|
||||||
|
*/
|
||||||
|
export interface IStorageMetadata {
|
||||||
|
/** Content type of the object */
|
||||||
|
contentType?: string;
|
||||||
|
/** Size in bytes */
|
||||||
|
size?: number;
|
||||||
|
/** Content digest (e.g., sha256:abc123) */
|
||||||
|
digest?: string;
|
||||||
|
/** Package/artifact name */
|
||||||
|
packageName?: string;
|
||||||
|
/** Version */
|
||||||
|
version?: string;
|
||||||
|
}
|
||||||
|
|
||||||
|
/**
|
||||||
|
* Context passed to storage hooks
|
||||||
|
*/
|
||||||
|
export interface IStorageHookContext {
|
||||||
|
/** Type of operation */
|
||||||
|
operation: 'put' | 'delete' | 'get';
|
||||||
|
/** Storage key/path */
|
||||||
|
key: string;
|
||||||
|
/** Protocol that triggered this operation */
|
||||||
|
protocol: TRegistryProtocol;
|
||||||
|
/** Actor who performed the operation (if known) */
|
||||||
|
actor?: IStorageActor;
|
||||||
|
/** Metadata about the object */
|
||||||
|
metadata?: IStorageMetadata;
|
||||||
|
/** Timestamp of the operation */
|
||||||
|
timestamp: Date;
|
||||||
|
}
|
||||||
|
|
||||||
|
/**
|
||||||
|
* Result from a beforePut hook that can modify the operation
|
||||||
|
*/
|
||||||
|
export interface IBeforePutResult {
|
||||||
|
/** Whether to allow the operation */
|
||||||
|
allowed: boolean;
|
||||||
|
/** Optional reason for rejection */
|
||||||
|
reason?: string;
|
||||||
|
/** Optional modified metadata */
|
||||||
|
metadata?: IStorageMetadata;
|
||||||
|
}
|
||||||
|
|
||||||
|
/**
|
||||||
|
* Result from a beforeDelete hook
|
||||||
|
*/
|
||||||
|
export interface IBeforeDeleteResult {
|
||||||
|
/** Whether to allow the operation */
|
||||||
|
allowed: boolean;
|
||||||
|
/** Optional reason for rejection */
|
||||||
|
reason?: string;
|
||||||
|
}
|
||||||
|
|
||||||
|
/**
|
||||||
|
* Storage event hooks for quota tracking, audit logging, cache invalidation, etc.
|
||||||
|
*
|
||||||
|
* @example
|
||||||
|
* ```typescript
|
||||||
|
* const quotaHooks: IStorageHooks = {
|
||||||
|
* async beforePut(context) {
|
||||||
|
* const quota = await getQuota(context.actor?.orgId);
|
||||||
|
* const currentUsage = await getUsage(context.actor?.orgId);
|
||||||
|
* if (currentUsage + (context.metadata?.size || 0) > quota) {
|
||||||
|
* return { allowed: false, reason: 'Quota exceeded' };
|
||||||
|
* }
|
||||||
|
* return { allowed: true };
|
||||||
|
* },
|
||||||
|
*
|
||||||
|
* async afterPut(context) {
|
||||||
|
* await updateUsage(context.actor?.orgId, context.metadata?.size || 0);
|
||||||
|
* await auditLog('storage.put', context);
|
||||||
|
* },
|
||||||
|
*
|
||||||
|
* async afterDelete(context) {
|
||||||
|
* await invalidateCache(context.key);
|
||||||
|
* }
|
||||||
|
* };
|
||||||
|
* ```
|
||||||
|
*/
|
||||||
|
export interface IStorageHooks {
|
||||||
|
/**
|
||||||
|
* Called before storing an object.
|
||||||
|
* Return { allowed: false } to reject the operation.
|
||||||
|
* Use for quota checks, virus scanning, validation, etc.
|
||||||
|
*/
|
||||||
|
beforePut?(context: IStorageHookContext): Promise<IBeforePutResult>;
|
||||||
|
|
||||||
|
/**
|
||||||
|
* Called after successfully storing an object.
|
||||||
|
* Use for quota tracking, audit logging, notifications, etc.
|
||||||
|
*/
|
||||||
|
afterPut?(context: IStorageHookContext): Promise<void>;
|
||||||
|
|
||||||
|
/**
|
||||||
|
* Called before deleting an object.
|
||||||
|
* Return { allowed: false } to reject the operation.
|
||||||
|
* Use for preventing deletion of protected resources.
|
||||||
|
*/
|
||||||
|
beforeDelete?(context: IStorageHookContext): Promise<IBeforeDeleteResult>;
|
||||||
|
|
||||||
|
/**
|
||||||
|
* Called after successfully deleting an object.
|
||||||
|
* Use for quota updates, audit logging, cache invalidation, etc.
|
||||||
|
*/
|
||||||
|
afterDelete?(context: IStorageHookContext): Promise<void>;
|
||||||
|
|
||||||
|
/**
|
||||||
|
* Called after reading an object.
|
||||||
|
* Use for access logging, analytics, etc.
|
||||||
|
* Note: This is called even for cache hits.
|
||||||
|
*/
|
||||||
|
afterGet?(context: IStorageHookContext): Promise<void>;
|
||||||
|
}
|
||||||
@@ -110,8 +110,18 @@ export abstract class BaseUpstream {
|
|||||||
return null;
|
return null;
|
||||||
}
|
}
|
||||||
|
|
||||||
|
// Get applicable upstreams sorted by priority
|
||||||
|
const applicableUpstreams = this.getApplicableUpstreams(context.resource);
|
||||||
|
|
||||||
|
if (applicableUpstreams.length === 0) {
|
||||||
|
return null;
|
||||||
|
}
|
||||||
|
|
||||||
|
// Use the first applicable upstream's URL for cache key
|
||||||
|
const primaryUpstreamUrl = applicableUpstreams[0]?.url;
|
||||||
|
|
||||||
// Check cache first
|
// Check cache first
|
||||||
const cached = this.cache.get(context);
|
const cached = await this.cache.get(context, primaryUpstreamUrl);
|
||||||
if (cached && !cached.stale) {
|
if (cached && !cached.stale) {
|
||||||
return {
|
return {
|
||||||
success: true,
|
success: true,
|
||||||
@@ -125,7 +135,7 @@ export abstract class BaseUpstream {
|
|||||||
}
|
}
|
||||||
|
|
||||||
// Check for negative cache (recent 404)
|
// Check for negative cache (recent 404)
|
||||||
if (this.cache.hasNegative(context)) {
|
if (await this.cache.hasNegative(context, primaryUpstreamUrl)) {
|
||||||
return {
|
return {
|
||||||
success: false,
|
success: false,
|
||||||
status: 404,
|
status: 404,
|
||||||
@@ -136,13 +146,6 @@ export abstract class BaseUpstream {
|
|||||||
};
|
};
|
||||||
}
|
}
|
||||||
|
|
||||||
// Get applicable upstreams sorted by priority
|
|
||||||
const applicableUpstreams = this.getApplicableUpstreams(context.resource);
|
|
||||||
|
|
||||||
if (applicableUpstreams.length === 0) {
|
|
||||||
return null;
|
|
||||||
}
|
|
||||||
|
|
||||||
// If we have stale cache, return it immediately and revalidate in background
|
// If we have stale cache, return it immediately and revalidate in background
|
||||||
if (cached?.stale && this.cacheConfig.staleWhileRevalidate) {
|
if (cached?.stale && this.cacheConfig.staleWhileRevalidate) {
|
||||||
// Fire and forget revalidation
|
// Fire and forget revalidation
|
||||||
@@ -173,18 +176,19 @@ export abstract class BaseUpstream {
|
|||||||
|
|
||||||
// Cache successful responses
|
// Cache successful responses
|
||||||
if (result.success && result.body) {
|
if (result.success && result.body) {
|
||||||
this.cache.set(
|
await this.cache.set(
|
||||||
context,
|
context,
|
||||||
Buffer.isBuffer(result.body) ? result.body : Buffer.from(JSON.stringify(result.body)),
|
Buffer.isBuffer(result.body) ? result.body : Buffer.from(JSON.stringify(result.body)),
|
||||||
result.headers['content-type'] || 'application/octet-stream',
|
result.headers['content-type'] || 'application/octet-stream',
|
||||||
result.headers,
|
result.headers,
|
||||||
upstream.id,
|
upstream.id,
|
||||||
|
upstream.url,
|
||||||
);
|
);
|
||||||
}
|
}
|
||||||
|
|
||||||
// Cache 404 responses
|
// Cache 404 responses
|
||||||
if (result.status === 404) {
|
if (result.status === 404) {
|
||||||
this.cache.setNegative(context, upstream.id);
|
await this.cache.setNegative(context, upstream.id, upstream.url);
|
||||||
}
|
}
|
||||||
|
|
||||||
return result;
|
return result;
|
||||||
@@ -210,15 +214,15 @@ export abstract class BaseUpstream {
|
|||||||
/**
|
/**
|
||||||
* Invalidate cache for a resource pattern.
|
* Invalidate cache for a resource pattern.
|
||||||
*/
|
*/
|
||||||
public invalidateCache(pattern: RegExp): number {
|
public async invalidateCache(pattern: RegExp): Promise<number> {
|
||||||
return this.cache.invalidatePattern(pattern);
|
return this.cache.invalidatePattern(pattern);
|
||||||
}
|
}
|
||||||
|
|
||||||
/**
|
/**
|
||||||
* Clear all cache entries.
|
* Clear all cache entries.
|
||||||
*/
|
*/
|
||||||
public clearCache(): void {
|
public async clearCache(): Promise<void> {
|
||||||
this.cache.clear();
|
await this.cache.clear();
|
||||||
}
|
}
|
||||||
|
|
||||||
/**
|
/**
|
||||||
@@ -501,12 +505,13 @@ export abstract class BaseUpstream {
|
|||||||
);
|
);
|
||||||
|
|
||||||
if (result.success && result.body) {
|
if (result.success && result.body) {
|
||||||
this.cache.set(
|
await this.cache.set(
|
||||||
context,
|
context,
|
||||||
Buffer.isBuffer(result.body) ? result.body : Buffer.from(JSON.stringify(result.body)),
|
Buffer.isBuffer(result.body) ? result.body : Buffer.from(JSON.stringify(result.body)),
|
||||||
result.headers['content-type'] || 'application/octet-stream',
|
result.headers['content-type'] || 'application/octet-stream',
|
||||||
result.headers,
|
result.headers,
|
||||||
upstream.id,
|
upstream.id,
|
||||||
|
upstream.url,
|
||||||
);
|
);
|
||||||
return; // Successfully revalidated
|
return; // Successfully revalidated
|
||||||
}
|
}
|
||||||
|
|||||||
@@ -4,9 +4,23 @@ import type {
|
|||||||
IUpstreamFetchContext,
|
IUpstreamFetchContext,
|
||||||
} from './interfaces.upstream.js';
|
} from './interfaces.upstream.js';
|
||||||
import { DEFAULT_CACHE_CONFIG } from './interfaces.upstream.js';
|
import { DEFAULT_CACHE_CONFIG } from './interfaces.upstream.js';
|
||||||
|
import type { IStorageBackend } from '../core/interfaces.core.js';
|
||||||
|
|
||||||
/**
|
/**
|
||||||
* In-memory cache for upstream responses.
|
* Cache metadata stored alongside cache entries.
|
||||||
|
*/
|
||||||
|
interface ICacheMetadata {
|
||||||
|
contentType: string;
|
||||||
|
headers: Record<string, string>;
|
||||||
|
cachedAt: string;
|
||||||
|
expiresAt?: string;
|
||||||
|
etag?: string;
|
||||||
|
upstreamId: string;
|
||||||
|
upstreamUrl: string;
|
||||||
|
}
|
||||||
|
|
||||||
|
/**
|
||||||
|
* S3-backed upstream cache with in-memory hot layer.
|
||||||
*
|
*
|
||||||
* Features:
|
* Features:
|
||||||
* - TTL-based expiration
|
* - TTL-based expiration
|
||||||
@@ -14,26 +28,45 @@ import { DEFAULT_CACHE_CONFIG } from './interfaces.upstream.js';
|
|||||||
* - Negative caching (404s)
|
* - Negative caching (404s)
|
||||||
* - Content-type aware caching
|
* - Content-type aware caching
|
||||||
* - ETag support for conditional requests
|
* - ETag support for conditional requests
|
||||||
|
* - Multi-upstream support via URL-based cache paths
|
||||||
|
* - Persistent S3 storage with in-memory hot layer
|
||||||
*
|
*
|
||||||
* Note: This is an in-memory implementation. For production with persistence,
|
* Cache paths are structured as:
|
||||||
* extend this class to use RegistryStorage for S3-backed caching.
|
* cache/{escaped-upstream-url}/{protocol}:{method}:{path}
|
||||||
|
*
|
||||||
|
* @example
|
||||||
|
* ```typescript
|
||||||
|
* // In-memory only (default)
|
||||||
|
* const cache = new UpstreamCache(config);
|
||||||
|
*
|
||||||
|
* // With S3 persistence
|
||||||
|
* const cache = new UpstreamCache(config, 10000, storage);
|
||||||
|
* ```
|
||||||
*/
|
*/
|
||||||
export class UpstreamCache {
|
export class UpstreamCache {
|
||||||
/** Cache storage */
|
/** In-memory hot cache */
|
||||||
private readonly cache: Map<string, ICacheEntry> = new Map();
|
private readonly memoryCache: Map<string, ICacheEntry> = new Map();
|
||||||
|
|
||||||
/** Configuration */
|
/** Configuration */
|
||||||
private readonly config: IUpstreamCacheConfig;
|
private readonly config: IUpstreamCacheConfig;
|
||||||
|
|
||||||
/** Maximum cache entries (prevents memory bloat) */
|
/** Maximum in-memory cache entries */
|
||||||
private readonly maxEntries: number;
|
private readonly maxMemoryEntries: number;
|
||||||
|
|
||||||
|
/** S3 storage backend (optional) */
|
||||||
|
private readonly storage?: IStorageBackend;
|
||||||
|
|
||||||
/** Cleanup interval handle */
|
/** Cleanup interval handle */
|
||||||
private cleanupInterval: ReturnType<typeof setInterval> | null = null;
|
private cleanupInterval: ReturnType<typeof setInterval> | null = null;
|
||||||
|
|
||||||
constructor(config?: Partial<IUpstreamCacheConfig>, maxEntries: number = 10000) {
|
constructor(
|
||||||
|
config?: Partial<IUpstreamCacheConfig>,
|
||||||
|
maxMemoryEntries: number = 10000,
|
||||||
|
storage?: IStorageBackend
|
||||||
|
) {
|
||||||
this.config = { ...DEFAULT_CACHE_CONFIG, ...config };
|
this.config = { ...DEFAULT_CACHE_CONFIG, ...config };
|
||||||
this.maxEntries = maxEntries;
|
this.maxMemoryEntries = maxMemoryEntries;
|
||||||
|
this.storage = storage;
|
||||||
|
|
||||||
// Start periodic cleanup if caching is enabled
|
// Start periodic cleanup if caching is enabled
|
||||||
if (this.config.enabled) {
|
if (this.config.enabled) {
|
||||||
@@ -48,17 +81,36 @@ export class UpstreamCache {
|
|||||||
return this.config.enabled;
|
return this.config.enabled;
|
||||||
}
|
}
|
||||||
|
|
||||||
|
/**
|
||||||
|
* Check if S3 storage is configured.
|
||||||
|
*/
|
||||||
|
public hasStorage(): boolean {
|
||||||
|
return !!this.storage;
|
||||||
|
}
|
||||||
|
|
||||||
/**
|
/**
|
||||||
* Get cached entry for a request context.
|
* Get cached entry for a request context.
|
||||||
|
* Checks memory first, then falls back to S3.
|
||||||
* Returns null if not found or expired (unless stale-while-revalidate).
|
* Returns null if not found or expired (unless stale-while-revalidate).
|
||||||
*/
|
*/
|
||||||
public get(context: IUpstreamFetchContext): ICacheEntry | null {
|
public async get(context: IUpstreamFetchContext, upstreamUrl?: string): Promise<ICacheEntry | null> {
|
||||||
if (!this.config.enabled) {
|
if (!this.config.enabled) {
|
||||||
return null;
|
return null;
|
||||||
}
|
}
|
||||||
|
|
||||||
const key = this.buildCacheKey(context);
|
const key = this.buildCacheKey(context, upstreamUrl);
|
||||||
const entry = this.cache.get(key);
|
|
||||||
|
// Check memory cache first
|
||||||
|
let entry = this.memoryCache.get(key);
|
||||||
|
|
||||||
|
// If not in memory and we have storage, check S3
|
||||||
|
if (!entry && this.storage) {
|
||||||
|
entry = await this.loadFromStorage(key);
|
||||||
|
if (entry) {
|
||||||
|
// Promote to memory cache
|
||||||
|
this.memoryCache.set(key, entry);
|
||||||
|
}
|
||||||
|
}
|
||||||
|
|
||||||
if (!entry) {
|
if (!entry) {
|
||||||
return null;
|
return null;
|
||||||
@@ -78,7 +130,10 @@ export class UpstreamCache {
|
|||||||
}
|
}
|
||||||
}
|
}
|
||||||
// Entry is too old, remove it
|
// Entry is too old, remove it
|
||||||
this.cache.delete(key);
|
this.memoryCache.delete(key);
|
||||||
|
if (this.storage) {
|
||||||
|
await this.deleteFromStorage(key).catch(() => {});
|
||||||
|
}
|
||||||
return null;
|
return null;
|
||||||
}
|
}
|
||||||
|
|
||||||
@@ -86,26 +141,27 @@ export class UpstreamCache {
|
|||||||
}
|
}
|
||||||
|
|
||||||
/**
|
/**
|
||||||
* Store a response in the cache.
|
* Store a response in the cache (memory and optionally S3).
|
||||||
*/
|
*/
|
||||||
public set(
|
public async set(
|
||||||
context: IUpstreamFetchContext,
|
context: IUpstreamFetchContext,
|
||||||
data: Buffer,
|
data: Buffer,
|
||||||
contentType: string,
|
contentType: string,
|
||||||
headers: Record<string, string>,
|
headers: Record<string, string>,
|
||||||
upstreamId: string,
|
upstreamId: string,
|
||||||
|
upstreamUrl: string,
|
||||||
options?: ICacheSetOptions,
|
options?: ICacheSetOptions,
|
||||||
): void {
|
): Promise<void> {
|
||||||
if (!this.config.enabled) {
|
if (!this.config.enabled) {
|
||||||
return;
|
return;
|
||||||
}
|
}
|
||||||
|
|
||||||
// Enforce max entries limit
|
// Enforce max memory entries limit
|
||||||
if (this.cache.size >= this.maxEntries) {
|
if (this.memoryCache.size >= this.maxMemoryEntries) {
|
||||||
this.evictOldest();
|
this.evictOldest();
|
||||||
}
|
}
|
||||||
|
|
||||||
const key = this.buildCacheKey(context);
|
const key = this.buildCacheKey(context, upstreamUrl);
|
||||||
const now = new Date();
|
const now = new Date();
|
||||||
|
|
||||||
// Determine TTL based on content type
|
// Determine TTL based on content type
|
||||||
@@ -122,18 +178,24 @@ export class UpstreamCache {
|
|||||||
stale: false,
|
stale: false,
|
||||||
};
|
};
|
||||||
|
|
||||||
this.cache.set(key, entry);
|
// Store in memory
|
||||||
|
this.memoryCache.set(key, entry);
|
||||||
|
|
||||||
|
// Store in S3 if available
|
||||||
|
if (this.storage) {
|
||||||
|
await this.saveToStorage(key, entry, upstreamUrl).catch(() => {});
|
||||||
|
}
|
||||||
}
|
}
|
||||||
|
|
||||||
/**
|
/**
|
||||||
* Store a negative cache entry (404 response).
|
* Store a negative cache entry (404 response).
|
||||||
*/
|
*/
|
||||||
public setNegative(context: IUpstreamFetchContext, upstreamId: string): void {
|
public async setNegative(context: IUpstreamFetchContext, upstreamId: string, upstreamUrl: string): Promise<void> {
|
||||||
if (!this.config.enabled || this.config.negativeCacheTtlSeconds <= 0) {
|
if (!this.config.enabled || this.config.negativeCacheTtlSeconds <= 0) {
|
||||||
return;
|
return;
|
||||||
}
|
}
|
||||||
|
|
||||||
const key = this.buildCacheKey(context);
|
const key = this.buildCacheKey(context, upstreamUrl);
|
||||||
const now = new Date();
|
const now = new Date();
|
||||||
|
|
||||||
const entry: ICacheEntry = {
|
const entry: ICacheEntry = {
|
||||||
@@ -146,34 +208,47 @@ export class UpstreamCache {
|
|||||||
stale: false,
|
stale: false,
|
||||||
};
|
};
|
||||||
|
|
||||||
this.cache.set(key, entry);
|
this.memoryCache.set(key, entry);
|
||||||
|
|
||||||
|
if (this.storage) {
|
||||||
|
await this.saveToStorage(key, entry, upstreamUrl).catch(() => {});
|
||||||
|
}
|
||||||
}
|
}
|
||||||
|
|
||||||
/**
|
/**
|
||||||
* Check if there's a negative cache entry for this context.
|
* Check if there's a negative cache entry for this context.
|
||||||
*/
|
*/
|
||||||
public hasNegative(context: IUpstreamFetchContext): boolean {
|
public async hasNegative(context: IUpstreamFetchContext, upstreamUrl?: string): Promise<boolean> {
|
||||||
const entry = this.get(context);
|
const entry = await this.get(context, upstreamUrl);
|
||||||
return entry !== null && entry.data.length === 0;
|
return entry !== null && entry.data.length === 0;
|
||||||
}
|
}
|
||||||
|
|
||||||
/**
|
/**
|
||||||
* Invalidate a specific cache entry.
|
* Invalidate a specific cache entry.
|
||||||
*/
|
*/
|
||||||
public invalidate(context: IUpstreamFetchContext): boolean {
|
public async invalidate(context: IUpstreamFetchContext, upstreamUrl?: string): Promise<boolean> {
|
||||||
const key = this.buildCacheKey(context);
|
const key = this.buildCacheKey(context, upstreamUrl);
|
||||||
return this.cache.delete(key);
|
const deleted = this.memoryCache.delete(key);
|
||||||
|
|
||||||
|
if (this.storage) {
|
||||||
|
await this.deleteFromStorage(key).catch(() => {});
|
||||||
|
}
|
||||||
|
|
||||||
|
return deleted;
|
||||||
}
|
}
|
||||||
|
|
||||||
/**
|
/**
|
||||||
* Invalidate all entries matching a pattern.
|
* Invalidate all entries matching a pattern.
|
||||||
* Useful for invalidating all versions of a package.
|
* Useful for invalidating all versions of a package.
|
||||||
*/
|
*/
|
||||||
public invalidatePattern(pattern: RegExp): number {
|
public async invalidatePattern(pattern: RegExp): Promise<number> {
|
||||||
let count = 0;
|
let count = 0;
|
||||||
for (const key of this.cache.keys()) {
|
for (const key of this.memoryCache.keys()) {
|
||||||
if (pattern.test(key)) {
|
if (pattern.test(key)) {
|
||||||
this.cache.delete(key);
|
this.memoryCache.delete(key);
|
||||||
|
if (this.storage) {
|
||||||
|
await this.deleteFromStorage(key).catch(() => {});
|
||||||
|
}
|
||||||
count++;
|
count++;
|
||||||
}
|
}
|
||||||
}
|
}
|
||||||
@@ -183,11 +258,14 @@ export class UpstreamCache {
|
|||||||
/**
|
/**
|
||||||
* Invalidate all entries from a specific upstream.
|
* Invalidate all entries from a specific upstream.
|
||||||
*/
|
*/
|
||||||
public invalidateUpstream(upstreamId: string): number {
|
public async invalidateUpstream(upstreamId: string): Promise<number> {
|
||||||
let count = 0;
|
let count = 0;
|
||||||
for (const [key, entry] of this.cache.entries()) {
|
for (const [key, entry] of this.memoryCache.entries()) {
|
||||||
if (entry.upstreamId === upstreamId) {
|
if (entry.upstreamId === upstreamId) {
|
||||||
this.cache.delete(key);
|
this.memoryCache.delete(key);
|
||||||
|
if (this.storage) {
|
||||||
|
await this.deleteFromStorage(key).catch(() => {});
|
||||||
|
}
|
||||||
count++;
|
count++;
|
||||||
}
|
}
|
||||||
}
|
}
|
||||||
@@ -195,10 +273,13 @@ export class UpstreamCache {
|
|||||||
}
|
}
|
||||||
|
|
||||||
/**
|
/**
|
||||||
* Clear all cache entries.
|
* Clear all cache entries (memory and S3).
|
||||||
*/
|
*/
|
||||||
public clear(): void {
|
public async clear(): Promise<void> {
|
||||||
this.cache.clear();
|
this.memoryCache.clear();
|
||||||
|
|
||||||
|
// Note: S3 cleanup would require listing and deleting all cache/* objects
|
||||||
|
// This is left as a future enhancement for bulk cleanup
|
||||||
}
|
}
|
||||||
|
|
||||||
/**
|
/**
|
||||||
@@ -211,7 +292,7 @@ export class UpstreamCache {
|
|||||||
let totalSize = 0;
|
let totalSize = 0;
|
||||||
const now = new Date();
|
const now = new Date();
|
||||||
|
|
||||||
for (const entry of this.cache.values()) {
|
for (const entry of this.memoryCache.values()) {
|
||||||
totalSize += entry.data.length;
|
totalSize += entry.data.length;
|
||||||
|
|
||||||
if (entry.data.length === 0) {
|
if (entry.data.length === 0) {
|
||||||
@@ -224,13 +305,14 @@ export class UpstreamCache {
|
|||||||
}
|
}
|
||||||
|
|
||||||
return {
|
return {
|
||||||
totalEntries: this.cache.size,
|
totalEntries: this.memoryCache.size,
|
||||||
freshEntries: freshCount,
|
freshEntries: freshCount,
|
||||||
staleEntries: staleCount,
|
staleEntries: staleCount,
|
||||||
negativeEntries: negativeCount,
|
negativeEntries: negativeCount,
|
||||||
totalSizeBytes: totalSize,
|
totalSizeBytes: totalSize,
|
||||||
maxEntries: this.maxEntries,
|
maxEntries: this.maxMemoryEntries,
|
||||||
enabled: this.config.enabled,
|
enabled: this.config.enabled,
|
||||||
|
hasStorage: !!this.storage,
|
||||||
};
|
};
|
||||||
}
|
}
|
||||||
|
|
||||||
@@ -244,17 +326,136 @@ export class UpstreamCache {
|
|||||||
}
|
}
|
||||||
}
|
}
|
||||||
|
|
||||||
|
// ========================================================================
|
||||||
|
// Storage Methods
|
||||||
|
// ========================================================================
|
||||||
|
|
||||||
|
/**
|
||||||
|
* Build storage path for a cache key.
|
||||||
|
* Escapes upstream URL for safe use in S3 paths.
|
||||||
|
*/
|
||||||
|
private buildStoragePath(key: string): string {
|
||||||
|
return `cache/${key}`;
|
||||||
|
}
|
||||||
|
|
||||||
|
/**
|
||||||
|
* Build storage path for cache metadata.
|
||||||
|
*/
|
||||||
|
private buildMetadataPath(key: string): string {
|
||||||
|
return `cache/${key}.meta`;
|
||||||
|
}
|
||||||
|
|
||||||
|
/**
|
||||||
|
* Load a cache entry from S3 storage.
|
||||||
|
*/
|
||||||
|
private async loadFromStorage(key: string): Promise<ICacheEntry | null> {
|
||||||
|
if (!this.storage) return null;
|
||||||
|
|
||||||
|
try {
|
||||||
|
const dataPath = this.buildStoragePath(key);
|
||||||
|
const metaPath = this.buildMetadataPath(key);
|
||||||
|
|
||||||
|
// Load data and metadata in parallel
|
||||||
|
const [data, metaBuffer] = await Promise.all([
|
||||||
|
this.storage.getObject(dataPath),
|
||||||
|
this.storage.getObject(metaPath),
|
||||||
|
]);
|
||||||
|
|
||||||
|
if (!data || !metaBuffer) {
|
||||||
|
return null;
|
||||||
|
}
|
||||||
|
|
||||||
|
const meta: ICacheMetadata = JSON.parse(metaBuffer.toString('utf-8'));
|
||||||
|
|
||||||
|
return {
|
||||||
|
data,
|
||||||
|
contentType: meta.contentType,
|
||||||
|
headers: meta.headers,
|
||||||
|
cachedAt: new Date(meta.cachedAt),
|
||||||
|
expiresAt: meta.expiresAt ? new Date(meta.expiresAt) : undefined,
|
||||||
|
etag: meta.etag,
|
||||||
|
upstreamId: meta.upstreamId,
|
||||||
|
stale: false,
|
||||||
|
};
|
||||||
|
} catch {
|
||||||
|
return null;
|
||||||
|
}
|
||||||
|
}
|
||||||
|
|
||||||
|
/**
|
||||||
|
* Save a cache entry to S3 storage.
|
||||||
|
*/
|
||||||
|
private async saveToStorage(key: string, entry: ICacheEntry, upstreamUrl: string): Promise<void> {
|
||||||
|
if (!this.storage) return;
|
||||||
|
|
||||||
|
const dataPath = this.buildStoragePath(key);
|
||||||
|
const metaPath = this.buildMetadataPath(key);
|
||||||
|
|
||||||
|
const meta: ICacheMetadata = {
|
||||||
|
contentType: entry.contentType,
|
||||||
|
headers: entry.headers,
|
||||||
|
cachedAt: entry.cachedAt.toISOString(),
|
||||||
|
expiresAt: entry.expiresAt?.toISOString(),
|
||||||
|
etag: entry.etag,
|
||||||
|
upstreamId: entry.upstreamId,
|
||||||
|
upstreamUrl,
|
||||||
|
};
|
||||||
|
|
||||||
|
// Save data and metadata in parallel
|
||||||
|
await Promise.all([
|
||||||
|
this.storage.putObject(dataPath, entry.data),
|
||||||
|
this.storage.putObject(metaPath, Buffer.from(JSON.stringify(meta), 'utf-8')),
|
||||||
|
]);
|
||||||
|
}
|
||||||
|
|
||||||
|
/**
|
||||||
|
* Delete a cache entry from S3 storage.
|
||||||
|
*/
|
||||||
|
private async deleteFromStorage(key: string): Promise<void> {
|
||||||
|
if (!this.storage) return;
|
||||||
|
|
||||||
|
const dataPath = this.buildStoragePath(key);
|
||||||
|
const metaPath = this.buildMetadataPath(key);
|
||||||
|
|
||||||
|
await Promise.all([
|
||||||
|
this.storage.deleteObject(dataPath).catch(() => {}),
|
||||||
|
this.storage.deleteObject(metaPath).catch(() => {}),
|
||||||
|
]);
|
||||||
|
}
|
||||||
|
|
||||||
|
// ========================================================================
|
||||||
|
// Helper Methods
|
||||||
|
// ========================================================================
|
||||||
|
|
||||||
|
/**
|
||||||
|
* Escape a URL for safe use in storage paths.
|
||||||
|
*/
|
||||||
|
private escapeUrl(url: string): string {
|
||||||
|
// Remove protocol prefix and escape special characters
|
||||||
|
return url
|
||||||
|
.replace(/^https?:\/\//, '')
|
||||||
|
.replace(/[\/\\:*?"<>|]/g, '_')
|
||||||
|
.replace(/__+/g, '_');
|
||||||
|
}
|
||||||
|
|
||||||
/**
|
/**
|
||||||
* Build a unique cache key for a request context.
|
* Build a unique cache key for a request context.
|
||||||
|
* Includes escaped upstream URL for multi-upstream support.
|
||||||
*/
|
*/
|
||||||
private buildCacheKey(context: IUpstreamFetchContext): string {
|
private buildCacheKey(context: IUpstreamFetchContext, upstreamUrl?: string): string {
|
||||||
// Include method, protocol, path, and sorted query params
|
// Include method, protocol, path, and sorted query params
|
||||||
const queryString = Object.keys(context.query)
|
const queryString = Object.keys(context.query)
|
||||||
.sort()
|
.sort()
|
||||||
.map(k => `${k}=${context.query[k]}`)
|
.map(k => `${k}=${context.query[k]}`)
|
||||||
.join('&');
|
.join('&');
|
||||||
|
|
||||||
return `${context.protocol}:${context.method}:${context.path}${queryString ? '?' + queryString : ''}`;
|
const baseKey = `${context.protocol}:${context.method}:${context.path}${queryString ? '?' + queryString : ''}`;
|
||||||
|
|
||||||
|
if (upstreamUrl) {
|
||||||
|
return `${this.escapeUrl(upstreamUrl)}/${baseKey}`;
|
||||||
|
}
|
||||||
|
|
||||||
|
return baseKey;
|
||||||
}
|
}
|
||||||
|
|
||||||
/**
|
/**
|
||||||
@@ -333,27 +534,27 @@ export class UpstreamCache {
|
|||||||
*/
|
*/
|
||||||
private evictOldest(): void {
|
private evictOldest(): void {
|
||||||
// Evict 10% of max entries
|
// Evict 10% of max entries
|
||||||
const evictCount = Math.ceil(this.maxEntries * 0.1);
|
const evictCount = Math.ceil(this.maxMemoryEntries * 0.1);
|
||||||
let evicted = 0;
|
let evicted = 0;
|
||||||
|
|
||||||
// First, try to evict stale entries
|
// First, try to evict stale entries
|
||||||
const now = new Date();
|
const now = new Date();
|
||||||
for (const [key, entry] of this.cache.entries()) {
|
for (const [key, entry] of this.memoryCache.entries()) {
|
||||||
if (evicted >= evictCount) break;
|
if (evicted >= evictCount) break;
|
||||||
if (entry.stale || (entry.expiresAt && entry.expiresAt < now)) {
|
if (entry.stale || (entry.expiresAt && entry.expiresAt < now)) {
|
||||||
this.cache.delete(key);
|
this.memoryCache.delete(key);
|
||||||
evicted++;
|
evicted++;
|
||||||
}
|
}
|
||||||
}
|
}
|
||||||
|
|
||||||
// If not enough evicted, evict oldest by cachedAt
|
// If not enough evicted, evict oldest by cachedAt
|
||||||
if (evicted < evictCount) {
|
if (evicted < evictCount) {
|
||||||
const entries = Array.from(this.cache.entries())
|
const entries = Array.from(this.memoryCache.entries())
|
||||||
.sort((a, b) => a[1].cachedAt.getTime() - b[1].cachedAt.getTime());
|
.sort((a, b) => a[1].cachedAt.getTime() - b[1].cachedAt.getTime());
|
||||||
|
|
||||||
for (const [key] of entries) {
|
for (const [key] of entries) {
|
||||||
if (evicted >= evictCount) break;
|
if (evicted >= evictCount) break;
|
||||||
this.cache.delete(key);
|
this.memoryCache.delete(key);
|
||||||
evicted++;
|
evicted++;
|
||||||
}
|
}
|
||||||
}
|
}
|
||||||
@@ -375,17 +576,17 @@ export class UpstreamCache {
|
|||||||
}
|
}
|
||||||
|
|
||||||
/**
|
/**
|
||||||
* Remove all expired entries.
|
* Remove all expired entries from memory cache.
|
||||||
*/
|
*/
|
||||||
private cleanup(): void {
|
private cleanup(): void {
|
||||||
const now = new Date();
|
const now = new Date();
|
||||||
const staleDeadline = new Date(now.getTime() - this.config.staleMaxAgeSeconds * 1000);
|
const staleDeadline = new Date(now.getTime() - this.config.staleMaxAgeSeconds * 1000);
|
||||||
|
|
||||||
for (const [key, entry] of this.cache.entries()) {
|
for (const [key, entry] of this.memoryCache.entries()) {
|
||||||
if (entry.expiresAt) {
|
if (entry.expiresAt) {
|
||||||
// Remove if past stale deadline
|
// Remove if past stale deadline
|
||||||
if (entry.expiresAt < staleDeadline) {
|
if (entry.expiresAt < staleDeadline) {
|
||||||
this.cache.delete(key);
|
this.memoryCache.delete(key);
|
||||||
}
|
}
|
||||||
}
|
}
|
||||||
}
|
}
|
||||||
@@ -406,7 +607,7 @@ export interface ICacheSetOptions {
|
|||||||
* Cache statistics.
|
* Cache statistics.
|
||||||
*/
|
*/
|
||||||
export interface ICacheStats {
|
export interface ICacheStats {
|
||||||
/** Total number of cached entries */
|
/** Total number of cached entries in memory */
|
||||||
totalEntries: number;
|
totalEntries: number;
|
||||||
/** Number of fresh (non-expired) entries */
|
/** Number of fresh (non-expired) entries */
|
||||||
freshEntries: number;
|
freshEntries: number;
|
||||||
@@ -414,10 +615,12 @@ export interface ICacheStats {
|
|||||||
staleEntries: number;
|
staleEntries: number;
|
||||||
/** Number of negative cache entries */
|
/** Number of negative cache entries */
|
||||||
negativeEntries: number;
|
negativeEntries: number;
|
||||||
/** Total size of cached data in bytes */
|
/** Total size of cached data in bytes (memory only) */
|
||||||
totalSizeBytes: number;
|
totalSizeBytes: number;
|
||||||
/** Maximum allowed entries */
|
/** Maximum allowed memory entries */
|
||||||
maxEntries: number;
|
maxEntries: number;
|
||||||
/** Whether caching is enabled */
|
/** Whether caching is enabled */
|
||||||
enabled: boolean;
|
enabled: boolean;
|
||||||
|
/** Whether S3 storage is configured */
|
||||||
|
hasStorage: boolean;
|
||||||
}
|
}
|
||||||
|
|||||||
Reference in New Issue
Block a user