Compare commits
10 Commits
| Author | SHA1 | Date | |
|---|---|---|---|
| ac51a94c8b | |||
| 9ca1e670ef | |||
| fb8d6897e3 | |||
| 81ae4f2d59 | |||
| 374469e37e | |||
| 9039613f7a | |||
| 4d13fac9f1 | |||
| 42209d235d | |||
| 80005af576 | |||
| 8d48627301 |
@@ -1,35 +1,5 @@
|
|||||||
// The Dev Container format allows you to configure your environment. At the heart of it
|
|
||||||
// is a Docker image or Dockerfile which controls the tools available in your environment.
|
|
||||||
//
|
|
||||||
// See https://aka.ms/devcontainer.json for more information.
|
// See https://aka.ms/devcontainer.json for more information.
|
||||||
{
|
{
|
||||||
"name": "Ona",
|
"name": "gitzone.universal",
|
||||||
// This universal image (~10GB) includes many development tools and languages,
|
|
||||||
// providing a convenient all-in-one development environment.
|
|
||||||
//
|
|
||||||
// This image is already available on remote runners for fast startup. On desktop
|
|
||||||
// and linux runners, it will need to be downloaded, which may take longer.
|
|
||||||
//
|
|
||||||
// For faster startup on desktop/linux, consider a smaller, language-specific image:
|
|
||||||
// • For Python: mcr.microsoft.com/devcontainers/python:3.13
|
|
||||||
// • For Node.js: mcr.microsoft.com/devcontainers/javascript-node:24
|
|
||||||
// • For Go: mcr.microsoft.com/devcontainers/go:1.24
|
|
||||||
// • For Java: mcr.microsoft.com/devcontainers/java:21
|
|
||||||
//
|
|
||||||
// Browse more options at: https://hub.docker.com/r/microsoft/devcontainers
|
|
||||||
// or build your own using the Dockerfile option below.
|
|
||||||
"image": "mcr.microsoft.com/devcontainers/universal:4.0.1-noble"
|
"image": "mcr.microsoft.com/devcontainers/universal:4.0.1-noble"
|
||||||
// Use "build":
|
|
||||||
// instead of the image to use a Dockerfile to build an image.
|
|
||||||
// "build": {
|
|
||||||
// "context": ".",
|
|
||||||
// "dockerfile": "Dockerfile"
|
|
||||||
// }
|
|
||||||
// Features add additional features to your environment. See https://containers.dev/features
|
|
||||||
// Beware: features are not supported on all platforms and may have unintended side-effects.
|
|
||||||
// "features": {
|
|
||||||
// "ghcr.io/devcontainers/features/docker-in-docker": {
|
|
||||||
// "moby": false
|
|
||||||
// }
|
|
||||||
// }
|
|
||||||
}
|
}
|
||||||
|
|||||||
40
changelog.md
40
changelog.md
@@ -1,5 +1,45 @@
|
|||||||
# Changelog
|
# Changelog
|
||||||
|
|
||||||
|
## 2025-11-21 - 1.5.0 - feat(core)
|
||||||
|
Add PyPI and RubyGems protocol support, Cargo token management, and storage helpers
|
||||||
|
|
||||||
|
- Extend core protocol types to include 'pypi' and 'rubygems' and add protocol config entries for pypi and rubygems.
|
||||||
|
- Add PyPI storage methods for metadata, Simple API HTML/JSON indexes, package files, version listing and deletion in RegistryStorage.
|
||||||
|
- Add Cargo-specific storage helpers (index paths, crate storage) and ensure Cargo registry initialization and endpoints are wired into SmartRegistry.
|
||||||
|
- Extend AuthManager with Cargo, PyPI and RubyGems token creation, validation and revocation methods; update unified validateToken to check these token types.
|
||||||
|
- Update test helpers to create Cargo tokens and return cargoToken from registry setup.
|
||||||
|
|
||||||
|
## 2025-11-21 - 1.4.1 - fix(devcontainer)
|
||||||
|
Simplify devcontainer configuration and rename container image
|
||||||
|
|
||||||
|
- Rename Dev Container name to 'gitzone.universal' and set image to mcr.microsoft.com/devcontainers/universal:4.0.1-noble
|
||||||
|
- Remove large inline comments and example 'build'/'features' blocks to simplify the devcontainer.json
|
||||||
|
|
||||||
|
## 2025-11-21 - 1.4.0 - feat(registrystorage)
|
||||||
|
Add deleteMavenMetadata to RegistryStorage and update Maven DELETE test to expect 204 No Content
|
||||||
|
|
||||||
|
- Add deleteMavenMetadata(groupId, artifactId) to RegistryStorage to remove maven-metadata.xml.
|
||||||
|
- Update Maven test to assert 204 No Content for DELETE responses (previously expected 200).
|
||||||
|
|
||||||
|
## 2025-11-21 - 1.3.1 - fix(maven)
|
||||||
|
Pass request path to Maven checksum handler so checksum files are resolved correctly
|
||||||
|
|
||||||
|
- Call handleChecksumRequest with the full request path from MavenRegistry.handleRequest
|
||||||
|
- Allows getChecksum to extract the checksum filename from the URL and fetch the correct checksum file from storage
|
||||||
|
- Fixes 404s when requesting artifact checksum files (md5, sha1, sha256, sha512)
|
||||||
|
|
||||||
|
## 2025-11-21 - 1.3.0 - feat(core)
|
||||||
|
Add Cargo and Composer registries with storage, auth and helpers
|
||||||
|
|
||||||
|
- Add Cargo registry implementation (ts/cargo) including index, publish, download, yank/unyank and search handlers
|
||||||
|
- Add Composer registry implementation (ts/composer) including package upload/download, metadata, packages.json and helpers
|
||||||
|
- Extend RegistryStorage with Cargo and Composer-specific storage helpers and path conventions
|
||||||
|
- Extend AuthManager with Composer token creation/validation and unified token validation support
|
||||||
|
- Wire SmartRegistry to initialize and route requests to cargo and composer handlers
|
||||||
|
- Add adm-zip dependency and Composer ZIP parsing helpers (extractComposerJsonFromZip, sha1 calculation, version sorting)
|
||||||
|
- Add tests for Cargo index path calculation and config handling
|
||||||
|
- Export new modules from ts/index.ts and add module entry files for composer and cargo
|
||||||
|
|
||||||
## 2025-11-21 - 1.2.0 - feat(maven)
|
## 2025-11-21 - 1.2.0 - feat(maven)
|
||||||
Add Maven registry protocol support (storage, auth, routing, interfaces, and exports)
|
Add Maven registry protocol support (storage, auth, routing, interfaces, and exports)
|
||||||
|
|
||||||
|
|||||||
@@ -1,6 +1,6 @@
|
|||||||
{
|
{
|
||||||
"name": "@push.rocks/smartregistry",
|
"name": "@push.rocks/smartregistry",
|
||||||
"version": "1.2.0",
|
"version": "1.5.0",
|
||||||
"private": false,
|
"private": false,
|
||||||
"description": "a registry for npm modules and oci images",
|
"description": "a registry for npm modules and oci images",
|
||||||
"main": "dist_ts/index.js",
|
"main": "dist_ts/index.js",
|
||||||
@@ -47,7 +47,8 @@
|
|||||||
"@push.rocks/qenv": "^6.1.3",
|
"@push.rocks/qenv": "^6.1.3",
|
||||||
"@push.rocks/smartbucket": "^4.3.0",
|
"@push.rocks/smartbucket": "^4.3.0",
|
||||||
"@push.rocks/smartlog": "^3.1.10",
|
"@push.rocks/smartlog": "^3.1.10",
|
||||||
"@push.rocks/smartpath": "^6.0.0"
|
"@push.rocks/smartpath": "^6.0.0",
|
||||||
|
"adm-zip": "^0.5.10"
|
||||||
},
|
},
|
||||||
"packageManager": "pnpm@10.18.1+sha512.77a884a165cbba2d8d1c19e3b4880eee6d2fcabd0d879121e282196b80042351d5eb3ca0935fa599da1dc51265cc68816ad2bddd2a2de5ea9fdf92adbec7cd34"
|
"packageManager": "pnpm@10.18.1+sha512.77a884a165cbba2d8d1c19e3b4880eee6d2fcabd0d879121e282196b80042351d5eb3ca0935fa599da1dc51265cc68816ad2bddd2a2de5ea9fdf92adbec7cd34"
|
||||||
}
|
}
|
||||||
|
|||||||
9
pnpm-lock.yaml
generated
9
pnpm-lock.yaml
generated
@@ -20,6 +20,9 @@ importers:
|
|||||||
'@push.rocks/smartpath':
|
'@push.rocks/smartpath':
|
||||||
specifier: ^6.0.0
|
specifier: ^6.0.0
|
||||||
version: 6.0.0
|
version: 6.0.0
|
||||||
|
adm-zip:
|
||||||
|
specifier: ^0.5.10
|
||||||
|
version: 0.5.16
|
||||||
devDependencies:
|
devDependencies:
|
||||||
'@git.zone/tsbuild':
|
'@git.zone/tsbuild':
|
||||||
specifier: ^3.1.0
|
specifier: ^3.1.0
|
||||||
@@ -1507,6 +1510,10 @@ packages:
|
|||||||
resolution: {integrity: sha512-mORqg60S8iML6XSmVjqjGHJkINrCGLMj2QvDmFzI9vIlv1RGlyjmw3nrzaINJjkNsYXC41XhhD5pfy7CtuGcbA==}
|
resolution: {integrity: sha512-mORqg60S8iML6XSmVjqjGHJkINrCGLMj2QvDmFzI9vIlv1RGlyjmw3nrzaINJjkNsYXC41XhhD5pfy7CtuGcbA==}
|
||||||
engines: {node: '>= 16'}
|
engines: {node: '>= 16'}
|
||||||
|
|
||||||
|
adm-zip@0.5.16:
|
||||||
|
resolution: {integrity: sha512-TGw5yVi4saajsSEgz25grObGHEUaDrniwvA2qwSC060KfqGPdglhvPMA2lPIoxs3PQIItj2iag35fONcQqgUaQ==}
|
||||||
|
engines: {node: '>=12.0'}
|
||||||
|
|
||||||
agent-base@7.1.4:
|
agent-base@7.1.4:
|
||||||
resolution: {integrity: sha512-MnA+YT8fwfJPgBx3m60MNqakm30XOkyIoH1y6huTQvC0PwZG7ki8NacLBcrPbNoo8vEZy7Jpuk7+jMO+CUovTQ==}
|
resolution: {integrity: sha512-MnA+YT8fwfJPgBx3m60MNqakm30XOkyIoH1y6huTQvC0PwZG7ki8NacLBcrPbNoo8vEZy7Jpuk7+jMO+CUovTQ==}
|
||||||
engines: {node: '>= 14'}
|
engines: {node: '>= 14'}
|
||||||
@@ -6557,6 +6564,8 @@ snapshots:
|
|||||||
transitivePeerDependencies:
|
transitivePeerDependencies:
|
||||||
- supports-color
|
- supports-color
|
||||||
|
|
||||||
|
adm-zip@0.5.16: {}
|
||||||
|
|
||||||
agent-base@7.1.4: {}
|
agent-base@7.1.4: {}
|
||||||
|
|
||||||
agentkeepalive@4.6.0:
|
agentkeepalive@4.6.0:
|
||||||
|
|||||||
334
readme.hints.md
334
readme.hints.md
@@ -1,3 +1,335 @@
|
|||||||
# Project Readme Hints
|
# Project Readme Hints
|
||||||
|
|
||||||
This is the initial readme hints file.
|
## Python (PyPI) Protocol Implementation Notes
|
||||||
|
|
||||||
|
### PEP 503: Simple Repository API (HTML-based)
|
||||||
|
|
||||||
|
**URL Structure:**
|
||||||
|
- Root: `/<base>/` - Lists all projects
|
||||||
|
- Project: `/<base>/<project>/` - Lists all files for a project
|
||||||
|
- All URLs MUST end with `/` (redirect if missing)
|
||||||
|
|
||||||
|
**Package Name Normalization:**
|
||||||
|
- Lowercase all characters
|
||||||
|
- Replace runs of `.`, `-`, `_` with single `-`
|
||||||
|
- Implementation: `re.sub(r"[-_.]+", "-", name).lower()`
|
||||||
|
|
||||||
|
**HTML Format:**
|
||||||
|
- Root: One anchor per project
|
||||||
|
- Project: One anchor per file
|
||||||
|
- Anchor text must match final filename
|
||||||
|
- Anchor href links to download URL
|
||||||
|
|
||||||
|
**Hash Fragments:**
|
||||||
|
Format: `#<hashname>=<hashvalue>`
|
||||||
|
- hashname: lowercase hash function name (recommend `sha256`)
|
||||||
|
- hashvalue: hex-encoded digest
|
||||||
|
|
||||||
|
**Data Attributes:**
|
||||||
|
- `data-gpg-sig`: `true`/`false` for GPG signature presence
|
||||||
|
- `data-requires-python`: PEP 345 requirement string (HTML-encode `<` as `<`, `>` as `>`)
|
||||||
|
|
||||||
|
### PEP 691: JSON-based Simple API
|
||||||
|
|
||||||
|
**Content Types:**
|
||||||
|
- `application/vnd.pypi.simple.v1+json` - JSON format
|
||||||
|
- `application/vnd.pypi.simple.v1+html` - HTML format
|
||||||
|
- `text/html` - Alias for HTML (backwards compat)
|
||||||
|
|
||||||
|
**Root Endpoint JSON:**
|
||||||
|
```json
|
||||||
|
{
|
||||||
|
"meta": {"api-version": "1.0"},
|
||||||
|
"projects": [{"name": "ProjectName"}]
|
||||||
|
}
|
||||||
|
```
|
||||||
|
|
||||||
|
**Project Endpoint JSON:**
|
||||||
|
```json
|
||||||
|
{
|
||||||
|
"name": "normalized-name",
|
||||||
|
"meta": {"api-version": "1.0"},
|
||||||
|
"files": [
|
||||||
|
{
|
||||||
|
"filename": "package-1.0-py3-none-any.whl",
|
||||||
|
"url": "https://example.com/path/to/file",
|
||||||
|
"hashes": {"sha256": "..."},
|
||||||
|
"requires-python": ">=3.7",
|
||||||
|
"dist-info-metadata": true | {"sha256": "..."},
|
||||||
|
"gpg-sig": true,
|
||||||
|
"yanked": false | "reason string"
|
||||||
|
}
|
||||||
|
]
|
||||||
|
}
|
||||||
|
```
|
||||||
|
|
||||||
|
**Content Negotiation:**
|
||||||
|
- Use `Accept` header for format selection
|
||||||
|
- Server responds with `Content-Type` header
|
||||||
|
- Support both JSON and HTML formats
|
||||||
|
|
||||||
|
### PyPI Upload API (Legacy /legacy/)
|
||||||
|
|
||||||
|
**Endpoint:**
|
||||||
|
- URL: `https://upload.pypi.org/legacy/`
|
||||||
|
- Method: `POST`
|
||||||
|
- Content-Type: `multipart/form-data`
|
||||||
|
|
||||||
|
**Required Form Fields:**
|
||||||
|
- `:action` = `file_upload`
|
||||||
|
- `protocol_version` = `1`
|
||||||
|
- `content` = Binary file data with filename
|
||||||
|
- `filetype` = `bdist_wheel` | `sdist`
|
||||||
|
- `pyversion` = Python tag (e.g., `py3`, `py2.py3`) or `source` for sdist
|
||||||
|
- `metadata_version` = Metadata standard version
|
||||||
|
- `name` = Package name
|
||||||
|
- `version` = Version string
|
||||||
|
|
||||||
|
**Hash Digest (one required):**
|
||||||
|
- `md5_digest`: urlsafe base64 without padding
|
||||||
|
- `sha256_digest`: hexadecimal
|
||||||
|
- `blake2_256_digest`: hexadecimal
|
||||||
|
|
||||||
|
**Optional Fields:**
|
||||||
|
- `attestations`: JSON array of attestation objects
|
||||||
|
- Any Core Metadata fields (lowercase, hyphens → underscores)
|
||||||
|
- Example: `Description-Content-Type` → `description_content_type`
|
||||||
|
|
||||||
|
**Authentication:**
|
||||||
|
- Username/password or API token in HTTP Basic Auth
|
||||||
|
- API tokens: username = `__token__`, password = token value
|
||||||
|
|
||||||
|
**Behavior:**
|
||||||
|
- First file uploaded creates the release
|
||||||
|
- Multiple files uploaded sequentially for same version
|
||||||
|
|
||||||
|
### PEP 694: Upload 2.0 API
|
||||||
|
|
||||||
|
**Status:** Draft (not yet required, legacy API still supported)
|
||||||
|
- Multi-step workflow with sessions
|
||||||
|
- Async upload support with resumption
|
||||||
|
- JSON-based API
|
||||||
|
- Standard HTTP auth (RFC 7235)
|
||||||
|
- Not implementing initially (legacy API sufficient)
|
||||||
|
|
||||||
|
---
|
||||||
|
|
||||||
|
## Ruby (RubyGems) Protocol Implementation Notes
|
||||||
|
|
||||||
|
### Compact Index Format
|
||||||
|
|
||||||
|
**Endpoints:**
|
||||||
|
- `/versions` - Master list of all gems and versions
|
||||||
|
- `/info/<RUBYGEM>` - Detailed info for specific gem
|
||||||
|
- `/names` - Simple list of gem names
|
||||||
|
|
||||||
|
**Authentication:**
|
||||||
|
- UUID tokens similar to NPM pattern
|
||||||
|
- API key in `Authorization` header
|
||||||
|
- Scope format: `rubygems:gem:{name}:{read|write|yank}`
|
||||||
|
|
||||||
|
### `/versions` File Format
|
||||||
|
|
||||||
|
**Structure:**
|
||||||
|
```
|
||||||
|
created_at: 2024-04-01T00:00:05Z
|
||||||
|
---
|
||||||
|
RUBYGEM [-]VERSION_PLATFORM[,VERSION_PLATFORM,...] MD5
|
||||||
|
```
|
||||||
|
|
||||||
|
**Details:**
|
||||||
|
- Metadata lines before `---` delimiter
|
||||||
|
- One line per gem with comma-separated versions
|
||||||
|
- `[-]` prefix indicates yanked version
|
||||||
|
- `MD5`: Checksum of corresponding `/info/<RUBYGEM>` file
|
||||||
|
- Append-only during month, recalculated monthly
|
||||||
|
|
||||||
|
### `/info/<RUBYGEM>` File Format
|
||||||
|
|
||||||
|
**Structure:**
|
||||||
|
```
|
||||||
|
---
|
||||||
|
VERSION[-PLATFORM] [DEPENDENCY[,DEPENDENCY,...]]|REQUIREMENT[,REQUIREMENT,...]
|
||||||
|
```
|
||||||
|
|
||||||
|
**Dependency Format:**
|
||||||
|
```
|
||||||
|
GEM:CONSTRAINT[&CONSTRAINT]
|
||||||
|
```
|
||||||
|
- Examples: `actionmailer:= 2.2.2`, `parser:>= 3.2.2.3`
|
||||||
|
- Operators: `=`, `>`, `<`, `>=`, `<=`, `~>`, `!=`
|
||||||
|
- Multiple constraints: `unicode-display_width:< 3.0&>= 2.4.0`
|
||||||
|
|
||||||
|
**Requirement Format:**
|
||||||
|
```
|
||||||
|
checksum:SHA256_HEX
|
||||||
|
ruby:CONSTRAINT
|
||||||
|
rubygems:CONSTRAINT
|
||||||
|
```
|
||||||
|
|
||||||
|
**Platform:**
|
||||||
|
- Default platform is `ruby`
|
||||||
|
- Non-default platforms: `VERSION-PLATFORM` (e.g., `3.2.1-arm64-darwin`)
|
||||||
|
|
||||||
|
**Yanked Gems:**
|
||||||
|
- Listed with `-` prefix in `/versions`
|
||||||
|
- Excluded entirely from `/info/<RUBYGEM>` file
|
||||||
|
|
||||||
|
### `/names` File Format
|
||||||
|
|
||||||
|
```
|
||||||
|
---
|
||||||
|
gemname1
|
||||||
|
gemname2
|
||||||
|
gemname3
|
||||||
|
```
|
||||||
|
|
||||||
|
### HTTP Range Support
|
||||||
|
|
||||||
|
**Headers:**
|
||||||
|
- `Range: bytes=#{start}-`: Request from byte position
|
||||||
|
- `If-None-Match`: ETag conditional request
|
||||||
|
- `Repr-Digest`: SHA256 checksum in response
|
||||||
|
|
||||||
|
**Caching Strategy:**
|
||||||
|
1. Store file with last byte position
|
||||||
|
2. Request range from last position
|
||||||
|
3. Append response to existing file
|
||||||
|
4. Verify SHA256 against `Repr-Digest`
|
||||||
|
|
||||||
|
### RubyGems Upload/Management API
|
||||||
|
|
||||||
|
**Upload Gem:**
|
||||||
|
- `POST /api/v1/gems`
|
||||||
|
- Binary `.gem` file in request body
|
||||||
|
- `Authorization` header with API key
|
||||||
|
|
||||||
|
**Yank Version:**
|
||||||
|
- `DELETE /api/v1/gems/yank`
|
||||||
|
- Parameters: `gem_name`, `version`
|
||||||
|
|
||||||
|
**Unyank Version:**
|
||||||
|
- `PUT /api/v1/gems/unyank`
|
||||||
|
- Parameters: `gem_name`, `version`
|
||||||
|
|
||||||
|
**Version Metadata:**
|
||||||
|
- `GET /api/v1/versions/<gem>.json`
|
||||||
|
- Returns JSON array of versions
|
||||||
|
|
||||||
|
**Dependencies:**
|
||||||
|
- `GET /api/v1/dependencies?gems=<comma-list>`
|
||||||
|
- Returns dependency information for resolution
|
||||||
|
|
||||||
|
---
|
||||||
|
|
||||||
|
## Implementation Strategy
|
||||||
|
|
||||||
|
### Storage Paths
|
||||||
|
|
||||||
|
**PyPI:**
|
||||||
|
```
|
||||||
|
pypi/
|
||||||
|
├── simple/ # PEP 503 HTML files
|
||||||
|
│ ├── index.html # All packages list
|
||||||
|
│ └── {package}/index.html # Package versions list
|
||||||
|
├── packages/
|
||||||
|
│ └── {package}/{filename} # .whl and .tar.gz files
|
||||||
|
└── metadata/
|
||||||
|
└── {package}/metadata.json # Package metadata
|
||||||
|
```
|
||||||
|
|
||||||
|
**RubyGems:**
|
||||||
|
```
|
||||||
|
rubygems/
|
||||||
|
├── versions # Master versions file
|
||||||
|
├── info/{gemname} # Per-gem info files
|
||||||
|
├── names # All gem names
|
||||||
|
└── gems/{gemname}-{version}.gem # .gem files
|
||||||
|
```
|
||||||
|
|
||||||
|
### Authentication Pattern
|
||||||
|
|
||||||
|
Both protocols should follow the existing UUID token pattern used by NPM, Maven, Cargo, Composer:
|
||||||
|
|
||||||
|
```typescript
|
||||||
|
// AuthManager additions
|
||||||
|
createPypiToken(userId: string, readonly: boolean): string
|
||||||
|
validatePypiToken(token: string): ITokenInfo | null
|
||||||
|
revokePypiToken(token: string): boolean
|
||||||
|
|
||||||
|
createRubyGemsToken(userId: string, readonly: boolean): string
|
||||||
|
validateRubyGemsToken(token: string): ITokenInfo | null
|
||||||
|
revokeRubyGemsToken(token: string): boolean
|
||||||
|
```
|
||||||
|
|
||||||
|
### Scope Format
|
||||||
|
|
||||||
|
```
|
||||||
|
pypi:package:{name}:{read|write}
|
||||||
|
rubygems:gem:{name}:{read|write|yank}
|
||||||
|
```
|
||||||
|
|
||||||
|
### Common Patterns
|
||||||
|
|
||||||
|
1. **Package name normalization** - Critical for PyPI
|
||||||
|
2. **Checksum calculation** - SHA256 for both protocols
|
||||||
|
3. **Append-only files** - RubyGems compact index
|
||||||
|
4. **Content negotiation** - PyPI JSON vs HTML
|
||||||
|
5. **Multipart upload parsing** - PyPI file uploads
|
||||||
|
6. **Binary file handling** - Both protocols (.whl, .tar.gz, .gem)
|
||||||
|
|
||||||
|
---
|
||||||
|
|
||||||
|
## Key Differences from Existing Protocols
|
||||||
|
|
||||||
|
**PyPI vs NPM:**
|
||||||
|
- PyPI uses Simple API (HTML) + JSON API
|
||||||
|
- PyPI requires package name normalization
|
||||||
|
- PyPI uses multipart form data for uploads (not JSON)
|
||||||
|
- PyPI supports multiple file types per release (wheel + sdist)
|
||||||
|
|
||||||
|
**RubyGems vs Cargo:**
|
||||||
|
- RubyGems uses compact index (append-only text files)
|
||||||
|
- RubyGems uses checksums in index files (not just filenames)
|
||||||
|
- RubyGems has HTTP Range support for incremental updates
|
||||||
|
- RubyGems uses MD5 for index checksums, SHA256 for .gem files
|
||||||
|
|
||||||
|
---
|
||||||
|
|
||||||
|
## Testing Requirements
|
||||||
|
|
||||||
|
### PyPI Tests Must Cover:
|
||||||
|
- Package upload (wheel and sdist)
|
||||||
|
- Package name normalization
|
||||||
|
- Simple API HTML generation (PEP 503)
|
||||||
|
- JSON API responses (PEP 691)
|
||||||
|
- Content negotiation
|
||||||
|
- Hash calculation and verification
|
||||||
|
- Authentication (tokens)
|
||||||
|
- Multi-file releases
|
||||||
|
- Yanked packages
|
||||||
|
|
||||||
|
### RubyGems Tests Must Cover:
|
||||||
|
- Gem upload
|
||||||
|
- Compact index generation
|
||||||
|
- `/versions` file updates (append-only)
|
||||||
|
- `/info/<gem>` file generation
|
||||||
|
- `/names` file generation
|
||||||
|
- Checksum calculations (MD5 and SHA256)
|
||||||
|
- Platform-specific gems
|
||||||
|
- Yanking/unyanking
|
||||||
|
- HTTP Range requests
|
||||||
|
- Authentication (API keys)
|
||||||
|
|
||||||
|
---
|
||||||
|
|
||||||
|
## Security Considerations
|
||||||
|
|
||||||
|
1. **Package name validation** - Prevent path traversal
|
||||||
|
2. **File size limits** - Prevent DoS via large uploads
|
||||||
|
3. **Content-Type validation** - Verify file types
|
||||||
|
4. **Checksum verification** - Ensure file integrity
|
||||||
|
5. **Token scope enforcement** - Read vs write permissions
|
||||||
|
6. **HTML escaping** - Prevent XSS in generated HTML
|
||||||
|
7. **Metadata sanitization** - Clean user-provided strings
|
||||||
|
8. **Rate limiting** - Consider upload frequency limits
|
||||||
|
|||||||
302
readme.md
302
readme.md
@@ -1,18 +1,21 @@
|
|||||||
# @push.rocks/smartregistry
|
# @push.rocks/smartregistry
|
||||||
|
|
||||||
> 🚀 A composable TypeScript library implementing both **OCI Distribution Specification v1.1** and **NPM Registry API** for building unified container and package registries.
|
> 🚀 A composable TypeScript library implementing **OCI Distribution Specification v1.1**, **NPM Registry API**, **Maven Repository**, **Cargo/crates.io Registry**, and **Composer/Packagist** for building unified container and package registries.
|
||||||
|
|
||||||
## ✨ Features
|
## ✨ Features
|
||||||
|
|
||||||
### 🔄 Dual Protocol Support
|
### 🔄 Multi-Protocol Support
|
||||||
- **OCI Distribution Spec v1.1**: Full container registry with manifest/blob operations
|
- **OCI Distribution Spec v1.1**: Full container registry with manifest/blob operations
|
||||||
- **NPM Registry API**: Complete package registry with publish/install/search
|
- **NPM Registry API**: Complete package registry with publish/install/search
|
||||||
|
- **Maven Repository**: Java/JVM artifact management with POM support
|
||||||
|
- **Cargo/crates.io Registry**: Rust crate registry with sparse HTTP protocol
|
||||||
|
- **Composer/Packagist**: PHP package registry with Composer v2 protocol
|
||||||
|
|
||||||
### 🏗️ Unified Architecture
|
### 🏗️ Unified Architecture
|
||||||
- **Composable Design**: Core infrastructure with protocol plugins
|
- **Composable Design**: Core infrastructure with protocol plugins
|
||||||
- **Shared Storage**: Cloud-agnostic S3-compatible backend ([@push.rocks/smartbucket](https://www.npmjs.com/package/@push.rocks/smartbucket))
|
- **Shared Storage**: Cloud-agnostic S3-compatible backend ([@push.rocks/smartbucket](https://www.npmjs.com/package/@push.rocks/smartbucket))
|
||||||
- **Unified Authentication**: Scope-based permissions across both protocols
|
- **Unified Authentication**: Scope-based permissions across all protocols
|
||||||
- **Path-based Routing**: `/oci/*` for containers, `/npm/*` for packages
|
- **Path-based Routing**: `/oci/*` for containers, `/npm/*` for packages, `/maven/*` for Java artifacts, `/cargo/*` for Rust crates, `/composer/*` for PHP packages
|
||||||
|
|
||||||
### 🔐 Authentication & Authorization
|
### 🔐 Authentication & Authorization
|
||||||
- NPM UUID tokens for package operations
|
- NPM UUID tokens for package operations
|
||||||
@@ -35,6 +38,27 @@
|
|||||||
- ✅ Dist-tag management
|
- ✅ Dist-tag management
|
||||||
- ✅ Token management
|
- ✅ Token management
|
||||||
|
|
||||||
|
**Maven Features:**
|
||||||
|
- ✅ Artifact upload/download
|
||||||
|
- ✅ POM and metadata management
|
||||||
|
- ✅ Snapshot and release versions
|
||||||
|
- ✅ Checksum verification (MD5, SHA1)
|
||||||
|
|
||||||
|
**Cargo Features:**
|
||||||
|
- ✅ Crate publish (.crate files)
|
||||||
|
- ✅ Sparse HTTP protocol (modern index)
|
||||||
|
- ✅ Version yank/unyank
|
||||||
|
- ✅ Dependency resolution
|
||||||
|
- ✅ Search functionality
|
||||||
|
|
||||||
|
**Composer Features:**
|
||||||
|
- ✅ Package publish/download (ZIP format)
|
||||||
|
- ✅ Composer v2 repository API
|
||||||
|
- ✅ Package metadata (packages.json)
|
||||||
|
- ✅ Version management
|
||||||
|
- ✅ Dependency resolution
|
||||||
|
- ✅ PSR-4/PSR-0 autoloading support
|
||||||
|
|
||||||
## 📥 Installation
|
## 📥 Installation
|
||||||
|
|
||||||
```bash
|
```bash
|
||||||
@@ -78,6 +102,18 @@ const config: IRegistryConfig = {
|
|||||||
enabled: true,
|
enabled: true,
|
||||||
basePath: '/npm',
|
basePath: '/npm',
|
||||||
},
|
},
|
||||||
|
maven: {
|
||||||
|
enabled: true,
|
||||||
|
basePath: '/maven',
|
||||||
|
},
|
||||||
|
cargo: {
|
||||||
|
enabled: true,
|
||||||
|
basePath: '/cargo',
|
||||||
|
},
|
||||||
|
composer: {
|
||||||
|
enabled: true,
|
||||||
|
basePath: '/composer',
|
||||||
|
},
|
||||||
};
|
};
|
||||||
|
|
||||||
const registry = new SmartRegistry(config);
|
const registry = new SmartRegistry(config);
|
||||||
@@ -212,6 +248,167 @@ const searchResults = await registry.handleRequest({
|
|||||||
});
|
});
|
||||||
```
|
```
|
||||||
|
|
||||||
|
### 🦀 Cargo Registry (Rust Crates)
|
||||||
|
|
||||||
|
```typescript
|
||||||
|
// Get config.json (required for Cargo)
|
||||||
|
const config = await registry.handleRequest({
|
||||||
|
method: 'GET',
|
||||||
|
path: '/cargo/config.json',
|
||||||
|
headers: {},
|
||||||
|
query: {},
|
||||||
|
});
|
||||||
|
|
||||||
|
// Get index file for a crate
|
||||||
|
const index = await registry.handleRequest({
|
||||||
|
method: 'GET',
|
||||||
|
path: '/cargo/se/rd/serde', // Path based on crate name length
|
||||||
|
headers: {},
|
||||||
|
query: {},
|
||||||
|
});
|
||||||
|
|
||||||
|
// Download a crate file
|
||||||
|
const crateFile = await registry.handleRequest({
|
||||||
|
method: 'GET',
|
||||||
|
path: '/cargo/api/v1/crates/serde/1.0.0/download',
|
||||||
|
headers: {},
|
||||||
|
query: {},
|
||||||
|
});
|
||||||
|
|
||||||
|
// Publish a crate (binary format: [4 bytes JSON len][JSON][4 bytes crate len][.crate])
|
||||||
|
const publishResponse = await registry.handleRequest({
|
||||||
|
method: 'PUT',
|
||||||
|
path: '/cargo/api/v1/crates/new',
|
||||||
|
headers: { 'Authorization': '<cargo-token>' }, // No "Bearer" prefix
|
||||||
|
query: {},
|
||||||
|
body: binaryPublishData, // Length-prefixed binary format
|
||||||
|
});
|
||||||
|
|
||||||
|
// Yank a version (deprecate without deleting)
|
||||||
|
const yankResponse = await registry.handleRequest({
|
||||||
|
method: 'DELETE',
|
||||||
|
path: '/cargo/api/v1/crates/my-crate/0.1.0/yank',
|
||||||
|
headers: { 'Authorization': '<cargo-token>' },
|
||||||
|
query: {},
|
||||||
|
});
|
||||||
|
|
||||||
|
// Unyank a version
|
||||||
|
const unyankResponse = await registry.handleRequest({
|
||||||
|
method: 'PUT',
|
||||||
|
path: '/cargo/api/v1/crates/my-crate/0.1.0/unyank',
|
||||||
|
headers: { 'Authorization': '<cargo-token>' },
|
||||||
|
query: {},
|
||||||
|
});
|
||||||
|
|
||||||
|
// Search crates
|
||||||
|
const search = await registry.handleRequest({
|
||||||
|
method: 'GET',
|
||||||
|
path: '/cargo/api/v1/crates',
|
||||||
|
headers: {},
|
||||||
|
query: { q: 'serde', per_page: '10' },
|
||||||
|
});
|
||||||
|
```
|
||||||
|
|
||||||
|
**Using with Cargo CLI:**
|
||||||
|
|
||||||
|
```toml
|
||||||
|
# .cargo/config.toml
|
||||||
|
[registries.myregistry]
|
||||||
|
index = "sparse+https://registry.example.com/cargo/"
|
||||||
|
|
||||||
|
[registries.myregistry.credential-provider]
|
||||||
|
# Or use credentials directly:
|
||||||
|
# [registries.myregistry]
|
||||||
|
# token = "your-api-token"
|
||||||
|
```
|
||||||
|
|
||||||
|
```bash
|
||||||
|
# Publish to custom registry
|
||||||
|
cargo publish --registry=myregistry
|
||||||
|
|
||||||
|
# Install from custom registry
|
||||||
|
cargo install --registry=myregistry my-crate
|
||||||
|
|
||||||
|
# Search custom registry
|
||||||
|
cargo search --registry=myregistry tokio
|
||||||
|
```
|
||||||
|
|
||||||
|
### 🎼 Composer Registry (PHP Packages)
|
||||||
|
|
||||||
|
```typescript
|
||||||
|
// Get repository root (packages.json)
|
||||||
|
const packagesJson = await registry.handleRequest({
|
||||||
|
method: 'GET',
|
||||||
|
path: '/composer/packages.json',
|
||||||
|
headers: {},
|
||||||
|
query: {},
|
||||||
|
});
|
||||||
|
|
||||||
|
// Get package metadata
|
||||||
|
const metadata = await registry.handleRequest({
|
||||||
|
method: 'GET',
|
||||||
|
path: '/composer/p2/vendor/package.json',
|
||||||
|
headers: {},
|
||||||
|
query: {},
|
||||||
|
});
|
||||||
|
|
||||||
|
// Upload a package (ZIP with composer.json)
|
||||||
|
const zipBuffer = await readFile('package.zip');
|
||||||
|
const uploadResponse = await registry.handleRequest({
|
||||||
|
method: 'PUT',
|
||||||
|
path: '/composer/packages/vendor/package',
|
||||||
|
headers: { 'Authorization': `Bearer <composer-token>` },
|
||||||
|
query: {},
|
||||||
|
body: zipBuffer,
|
||||||
|
});
|
||||||
|
|
||||||
|
// Download package ZIP
|
||||||
|
const download = await registry.handleRequest({
|
||||||
|
method: 'GET',
|
||||||
|
path: '/composer/dists/vendor/package/ref123.zip',
|
||||||
|
headers: {},
|
||||||
|
query: {},
|
||||||
|
});
|
||||||
|
|
||||||
|
// List all packages
|
||||||
|
const list = await registry.handleRequest({
|
||||||
|
method: 'GET',
|
||||||
|
path: '/composer/packages/list.json',
|
||||||
|
headers: {},
|
||||||
|
query: {},
|
||||||
|
});
|
||||||
|
|
||||||
|
// Delete a specific version
|
||||||
|
const deleteVersion = await registry.handleRequest({
|
||||||
|
method: 'DELETE',
|
||||||
|
path: '/composer/packages/vendor/package/1.0.0',
|
||||||
|
headers: { 'Authorization': `Bearer <composer-token>` },
|
||||||
|
query: {},
|
||||||
|
});
|
||||||
|
```
|
||||||
|
|
||||||
|
**Using with Composer CLI:**
|
||||||
|
|
||||||
|
```json
|
||||||
|
// composer.json
|
||||||
|
{
|
||||||
|
"repositories": [
|
||||||
|
{
|
||||||
|
"type": "composer",
|
||||||
|
"url": "https://registry.example.com/composer"
|
||||||
|
}
|
||||||
|
]
|
||||||
|
}
|
||||||
|
```
|
||||||
|
|
||||||
|
```bash
|
||||||
|
# Install from custom registry
|
||||||
|
composer require vendor/package
|
||||||
|
|
||||||
|
# Update packages
|
||||||
|
composer update
|
||||||
|
```
|
||||||
|
|
||||||
### 🔐 Authentication
|
### 🔐 Authentication
|
||||||
|
|
||||||
```typescript
|
```typescript
|
||||||
@@ -374,6 +571,48 @@ NPM registry API compliant implementation.
|
|||||||
- `POST /-/npm/v1/tokens` - Create token
|
- `POST /-/npm/v1/tokens` - Create token
|
||||||
- `PUT /-/package/{pkg}/dist-tags/{tag}` - Update tag
|
- `PUT /-/package/{pkg}/dist-tags/{tag}` - Update tag
|
||||||
|
|
||||||
|
#### CargoRegistry
|
||||||
|
|
||||||
|
Cargo/crates.io registry with sparse HTTP protocol support.
|
||||||
|
|
||||||
|
**Endpoints:**
|
||||||
|
- `GET /config.json` - Registry configuration (sparse protocol)
|
||||||
|
- `GET /index/{path}` - Index files (hierarchical structure)
|
||||||
|
- `/1/{name}` - 1-character crate names
|
||||||
|
- `/2/{name}` - 2-character crate names
|
||||||
|
- `/3/{c}/{name}` - 3-character crate names
|
||||||
|
- `/{p1}/{p2}/{name}` - 4+ character crate names
|
||||||
|
- `PUT /api/v1/crates/new` - Publish crate (binary format)
|
||||||
|
- `GET /api/v1/crates/{crate}/{version}/download` - Download .crate file
|
||||||
|
- `DELETE /api/v1/crates/{crate}/{version}/yank` - Yank (deprecate) version
|
||||||
|
- `PUT /api/v1/crates/{crate}/{version}/unyank` - Unyank version
|
||||||
|
- `GET /api/v1/crates?q={query}` - Search crates
|
||||||
|
|
||||||
|
**Index Format:**
|
||||||
|
- Newline-delimited JSON (one line per version)
|
||||||
|
- SHA256 checksums for .crate files
|
||||||
|
- Yanked flag (keep files, mark unavailable)
|
||||||
|
|
||||||
|
#### ComposerRegistry
|
||||||
|
|
||||||
|
Composer v2 repository API compliant implementation.
|
||||||
|
|
||||||
|
**Endpoints:**
|
||||||
|
- `GET /packages.json` - Repository metadata and configuration
|
||||||
|
- `GET /p2/{vendor}/{package}.json` - Package version metadata
|
||||||
|
- `GET /p2/{vendor}/{package}~dev.json` - Dev versions metadata
|
||||||
|
- `GET /packages/list.json` - List all packages
|
||||||
|
- `GET /dists/{vendor}/{package}/{ref}.zip` - Download package ZIP
|
||||||
|
- `PUT /packages/{vendor}/{package}` - Upload package (requires auth)
|
||||||
|
- `DELETE /packages/{vendor}/{package}` - Delete entire package
|
||||||
|
- `DELETE /packages/{vendor}/{package}/{version}` - Delete specific version
|
||||||
|
|
||||||
|
**Package Format:**
|
||||||
|
- ZIP archives with composer.json in root
|
||||||
|
- SHA-1 checksums for verification
|
||||||
|
- Version normalization (1.0.0 → 1.0.0.0)
|
||||||
|
- PSR-4/PSR-0 autoloading configuration
|
||||||
|
|
||||||
## 🗄️ Storage Structure
|
## 🗄️ Storage Structure
|
||||||
|
|
||||||
```
|
```
|
||||||
@@ -385,16 +624,38 @@ bucket/
|
|||||||
│ │ └── {repository}/{digest}
|
│ │ └── {repository}/{digest}
|
||||||
│ └── tags/
|
│ └── tags/
|
||||||
│ └── {repository}/tags.json
|
│ └── {repository}/tags.json
|
||||||
└── npm/
|
├── npm/
|
||||||
├── packages/
|
│ ├── packages/
|
||||||
│ ├── {name}/
|
│ │ ├── {name}/
|
||||||
│ │ ├── index.json # Packument
|
│ │ │ ├── index.json # Packument
|
||||||
│ │ └── {name}-{ver}.tgz # Tarball
|
│ │ │ └── {name}-{ver}.tgz # Tarball
|
||||||
│ └── @{scope}/{name}/
|
│ │ └── @{scope}/{name}/
|
||||||
│ ├── index.json
|
│ │ ├── index.json
|
||||||
│ └── {name}-{ver}.tgz
|
│ │ └── {name}-{ver}.tgz
|
||||||
└── users/
|
│ └── users/
|
||||||
└── {username}.json
|
│ └── {username}.json
|
||||||
|
├── maven/
|
||||||
|
│ ├── artifacts/
|
||||||
|
│ │ └── {group-path}/{artifact}/{version}/
|
||||||
|
│ │ ├── {artifact}-{version}.jar
|
||||||
|
│ │ ├── {artifact}-{version}.pom
|
||||||
|
│ │ └── {artifact}-{version}.{ext}
|
||||||
|
│ └── metadata/
|
||||||
|
│ └── {group-path}/{artifact}/maven-metadata.xml
|
||||||
|
├── cargo/
|
||||||
|
│ ├── config.json # Registry configuration (sparse protocol)
|
||||||
|
│ ├── index/ # Hierarchical index structure
|
||||||
|
│ │ ├── 1/{name} # 1-char crate names (e.g., "a")
|
||||||
|
│ │ ├── 2/{name} # 2-char crate names (e.g., "io")
|
||||||
|
│ │ ├── 3/{c}/{name} # 3-char crate names (e.g., "3/a/axo")
|
||||||
|
│ │ └── {p1}/{p2}/{name} # 4+ char (e.g., "se/rd/serde")
|
||||||
|
│ └── crates/
|
||||||
|
│ └── {name}/{name}-{version}.crate # Gzipped tar archives
|
||||||
|
└── composer/
|
||||||
|
└── packages/
|
||||||
|
└── {vendor}/{package}/
|
||||||
|
├── metadata.json # All versions metadata
|
||||||
|
└── {reference}.zip # Package ZIP files
|
||||||
```
|
```
|
||||||
|
|
||||||
## 🎯 Scope Format
|
## 🎯 Scope Format
|
||||||
@@ -408,9 +669,22 @@ Examples:
|
|||||||
npm:package:express:read # Read express package
|
npm:package:express:read # Read express package
|
||||||
npm:package:*:write # Write any package
|
npm:package:*:write # Write any package
|
||||||
npm:*:*:* # Full NPM access
|
npm:*:*:* # Full NPM access
|
||||||
|
|
||||||
oci:repository:nginx:pull # Pull nginx image
|
oci:repository:nginx:pull # Pull nginx image
|
||||||
oci:repository:*:push # Push any image
|
oci:repository:*:push # Push any image
|
||||||
oci:*:*:* # Full OCI access
|
oci:*:*:* # Full OCI access
|
||||||
|
|
||||||
|
maven:artifact:com.example:read # Read Maven artifact
|
||||||
|
maven:artifact:*:write # Write any artifact
|
||||||
|
maven:*:*:* # Full Maven access
|
||||||
|
|
||||||
|
cargo:crate:serde:write # Write serde crate
|
||||||
|
cargo:crate:*:read # Read any crate
|
||||||
|
cargo:*:*:* # Full Cargo access
|
||||||
|
|
||||||
|
composer:package:vendor/package:read # Read Composer package
|
||||||
|
composer:package:*:write # Write any package
|
||||||
|
composer:*:*:* # Full Composer access
|
||||||
```
|
```
|
||||||
|
|
||||||
## 🔌 Integration Examples
|
## 🔌 Integration Examples
|
||||||
|
|||||||
131
test/cargo.test.node.ts
Normal file
131
test/cargo.test.node.ts
Normal file
@@ -0,0 +1,131 @@
|
|||||||
|
import { tap, expect } from '@git.zone/tstest';
|
||||||
|
import { RegistryStorage } from '../ts/core/classes.registrystorage.js';
|
||||||
|
import { CargoRegistry } from '../ts/cargo/classes.cargoregistry.js';
|
||||||
|
import { AuthManager } from '../ts/core/classes.authmanager.js';
|
||||||
|
|
||||||
|
// Test index path calculation
|
||||||
|
tap.test('should calculate correct index paths for different crate names', async () => {
|
||||||
|
const storage = new RegistryStorage({
|
||||||
|
accessKey: 'test',
|
||||||
|
accessSecret: 'test',
|
||||||
|
endpoint: 's3.test.com',
|
||||||
|
bucketName: 'test-bucket',
|
||||||
|
});
|
||||||
|
|
||||||
|
// Access private method for testing
|
||||||
|
const getPath = (storage as any).getCargoIndexPath.bind(storage);
|
||||||
|
|
||||||
|
// 1-character names
|
||||||
|
expect(getPath('a')).to.equal('cargo/index/1/a');
|
||||||
|
expect(getPath('z')).to.equal('cargo/index/1/z');
|
||||||
|
|
||||||
|
// 2-character names
|
||||||
|
expect(getPath('io')).to.equal('cargo/index/2/io');
|
||||||
|
expect(getPath('ab')).to.equal('cargo/index/2/ab');
|
||||||
|
|
||||||
|
// 3-character names
|
||||||
|
expect(getPath('axo')).to.equal('cargo/index/3/a/axo');
|
||||||
|
expect(getPath('foo')).to.equal('cargo/index/3/f/foo');
|
||||||
|
|
||||||
|
// 4+ character names
|
||||||
|
expect(getPath('serde')).to.equal('cargo/index/se/rd/serde');
|
||||||
|
expect(getPath('tokio')).to.equal('cargo/index/to/ki/tokio');
|
||||||
|
expect(getPath('my-crate')).to.equal('cargo/index/my/--/my-crate');
|
||||||
|
});
|
||||||
|
|
||||||
|
// Test crate file path calculation
|
||||||
|
tap.test('should calculate correct crate file paths', async () => {
|
||||||
|
const storage = new RegistryStorage({
|
||||||
|
accessKey: 'test',
|
||||||
|
accessSecret: 'test',
|
||||||
|
endpoint: 's3.test.com',
|
||||||
|
bucketName: 'test-bucket',
|
||||||
|
});
|
||||||
|
|
||||||
|
// Access private method for testing
|
||||||
|
const getPath = (storage as any).getCargoCratePath.bind(storage);
|
||||||
|
|
||||||
|
expect(getPath('serde', '1.0.0')).to.equal('cargo/crates/serde/serde-1.0.0.crate');
|
||||||
|
expect(getPath('tokio', '1.28.0')).to.equal('cargo/crates/tokio/tokio-1.28.0.crate');
|
||||||
|
expect(getPath('my-crate', '0.1.0')).to.equal('cargo/crates/my-crate/my-crate-0.1.0.crate');
|
||||||
|
});
|
||||||
|
|
||||||
|
// Test crate name validation
|
||||||
|
tap.test('should validate crate names correctly', async () => {
|
||||||
|
const storage = new RegistryStorage({
|
||||||
|
accessKey: 'test',
|
||||||
|
accessSecret: 'test',
|
||||||
|
endpoint: 's3.test.com',
|
||||||
|
bucketName: 'test-bucket',
|
||||||
|
});
|
||||||
|
|
||||||
|
const authManager = new AuthManager({
|
||||||
|
jwtSecret: 'test-secret',
|
||||||
|
tokenStore: 'memory',
|
||||||
|
npmTokens: { enabled: true },
|
||||||
|
ociTokens: { enabled: false, realm: '', service: '' },
|
||||||
|
});
|
||||||
|
|
||||||
|
const registry = new CargoRegistry(storage, authManager, '/cargo', 'http://localhost:5000/cargo');
|
||||||
|
|
||||||
|
// Access private method for testing
|
||||||
|
const validate = (registry as any).validateCrateName.bind(registry);
|
||||||
|
|
||||||
|
// Valid names
|
||||||
|
expect(validate('serde')).to.be.true;
|
||||||
|
expect(validate('tokio')).to.be.true;
|
||||||
|
expect(validate('my-crate')).to.be.true;
|
||||||
|
expect(validate('my_crate')).to.be.true;
|
||||||
|
expect(validate('crate123')).to.be.true;
|
||||||
|
expect(validate('a')).to.be.true;
|
||||||
|
|
||||||
|
// Invalid names (uppercase not allowed)
|
||||||
|
expect(validate('Serde')).to.be.false;
|
||||||
|
expect(validate('MyCreate')).to.be.false;
|
||||||
|
|
||||||
|
// Invalid names (special characters)
|
||||||
|
expect(validate('my.crate')).to.be.false;
|
||||||
|
expect(validate('my@crate')).to.be.false;
|
||||||
|
expect(validate('my crate')).to.be.false;
|
||||||
|
|
||||||
|
// Invalid names (too long)
|
||||||
|
const longName = 'a'.repeat(65);
|
||||||
|
expect(validate(longName)).to.be.false;
|
||||||
|
|
||||||
|
// Invalid names (empty)
|
||||||
|
expect(validate('')).to.be.false;
|
||||||
|
});
|
||||||
|
|
||||||
|
// Test config.json response
|
||||||
|
tap.test('should return valid config.json', async () => {
|
||||||
|
const storage = new RegistryStorage({
|
||||||
|
accessKey: 'test',
|
||||||
|
accessSecret: 'test',
|
||||||
|
endpoint: 's3.test.com',
|
||||||
|
bucketName: 'test-bucket',
|
||||||
|
});
|
||||||
|
|
||||||
|
const authManager = new AuthManager({
|
||||||
|
jwtSecret: 'test-secret',
|
||||||
|
tokenStore: 'memory',
|
||||||
|
npmTokens: { enabled: true },
|
||||||
|
ociTokens: { enabled: false, realm: '', service: '' },
|
||||||
|
});
|
||||||
|
|
||||||
|
const registry = new CargoRegistry(storage, authManager, '/cargo', 'http://localhost:5000/cargo');
|
||||||
|
|
||||||
|
const response = await registry.handleRequest({
|
||||||
|
method: 'GET',
|
||||||
|
path: '/cargo/config.json',
|
||||||
|
headers: {},
|
||||||
|
query: {},
|
||||||
|
});
|
||||||
|
|
||||||
|
expect(response.status).to.equal(200);
|
||||||
|
expect(response.headers['Content-Type']).to.equal('application/json');
|
||||||
|
expect(response.body).to.be.an('object');
|
||||||
|
expect(response.body.dl).to.include('/api/v1/crates/{crate}/{version}/download');
|
||||||
|
expect(response.body.api).to.equal('http://localhost:5000/cargo');
|
||||||
|
});
|
||||||
|
|
||||||
|
export default tap.start();
|
||||||
@@ -6,7 +6,7 @@ import type { IRegistryConfig } from '../../ts/core/interfaces.core.js';
|
|||||||
const testQenv = new qenv.Qenv('./', './.nogit');
|
const testQenv = new qenv.Qenv('./', './.nogit');
|
||||||
|
|
||||||
/**
|
/**
|
||||||
* Create a test SmartRegistry instance with OCI, NPM, and Maven enabled
|
* Create a test SmartRegistry instance with OCI, NPM, Maven, and Composer enabled
|
||||||
*/
|
*/
|
||||||
export async function createTestRegistry(): Promise<SmartRegistry> {
|
export async function createTestRegistry(): Promise<SmartRegistry> {
|
||||||
// Read S3 config from env.json
|
// Read S3 config from env.json
|
||||||
@@ -49,6 +49,14 @@ export async function createTestRegistry(): Promise<SmartRegistry> {
|
|||||||
enabled: true,
|
enabled: true,
|
||||||
basePath: '/maven',
|
basePath: '/maven',
|
||||||
},
|
},
|
||||||
|
composer: {
|
||||||
|
enabled: true,
|
||||||
|
basePath: '/composer',
|
||||||
|
},
|
||||||
|
cargo: {
|
||||||
|
enabled: true,
|
||||||
|
basePath: '/cargo',
|
||||||
|
},
|
||||||
};
|
};
|
||||||
|
|
||||||
const registry = new SmartRegistry(config);
|
const registry = new SmartRegistry(config);
|
||||||
@@ -86,7 +94,13 @@ export async function createTestTokens(registry: SmartRegistry) {
|
|||||||
// Create Maven token with full access
|
// Create Maven token with full access
|
||||||
const mavenToken = await authManager.createMavenToken(userId, false);
|
const mavenToken = await authManager.createMavenToken(userId, false);
|
||||||
|
|
||||||
return { npmToken, ociToken, mavenToken, userId };
|
// Create Composer token with full access
|
||||||
|
const composerToken = await authManager.createComposerToken(userId, false);
|
||||||
|
|
||||||
|
// Create Cargo token with full access
|
||||||
|
const cargoToken = await authManager.createCargoToken(userId, false);
|
||||||
|
|
||||||
|
return { npmToken, ociToken, mavenToken, composerToken, cargoToken, userId };
|
||||||
}
|
}
|
||||||
|
|
||||||
/**
|
/**
|
||||||
@@ -205,3 +219,61 @@ export function calculateMavenChecksums(data: Buffer) {
|
|||||||
sha512: crypto.createHash('sha512').update(data).digest('hex'),
|
sha512: crypto.createHash('sha512').update(data).digest('hex'),
|
||||||
};
|
};
|
||||||
}
|
}
|
||||||
|
|
||||||
|
/**
|
||||||
|
* Helper to create a Composer package ZIP
|
||||||
|
*/
|
||||||
|
export async function createComposerZip(
|
||||||
|
vendorPackage: string,
|
||||||
|
version: string,
|
||||||
|
options?: {
|
||||||
|
description?: string;
|
||||||
|
license?: string[];
|
||||||
|
authors?: Array<{ name: string; email?: string }>;
|
||||||
|
}
|
||||||
|
): Promise<Buffer> {
|
||||||
|
const AdmZip = (await import('adm-zip')).default;
|
||||||
|
const zip = new AdmZip();
|
||||||
|
|
||||||
|
const composerJson = {
|
||||||
|
name: vendorPackage,
|
||||||
|
version: version,
|
||||||
|
type: 'library',
|
||||||
|
description: options?.description || 'Test Composer package',
|
||||||
|
license: options?.license || ['MIT'],
|
||||||
|
authors: options?.authors || [{ name: 'Test Author', email: 'test@example.com' }],
|
||||||
|
require: {
|
||||||
|
php: '>=7.4',
|
||||||
|
},
|
||||||
|
autoload: {
|
||||||
|
'psr-4': {
|
||||||
|
'Vendor\\TestPackage\\': 'src/',
|
||||||
|
},
|
||||||
|
},
|
||||||
|
};
|
||||||
|
|
||||||
|
// Add composer.json
|
||||||
|
zip.addFile('composer.json', Buffer.from(JSON.stringify(composerJson, null, 2), 'utf-8'));
|
||||||
|
|
||||||
|
// Add a test PHP file
|
||||||
|
const [vendor, pkg] = vendorPackage.split('/');
|
||||||
|
const namespace = `${vendor.charAt(0).toUpperCase() + vendor.slice(1)}\\${pkg.charAt(0).toUpperCase() + pkg.slice(1).replace(/-/g, '')}`;
|
||||||
|
const testPhpContent = `<?php
|
||||||
|
namespace ${namespace};
|
||||||
|
|
||||||
|
class TestClass
|
||||||
|
{
|
||||||
|
public function greet(): string
|
||||||
|
{
|
||||||
|
return "Hello from ${vendorPackage}!";
|
||||||
|
}
|
||||||
|
}
|
||||||
|
`;
|
||||||
|
|
||||||
|
zip.addFile('src/TestClass.php', Buffer.from(testPhpContent, 'utf-8'));
|
||||||
|
|
||||||
|
// Add README
|
||||||
|
zip.addFile('README.md', Buffer.from(`# ${vendorPackage}\n\nTest package`, 'utf-8'));
|
||||||
|
|
||||||
|
return zip.toBuffer();
|
||||||
|
}
|
||||||
|
|||||||
475
test/test.cargo.nativecli.node.ts
Normal file
475
test/test.cargo.nativecli.node.ts
Normal file
@@ -0,0 +1,475 @@
|
|||||||
|
/**
|
||||||
|
* Native cargo CLI Testing
|
||||||
|
* Tests the Cargo registry implementation using the actual cargo CLI
|
||||||
|
*/
|
||||||
|
|
||||||
|
import { expect, tap } from '@git.zone/tstest/tapbundle';
|
||||||
|
import { tapNodeTools } from '@git.zone/tstest/tapbundle_serverside';
|
||||||
|
import { SmartRegistry } from '../ts/index.js';
|
||||||
|
import { createTestRegistry, createTestTokens } from './helpers/registry.js';
|
||||||
|
import type { IRequestContext, IResponse } from '../ts/core/interfaces.core.js';
|
||||||
|
import * as http from 'http';
|
||||||
|
import * as url from 'url';
|
||||||
|
import * as fs from 'fs';
|
||||||
|
import * as path from 'path';
|
||||||
|
|
||||||
|
// Test context
|
||||||
|
let registry: SmartRegistry;
|
||||||
|
let server: http.Server;
|
||||||
|
let registryUrl: string;
|
||||||
|
let registryPort: number;
|
||||||
|
let cargoToken: string;
|
||||||
|
let testDir: string;
|
||||||
|
let cargoHome: string;
|
||||||
|
|
||||||
|
/**
|
||||||
|
* Create HTTP server wrapper around SmartRegistry
|
||||||
|
*/
|
||||||
|
async function createHttpServer(
|
||||||
|
registryInstance: SmartRegistry,
|
||||||
|
port: number
|
||||||
|
): Promise<{ server: http.Server; url: string }> {
|
||||||
|
return new Promise((resolve, reject) => {
|
||||||
|
const httpServer = http.createServer(async (req, res) => {
|
||||||
|
try {
|
||||||
|
// Parse request
|
||||||
|
const parsedUrl = url.parse(req.url || '', true);
|
||||||
|
const pathname = parsedUrl.pathname || '/';
|
||||||
|
const query = parsedUrl.query;
|
||||||
|
|
||||||
|
// Read body
|
||||||
|
const chunks: Buffer[] = [];
|
||||||
|
for await (const chunk of req) {
|
||||||
|
chunks.push(chunk);
|
||||||
|
}
|
||||||
|
const bodyBuffer = Buffer.concat(chunks);
|
||||||
|
|
||||||
|
// Parse body based on content type
|
||||||
|
let body: any;
|
||||||
|
if (bodyBuffer.length > 0) {
|
||||||
|
const contentType = req.headers['content-type'] || '';
|
||||||
|
if (contentType.includes('application/json')) {
|
||||||
|
try {
|
||||||
|
body = JSON.parse(bodyBuffer.toString('utf-8'));
|
||||||
|
} catch (error) {
|
||||||
|
body = bodyBuffer;
|
||||||
|
}
|
||||||
|
} else {
|
||||||
|
body = bodyBuffer;
|
||||||
|
}
|
||||||
|
}
|
||||||
|
|
||||||
|
// Convert to IRequestContext
|
||||||
|
const context: IRequestContext = {
|
||||||
|
method: req.method || 'GET',
|
||||||
|
path: pathname,
|
||||||
|
headers: req.headers as Record<string, string>,
|
||||||
|
query: query as Record<string, string>,
|
||||||
|
body: body,
|
||||||
|
};
|
||||||
|
|
||||||
|
// Handle request
|
||||||
|
const response: IResponse = await registryInstance.handleRequest(context);
|
||||||
|
|
||||||
|
// Convert IResponse to HTTP response
|
||||||
|
res.statusCode = response.status;
|
||||||
|
|
||||||
|
// Set headers
|
||||||
|
for (const [key, value] of Object.entries(response.headers || {})) {
|
||||||
|
res.setHeader(key, value);
|
||||||
|
}
|
||||||
|
|
||||||
|
// Send body
|
||||||
|
if (response.body) {
|
||||||
|
if (Buffer.isBuffer(response.body)) {
|
||||||
|
res.end(response.body);
|
||||||
|
} else if (typeof response.body === 'string') {
|
||||||
|
res.end(response.body);
|
||||||
|
} else {
|
||||||
|
res.setHeader('Content-Type', 'application/json');
|
||||||
|
res.end(JSON.stringify(response.body));
|
||||||
|
}
|
||||||
|
} else {
|
||||||
|
res.end();
|
||||||
|
}
|
||||||
|
} catch (error) {
|
||||||
|
console.error('Server error:', error);
|
||||||
|
res.statusCode = 500;
|
||||||
|
res.setHeader('Content-Type', 'application/json');
|
||||||
|
res.end(JSON.stringify({ error: 'INTERNAL_ERROR', message: String(error) }));
|
||||||
|
}
|
||||||
|
});
|
||||||
|
|
||||||
|
httpServer.listen(port, () => {
|
||||||
|
const serverUrl = `http://localhost:${port}`;
|
||||||
|
resolve({ server: httpServer, url: serverUrl });
|
||||||
|
});
|
||||||
|
|
||||||
|
httpServer.on('error', reject);
|
||||||
|
});
|
||||||
|
}
|
||||||
|
|
||||||
|
/**
|
||||||
|
* Setup Cargo configuration
|
||||||
|
*/
|
||||||
|
function setupCargoConfig(registryUrlArg: string, token: string, cargoHomeArg: string): void {
|
||||||
|
const cargoConfigDir = path.join(cargoHomeArg, '.cargo');
|
||||||
|
fs.mkdirSync(cargoConfigDir, { recursive: true });
|
||||||
|
|
||||||
|
// Create config.toml with sparse protocol
|
||||||
|
const configContent = `[registries.test-registry]
|
||||||
|
index = "sparse+${registryUrlArg}/cargo/"
|
||||||
|
|
||||||
|
[source.crates-io]
|
||||||
|
replace-with = "test-registry"
|
||||||
|
|
||||||
|
[net]
|
||||||
|
retry = 0
|
||||||
|
`;
|
||||||
|
|
||||||
|
fs.writeFileSync(path.join(cargoConfigDir, 'config.toml'), configContent, 'utf-8');
|
||||||
|
|
||||||
|
// Create credentials.toml (Cargo uses plain token, no "Bearer" prefix)
|
||||||
|
const credentialsContent = `[registries.test-registry]
|
||||||
|
token = "${token}"
|
||||||
|
`;
|
||||||
|
|
||||||
|
fs.writeFileSync(path.join(cargoConfigDir, 'credentials.toml'), credentialsContent, 'utf-8');
|
||||||
|
}
|
||||||
|
|
||||||
|
/**
|
||||||
|
* Create a test Cargo crate
|
||||||
|
*/
|
||||||
|
function createTestCrate(
|
||||||
|
crateName: string,
|
||||||
|
version: string,
|
||||||
|
targetDir: string
|
||||||
|
): string {
|
||||||
|
const crateDir = path.join(targetDir, crateName);
|
||||||
|
fs.mkdirSync(crateDir, { recursive: true });
|
||||||
|
|
||||||
|
// Create Cargo.toml
|
||||||
|
const cargoToml = `[package]
|
||||||
|
name = "${crateName}"
|
||||||
|
version = "${version}"
|
||||||
|
edition = "2021"
|
||||||
|
description = "Test crate ${crateName}"
|
||||||
|
license = "MIT"
|
||||||
|
authors = ["Test Author <test@example.com>"]
|
||||||
|
|
||||||
|
[dependencies]
|
||||||
|
`;
|
||||||
|
|
||||||
|
fs.writeFileSync(path.join(crateDir, 'Cargo.toml'), cargoToml, 'utf-8');
|
||||||
|
|
||||||
|
// Create src directory
|
||||||
|
const srcDir = path.join(crateDir, 'src');
|
||||||
|
fs.mkdirSync(srcDir, { recursive: true });
|
||||||
|
|
||||||
|
// Create lib.rs
|
||||||
|
const libRs = `//! Test crate ${crateName}
|
||||||
|
|
||||||
|
/// Returns a greeting message
|
||||||
|
pub fn greet() -> String {
|
||||||
|
format!("Hello from {}@{}", "${crateName}", "${version}")
|
||||||
|
}
|
||||||
|
|
||||||
|
#[cfg(test)]
|
||||||
|
mod tests {
|
||||||
|
use super::*;
|
||||||
|
|
||||||
|
#[test]
|
||||||
|
fn test_greet() {
|
||||||
|
let greeting = greet();
|
||||||
|
assert!(greeting.contains("${crateName}"));
|
||||||
|
}
|
||||||
|
}
|
||||||
|
`;
|
||||||
|
|
||||||
|
fs.writeFileSync(path.join(srcDir, 'lib.rs'), libRs, 'utf-8');
|
||||||
|
|
||||||
|
// Create README.md
|
||||||
|
const readme = `# ${crateName}
|
||||||
|
|
||||||
|
Test crate for SmartRegistry.
|
||||||
|
|
||||||
|
Version: ${version}
|
||||||
|
`;
|
||||||
|
|
||||||
|
fs.writeFileSync(path.join(crateDir, 'README.md'), readme, 'utf-8');
|
||||||
|
|
||||||
|
return crateDir;
|
||||||
|
}
|
||||||
|
|
||||||
|
/**
|
||||||
|
* Run cargo command with proper environment
|
||||||
|
*/
|
||||||
|
async function runCargoCommand(
|
||||||
|
command: string,
|
||||||
|
cwd: string,
|
||||||
|
includeToken: boolean = true
|
||||||
|
): Promise<{ stdout: string; stderr: string; exitCode: number }> {
|
||||||
|
// Prepare environment variables
|
||||||
|
// NOTE: Cargo converts registry name "test-registry" to "TEST_REGISTRY" for env vars
|
||||||
|
const envVars = [
|
||||||
|
`CARGO_HOME="${cargoHome}"`,
|
||||||
|
`CARGO_REGISTRIES_TEST_REGISTRY_INDEX="sparse+${registryUrl}/cargo/"`,
|
||||||
|
includeToken ? `CARGO_REGISTRIES_TEST_REGISTRY_TOKEN="${cargoToken}"` : '',
|
||||||
|
`CARGO_NET_RETRY="0"`,
|
||||||
|
].filter(Boolean).join(' ');
|
||||||
|
|
||||||
|
// Build command with cd to correct directory and environment variables
|
||||||
|
const fullCommand = `cd "${cwd}" && ${envVars} ${command}`;
|
||||||
|
|
||||||
|
try {
|
||||||
|
const result = await tapNodeTools.runCommand(fullCommand);
|
||||||
|
return {
|
||||||
|
stdout: result.stdout || '',
|
||||||
|
stderr: result.stderr || '',
|
||||||
|
exitCode: result.exitCode || 0,
|
||||||
|
};
|
||||||
|
} catch (error: any) {
|
||||||
|
return {
|
||||||
|
stdout: error.stdout || '',
|
||||||
|
stderr: error.stderr || String(error),
|
||||||
|
exitCode: error.exitCode || 1,
|
||||||
|
};
|
||||||
|
}
|
||||||
|
}
|
||||||
|
|
||||||
|
/**
|
||||||
|
* Cleanup test directory
|
||||||
|
*/
|
||||||
|
function cleanupTestDir(dir: string): void {
|
||||||
|
if (fs.existsSync(dir)) {
|
||||||
|
fs.rmSync(dir, { recursive: true, force: true });
|
||||||
|
}
|
||||||
|
}
|
||||||
|
|
||||||
|
// ========================================================================
|
||||||
|
// TESTS
|
||||||
|
// ========================================================================
|
||||||
|
|
||||||
|
tap.test('Cargo CLI: should setup registry and HTTP server', async () => {
|
||||||
|
// Create registry
|
||||||
|
registry = await createTestRegistry();
|
||||||
|
const tokens = await createTestTokens(registry);
|
||||||
|
cargoToken = tokens.cargoToken;
|
||||||
|
|
||||||
|
expect(registry).toBeInstanceOf(SmartRegistry);
|
||||||
|
expect(cargoToken).toBeTypeOf('string');
|
||||||
|
|
||||||
|
// Clean up any existing index from previous test runs
|
||||||
|
const storage = registry.getStorage();
|
||||||
|
try {
|
||||||
|
await storage.putCargoIndex('test-crate-cli', []);
|
||||||
|
} catch (error) {
|
||||||
|
// Ignore error if operation fails
|
||||||
|
}
|
||||||
|
|
||||||
|
// Use port 5000 (hardcoded in CargoRegistry default config)
|
||||||
|
// TODO: Once registryUrl is configurable, use dynamic port like npm test (35001)
|
||||||
|
registryPort = 5000;
|
||||||
|
const serverSetup = await createHttpServer(registry, registryPort);
|
||||||
|
server = serverSetup.server;
|
||||||
|
registryUrl = serverSetup.url;
|
||||||
|
|
||||||
|
expect(server).toBeDefined();
|
||||||
|
expect(registryUrl).toEqual(`http://localhost:${registryPort}`);
|
||||||
|
|
||||||
|
// Setup test directory
|
||||||
|
testDir = path.join(process.cwd(), '.nogit', 'test-cargo-cli');
|
||||||
|
cleanupTestDir(testDir);
|
||||||
|
fs.mkdirSync(testDir, { recursive: true });
|
||||||
|
|
||||||
|
// Setup CARGO_HOME
|
||||||
|
cargoHome = path.join(testDir, '.cargo-home');
|
||||||
|
fs.mkdirSync(cargoHome, { recursive: true });
|
||||||
|
|
||||||
|
// Setup Cargo config
|
||||||
|
setupCargoConfig(registryUrl, cargoToken, cargoHome);
|
||||||
|
expect(fs.existsSync(path.join(cargoHome, '.cargo', 'config.toml'))).toEqual(true);
|
||||||
|
expect(fs.existsSync(path.join(cargoHome, '.cargo', 'credentials.toml'))).toEqual(true);
|
||||||
|
});
|
||||||
|
|
||||||
|
tap.test('Cargo CLI: should verify server is responding', async () => {
|
||||||
|
// Check server is up by doing a direct HTTP request to the cargo index
|
||||||
|
const response = await fetch(`${registryUrl}/cargo/`);
|
||||||
|
expect(response.status).toBeGreaterThanOrEqual(200);
|
||||||
|
expect(response.status).toBeLessThan(500);
|
||||||
|
});
|
||||||
|
|
||||||
|
tap.test('Cargo CLI: should publish a crate', async () => {
|
||||||
|
const crateName = 'test-crate-cli';
|
||||||
|
const version = '0.1.0';
|
||||||
|
const crateDir = createTestCrate(crateName, version, testDir);
|
||||||
|
|
||||||
|
const result = await runCargoCommand('cargo publish --registry test-registry --allow-dirty', crateDir);
|
||||||
|
console.log('cargo publish output:', result.stdout);
|
||||||
|
console.log('cargo publish stderr:', result.stderr);
|
||||||
|
|
||||||
|
expect(result.exitCode).toEqual(0);
|
||||||
|
expect(result.stdout || result.stderr).toContain(crateName);
|
||||||
|
});
|
||||||
|
|
||||||
|
tap.test('Cargo CLI: should verify crate in index', async () => {
|
||||||
|
const crateName = 'test-crate-cli';
|
||||||
|
|
||||||
|
// Cargo uses a specific index structure
|
||||||
|
// For crate "test-crate-cli", the index path is based on the first characters
|
||||||
|
// 1 char: <name>
|
||||||
|
// 2 char: 2/<name>
|
||||||
|
// 3 char: 3/<first-char>/<name>
|
||||||
|
// 4+ char: <first-2-chars>/<second-2-chars>/<name>
|
||||||
|
|
||||||
|
// "test-crate-cli" is 14 chars, so it should be at: te/st/test-crate-cli
|
||||||
|
const indexPath = `/cargo/te/st/${crateName}`;
|
||||||
|
|
||||||
|
const response = await fetch(`${registryUrl}${indexPath}`);
|
||||||
|
expect(response.status).toEqual(200);
|
||||||
|
|
||||||
|
const indexData = await response.text();
|
||||||
|
console.log('Index data:', indexData);
|
||||||
|
|
||||||
|
// Index should contain JSON line with crate info
|
||||||
|
expect(indexData).toContain(crateName);
|
||||||
|
expect(indexData).toContain('0.1.0');
|
||||||
|
});
|
||||||
|
|
||||||
|
tap.test('Cargo CLI: should download published crate', async () => {
|
||||||
|
const crateName = 'test-crate-cli';
|
||||||
|
const version = '0.1.0';
|
||||||
|
|
||||||
|
// Cargo downloads crates from /cargo/api/v1/crates/{name}/{version}/download
|
||||||
|
const downloadPath = `/cargo/api/v1/crates/${crateName}/${version}/download`;
|
||||||
|
|
||||||
|
const response = await fetch(`${registryUrl}${downloadPath}`);
|
||||||
|
expect(response.status).toEqual(200);
|
||||||
|
|
||||||
|
const crateData = await response.arrayBuffer();
|
||||||
|
expect(crateData.byteLength).toBeGreaterThan(0);
|
||||||
|
});
|
||||||
|
|
||||||
|
tap.test('Cargo CLI: should publish second version', async () => {
|
||||||
|
const crateName = 'test-crate-cli';
|
||||||
|
const version = '0.2.0';
|
||||||
|
const crateDir = createTestCrate(crateName, version, testDir);
|
||||||
|
|
||||||
|
const result = await runCargoCommand('cargo publish --registry test-registry --allow-dirty', crateDir);
|
||||||
|
console.log('cargo publish v0.2.0 output:', result.stdout);
|
||||||
|
|
||||||
|
expect(result.exitCode).toEqual(0);
|
||||||
|
});
|
||||||
|
|
||||||
|
tap.test('Cargo CLI: should list versions in index', async () => {
|
||||||
|
const crateName = 'test-crate-cli';
|
||||||
|
const indexPath = `/cargo/te/st/${crateName}`;
|
||||||
|
|
||||||
|
const response = await fetch(`${registryUrl}${indexPath}`);
|
||||||
|
expect(response.status).toEqual(200);
|
||||||
|
|
||||||
|
const indexData = await response.text();
|
||||||
|
const lines = indexData.trim().split('\n');
|
||||||
|
|
||||||
|
// Should have 2 lines (2 versions)
|
||||||
|
expect(lines.length).toEqual(2);
|
||||||
|
|
||||||
|
// Parse JSON lines
|
||||||
|
const version1 = JSON.parse(lines[0]);
|
||||||
|
const version2 = JSON.parse(lines[1]);
|
||||||
|
|
||||||
|
expect(version1.vers).toEqual('0.1.0');
|
||||||
|
expect(version2.vers).toEqual('0.2.0');
|
||||||
|
});
|
||||||
|
|
||||||
|
tap.test('Cargo CLI: should search for crate', async () => {
|
||||||
|
const crateName = 'test-crate-cli';
|
||||||
|
|
||||||
|
// Cargo search endpoint: /cargo/api/v1/crates?q={query}
|
||||||
|
const response = await fetch(`${registryUrl}/cargo/api/v1/crates?q=${crateName}`);
|
||||||
|
expect(response.status).toEqual(200);
|
||||||
|
|
||||||
|
const searchResults = await response.json();
|
||||||
|
console.log('Search results:', searchResults);
|
||||||
|
|
||||||
|
expect(searchResults).toHaveProperty('crates');
|
||||||
|
expect(searchResults.crates).toBeInstanceOf(Array);
|
||||||
|
expect(searchResults.crates.length).toBeGreaterThan(0);
|
||||||
|
expect(searchResults.crates[0].name).toEqual(crateName);
|
||||||
|
});
|
||||||
|
|
||||||
|
tap.test('Cargo CLI: should yank a version', async () => {
|
||||||
|
const crateName = 'test-crate-cli';
|
||||||
|
const crateDir = path.join(testDir, crateName);
|
||||||
|
|
||||||
|
const result = await runCargoCommand('cargo yank --registry test-registry --vers 0.1.0', crateDir);
|
||||||
|
console.log('cargo yank output:', result.stdout);
|
||||||
|
console.log('cargo yank stderr:', result.stderr);
|
||||||
|
|
||||||
|
expect(result.exitCode).toEqual(0);
|
||||||
|
|
||||||
|
// Verify version is yanked in index
|
||||||
|
const indexPath = `/cargo/te/st/${crateName}`;
|
||||||
|
const response = await fetch(`${registryUrl}${indexPath}`);
|
||||||
|
const indexData = await response.text();
|
||||||
|
const lines = indexData.trim().split('\n');
|
||||||
|
const version1 = JSON.parse(lines[0]);
|
||||||
|
|
||||||
|
expect(version1.yanked).toEqual(true);
|
||||||
|
});
|
||||||
|
|
||||||
|
tap.test('Cargo CLI: should unyank a version', async () => {
|
||||||
|
const crateName = 'test-crate-cli';
|
||||||
|
const crateDir = path.join(testDir, crateName);
|
||||||
|
|
||||||
|
const result = await runCargoCommand('cargo yank --registry test-registry --vers 0.1.0 --undo', crateDir);
|
||||||
|
console.log('cargo unyank output:', result.stdout);
|
||||||
|
console.log('cargo unyank stderr:', result.stderr);
|
||||||
|
|
||||||
|
expect(result.exitCode).toEqual(0);
|
||||||
|
|
||||||
|
// Verify version is not yanked in index
|
||||||
|
const indexPath = `/cargo/te/st/${crateName}`;
|
||||||
|
const response = await fetch(`${registryUrl}${indexPath}`);
|
||||||
|
const indexData = await response.text();
|
||||||
|
const lines = indexData.trim().split('\n');
|
||||||
|
const version1 = JSON.parse(lines[0]);
|
||||||
|
|
||||||
|
expect(version1.yanked).toEqual(false);
|
||||||
|
});
|
||||||
|
|
||||||
|
tap.test('Cargo CLI: should fail to publish without auth', async () => {
|
||||||
|
const crateName = 'unauth-crate';
|
||||||
|
const version = '0.1.0';
|
||||||
|
const crateDir = createTestCrate(crateName, version, testDir);
|
||||||
|
|
||||||
|
// Run without token (includeToken: false)
|
||||||
|
const result = await runCargoCommand('cargo publish --registry test-registry --allow-dirty', crateDir, false);
|
||||||
|
console.log('cargo publish unauth output:', result.stdout);
|
||||||
|
console.log('cargo publish unauth stderr:', result.stderr);
|
||||||
|
|
||||||
|
// Should fail with auth error
|
||||||
|
expect(result.exitCode).not.toEqual(0);
|
||||||
|
expect(result.stderr).toContain('token');
|
||||||
|
});
|
||||||
|
|
||||||
|
tap.postTask('cleanup cargo cli tests', async () => {
|
||||||
|
// Stop server
|
||||||
|
if (server) {
|
||||||
|
await new Promise<void>((resolve) => {
|
||||||
|
server.close(() => resolve());
|
||||||
|
});
|
||||||
|
}
|
||||||
|
|
||||||
|
// Cleanup test directory
|
||||||
|
if (testDir) {
|
||||||
|
cleanupTestDir(testDir);
|
||||||
|
}
|
||||||
|
|
||||||
|
// Destroy registry
|
||||||
|
if (registry) {
|
||||||
|
registry.destroy();
|
||||||
|
}
|
||||||
|
});
|
||||||
|
|
||||||
|
export default tap.start();
|
||||||
296
test/test.composer.ts
Normal file
296
test/test.composer.ts
Normal file
@@ -0,0 +1,296 @@
|
|||||||
|
import { expect, tap } from '@git.zone/tstest/tapbundle';
|
||||||
|
import { SmartRegistry } from '../ts/index.js';
|
||||||
|
import { createTestRegistry, createTestTokens, createComposerZip } from './helpers/registry.js';
|
||||||
|
|
||||||
|
let registry: SmartRegistry;
|
||||||
|
let composerToken: string;
|
||||||
|
let userId: string;
|
||||||
|
|
||||||
|
// Test data
|
||||||
|
const testPackageName = 'vendor/test-package';
|
||||||
|
const testVersion = '1.0.0';
|
||||||
|
let testZipData: Buffer;
|
||||||
|
|
||||||
|
tap.test('Composer: should create registry instance', async () => {
|
||||||
|
registry = await createTestRegistry();
|
||||||
|
const tokens = await createTestTokens(registry);
|
||||||
|
composerToken = tokens.composerToken;
|
||||||
|
userId = tokens.userId;
|
||||||
|
|
||||||
|
expect(registry).toBeInstanceOf(SmartRegistry);
|
||||||
|
expect(composerToken).toBeTypeOf('string');
|
||||||
|
});
|
||||||
|
|
||||||
|
tap.test('Composer: should create test ZIP package', async () => {
|
||||||
|
testZipData = await createComposerZip(testPackageName, testVersion, {
|
||||||
|
description: 'Test Composer package for registry',
|
||||||
|
license: ['MIT'],
|
||||||
|
authors: [{ name: 'Test Author', email: 'test@example.com' }],
|
||||||
|
});
|
||||||
|
|
||||||
|
expect(testZipData).toBeInstanceOf(Buffer);
|
||||||
|
expect(testZipData.length).toBeGreaterThan(0);
|
||||||
|
});
|
||||||
|
|
||||||
|
tap.test('Composer: should return packages.json (GET /packages.json)', async () => {
|
||||||
|
const response = await registry.handleRequest({
|
||||||
|
method: 'GET',
|
||||||
|
path: '/composer/packages.json',
|
||||||
|
headers: {},
|
||||||
|
query: {},
|
||||||
|
});
|
||||||
|
|
||||||
|
expect(response.status).toEqual(200);
|
||||||
|
expect(response.body).toHaveProperty('metadata-url');
|
||||||
|
expect(response.body).toHaveProperty('available-packages');
|
||||||
|
expect(response.body['available-packages']).toBeInstanceOf(Array);
|
||||||
|
});
|
||||||
|
|
||||||
|
tap.test('Composer: should upload a package (PUT /packages/{vendor/package})', async () => {
|
||||||
|
const response = await registry.handleRequest({
|
||||||
|
method: 'PUT',
|
||||||
|
path: `/composer/packages/${testPackageName}`,
|
||||||
|
headers: {
|
||||||
|
Authorization: `Bearer ${composerToken}`,
|
||||||
|
'Content-Type': 'application/zip',
|
||||||
|
},
|
||||||
|
query: {},
|
||||||
|
body: testZipData,
|
||||||
|
});
|
||||||
|
|
||||||
|
expect(response.status).toEqual(201);
|
||||||
|
expect(response.body.status).toEqual('success');
|
||||||
|
expect(response.body.package).toEqual(testPackageName);
|
||||||
|
expect(response.body.version).toEqual(testVersion);
|
||||||
|
});
|
||||||
|
|
||||||
|
tap.test('Composer: should retrieve package metadata (GET /p2/{vendor/package}.json)', async () => {
|
||||||
|
const response = await registry.handleRequest({
|
||||||
|
method: 'GET',
|
||||||
|
path: `/composer/p2/${testPackageName}.json`,
|
||||||
|
headers: {},
|
||||||
|
query: {},
|
||||||
|
});
|
||||||
|
|
||||||
|
expect(response.status).toEqual(200);
|
||||||
|
expect(response.body).toHaveProperty('packages');
|
||||||
|
expect(response.body.packages[testPackageName]).toBeInstanceOf(Array);
|
||||||
|
expect(response.body.packages[testPackageName].length).toEqual(1);
|
||||||
|
|
||||||
|
const packageData = response.body.packages[testPackageName][0];
|
||||||
|
expect(packageData.name).toEqual(testPackageName);
|
||||||
|
expect(packageData.version).toEqual(testVersion);
|
||||||
|
expect(packageData.version_normalized).toEqual('1.0.0.0');
|
||||||
|
expect(packageData).toHaveProperty('dist');
|
||||||
|
expect(packageData.dist.type).toEqual('zip');
|
||||||
|
expect(packageData.dist).toHaveProperty('url');
|
||||||
|
expect(packageData.dist).toHaveProperty('shasum');
|
||||||
|
expect(packageData.dist).toHaveProperty('reference');
|
||||||
|
});
|
||||||
|
|
||||||
|
tap.test('Composer: should download package ZIP (GET /dists/{vendor/package}/{ref}.zip)', async () => {
|
||||||
|
// First get metadata to find reference
|
||||||
|
const metadataResponse = await registry.handleRequest({
|
||||||
|
method: 'GET',
|
||||||
|
path: `/composer/p2/${testPackageName}.json`,
|
||||||
|
headers: {},
|
||||||
|
query: {},
|
||||||
|
});
|
||||||
|
|
||||||
|
const reference = metadataResponse.body.packages[testPackageName][0].dist.reference;
|
||||||
|
|
||||||
|
const response = await registry.handleRequest({
|
||||||
|
method: 'GET',
|
||||||
|
path: `/composer/dists/${testPackageName}/${reference}.zip`,
|
||||||
|
headers: {},
|
||||||
|
query: {},
|
||||||
|
});
|
||||||
|
|
||||||
|
expect(response.status).toEqual(200);
|
||||||
|
expect(response.body).toBeInstanceOf(Buffer);
|
||||||
|
expect(response.headers['Content-Type']).toEqual('application/zip');
|
||||||
|
expect(response.headers['Content-Disposition']).toContain('attachment');
|
||||||
|
});
|
||||||
|
|
||||||
|
tap.test('Composer: should list packages (GET /packages/list.json)', async () => {
|
||||||
|
const response = await registry.handleRequest({
|
||||||
|
method: 'GET',
|
||||||
|
path: '/composer/packages/list.json',
|
||||||
|
headers: {},
|
||||||
|
query: {},
|
||||||
|
});
|
||||||
|
|
||||||
|
expect(response.status).toEqual(200);
|
||||||
|
expect(response.body).toHaveProperty('packageNames');
|
||||||
|
expect(response.body.packageNames).toBeInstanceOf(Array);
|
||||||
|
expect(response.body.packageNames).toContain(testPackageName);
|
||||||
|
});
|
||||||
|
|
||||||
|
tap.test('Composer: should filter package list (GET /packages/list.json?filter=vendor/*)', async () => {
|
||||||
|
const response = await registry.handleRequest({
|
||||||
|
method: 'GET',
|
||||||
|
path: '/composer/packages/list.json',
|
||||||
|
headers: {},
|
||||||
|
query: { filter: 'vendor/*' },
|
||||||
|
});
|
||||||
|
|
||||||
|
expect(response.status).toEqual(200);
|
||||||
|
expect(response.body.packageNames).toBeInstanceOf(Array);
|
||||||
|
expect(response.body.packageNames).toContain(testPackageName);
|
||||||
|
});
|
||||||
|
|
||||||
|
tap.test('Composer: should prevent duplicate version upload', async () => {
|
||||||
|
const response = await registry.handleRequest({
|
||||||
|
method: 'PUT',
|
||||||
|
path: `/composer/packages/${testPackageName}`,
|
||||||
|
headers: {
|
||||||
|
Authorization: `Bearer ${composerToken}`,
|
||||||
|
'Content-Type': 'application/zip',
|
||||||
|
},
|
||||||
|
query: {},
|
||||||
|
body: testZipData,
|
||||||
|
});
|
||||||
|
|
||||||
|
expect(response.status).toEqual(409);
|
||||||
|
expect(response.body.status).toEqual('error');
|
||||||
|
expect(response.body.message).toContain('already exists');
|
||||||
|
});
|
||||||
|
|
||||||
|
tap.test('Composer: should upload a second version', async () => {
|
||||||
|
const testVersion2 = '1.1.0';
|
||||||
|
const testZipData2 = await createComposerZip(testPackageName, testVersion2);
|
||||||
|
|
||||||
|
const response = await registry.handleRequest({
|
||||||
|
method: 'PUT',
|
||||||
|
path: `/composer/packages/${testPackageName}`,
|
||||||
|
headers: {
|
||||||
|
Authorization: `Bearer ${composerToken}`,
|
||||||
|
'Content-Type': 'application/zip',
|
||||||
|
},
|
||||||
|
query: {},
|
||||||
|
body: testZipData2,
|
||||||
|
});
|
||||||
|
|
||||||
|
expect(response.status).toEqual(201);
|
||||||
|
expect(response.body.status).toEqual('success');
|
||||||
|
expect(response.body.version).toEqual(testVersion2);
|
||||||
|
});
|
||||||
|
|
||||||
|
tap.test('Composer: should return multiple versions in metadata', async () => {
|
||||||
|
const response = await registry.handleRequest({
|
||||||
|
method: 'GET',
|
||||||
|
path: `/composer/p2/${testPackageName}.json`,
|
||||||
|
headers: {},
|
||||||
|
query: {},
|
||||||
|
});
|
||||||
|
|
||||||
|
expect(response.status).toEqual(200);
|
||||||
|
expect(response.body.packages[testPackageName]).toBeInstanceOf(Array);
|
||||||
|
expect(response.body.packages[testPackageName].length).toEqual(2);
|
||||||
|
|
||||||
|
const versions = response.body.packages[testPackageName].map((p: any) => p.version);
|
||||||
|
expect(versions).toContain('1.0.0');
|
||||||
|
expect(versions).toContain('1.1.0');
|
||||||
|
});
|
||||||
|
|
||||||
|
tap.test('Composer: should delete a specific version (DELETE /packages/{vendor/package}/{version})', async () => {
|
||||||
|
const response = await registry.handleRequest({
|
||||||
|
method: 'DELETE',
|
||||||
|
path: `/composer/packages/${testPackageName}/1.0.0`,
|
||||||
|
headers: {
|
||||||
|
Authorization: `Bearer ${composerToken}`,
|
||||||
|
},
|
||||||
|
query: {},
|
||||||
|
});
|
||||||
|
|
||||||
|
expect(response.status).toEqual(204);
|
||||||
|
|
||||||
|
// Verify version was removed
|
||||||
|
const metadataResponse = await registry.handleRequest({
|
||||||
|
method: 'GET',
|
||||||
|
path: `/composer/p2/${testPackageName}.json`,
|
||||||
|
headers: {},
|
||||||
|
query: {},
|
||||||
|
});
|
||||||
|
|
||||||
|
expect(metadataResponse.body.packages[testPackageName].length).toEqual(1);
|
||||||
|
expect(metadataResponse.body.packages[testPackageName][0].version).toEqual('1.1.0');
|
||||||
|
});
|
||||||
|
|
||||||
|
tap.test('Composer: should require auth for package upload', async () => {
|
||||||
|
const testZipData3 = await createComposerZip('vendor/unauth-package', '1.0.0');
|
||||||
|
|
||||||
|
const response = await registry.handleRequest({
|
||||||
|
method: 'PUT',
|
||||||
|
path: '/composer/packages/vendor/unauth-package',
|
||||||
|
headers: {
|
||||||
|
'Content-Type': 'application/zip',
|
||||||
|
},
|
||||||
|
query: {},
|
||||||
|
body: testZipData3,
|
||||||
|
});
|
||||||
|
|
||||||
|
expect(response.status).toEqual(401);
|
||||||
|
expect(response.body.status).toEqual('error');
|
||||||
|
});
|
||||||
|
|
||||||
|
tap.test('Composer: should reject invalid ZIP (no composer.json)', async () => {
|
||||||
|
const invalidZip = Buffer.from('invalid zip content');
|
||||||
|
|
||||||
|
const response = await registry.handleRequest({
|
||||||
|
method: 'PUT',
|
||||||
|
path: `/composer/packages/${testPackageName}`,
|
||||||
|
headers: {
|
||||||
|
Authorization: `Bearer ${composerToken}`,
|
||||||
|
'Content-Type': 'application/zip',
|
||||||
|
},
|
||||||
|
query: {},
|
||||||
|
body: invalidZip,
|
||||||
|
});
|
||||||
|
|
||||||
|
expect(response.status).toEqual(400);
|
||||||
|
expect(response.body.status).toEqual('error');
|
||||||
|
expect(response.body.message).toContain('composer.json');
|
||||||
|
});
|
||||||
|
|
||||||
|
tap.test('Composer: should delete entire package (DELETE /packages/{vendor/package})', async () => {
|
||||||
|
const response = await registry.handleRequest({
|
||||||
|
method: 'DELETE',
|
||||||
|
path: `/composer/packages/${testPackageName}`,
|
||||||
|
headers: {
|
||||||
|
Authorization: `Bearer ${composerToken}`,
|
||||||
|
},
|
||||||
|
query: {},
|
||||||
|
});
|
||||||
|
|
||||||
|
expect(response.status).toEqual(204);
|
||||||
|
|
||||||
|
// Verify package was removed
|
||||||
|
const metadataResponse = await registry.handleRequest({
|
||||||
|
method: 'GET',
|
||||||
|
path: `/composer/p2/${testPackageName}.json`,
|
||||||
|
headers: {},
|
||||||
|
query: {},
|
||||||
|
});
|
||||||
|
|
||||||
|
expect(metadataResponse.status).toEqual(404);
|
||||||
|
});
|
||||||
|
|
||||||
|
tap.test('Composer: should return 404 for non-existent package', async () => {
|
||||||
|
const response = await registry.handleRequest({
|
||||||
|
method: 'GET',
|
||||||
|
path: '/composer/p2/non/existent.json',
|
||||||
|
headers: {},
|
||||||
|
query: {},
|
||||||
|
});
|
||||||
|
|
||||||
|
expect(response.status).toEqual(404);
|
||||||
|
});
|
||||||
|
|
||||||
|
tap.postTask('cleanup registry', async () => {
|
||||||
|
if (registry) {
|
||||||
|
registry.destroy();
|
||||||
|
}
|
||||||
|
});
|
||||||
|
|
||||||
|
export default tap.start();
|
||||||
@@ -30,6 +30,14 @@ tap.test('Maven: should create registry instance', async () => {
|
|||||||
|
|
||||||
expect(registry).toBeInstanceOf(SmartRegistry);
|
expect(registry).toBeInstanceOf(SmartRegistry);
|
||||||
expect(mavenToken).toBeTypeOf('string');
|
expect(mavenToken).toBeTypeOf('string');
|
||||||
|
|
||||||
|
// Clean up any existing metadata from previous test runs
|
||||||
|
const storage = registry.getStorage();
|
||||||
|
try {
|
||||||
|
await storage.deleteMavenMetadata(testGroupId, testArtifactId);
|
||||||
|
} catch (error) {
|
||||||
|
// Ignore error if metadata doesn't exist
|
||||||
|
}
|
||||||
});
|
});
|
||||||
|
|
||||||
tap.test('Maven: should upload POM file (PUT /{groupPath}/{artifactId}/{version}/*.pom)', async () => {
|
tap.test('Maven: should upload POM file (PUT /{groupPath}/{artifactId}/{version}/*.pom)', async () => {
|
||||||
@@ -336,7 +344,7 @@ tap.test('Maven: should delete an artifact (DELETE)', async () => {
|
|||||||
query: {},
|
query: {},
|
||||||
});
|
});
|
||||||
|
|
||||||
expect(response.status).toEqual(200);
|
expect(response.status).toEqual(204); // 204 No Content is correct for DELETE
|
||||||
|
|
||||||
// Verify artifact was deleted
|
// Verify artifact was deleted
|
||||||
const getResponse = await registry.handleRequest({
|
const getResponse = await registry.handleRequest({
|
||||||
|
|||||||
412
test/test.npm.nativecli.node.ts
Normal file
412
test/test.npm.nativecli.node.ts
Normal file
@@ -0,0 +1,412 @@
|
|||||||
|
/**
|
||||||
|
* Native npm CLI Testing
|
||||||
|
* Tests the NPM registry implementation using the actual npm CLI
|
||||||
|
*/
|
||||||
|
|
||||||
|
import { expect, tap } from '@git.zone/tstest/tapbundle';
|
||||||
|
import { tapNodeTools } from '@git.zone/tstest/tapbundle_serverside';
|
||||||
|
import { SmartRegistry } from '../ts/index.js';
|
||||||
|
import { createTestRegistry, createTestTokens } from './helpers/registry.js';
|
||||||
|
import type { IRequestContext, IResponse } from '../ts/core/interfaces.core.js';
|
||||||
|
import * as http from 'http';
|
||||||
|
import * as url from 'url';
|
||||||
|
import * as fs from 'fs';
|
||||||
|
import * as path from 'path';
|
||||||
|
|
||||||
|
// Test context
|
||||||
|
let registry: SmartRegistry;
|
||||||
|
let server: http.Server;
|
||||||
|
let registryUrl: string;
|
||||||
|
let registryPort: number;
|
||||||
|
let npmToken: string;
|
||||||
|
let testDir: string;
|
||||||
|
let npmrcPath: string;
|
||||||
|
|
||||||
|
/**
|
||||||
|
* Create HTTP server wrapper around SmartRegistry
|
||||||
|
*/
|
||||||
|
async function createHttpServer(
|
||||||
|
registryInstance: SmartRegistry,
|
||||||
|
port: number
|
||||||
|
): Promise<{ server: http.Server; url: string }> {
|
||||||
|
return new Promise((resolve, reject) => {
|
||||||
|
const httpServer = http.createServer(async (req, res) => {
|
||||||
|
try {
|
||||||
|
// Parse request
|
||||||
|
const parsedUrl = url.parse(req.url || '', true);
|
||||||
|
const pathname = parsedUrl.pathname || '/';
|
||||||
|
const query = parsedUrl.query;
|
||||||
|
|
||||||
|
// Read body
|
||||||
|
const chunks: Buffer[] = [];
|
||||||
|
for await (const chunk of req) {
|
||||||
|
chunks.push(chunk);
|
||||||
|
}
|
||||||
|
const bodyBuffer = Buffer.concat(chunks);
|
||||||
|
|
||||||
|
// Parse body based on content type
|
||||||
|
let body: any;
|
||||||
|
if (bodyBuffer.length > 0) {
|
||||||
|
const contentType = req.headers['content-type'] || '';
|
||||||
|
if (contentType.includes('application/json')) {
|
||||||
|
try {
|
||||||
|
body = JSON.parse(bodyBuffer.toString('utf-8'));
|
||||||
|
} catch (error) {
|
||||||
|
body = bodyBuffer;
|
||||||
|
}
|
||||||
|
} else {
|
||||||
|
body = bodyBuffer;
|
||||||
|
}
|
||||||
|
}
|
||||||
|
|
||||||
|
// Convert to IRequestContext
|
||||||
|
const context: IRequestContext = {
|
||||||
|
method: req.method || 'GET',
|
||||||
|
path: pathname,
|
||||||
|
headers: req.headers as Record<string, string>,
|
||||||
|
query: query as Record<string, string>,
|
||||||
|
body: body,
|
||||||
|
};
|
||||||
|
|
||||||
|
// Handle request
|
||||||
|
const response: IResponse = await registryInstance.handleRequest(context);
|
||||||
|
|
||||||
|
// Convert IResponse to HTTP response
|
||||||
|
res.statusCode = response.status;
|
||||||
|
|
||||||
|
// Set headers
|
||||||
|
for (const [key, value] of Object.entries(response.headers || {})) {
|
||||||
|
res.setHeader(key, value);
|
||||||
|
}
|
||||||
|
|
||||||
|
// Send body
|
||||||
|
if (response.body) {
|
||||||
|
if (Buffer.isBuffer(response.body)) {
|
||||||
|
res.end(response.body);
|
||||||
|
} else if (typeof response.body === 'string') {
|
||||||
|
res.end(response.body);
|
||||||
|
} else {
|
||||||
|
res.setHeader('Content-Type', 'application/json');
|
||||||
|
res.end(JSON.stringify(response.body));
|
||||||
|
}
|
||||||
|
} else {
|
||||||
|
res.end();
|
||||||
|
}
|
||||||
|
} catch (error) {
|
||||||
|
console.error('Server error:', error);
|
||||||
|
res.statusCode = 500;
|
||||||
|
res.setHeader('Content-Type', 'application/json');
|
||||||
|
res.end(JSON.stringify({ error: 'INTERNAL_ERROR', message: String(error) }));
|
||||||
|
}
|
||||||
|
});
|
||||||
|
|
||||||
|
httpServer.listen(port, () => {
|
||||||
|
const serverUrl = `http://localhost:${port}`;
|
||||||
|
resolve({ server: httpServer, url: serverUrl });
|
||||||
|
});
|
||||||
|
|
||||||
|
httpServer.on('error', reject);
|
||||||
|
});
|
||||||
|
}
|
||||||
|
|
||||||
|
/**
|
||||||
|
* Setup .npmrc configuration
|
||||||
|
*/
|
||||||
|
function setupNpmrc(registryUrlArg: string, token: string, testDirArg: string): string {
|
||||||
|
const npmrcContent = `registry=${registryUrlArg}/npm/
|
||||||
|
//localhost:${registryPort}/npm/:_authToken=${token}
|
||||||
|
`;
|
||||||
|
|
||||||
|
const npmrcFilePath = path.join(testDirArg, '.npmrc');
|
||||||
|
fs.writeFileSync(npmrcFilePath, npmrcContent, 'utf-8');
|
||||||
|
return npmrcFilePath;
|
||||||
|
}
|
||||||
|
|
||||||
|
/**
|
||||||
|
* Create a test package
|
||||||
|
*/
|
||||||
|
function createTestPackage(
|
||||||
|
packageName: string,
|
||||||
|
version: string,
|
||||||
|
targetDir: string
|
||||||
|
): string {
|
||||||
|
const packageDir = path.join(targetDir, packageName);
|
||||||
|
fs.mkdirSync(packageDir, { recursive: true });
|
||||||
|
|
||||||
|
// Create package.json
|
||||||
|
const packageJson = {
|
||||||
|
name: packageName,
|
||||||
|
version: version,
|
||||||
|
description: `Test package ${packageName}`,
|
||||||
|
main: 'index.js',
|
||||||
|
scripts: {
|
||||||
|
test: 'echo "Test passed"',
|
||||||
|
},
|
||||||
|
keywords: ['test'],
|
||||||
|
author: 'Test Author',
|
||||||
|
license: 'MIT',
|
||||||
|
};
|
||||||
|
|
||||||
|
fs.writeFileSync(
|
||||||
|
path.join(packageDir, 'package.json'),
|
||||||
|
JSON.stringify(packageJson, null, 2),
|
||||||
|
'utf-8'
|
||||||
|
);
|
||||||
|
|
||||||
|
// Create index.js
|
||||||
|
const indexJs = `module.exports = {
|
||||||
|
name: '${packageName}',
|
||||||
|
version: '${version}',
|
||||||
|
message: 'Hello from ${packageName}@${version}'
|
||||||
|
};
|
||||||
|
`;
|
||||||
|
|
||||||
|
fs.writeFileSync(path.join(packageDir, 'index.js'), indexJs, 'utf-8');
|
||||||
|
|
||||||
|
// Create README.md
|
||||||
|
const readme = `# ${packageName}
|
||||||
|
|
||||||
|
Test package for SmartRegistry.
|
||||||
|
|
||||||
|
Version: ${version}
|
||||||
|
`;
|
||||||
|
|
||||||
|
fs.writeFileSync(path.join(packageDir, 'README.md'), readme, 'utf-8');
|
||||||
|
|
||||||
|
return packageDir;
|
||||||
|
}
|
||||||
|
|
||||||
|
/**
|
||||||
|
* Run npm command with proper environment
|
||||||
|
*/
|
||||||
|
async function runNpmCommand(
|
||||||
|
command: string,
|
||||||
|
cwd: string
|
||||||
|
): Promise<{ stdout: string; stderr: string; exitCode: number }> {
|
||||||
|
// Prepare environment variables
|
||||||
|
const envVars = [
|
||||||
|
`NPM_CONFIG_USERCONFIG="${npmrcPath}"`,
|
||||||
|
`NPM_CONFIG_CACHE="${path.join(testDir, '.npm-cache')}"`,
|
||||||
|
`NPM_CONFIG_PREFIX="${path.join(testDir, '.npm-global')}"`,
|
||||||
|
`NPM_CONFIG_REGISTRY="${registryUrl}/npm/"`,
|
||||||
|
].join(' ');
|
||||||
|
|
||||||
|
// Build command with cd to correct directory and environment variables
|
||||||
|
const fullCommand = `cd "${cwd}" && ${envVars} ${command}`;
|
||||||
|
|
||||||
|
try {
|
||||||
|
const result = await tapNodeTools.runCommand(fullCommand);
|
||||||
|
return {
|
||||||
|
stdout: result.stdout || '',
|
||||||
|
stderr: result.stderr || '',
|
||||||
|
exitCode: result.exitCode || 0,
|
||||||
|
};
|
||||||
|
} catch (error: any) {
|
||||||
|
return {
|
||||||
|
stdout: error.stdout || '',
|
||||||
|
stderr: error.stderr || String(error),
|
||||||
|
exitCode: error.exitCode || 1,
|
||||||
|
};
|
||||||
|
}
|
||||||
|
}
|
||||||
|
|
||||||
|
/**
|
||||||
|
* Cleanup test directory
|
||||||
|
*/
|
||||||
|
function cleanupTestDir(dir: string): void {
|
||||||
|
if (fs.existsSync(dir)) {
|
||||||
|
fs.rmSync(dir, { recursive: true, force: true });
|
||||||
|
}
|
||||||
|
}
|
||||||
|
|
||||||
|
// ========================================================================
|
||||||
|
// TESTS
|
||||||
|
// ========================================================================
|
||||||
|
|
||||||
|
tap.test('NPM CLI: should setup registry and HTTP server', async () => {
|
||||||
|
// Create registry
|
||||||
|
registry = await createTestRegistry();
|
||||||
|
const tokens = await createTestTokens(registry);
|
||||||
|
npmToken = tokens.npmToken;
|
||||||
|
|
||||||
|
expect(registry).toBeInstanceOf(SmartRegistry);
|
||||||
|
expect(npmToken).toBeTypeOf('string');
|
||||||
|
|
||||||
|
// Find available port
|
||||||
|
registryPort = 35000;
|
||||||
|
const serverSetup = await createHttpServer(registry, registryPort);
|
||||||
|
server = serverSetup.server;
|
||||||
|
registryUrl = serverSetup.url;
|
||||||
|
|
||||||
|
expect(server).toBeDefined();
|
||||||
|
expect(registryUrl).toEqual(`http://localhost:${registryPort}`);
|
||||||
|
|
||||||
|
// Setup test directory
|
||||||
|
testDir = path.join(process.cwd(), '.nogit', 'test-npm-cli');
|
||||||
|
cleanupTestDir(testDir);
|
||||||
|
fs.mkdirSync(testDir, { recursive: true });
|
||||||
|
|
||||||
|
// Setup .npmrc
|
||||||
|
npmrcPath = setupNpmrc(registryUrl, npmToken, testDir);
|
||||||
|
expect(fs.existsSync(npmrcPath)).toEqual(true);
|
||||||
|
});
|
||||||
|
|
||||||
|
tap.test('NPM CLI: should verify server is responding', async () => {
|
||||||
|
const result = await runNpmCommand('npm ping', testDir);
|
||||||
|
console.log('npm ping output:', result.stdout, result.stderr);
|
||||||
|
|
||||||
|
// npm ping may not work with custom registries, so just check server is up
|
||||||
|
// by doing a direct HTTP request
|
||||||
|
const response = await fetch(`${registryUrl}/npm/`);
|
||||||
|
expect(response.status).toBeGreaterThanOrEqual(200);
|
||||||
|
expect(response.status).toBeLessThan(500);
|
||||||
|
});
|
||||||
|
|
||||||
|
tap.test('NPM CLI: should publish a package', async () => {
|
||||||
|
const packageName = 'test-package-cli';
|
||||||
|
const version = '1.0.0';
|
||||||
|
const packageDir = createTestPackage(packageName, version, testDir);
|
||||||
|
|
||||||
|
const result = await runNpmCommand('npm publish', packageDir);
|
||||||
|
console.log('npm publish output:', result.stdout);
|
||||||
|
console.log('npm publish stderr:', result.stderr);
|
||||||
|
|
||||||
|
expect(result.exitCode).toEqual(0);
|
||||||
|
expect(result.stdout || result.stderr).toContain(packageName);
|
||||||
|
});
|
||||||
|
|
||||||
|
tap.test('NPM CLI: should view published package', async () => {
|
||||||
|
const packageName = 'test-package-cli';
|
||||||
|
|
||||||
|
const result = await runNpmCommand(`npm view ${packageName}`, testDir);
|
||||||
|
console.log('npm view output:', result.stdout);
|
||||||
|
|
||||||
|
expect(result.exitCode).toEqual(0);
|
||||||
|
expect(result.stdout).toContain(packageName);
|
||||||
|
expect(result.stdout).toContain('1.0.0');
|
||||||
|
});
|
||||||
|
|
||||||
|
tap.test('NPM CLI: should install published package', async () => {
|
||||||
|
const packageName = 'test-package-cli';
|
||||||
|
const installDir = path.join(testDir, 'install-test');
|
||||||
|
fs.mkdirSync(installDir, { recursive: true });
|
||||||
|
|
||||||
|
// Create package.json for installation
|
||||||
|
const packageJson = {
|
||||||
|
name: 'install-test',
|
||||||
|
version: '1.0.0',
|
||||||
|
dependencies: {
|
||||||
|
[packageName]: '1.0.0',
|
||||||
|
},
|
||||||
|
};
|
||||||
|
|
||||||
|
fs.writeFileSync(
|
||||||
|
path.join(installDir, 'package.json'),
|
||||||
|
JSON.stringify(packageJson, null, 2),
|
||||||
|
'utf-8'
|
||||||
|
);
|
||||||
|
|
||||||
|
const result = await runNpmCommand('npm install', installDir);
|
||||||
|
console.log('npm install output:', result.stdout);
|
||||||
|
console.log('npm install stderr:', result.stderr);
|
||||||
|
|
||||||
|
expect(result.exitCode).toEqual(0);
|
||||||
|
|
||||||
|
// Verify package was installed
|
||||||
|
const nodeModulesPath = path.join(installDir, 'node_modules', packageName);
|
||||||
|
expect(fs.existsSync(nodeModulesPath)).toEqual(true);
|
||||||
|
expect(fs.existsSync(path.join(nodeModulesPath, 'package.json'))).toEqual(true);
|
||||||
|
expect(fs.existsSync(path.join(nodeModulesPath, 'index.js'))).toEqual(true);
|
||||||
|
|
||||||
|
// Verify package contents
|
||||||
|
const installedPackageJson = JSON.parse(
|
||||||
|
fs.readFileSync(path.join(nodeModulesPath, 'package.json'), 'utf-8')
|
||||||
|
);
|
||||||
|
expect(installedPackageJson.name).toEqual(packageName);
|
||||||
|
expect(installedPackageJson.version).toEqual('1.0.0');
|
||||||
|
});
|
||||||
|
|
||||||
|
tap.test('NPM CLI: should publish second version', async () => {
|
||||||
|
const packageName = 'test-package-cli';
|
||||||
|
const version = '1.1.0';
|
||||||
|
const packageDir = createTestPackage(packageName, version, testDir);
|
||||||
|
|
||||||
|
const result = await runNpmCommand('npm publish', packageDir);
|
||||||
|
console.log('npm publish v1.1.0 output:', result.stdout);
|
||||||
|
|
||||||
|
expect(result.exitCode).toEqual(0);
|
||||||
|
});
|
||||||
|
|
||||||
|
tap.test('NPM CLI: should list versions', async () => {
|
||||||
|
const packageName = 'test-package-cli';
|
||||||
|
|
||||||
|
const result = await runNpmCommand(`npm view ${packageName} versions`, testDir);
|
||||||
|
console.log('npm view versions output:', result.stdout);
|
||||||
|
|
||||||
|
expect(result.exitCode).toEqual(0);
|
||||||
|
expect(result.stdout).toContain('1.0.0');
|
||||||
|
expect(result.stdout).toContain('1.1.0');
|
||||||
|
});
|
||||||
|
|
||||||
|
tap.test('NPM CLI: should publish scoped package', async () => {
|
||||||
|
const packageName = '@testscope/scoped-package';
|
||||||
|
const version = '1.0.0';
|
||||||
|
const packageDir = createTestPackage(packageName, version, testDir);
|
||||||
|
|
||||||
|
const result = await runNpmCommand('npm publish --access public', packageDir);
|
||||||
|
console.log('npm publish scoped output:', result.stdout);
|
||||||
|
console.log('npm publish scoped stderr:', result.stderr);
|
||||||
|
|
||||||
|
expect(result.exitCode).toEqual(0);
|
||||||
|
});
|
||||||
|
|
||||||
|
tap.test('NPM CLI: should view scoped package', async () => {
|
||||||
|
const packageName = '@testscope/scoped-package';
|
||||||
|
|
||||||
|
const result = await runNpmCommand(`npm view ${packageName}`, testDir);
|
||||||
|
console.log('npm view scoped output:', result.stdout);
|
||||||
|
|
||||||
|
expect(result.exitCode).toEqual(0);
|
||||||
|
expect(result.stdout).toContain('scoped-package');
|
||||||
|
});
|
||||||
|
|
||||||
|
tap.test('NPM CLI: should fail to publish without auth', async () => {
|
||||||
|
const packageName = 'unauth-package';
|
||||||
|
const version = '1.0.0';
|
||||||
|
const packageDir = createTestPackage(packageName, version, testDir);
|
||||||
|
|
||||||
|
// Temporarily remove .npmrc
|
||||||
|
const npmrcBackup = fs.readFileSync(npmrcPath, 'utf-8');
|
||||||
|
fs.writeFileSync(npmrcPath, 'registry=' + registryUrl + '/npm/\n', 'utf-8');
|
||||||
|
|
||||||
|
const result = await runNpmCommand('npm publish', packageDir);
|
||||||
|
console.log('npm publish unauth output:', result.stdout);
|
||||||
|
console.log('npm publish unauth stderr:', result.stderr);
|
||||||
|
|
||||||
|
// Restore .npmrc
|
||||||
|
fs.writeFileSync(npmrcPath, npmrcBackup, 'utf-8');
|
||||||
|
|
||||||
|
// Should fail with auth error
|
||||||
|
expect(result.exitCode).not.toEqual(0);
|
||||||
|
});
|
||||||
|
|
||||||
|
tap.postTask('cleanup npm cli tests', async () => {
|
||||||
|
// Stop server
|
||||||
|
if (server) {
|
||||||
|
await new Promise<void>((resolve) => {
|
||||||
|
server.close(() => resolve());
|
||||||
|
});
|
||||||
|
}
|
||||||
|
|
||||||
|
// Cleanup test directory
|
||||||
|
if (testDir) {
|
||||||
|
cleanupTestDir(testDir);
|
||||||
|
}
|
||||||
|
|
||||||
|
// Destroy registry
|
||||||
|
if (registry) {
|
||||||
|
registry.destroy();
|
||||||
|
}
|
||||||
|
});
|
||||||
|
|
||||||
|
export default tap.start();
|
||||||
@@ -3,6 +3,6 @@
|
|||||||
*/
|
*/
|
||||||
export const commitinfo = {
|
export const commitinfo = {
|
||||||
name: '@push.rocks/smartregistry',
|
name: '@push.rocks/smartregistry',
|
||||||
version: '1.2.0',
|
version: '1.5.0',
|
||||||
description: 'a registry for npm modules and oci images'
|
description: 'a registry for npm modules and oci images'
|
||||||
}
|
}
|
||||||
|
|||||||
@@ -5,10 +5,12 @@ import type { IRegistryConfig, IRequestContext, IResponse } from './core/interfa
|
|||||||
import { OciRegistry } from './oci/classes.ociregistry.js';
|
import { OciRegistry } from './oci/classes.ociregistry.js';
|
||||||
import { NpmRegistry } from './npm/classes.npmregistry.js';
|
import { NpmRegistry } from './npm/classes.npmregistry.js';
|
||||||
import { MavenRegistry } from './maven/classes.mavenregistry.js';
|
import { MavenRegistry } from './maven/classes.mavenregistry.js';
|
||||||
|
import { CargoRegistry } from './cargo/classes.cargoregistry.js';
|
||||||
|
import { ComposerRegistry } from './composer/classes.composerregistry.js';
|
||||||
|
|
||||||
/**
|
/**
|
||||||
* Main registry orchestrator
|
* Main registry orchestrator
|
||||||
* Routes requests to appropriate protocol handlers (OCI, NPM, or Maven)
|
* Routes requests to appropriate protocol handlers (OCI, NPM, Maven, Cargo, or Composer)
|
||||||
*/
|
*/
|
||||||
export class SmartRegistry {
|
export class SmartRegistry {
|
||||||
private storage: RegistryStorage;
|
private storage: RegistryStorage;
|
||||||
@@ -61,6 +63,24 @@ export class SmartRegistry {
|
|||||||
this.registries.set('maven', mavenRegistry);
|
this.registries.set('maven', mavenRegistry);
|
||||||
}
|
}
|
||||||
|
|
||||||
|
// Initialize Cargo registry if enabled
|
||||||
|
if (this.config.cargo?.enabled) {
|
||||||
|
const cargoBasePath = this.config.cargo.basePath || '/cargo';
|
||||||
|
const registryUrl = `http://localhost:5000${cargoBasePath}`; // TODO: Make configurable
|
||||||
|
const cargoRegistry = new CargoRegistry(this.storage, this.authManager, cargoBasePath, registryUrl);
|
||||||
|
await cargoRegistry.init();
|
||||||
|
this.registries.set('cargo', cargoRegistry);
|
||||||
|
}
|
||||||
|
|
||||||
|
// Initialize Composer registry if enabled
|
||||||
|
if (this.config.composer?.enabled) {
|
||||||
|
const composerBasePath = this.config.composer.basePath || '/composer';
|
||||||
|
const registryUrl = `http://localhost:5000${composerBasePath}`; // TODO: Make configurable
|
||||||
|
const composerRegistry = new ComposerRegistry(this.storage, this.authManager, composerBasePath, registryUrl);
|
||||||
|
await composerRegistry.init();
|
||||||
|
this.registries.set('composer', composerRegistry);
|
||||||
|
}
|
||||||
|
|
||||||
this.initialized = true;
|
this.initialized = true;
|
||||||
}
|
}
|
||||||
|
|
||||||
@@ -95,6 +115,22 @@ export class SmartRegistry {
|
|||||||
}
|
}
|
||||||
}
|
}
|
||||||
|
|
||||||
|
// Route to Cargo registry
|
||||||
|
if (this.config.cargo?.enabled && path.startsWith(this.config.cargo.basePath)) {
|
||||||
|
const cargoRegistry = this.registries.get('cargo');
|
||||||
|
if (cargoRegistry) {
|
||||||
|
return cargoRegistry.handleRequest(context);
|
||||||
|
}
|
||||||
|
}
|
||||||
|
|
||||||
|
// Route to Composer registry
|
||||||
|
if (this.config.composer?.enabled && path.startsWith(this.config.composer.basePath)) {
|
||||||
|
const composerRegistry = this.registries.get('composer');
|
||||||
|
if (composerRegistry) {
|
||||||
|
return composerRegistry.handleRequest(context);
|
||||||
|
}
|
||||||
|
}
|
||||||
|
|
||||||
// No matching registry
|
// No matching registry
|
||||||
return {
|
return {
|
||||||
status: 404,
|
status: 404,
|
||||||
@@ -123,7 +159,7 @@ export class SmartRegistry {
|
|||||||
/**
|
/**
|
||||||
* Get a specific registry handler
|
* Get a specific registry handler
|
||||||
*/
|
*/
|
||||||
public getRegistry(protocol: 'oci' | 'npm' | 'maven'): BaseRegistry | undefined {
|
public getRegistry(protocol: 'oci' | 'npm' | 'maven' | 'cargo' | 'composer'): BaseRegistry | undefined {
|
||||||
return this.registries.get(protocol);
|
return this.registries.get(protocol);
|
||||||
}
|
}
|
||||||
|
|
||||||
|
|||||||
459
ts/composer/classes.composerregistry.ts
Normal file
459
ts/composer/classes.composerregistry.ts
Normal file
@@ -0,0 +1,459 @@
|
|||||||
|
/**
|
||||||
|
* Composer Registry Implementation
|
||||||
|
* Compliant with Composer v2 repository API
|
||||||
|
*/
|
||||||
|
|
||||||
|
import { BaseRegistry } from '../core/classes.baseregistry.js';
|
||||||
|
import type { RegistryStorage } from '../core/classes.registrystorage.js';
|
||||||
|
import type { AuthManager } from '../core/classes.authmanager.js';
|
||||||
|
import type { IRequestContext, IResponse, IAuthToken } from '../core/interfaces.core.js';
|
||||||
|
import type {
|
||||||
|
IComposerPackage,
|
||||||
|
IComposerPackageMetadata,
|
||||||
|
IComposerRepository,
|
||||||
|
} from './interfaces.composer.js';
|
||||||
|
import {
|
||||||
|
normalizeVersion,
|
||||||
|
validateComposerJson,
|
||||||
|
extractComposerJsonFromZip,
|
||||||
|
calculateSha1,
|
||||||
|
parseVendorPackage,
|
||||||
|
generatePackagesJson,
|
||||||
|
sortVersions,
|
||||||
|
} from './helpers.composer.js';
|
||||||
|
|
||||||
|
export class ComposerRegistry extends BaseRegistry {
|
||||||
|
private storage: RegistryStorage;
|
||||||
|
private authManager: AuthManager;
|
||||||
|
private basePath: string = '/composer';
|
||||||
|
private registryUrl: string;
|
||||||
|
|
||||||
|
constructor(
|
||||||
|
storage: RegistryStorage,
|
||||||
|
authManager: AuthManager,
|
||||||
|
basePath: string = '/composer',
|
||||||
|
registryUrl: string = 'http://localhost:5000/composer'
|
||||||
|
) {
|
||||||
|
super();
|
||||||
|
this.storage = storage;
|
||||||
|
this.authManager = authManager;
|
||||||
|
this.basePath = basePath;
|
||||||
|
this.registryUrl = registryUrl;
|
||||||
|
}
|
||||||
|
|
||||||
|
public async init(): Promise<void> {
|
||||||
|
// Composer registry initialization
|
||||||
|
}
|
||||||
|
|
||||||
|
public getBasePath(): string {
|
||||||
|
return this.basePath;
|
||||||
|
}
|
||||||
|
|
||||||
|
public async handleRequest(context: IRequestContext): Promise<IResponse> {
|
||||||
|
const path = context.path.replace(this.basePath, '');
|
||||||
|
|
||||||
|
// Extract token from Authorization header
|
||||||
|
const authHeader = context.headers['authorization'] || context.headers['Authorization'];
|
||||||
|
let token: IAuthToken | null = null;
|
||||||
|
|
||||||
|
if (authHeader) {
|
||||||
|
if (authHeader.startsWith('Bearer ')) {
|
||||||
|
const tokenString = authHeader.replace(/^Bearer\s+/i, '');
|
||||||
|
token = await this.authManager.validateToken(tokenString, 'composer');
|
||||||
|
} else if (authHeader.startsWith('Basic ')) {
|
||||||
|
// Handle HTTP Basic Auth
|
||||||
|
const credentials = Buffer.from(authHeader.replace(/^Basic\s+/i, ''), 'base64').toString('utf-8');
|
||||||
|
const [username, password] = credentials.split(':');
|
||||||
|
const userId = await this.authManager.authenticate({ username, password });
|
||||||
|
if (userId) {
|
||||||
|
// Create temporary token for this request
|
||||||
|
token = {
|
||||||
|
type: 'composer',
|
||||||
|
userId,
|
||||||
|
scopes: ['composer:*:*:read'],
|
||||||
|
readonly: true,
|
||||||
|
};
|
||||||
|
}
|
||||||
|
}
|
||||||
|
}
|
||||||
|
|
||||||
|
// Root packages.json
|
||||||
|
if (path === '/packages.json' || path === '' || path === '/') {
|
||||||
|
return this.handlePackagesJson();
|
||||||
|
}
|
||||||
|
|
||||||
|
// Package metadata: /p2/{vendor}/{package}.json or /p2/{vendor}/{package}~dev.json
|
||||||
|
const metadataMatch = path.match(/^\/p2\/([^\/]+\/[^\/]+?)(~dev)?\.json$/);
|
||||||
|
if (metadataMatch) {
|
||||||
|
const [, vendorPackage, devSuffix] = metadataMatch;
|
||||||
|
const includeDev = !!devSuffix;
|
||||||
|
return this.handlePackageMetadata(vendorPackage, includeDev, token);
|
||||||
|
}
|
||||||
|
|
||||||
|
// Package list: /packages/list.json?filter=vendor/*
|
||||||
|
if (path.startsWith('/packages/list.json')) {
|
||||||
|
const filter = context.query['filter'];
|
||||||
|
return this.handlePackageList(filter, token);
|
||||||
|
}
|
||||||
|
|
||||||
|
// Package ZIP download: /dists/{vendor}/{package}/{reference}.zip
|
||||||
|
const distMatch = path.match(/^\/dists\/([^\/]+\/[^\/]+)\/([^\/]+)\.zip$/);
|
||||||
|
if (distMatch) {
|
||||||
|
const [, vendorPackage, reference] = distMatch;
|
||||||
|
return this.handlePackageDownload(vendorPackage, reference, token);
|
||||||
|
}
|
||||||
|
|
||||||
|
// Package upload: PUT /packages/{vendor}/{package}
|
||||||
|
const uploadMatch = path.match(/^\/packages\/([^\/]+\/[^\/]+)$/);
|
||||||
|
if (uploadMatch && context.method === 'PUT') {
|
||||||
|
const vendorPackage = uploadMatch[1];
|
||||||
|
return this.handlePackageUpload(vendorPackage, context.body, token);
|
||||||
|
}
|
||||||
|
|
||||||
|
// Package delete: DELETE /packages/{vendor}/{package}
|
||||||
|
if (uploadMatch && context.method === 'DELETE') {
|
||||||
|
const vendorPackage = uploadMatch[1];
|
||||||
|
return this.handlePackageDelete(vendorPackage, token);
|
||||||
|
}
|
||||||
|
|
||||||
|
// Version delete: DELETE /packages/{vendor}/{package}/{version}
|
||||||
|
const versionDeleteMatch = path.match(/^\/packages\/([^\/]+\/[^\/]+)\/(.+)$/);
|
||||||
|
if (versionDeleteMatch && context.method === 'DELETE') {
|
||||||
|
const [, vendorPackage, version] = versionDeleteMatch;
|
||||||
|
return this.handleVersionDelete(vendorPackage, version, token);
|
||||||
|
}
|
||||||
|
|
||||||
|
return {
|
||||||
|
status: 404,
|
||||||
|
headers: { 'Content-Type': 'application/json' },
|
||||||
|
body: { status: 'error', message: 'Not found' },
|
||||||
|
};
|
||||||
|
}
|
||||||
|
|
||||||
|
protected async checkPermission(
|
||||||
|
token: IAuthToken | null,
|
||||||
|
resource: string,
|
||||||
|
action: string
|
||||||
|
): Promise<boolean> {
|
||||||
|
if (!token) return false;
|
||||||
|
return this.authManager.authorize(token, `composer:package:${resource}`, action);
|
||||||
|
}
|
||||||
|
|
||||||
|
// ========================================================================
|
||||||
|
// REQUEST HANDLERS
|
||||||
|
// ========================================================================
|
||||||
|
|
||||||
|
private async handlePackagesJson(): Promise<IResponse> {
|
||||||
|
const availablePackages = await this.storage.listComposerPackages();
|
||||||
|
const packagesJson = generatePackagesJson(this.registryUrl, availablePackages);
|
||||||
|
|
||||||
|
return {
|
||||||
|
status: 200,
|
||||||
|
headers: { 'Content-Type': 'application/json' },
|
||||||
|
body: packagesJson,
|
||||||
|
};
|
||||||
|
}
|
||||||
|
|
||||||
|
private async handlePackageMetadata(
|
||||||
|
vendorPackage: string,
|
||||||
|
includeDev: boolean,
|
||||||
|
token: IAuthToken | null
|
||||||
|
): Promise<IResponse> {
|
||||||
|
// Read operations are public, no authentication required
|
||||||
|
const metadata = await this.storage.getComposerPackageMetadata(vendorPackage);
|
||||||
|
|
||||||
|
if (!metadata) {
|
||||||
|
return {
|
||||||
|
status: 404,
|
||||||
|
headers: { 'Content-Type': 'application/json' },
|
||||||
|
body: { status: 'error', message: 'Package not found' },
|
||||||
|
};
|
||||||
|
}
|
||||||
|
|
||||||
|
// Filter dev versions if needed
|
||||||
|
let packages = metadata.packages[vendorPackage] || [];
|
||||||
|
if (!includeDev) {
|
||||||
|
packages = packages.filter((pkg: IComposerPackage) =>
|
||||||
|
!pkg.version.includes('dev') && !pkg.version.includes('alpha') && !pkg.version.includes('beta')
|
||||||
|
);
|
||||||
|
}
|
||||||
|
|
||||||
|
const response: IComposerPackageMetadata = {
|
||||||
|
minified: 'composer/2.0',
|
||||||
|
packages: {
|
||||||
|
[vendorPackage]: packages,
|
||||||
|
},
|
||||||
|
};
|
||||||
|
|
||||||
|
return {
|
||||||
|
status: 200,
|
||||||
|
headers: {
|
||||||
|
'Content-Type': 'application/json',
|
||||||
|
'Last-Modified': metadata.lastModified || new Date().toUTCString(),
|
||||||
|
},
|
||||||
|
body: response,
|
||||||
|
};
|
||||||
|
}
|
||||||
|
|
||||||
|
private async handlePackageList(
|
||||||
|
filter: string | undefined,
|
||||||
|
token: IAuthToken | null
|
||||||
|
): Promise<IResponse> {
|
||||||
|
let packages = await this.storage.listComposerPackages();
|
||||||
|
|
||||||
|
// Apply filter if provided
|
||||||
|
if (filter) {
|
||||||
|
const regex = new RegExp('^' + filter.replace(/\*/g, '.*') + '$');
|
||||||
|
packages = packages.filter(pkg => regex.test(pkg));
|
||||||
|
}
|
||||||
|
|
||||||
|
return {
|
||||||
|
status: 200,
|
||||||
|
headers: { 'Content-Type': 'application/json' },
|
||||||
|
body: { packageNames: packages },
|
||||||
|
};
|
||||||
|
}
|
||||||
|
|
||||||
|
private async handlePackageDownload(
|
||||||
|
vendorPackage: string,
|
||||||
|
reference: string,
|
||||||
|
token: IAuthToken | null
|
||||||
|
): Promise<IResponse> {
|
||||||
|
// Read operations are public, no authentication required
|
||||||
|
const zipData = await this.storage.getComposerPackageZip(vendorPackage, reference);
|
||||||
|
|
||||||
|
if (!zipData) {
|
||||||
|
return {
|
||||||
|
status: 404,
|
||||||
|
headers: {},
|
||||||
|
body: { status: 'error', message: 'Package file not found' },
|
||||||
|
};
|
||||||
|
}
|
||||||
|
|
||||||
|
return {
|
||||||
|
status: 200,
|
||||||
|
headers: {
|
||||||
|
'Content-Type': 'application/zip',
|
||||||
|
'Content-Length': zipData.length.toString(),
|
||||||
|
'Content-Disposition': `attachment; filename="${reference}.zip"`,
|
||||||
|
},
|
||||||
|
body: zipData,
|
||||||
|
};
|
||||||
|
}
|
||||||
|
|
||||||
|
private async handlePackageUpload(
|
||||||
|
vendorPackage: string,
|
||||||
|
body: any,
|
||||||
|
token: IAuthToken | null
|
||||||
|
): Promise<IResponse> {
|
||||||
|
// Check write permission
|
||||||
|
if (!await this.checkPermission(token, vendorPackage, 'write')) {
|
||||||
|
return {
|
||||||
|
status: 401,
|
||||||
|
headers: {},
|
||||||
|
body: { status: 'error', message: 'Write permission required' },
|
||||||
|
};
|
||||||
|
}
|
||||||
|
|
||||||
|
if (!body || !Buffer.isBuffer(body)) {
|
||||||
|
return {
|
||||||
|
status: 400,
|
||||||
|
headers: {},
|
||||||
|
body: { status: 'error', message: 'ZIP file required' },
|
||||||
|
};
|
||||||
|
}
|
||||||
|
|
||||||
|
// Extract and validate composer.json from ZIP
|
||||||
|
const composerJson = await extractComposerJsonFromZip(body);
|
||||||
|
if (!composerJson || !validateComposerJson(composerJson)) {
|
||||||
|
return {
|
||||||
|
status: 400,
|
||||||
|
headers: {},
|
||||||
|
body: { status: 'error', message: 'Invalid composer.json in ZIP' },
|
||||||
|
};
|
||||||
|
}
|
||||||
|
|
||||||
|
// Verify package name matches
|
||||||
|
if (composerJson.name !== vendorPackage) {
|
||||||
|
return {
|
||||||
|
status: 400,
|
||||||
|
headers: {},
|
||||||
|
body: { status: 'error', message: 'Package name mismatch' },
|
||||||
|
};
|
||||||
|
}
|
||||||
|
|
||||||
|
const version = composerJson.version;
|
||||||
|
if (!version) {
|
||||||
|
return {
|
||||||
|
status: 400,
|
||||||
|
headers: {},
|
||||||
|
body: { status: 'error', message: 'Version required in composer.json' },
|
||||||
|
};
|
||||||
|
}
|
||||||
|
|
||||||
|
// Calculate SHA-1 hash
|
||||||
|
const shasum = await calculateSha1(body);
|
||||||
|
|
||||||
|
// Generate reference (use version or commit hash)
|
||||||
|
const reference = composerJson.source?.reference || version.replace(/[^a-zA-Z0-9.-]/g, '-');
|
||||||
|
|
||||||
|
// Store ZIP file
|
||||||
|
await this.storage.putComposerPackageZip(vendorPackage, reference, body);
|
||||||
|
|
||||||
|
// Get or create metadata
|
||||||
|
let metadata = await this.storage.getComposerPackageMetadata(vendorPackage);
|
||||||
|
if (!metadata) {
|
||||||
|
metadata = {
|
||||||
|
packages: {
|
||||||
|
[vendorPackage]: [],
|
||||||
|
},
|
||||||
|
lastModified: new Date().toUTCString(),
|
||||||
|
};
|
||||||
|
}
|
||||||
|
|
||||||
|
// Build package entry
|
||||||
|
const packageEntry: IComposerPackage = {
|
||||||
|
...composerJson,
|
||||||
|
version_normalized: normalizeVersion(version),
|
||||||
|
dist: {
|
||||||
|
type: 'zip',
|
||||||
|
url: `${this.registryUrl}/dists/${vendorPackage}/${reference}.zip`,
|
||||||
|
reference,
|
||||||
|
shasum,
|
||||||
|
},
|
||||||
|
time: new Date().toISOString(),
|
||||||
|
};
|
||||||
|
|
||||||
|
// Add to metadata (check if version already exists)
|
||||||
|
const packages = metadata.packages[vendorPackage] || [];
|
||||||
|
const existingIndex = packages.findIndex((p: IComposerPackage) => p.version === version);
|
||||||
|
|
||||||
|
if (existingIndex >= 0) {
|
||||||
|
return {
|
||||||
|
status: 409,
|
||||||
|
headers: {},
|
||||||
|
body: { status: 'error', message: 'Version already exists' },
|
||||||
|
};
|
||||||
|
}
|
||||||
|
|
||||||
|
packages.push(packageEntry);
|
||||||
|
|
||||||
|
// Sort by version
|
||||||
|
const sortedVersions = sortVersions(packages.map((p: IComposerPackage) => p.version));
|
||||||
|
packages.sort((a: IComposerPackage, b: IComposerPackage) => {
|
||||||
|
return sortedVersions.indexOf(a.version) - sortedVersions.indexOf(b.version);
|
||||||
|
});
|
||||||
|
|
||||||
|
metadata.packages[vendorPackage] = packages;
|
||||||
|
metadata.lastModified = new Date().toUTCString();
|
||||||
|
|
||||||
|
// Store updated metadata
|
||||||
|
await this.storage.putComposerPackageMetadata(vendorPackage, metadata);
|
||||||
|
|
||||||
|
return {
|
||||||
|
status: 201,
|
||||||
|
headers: {},
|
||||||
|
body: {
|
||||||
|
status: 'success',
|
||||||
|
message: 'Package uploaded successfully',
|
||||||
|
package: vendorPackage,
|
||||||
|
version,
|
||||||
|
},
|
||||||
|
};
|
||||||
|
}
|
||||||
|
|
||||||
|
private async handlePackageDelete(
|
||||||
|
vendorPackage: string,
|
||||||
|
token: IAuthToken | null
|
||||||
|
): Promise<IResponse> {
|
||||||
|
// Check delete permission
|
||||||
|
if (!await this.checkPermission(token, vendorPackage, 'delete')) {
|
||||||
|
return {
|
||||||
|
status: 401,
|
||||||
|
headers: {},
|
||||||
|
body: { status: 'error', message: 'Delete permission required' },
|
||||||
|
};
|
||||||
|
}
|
||||||
|
|
||||||
|
const metadata = await this.storage.getComposerPackageMetadata(vendorPackage);
|
||||||
|
if (!metadata) {
|
||||||
|
return {
|
||||||
|
status: 404,
|
||||||
|
headers: {},
|
||||||
|
body: { status: 'error', message: 'Package not found' },
|
||||||
|
};
|
||||||
|
}
|
||||||
|
|
||||||
|
// Delete all ZIP files
|
||||||
|
const packages = metadata.packages[vendorPackage] || [];
|
||||||
|
for (const pkg of packages) {
|
||||||
|
if (pkg.dist?.reference) {
|
||||||
|
await this.storage.deleteComposerPackageZip(vendorPackage, pkg.dist.reference);
|
||||||
|
}
|
||||||
|
}
|
||||||
|
|
||||||
|
// Delete metadata
|
||||||
|
await this.storage.deleteComposerPackageMetadata(vendorPackage);
|
||||||
|
|
||||||
|
return {
|
||||||
|
status: 204,
|
||||||
|
headers: {},
|
||||||
|
body: null,
|
||||||
|
};
|
||||||
|
}
|
||||||
|
|
||||||
|
private async handleVersionDelete(
|
||||||
|
vendorPackage: string,
|
||||||
|
version: string,
|
||||||
|
token: IAuthToken | null
|
||||||
|
): Promise<IResponse> {
|
||||||
|
// Check delete permission
|
||||||
|
if (!await this.checkPermission(token, vendorPackage, 'delete')) {
|
||||||
|
return {
|
||||||
|
status: 401,
|
||||||
|
headers: {},
|
||||||
|
body: { status: 'error', message: 'Delete permission required' },
|
||||||
|
};
|
||||||
|
}
|
||||||
|
|
||||||
|
const metadata = await this.storage.getComposerPackageMetadata(vendorPackage);
|
||||||
|
if (!metadata) {
|
||||||
|
return {
|
||||||
|
status: 404,
|
||||||
|
headers: {},
|
||||||
|
body: { status: 'error', message: 'Package not found' },
|
||||||
|
};
|
||||||
|
}
|
||||||
|
|
||||||
|
const packages = metadata.packages[vendorPackage] || [];
|
||||||
|
const versionIndex = packages.findIndex((p: IComposerPackage) => p.version === version);
|
||||||
|
|
||||||
|
if (versionIndex === -1) {
|
||||||
|
return {
|
||||||
|
status: 404,
|
||||||
|
headers: {},
|
||||||
|
body: { status: 'error', message: 'Version not found' },
|
||||||
|
};
|
||||||
|
}
|
||||||
|
|
||||||
|
// Delete ZIP file
|
||||||
|
const pkg = packages[versionIndex];
|
||||||
|
if (pkg.dist?.reference) {
|
||||||
|
await this.storage.deleteComposerPackageZip(vendorPackage, pkg.dist.reference);
|
||||||
|
}
|
||||||
|
|
||||||
|
// Remove from metadata
|
||||||
|
packages.splice(versionIndex, 1);
|
||||||
|
metadata.packages[vendorPackage] = packages;
|
||||||
|
metadata.lastModified = new Date().toUTCString();
|
||||||
|
|
||||||
|
// Save updated metadata
|
||||||
|
await this.storage.putComposerPackageMetadata(vendorPackage, metadata);
|
||||||
|
|
||||||
|
return {
|
||||||
|
status: 204,
|
||||||
|
headers: {},
|
||||||
|
body: null,
|
||||||
|
};
|
||||||
|
}
|
||||||
|
}
|
||||||
139
ts/composer/helpers.composer.ts
Normal file
139
ts/composer/helpers.composer.ts
Normal file
@@ -0,0 +1,139 @@
|
|||||||
|
/**
|
||||||
|
* Composer Registry Helper Functions
|
||||||
|
*/
|
||||||
|
|
||||||
|
import type { IComposerPackage } from './interfaces.composer.js';
|
||||||
|
|
||||||
|
/**
|
||||||
|
* Normalize version string to Composer format
|
||||||
|
* Example: "1.0.0" -> "1.0.0.0", "v2.3.1" -> "2.3.1.0"
|
||||||
|
*/
|
||||||
|
export function normalizeVersion(version: string): string {
|
||||||
|
// Remove 'v' prefix if present
|
||||||
|
let normalized = version.replace(/^v/i, '');
|
||||||
|
|
||||||
|
// Handle special versions (dev, alpha, beta, rc)
|
||||||
|
if (normalized.includes('dev') || normalized.includes('alpha') || normalized.includes('beta') || normalized.includes('RC')) {
|
||||||
|
// For dev versions, just return as-is with .0 appended if needed
|
||||||
|
const parts = normalized.split(/[-+]/)[0].split('.');
|
||||||
|
while (parts.length < 4) {
|
||||||
|
parts.push('0');
|
||||||
|
}
|
||||||
|
return parts.slice(0, 4).join('.');
|
||||||
|
}
|
||||||
|
|
||||||
|
// Split by dots
|
||||||
|
const parts = normalized.split('.');
|
||||||
|
|
||||||
|
// Ensure 4 parts (major.minor.patch.build)
|
||||||
|
while (parts.length < 4) {
|
||||||
|
parts.push('0');
|
||||||
|
}
|
||||||
|
|
||||||
|
return parts.slice(0, 4).join('.');
|
||||||
|
}
|
||||||
|
|
||||||
|
/**
|
||||||
|
* Validate composer.json structure
|
||||||
|
*/
|
||||||
|
export function validateComposerJson(composerJson: any): boolean {
|
||||||
|
return !!(
|
||||||
|
composerJson &&
|
||||||
|
typeof composerJson.name === 'string' &&
|
||||||
|
composerJson.name.includes('/') &&
|
||||||
|
(composerJson.version || composerJson.require)
|
||||||
|
);
|
||||||
|
}
|
||||||
|
|
||||||
|
/**
|
||||||
|
* Extract composer.json from ZIP buffer
|
||||||
|
*/
|
||||||
|
export async function extractComposerJsonFromZip(zipBuffer: Buffer): Promise<any | null> {
|
||||||
|
try {
|
||||||
|
const AdmZip = (await import('adm-zip')).default;
|
||||||
|
const zip = new AdmZip(zipBuffer);
|
||||||
|
const entries = zip.getEntries();
|
||||||
|
|
||||||
|
// Look for composer.json in root or first-level directory
|
||||||
|
for (const entry of entries) {
|
||||||
|
if (entry.entryName.endsWith('composer.json')) {
|
||||||
|
const parts = entry.entryName.split('/');
|
||||||
|
if (parts.length <= 2) { // Root or first-level dir
|
||||||
|
const content = entry.getData().toString('utf-8');
|
||||||
|
return JSON.parse(content);
|
||||||
|
}
|
||||||
|
}
|
||||||
|
}
|
||||||
|
|
||||||
|
return null;
|
||||||
|
} catch (error) {
|
||||||
|
return null;
|
||||||
|
}
|
||||||
|
}
|
||||||
|
|
||||||
|
/**
|
||||||
|
* Calculate SHA-1 hash for ZIP file
|
||||||
|
*/
|
||||||
|
export async function calculateSha1(data: Buffer): Promise<string> {
|
||||||
|
const crypto = await import('crypto');
|
||||||
|
return crypto.createHash('sha1').update(data).digest('hex');
|
||||||
|
}
|
||||||
|
|
||||||
|
/**
|
||||||
|
* Parse vendor/package format
|
||||||
|
*/
|
||||||
|
export function parseVendorPackage(name: string): { vendor: string; package: string } | null {
|
||||||
|
const parts = name.split('/');
|
||||||
|
if (parts.length !== 2) {
|
||||||
|
return null;
|
||||||
|
}
|
||||||
|
return { vendor: parts[0], package: parts[1] };
|
||||||
|
}
|
||||||
|
|
||||||
|
/**
|
||||||
|
* Generate packages.json root repository file
|
||||||
|
*/
|
||||||
|
export function generatePackagesJson(
|
||||||
|
registryUrl: string,
|
||||||
|
availablePackages: string[]
|
||||||
|
): any {
|
||||||
|
return {
|
||||||
|
'metadata-url': `${registryUrl}/p2/%package%.json`,
|
||||||
|
'available-packages': availablePackages,
|
||||||
|
};
|
||||||
|
}
|
||||||
|
|
||||||
|
/**
|
||||||
|
* Sort versions in semantic version order
|
||||||
|
*/
|
||||||
|
export function sortVersions(versions: string[]): string[] {
|
||||||
|
return versions.sort((a, b) => {
|
||||||
|
const aParts = a.replace(/^v/i, '').split(/[.-]/).map(part => {
|
||||||
|
const num = parseInt(part, 10);
|
||||||
|
return isNaN(num) ? part : num;
|
||||||
|
});
|
||||||
|
const bParts = b.replace(/^v/i, '').split(/[.-]/).map(part => {
|
||||||
|
const num = parseInt(part, 10);
|
||||||
|
return isNaN(num) ? part : num;
|
||||||
|
});
|
||||||
|
|
||||||
|
for (let i = 0; i < Math.max(aParts.length, bParts.length); i++) {
|
||||||
|
const aPart = aParts[i] ?? 0;
|
||||||
|
const bPart = bParts[i] ?? 0;
|
||||||
|
|
||||||
|
// Compare numbers numerically, strings lexicographically
|
||||||
|
if (typeof aPart === 'number' && typeof bPart === 'number') {
|
||||||
|
if (aPart !== bPart) {
|
||||||
|
return aPart - bPart;
|
||||||
|
}
|
||||||
|
} else {
|
||||||
|
const aStr = String(aPart);
|
||||||
|
const bStr = String(bPart);
|
||||||
|
if (aStr !== bStr) {
|
||||||
|
return aStr.localeCompare(bStr);
|
||||||
|
}
|
||||||
|
}
|
||||||
|
}
|
||||||
|
return 0;
|
||||||
|
});
|
||||||
|
}
|
||||||
8
ts/composer/index.ts
Normal file
8
ts/composer/index.ts
Normal file
@@ -0,0 +1,8 @@
|
|||||||
|
/**
|
||||||
|
* Composer Registry Module
|
||||||
|
* Export all public interfaces, classes, and helpers
|
||||||
|
*/
|
||||||
|
|
||||||
|
export { ComposerRegistry } from './classes.composerregistry.js';
|
||||||
|
export * from './interfaces.composer.js';
|
||||||
|
export * from './helpers.composer.js';
|
||||||
111
ts/composer/interfaces.composer.ts
Normal file
111
ts/composer/interfaces.composer.ts
Normal file
@@ -0,0 +1,111 @@
|
|||||||
|
/**
|
||||||
|
* Composer Registry Type Definitions
|
||||||
|
* Compliant with Composer v2 repository API
|
||||||
|
*/
|
||||||
|
|
||||||
|
/**
|
||||||
|
* Composer package metadata
|
||||||
|
*/
|
||||||
|
export interface IComposerPackage {
|
||||||
|
name: string; // vendor/package-name
|
||||||
|
version: string; // 1.0.0
|
||||||
|
version_normalized: string; // 1.0.0.0
|
||||||
|
type?: string; // library, project, metapackage
|
||||||
|
description?: string;
|
||||||
|
keywords?: string[];
|
||||||
|
homepage?: string;
|
||||||
|
license?: string[];
|
||||||
|
authors?: IComposerAuthor[];
|
||||||
|
require?: Record<string, string>;
|
||||||
|
'require-dev'?: Record<string, string>;
|
||||||
|
suggest?: Record<string, string>;
|
||||||
|
provide?: Record<string, string>;
|
||||||
|
conflict?: Record<string, string>;
|
||||||
|
replace?: Record<string, string>;
|
||||||
|
autoload?: IComposerAutoload;
|
||||||
|
'autoload-dev'?: IComposerAutoload;
|
||||||
|
dist?: IComposerDist;
|
||||||
|
source?: IComposerSource;
|
||||||
|
time?: string; // ISO 8601 timestamp
|
||||||
|
support?: Record<string, string>;
|
||||||
|
funding?: IComposerFunding[];
|
||||||
|
extra?: Record<string, any>;
|
||||||
|
}
|
||||||
|
|
||||||
|
/**
|
||||||
|
* Author information
|
||||||
|
*/
|
||||||
|
export interface IComposerAuthor {
|
||||||
|
name: string;
|
||||||
|
email?: string;
|
||||||
|
homepage?: string;
|
||||||
|
role?: string;
|
||||||
|
}
|
||||||
|
|
||||||
|
/**
|
||||||
|
* PSR-4/PSR-0 autoloading configuration
|
||||||
|
*/
|
||||||
|
export interface IComposerAutoload {
|
||||||
|
'psr-4'?: Record<string, string | string[]>;
|
||||||
|
'psr-0'?: Record<string, string | string[]>;
|
||||||
|
classmap?: string[];
|
||||||
|
files?: string[];
|
||||||
|
'exclude-from-classmap'?: string[];
|
||||||
|
}
|
||||||
|
|
||||||
|
/**
|
||||||
|
* Distribution information (ZIP download)
|
||||||
|
*/
|
||||||
|
export interface IComposerDist {
|
||||||
|
type: 'zip' | 'tar' | 'phar';
|
||||||
|
url: string;
|
||||||
|
reference?: string; // commit hash or tag
|
||||||
|
shasum?: string; // SHA-1 hash
|
||||||
|
}
|
||||||
|
|
||||||
|
/**
|
||||||
|
* Source repository information
|
||||||
|
*/
|
||||||
|
export interface IComposerSource {
|
||||||
|
type: 'git' | 'svn' | 'hg';
|
||||||
|
url: string;
|
||||||
|
reference: string; // commit hash, branch, or tag
|
||||||
|
}
|
||||||
|
|
||||||
|
/**
|
||||||
|
* Funding information
|
||||||
|
*/
|
||||||
|
export interface IComposerFunding {
|
||||||
|
type: string; // github, patreon, etc.
|
||||||
|
url: string;
|
||||||
|
}
|
||||||
|
|
||||||
|
/**
|
||||||
|
* Repository metadata (packages.json)
|
||||||
|
*/
|
||||||
|
export interface IComposerRepository {
|
||||||
|
packages?: Record<string, Record<string, IComposerPackage>>;
|
||||||
|
'metadata-url'?: string; // /p2/%package%.json
|
||||||
|
'available-packages'?: string[];
|
||||||
|
'available-package-patterns'?: string[];
|
||||||
|
'providers-url'?: string;
|
||||||
|
'notify-batch'?: string;
|
||||||
|
minified?: string; // "composer/2.0"
|
||||||
|
}
|
||||||
|
|
||||||
|
/**
|
||||||
|
* Package metadata response (/p2/vendor/package.json)
|
||||||
|
*/
|
||||||
|
export interface IComposerPackageMetadata {
|
||||||
|
packages: Record<string, IComposerPackage[]>;
|
||||||
|
minified?: string;
|
||||||
|
lastModified?: string;
|
||||||
|
}
|
||||||
|
|
||||||
|
/**
|
||||||
|
* Error structure
|
||||||
|
*/
|
||||||
|
export interface IComposerError {
|
||||||
|
status: string;
|
||||||
|
message: string;
|
||||||
|
}
|
||||||
@@ -270,12 +270,200 @@ export class AuthManager {
|
|||||||
this.tokenStore.delete(token);
|
this.tokenStore.delete(token);
|
||||||
}
|
}
|
||||||
|
|
||||||
|
// ========================================================================
|
||||||
|
// COMPOSER TOKEN MANAGEMENT
|
||||||
|
// ========================================================================
|
||||||
|
|
||||||
|
/**
|
||||||
|
* Create a Composer token
|
||||||
|
* @param userId - User ID
|
||||||
|
* @param readonly - Whether the token is readonly
|
||||||
|
* @returns Composer UUID token
|
||||||
|
*/
|
||||||
|
public async createComposerToken(userId: string, readonly: boolean = false): Promise<string> {
|
||||||
|
const scopes = readonly ? ['composer:*:*:read'] : ['composer:*:*:*'];
|
||||||
|
return this.createUuidToken(userId, 'composer', scopes, readonly);
|
||||||
|
}
|
||||||
|
|
||||||
|
/**
|
||||||
|
* Validate a Composer token
|
||||||
|
* @param token - Composer UUID token
|
||||||
|
* @returns Auth token object or null
|
||||||
|
*/
|
||||||
|
public async validateComposerToken(token: string): Promise<IAuthToken | null> {
|
||||||
|
if (!this.isValidUuid(token)) {
|
||||||
|
return null;
|
||||||
|
}
|
||||||
|
|
||||||
|
const authToken = this.tokenStore.get(token);
|
||||||
|
if (!authToken || authToken.type !== 'composer') {
|
||||||
|
return null;
|
||||||
|
}
|
||||||
|
|
||||||
|
// Check expiration if set
|
||||||
|
if (authToken.expiresAt && authToken.expiresAt < new Date()) {
|
||||||
|
this.tokenStore.delete(token);
|
||||||
|
return null;
|
||||||
|
}
|
||||||
|
|
||||||
|
return authToken;
|
||||||
|
}
|
||||||
|
|
||||||
|
/**
|
||||||
|
* Revoke a Composer token
|
||||||
|
* @param token - Composer UUID token
|
||||||
|
*/
|
||||||
|
public async revokeComposerToken(token: string): Promise<void> {
|
||||||
|
this.tokenStore.delete(token);
|
||||||
|
}
|
||||||
|
|
||||||
|
// ========================================================================
|
||||||
|
// CARGO TOKEN MANAGEMENT
|
||||||
|
// ========================================================================
|
||||||
|
|
||||||
|
/**
|
||||||
|
* Create a Cargo token
|
||||||
|
* @param userId - User ID
|
||||||
|
* @param readonly - Whether the token is readonly
|
||||||
|
* @returns Cargo UUID token
|
||||||
|
*/
|
||||||
|
public async createCargoToken(userId: string, readonly: boolean = false): Promise<string> {
|
||||||
|
const scopes = readonly ? ['cargo:*:*:read'] : ['cargo:*:*:*'];
|
||||||
|
return this.createUuidToken(userId, 'cargo', scopes, readonly);
|
||||||
|
}
|
||||||
|
|
||||||
|
/**
|
||||||
|
* Validate a Cargo token
|
||||||
|
* @param token - Cargo UUID token
|
||||||
|
* @returns Auth token object or null
|
||||||
|
*/
|
||||||
|
public async validateCargoToken(token: string): Promise<IAuthToken | null> {
|
||||||
|
if (!this.isValidUuid(token)) {
|
||||||
|
return null;
|
||||||
|
}
|
||||||
|
|
||||||
|
const authToken = this.tokenStore.get(token);
|
||||||
|
if (!authToken || authToken.type !== 'cargo') {
|
||||||
|
return null;
|
||||||
|
}
|
||||||
|
|
||||||
|
// Check expiration if set
|
||||||
|
if (authToken.expiresAt && authToken.expiresAt < new Date()) {
|
||||||
|
this.tokenStore.delete(token);
|
||||||
|
return null;
|
||||||
|
}
|
||||||
|
|
||||||
|
return authToken;
|
||||||
|
}
|
||||||
|
|
||||||
|
/**
|
||||||
|
* Revoke a Cargo token
|
||||||
|
* @param token - Cargo UUID token
|
||||||
|
*/
|
||||||
|
public async revokeCargoToken(token: string): Promise<void> {
|
||||||
|
this.tokenStore.delete(token);
|
||||||
|
}
|
||||||
|
|
||||||
|
// ========================================================================
|
||||||
|
// PYPI AUTHENTICATION
|
||||||
|
// ========================================================================
|
||||||
|
|
||||||
|
/**
|
||||||
|
* Create a PyPI token
|
||||||
|
* @param userId - User ID
|
||||||
|
* @param readonly - Whether the token is readonly
|
||||||
|
* @returns PyPI UUID token
|
||||||
|
*/
|
||||||
|
public async createPypiToken(userId: string, readonly: boolean = false): Promise<string> {
|
||||||
|
const scopes = readonly ? ['pypi:*:*:read'] : ['pypi:*:*:*'];
|
||||||
|
return this.createUuidToken(userId, 'pypi', scopes, readonly);
|
||||||
|
}
|
||||||
|
|
||||||
|
/**
|
||||||
|
* Validate a PyPI token
|
||||||
|
* @param token - PyPI UUID token
|
||||||
|
* @returns Auth token object or null
|
||||||
|
*/
|
||||||
|
public async validatePypiToken(token: string): Promise<IAuthToken | null> {
|
||||||
|
if (!this.isValidUuid(token)) {
|
||||||
|
return null;
|
||||||
|
}
|
||||||
|
|
||||||
|
const authToken = this.tokenStore.get(token);
|
||||||
|
if (!authToken || authToken.type !== 'pypi') {
|
||||||
|
return null;
|
||||||
|
}
|
||||||
|
|
||||||
|
// Check expiration if set
|
||||||
|
if (authToken.expiresAt && authToken.expiresAt < new Date()) {
|
||||||
|
this.tokenStore.delete(token);
|
||||||
|
return null;
|
||||||
|
}
|
||||||
|
|
||||||
|
return authToken;
|
||||||
|
}
|
||||||
|
|
||||||
|
/**
|
||||||
|
* Revoke a PyPI token
|
||||||
|
* @param token - PyPI UUID token
|
||||||
|
*/
|
||||||
|
public async revokePypiToken(token: string): Promise<void> {
|
||||||
|
this.tokenStore.delete(token);
|
||||||
|
}
|
||||||
|
|
||||||
|
// ========================================================================
|
||||||
|
// RUBYGEMS AUTHENTICATION
|
||||||
|
// ========================================================================
|
||||||
|
|
||||||
|
/**
|
||||||
|
* Create a RubyGems token
|
||||||
|
* @param userId - User ID
|
||||||
|
* @param readonly - Whether the token is readonly
|
||||||
|
* @returns RubyGems UUID token
|
||||||
|
*/
|
||||||
|
public async createRubyGemsToken(userId: string, readonly: boolean = false): Promise<string> {
|
||||||
|
const scopes = readonly ? ['rubygems:*:*:read'] : ['rubygems:*:*:*'];
|
||||||
|
return this.createUuidToken(userId, 'rubygems', scopes, readonly);
|
||||||
|
}
|
||||||
|
|
||||||
|
/**
|
||||||
|
* Validate a RubyGems token
|
||||||
|
* @param token - RubyGems UUID token
|
||||||
|
* @returns Auth token object or null
|
||||||
|
*/
|
||||||
|
public async validateRubyGemsToken(token: string): Promise<IAuthToken | null> {
|
||||||
|
if (!this.isValidUuid(token)) {
|
||||||
|
return null;
|
||||||
|
}
|
||||||
|
|
||||||
|
const authToken = this.tokenStore.get(token);
|
||||||
|
if (!authToken || authToken.type !== 'rubygems') {
|
||||||
|
return null;
|
||||||
|
}
|
||||||
|
|
||||||
|
// Check expiration if set
|
||||||
|
if (authToken.expiresAt && authToken.expiresAt < new Date()) {
|
||||||
|
this.tokenStore.delete(token);
|
||||||
|
return null;
|
||||||
|
}
|
||||||
|
|
||||||
|
return authToken;
|
||||||
|
}
|
||||||
|
|
||||||
|
/**
|
||||||
|
* Revoke a RubyGems token
|
||||||
|
* @param token - RubyGems UUID token
|
||||||
|
*/
|
||||||
|
public async revokeRubyGemsToken(token: string): Promise<void> {
|
||||||
|
this.tokenStore.delete(token);
|
||||||
|
}
|
||||||
|
|
||||||
// ========================================================================
|
// ========================================================================
|
||||||
// UNIFIED AUTHENTICATION
|
// UNIFIED AUTHENTICATION
|
||||||
// ========================================================================
|
// ========================================================================
|
||||||
|
|
||||||
/**
|
/**
|
||||||
* Validate any token (NPM, Maven, or OCI)
|
* Validate any token (NPM, Maven, OCI, PyPI, RubyGems, Composer, Cargo)
|
||||||
* @param tokenString - Token string (UUID or JWT)
|
* @param tokenString - Token string (UUID or JWT)
|
||||||
* @param protocol - Expected protocol type
|
* @param protocol - Expected protocol type
|
||||||
* @returns Auth token object or null
|
* @returns Auth token object or null
|
||||||
@@ -284,7 +472,7 @@ export class AuthManager {
|
|||||||
tokenString: string,
|
tokenString: string,
|
||||||
protocol?: TRegistryProtocol
|
protocol?: TRegistryProtocol
|
||||||
): Promise<IAuthToken | null> {
|
): Promise<IAuthToken | null> {
|
||||||
// Try UUID-based tokens (NPM, Maven)
|
// Try UUID-based tokens (NPM, Maven, Composer, Cargo, PyPI, RubyGems)
|
||||||
if (this.isValidUuid(tokenString)) {
|
if (this.isValidUuid(tokenString)) {
|
||||||
// Try NPM token
|
// Try NPM token
|
||||||
const npmToken = await this.validateNpmToken(tokenString);
|
const npmToken = await this.validateNpmToken(tokenString);
|
||||||
@@ -297,6 +485,30 @@ export class AuthManager {
|
|||||||
if (mavenToken && (!protocol || protocol === 'maven')) {
|
if (mavenToken && (!protocol || protocol === 'maven')) {
|
||||||
return mavenToken;
|
return mavenToken;
|
||||||
}
|
}
|
||||||
|
|
||||||
|
// Try Composer token
|
||||||
|
const composerToken = await this.validateComposerToken(tokenString);
|
||||||
|
if (composerToken && (!protocol || protocol === 'composer')) {
|
||||||
|
return composerToken;
|
||||||
|
}
|
||||||
|
|
||||||
|
// Try Cargo token
|
||||||
|
const cargoToken = await this.validateCargoToken(tokenString);
|
||||||
|
if (cargoToken && (!protocol || protocol === 'cargo')) {
|
||||||
|
return cargoToken;
|
||||||
|
}
|
||||||
|
|
||||||
|
// Try PyPI token
|
||||||
|
const pypiToken = await this.validatePypiToken(tokenString);
|
||||||
|
if (pypiToken && (!protocol || protocol === 'pypi')) {
|
||||||
|
return pypiToken;
|
||||||
|
}
|
||||||
|
|
||||||
|
// Try RubyGems token
|
||||||
|
const rubygemsToken = await this.validateRubyGemsToken(tokenString);
|
||||||
|
if (rubygemsToken && (!protocol || protocol === 'rubygems')) {
|
||||||
|
return rubygemsToken;
|
||||||
|
}
|
||||||
}
|
}
|
||||||
|
|
||||||
// Try OCI JWT
|
// Try OCI JWT
|
||||||
|
|||||||
@@ -348,6 +348,17 @@ export class RegistryStorage implements IStorageBackend {
|
|||||||
return this.putObject(path, data);
|
return this.putObject(path, data);
|
||||||
}
|
}
|
||||||
|
|
||||||
|
/**
|
||||||
|
* Delete Maven metadata (maven-metadata.xml)
|
||||||
|
*/
|
||||||
|
public async deleteMavenMetadata(
|
||||||
|
groupId: string,
|
||||||
|
artifactId: string
|
||||||
|
): Promise<void> {
|
||||||
|
const path = this.getMavenMetadataPath(groupId, artifactId);
|
||||||
|
return this.deleteObject(path);
|
||||||
|
}
|
||||||
|
|
||||||
/**
|
/**
|
||||||
* List Maven versions for an artifact
|
* List Maven versions for an artifact
|
||||||
* Returns all version directories under the artifact path
|
* Returns all version directories under the artifact path
|
||||||
@@ -392,4 +403,429 @@ export class RegistryStorage implements IStorageBackend {
|
|||||||
const groupPath = groupId.replace(/\./g, '/');
|
const groupPath = groupId.replace(/\./g, '/');
|
||||||
return `maven/metadata/${groupPath}/${artifactId}/maven-metadata.xml`;
|
return `maven/metadata/${groupPath}/${artifactId}/maven-metadata.xml`;
|
||||||
}
|
}
|
||||||
|
|
||||||
|
// ========================================================================
|
||||||
|
// CARGO-SPECIFIC HELPERS
|
||||||
|
// ========================================================================
|
||||||
|
|
||||||
|
/**
|
||||||
|
* Get Cargo config.json
|
||||||
|
*/
|
||||||
|
public async getCargoConfig(): Promise<any | null> {
|
||||||
|
const data = await this.getObject('cargo/config.json');
|
||||||
|
return data ? JSON.parse(data.toString('utf-8')) : null;
|
||||||
|
}
|
||||||
|
|
||||||
|
/**
|
||||||
|
* Store Cargo config.json
|
||||||
|
*/
|
||||||
|
public async putCargoConfig(config: any): Promise<void> {
|
||||||
|
const data = Buffer.from(JSON.stringify(config, null, 2), 'utf-8');
|
||||||
|
return this.putObject('cargo/config.json', data, { 'Content-Type': 'application/json' });
|
||||||
|
}
|
||||||
|
|
||||||
|
/**
|
||||||
|
* Get Cargo index file (newline-delimited JSON)
|
||||||
|
*/
|
||||||
|
public async getCargoIndex(crateName: string): Promise<any[] | null> {
|
||||||
|
const path = this.getCargoIndexPath(crateName);
|
||||||
|
const data = await this.getObject(path);
|
||||||
|
if (!data) return null;
|
||||||
|
|
||||||
|
// Parse newline-delimited JSON
|
||||||
|
const lines = data.toString('utf-8').split('\n').filter(line => line.trim());
|
||||||
|
return lines.map(line => JSON.parse(line));
|
||||||
|
}
|
||||||
|
|
||||||
|
/**
|
||||||
|
* Store Cargo index file
|
||||||
|
*/
|
||||||
|
public async putCargoIndex(crateName: string, entries: any[]): Promise<void> {
|
||||||
|
const path = this.getCargoIndexPath(crateName);
|
||||||
|
// Convert to newline-delimited JSON
|
||||||
|
const data = Buffer.from(entries.map(e => JSON.stringify(e)).join('\n') + '\n', 'utf-8');
|
||||||
|
return this.putObject(path, data, { 'Content-Type': 'text/plain' });
|
||||||
|
}
|
||||||
|
|
||||||
|
/**
|
||||||
|
* Get Cargo .crate file
|
||||||
|
*/
|
||||||
|
public async getCargoCrate(crateName: string, version: string): Promise<Buffer | null> {
|
||||||
|
const path = this.getCargoCratePath(crateName, version);
|
||||||
|
return this.getObject(path);
|
||||||
|
}
|
||||||
|
|
||||||
|
/**
|
||||||
|
* Store Cargo .crate file
|
||||||
|
*/
|
||||||
|
public async putCargoCrate(
|
||||||
|
crateName: string,
|
||||||
|
version: string,
|
||||||
|
crateFile: Buffer
|
||||||
|
): Promise<void> {
|
||||||
|
const path = this.getCargoCratePath(crateName, version);
|
||||||
|
return this.putObject(path, crateFile, { 'Content-Type': 'application/gzip' });
|
||||||
|
}
|
||||||
|
|
||||||
|
/**
|
||||||
|
* Check if Cargo crate exists
|
||||||
|
*/
|
||||||
|
public async cargoCrateExists(crateName: string, version: string): Promise<boolean> {
|
||||||
|
const path = this.getCargoCratePath(crateName, version);
|
||||||
|
return this.objectExists(path);
|
||||||
|
}
|
||||||
|
|
||||||
|
/**
|
||||||
|
* Delete Cargo crate (for cleanup, not for unpublishing)
|
||||||
|
*/
|
||||||
|
public async deleteCargoCrate(crateName: string, version: string): Promise<void> {
|
||||||
|
const path = this.getCargoCratePath(crateName, version);
|
||||||
|
return this.deleteObject(path);
|
||||||
|
}
|
||||||
|
|
||||||
|
// ========================================================================
|
||||||
|
// CARGO PATH HELPERS
|
||||||
|
// ========================================================================
|
||||||
|
|
||||||
|
private getCargoIndexPath(crateName: string): string {
|
||||||
|
const lower = crateName.toLowerCase();
|
||||||
|
const len = lower.length;
|
||||||
|
|
||||||
|
if (len === 1) {
|
||||||
|
return `cargo/index/1/${lower}`;
|
||||||
|
} else if (len === 2) {
|
||||||
|
return `cargo/index/2/${lower}`;
|
||||||
|
} else if (len === 3) {
|
||||||
|
return `cargo/index/3/${lower.charAt(0)}/${lower}`;
|
||||||
|
} else {
|
||||||
|
// 4+ characters: {first-two}/{second-two}/{name}
|
||||||
|
const prefix1 = lower.substring(0, 2);
|
||||||
|
const prefix2 = lower.substring(2, 4);
|
||||||
|
return `cargo/index/${prefix1}/${prefix2}/${lower}`;
|
||||||
|
}
|
||||||
|
}
|
||||||
|
|
||||||
|
private getCargoCratePath(crateName: string, version: string): string {
|
||||||
|
return `cargo/crates/${crateName}/${crateName}-${version}.crate`;
|
||||||
|
}
|
||||||
|
|
||||||
|
// ========================================================================
|
||||||
|
// COMPOSER-SPECIFIC HELPERS
|
||||||
|
// ========================================================================
|
||||||
|
|
||||||
|
/**
|
||||||
|
* Get Composer package metadata
|
||||||
|
*/
|
||||||
|
public async getComposerPackageMetadata(vendorPackage: string): Promise<any | null> {
|
||||||
|
const path = this.getComposerMetadataPath(vendorPackage);
|
||||||
|
const data = await this.getObject(path);
|
||||||
|
return data ? JSON.parse(data.toString('utf-8')) : null;
|
||||||
|
}
|
||||||
|
|
||||||
|
/**
|
||||||
|
* Store Composer package metadata
|
||||||
|
*/
|
||||||
|
public async putComposerPackageMetadata(vendorPackage: string, metadata: any): Promise<void> {
|
||||||
|
const path = this.getComposerMetadataPath(vendorPackage);
|
||||||
|
const data = Buffer.from(JSON.stringify(metadata, null, 2), 'utf-8');
|
||||||
|
return this.putObject(path, data, { 'Content-Type': 'application/json' });
|
||||||
|
}
|
||||||
|
|
||||||
|
/**
|
||||||
|
* Get Composer package ZIP
|
||||||
|
*/
|
||||||
|
public async getComposerPackageZip(vendorPackage: string, reference: string): Promise<Buffer | null> {
|
||||||
|
const path = this.getComposerZipPath(vendorPackage, reference);
|
||||||
|
return this.getObject(path);
|
||||||
|
}
|
||||||
|
|
||||||
|
/**
|
||||||
|
* Store Composer package ZIP
|
||||||
|
*/
|
||||||
|
public async putComposerPackageZip(vendorPackage: string, reference: string, zipData: Buffer): Promise<void> {
|
||||||
|
const path = this.getComposerZipPath(vendorPackage, reference);
|
||||||
|
return this.putObject(path, zipData, { 'Content-Type': 'application/zip' });
|
||||||
|
}
|
||||||
|
|
||||||
|
/**
|
||||||
|
* Check if Composer package metadata exists
|
||||||
|
*/
|
||||||
|
public async composerPackageMetadataExists(vendorPackage: string): Promise<boolean> {
|
||||||
|
const path = this.getComposerMetadataPath(vendorPackage);
|
||||||
|
return this.objectExists(path);
|
||||||
|
}
|
||||||
|
|
||||||
|
/**
|
||||||
|
* Delete Composer package metadata
|
||||||
|
*/
|
||||||
|
public async deleteComposerPackageMetadata(vendorPackage: string): Promise<void> {
|
||||||
|
const path = this.getComposerMetadataPath(vendorPackage);
|
||||||
|
return this.deleteObject(path);
|
||||||
|
}
|
||||||
|
|
||||||
|
/**
|
||||||
|
* Delete Composer package ZIP
|
||||||
|
*/
|
||||||
|
public async deleteComposerPackageZip(vendorPackage: string, reference: string): Promise<void> {
|
||||||
|
const path = this.getComposerZipPath(vendorPackage, reference);
|
||||||
|
return this.deleteObject(path);
|
||||||
|
}
|
||||||
|
|
||||||
|
/**
|
||||||
|
* List all Composer packages
|
||||||
|
*/
|
||||||
|
public async listComposerPackages(): Promise<string[]> {
|
||||||
|
const prefix = 'composer/packages/';
|
||||||
|
const objects = await this.listObjects(prefix);
|
||||||
|
const packages = new Set<string>();
|
||||||
|
|
||||||
|
// Extract vendor/package from paths like: composer/packages/vendor/package/metadata.json
|
||||||
|
for (const obj of objects) {
|
||||||
|
const match = obj.match(/^composer\/packages\/([^\/]+\/[^\/]+)\/metadata\.json$/);
|
||||||
|
if (match) {
|
||||||
|
packages.add(match[1]);
|
||||||
|
}
|
||||||
|
}
|
||||||
|
|
||||||
|
return Array.from(packages).sort();
|
||||||
|
}
|
||||||
|
|
||||||
|
// ========================================================================
|
||||||
|
// COMPOSER PATH HELPERS
|
||||||
|
// ========================================================================
|
||||||
|
|
||||||
|
private getComposerMetadataPath(vendorPackage: string): string {
|
||||||
|
return `composer/packages/${vendorPackage}/metadata.json`;
|
||||||
|
}
|
||||||
|
|
||||||
|
private getComposerZipPath(vendorPackage: string, reference: string): string {
|
||||||
|
return `composer/packages/${vendorPackage}/${reference}.zip`;
|
||||||
|
}
|
||||||
|
|
||||||
|
// ========================================================================
|
||||||
|
// PYPI STORAGE METHODS
|
||||||
|
// ========================================================================
|
||||||
|
|
||||||
|
/**
|
||||||
|
* Get PyPI package metadata
|
||||||
|
*/
|
||||||
|
public async getPypiPackageMetadata(packageName: string): Promise<any | null> {
|
||||||
|
const path = this.getPypiMetadataPath(packageName);
|
||||||
|
const data = await this.getObject(path);
|
||||||
|
return data ? JSON.parse(data.toString('utf-8')) : null;
|
||||||
|
}
|
||||||
|
|
||||||
|
/**
|
||||||
|
* Store PyPI package metadata
|
||||||
|
*/
|
||||||
|
public async putPypiPackageMetadata(packageName: string, metadata: any): Promise<void> {
|
||||||
|
const path = this.getPypiMetadataPath(packageName);
|
||||||
|
const data = Buffer.from(JSON.stringify(metadata, null, 2), 'utf-8');
|
||||||
|
return this.putObject(path, data, { 'Content-Type': 'application/json' });
|
||||||
|
}
|
||||||
|
|
||||||
|
/**
|
||||||
|
* Check if PyPI package metadata exists
|
||||||
|
*/
|
||||||
|
public async pypiPackageMetadataExists(packageName: string): Promise<boolean> {
|
||||||
|
const path = this.getPypiMetadataPath(packageName);
|
||||||
|
return this.objectExists(path);
|
||||||
|
}
|
||||||
|
|
||||||
|
/**
|
||||||
|
* Delete PyPI package metadata
|
||||||
|
*/
|
||||||
|
public async deletePypiPackageMetadata(packageName: string): Promise<void> {
|
||||||
|
const path = this.getPypiMetadataPath(packageName);
|
||||||
|
return this.deleteObject(path);
|
||||||
|
}
|
||||||
|
|
||||||
|
/**
|
||||||
|
* Get PyPI Simple API index (HTML)
|
||||||
|
*/
|
||||||
|
public async getPypiSimpleIndex(packageName: string): Promise<string | null> {
|
||||||
|
const path = this.getPypiSimpleIndexPath(packageName);
|
||||||
|
const data = await this.getObject(path);
|
||||||
|
return data ? data.toString('utf-8') : null;
|
||||||
|
}
|
||||||
|
|
||||||
|
/**
|
||||||
|
* Store PyPI Simple API index (HTML)
|
||||||
|
*/
|
||||||
|
public async putPypiSimpleIndex(packageName: string, html: string): Promise<void> {
|
||||||
|
const path = this.getPypiSimpleIndexPath(packageName);
|
||||||
|
const data = Buffer.from(html, 'utf-8');
|
||||||
|
return this.putObject(path, data, { 'Content-Type': 'text/html; charset=utf-8' });
|
||||||
|
}
|
||||||
|
|
||||||
|
/**
|
||||||
|
* Get PyPI root Simple API index (HTML)
|
||||||
|
*/
|
||||||
|
public async getPypiSimpleRootIndex(): Promise<string | null> {
|
||||||
|
const path = this.getPypiSimpleRootIndexPath();
|
||||||
|
const data = await this.getObject(path);
|
||||||
|
return data ? data.toString('utf-8') : null;
|
||||||
|
}
|
||||||
|
|
||||||
|
/**
|
||||||
|
* Store PyPI root Simple API index (HTML)
|
||||||
|
*/
|
||||||
|
public async putPypiSimpleRootIndex(html: string): Promise<void> {
|
||||||
|
const path = this.getPypiSimpleRootIndexPath();
|
||||||
|
const data = Buffer.from(html, 'utf-8');
|
||||||
|
return this.putObject(path, data, { 'Content-Type': 'text/html; charset=utf-8' });
|
||||||
|
}
|
||||||
|
|
||||||
|
/**
|
||||||
|
* Get PyPI package file (wheel, sdist)
|
||||||
|
*/
|
||||||
|
public async getPypiPackageFile(packageName: string, filename: string): Promise<Buffer | null> {
|
||||||
|
const path = this.getPypiPackageFilePath(packageName, filename);
|
||||||
|
return this.getObject(path);
|
||||||
|
}
|
||||||
|
|
||||||
|
/**
|
||||||
|
* Store PyPI package file (wheel, sdist)
|
||||||
|
*/
|
||||||
|
public async putPypiPackageFile(
|
||||||
|
packageName: string,
|
||||||
|
filename: string,
|
||||||
|
data: Buffer
|
||||||
|
): Promise<void> {
|
||||||
|
const path = this.getPypiPackageFilePath(packageName, filename);
|
||||||
|
return this.putObject(path, data, { 'Content-Type': 'application/octet-stream' });
|
||||||
|
}
|
||||||
|
|
||||||
|
/**
|
||||||
|
* Check if PyPI package file exists
|
||||||
|
*/
|
||||||
|
public async pypiPackageFileExists(packageName: string, filename: string): Promise<boolean> {
|
||||||
|
const path = this.getPypiPackageFilePath(packageName, filename);
|
||||||
|
return this.objectExists(path);
|
||||||
|
}
|
||||||
|
|
||||||
|
/**
|
||||||
|
* Delete PyPI package file
|
||||||
|
*/
|
||||||
|
public async deletePypiPackageFile(packageName: string, filename: string): Promise<void> {
|
||||||
|
const path = this.getPypiPackageFilePath(packageName, filename);
|
||||||
|
return this.deleteObject(path);
|
||||||
|
}
|
||||||
|
|
||||||
|
/**
|
||||||
|
* List all PyPI packages
|
||||||
|
*/
|
||||||
|
public async listPypiPackages(): Promise<string[]> {
|
||||||
|
const prefix = 'pypi/metadata/';
|
||||||
|
const objects = await this.listObjects(prefix);
|
||||||
|
const packages = new Set<string>();
|
||||||
|
|
||||||
|
// Extract package names from paths like: pypi/metadata/package-name/metadata.json
|
||||||
|
for (const obj of objects) {
|
||||||
|
const match = obj.match(/^pypi\/metadata\/([^\/]+)\/metadata\.json$/);
|
||||||
|
if (match) {
|
||||||
|
packages.add(match[1]);
|
||||||
|
}
|
||||||
|
}
|
||||||
|
|
||||||
|
return Array.from(packages).sort();
|
||||||
|
}
|
||||||
|
|
||||||
|
/**
|
||||||
|
* List all versions of a PyPI package
|
||||||
|
*/
|
||||||
|
public async listPypiPackageVersions(packageName: string): Promise<string[]> {
|
||||||
|
const prefix = `pypi/packages/${packageName}/`;
|
||||||
|
const objects = await this.listObjects(prefix);
|
||||||
|
const versions = new Set<string>();
|
||||||
|
|
||||||
|
// Extract versions from filenames
|
||||||
|
for (const obj of objects) {
|
||||||
|
const filename = obj.split('/').pop();
|
||||||
|
if (!filename) continue;
|
||||||
|
|
||||||
|
// Extract version from wheel filename: package-1.0.0-py3-none-any.whl
|
||||||
|
// or sdist filename: package-1.0.0.tar.gz
|
||||||
|
const wheelMatch = filename.match(/^[^-]+-([^-]+)-.*\.whl$/);
|
||||||
|
const sdistMatch = filename.match(/^[^-]+-([^.]+)\.(tar\.gz|zip)$/);
|
||||||
|
|
||||||
|
if (wheelMatch) versions.add(wheelMatch[1]);
|
||||||
|
else if (sdistMatch) versions.add(sdistMatch[1]);
|
||||||
|
}
|
||||||
|
|
||||||
|
return Array.from(versions).sort();
|
||||||
|
}
|
||||||
|
|
||||||
|
/**
|
||||||
|
* Delete entire PyPI package (all versions and files)
|
||||||
|
*/
|
||||||
|
public async deletePypiPackage(packageName: string): Promise<void> {
|
||||||
|
// Delete metadata
|
||||||
|
await this.deletePypiPackageMetadata(packageName);
|
||||||
|
|
||||||
|
// Delete Simple API index
|
||||||
|
const simpleIndexPath = this.getPypiSimpleIndexPath(packageName);
|
||||||
|
try {
|
||||||
|
await this.deleteObject(simpleIndexPath);
|
||||||
|
} catch (error) {
|
||||||
|
// Ignore if doesn't exist
|
||||||
|
}
|
||||||
|
|
||||||
|
// Delete all package files
|
||||||
|
const prefix = `pypi/packages/${packageName}/`;
|
||||||
|
const objects = await this.listObjects(prefix);
|
||||||
|
for (const obj of objects) {
|
||||||
|
await this.deleteObject(obj);
|
||||||
|
}
|
||||||
|
}
|
||||||
|
|
||||||
|
/**
|
||||||
|
* Delete specific version of a PyPI package
|
||||||
|
*/
|
||||||
|
public async deletePypiPackageVersion(packageName: string, version: string): Promise<void> {
|
||||||
|
const prefix = `pypi/packages/${packageName}/`;
|
||||||
|
const objects = await this.listObjects(prefix);
|
||||||
|
|
||||||
|
// Delete all files matching this version
|
||||||
|
for (const obj of objects) {
|
||||||
|
const filename = obj.split('/').pop();
|
||||||
|
if (!filename) continue;
|
||||||
|
|
||||||
|
// Check if filename contains this version
|
||||||
|
const wheelMatch = filename.match(/^[^-]+-([^-]+)-.*\.whl$/);
|
||||||
|
const sdistMatch = filename.match(/^[^-]+-([^.]+)\.(tar\.gz|zip)$/);
|
||||||
|
|
||||||
|
const fileVersion = wheelMatch?.[1] || sdistMatch?.[1];
|
||||||
|
if (fileVersion === version) {
|
||||||
|
await this.deleteObject(obj);
|
||||||
|
}
|
||||||
|
}
|
||||||
|
|
||||||
|
// Update metadata to remove this version
|
||||||
|
const metadata = await this.getPypiPackageMetadata(packageName);
|
||||||
|
if (metadata && metadata.versions) {
|
||||||
|
delete metadata.versions[version];
|
||||||
|
await this.putPypiPackageMetadata(packageName, metadata);
|
||||||
|
}
|
||||||
|
}
|
||||||
|
|
||||||
|
// ========================================================================
|
||||||
|
// PYPI PATH HELPERS
|
||||||
|
// ========================================================================
|
||||||
|
|
||||||
|
private getPypiMetadataPath(packageName: string): string {
|
||||||
|
return `pypi/metadata/${packageName}/metadata.json`;
|
||||||
|
}
|
||||||
|
|
||||||
|
private getPypiSimpleIndexPath(packageName: string): string {
|
||||||
|
return `pypi/simple/${packageName}/index.html`;
|
||||||
|
}
|
||||||
|
|
||||||
|
private getPypiSimpleRootIndexPath(): string {
|
||||||
|
return `pypi/simple/index.html`;
|
||||||
|
}
|
||||||
|
|
||||||
|
private getPypiPackageFilePath(packageName: string, filename: string): string {
|
||||||
|
return `pypi/packages/${packageName}/${filename}`;
|
||||||
|
}
|
||||||
}
|
}
|
||||||
|
|||||||
@@ -5,7 +5,7 @@
|
|||||||
/**
|
/**
|
||||||
* Registry protocol types
|
* Registry protocol types
|
||||||
*/
|
*/
|
||||||
export type TRegistryProtocol = 'oci' | 'npm' | 'maven';
|
export type TRegistryProtocol = 'oci' | 'npm' | 'maven' | 'cargo' | 'composer' | 'pypi' | 'rubygems';
|
||||||
|
|
||||||
/**
|
/**
|
||||||
* Unified action types across protocols
|
* Unified action types across protocols
|
||||||
@@ -70,6 +70,16 @@ export interface IAuthConfig {
|
|||||||
realm: string;
|
realm: string;
|
||||||
service: string;
|
service: string;
|
||||||
};
|
};
|
||||||
|
/** PyPI token settings */
|
||||||
|
pypiTokens?: {
|
||||||
|
enabled: boolean;
|
||||||
|
defaultReadonly?: boolean;
|
||||||
|
};
|
||||||
|
/** RubyGems token settings */
|
||||||
|
rubygemsTokens?: {
|
||||||
|
enabled: boolean;
|
||||||
|
defaultReadonly?: boolean;
|
||||||
|
};
|
||||||
}
|
}
|
||||||
|
|
||||||
/**
|
/**
|
||||||
@@ -90,6 +100,10 @@ export interface IRegistryConfig {
|
|||||||
oci?: IProtocolConfig;
|
oci?: IProtocolConfig;
|
||||||
npm?: IProtocolConfig;
|
npm?: IProtocolConfig;
|
||||||
maven?: IProtocolConfig;
|
maven?: IProtocolConfig;
|
||||||
|
cargo?: IProtocolConfig;
|
||||||
|
composer?: IProtocolConfig;
|
||||||
|
pypi?: IProtocolConfig;
|
||||||
|
rubygems?: IProtocolConfig;
|
||||||
}
|
}
|
||||||
|
|
||||||
/**
|
/**
|
||||||
|
|||||||
@@ -1,6 +1,6 @@
|
|||||||
/**
|
/**
|
||||||
* @push.rocks/smartregistry
|
* @push.rocks/smartregistry
|
||||||
* Composable registry supporting OCI, NPM, and Maven protocols
|
* Composable registry supporting OCI, NPM, Maven, Cargo, and Composer protocols
|
||||||
*/
|
*/
|
||||||
|
|
||||||
// Main orchestrator
|
// Main orchestrator
|
||||||
@@ -17,3 +17,9 @@ export * from './npm/index.js';
|
|||||||
|
|
||||||
// Maven Registry
|
// Maven Registry
|
||||||
export * from './maven/index.js';
|
export * from './maven/index.js';
|
||||||
|
|
||||||
|
// Cargo Registry
|
||||||
|
export * from './cargo/index.js';
|
||||||
|
|
||||||
|
// Composer Registry
|
||||||
|
export * from './composer/index.js';
|
||||||
|
|||||||
@@ -85,7 +85,7 @@ export class MavenRegistry extends BaseRegistry {
|
|||||||
// Check if it's a checksum file
|
// Check if it's a checksum file
|
||||||
if (coordinate.extension === 'md5' || coordinate.extension === 'sha1' ||
|
if (coordinate.extension === 'md5' || coordinate.extension === 'sha1' ||
|
||||||
coordinate.extension === 'sha256' || coordinate.extension === 'sha512') {
|
coordinate.extension === 'sha256' || coordinate.extension === 'sha512') {
|
||||||
return this.handleChecksumRequest(context.method, coordinate, token);
|
return this.handleChecksumRequest(context.method, coordinate, token, path);
|
||||||
}
|
}
|
||||||
|
|
||||||
// Handle artifact requests (JAR, POM, WAR, etc.)
|
// Handle artifact requests (JAR, POM, WAR, etc.)
|
||||||
@@ -118,17 +118,7 @@ export class MavenRegistry extends BaseRegistry {
|
|||||||
switch (method) {
|
switch (method) {
|
||||||
case 'GET':
|
case 'GET':
|
||||||
case 'HEAD':
|
case 'HEAD':
|
||||||
// Read permission required
|
// Maven repositories typically allow anonymous reads
|
||||||
if (!await this.checkPermission(token, resource, 'read')) {
|
|
||||||
return {
|
|
||||||
status: 401,
|
|
||||||
headers: {
|
|
||||||
'WWW-Authenticate': `Bearer realm="${this.basePath}",service="maven-registry"`,
|
|
||||||
},
|
|
||||||
body: { error: 'UNAUTHORIZED', message: 'Authentication required' },
|
|
||||||
};
|
|
||||||
}
|
|
||||||
|
|
||||||
return method === 'GET'
|
return method === 'GET'
|
||||||
? this.getArtifact(groupId, artifactId, version, filename)
|
? this.getArtifact(groupId, artifactId, version, filename)
|
||||||
: this.headArtifact(groupId, artifactId, version, filename);
|
: this.headArtifact(groupId, artifactId, version, filename);
|
||||||
@@ -181,24 +171,15 @@ export class MavenRegistry extends BaseRegistry {
|
|||||||
private async handleChecksumRequest(
|
private async handleChecksumRequest(
|
||||||
method: string,
|
method: string,
|
||||||
coordinate: IMavenCoordinate,
|
coordinate: IMavenCoordinate,
|
||||||
token: IAuthToken | null
|
token: IAuthToken | null,
|
||||||
|
path: string
|
||||||
): Promise<IResponse> {
|
): Promise<IResponse> {
|
||||||
const { groupId, artifactId, version, extension } = coordinate;
|
const { groupId, artifactId, version, extension } = coordinate;
|
||||||
const resource = `${groupId}:${artifactId}`;
|
const resource = `${groupId}:${artifactId}`;
|
||||||
|
|
||||||
// Checksums follow the same permissions as their artifacts
|
// Checksums follow the same permissions as their artifacts (public read)
|
||||||
if (method === 'GET' || method === 'HEAD') {
|
if (method === 'GET' || method === 'HEAD') {
|
||||||
if (!await this.checkPermission(token, resource, 'read')) {
|
return this.getChecksum(groupId, artifactId, version, coordinate, path);
|
||||||
return {
|
|
||||||
status: 401,
|
|
||||||
headers: {
|
|
||||||
'WWW-Authenticate': `Bearer realm="${this.basePath}",service="maven-registry"`,
|
|
||||||
},
|
|
||||||
body: { error: 'UNAUTHORIZED', message: 'Authentication required' },
|
|
||||||
};
|
|
||||||
}
|
|
||||||
|
|
||||||
return this.getChecksum(groupId, artifactId, version, coordinate);
|
|
||||||
}
|
}
|
||||||
|
|
||||||
return {
|
return {
|
||||||
@@ -402,9 +383,14 @@ export class MavenRegistry extends BaseRegistry {
|
|||||||
groupId: string,
|
groupId: string,
|
||||||
artifactId: string,
|
artifactId: string,
|
||||||
version: string,
|
version: string,
|
||||||
coordinate: IMavenCoordinate
|
coordinate: IMavenCoordinate,
|
||||||
|
fullPath: string
|
||||||
): Promise<IResponse> {
|
): Promise<IResponse> {
|
||||||
const checksumFilename = buildFilename(coordinate);
|
// Extract the filename from the full path (last component)
|
||||||
|
// The fullPath might be something like /com/example/test/test-artifact/1.0.0/test-artifact-1.0.0.jar.md5
|
||||||
|
const pathParts = fullPath.split('/');
|
||||||
|
const checksumFilename = pathParts[pathParts.length - 1];
|
||||||
|
|
||||||
const data = await this.storage.getMavenArtifact(groupId, artifactId, version, checksumFilename);
|
const data = await this.storage.getMavenArtifact(groupId, artifactId, version, checksumFilename);
|
||||||
|
|
||||||
if (!data) {
|
if (!data) {
|
||||||
@@ -567,10 +553,8 @@ export class MavenRegistry extends BaseRegistry {
|
|||||||
const xml = generateMetadataXml(metadata);
|
const xml = generateMetadataXml(metadata);
|
||||||
await this.storage.putMavenMetadata(groupId, artifactId, Buffer.from(xml, 'utf-8'));
|
await this.storage.putMavenMetadata(groupId, artifactId, Buffer.from(xml, 'utf-8'));
|
||||||
|
|
||||||
// Also store checksums for metadata
|
// Note: Checksums for maven-metadata.xml are optional and not critical
|
||||||
const checksums = await calculateChecksums(Buffer.from(xml, 'utf-8'));
|
// They would need special handling since metadata uses a different storage path
|
||||||
const metadataFilename = 'maven-metadata.xml';
|
|
||||||
await this.storeChecksums(groupId, artifactId, '', metadataFilename, checksums);
|
|
||||||
}
|
}
|
||||||
|
|
||||||
// ========================================================================
|
// ========================================================================
|
||||||
|
|||||||
@@ -65,22 +65,35 @@ export function pathToGAV(path: string): IMavenCoordinate | null {
|
|||||||
/**
|
/**
|
||||||
* Parse Maven artifact filename
|
* Parse Maven artifact filename
|
||||||
* Example: my-lib-1.0.0-sources.jar → {classifier: 'sources', extension: 'jar'}
|
* Example: my-lib-1.0.0-sources.jar → {classifier: 'sources', extension: 'jar'}
|
||||||
|
* Example: my-lib-1.0.0.jar.md5 → {extension: 'md5'}
|
||||||
*/
|
*/
|
||||||
export function parseFilename(
|
export function parseFilename(
|
||||||
filename: string,
|
filename: string,
|
||||||
artifactId: string,
|
artifactId: string,
|
||||||
version: string
|
version: string
|
||||||
): { classifier?: string; extension: string } | null {
|
): { classifier?: string; extension: string } | null {
|
||||||
// Expected format: {artifactId}-{version}[-{classifier}].{extension}
|
// Expected format: {artifactId}-{version}[-{classifier}].{extension}[.checksum]
|
||||||
const prefix = `${artifactId}-${version}`;
|
const prefix = `${artifactId}-${version}`;
|
||||||
|
|
||||||
if (!filename.startsWith(prefix)) {
|
if (!filename.startsWith(prefix)) {
|
||||||
return null;
|
return null;
|
||||||
}
|
}
|
||||||
|
|
||||||
const remainder = filename.substring(prefix.length);
|
let remainder = filename.substring(prefix.length);
|
||||||
|
|
||||||
// Check for classifier
|
// Check if this is a checksum file (double extension like .jar.md5)
|
||||||
|
const checksumExtensions = ['md5', 'sha1', 'sha256', 'sha512'];
|
||||||
|
const lastDotIndex = remainder.lastIndexOf('.');
|
||||||
|
if (lastDotIndex !== -1) {
|
||||||
|
const possibleChecksum = remainder.substring(lastDotIndex + 1);
|
||||||
|
if (checksumExtensions.includes(possibleChecksum)) {
|
||||||
|
// This is a checksum file - just return the checksum extension
|
||||||
|
// The base artifact extension doesn't matter for checksum retrieval
|
||||||
|
return { extension: possibleChecksum };
|
||||||
|
}
|
||||||
|
}
|
||||||
|
|
||||||
|
// Regular artifact file parsing
|
||||||
const dotIndex = remainder.lastIndexOf('.');
|
const dotIndex = remainder.lastIndexOf('.');
|
||||||
if (dotIndex === -1) {
|
if (dotIndex === -1) {
|
||||||
return null; // No extension
|
return null; // No extension
|
||||||
|
|||||||
564
ts/pypi/classes.pypiregistry.ts
Normal file
564
ts/pypi/classes.pypiregistry.ts
Normal file
@@ -0,0 +1,564 @@
|
|||||||
|
import { Smartlog } from '@push.rocks/smartlog';
|
||||||
|
import { BaseRegistry } from '../core/classes.baseregistry.js';
|
||||||
|
import { RegistryStorage } from '../core/classes.registrystorage.js';
|
||||||
|
import { AuthManager } from '../core/classes.authmanager.js';
|
||||||
|
import type { IRequestContext, IResponse, IAuthToken } from '../core/interfaces.core.js';
|
||||||
|
import type {
|
||||||
|
IPypiPackageMetadata,
|
||||||
|
IPypiFile,
|
||||||
|
IPypiError,
|
||||||
|
IPypiUploadResponse,
|
||||||
|
} from './interfaces.pypi.js';
|
||||||
|
import * as helpers from './helpers.pypi.js';
|
||||||
|
|
||||||
|
/**
|
||||||
|
* PyPI registry implementation
|
||||||
|
* Implements PEP 503 (Simple API), PEP 691 (JSON API), and legacy upload API
|
||||||
|
*/
|
||||||
|
export class PypiRegistry extends BaseRegistry {
|
||||||
|
private storage: RegistryStorage;
|
||||||
|
private authManager: AuthManager;
|
||||||
|
private basePath: string = '/pypi';
|
||||||
|
private registryUrl: string;
|
||||||
|
private logger: Smartlog;
|
||||||
|
|
||||||
|
constructor(
|
||||||
|
storage: RegistryStorage,
|
||||||
|
authManager: AuthManager,
|
||||||
|
basePath: string = '/pypi',
|
||||||
|
registryUrl: string = 'http://localhost:5000'
|
||||||
|
) {
|
||||||
|
super();
|
||||||
|
this.storage = storage;
|
||||||
|
this.authManager = authManager;
|
||||||
|
this.basePath = basePath;
|
||||||
|
this.registryUrl = registryUrl;
|
||||||
|
|
||||||
|
// Initialize logger
|
||||||
|
this.logger = new Smartlog({
|
||||||
|
logContext: {
|
||||||
|
company: 'push.rocks',
|
||||||
|
companyunit: 'smartregistry',
|
||||||
|
containerName: 'pypi-registry',
|
||||||
|
environment: (process.env.NODE_ENV as any) || 'development',
|
||||||
|
runtime: 'node',
|
||||||
|
zone: 'pypi'
|
||||||
|
}
|
||||||
|
});
|
||||||
|
this.logger.enableConsole();
|
||||||
|
}
|
||||||
|
|
||||||
|
public async init(): Promise<void> {
|
||||||
|
// Initialize root Simple API index if not exists
|
||||||
|
const existingIndex = await this.storage.getPypiSimpleRootIndex();
|
||||||
|
if (!existingIndex) {
|
||||||
|
const html = helpers.generateSimpleRootHtml([]);
|
||||||
|
await this.storage.putPypiSimpleRootIndex(html);
|
||||||
|
this.logger.log('info', 'Initialized PyPI root index');
|
||||||
|
}
|
||||||
|
}
|
||||||
|
|
||||||
|
public getBasePath(): string {
|
||||||
|
return this.basePath;
|
||||||
|
}
|
||||||
|
|
||||||
|
public async handleRequest(context: IRequestContext): Promise<IResponse> {
|
||||||
|
let path = context.path.replace(this.basePath, '');
|
||||||
|
|
||||||
|
// Also handle /simple path prefix
|
||||||
|
if (path.startsWith('/simple')) {
|
||||||
|
path = path.replace('/simple', '');
|
||||||
|
return this.handleSimpleRequest(path, context);
|
||||||
|
}
|
||||||
|
|
||||||
|
// Extract token (Basic Auth or Bearer)
|
||||||
|
const token = await this.extractToken(context);
|
||||||
|
|
||||||
|
this.logger.log('debug', `handleRequest: ${context.method} ${path}`, {
|
||||||
|
method: context.method,
|
||||||
|
path,
|
||||||
|
hasAuth: !!token
|
||||||
|
});
|
||||||
|
|
||||||
|
// Root upload endpoint (POST /)
|
||||||
|
if ((path === '/' || path === '') && context.method === 'POST') {
|
||||||
|
return this.handleUpload(context, token);
|
||||||
|
}
|
||||||
|
|
||||||
|
// Package metadata JSON API: GET /pypi/{package}/json
|
||||||
|
const jsonMatch = path.match(/^\/pypi\/([^\/]+)\/json$/);
|
||||||
|
if (jsonMatch && context.method === 'GET') {
|
||||||
|
return this.handlePackageJson(jsonMatch[1]);
|
||||||
|
}
|
||||||
|
|
||||||
|
// Version-specific JSON API: GET /pypi/{package}/{version}/json
|
||||||
|
const versionJsonMatch = path.match(/^\/pypi\/([^\/]+)\/([^\/]+)\/json$/);
|
||||||
|
if (versionJsonMatch && context.method === 'GET') {
|
||||||
|
return this.handleVersionJson(versionJsonMatch[1], versionJsonMatch[2]);
|
||||||
|
}
|
||||||
|
|
||||||
|
// Package file download: GET /packages/{package}/{filename}
|
||||||
|
const downloadMatch = path.match(/^\/packages\/([^\/]+)\/(.+)$/);
|
||||||
|
if (downloadMatch && context.method === 'GET') {
|
||||||
|
return this.handleDownload(downloadMatch[1], downloadMatch[2]);
|
||||||
|
}
|
||||||
|
|
||||||
|
// Delete package: DELETE /packages/{package}
|
||||||
|
if (path.match(/^\/packages\/([^\/]+)$/) && context.method === 'DELETE') {
|
||||||
|
const packageName = path.match(/^\/packages\/([^\/]+)$/)?.[1];
|
||||||
|
return this.handleDeletePackage(packageName!, token);
|
||||||
|
}
|
||||||
|
|
||||||
|
// Delete version: DELETE /packages/{package}/{version}
|
||||||
|
const deleteVersionMatch = path.match(/^\/packages\/([^\/]+)\/([^\/]+)$/);
|
||||||
|
if (deleteVersionMatch && context.method === 'DELETE') {
|
||||||
|
return this.handleDeleteVersion(deleteVersionMatch[1], deleteVersionMatch[2], token);
|
||||||
|
}
|
||||||
|
|
||||||
|
return {
|
||||||
|
status: 404,
|
||||||
|
headers: { 'Content-Type': 'application/json' },
|
||||||
|
body: Buffer.from(JSON.stringify({ message: 'Not Found' })),
|
||||||
|
};
|
||||||
|
}
|
||||||
|
|
||||||
|
/**
|
||||||
|
* Check if token has permission for resource
|
||||||
|
*/
|
||||||
|
protected async checkPermission(
|
||||||
|
token: IAuthToken | null,
|
||||||
|
resource: string,
|
||||||
|
action: string
|
||||||
|
): Promise<boolean> {
|
||||||
|
if (!token) return false;
|
||||||
|
return this.authManager.authorize(token, `pypi:package:${resource}`, action);
|
||||||
|
}
|
||||||
|
|
||||||
|
/**
|
||||||
|
* Handle Simple API requests (PEP 503 HTML or PEP 691 JSON)
|
||||||
|
*/
|
||||||
|
private async handleSimpleRequest(path: string, context: IRequestContext): Promise<IResponse> {
|
||||||
|
// Ensure path ends with / (PEP 503 requirement)
|
||||||
|
if (!path.endsWith('/') && !path.includes('.')) {
|
||||||
|
return {
|
||||||
|
status: 301,
|
||||||
|
headers: { 'Location': `${this.basePath}/simple${path}/` },
|
||||||
|
body: Buffer.from(''),
|
||||||
|
};
|
||||||
|
}
|
||||||
|
|
||||||
|
// Root index: /simple/
|
||||||
|
if (path === '/' || path === '') {
|
||||||
|
return this.handleSimpleRoot(context);
|
||||||
|
}
|
||||||
|
|
||||||
|
// Package index: /simple/{package}/
|
||||||
|
const packageMatch = path.match(/^\/([^\/]+)\/$/);
|
||||||
|
if (packageMatch) {
|
||||||
|
return this.handleSimplePackage(packageMatch[1], context);
|
||||||
|
}
|
||||||
|
|
||||||
|
return {
|
||||||
|
status: 404,
|
||||||
|
headers: { 'Content-Type': 'text/html; charset=utf-8' },
|
||||||
|
body: Buffer.from('<html><body><h1>404 Not Found</h1></body></html>'),
|
||||||
|
};
|
||||||
|
}
|
||||||
|
|
||||||
|
/**
|
||||||
|
* Handle Simple API root index
|
||||||
|
* Returns HTML (PEP 503) or JSON (PEP 691) based on Accept header
|
||||||
|
*/
|
||||||
|
private async handleSimpleRoot(context: IRequestContext): Promise<IResponse> {
|
||||||
|
const acceptHeader = context.headers['accept'] || context.headers['Accept'] || '';
|
||||||
|
const preferJson = acceptHeader.includes('application/vnd.pypi.simple') &&
|
||||||
|
acceptHeader.includes('json');
|
||||||
|
|
||||||
|
const packages = await this.storage.listPypiPackages();
|
||||||
|
|
||||||
|
if (preferJson) {
|
||||||
|
// PEP 691: JSON response
|
||||||
|
const response = helpers.generateJsonRootResponse(packages);
|
||||||
|
return {
|
||||||
|
status: 200,
|
||||||
|
headers: {
|
||||||
|
'Content-Type': 'application/vnd.pypi.simple.v1+json',
|
||||||
|
'Cache-Control': 'public, max-age=600'
|
||||||
|
},
|
||||||
|
body: Buffer.from(JSON.stringify(response)),
|
||||||
|
};
|
||||||
|
} else {
|
||||||
|
// PEP 503: HTML response
|
||||||
|
const html = helpers.generateSimpleRootHtml(packages);
|
||||||
|
|
||||||
|
// Update stored index
|
||||||
|
await this.storage.putPypiSimpleRootIndex(html);
|
||||||
|
|
||||||
|
return {
|
||||||
|
status: 200,
|
||||||
|
headers: {
|
||||||
|
'Content-Type': 'text/html; charset=utf-8',
|
||||||
|
'Cache-Control': 'public, max-age=600'
|
||||||
|
},
|
||||||
|
body: Buffer.from(html),
|
||||||
|
};
|
||||||
|
}
|
||||||
|
}
|
||||||
|
|
||||||
|
/**
|
||||||
|
* Handle Simple API package index
|
||||||
|
* Returns HTML (PEP 503) or JSON (PEP 691) based on Accept header
|
||||||
|
*/
|
||||||
|
private async handleSimplePackage(packageName: string, context: IRequestContext): Promise<IResponse> {
|
||||||
|
const normalized = helpers.normalizePypiPackageName(packageName);
|
||||||
|
|
||||||
|
// Get package metadata
|
||||||
|
const metadata = await this.storage.getPypiPackageMetadata(normalized);
|
||||||
|
if (!metadata) {
|
||||||
|
return {
|
||||||
|
status: 404,
|
||||||
|
headers: { 'Content-Type': 'text/html; charset=utf-8' },
|
||||||
|
body: Buffer.from('<html><body><h1>404 Not Found</h1></body></html>'),
|
||||||
|
};
|
||||||
|
}
|
||||||
|
|
||||||
|
// Build file list from all versions
|
||||||
|
const files: IPypiFile[] = [];
|
||||||
|
for (const [version, versionMeta] of Object.entries(metadata.versions || {})) {
|
||||||
|
for (const file of (versionMeta as any).files || []) {
|
||||||
|
files.push({
|
||||||
|
filename: file.filename,
|
||||||
|
url: `${this.registryUrl}/pypi/packages/${normalized}/${file.filename}`,
|
||||||
|
hashes: file.hashes,
|
||||||
|
'requires-python': file['requires-python'],
|
||||||
|
yanked: file.yanked || (versionMeta as any).yanked,
|
||||||
|
size: file.size,
|
||||||
|
'upload-time': file['upload-time'],
|
||||||
|
});
|
||||||
|
}
|
||||||
|
}
|
||||||
|
|
||||||
|
const acceptHeader = context.headers['accept'] || context.headers['Accept'] || '';
|
||||||
|
const preferJson = acceptHeader.includes('application/vnd.pypi.simple') &&
|
||||||
|
acceptHeader.includes('json');
|
||||||
|
|
||||||
|
if (preferJson) {
|
||||||
|
// PEP 691: JSON response
|
||||||
|
const response = helpers.generateJsonPackageResponse(normalized, files);
|
||||||
|
return {
|
||||||
|
status: 200,
|
||||||
|
headers: {
|
||||||
|
'Content-Type': 'application/vnd.pypi.simple.v1+json',
|
||||||
|
'Cache-Control': 'public, max-age=300'
|
||||||
|
},
|
||||||
|
body: Buffer.from(JSON.stringify(response)),
|
||||||
|
};
|
||||||
|
} else {
|
||||||
|
// PEP 503: HTML response
|
||||||
|
const html = helpers.generateSimplePackageHtml(normalized, files, this.registryUrl);
|
||||||
|
|
||||||
|
// Update stored index
|
||||||
|
await this.storage.putPypiSimpleIndex(normalized, html);
|
||||||
|
|
||||||
|
return {
|
||||||
|
status: 200,
|
||||||
|
headers: {
|
||||||
|
'Content-Type': 'text/html; charset=utf-8',
|
||||||
|
'Cache-Control': 'public, max-age=300'
|
||||||
|
},
|
||||||
|
body: Buffer.from(html),
|
||||||
|
};
|
||||||
|
}
|
||||||
|
}
|
||||||
|
|
||||||
|
/**
|
||||||
|
* Extract authentication token from request
|
||||||
|
*/
|
||||||
|
private async extractToken(context: IRequestContext): Promise<IAuthToken | null> {
|
||||||
|
const authHeader = context.headers['authorization'] || context.headers['Authorization'];
|
||||||
|
if (!authHeader) return null;
|
||||||
|
|
||||||
|
// Handle Basic Auth (username:password or __token__:token)
|
||||||
|
if (authHeader.startsWith('Basic ')) {
|
||||||
|
const base64 = authHeader.substring(6);
|
||||||
|
const decoded = Buffer.from(base64, 'base64').toString('utf-8');
|
||||||
|
const [username, password] = decoded.split(':');
|
||||||
|
|
||||||
|
// PyPI token authentication: username = __token__
|
||||||
|
if (username === '__token__') {
|
||||||
|
return this.authManager.validateToken(password, 'pypi');
|
||||||
|
}
|
||||||
|
|
||||||
|
// Username/password authentication (would need user lookup)
|
||||||
|
// For now, not implemented
|
||||||
|
return null;
|
||||||
|
}
|
||||||
|
|
||||||
|
// Handle Bearer token
|
||||||
|
if (authHeader.startsWith('Bearer ')) {
|
||||||
|
const token = authHeader.substring(7);
|
||||||
|
return this.authManager.validateToken(token, 'pypi');
|
||||||
|
}
|
||||||
|
|
||||||
|
return null;
|
||||||
|
}
|
||||||
|
|
||||||
|
/**
|
||||||
|
* Handle package upload (multipart/form-data)
|
||||||
|
* POST / with :action=file_upload
|
||||||
|
*/
|
||||||
|
private async handleUpload(context: IRequestContext, token: IAuthToken | null): Promise<IResponse> {
|
||||||
|
if (!token) {
|
||||||
|
return {
|
||||||
|
status: 401,
|
||||||
|
headers: {
|
||||||
|
'Content-Type': 'application/json',
|
||||||
|
'WWW-Authenticate': 'Basic realm="PyPI"'
|
||||||
|
},
|
||||||
|
body: Buffer.from(JSON.stringify({ message: 'Authentication required' })),
|
||||||
|
};
|
||||||
|
}
|
||||||
|
|
||||||
|
try {
|
||||||
|
// Parse multipart form data (context.body should be parsed by server)
|
||||||
|
const formData = context.body as any; // Assuming parsed multipart data
|
||||||
|
|
||||||
|
if (!formData || formData[':action'] !== 'file_upload') {
|
||||||
|
return this.errorResponse(400, 'Invalid upload request');
|
||||||
|
}
|
||||||
|
|
||||||
|
// Extract required fields
|
||||||
|
const packageName = formData.name;
|
||||||
|
const version = formData.version;
|
||||||
|
const filename = formData.content?.filename;
|
||||||
|
const fileData = formData.content?.data as Buffer;
|
||||||
|
const filetype = formData.filetype; // 'bdist_wheel' or 'sdist'
|
||||||
|
const pyversion = formData.pyversion;
|
||||||
|
|
||||||
|
if (!packageName || !version || !filename || !fileData) {
|
||||||
|
return this.errorResponse(400, 'Missing required fields');
|
||||||
|
}
|
||||||
|
|
||||||
|
// Validate package name
|
||||||
|
if (!helpers.isValidPackageName(packageName)) {
|
||||||
|
return this.errorResponse(400, 'Invalid package name');
|
||||||
|
}
|
||||||
|
|
||||||
|
const normalized = helpers.normalizePypiPackageName(packageName);
|
||||||
|
|
||||||
|
// Check permission
|
||||||
|
if (!(await this.checkPermission(token, normalized, 'write'))) {
|
||||||
|
return this.errorResponse(403, 'Insufficient permissions');
|
||||||
|
}
|
||||||
|
|
||||||
|
// Calculate hashes
|
||||||
|
const hashes: Record<string, string> = {};
|
||||||
|
|
||||||
|
if (formData.sha256_digest) {
|
||||||
|
hashes.sha256 = formData.sha256_digest;
|
||||||
|
} else {
|
||||||
|
hashes.sha256 = await helpers.calculateHash(fileData, 'sha256');
|
||||||
|
}
|
||||||
|
|
||||||
|
if (formData.md5_digest) {
|
||||||
|
// MD5 digest in PyPI is urlsafe base64, convert to hex
|
||||||
|
hashes.md5 = await helpers.calculateHash(fileData, 'md5');
|
||||||
|
}
|
||||||
|
|
||||||
|
if (formData.blake2_256_digest) {
|
||||||
|
hashes.blake2b = formData.blake2_256_digest;
|
||||||
|
}
|
||||||
|
|
||||||
|
// Store file
|
||||||
|
await this.storage.putPypiPackageFile(normalized, filename, fileData);
|
||||||
|
|
||||||
|
// Update metadata
|
||||||
|
let metadata = await this.storage.getPypiPackageMetadata(normalized);
|
||||||
|
if (!metadata) {
|
||||||
|
metadata = {
|
||||||
|
name: normalized,
|
||||||
|
versions: {},
|
||||||
|
};
|
||||||
|
}
|
||||||
|
|
||||||
|
if (!metadata.versions[version]) {
|
||||||
|
metadata.versions[version] = {
|
||||||
|
version,
|
||||||
|
files: [],
|
||||||
|
};
|
||||||
|
}
|
||||||
|
|
||||||
|
// Add file to version
|
||||||
|
metadata.versions[version].files.push({
|
||||||
|
filename,
|
||||||
|
path: `pypi/packages/${normalized}/${filename}`,
|
||||||
|
filetype,
|
||||||
|
python_version: pyversion,
|
||||||
|
hashes,
|
||||||
|
size: fileData.length,
|
||||||
|
'requires-python': formData.requires_python,
|
||||||
|
'upload-time': new Date().toISOString(),
|
||||||
|
'uploaded-by': token.userId,
|
||||||
|
});
|
||||||
|
|
||||||
|
// Store core metadata if provided
|
||||||
|
if (formData.summary || formData.description) {
|
||||||
|
metadata.versions[version].metadata = helpers.extractCoreMetadata(formData);
|
||||||
|
}
|
||||||
|
|
||||||
|
metadata['last-modified'] = new Date().toISOString();
|
||||||
|
await this.storage.putPypiPackageMetadata(normalized, metadata);
|
||||||
|
|
||||||
|
this.logger.log('info', `Package uploaded: ${normalized} ${version}`, {
|
||||||
|
filename,
|
||||||
|
size: fileData.length
|
||||||
|
});
|
||||||
|
|
||||||
|
return {
|
||||||
|
status: 200,
|
||||||
|
headers: { 'Content-Type': 'application/json' },
|
||||||
|
body: Buffer.from(JSON.stringify({
|
||||||
|
message: 'Package uploaded successfully',
|
||||||
|
url: `${this.registryUrl}/pypi/packages/${normalized}/${filename}`
|
||||||
|
})),
|
||||||
|
};
|
||||||
|
} catch (error) {
|
||||||
|
this.logger.log('error', 'Upload failed', { error: (error as Error).message });
|
||||||
|
return this.errorResponse(500, 'Upload failed: ' + (error as Error).message);
|
||||||
|
}
|
||||||
|
}
|
||||||
|
|
||||||
|
/**
|
||||||
|
* Handle package download
|
||||||
|
*/
|
||||||
|
private async handleDownload(packageName: string, filename: string): Promise<IResponse> {
|
||||||
|
const normalized = helpers.normalizePypiPackageName(packageName);
|
||||||
|
const fileData = await this.storage.getPypiPackageFile(normalized, filename);
|
||||||
|
|
||||||
|
if (!fileData) {
|
||||||
|
return {
|
||||||
|
status: 404,
|
||||||
|
headers: { 'Content-Type': 'application/json' },
|
||||||
|
body: Buffer.from(JSON.stringify({ message: 'File not found' })),
|
||||||
|
};
|
||||||
|
}
|
||||||
|
|
||||||
|
return {
|
||||||
|
status: 200,
|
||||||
|
headers: {
|
||||||
|
'Content-Type': 'application/octet-stream',
|
||||||
|
'Content-Disposition': `attachment; filename="${filename}"`,
|
||||||
|
'Content-Length': fileData.length.toString()
|
||||||
|
},
|
||||||
|
body: fileData,
|
||||||
|
};
|
||||||
|
}
|
||||||
|
|
||||||
|
/**
|
||||||
|
* Handle package JSON API (all versions)
|
||||||
|
*/
|
||||||
|
private async handlePackageJson(packageName: string): Promise<IResponse> {
|
||||||
|
const normalized = helpers.normalizePypiPackageName(packageName);
|
||||||
|
const metadata = await this.storage.getPypiPackageMetadata(normalized);
|
||||||
|
|
||||||
|
if (!metadata) {
|
||||||
|
return this.errorResponse(404, 'Package not found');
|
||||||
|
}
|
||||||
|
|
||||||
|
return {
|
||||||
|
status: 200,
|
||||||
|
headers: {
|
||||||
|
'Content-Type': 'application/json',
|
||||||
|
'Cache-Control': 'public, max-age=300'
|
||||||
|
},
|
||||||
|
body: Buffer.from(JSON.stringify(metadata)),
|
||||||
|
};
|
||||||
|
}
|
||||||
|
|
||||||
|
/**
|
||||||
|
* Handle version-specific JSON API
|
||||||
|
*/
|
||||||
|
private async handleVersionJson(packageName: string, version: string): Promise<IResponse> {
|
||||||
|
const normalized = helpers.normalizePypiPackageName(packageName);
|
||||||
|
const metadata = await this.storage.getPypiPackageMetadata(normalized);
|
||||||
|
|
||||||
|
if (!metadata || !metadata.versions[version]) {
|
||||||
|
return this.errorResponse(404, 'Version not found');
|
||||||
|
}
|
||||||
|
|
||||||
|
return {
|
||||||
|
status: 200,
|
||||||
|
headers: {
|
||||||
|
'Content-Type': 'application/json',
|
||||||
|
'Cache-Control': 'public, max-age=300'
|
||||||
|
},
|
||||||
|
body: Buffer.from(JSON.stringify(metadata.versions[version])),
|
||||||
|
};
|
||||||
|
}
|
||||||
|
|
||||||
|
/**
|
||||||
|
* Handle package deletion
|
||||||
|
*/
|
||||||
|
private async handleDeletePackage(packageName: string, token: IAuthToken | null): Promise<IResponse> {
|
||||||
|
if (!token) {
|
||||||
|
return this.errorResponse(401, 'Authentication required');
|
||||||
|
}
|
||||||
|
|
||||||
|
const normalized = helpers.normalizePypiPackageName(packageName);
|
||||||
|
|
||||||
|
if (!(await this.checkPermission(token, normalized, 'delete'))) {
|
||||||
|
return this.errorResponse(403, 'Insufficient permissions');
|
||||||
|
}
|
||||||
|
|
||||||
|
await this.storage.deletePypiPackage(normalized);
|
||||||
|
|
||||||
|
this.logger.log('info', `Package deleted: ${normalized}`);
|
||||||
|
|
||||||
|
return {
|
||||||
|
status: 204,
|
||||||
|
headers: {},
|
||||||
|
body: Buffer.from(''),
|
||||||
|
};
|
||||||
|
}
|
||||||
|
|
||||||
|
/**
|
||||||
|
* Handle version deletion
|
||||||
|
*/
|
||||||
|
private async handleDeleteVersion(
|
||||||
|
packageName: string,
|
||||||
|
version: string,
|
||||||
|
token: IAuthToken | null
|
||||||
|
): Promise<IResponse> {
|
||||||
|
if (!token) {
|
||||||
|
return this.errorResponse(401, 'Authentication required');
|
||||||
|
}
|
||||||
|
|
||||||
|
const normalized = helpers.normalizePypiPackageName(packageName);
|
||||||
|
|
||||||
|
if (!(await this.checkPermission(token, normalized, 'delete'))) {
|
||||||
|
return this.errorResponse(403, 'Insufficient permissions');
|
||||||
|
}
|
||||||
|
|
||||||
|
await this.storage.deletePypiPackageVersion(normalized, version);
|
||||||
|
|
||||||
|
this.logger.log('info', `Version deleted: ${normalized} ${version}`);
|
||||||
|
|
||||||
|
return {
|
||||||
|
status: 204,
|
||||||
|
headers: {},
|
||||||
|
body: Buffer.from(''),
|
||||||
|
};
|
||||||
|
}
|
||||||
|
|
||||||
|
/**
|
||||||
|
* Helper: Create error response
|
||||||
|
*/
|
||||||
|
private errorResponse(status: number, message: string): IResponse {
|
||||||
|
const error: IPypiError = { message, status };
|
||||||
|
return {
|
||||||
|
status,
|
||||||
|
headers: { 'Content-Type': 'application/json' },
|
||||||
|
body: Buffer.from(JSON.stringify(error)),
|
||||||
|
};
|
||||||
|
}
|
||||||
|
}
|
||||||
299
ts/pypi/helpers.pypi.ts
Normal file
299
ts/pypi/helpers.pypi.ts
Normal file
@@ -0,0 +1,299 @@
|
|||||||
|
/**
|
||||||
|
* Helper functions for PyPI registry
|
||||||
|
* Package name normalization, HTML generation, etc.
|
||||||
|
*/
|
||||||
|
|
||||||
|
import type { IPypiFile, IPypiPackageMetadata } from './interfaces.pypi.js';
|
||||||
|
|
||||||
|
/**
|
||||||
|
* Normalize package name according to PEP 503
|
||||||
|
* Lowercase and replace runs of [._-] with a single dash
|
||||||
|
* @param name - Package name
|
||||||
|
* @returns Normalized name
|
||||||
|
*/
|
||||||
|
export function normalizePypiPackageName(name: string): string {
|
||||||
|
return name
|
||||||
|
.toLowerCase()
|
||||||
|
.replace(/[-_.]+/g, '-');
|
||||||
|
}
|
||||||
|
|
||||||
|
/**
|
||||||
|
* Escape HTML special characters to prevent XSS
|
||||||
|
* @param str - String to escape
|
||||||
|
* @returns Escaped string
|
||||||
|
*/
|
||||||
|
export function escapeHtml(str: string): string {
|
||||||
|
return str
|
||||||
|
.replace(/&/g, '&')
|
||||||
|
.replace(/</g, '<')
|
||||||
|
.replace(/>/g, '>')
|
||||||
|
.replace(/"/g, '"')
|
||||||
|
.replace(/'/g, ''');
|
||||||
|
}
|
||||||
|
|
||||||
|
/**
|
||||||
|
* Generate PEP 503 compliant HTML for root index (all packages)
|
||||||
|
* @param packages - List of package names
|
||||||
|
* @returns HTML string
|
||||||
|
*/
|
||||||
|
export function generateSimpleRootHtml(packages: string[]): string {
|
||||||
|
const links = packages
|
||||||
|
.map(pkg => {
|
||||||
|
const normalized = normalizePypiPackageName(pkg);
|
||||||
|
return ` <a href="${escapeHtml(normalized)}/">${escapeHtml(pkg)}</a>`;
|
||||||
|
})
|
||||||
|
.join('\n');
|
||||||
|
|
||||||
|
return `<!DOCTYPE html>
|
||||||
|
<html>
|
||||||
|
<head>
|
||||||
|
<meta name="pypi:repository-version" content="1.0">
|
||||||
|
<title>Simple Index</title>
|
||||||
|
</head>
|
||||||
|
<body>
|
||||||
|
<h1>Simple Index</h1>
|
||||||
|
${links}
|
||||||
|
</body>
|
||||||
|
</html>`;
|
||||||
|
}
|
||||||
|
|
||||||
|
/**
|
||||||
|
* Generate PEP 503 compliant HTML for package index (file list)
|
||||||
|
* @param packageName - Package name (normalized)
|
||||||
|
* @param files - List of files
|
||||||
|
* @param baseUrl - Base URL for downloads
|
||||||
|
* @returns HTML string
|
||||||
|
*/
|
||||||
|
export function generateSimplePackageHtml(
|
||||||
|
packageName: string,
|
||||||
|
files: IPypiFile[],
|
||||||
|
baseUrl: string
|
||||||
|
): string {
|
||||||
|
const links = files
|
||||||
|
.map(file => {
|
||||||
|
// Build URL
|
||||||
|
let url = file.url;
|
||||||
|
if (!url.startsWith('http://') && !url.startsWith('https://')) {
|
||||||
|
// Relative URL - make it absolute
|
||||||
|
url = `${baseUrl}/packages/${packageName}/${file.filename}`;
|
||||||
|
}
|
||||||
|
|
||||||
|
// Add hash fragment
|
||||||
|
const hashName = Object.keys(file.hashes)[0];
|
||||||
|
const hashValue = file.hashes[hashName];
|
||||||
|
const fragment = hashName && hashValue ? `#${hashName}=${hashValue}` : '';
|
||||||
|
|
||||||
|
// Build data attributes
|
||||||
|
const dataAttrs: string[] = [];
|
||||||
|
|
||||||
|
if (file['requires-python']) {
|
||||||
|
const escaped = escapeHtml(file['requires-python']);
|
||||||
|
dataAttrs.push(`data-requires-python="${escaped}"`);
|
||||||
|
}
|
||||||
|
|
||||||
|
if (file['gpg-sig'] !== undefined) {
|
||||||
|
dataAttrs.push(`data-gpg-sig="${file['gpg-sig'] ? 'true' : 'false'}"`);
|
||||||
|
}
|
||||||
|
|
||||||
|
if (file.yanked) {
|
||||||
|
const reason = typeof file.yanked === 'string' ? file.yanked : '';
|
||||||
|
if (reason) {
|
||||||
|
dataAttrs.push(`data-yanked="${escapeHtml(reason)}"`);
|
||||||
|
} else {
|
||||||
|
dataAttrs.push(`data-yanked=""`);
|
||||||
|
}
|
||||||
|
}
|
||||||
|
|
||||||
|
const dataAttrStr = dataAttrs.length > 0 ? ' ' + dataAttrs.join(' ') : '';
|
||||||
|
|
||||||
|
return ` <a href="${escapeHtml(url)}${fragment}"${dataAttrStr}>${escapeHtml(file.filename)}</a>`;
|
||||||
|
})
|
||||||
|
.join('\n');
|
||||||
|
|
||||||
|
return `<!DOCTYPE html>
|
||||||
|
<html>
|
||||||
|
<head>
|
||||||
|
<meta name="pypi:repository-version" content="1.0">
|
||||||
|
<title>Links for ${escapeHtml(packageName)}</title>
|
||||||
|
</head>
|
||||||
|
<body>
|
||||||
|
<h1>Links for ${escapeHtml(packageName)}</h1>
|
||||||
|
${links}
|
||||||
|
</body>
|
||||||
|
</html>`;
|
||||||
|
}
|
||||||
|
|
||||||
|
/**
|
||||||
|
* Parse filename to extract package info
|
||||||
|
* Supports wheel and sdist formats
|
||||||
|
* @param filename - Package filename
|
||||||
|
* @returns Parsed info or null
|
||||||
|
*/
|
||||||
|
export function parsePackageFilename(filename: string): {
|
||||||
|
name: string;
|
||||||
|
version: string;
|
||||||
|
filetype: 'bdist_wheel' | 'sdist';
|
||||||
|
pythonVersion?: string;
|
||||||
|
} | null {
|
||||||
|
// Wheel format: {distribution}-{version}(-{build tag})?-{python tag}-{abi tag}-{platform tag}.whl
|
||||||
|
const wheelMatch = filename.match(/^([a-zA-Z0-9_.-]+?)-([a-zA-Z0-9_.]+?)(?:-(\d+))?-([^-]+)-([^-]+)-([^-]+)\.whl$/);
|
||||||
|
if (wheelMatch) {
|
||||||
|
return {
|
||||||
|
name: wheelMatch[1],
|
||||||
|
version: wheelMatch[2],
|
||||||
|
filetype: 'bdist_wheel',
|
||||||
|
pythonVersion: wheelMatch[4],
|
||||||
|
};
|
||||||
|
}
|
||||||
|
|
||||||
|
// Sdist tar.gz format: {name}-{version}.tar.gz
|
||||||
|
const sdistTarMatch = filename.match(/^([a-zA-Z0-9_.-]+?)-([a-zA-Z0-9_.]+)\.tar\.gz$/);
|
||||||
|
if (sdistTarMatch) {
|
||||||
|
return {
|
||||||
|
name: sdistTarMatch[1],
|
||||||
|
version: sdistTarMatch[2],
|
||||||
|
filetype: 'sdist',
|
||||||
|
pythonVersion: 'source',
|
||||||
|
};
|
||||||
|
}
|
||||||
|
|
||||||
|
// Sdist zip format: {name}-{version}.zip
|
||||||
|
const sdistZipMatch = filename.match(/^([a-zA-Z0-9_.-]+?)-([a-zA-Z0-9_.]+)\.zip$/);
|
||||||
|
if (sdistZipMatch) {
|
||||||
|
return {
|
||||||
|
name: sdistZipMatch[1],
|
||||||
|
version: sdistZipMatch[2],
|
||||||
|
filetype: 'sdist',
|
||||||
|
pythonVersion: 'source',
|
||||||
|
};
|
||||||
|
}
|
||||||
|
|
||||||
|
return null;
|
||||||
|
}
|
||||||
|
|
||||||
|
/**
|
||||||
|
* Calculate hash digest for a buffer
|
||||||
|
* @param data - Data to hash
|
||||||
|
* @param algorithm - Hash algorithm (sha256, md5, blake2b)
|
||||||
|
* @returns Hex-encoded hash
|
||||||
|
*/
|
||||||
|
export async function calculateHash(data: Buffer, algorithm: 'sha256' | 'md5' | 'blake2b'): Promise<string> {
|
||||||
|
const crypto = await import('crypto');
|
||||||
|
|
||||||
|
let hash: any;
|
||||||
|
if (algorithm === 'blake2b') {
|
||||||
|
// Node.js uses 'blake2b512' for blake2b
|
||||||
|
hash = crypto.createHash('blake2b512');
|
||||||
|
} else {
|
||||||
|
hash = crypto.createHash(algorithm);
|
||||||
|
}
|
||||||
|
|
||||||
|
hash.update(data);
|
||||||
|
return hash.digest('hex');
|
||||||
|
}
|
||||||
|
|
||||||
|
/**
|
||||||
|
* Validate package name
|
||||||
|
* Must contain only ASCII letters, numbers, ., -, and _
|
||||||
|
* @param name - Package name
|
||||||
|
* @returns true if valid
|
||||||
|
*/
|
||||||
|
export function isValidPackageName(name: string): boolean {
|
||||||
|
return /^[a-zA-Z0-9._-]+$/.test(name);
|
||||||
|
}
|
||||||
|
|
||||||
|
/**
|
||||||
|
* Validate version string (basic check)
|
||||||
|
* @param version - Version string
|
||||||
|
* @returns true if valid
|
||||||
|
*/
|
||||||
|
export function isValidVersion(version: string): boolean {
|
||||||
|
// Basic check - allows numbers, letters, dots, hyphens, underscores
|
||||||
|
// More strict validation would follow PEP 440
|
||||||
|
return /^[a-zA-Z0-9._-]+$/.test(version);
|
||||||
|
}
|
||||||
|
|
||||||
|
/**
|
||||||
|
* Extract metadata from package metadata
|
||||||
|
* Filters and normalizes metadata fields
|
||||||
|
* @param metadata - Raw metadata object
|
||||||
|
* @returns Filtered metadata
|
||||||
|
*/
|
||||||
|
export function extractCoreMetadata(metadata: Record<string, any>): Record<string, any> {
|
||||||
|
const coreFields = [
|
||||||
|
'metadata-version',
|
||||||
|
'name',
|
||||||
|
'version',
|
||||||
|
'platform',
|
||||||
|
'supported-platform',
|
||||||
|
'summary',
|
||||||
|
'description',
|
||||||
|
'description-content-type',
|
||||||
|
'keywords',
|
||||||
|
'home-page',
|
||||||
|
'download-url',
|
||||||
|
'author',
|
||||||
|
'author-email',
|
||||||
|
'maintainer',
|
||||||
|
'maintainer-email',
|
||||||
|
'license',
|
||||||
|
'classifier',
|
||||||
|
'requires-python',
|
||||||
|
'requires-dist',
|
||||||
|
'requires-external',
|
||||||
|
'provides-dist',
|
||||||
|
'project-url',
|
||||||
|
'provides-extra',
|
||||||
|
];
|
||||||
|
|
||||||
|
const result: Record<string, any> = {};
|
||||||
|
|
||||||
|
for (const [key, value] of Object.entries(metadata)) {
|
||||||
|
const normalizedKey = key.toLowerCase().replace(/_/g, '-');
|
||||||
|
if (coreFields.includes(normalizedKey)) {
|
||||||
|
result[normalizedKey] = value;
|
||||||
|
}
|
||||||
|
}
|
||||||
|
|
||||||
|
return result;
|
||||||
|
}
|
||||||
|
|
||||||
|
/**
|
||||||
|
* Generate JSON API response for package list (PEP 691)
|
||||||
|
* @param packages - List of package names
|
||||||
|
* @returns JSON object
|
||||||
|
*/
|
||||||
|
export function generateJsonRootResponse(packages: string[]): any {
|
||||||
|
return {
|
||||||
|
meta: {
|
||||||
|
'api-version': '1.0',
|
||||||
|
},
|
||||||
|
projects: packages.map(name => ({ name })),
|
||||||
|
};
|
||||||
|
}
|
||||||
|
|
||||||
|
/**
|
||||||
|
* Generate JSON API response for package files (PEP 691)
|
||||||
|
* @param packageName - Package name (normalized)
|
||||||
|
* @param files - List of files
|
||||||
|
* @returns JSON object
|
||||||
|
*/
|
||||||
|
export function generateJsonPackageResponse(packageName: string, files: IPypiFile[]): any {
|
||||||
|
return {
|
||||||
|
meta: {
|
||||||
|
'api-version': '1.0',
|
||||||
|
},
|
||||||
|
name: packageName,
|
||||||
|
files: files.map(file => ({
|
||||||
|
filename: file.filename,
|
||||||
|
url: file.url,
|
||||||
|
hashes: file.hashes,
|
||||||
|
'requires-python': file['requires-python'],
|
||||||
|
'dist-info-metadata': file['dist-info-metadata'],
|
||||||
|
'gpg-sig': file['gpg-sig'],
|
||||||
|
yanked: file.yanked,
|
||||||
|
size: file.size,
|
||||||
|
'upload-time': file['upload-time'],
|
||||||
|
})),
|
||||||
|
};
|
||||||
|
}
|
||||||
8
ts/pypi/index.ts
Normal file
8
ts/pypi/index.ts
Normal file
@@ -0,0 +1,8 @@
|
|||||||
|
/**
|
||||||
|
* PyPI Registry Module
|
||||||
|
* Python Package Index implementation
|
||||||
|
*/
|
||||||
|
|
||||||
|
export * from './interfaces.pypi.js';
|
||||||
|
export * from './classes.pypiregistry.js';
|
||||||
|
export * as pypiHelpers from './helpers.pypi.js';
|
||||||
316
ts/pypi/interfaces.pypi.ts
Normal file
316
ts/pypi/interfaces.pypi.ts
Normal file
@@ -0,0 +1,316 @@
|
|||||||
|
/**
|
||||||
|
* PyPI Registry Type Definitions
|
||||||
|
* Compliant with PEP 503 (Simple API), PEP 691 (JSON API), and PyPI upload API
|
||||||
|
*/
|
||||||
|
|
||||||
|
/**
|
||||||
|
* File information for a package distribution
|
||||||
|
* Used in both PEP 503 HTML and PEP 691 JSON responses
|
||||||
|
*/
|
||||||
|
export interface IPypiFile {
|
||||||
|
/** Filename (e.g., "package-1.0.0-py3-none-any.whl") */
|
||||||
|
filename: string;
|
||||||
|
/** Download URL (absolute or relative) */
|
||||||
|
url: string;
|
||||||
|
/** Hash digests (multiple algorithms supported in JSON) */
|
||||||
|
hashes: Record<string, string>;
|
||||||
|
/** Python version requirement (PEP 345 format) */
|
||||||
|
'requires-python'?: string;
|
||||||
|
/** Whether distribution info metadata is available (PEP 658) */
|
||||||
|
'dist-info-metadata'?: boolean | { sha256: string };
|
||||||
|
/** Whether GPG signature is available */
|
||||||
|
'gpg-sig'?: boolean;
|
||||||
|
/** Yank status: false or reason string */
|
||||||
|
yanked?: boolean | string;
|
||||||
|
/** File size in bytes */
|
||||||
|
size?: number;
|
||||||
|
/** Upload timestamp */
|
||||||
|
'upload-time'?: string;
|
||||||
|
}
|
||||||
|
|
||||||
|
/**
|
||||||
|
* Package metadata stored internally
|
||||||
|
* Consolidated from multiple file uploads
|
||||||
|
*/
|
||||||
|
export interface IPypiPackageMetadata {
|
||||||
|
/** Normalized package name */
|
||||||
|
name: string;
|
||||||
|
/** Map of version to file list */
|
||||||
|
versions: Record<string, IPypiVersionMetadata>;
|
||||||
|
/** Timestamp of last update */
|
||||||
|
'last-modified'?: string;
|
||||||
|
}
|
||||||
|
|
||||||
|
/**
|
||||||
|
* Metadata for a specific version
|
||||||
|
*/
|
||||||
|
export interface IPypiVersionMetadata {
|
||||||
|
/** Version string */
|
||||||
|
version: string;
|
||||||
|
/** Files for this version (wheels, sdists) */
|
||||||
|
files: IPypiFileMetadata[];
|
||||||
|
/** Core metadata fields */
|
||||||
|
metadata?: IPypiCoreMetadata;
|
||||||
|
/** Whether entire version is yanked */
|
||||||
|
yanked?: boolean | string;
|
||||||
|
/** Upload timestamp */
|
||||||
|
'upload-time'?: string;
|
||||||
|
}
|
||||||
|
|
||||||
|
/**
|
||||||
|
* Internal file metadata
|
||||||
|
*/
|
||||||
|
export interface IPypiFileMetadata {
|
||||||
|
filename: string;
|
||||||
|
/** Storage key/path */
|
||||||
|
path: string;
|
||||||
|
/** File type: bdist_wheel or sdist */
|
||||||
|
filetype: 'bdist_wheel' | 'sdist';
|
||||||
|
/** Python version tag */
|
||||||
|
python_version: string;
|
||||||
|
/** Hash digests */
|
||||||
|
hashes: Record<string, string>;
|
||||||
|
/** File size in bytes */
|
||||||
|
size: number;
|
||||||
|
/** Python version requirement */
|
||||||
|
'requires-python'?: string;
|
||||||
|
/** Whether this file is yanked */
|
||||||
|
yanked?: boolean | string;
|
||||||
|
/** Upload timestamp */
|
||||||
|
'upload-time': string;
|
||||||
|
/** Uploader user ID */
|
||||||
|
'uploaded-by': string;
|
||||||
|
}
|
||||||
|
|
||||||
|
/**
|
||||||
|
* Core metadata fields (subset of PEP 566)
|
||||||
|
* These are extracted from package uploads
|
||||||
|
*/
|
||||||
|
export interface IPypiCoreMetadata {
|
||||||
|
/** Metadata version */
|
||||||
|
'metadata-version': string;
|
||||||
|
/** Package name */
|
||||||
|
name: string;
|
||||||
|
/** Version string */
|
||||||
|
version: string;
|
||||||
|
/** Platform compatibility */
|
||||||
|
platform?: string;
|
||||||
|
/** Supported platforms */
|
||||||
|
'supported-platform'?: string;
|
||||||
|
/** Summary/description */
|
||||||
|
summary?: string;
|
||||||
|
/** Long description */
|
||||||
|
description?: string;
|
||||||
|
/** Description content type (text/plain, text/markdown, text/x-rst) */
|
||||||
|
'description-content-type'?: string;
|
||||||
|
/** Keywords */
|
||||||
|
keywords?: string;
|
||||||
|
/** Homepage URL */
|
||||||
|
'home-page'?: string;
|
||||||
|
/** Download URL */
|
||||||
|
'download-url'?: string;
|
||||||
|
/** Author name */
|
||||||
|
author?: string;
|
||||||
|
/** Author email */
|
||||||
|
'author-email'?: string;
|
||||||
|
/** Maintainer name */
|
||||||
|
maintainer?: string;
|
||||||
|
/** Maintainer email */
|
||||||
|
'maintainer-email'?: string;
|
||||||
|
/** License */
|
||||||
|
license?: string;
|
||||||
|
/** Classifiers (Trove classifiers) */
|
||||||
|
classifier?: string[];
|
||||||
|
/** Python version requirement */
|
||||||
|
'requires-python'?: string;
|
||||||
|
/** Dist name requirement */
|
||||||
|
'requires-dist'?: string[];
|
||||||
|
/** External requirement */
|
||||||
|
'requires-external'?: string[];
|
||||||
|
/** Provides dist */
|
||||||
|
'provides-dist'?: string[];
|
||||||
|
/** Project URLs */
|
||||||
|
'project-url'?: string[];
|
||||||
|
/** Provides extra */
|
||||||
|
'provides-extra'?: string[];
|
||||||
|
}
|
||||||
|
|
||||||
|
/**
|
||||||
|
* PEP 503: Simple API root response (project list)
|
||||||
|
*/
|
||||||
|
export interface IPypiSimpleRootHtml {
|
||||||
|
/** List of project names */
|
||||||
|
projects: string[];
|
||||||
|
}
|
||||||
|
|
||||||
|
/**
|
||||||
|
* PEP 503: Simple API project response (file list)
|
||||||
|
*/
|
||||||
|
export interface IPypiSimpleProjectHtml {
|
||||||
|
/** Normalized project name */
|
||||||
|
name: string;
|
||||||
|
/** List of files */
|
||||||
|
files: IPypiFile[];
|
||||||
|
}
|
||||||
|
|
||||||
|
/**
|
||||||
|
* PEP 691: JSON API root response
|
||||||
|
*/
|
||||||
|
export interface IPypiJsonRoot {
|
||||||
|
/** API metadata */
|
||||||
|
meta: {
|
||||||
|
/** API version (e.g., "1.0") */
|
||||||
|
'api-version': string;
|
||||||
|
};
|
||||||
|
/** List of projects */
|
||||||
|
projects: Array<{
|
||||||
|
/** Project name */
|
||||||
|
name: string;
|
||||||
|
}>;
|
||||||
|
}
|
||||||
|
|
||||||
|
/**
|
||||||
|
* PEP 691: JSON API project response
|
||||||
|
*/
|
||||||
|
export interface IPypiJsonProject {
|
||||||
|
/** Normalized project name */
|
||||||
|
name: string;
|
||||||
|
/** API metadata */
|
||||||
|
meta: {
|
||||||
|
/** API version (e.g., "1.0") */
|
||||||
|
'api-version': string;
|
||||||
|
};
|
||||||
|
/** List of files */
|
||||||
|
files: IPypiFile[];
|
||||||
|
}
|
||||||
|
|
||||||
|
/**
|
||||||
|
* Upload form data (multipart/form-data fields)
|
||||||
|
* Based on PyPI legacy upload API
|
||||||
|
*/
|
||||||
|
export interface IPypiUploadForm {
|
||||||
|
/** Action type (always "file_upload") */
|
||||||
|
':action': 'file_upload';
|
||||||
|
/** Protocol version (always "1") */
|
||||||
|
protocol_version: '1';
|
||||||
|
/** File content (binary) */
|
||||||
|
content: Buffer;
|
||||||
|
/** File type */
|
||||||
|
filetype: 'bdist_wheel' | 'sdist';
|
||||||
|
/** Python version tag */
|
||||||
|
pyversion: string;
|
||||||
|
/** Package name */
|
||||||
|
name: string;
|
||||||
|
/** Version string */
|
||||||
|
version: string;
|
||||||
|
/** Metadata version */
|
||||||
|
metadata_version: string;
|
||||||
|
/** Hash digests (at least one required) */
|
||||||
|
md5_digest?: string;
|
||||||
|
sha256_digest?: string;
|
||||||
|
blake2_256_digest?: string;
|
||||||
|
/** Optional attestations */
|
||||||
|
attestations?: string; // JSON array
|
||||||
|
/** Optional core metadata fields */
|
||||||
|
summary?: string;
|
||||||
|
description?: string;
|
||||||
|
description_content_type?: string;
|
||||||
|
author?: string;
|
||||||
|
author_email?: string;
|
||||||
|
maintainer?: string;
|
||||||
|
maintainer_email?: string;
|
||||||
|
license?: string;
|
||||||
|
keywords?: string;
|
||||||
|
home_page?: string;
|
||||||
|
download_url?: string;
|
||||||
|
requires_python?: string;
|
||||||
|
classifiers?: string[];
|
||||||
|
platform?: string;
|
||||||
|
[key: string]: any; // Allow additional metadata fields
|
||||||
|
}
|
||||||
|
|
||||||
|
/**
|
||||||
|
* JSON API upload response
|
||||||
|
*/
|
||||||
|
export interface IPypiUploadResponse {
|
||||||
|
/** Success message */
|
||||||
|
message?: string;
|
||||||
|
/** URL of uploaded file */
|
||||||
|
url?: string;
|
||||||
|
}
|
||||||
|
|
||||||
|
/**
|
||||||
|
* Error response structure
|
||||||
|
*/
|
||||||
|
export interface IPypiError {
|
||||||
|
/** Error message */
|
||||||
|
message: string;
|
||||||
|
/** HTTP status code */
|
||||||
|
status?: number;
|
||||||
|
/** Additional error details */
|
||||||
|
details?: string[];
|
||||||
|
}
|
||||||
|
|
||||||
|
/**
|
||||||
|
* Search query parameters
|
||||||
|
*/
|
||||||
|
export interface IPypiSearchQuery {
|
||||||
|
/** Search term */
|
||||||
|
q?: string;
|
||||||
|
/** Page number */
|
||||||
|
page?: number;
|
||||||
|
/** Results per page */
|
||||||
|
per_page?: number;
|
||||||
|
}
|
||||||
|
|
||||||
|
/**
|
||||||
|
* Search result for a single package
|
||||||
|
*/
|
||||||
|
export interface IPypiSearchResult {
|
||||||
|
/** Package name */
|
||||||
|
name: string;
|
||||||
|
/** Latest version */
|
||||||
|
version: string;
|
||||||
|
/** Summary */
|
||||||
|
summary: string;
|
||||||
|
/** Description */
|
||||||
|
description?: string;
|
||||||
|
}
|
||||||
|
|
||||||
|
/**
|
||||||
|
* Search response structure
|
||||||
|
*/
|
||||||
|
export interface IPypiSearchResponse {
|
||||||
|
/** Search results */
|
||||||
|
results: IPypiSearchResult[];
|
||||||
|
/** Result count */
|
||||||
|
count: number;
|
||||||
|
/** Current page */
|
||||||
|
page: number;
|
||||||
|
/** Total pages */
|
||||||
|
pages: number;
|
||||||
|
}
|
||||||
|
|
||||||
|
/**
|
||||||
|
* Yank request
|
||||||
|
*/
|
||||||
|
export interface IPypiYankRequest {
|
||||||
|
/** Package name */
|
||||||
|
name: string;
|
||||||
|
/** Version to yank */
|
||||||
|
version: string;
|
||||||
|
/** Optional filename (specific file) */
|
||||||
|
filename?: string;
|
||||||
|
/** Reason for yanking */
|
||||||
|
reason?: string;
|
||||||
|
}
|
||||||
|
|
||||||
|
/**
|
||||||
|
* Yank response
|
||||||
|
*/
|
||||||
|
export interface IPypiYankResponse {
|
||||||
|
/** Success indicator */
|
||||||
|
success: boolean;
|
||||||
|
/** Message */
|
||||||
|
message?: string;
|
||||||
|
}
|
||||||
Reference in New Issue
Block a user