Compare commits
4 Commits
| Author | SHA1 | Date | |
|---|---|---|---|
| e147a077f3 | |||
| 5889396134 | |||
| 0c631383e1 | |||
| d852d8c85b |
110
changelog.md
110
changelog.md
@@ -1,5 +1,115 @@
|
|||||||
# Changelog
|
# Changelog
|
||||||
|
|
||||||
|
## 2025-11-20 - 4.0.0 - BREAKING CHANGE(core)
|
||||||
|
Make API strict-by-default: remove *Strict variants, throw on not-found/exists conflicts, add explicit exists() methods, update docs/tests and bump deps
|
||||||
|
|
||||||
|
- Breaking: Core API methods are strict by default and now throw errors instead of returning null when targets are missing or already exist (e.g. getBucketByName, getFile, getSubDirectoryByName, fastPut, fastPutStream).
|
||||||
|
- Removed *Strict variants: fastPutStrict, getBucketByNameStrict, getFileStrict, getSubDirectoryByNameStrict — use the base methods which are now strict.
|
||||||
|
- Added explicit existence checks: bucketExists (SmartBucket), fileExists (Directory/fileExists), directoryExists (Directory.directoryExists), and fastExists (Bucket.fastExists) to allow non-throwing checks before operations.
|
||||||
|
- Return type updates: fastPut now returns Promise<File> (no null), getBucketByName/getFile/getSubDirectoryByName now return the respective objects or throw.
|
||||||
|
- Improved error messages to guide callers (e.g. suggest setting overwrite:true on fastPut when object exists).
|
||||||
|
- Updated README, changelog and tests to reflect the new strict semantics and usage patterns.
|
||||||
|
- Developer/runtime dependency bumps: @git.zone/tsbuild, @git.zone/tsrun, @git.zone/tstest, @aws-sdk/client-s3, @push.rocks/smartstring, @tsclass/tsclass (version bumps recorded in package.json).
|
||||||
|
- Major version bump to 4.0.0 to reflect breaking API changes.
|
||||||
|
|
||||||
|
## 2025-11-20 - 4.0.0 - BREAKING: Strict by default + exists methods
|
||||||
|
Complete API overhaul: all methods throw by default, removed all *Strict variants, added dedicated exists methods
|
||||||
|
|
||||||
|
**Breaking Changes:**
|
||||||
|
|
||||||
|
**Putters (Write Operations):**
|
||||||
|
- `fastPut`: Return type `Promise<File | null>` → `Promise<File>`, throws when file exists and overwrite is false
|
||||||
|
- `fastPutStream`: Now throws when file exists and overwrite is false (previously returned silently)
|
||||||
|
- `fastPutStrict`: **Removed** - use `fastPut` directly
|
||||||
|
|
||||||
|
**Getters (Read Operations):**
|
||||||
|
- `getBucketByName`: Return type `Promise<Bucket | null>` → `Promise<Bucket>`, throws when bucket not found
|
||||||
|
- `getBucketByNameStrict`: **Removed** - use `getBucketByName` directly
|
||||||
|
- `getFile`: Return type `Promise<File | null>` → `Promise<File>`, throws when file not found
|
||||||
|
- `getFileStrict`: **Removed** - use `getFile` directly
|
||||||
|
- `getSubDirectoryByName`: Return type `Promise<Directory | null>` → `Promise<Directory>`, throws when directory not found
|
||||||
|
- `getSubDirectoryByNameStrict`: **Removed** - use `getSubDirectoryByName` directly
|
||||||
|
|
||||||
|
**New Methods (Existence Checks):**
|
||||||
|
- `bucket.fastExists({ path })` - ✅ Already existed
|
||||||
|
- `directory.fileExists({ path })` - **NEW** - Check if file exists
|
||||||
|
- `directory.directoryExists(name)` - **NEW** - Check if subdirectory exists
|
||||||
|
- `smartBucket.bucketExists(name)` - **NEW** - Check if bucket exists
|
||||||
|
|
||||||
|
**Benefits:**
|
||||||
|
- ✅ **Simpler API**: Removed 4 redundant *Strict methods
|
||||||
|
- ✅ **Type-safe**: No nullable returns - `Promise<T>` not `Promise<T | null>`
|
||||||
|
- ✅ **Fail-fast**: Errors throw immediately with precise stack traces
|
||||||
|
- ✅ **Consistent**: All methods behave the same way
|
||||||
|
- ✅ **Explicit**: Use exists() to check, then get() to retrieve
|
||||||
|
- ✅ **Better debugging**: Error location is always precise
|
||||||
|
|
||||||
|
**Migration Guide:**
|
||||||
|
|
||||||
|
```typescript
|
||||||
|
// ============================================
|
||||||
|
// Pattern 1: Check then Get (Recommended)
|
||||||
|
// ============================================
|
||||||
|
|
||||||
|
// Before (v3.x):
|
||||||
|
const bucket = await smartBucket.getBucketByName('my-bucket');
|
||||||
|
if (bucket) {
|
||||||
|
// use bucket
|
||||||
|
}
|
||||||
|
|
||||||
|
// After (v4.0):
|
||||||
|
if (await smartBucket.bucketExists('my-bucket')) {
|
||||||
|
const bucket = await smartBucket.getBucketByName('my-bucket'); // guaranteed to exist
|
||||||
|
// use bucket
|
||||||
|
}
|
||||||
|
|
||||||
|
// ============================================
|
||||||
|
// Pattern 2: Try/Catch
|
||||||
|
// ============================================
|
||||||
|
|
||||||
|
// Before (v3.x):
|
||||||
|
const file = await directory.getFile({ path: 'file.txt' });
|
||||||
|
if (!file) {
|
||||||
|
// Handle not found
|
||||||
|
}
|
||||||
|
|
||||||
|
// After (v4.0):
|
||||||
|
try {
|
||||||
|
const file = await directory.getFile({ path: 'file.txt' });
|
||||||
|
// use file
|
||||||
|
} catch (error) {
|
||||||
|
// Handle not found
|
||||||
|
}
|
||||||
|
|
||||||
|
// ============================================
|
||||||
|
// Pattern 3: Remove *Strict calls
|
||||||
|
// ============================================
|
||||||
|
|
||||||
|
// Before (v3.x):
|
||||||
|
const file = await directory.getFileStrict({ path: 'file.txt' });
|
||||||
|
|
||||||
|
// After (v4.0):
|
||||||
|
const file = await directory.getFile({ path: 'file.txt' }); // already strict
|
||||||
|
|
||||||
|
// ============================================
|
||||||
|
// Pattern 4: Write Operations
|
||||||
|
// ============================================
|
||||||
|
|
||||||
|
// Before (v3.x):
|
||||||
|
const file = await bucket.fastPutStrict({ path: 'file.txt', contents: 'data' });
|
||||||
|
|
||||||
|
// After (v4.0):
|
||||||
|
const file = await bucket.fastPut({ path: 'file.txt', contents: 'data' }); // already strict
|
||||||
|
```
|
||||||
|
|
||||||
|
## 2025-08-18 - 3.3.10 - fix(helpers)
|
||||||
|
Normalize and robustly parse S3 endpoint configuration; use normalized descriptor in SmartBucket and update dev tooling
|
||||||
|
|
||||||
|
- Add normalizeS3Descriptor to ts/helpers.ts: robust endpoint parsing, coercion of useSsl/port, sanitization, warnings for dropped URL parts, and canonical endpoint URL output.
|
||||||
|
- Update SmartBucket (ts/classes.smartbucket.ts) to use the normalized endpoint, region, credentials and forcePathStyle from normalizeS3Descriptor.
|
||||||
|
- Adjust dev tooling: bump @git.zone/tsbuild -> ^2.6.7, @git.zone/tstest -> ^2.3.4, @push.rocks/qenv -> ^6.1.3 and update test script to run tstest with --verbose --logfile --timeout 60.
|
||||||
|
- Add .claude/settings.local.json containing local assistant/CI permission settings (local config only).
|
||||||
|
|
||||||
## 2025-08-15 - 3.3.9 - fix(docs)
|
## 2025-08-15 - 3.3.9 - fix(docs)
|
||||||
Revise README with detailed usage examples and add local Claude settings
|
Revise README with detailed usage examples and add local Claude settings
|
||||||
|
|
||||||
|
|||||||
4
package-lock.json
generated
4
package-lock.json
generated
@@ -1,12 +1,12 @@
|
|||||||
{
|
{
|
||||||
"name": "@push.rocks/smartbucket",
|
"name": "@push.rocks/smartbucket",
|
||||||
"version": "3.3.9",
|
"version": "3.3.10",
|
||||||
"lockfileVersion": 3,
|
"lockfileVersion": 3,
|
||||||
"requires": true,
|
"requires": true,
|
||||||
"packages": {
|
"packages": {
|
||||||
"": {
|
"": {
|
||||||
"name": "@push.rocks/smartbucket",
|
"name": "@push.rocks/smartbucket",
|
||||||
"version": "3.3.9",
|
"version": "3.3.10",
|
||||||
"license": "UNLICENSED",
|
"license": "UNLICENSED",
|
||||||
"dependencies": {
|
"dependencies": {
|
||||||
"@push.rocks/smartpath": "^5.0.18",
|
"@push.rocks/smartpath": "^5.0.18",
|
||||||
|
|||||||
18
package.json
18
package.json
@@ -1,6 +1,6 @@
|
|||||||
{
|
{
|
||||||
"name": "@push.rocks/smartbucket",
|
"name": "@push.rocks/smartbucket",
|
||||||
"version": "3.3.9",
|
"version": "4.0.0",
|
||||||
"description": "A TypeScript library providing a cloud-agnostic interface for managing object storage with functionalities like bucket management, file and directory operations, and advanced features such as metadata handling and file locking.",
|
"description": "A TypeScript library providing a cloud-agnostic interface for managing object storage with functionalities like bucket management, file and directory operations, and advanced features such as metadata handling and file locking.",
|
||||||
"main": "dist_ts/index.js",
|
"main": "dist_ts/index.js",
|
||||||
"typings": "dist_ts/index.d.ts",
|
"typings": "dist_ts/index.d.ts",
|
||||||
@@ -8,26 +8,26 @@
|
|||||||
"author": "Task Venture Capital GmbH",
|
"author": "Task Venture Capital GmbH",
|
||||||
"license": "MIT",
|
"license": "MIT",
|
||||||
"scripts": {
|
"scripts": {
|
||||||
"test": "(tstest test/)",
|
"test": "(tstest test/ --verbose --logfile --timeout 60)",
|
||||||
"build": "(tsbuild --web --allowimplicitany)"
|
"build": "(tsbuild --web --allowimplicitany)"
|
||||||
},
|
},
|
||||||
"devDependencies": {
|
"devDependencies": {
|
||||||
"@git.zone/tsbuild": "^2.6.4",
|
"@git.zone/tsbuild": "^3.1.0",
|
||||||
"@git.zone/tsrun": "^1.2.49",
|
"@git.zone/tsrun": "^2.0.0",
|
||||||
"@git.zone/tstest": "^2.3.2",
|
"@git.zone/tstest": "^3.0.1",
|
||||||
"@push.rocks/qenv": "^6.1.2",
|
"@push.rocks/qenv": "^6.1.3",
|
||||||
"@push.rocks/tapbundle": "^6.0.3"
|
"@push.rocks/tapbundle": "^6.0.3"
|
||||||
},
|
},
|
||||||
"dependencies": {
|
"dependencies": {
|
||||||
"@aws-sdk/client-s3": "^3.864.0",
|
"@aws-sdk/client-s3": "^3.936.0",
|
||||||
"@push.rocks/smartmime": "^2.0.4",
|
"@push.rocks/smartmime": "^2.0.4",
|
||||||
"@push.rocks/smartpath": "^6.0.0",
|
"@push.rocks/smartpath": "^6.0.0",
|
||||||
"@push.rocks/smartpromise": "^4.2.3",
|
"@push.rocks/smartpromise": "^4.2.3",
|
||||||
"@push.rocks/smartrx": "^3.0.10",
|
"@push.rocks/smartrx": "^3.0.10",
|
||||||
"@push.rocks/smartstream": "^3.2.5",
|
"@push.rocks/smartstream": "^3.2.5",
|
||||||
"@push.rocks/smartstring": "^4.0.15",
|
"@push.rocks/smartstring": "^4.1.0",
|
||||||
"@push.rocks/smartunique": "^3.0.9",
|
"@push.rocks/smartunique": "^3.0.9",
|
||||||
"@tsclass/tsclass": "^9.2.0"
|
"@tsclass/tsclass": "^9.3.0"
|
||||||
},
|
},
|
||||||
"private": false,
|
"private": false,
|
||||||
"files": [
|
"files": [
|
||||||
|
|||||||
4556
pnpm-lock.yaml
generated
4556
pnpm-lock.yaml
generated
File diff suppressed because it is too large
Load Diff
@@ -1,3 +1,5 @@
|
|||||||
* The project uses the official s3 client, not the minio client.
|
* The project uses the official s3 client, not the minio client.
|
||||||
* notice the difference between *Strict methods and the normal methods.
|
* **All methods throw by default** (strict mode): - Put operations: `fastPut`, `fastPutStream` throw when file exists and overwrite is false - Get operations: `getBucketByName`, `getFile`, `getSubDirectoryByName` throw when not found
|
||||||
|
* **Use exists() methods to check before getting**: `bucketExists`, `fileExists`, `directoryExists`, `fastExists`
|
||||||
|
* **No *Strict methods**: All removed (fastPutStrict, getBucketByNameStrict, getFileStrict, getSubDirectoryByNameStrict)
|
||||||
* metadata is handled though the MetaData class. Important!
|
* metadata is handled though the MetaData class. Important!
|
||||||
|
|||||||
21
readme.md
21
readme.md
@@ -88,8 +88,8 @@ console.log('🗑️ Bucket removed');
|
|||||||
```typescript
|
```typescript
|
||||||
const bucket = await smartBucket.getBucketByName('my-bucket');
|
const bucket = await smartBucket.getBucketByName('my-bucket');
|
||||||
|
|
||||||
// Simple file upload
|
// Simple file upload (returns File object)
|
||||||
await bucket.fastPut({
|
const file = await bucket.fastPut({
|
||||||
path: 'documents/report.pdf',
|
path: 'documents/report.pdf',
|
||||||
contents: Buffer.from('Your file content here')
|
contents: Buffer.from('Your file content here')
|
||||||
});
|
});
|
||||||
@@ -100,12 +100,23 @@ await bucket.fastPut({
|
|||||||
contents: 'Buy milk\nCall mom\nRule the world'
|
contents: 'Buy milk\nCall mom\nRule the world'
|
||||||
});
|
});
|
||||||
|
|
||||||
// Strict upload (returns File object)
|
// Upload with overwrite control
|
||||||
const uploadedFile = await bucket.fastPutStrict({
|
const uploadedFile = await bucket.fastPut({
|
||||||
path: 'images/logo.png',
|
path: 'images/logo.png',
|
||||||
contents: imageBuffer,
|
contents: imageBuffer,
|
||||||
overwrite: true // Optional: control overwrite behavior
|
overwrite: true // Set to true to replace existing files
|
||||||
});
|
});
|
||||||
|
|
||||||
|
// Error handling: fastPut throws if file exists and overwrite is false
|
||||||
|
try {
|
||||||
|
await bucket.fastPut({
|
||||||
|
path: 'existing-file.txt',
|
||||||
|
contents: 'new content'
|
||||||
|
});
|
||||||
|
} catch (error) {
|
||||||
|
console.error('Upload failed:', error.message);
|
||||||
|
// Error: Object already exists at path 'existing-file.txt' in bucket 'my-bucket'. Set overwrite:true to replace it.
|
||||||
|
}
|
||||||
```
|
```
|
||||||
|
|
||||||
#### Download Files
|
#### Download Files
|
||||||
|
|||||||
@@ -16,7 +16,7 @@ tap.test('should create a valid smartbucket', async () => {
|
|||||||
endpoint: await testQenv.getEnvVarOnDemandStrict('S3_ENDPOINT'),
|
endpoint: await testQenv.getEnvVarOnDemandStrict('S3_ENDPOINT'),
|
||||||
});
|
});
|
||||||
expect(testSmartbucket).toBeInstanceOf(smartbucket.SmartBucket);
|
expect(testSmartbucket).toBeInstanceOf(smartbucket.SmartBucket);
|
||||||
myBucket = await testSmartbucket.getBucketByNameStrict(await testQenv.getEnvVarOnDemandStrict('S3_BUCKET'),);
|
myBucket = await testSmartbucket.getBucketByName(await testQenv.getEnvVarOnDemandStrict('S3_BUCKET'),);
|
||||||
expect(myBucket).toBeInstanceOf(smartbucket.Bucket);
|
expect(myBucket).toBeInstanceOf(smartbucket.Bucket);
|
||||||
expect(myBucket.name).toEqual('test-pushrocks-smartbucket');
|
expect(myBucket.name).toEqual('test-pushrocks-smartbucket');
|
||||||
});
|
});
|
||||||
@@ -17,7 +17,7 @@ tap.test('should create a valid smartbucket', async () => {
|
|||||||
endpoint: await testQenv.getEnvVarOnDemandStrict('S3_ENDPOINT'),
|
endpoint: await testQenv.getEnvVarOnDemandStrict('S3_ENDPOINT'),
|
||||||
});
|
});
|
||||||
expect(testSmartbucket).toBeInstanceOf(smartbucket.SmartBucket);
|
expect(testSmartbucket).toBeInstanceOf(smartbucket.SmartBucket);
|
||||||
myBucket = await testSmartbucket.getBucketByNameStrict(await testQenv.getEnvVarOnDemandStrict('S3_BUCKET'),);
|
myBucket = await testSmartbucket.getBucketByName(await testQenv.getEnvVarOnDemandStrict('S3_BUCKET'),);
|
||||||
expect(myBucket).toBeInstanceOf(smartbucket.Bucket);
|
expect(myBucket).toBeInstanceOf(smartbucket.Bucket);
|
||||||
expect(myBucket.name).toEqual('test-pushrocks-smartbucket');
|
expect(myBucket.name).toEqual('test-pushrocks-smartbucket');
|
||||||
});
|
});
|
||||||
@@ -30,7 +30,7 @@ tap.test('should clean all contents', async () => {
|
|||||||
|
|
||||||
tap.test('should delete a file into the normally', async () => {
|
tap.test('should delete a file into the normally', async () => {
|
||||||
const path = 'trashtest/trashme.txt';
|
const path = 'trashtest/trashme.txt';
|
||||||
const file = await myBucket.fastPutStrict({
|
const file = await myBucket.fastPut({
|
||||||
path,
|
path,
|
||||||
contents: 'I\'m in the trash test content!',
|
contents: 'I\'m in the trash test content!',
|
||||||
});
|
});
|
||||||
@@ -44,7 +44,7 @@ tap.test('should delete a file into the normally', async () => {
|
|||||||
|
|
||||||
tap.test('should put a file into the trash', async () => {
|
tap.test('should put a file into the trash', async () => {
|
||||||
const path = 'trashtest/trashme.txt';
|
const path = 'trashtest/trashme.txt';
|
||||||
const file = await myBucket.fastPutStrict({
|
const file = await myBucket.fastPut({
|
||||||
path,
|
path,
|
||||||
contents: 'I\'m in the trash test content!',
|
contents: 'I\'m in the trash test content!',
|
||||||
});
|
});
|
||||||
@@ -76,7 +76,7 @@ tap.test('should put a file into the trash', async () => {
|
|||||||
|
|
||||||
tap.test('should restore a file from trash', async () => {
|
tap.test('should restore a file from trash', async () => {
|
||||||
const baseDirectory = await myBucket.getBaseDirectory();
|
const baseDirectory = await myBucket.getBaseDirectory();
|
||||||
const file = await baseDirectory.getFileStrict({
|
const file = await baseDirectory.getFile({
|
||||||
path: 'trashtest/trashme.txt',
|
path: 'trashtest/trashme.txt',
|
||||||
getFromTrash: true
|
getFromTrash: true
|
||||||
});
|
});
|
||||||
@@ -3,6 +3,6 @@
|
|||||||
*/
|
*/
|
||||||
export const commitinfo = {
|
export const commitinfo = {
|
||||||
name: '@push.rocks/smartbucket',
|
name: '@push.rocks/smartbucket',
|
||||||
version: '3.3.9',
|
version: '4.0.0',
|
||||||
description: 'A TypeScript library providing a cloud-agnostic interface for managing object storage with functionalities like bucket management, file and directory operations, and advanced features such as metadata handling and file locking.'
|
description: 'A TypeScript library providing a cloud-agnostic interface for managing object storage with functionalities like bucket management, file and directory operations, and advanced features such as metadata handling and file locking.'
|
||||||
}
|
}
|
||||||
|
|||||||
@@ -14,7 +14,7 @@ import { Trash } from './classes.trash.js';
|
|||||||
* operate in S3 basic fashion on blobs of data.
|
* operate in S3 basic fashion on blobs of data.
|
||||||
*/
|
*/
|
||||||
export class Bucket {
|
export class Bucket {
|
||||||
public static async getBucketByName(smartbucketRef: SmartBucket, bucketNameArg: string) {
|
public static async getBucketByName(smartbucketRef: SmartBucket, bucketNameArg: string): Promise<Bucket> {
|
||||||
const command = new plugins.s3.ListBucketsCommand({});
|
const command = new plugins.s3.ListBucketsCommand({});
|
||||||
const buckets = await smartbucketRef.s3Client.send(command);
|
const buckets = await smartbucketRef.s3Client.send(command);
|
||||||
const foundBucket = buckets.Buckets!.find((bucket) => bucket.Name === bucketNameArg);
|
const foundBucket = buckets.Buckets!.find((bucket) => bucket.Name === bucketNameArg);
|
||||||
@@ -24,8 +24,7 @@ export class Bucket {
|
|||||||
console.log(`Taking this as base for new Bucket instance`);
|
console.log(`Taking this as base for new Bucket instance`);
|
||||||
return new this(smartbucketRef, bucketNameArg);
|
return new this(smartbucketRef, bucketNameArg);
|
||||||
} else {
|
} else {
|
||||||
console.log(`did not find bucket by name: ${bucketNameArg}`);
|
throw new Error(`Bucket '${bucketNameArg}' not found.`);
|
||||||
return null;
|
|
||||||
}
|
}
|
||||||
}
|
}
|
||||||
|
|
||||||
@@ -71,7 +70,7 @@ export class Bucket {
|
|||||||
}
|
}
|
||||||
const checkPath = await helpers.reducePathDescriptorToPath(pathDescriptorArg);
|
const checkPath = await helpers.reducePathDescriptorToPath(pathDescriptorArg);
|
||||||
const baseDirectory = await this.getBaseDirectory();
|
const baseDirectory = await this.getBaseDirectory();
|
||||||
return await baseDirectory.getSubDirectoryByNameStrict(checkPath, {
|
return await baseDirectory.getSubDirectoryByName(checkPath, {
|
||||||
getEmptyDirectory: true,
|
getEmptyDirectory: true,
|
||||||
});
|
});
|
||||||
}
|
}
|
||||||
@@ -88,15 +87,16 @@ export class Bucket {
|
|||||||
contents: string | Buffer;
|
contents: string | Buffer;
|
||||||
overwrite?: boolean;
|
overwrite?: boolean;
|
||||||
}
|
}
|
||||||
): Promise<File | null> {
|
): Promise<File> {
|
||||||
try {
|
try {
|
||||||
const reducedPath = await helpers.reducePathDescriptorToPath(optionsArg);
|
const reducedPath = await helpers.reducePathDescriptorToPath(optionsArg);
|
||||||
const exists = await this.fastExists({ path: reducedPath });
|
const exists = await this.fastExists({ path: reducedPath });
|
||||||
|
|
||||||
if (exists && !optionsArg.overwrite) {
|
if (exists && !optionsArg.overwrite) {
|
||||||
const errorText = `Object already exists at path '${reducedPath}' in bucket '${this.name}'.`;
|
throw new Error(
|
||||||
console.error(errorText);
|
`Object already exists at path '${reducedPath}' in bucket '${this.name}'. ` +
|
||||||
return null;
|
`Set overwrite:true to replace it.`
|
||||||
|
);
|
||||||
} else if (exists && optionsArg.overwrite) {
|
} else if (exists && optionsArg.overwrite) {
|
||||||
console.log(
|
console.log(
|
||||||
`Overwriting existing object at path '${reducedPath}' in bucket '${this.name}'.`
|
`Overwriting existing object at path '${reducedPath}' in bucket '${this.name}'.`
|
||||||
@@ -129,13 +129,6 @@ export class Bucket {
|
|||||||
}
|
}
|
||||||
}
|
}
|
||||||
|
|
||||||
public async fastPutStrict(...args: Parameters<Bucket['fastPut']>) {
|
|
||||||
const file = await this.fastPut(...args);
|
|
||||||
if (!file) {
|
|
||||||
throw new Error(`File not stored at path '${args[0].path}'`);
|
|
||||||
}
|
|
||||||
return file;
|
|
||||||
}
|
|
||||||
|
|
||||||
/**
|
/**
|
||||||
* get file
|
* get file
|
||||||
@@ -259,10 +252,10 @@ export class Bucket {
|
|||||||
const exists = await this.fastExists({ path: optionsArg.path });
|
const exists = await this.fastExists({ path: optionsArg.path });
|
||||||
|
|
||||||
if (exists && !optionsArg.overwrite) {
|
if (exists && !optionsArg.overwrite) {
|
||||||
console.error(
|
throw new Error(
|
||||||
`Object already exists at path '${optionsArg.path}' in bucket '${this.name}'.`
|
`Object already exists at path '${optionsArg.path}' in bucket '${this.name}'. ` +
|
||||||
|
`Set overwrite:true to replace it.`
|
||||||
);
|
);
|
||||||
return;
|
|
||||||
} else if (exists && optionsArg.overwrite) {
|
} else if (exists && optionsArg.overwrite) {
|
||||||
console.log(
|
console.log(
|
||||||
`Overwriting existing object at path '${optionsArg.path}' in bucket '${this.name}'.`
|
`Overwriting existing object at path '${optionsArg.path}' in bucket '${this.name}'.`
|
||||||
@@ -460,7 +453,7 @@ export class Bucket {
|
|||||||
Range: `bytes=0-${optionsArg.length - 1}`,
|
Range: `bytes=0-${optionsArg.length - 1}`,
|
||||||
});
|
});
|
||||||
const response = await this.smartbucketRef.s3Client.send(command);
|
const response = await this.smartbucketRef.s3Client.send(command);
|
||||||
const chunks = [];
|
const chunks: Buffer[] = [];
|
||||||
const stream = response.Body as any; // SdkStreamMixin includes readable stream
|
const stream = response.Body as any; // SdkStreamMixin includes readable stream
|
||||||
|
|
||||||
for await (const chunk of stream) {
|
for await (const chunk of stream) {
|
||||||
|
|||||||
@@ -69,7 +69,7 @@ export class Directory {
|
|||||||
path: string;
|
path: string;
|
||||||
createWithContents?: string | Buffer;
|
createWithContents?: string | Buffer;
|
||||||
getFromTrash?: boolean;
|
getFromTrash?: boolean;
|
||||||
}): Promise<File | null> {
|
}): Promise<File> {
|
||||||
const pathDescriptor = {
|
const pathDescriptor = {
|
||||||
directory: this,
|
directory: this,
|
||||||
path: optionsArg.path,
|
path: optionsArg.path,
|
||||||
@@ -83,7 +83,7 @@ export class Directory {
|
|||||||
return trashedFile;
|
return trashedFile;
|
||||||
}
|
}
|
||||||
if (!exists && !optionsArg.createWithContents) {
|
if (!exists && !optionsArg.createWithContents) {
|
||||||
return null;
|
throw new Error(`File not found at path '${optionsArg.path}'`);
|
||||||
}
|
}
|
||||||
if (!exists && optionsArg.createWithContents) {
|
if (!exists && optionsArg.createWithContents) {
|
||||||
await File.create({
|
await File.create({
|
||||||
@@ -98,17 +98,26 @@ export class Directory {
|
|||||||
});
|
});
|
||||||
}
|
}
|
||||||
|
|
||||||
|
|
||||||
/**
|
/**
|
||||||
* gets a file strictly
|
* Check if a file exists in this directory
|
||||||
* @param args
|
|
||||||
* @returns
|
|
||||||
*/
|
*/
|
||||||
public async getFileStrict(...args: Parameters<Directory['getFile']>) {
|
public async fileExists(optionsArg: { path: string }): Promise<boolean> {
|
||||||
const file = await this.getFile(...args);
|
const pathDescriptor = {
|
||||||
if (!file) {
|
directory: this,
|
||||||
throw new Error(`File not found at path '${args[0].path}'`);
|
path: optionsArg.path,
|
||||||
|
};
|
||||||
|
return this.bucketRef.fastExists({
|
||||||
|
path: await helpers.reducePathDescriptorToPath(pathDescriptor),
|
||||||
|
});
|
||||||
}
|
}
|
||||||
return file;
|
|
||||||
|
/**
|
||||||
|
* Check if a subdirectory exists
|
||||||
|
*/
|
||||||
|
public async directoryExists(dirNameArg: string): Promise<boolean> {
|
||||||
|
const directories = await this.listDirectories();
|
||||||
|
return directories.some(dir => dir.name === dirNameArg);
|
||||||
}
|
}
|
||||||
|
|
||||||
/**
|
/**
|
||||||
@@ -206,7 +215,7 @@ export class Directory {
|
|||||||
* if the path is a file path, it will be treated as a file and the parent directory will be returned
|
* if the path is a file path, it will be treated as a file and the parent directory will be returned
|
||||||
*/
|
*/
|
||||||
couldBeFilePath?: boolean;
|
couldBeFilePath?: boolean;
|
||||||
} = {}): Promise<Directory | null> {
|
} = {}): Promise<Directory> {
|
||||||
|
|
||||||
const dirNameArray = dirNameArg.split('/').filter(str => str.trim() !== "");
|
const dirNameArray = dirNameArg.split('/').filter(str => str.trim() !== "");
|
||||||
|
|
||||||
@@ -253,16 +262,12 @@ export class Directory {
|
|||||||
wantedDirectory = await getDirectory(directoryToSearchIn, dirNameToSearch, counter === dirNameArray.length);
|
wantedDirectory = await getDirectory(directoryToSearchIn, dirNameToSearch, counter === dirNameArray.length);
|
||||||
}
|
}
|
||||||
|
|
||||||
return wantedDirectory || null;
|
if (!wantedDirectory) {
|
||||||
|
throw new Error(`Directory not found at path '${dirNameArg}'`);
|
||||||
|
}
|
||||||
|
return wantedDirectory;
|
||||||
}
|
}
|
||||||
|
|
||||||
public async getSubDirectoryByNameStrict(...args: Parameters<Directory['getSubDirectoryByName']>) {
|
|
||||||
const directory = await this.getSubDirectoryByName(...args);
|
|
||||||
if (!directory) {
|
|
||||||
throw new Error(`Directory not found at path '${args[0]}'`);
|
|
||||||
}
|
|
||||||
return directory;
|
|
||||||
}
|
|
||||||
|
|
||||||
/**
|
/**
|
||||||
* moves the directory
|
* moves the directory
|
||||||
@@ -360,7 +365,7 @@ export class Directory {
|
|||||||
*/
|
*/
|
||||||
mode?: 'permanent' | 'trash';
|
mode?: 'permanent' | 'trash';
|
||||||
}) {
|
}) {
|
||||||
const file = await this.getFileStrict({
|
const file = await this.getFile({
|
||||||
path: optionsArg.path,
|
path: optionsArg.path,
|
||||||
});
|
});
|
||||||
await file.delete({
|
await file.delete({
|
||||||
|
|||||||
@@ -245,7 +245,7 @@ export class File {
|
|||||||
|
|
||||||
// lets update references of this
|
// lets update references of this
|
||||||
const baseDirectory = await this.parentDirectoryRef.bucketRef.getBaseDirectory();
|
const baseDirectory = await this.parentDirectoryRef.bucketRef.getBaseDirectory();
|
||||||
this.parentDirectoryRef = await baseDirectory.getSubDirectoryByNameStrict(
|
this.parentDirectoryRef = await baseDirectory.getSubDirectoryByName(
|
||||||
await helpers.reducePathDescriptorToPath(pathDescriptorArg),
|
await helpers.reducePathDescriptorToPath(pathDescriptorArg),
|
||||||
{
|
{
|
||||||
couldBeFilePath: true,
|
couldBeFilePath: true,
|
||||||
|
|||||||
@@ -17,7 +17,7 @@ export class MetaData {
|
|||||||
metaData.fileRef = optionsArg.file;
|
metaData.fileRef = optionsArg.file;
|
||||||
|
|
||||||
// lets find the existing metadata file
|
// lets find the existing metadata file
|
||||||
metaData.metadataFile = await metaData.fileRef.parentDirectoryRef.getFileStrict({
|
metaData.metadataFile = await metaData.fileRef.parentDirectoryRef.getFile({
|
||||||
path: metaData.fileRef.name + '.metadata',
|
path: metaData.fileRef.name + '.metadata',
|
||||||
createWithContents: '{}',
|
createWithContents: '{}',
|
||||||
});
|
});
|
||||||
|
|||||||
@@ -2,6 +2,7 @@
|
|||||||
|
|
||||||
import * as plugins from './plugins.js';
|
import * as plugins from './plugins.js';
|
||||||
import { Bucket } from './classes.bucket.js';
|
import { Bucket } from './classes.bucket.js';
|
||||||
|
import { normalizeS3Descriptor } from './helpers.js';
|
||||||
|
|
||||||
export class SmartBucket {
|
export class SmartBucket {
|
||||||
public config: plugins.tsclass.storage.IS3Descriptor;
|
public config: plugins.tsclass.storage.IS3Descriptor;
|
||||||
@@ -17,18 +18,14 @@ export class SmartBucket {
|
|||||||
constructor(configArg: plugins.tsclass.storage.IS3Descriptor) {
|
constructor(configArg: plugins.tsclass.storage.IS3Descriptor) {
|
||||||
this.config = configArg;
|
this.config = configArg;
|
||||||
|
|
||||||
const protocol = configArg.useSsl === false ? 'http' : 'https';
|
// Use the normalizer to handle various endpoint formats
|
||||||
const port = configArg.port ? `:${configArg.port}` : '';
|
const { normalized } = normalizeS3Descriptor(configArg);
|
||||||
const endpoint = `${protocol}://${configArg.endpoint}${port}`;
|
|
||||||
|
|
||||||
this.s3Client = new plugins.s3.S3Client({
|
this.s3Client = new plugins.s3.S3Client({
|
||||||
endpoint,
|
endpoint: normalized.endpointUrl,
|
||||||
region: configArg.region || 'us-east-1',
|
region: normalized.region,
|
||||||
credentials: {
|
credentials: normalized.credentials,
|
||||||
accessKeyId: configArg.accessKey,
|
forcePathStyle: normalized.forcePathStyle, // Necessary for S3-compatible storage like MinIO or Wasabi
|
||||||
secretAccessKey: configArg.accessSecret,
|
|
||||||
},
|
|
||||||
forcePathStyle: true, // Necessary for S3-compatible storage like MinIO or Wasabi
|
|
||||||
});
|
});
|
||||||
}
|
}
|
||||||
|
|
||||||
@@ -45,11 +42,12 @@ export class SmartBucket {
|
|||||||
return Bucket.getBucketByName(this, bucketNameArg);
|
return Bucket.getBucketByName(this, bucketNameArg);
|
||||||
}
|
}
|
||||||
|
|
||||||
public async getBucketByNameStrict(...args: Parameters<SmartBucket['getBucketByName']>) {
|
/**
|
||||||
const bucket = await this.getBucketByName(...args);
|
* Check if a bucket exists
|
||||||
if (!bucket) {
|
*/
|
||||||
throw new Error(`Bucket ${args[0]} does not exist.`);
|
public async bucketExists(bucketNameArg: string): Promise<boolean> {
|
||||||
}
|
const command = new plugins.s3.ListBucketsCommand({});
|
||||||
return bucket;
|
const buckets = await this.s3Client.send(command);
|
||||||
|
return buckets.Buckets?.some(bucket => bucket.Name === bucketNameArg) ?? false;
|
||||||
}
|
}
|
||||||
}
|
}
|
||||||
|
|||||||
@@ -21,7 +21,7 @@ export class Trash {
|
|||||||
const trashDir = await this.getTrashDir();
|
const trashDir = await this.getTrashDir();
|
||||||
const originalPath = await helpers.reducePathDescriptorToPath(pathDescriptor);
|
const originalPath = await helpers.reducePathDescriptorToPath(pathDescriptor);
|
||||||
const trashKey = await this.getTrashKeyByOriginalBasePath(originalPath);
|
const trashKey = await this.getTrashKeyByOriginalBasePath(originalPath);
|
||||||
return trashDir.getFileStrict({ path: trashKey });
|
return trashDir.getFile({ path: trashKey });
|
||||||
}
|
}
|
||||||
|
|
||||||
public async getTrashKeyByOriginalBasePath (originalPath: string): Promise<string> {
|
public async getTrashKeyByOriginalBasePath (originalPath: string): Promise<string> {
|
||||||
|
|||||||
232
ts/helpers.ts
232
ts/helpers.ts
@@ -20,3 +20,235 @@ export const reducePathDescriptorToPath = async (pathDescriptorArg: interfaces.I
|
|||||||
}
|
}
|
||||||
return returnPath;
|
return returnPath;
|
||||||
}
|
}
|
||||||
|
|
||||||
|
// S3 Descriptor Normalization
|
||||||
|
export interface IS3Warning {
|
||||||
|
code: string;
|
||||||
|
message: string;
|
||||||
|
}
|
||||||
|
|
||||||
|
export interface INormalizedS3Config {
|
||||||
|
endpointUrl: string;
|
||||||
|
host: string;
|
||||||
|
protocol: 'http' | 'https';
|
||||||
|
port?: number;
|
||||||
|
region: string;
|
||||||
|
credentials: {
|
||||||
|
accessKeyId: string;
|
||||||
|
secretAccessKey: string;
|
||||||
|
};
|
||||||
|
forcePathStyle: boolean;
|
||||||
|
}
|
||||||
|
|
||||||
|
function coerceBooleanMaybe(value: unknown): { value: boolean | undefined; warning?: IS3Warning } {
|
||||||
|
if (typeof value === 'boolean') return { value };
|
||||||
|
if (typeof value === 'string') {
|
||||||
|
const v = value.trim().toLowerCase();
|
||||||
|
if (v === 'true' || v === '1') {
|
||||||
|
return {
|
||||||
|
value: true,
|
||||||
|
warning: {
|
||||||
|
code: 'SBK_S3_COERCED_USESSL',
|
||||||
|
message: `Coerced useSsl='${value}' (string) to boolean true.`
|
||||||
|
}
|
||||||
|
};
|
||||||
|
}
|
||||||
|
if (v === 'false' || v === '0') {
|
||||||
|
return {
|
||||||
|
value: false,
|
||||||
|
warning: {
|
||||||
|
code: 'SBK_S3_COERCED_USESSL',
|
||||||
|
message: `Coerced useSsl='${value}' (string) to boolean false.`
|
||||||
|
}
|
||||||
|
};
|
||||||
|
}
|
||||||
|
}
|
||||||
|
return { value: undefined };
|
||||||
|
}
|
||||||
|
|
||||||
|
function coercePortMaybe(port: unknown): { value: number | undefined; warning?: IS3Warning } {
|
||||||
|
if (port === undefined || port === null || port === '') return { value: undefined };
|
||||||
|
const n = typeof port === 'number' ? port : Number(String(port).trim());
|
||||||
|
if (!Number.isFinite(n) || !Number.isInteger(n) || n <= 0 || n > 65535) {
|
||||||
|
return {
|
||||||
|
value: undefined,
|
||||||
|
warning: {
|
||||||
|
code: 'SBK_S3_INVALID_PORT',
|
||||||
|
message: `Invalid port '${String(port)}' - expected integer in [1..65535].`
|
||||||
|
}
|
||||||
|
};
|
||||||
|
}
|
||||||
|
return { value: n };
|
||||||
|
}
|
||||||
|
|
||||||
|
function sanitizeEndpointString(raw: unknown): { value: string; warnings: IS3Warning[] } {
|
||||||
|
const warnings: IS3Warning[] = [];
|
||||||
|
let s = String(raw ?? '').trim();
|
||||||
|
if (s !== String(raw ?? '')) {
|
||||||
|
warnings.push({
|
||||||
|
code: 'SBK_S3_TRIMMED_ENDPOINT',
|
||||||
|
message: 'Trimmed surrounding whitespace from endpoint.'
|
||||||
|
});
|
||||||
|
}
|
||||||
|
return { value: s, warnings };
|
||||||
|
}
|
||||||
|
|
||||||
|
function parseEndpointHostPort(
|
||||||
|
endpoint: string,
|
||||||
|
provisionalProtocol: 'http' | 'https'
|
||||||
|
): {
|
||||||
|
hadScheme: boolean;
|
||||||
|
host: string;
|
||||||
|
port?: number;
|
||||||
|
extras: {
|
||||||
|
droppedPath?: boolean;
|
||||||
|
droppedQuery?: boolean;
|
||||||
|
droppedCreds?: boolean
|
||||||
|
}
|
||||||
|
} {
|
||||||
|
let url: URL | undefined;
|
||||||
|
const extras: { droppedPath?: boolean; droppedQuery?: boolean; droppedCreds?: boolean } = {};
|
||||||
|
|
||||||
|
// Check if endpoint already has a scheme
|
||||||
|
const hasScheme = /^https?:\/\//i.test(endpoint);
|
||||||
|
|
||||||
|
// Try parsing as full URL first
|
||||||
|
try {
|
||||||
|
if (hasScheme) {
|
||||||
|
url = new URL(endpoint);
|
||||||
|
} else {
|
||||||
|
// Not a full URL; try host[:port] by attaching provisional scheme
|
||||||
|
// Remove anything after first '/' for safety
|
||||||
|
const cleanEndpoint = endpoint.replace(/\/.*/, '');
|
||||||
|
url = new URL(`${provisionalProtocol}://${cleanEndpoint}`);
|
||||||
|
}
|
||||||
|
} catch (e) {
|
||||||
|
throw new Error(`Unable to parse endpoint '${endpoint}'.`);
|
||||||
|
}
|
||||||
|
|
||||||
|
// Check for dropped components
|
||||||
|
if (url.username || url.password) extras.droppedCreds = true;
|
||||||
|
if (url.pathname && url.pathname !== '/') extras.droppedPath = true;
|
||||||
|
if (url.search) extras.droppedQuery = true;
|
||||||
|
|
||||||
|
const hadScheme = hasScheme;
|
||||||
|
const host = url.hostname; // hostnames lowercased by URL; IPs preserved
|
||||||
|
const port = url.port ? Number(url.port) : undefined;
|
||||||
|
|
||||||
|
return { hadScheme, host, port, extras };
|
||||||
|
}
|
||||||
|
|
||||||
|
export function normalizeS3Descriptor(
|
||||||
|
input: plugins.tsclass.storage.IS3Descriptor,
|
||||||
|
logger?: { warn: (msg: string) => void }
|
||||||
|
): { normalized: INormalizedS3Config; warnings: IS3Warning[] } {
|
||||||
|
const warnings: IS3Warning[] = [];
|
||||||
|
const logWarn = (w: IS3Warning) => {
|
||||||
|
warnings.push(w);
|
||||||
|
if (logger) {
|
||||||
|
logger.warn(`[SmartBucket S3] ${w.code}: ${w.message}`);
|
||||||
|
} else {
|
||||||
|
console.warn(`[SmartBucket S3] ${w.code}: ${w.message}`);
|
||||||
|
}
|
||||||
|
};
|
||||||
|
|
||||||
|
// Coerce and sanitize inputs
|
||||||
|
const { value: coercedUseSsl, warning: useSslWarn } = coerceBooleanMaybe((input as any).useSsl);
|
||||||
|
if (useSslWarn) logWarn(useSslWarn);
|
||||||
|
|
||||||
|
const { value: coercedPort, warning: portWarn } = coercePortMaybe((input as any).port);
|
||||||
|
if (portWarn) logWarn(portWarn);
|
||||||
|
|
||||||
|
const { value: endpointStr, warnings: endpointSanWarnings } = sanitizeEndpointString((input as any).endpoint);
|
||||||
|
endpointSanWarnings.forEach(logWarn);
|
||||||
|
|
||||||
|
if (!endpointStr) {
|
||||||
|
throw new Error('S3 endpoint is required (got empty string). Provide hostname or URL.');
|
||||||
|
}
|
||||||
|
|
||||||
|
// Provisional protocol selection for parsing host:port forms
|
||||||
|
const provisionalProtocol: 'http' | 'https' = coercedUseSsl === false ? 'http' : 'https';
|
||||||
|
|
||||||
|
const { hadScheme, host, port: epPort, extras } = parseEndpointHostPort(endpointStr, provisionalProtocol);
|
||||||
|
|
||||||
|
if (extras.droppedCreds) {
|
||||||
|
logWarn({
|
||||||
|
code: 'SBK_S3_DROPPED_CREDENTIALS',
|
||||||
|
message: 'Ignored credentials in endpoint URL.'
|
||||||
|
});
|
||||||
|
}
|
||||||
|
if (extras.droppedPath) {
|
||||||
|
logWarn({
|
||||||
|
code: 'SBK_S3_DROPPED_PATH',
|
||||||
|
message: 'Removed path segment from endpoint URL; S3 endpoint should be host[:port] only.'
|
||||||
|
});
|
||||||
|
}
|
||||||
|
if (extras.droppedQuery) {
|
||||||
|
logWarn({
|
||||||
|
code: 'SBK_S3_DROPPED_QUERY',
|
||||||
|
message: 'Removed query string from endpoint URL; S3 endpoint should be host[:port] only.'
|
||||||
|
});
|
||||||
|
}
|
||||||
|
|
||||||
|
// Final protocol decision
|
||||||
|
let finalProtocol: 'http' | 'https';
|
||||||
|
if (hadScheme) {
|
||||||
|
// Scheme from endpoint wins
|
||||||
|
const schemeFromEndpoint = endpointStr.trim().toLowerCase().startsWith('http://') ? 'http' : 'https';
|
||||||
|
finalProtocol = schemeFromEndpoint;
|
||||||
|
if (typeof coercedUseSsl === 'boolean') {
|
||||||
|
const expected = coercedUseSsl ? 'https' : 'http';
|
||||||
|
if (expected !== finalProtocol) {
|
||||||
|
logWarn({
|
||||||
|
code: 'SBK_S3_SCHEME_CONFLICT',
|
||||||
|
message: `useSsl=${String(coercedUseSsl)} conflicts with endpoint scheme '${finalProtocol}'; using endpoint scheme.`
|
||||||
|
});
|
||||||
|
}
|
||||||
|
}
|
||||||
|
} else {
|
||||||
|
if (typeof coercedUseSsl === 'boolean') {
|
||||||
|
finalProtocol = coercedUseSsl ? 'https' : 'http';
|
||||||
|
} else {
|
||||||
|
finalProtocol = 'https';
|
||||||
|
logWarn({
|
||||||
|
code: 'SBK_S3_GUESSED_PROTOCOL',
|
||||||
|
message: "No scheme in endpoint and useSsl not provided; defaulting to 'https'."
|
||||||
|
});
|
||||||
|
}
|
||||||
|
}
|
||||||
|
|
||||||
|
// Final port decision
|
||||||
|
let finalPort: number | undefined = undefined;
|
||||||
|
if (coercedPort !== undefined && epPort !== undefined && coercedPort !== epPort) {
|
||||||
|
logWarn({
|
||||||
|
code: 'SBK_S3_PORT_CONFLICT',
|
||||||
|
message: `Port in config (${coercedPort}) conflicts with endpoint port (${epPort}); using config port.`
|
||||||
|
});
|
||||||
|
finalPort = coercedPort;
|
||||||
|
} else {
|
||||||
|
finalPort = (coercedPort !== undefined) ? coercedPort : epPort;
|
||||||
|
}
|
||||||
|
|
||||||
|
// Build canonical endpoint URL (origin only, no trailing slash)
|
||||||
|
const url = new URL(`${finalProtocol}://${host}`);
|
||||||
|
if (finalPort !== undefined) url.port = String(finalPort);
|
||||||
|
const endpointUrl = url.origin;
|
||||||
|
|
||||||
|
const region = input.region || 'us-east-1';
|
||||||
|
|
||||||
|
return {
|
||||||
|
normalized: {
|
||||||
|
endpointUrl,
|
||||||
|
host,
|
||||||
|
protocol: finalProtocol,
|
||||||
|
port: finalPort,
|
||||||
|
region,
|
||||||
|
credentials: {
|
||||||
|
accessKeyId: input.accessKey,
|
||||||
|
secretAccessKey: input.accessSecret,
|
||||||
|
},
|
||||||
|
forcePathStyle: true,
|
||||||
|
},
|
||||||
|
warnings,
|
||||||
|
};
|
||||||
|
}
|
||||||
Reference in New Issue
Block a user