fix(docs): Revise README with detailed usage examples and add local Claude settings
This commit is contained in:
@@ -1,5 +1,14 @@
|
||||
# Changelog
|
||||
|
||||
## 2025-08-15 - 3.3.9 - fix(docs)
|
||||
Revise README with detailed usage examples and add local Claude settings
|
||||
|
||||
- Revamped README: reorganized content, added emojis and clearer headings for install, getting started, bucket/file/directory operations, streaming, metadata, trash/recovery, locking, and advanced configuration.
|
||||
- Added many concrete code examples for SmartBucket, Bucket, Directory, File, streaming (node/web), RxJS replay subjects, metadata handling, trash workflow, file locking, magic-bytes detection, JSON operations, and cleaning bucket contents.
|
||||
- Included testing instructions (pnpm test) and a Best Practices section with recommendations for strict mode, streaming, metadata, trash usage, and locking.
|
||||
- Added .claude/settings.local.json to include local Claude configuration and tool permissions.
|
||||
- No source code or public API changes; documentation and local tooling config only.
|
||||
|
||||
## 2025-08-15 - 3.3.8 - fix(tests)
|
||||
Update tests to use @git.zone/tstest, upgrade dependencies, remove GitLab CI and add local CI/workspace config
|
||||
|
||||
|
553
readme.md
553
readme.md
@@ -1,280 +1,457 @@
|
||||
```markdown
|
||||
# @push.rocks/smartbucket
|
||||
# @push.rocks/smartbucket 🪣
|
||||
|
||||
A comprehensive TypeScript library for cloud-agnostic object storage offering bucket management, file operations, and advanced data streaming.
|
||||
> A powerful, cloud-agnostic TypeScript library for object storage with advanced features like file locking, metadata management, and intelligent trash handling.
|
||||
|
||||
## Install
|
||||
## Install 📦
|
||||
|
||||
To install `@push.rocks/smartbucket`, ensure you have Node.js and npm installed. Then, run the following command in your project directory:
|
||||
To install `@push.rocks/smartbucket`, run:
|
||||
|
||||
```bash
|
||||
npm install @push.rocks/smartbucket --save
|
||||
```
|
||||
|
||||
This command will add `@push.rocks/smartbucket` to your project's dependencies and install it along with its requirements in the `node_modules` directory.
|
||||
Or if you're using pnpm (recommended):
|
||||
|
||||
## Usage
|
||||
```bash
|
||||
pnpm add @push.rocks/smartbucket
|
||||
```
|
||||
|
||||
## Usage 🚀
|
||||
|
||||
### Introduction
|
||||
|
||||
`@push.rocks/smartbucket` provides a robust set of features to manage cloud storage operations in a cloud-agnostic manner. By leveraging this library, you can seamlessly interact with object storage services like AWS S3, without being tied to any vendor-specific implementations. This library not only abstracts basic file operations but also integrates advanced capabilities such as metadata management, data streaming, file locking, and bucket policies, all through a simplified API.
|
||||
`@push.rocks/smartbucket` provides a unified, cloud-agnostic API for object storage operations across major providers like AWS S3, Google Cloud Storage, MinIO, and more. It abstracts away provider-specific complexities while offering advanced features like metadata management, file locking, streaming operations, and intelligent trash management.
|
||||
|
||||
### Table of Contents
|
||||
|
||||
1. [Setting Up](#setting-up)
|
||||
2. [Working with Buckets](#working-with-buckets)
|
||||
- [Creating a New Bucket](#creating-a-new-bucket)
|
||||
- [Listing Buckets](#listing-buckets)
|
||||
- [Deleting Buckets](#deleting-buckets)
|
||||
3. [File Operations in Buckets](#file-operations-in-buckets)
|
||||
- [Uploading Files](#uploading-files)
|
||||
- [Downloading Files](#downloading-files)
|
||||
- [Streaming Files](#streaming-files)
|
||||
- [Deleting Files](#deleting-files)
|
||||
4. [Directory Operations](#directory-operations)
|
||||
- [Listing Directories and Files](#listing-directories-and-files)
|
||||
- [Managing Files in Directories](#managing-files-in-directories)
|
||||
5. [Advanced Features](#advanced-features)
|
||||
- [Bucket Policies](#bucket-policies)
|
||||
- [Metadata Management](#metadata-management)
|
||||
- [File Locking](#file-locking)
|
||||
- [Trash Management](#trash-management)
|
||||
6. [Cloud Agnosticism](#cloud-agnosticism)
|
||||
1. [🏁 Getting Started](#-getting-started)
|
||||
2. [🗂️ Working with Buckets](#️-working-with-buckets)
|
||||
3. [📁 File Operations](#-file-operations)
|
||||
4. [📂 Directory Management](#-directory-management)
|
||||
5. [🌊 Streaming Operations](#-streaming-operations)
|
||||
6. [🔒 File Locking](#-file-locking)
|
||||
7. [🏷️ Metadata Management](#️-metadata-management)
|
||||
8. [🗑️ Trash & Recovery](#️-trash--recovery)
|
||||
9. [⚡ Advanced Features](#-advanced-features)
|
||||
10. [☁️ Cloud Provider Support](#️-cloud-provider-support)
|
||||
|
||||
### Setting Up
|
||||
### 🏁 Getting Started
|
||||
|
||||
Begin by importing the necessary classes from the `@push.rocks/smartbucket` package into your TypeScript file. Create an instance of `SmartBucket` with your storage configuration:
|
||||
First, set up your storage connection:
|
||||
|
||||
```typescript
|
||||
import {
|
||||
SmartBucket,
|
||||
Bucket,
|
||||
Directory,
|
||||
File
|
||||
} from '@push.rocks/smartbucket';
|
||||
import { SmartBucket } from '@push.rocks/smartbucket';
|
||||
|
||||
const mySmartBucket = new SmartBucket({
|
||||
accessKey: "yourAccessKey",
|
||||
accessSecret: "yourSecretKey",
|
||||
endpoint: "yourEndpointURL",
|
||||
// Initialize with your cloud storage credentials
|
||||
const smartBucket = new SmartBucket({
|
||||
accessKey: 'your-access-key',
|
||||
accessSecret: 'your-secret-key',
|
||||
endpoint: 's3.amazonaws.com', // Or your provider's endpoint
|
||||
port: 443,
|
||||
useSsl: true
|
||||
useSsl: true,
|
||||
region: 'us-east-1' // Optional, defaults to 'us-east-1'
|
||||
});
|
||||
```
|
||||
|
||||
Replace `"yourAccessKey"`, `"yourSecretKey"`, and `"yourEndpointURL"` with actual data specific to your cloud provider.
|
||||
### 🗂️ Working with Buckets
|
||||
|
||||
### Working with Buckets
|
||||
|
||||
#### Creating a New Bucket
|
||||
|
||||
Creating a bucket involves invoking the `createBucket` method. Note that bucket names are unique and follow the rules of the cloud provider:
|
||||
#### Creating Buckets
|
||||
|
||||
```typescript
|
||||
async function createBucket(bucketName: string) {
|
||||
try {
|
||||
const newBucket: Bucket = await mySmartBucket.createBucket(bucketName);
|
||||
console.log(`Bucket ${bucketName} created successfully.`);
|
||||
} catch (error) {
|
||||
console.error("Error creating bucket:", error);
|
||||
// Create a new bucket
|
||||
const myBucket = await smartBucket.createBucket('my-awesome-bucket');
|
||||
console.log(`✅ Bucket created: ${myBucket.name}`);
|
||||
```
|
||||
|
||||
#### Getting Existing Buckets
|
||||
|
||||
```typescript
|
||||
// Get a bucket reference
|
||||
const existingBucket = await smartBucket.getBucketByName('existing-bucket');
|
||||
|
||||
// Or use strict mode (throws if bucket doesn't exist)
|
||||
const bucketStrict = await smartBucket.getBucketByNameStrict('must-exist-bucket');
|
||||
```
|
||||
|
||||
#### Removing Buckets
|
||||
|
||||
```typescript
|
||||
// Delete a bucket (must be empty)
|
||||
await smartBucket.removeBucket('old-bucket');
|
||||
console.log('🗑️ Bucket removed');
|
||||
```
|
||||
|
||||
### 📁 File Operations
|
||||
|
||||
#### Upload Files
|
||||
|
||||
```typescript
|
||||
const bucket = await smartBucket.getBucketByName('my-bucket');
|
||||
|
||||
// Simple file upload
|
||||
await bucket.fastPut({
|
||||
path: 'documents/report.pdf',
|
||||
contents: Buffer.from('Your file content here')
|
||||
});
|
||||
|
||||
// Upload with string content
|
||||
await bucket.fastPut({
|
||||
path: 'notes/todo.txt',
|
||||
contents: 'Buy milk\nCall mom\nRule the world'
|
||||
});
|
||||
|
||||
// Strict upload (returns File object)
|
||||
const uploadedFile = await bucket.fastPutStrict({
|
||||
path: 'images/logo.png',
|
||||
contents: imageBuffer,
|
||||
overwrite: true // Optional: control overwrite behavior
|
||||
});
|
||||
```
|
||||
|
||||
#### Download Files
|
||||
|
||||
```typescript
|
||||
// Get file as Buffer
|
||||
const fileContent = await bucket.fastGet({
|
||||
path: 'documents/report.pdf'
|
||||
});
|
||||
console.log(`📄 File size: ${fileContent.length} bytes`);
|
||||
|
||||
// Get file as string
|
||||
const textContent = fileContent.toString('utf-8');
|
||||
```
|
||||
|
||||
#### Check File Existence
|
||||
|
||||
```typescript
|
||||
const exists = await bucket.fastExists({
|
||||
path: 'documents/report.pdf'
|
||||
});
|
||||
console.log(`File exists: ${exists ? '✅' : '❌'}`);
|
||||
```
|
||||
|
||||
#### Delete Files
|
||||
|
||||
```typescript
|
||||
// Permanent deletion
|
||||
await bucket.fastRemove({
|
||||
path: 'old-file.txt'
|
||||
});
|
||||
```
|
||||
|
||||
#### Copy & Move Files
|
||||
|
||||
```typescript
|
||||
// Copy file within bucket
|
||||
await bucket.fastCopy({
|
||||
sourcePath: 'original/file.txt',
|
||||
destinationPath: 'backup/file-copy.txt'
|
||||
});
|
||||
|
||||
// Move file (copy + delete original)
|
||||
await bucket.fastMove({
|
||||
sourcePath: 'temp/draft.txt',
|
||||
destinationPath: 'final/document.txt'
|
||||
});
|
||||
```
|
||||
|
||||
### 📂 Directory Management
|
||||
|
||||
SmartBucket provides powerful directory-like operations for organizing your files:
|
||||
|
||||
```typescript
|
||||
// Get base directory
|
||||
const baseDir = await bucket.getBaseDirectory();
|
||||
|
||||
// List directories and files
|
||||
const directories = await baseDir.listDirectories();
|
||||
const files = await baseDir.listFiles();
|
||||
|
||||
console.log(`📁 Found ${directories.length} directories`);
|
||||
console.log(`📄 Found ${files.length} files`);
|
||||
|
||||
// Navigate subdirectories
|
||||
const subDir = await baseDir.getSubDirectoryByName('projects/2024');
|
||||
|
||||
// Create nested file
|
||||
await subDir.fastPut({
|
||||
path: 'report.pdf',
|
||||
contents: reportBuffer
|
||||
});
|
||||
|
||||
// Get directory tree structure
|
||||
const tree = await subDir.getTreeArray();
|
||||
console.log('🌳 Directory tree:', tree);
|
||||
|
||||
// Create empty file as placeholder
|
||||
await subDir.createEmptyFile('placeholder.txt');
|
||||
```
|
||||
|
||||
### 🌊 Streaming Operations
|
||||
|
||||
Handle large files efficiently with streaming:
|
||||
|
||||
#### Download Streams
|
||||
|
||||
```typescript
|
||||
// Node.js stream
|
||||
const nodeStream = await bucket.fastGetStream(
|
||||
{ path: 'large-video.mp4' },
|
||||
'nodestream'
|
||||
);
|
||||
nodeStream.pipe(fs.createWriteStream('local-video.mp4'));
|
||||
|
||||
// Web stream (for modern environments)
|
||||
const webStream = await bucket.fastGetStream(
|
||||
{ path: 'large-file.zip' },
|
||||
'webstream'
|
||||
);
|
||||
```
|
||||
|
||||
#### Upload Streams
|
||||
|
||||
```typescript
|
||||
// Stream upload from file
|
||||
const readStream = fs.createReadStream('big-data.csv');
|
||||
await bucket.fastPutStream({
|
||||
path: 'uploads/big-data.csv',
|
||||
stream: readStream,
|
||||
metadata: {
|
||||
contentType: 'text/csv',
|
||||
userMetadata: {
|
||||
uploadedBy: 'data-team',
|
||||
version: '2.0'
|
||||
}
|
||||
}
|
||||
}
|
||||
|
||||
createBucket("myNewBucket");
|
||||
});
|
||||
```
|
||||
|
||||
#### Listing Buckets
|
||||
|
||||
While the library uses cloud-provider capabilities like AWS SDK to list existing buckets, `smartbucket` is aimed at simplifying content management within them.
|
||||
|
||||
#### Deleting Buckets
|
||||
|
||||
To delete a bucket, simply call the `removeBucket` function:
|
||||
#### Reactive Streams with RxJS
|
||||
|
||||
```typescript
|
||||
async function deleteBucket(bucketName: string) {
|
||||
try {
|
||||
await mySmartBucket.removeBucket(bucketName);
|
||||
console.log(`Bucket ${bucketName} deleted successfully.`);
|
||||
} catch (error) {
|
||||
console.error("Error deleting bucket:", error);
|
||||
}
|
||||
}
|
||||
// Get file as ReplaySubject for reactive programming
|
||||
const replaySubject = await bucket.fastGetReplaySubject({
|
||||
path: 'data/sensor-readings.json',
|
||||
chunkSize: 1024
|
||||
});
|
||||
|
||||
deleteBucket("anotherBucketName");
|
||||
replaySubject.subscribe({
|
||||
next: (chunk) => processChunk(chunk),
|
||||
complete: () => console.log('✅ Stream complete')
|
||||
});
|
||||
```
|
||||
|
||||
### File Operations in Buckets
|
||||
### 🔒 File Locking
|
||||
|
||||
SmartBucket offers a unified API to execute file-based operations efficiently.
|
||||
|
||||
#### Uploading Files
|
||||
|
||||
Upload a file using the `fastPut` method, specifying the bucket name, file path, and content:
|
||||
Prevent accidental modifications with file locking:
|
||||
|
||||
```typescript
|
||||
async function uploadFile(bucketName: string, filePath: string, fileContent: Buffer | string) {
|
||||
const bucket: Bucket = await mySmartBucket.getBucketByName(bucketName);
|
||||
await bucket.fastPut({ path: filePath, contents: fileContent });
|
||||
console.log(`File uploaded to ${filePath}`);
|
||||
const file = await bucket.getBaseDirectory()
|
||||
.getFileStrict({ path: 'important-config.json' });
|
||||
|
||||
// Lock file for 10 minutes
|
||||
await file.lock({ timeoutMillis: 600000 });
|
||||
console.log('🔒 File locked');
|
||||
|
||||
// Try to modify locked file (will throw error)
|
||||
try {
|
||||
await file.delete();
|
||||
} catch (error) {
|
||||
console.log('❌ Cannot delete locked file');
|
||||
}
|
||||
|
||||
uploadFile("myBucket", "example.txt", "This is a sample file content.");
|
||||
// Unlock when done
|
||||
await file.unlock();
|
||||
console.log('🔓 File unlocked');
|
||||
```
|
||||
|
||||
#### Downloading Files
|
||||
### 🏷️ Metadata Management
|
||||
|
||||
Download files using `fastGet`. It retrieves the file content as a buffer:
|
||||
Attach and manage metadata for your files:
|
||||
|
||||
```typescript
|
||||
async function downloadFile(bucketName: string, filePath: string) {
|
||||
const bucket: Bucket = await mySmartBucket.getBucketByName(bucketName);
|
||||
const content: Buffer = await bucket.fastGet({ path: filePath });
|
||||
console.log("Downloaded content:", content.toString());
|
||||
}
|
||||
const file = await bucket.getBaseDirectory()
|
||||
.getFileStrict({ path: 'document.pdf' });
|
||||
|
||||
downloadFile("myBucket", "example.txt");
|
||||
// Get metadata handler
|
||||
const metadata = await file.getMetaData();
|
||||
|
||||
// Set custom metadata
|
||||
await metadata.setCustomMetaData({
|
||||
key: 'author',
|
||||
value: 'John Doe'
|
||||
});
|
||||
|
||||
await metadata.setCustomMetaData({
|
||||
key: 'department',
|
||||
value: 'Engineering'
|
||||
});
|
||||
|
||||
// Retrieve metadata
|
||||
const author = await metadata.getCustomMetaData({ key: 'author' });
|
||||
console.log(`📝 Author: ${author}`);
|
||||
|
||||
// Get all metadata
|
||||
const allMeta = await metadata.getAllCustomMetaData();
|
||||
console.log('📋 All metadata:', allMeta);
|
||||
```
|
||||
|
||||
#### Streaming Files
|
||||
### 🗑️ Trash & Recovery
|
||||
|
||||
For large-scale applications, stream files without loading them fully into memory:
|
||||
SmartBucket includes an intelligent trash system for safe file deletion:
|
||||
|
||||
```typescript
|
||||
async function streamFile(bucketName: string, filePath: string) {
|
||||
const bucket: Bucket = await mySmartBucket.getBucketByName(bucketName);
|
||||
const stream = await bucket.fastGetStream({ path: filePath }, "nodestream");
|
||||
stream.on('data', chunk => console.log("Chunk:", chunk.toString()));
|
||||
stream.on('end', () => console.log("Download completed."));
|
||||
}
|
||||
const file = await bucket.getBaseDirectory()
|
||||
.getFileStrict({ path: 'important-data.xlsx' });
|
||||
|
||||
streamFile("myBucket", "largefile.txt");
|
||||
// Move to trash instead of permanent deletion
|
||||
await file.delete({ mode: 'trash' });
|
||||
console.log('🗑️ File moved to trash');
|
||||
|
||||
// Access trash
|
||||
const trash = await bucket.getTrash();
|
||||
const trashDir = await trash.getTrashDir();
|
||||
const trashedFiles = await trashDir.listFiles();
|
||||
console.log(`📦 ${trashedFiles.length} files in trash`);
|
||||
|
||||
// Restore from trash
|
||||
const trashedFile = await bucket.getBaseDirectory()
|
||||
.getFileStrict({
|
||||
path: 'important-data.xlsx',
|
||||
getFromTrash: true
|
||||
});
|
||||
|
||||
await trashedFile.restore({ useOriginalPath: true });
|
||||
console.log('♻️ File restored successfully');
|
||||
|
||||
// Permanent deletion from trash
|
||||
await trash.emptyTrash();
|
||||
console.log('🧹 Trash emptied');
|
||||
```
|
||||
|
||||
#### Deleting Files
|
||||
### ⚡ Advanced Features
|
||||
|
||||
Delete files with precision using `fastRemove`:
|
||||
#### File Statistics
|
||||
|
||||
```typescript
|
||||
async function deleteFile(bucketName: string, filePath: string) {
|
||||
const bucket: Bucket = await mySmartBucket.getBucketByName(bucketName);
|
||||
await bucket.fastRemove({ path: filePath });
|
||||
console.log(`File ${filePath} deleted.`);
|
||||
}
|
||||
|
||||
deleteFile("myBucket", "example.txt");
|
||||
// Get detailed file statistics
|
||||
const stats = await bucket.fastStat({ path: 'document.pdf' });
|
||||
console.log(`📊 Size: ${stats.size} bytes`);
|
||||
console.log(`📅 Last modified: ${stats.lastModified}`);
|
||||
console.log(`🏷️ ETag: ${stats.etag}`);
|
||||
```
|
||||
|
||||
### Directory Operations
|
||||
|
||||
Leverage directory functionalities to better organize and manage files within buckets.
|
||||
|
||||
#### Listing Directories and Files
|
||||
|
||||
Listing contents showcases a directory’s structure and file contents:
|
||||
#### Magic Bytes Detection
|
||||
|
||||
```typescript
|
||||
async function listDirectory(bucketName: string, directoryPath: string) {
|
||||
const bucket: Bucket = await mySmartBucket.getBucketByName(bucketName);
|
||||
const baseDirectory: Directory = await bucket.getBaseDirectory();
|
||||
const targetDirectory = await baseDirectory.getSubDirectoryByName(directoryPath);
|
||||
// Read first bytes for file type detection
|
||||
const magicBytes = await bucket.getMagicBytes({
|
||||
path: 'mystery-file',
|
||||
length: 16
|
||||
});
|
||||
|
||||
console.log('Directories:');
|
||||
(await targetDirectory.listDirectories()).forEach(dir => console.log(dir.name));
|
||||
|
||||
console.log('Files:');
|
||||
(await targetDirectory.listFiles()).forEach(file => console.log(file.name));
|
||||
}
|
||||
|
||||
listDirectory("myBucket", "path/to/directory");
|
||||
// Or from a File object
|
||||
const file = await bucket.getBaseDirectory()
|
||||
.getFileStrict({ path: 'image.jpg' });
|
||||
const magic = await file.getMagicBytes({ length: 4 });
|
||||
console.log(`🔮 Magic bytes: ${magic.toString('hex')}`);
|
||||
```
|
||||
|
||||
#### Managing Files in Directories
|
||||
|
||||
Additional functionalities allow file management, inclusive of handling sub-directories:
|
||||
#### JSON Data Operations
|
||||
|
||||
```typescript
|
||||
async function manageFilesInDirectory(bucketName: string, directoryPath: string, fileName: string, content: string) {
|
||||
const bucket: Bucket = await mySmartBucket.getBucketByName(bucketName);
|
||||
const baseDirectory: Directory = await bucket.getBaseDirectory();
|
||||
const directory = await baseDirectory.getSubDirectoryByName(directoryPath) ?? baseDirectory;
|
||||
const file = await bucket.getBaseDirectory()
|
||||
.getFileStrict({ path: 'config.json' });
|
||||
|
||||
await directory.fastPut({ path: fileName, contents: content });
|
||||
console.log(`File ${fileName} created in ${directoryPath}`);
|
||||
// Read JSON data
|
||||
const config = await file.getJsonData();
|
||||
console.log('⚙️ Config loaded:', config);
|
||||
|
||||
const fileContent = await directory.fastGet({ path: fileName });
|
||||
console.log(`Content of ${fileName}: ${fileContent.toString()}`);
|
||||
}
|
||||
|
||||
manageFilesInDirectory("myBucket", "myDir", "example.txt", "File content here");
|
||||
// Update JSON data
|
||||
config.version = '2.0';
|
||||
config.updated = new Date().toISOString();
|
||||
await file.writeJsonData(config);
|
||||
console.log('💾 Config updated');
|
||||
```
|
||||
|
||||
### Advanced Features
|
||||
|
||||
The library’s advanced features streamline intricate cloud storage workflows.
|
||||
|
||||
#### Bucket Policies
|
||||
|
||||
The library offers tools for maintaining consistent bucket policies across storage providers, assisting in defining access roles and permissions.
|
||||
|
||||
#### Metadata Management
|
||||
|
||||
Easily manage and store metadata by using the `MetaData` utility:
|
||||
#### Directory & File Type Detection
|
||||
|
||||
```typescript
|
||||
async function handleMetadata(bucketName: string, filePath: string) {
|
||||
const bucket: Bucket = await mySmartBucket.getBucketByName(bucketName);
|
||||
const meta = await bucket.fastStat({ path: filePath });
|
||||
console.log("Metadata:", meta.Metadata);
|
||||
}
|
||||
// Check if path is a directory
|
||||
const isDir = await bucket.isDirectory({ path: 'uploads/' });
|
||||
|
||||
handleMetadata("myBucket", "example.txt");
|
||||
// Check if path is a file
|
||||
const isFile = await bucket.isFile({ path: 'uploads/document.pdf' });
|
||||
|
||||
console.log(`Is directory: ${isDir ? '📁' : '❌'}`);
|
||||
console.log(`Is file: ${isFile ? '📄' : '❌'}`);
|
||||
```
|
||||
|
||||
#### File Locking
|
||||
|
||||
Prevent accidental writes by locking files:
|
||||
#### Clean Bucket Contents
|
||||
|
||||
```typescript
|
||||
async function lockFile(bucketName: string, filePath: string) {
|
||||
const bucket: Bucket = await mySmartBucket.getBucketByName(bucketName);
|
||||
const file: File = await bucket.getBaseDirectory().getFileStrict({ path: filePath });
|
||||
await file.lock({ timeoutMillis: 600000 }); // Lock for 10 minutes
|
||||
console.log(`File ${filePath} locked.`);
|
||||
}
|
||||
|
||||
lockFile("myBucket", "example.txt");
|
||||
// Remove all files and directories (use with caution!)
|
||||
await bucket.cleanAllContents();
|
||||
console.log('🧹 Bucket cleaned');
|
||||
```
|
||||
|
||||
#### Trash Management
|
||||
### ☁️ Cloud Provider Support
|
||||
|
||||
SmartBucket enables a safe deletion mode where files can be moved to a recycling bin, allowing for restoration:
|
||||
SmartBucket works seamlessly with:
|
||||
|
||||
- ✅ **AWS S3** - Full compatibility with S3 API
|
||||
- ✅ **Google Cloud Storage** - Via S3-compatible API
|
||||
- ✅ **MinIO** - Self-hosted S3-compatible storage
|
||||
- ✅ **DigitalOcean Spaces** - S3-compatible object storage
|
||||
- ✅ **Backblaze B2** - Cost-effective cloud storage
|
||||
- ✅ **Wasabi** - High-performance S3-compatible storage
|
||||
- ✅ **Any S3-compatible provider**
|
||||
|
||||
The library automatically handles provider quirks and optimizes operations for each platform while maintaining a consistent API.
|
||||
|
||||
### 🔧 Advanced Configuration
|
||||
|
||||
```typescript
|
||||
async function trashAndRestoreFile(bucketName: string, filePath: string) {
|
||||
const bucket: Bucket = await mySmartBucket.getBucketByName(bucketName);
|
||||
const file: File = await bucket.getBaseDirectory().getFileStrict({ path: filePath });
|
||||
// Configure with custom options
|
||||
const smartBucket = new SmartBucket({
|
||||
accessKey: process.env.S3_ACCESS_KEY,
|
||||
accessSecret: process.env.S3_SECRET_KEY,
|
||||
endpoint: process.env.S3_ENDPOINT,
|
||||
port: 443,
|
||||
useSsl: true,
|
||||
region: 'eu-central-1',
|
||||
// Additional S3 client options can be passed through
|
||||
});
|
||||
|
||||
// Move the file to trash
|
||||
await file.delete({ mode: 'trash' });
|
||||
console.log(`File ${filePath} moved to trash.`);
|
||||
// Environment-based configuration
|
||||
import { Qenv } from '@push.rocks/qenv';
|
||||
const qenv = new Qenv('./', './.nogit/');
|
||||
|
||||
// Retrieve the file from the trash
|
||||
const trashFile = await bucket.getTrash().getTrashedFileByOriginalName({ path: filePath });
|
||||
await trashFile.restore();
|
||||
console.log(`File ${filePath} restored from trash.`);
|
||||
}
|
||||
|
||||
trashAndRestoreFile("myBucket", "example.txt");
|
||||
const smartBucket = new SmartBucket({
|
||||
accessKey: await qenv.getEnvVarOnDemandStrict('S3_ACCESS_KEY'),
|
||||
accessSecret: await qenv.getEnvVarOnDemandStrict('S3_SECRET'),
|
||||
endpoint: await qenv.getEnvVarOnDemandStrict('S3_ENDPOINT'),
|
||||
});
|
||||
```
|
||||
|
||||
### Cloud Agnosticism
|
||||
### 🧪 Testing
|
||||
|
||||
`@push.rocks/smartbucket` supports a multitude of cloud providers, enhancing flexibility in adopting different cloud strategies without the need for extensive code rewrite. It offers a uniform interface allowing to perform operations seamlessly between different storage solutions such as AWS S3, Google Cloud Storage, and more. This aspect empowers organizations to align their storage decisions with business needs rather than technical constraints.
|
||||
SmartBucket is thoroughly tested. Run tests with:
|
||||
|
||||
By following this guide, you should be well-equipped to handle cloud storage operations using the `@push.rocks/smartbucket` library. Diligently constructed code examples elucidate the extensive functionalities offered by the library, aligned with best practices in cloud storage. For a deeper dive into any specific feature, refer to the comprehensive documentation provided with the library and the official documentation of the cloud providers you are integrating with.
|
||||
```bash
|
||||
pnpm test
|
||||
```
|
||||
|
||||
### 🤝 Best Practices
|
||||
|
||||
1. **Always use strict mode** for critical operations to catch errors early
|
||||
2. **Implement proper error handling** for network and permission issues
|
||||
3. **Use streaming** for large files to optimize memory usage
|
||||
4. **Leverage metadata** for organizing and searching files
|
||||
5. **Enable trash mode** for important data to prevent accidental loss
|
||||
6. **Lock files** during critical operations to prevent race conditions
|
||||
7. **Clean up resources** properly when done
|
||||
|
||||
## License and Legal Information
|
||||
|
||||
This repository contains open-source code that is licensed under the MIT License. A copy of the MIT License can be found in the [license](license) file within this repository.
|
||||
@@ -292,4 +469,4 @@ Registered at District court Bremen HRB 35230 HB, Germany
|
||||
|
||||
For any legal inquiries or if you require further information, please contact us via email at hello@task.vc.
|
||||
|
||||
By using this repository, you acknowledge that you have read this section, agree to comply with its terms, and understand that the licensing of the code does not imply endorsement by Task Venture Capital GmbH of any derivative works.
|
||||
By using this repository, you acknowledge that you have read this section, agree to comply with its terms, and understand that the licensing of the code does not imply endorsement by Task Venture Capital GmbH of any derivative works.
|
@@ -3,6 +3,6 @@
|
||||
*/
|
||||
export const commitinfo = {
|
||||
name: '@push.rocks/smartbucket',
|
||||
version: '3.3.8',
|
||||
version: '3.3.9',
|
||||
description: 'A TypeScript library providing a cloud-agnostic interface for managing object storage with functionalities like bucket management, file and directory operations, and advanced features such as metadata handling and file locking.'
|
||||
}
|
||||
|
Reference in New Issue
Block a user