fix(ziptools,gziptools): Use fflate synchronous APIs for ZIP and GZIP operations for Deno compatibility; add TEntryFilter type and small docs/tests cleanup
This commit is contained in:
712
readme.md
712
readme.md
@@ -1,8 +1,6 @@
|
||||
# @push.rocks/smartarchive 📦
|
||||
|
||||
Powerful archive manipulation for modern Node.js applications.
|
||||
|
||||
`@push.rocks/smartarchive` is a versatile library for handling archive files with a focus on developer experience. Work with **zip**, **tar**, **gzip**, and **bzip2** formats through a unified, streaming-optimized API.
|
||||
A powerful, streaming-first archive manipulation library with a fluent builder API. Works seamlessly in Node.js and Deno.
|
||||
|
||||
## Issue Reporting and Security
|
||||
|
||||
@@ -10,13 +8,15 @@ For reporting bugs, issues, or security vulnerabilities, please visit [community
|
||||
|
||||
## Features 🚀
|
||||
|
||||
- 📁 **Multi-format support** – Handle `.zip`, `.tar`, `.tar.gz`, `.tgz`, and `.bz2` archives
|
||||
- 📁 **Multi-format support** – Handle `.zip`, `.tar`, `.tar.gz`, `.tgz`, `.gz`, and `.bz2` archives
|
||||
- 🌊 **Streaming-first architecture** – Process large archives without memory constraints
|
||||
- 🔄 **Unified API** – Consistent interface across different archive formats
|
||||
- ✨ **Fluent builder API** – Chain methods for readable, expressive code
|
||||
- 🎯 **Smart detection** – Automatically identifies archive types via magic bytes
|
||||
- ⚡ **High performance** – Built on `tar-stream` and `fflate` for speed
|
||||
- 🔧 **Flexible I/O** – Work with files, URLs, and streams seamlessly
|
||||
- 🔧 **Flexible I/O** – Work with files, URLs, streams, and buffers seamlessly
|
||||
- 🛠️ **Modern TypeScript** – Full type safety and excellent IDE support
|
||||
- 🔄 **Dual-mode operation** – Extract existing archives OR create new ones
|
||||
- 🦕 **Cross-runtime** – Works in both Node.js and Deno environments
|
||||
|
||||
## Installation 📥
|
||||
|
||||
@@ -39,88 +39,288 @@ yarn add @push.rocks/smartarchive
|
||||
import { SmartArchive } from '@push.rocks/smartarchive';
|
||||
|
||||
// Extract a .tar.gz archive from a URL directly to the filesystem
|
||||
const archive = await SmartArchive.fromArchiveUrl(
|
||||
'https://registry.npmjs.org/some-package/-/some-package-1.0.0.tgz'
|
||||
);
|
||||
await archive.exportToFs('./extracted');
|
||||
await SmartArchive.create()
|
||||
.url('https://registry.npmjs.org/some-package/-/some-package-1.0.0.tgz')
|
||||
.extract('./extracted');
|
||||
```
|
||||
|
||||
### Process archive as a stream
|
||||
### Create an archive from entries
|
||||
|
||||
```typescript
|
||||
import { SmartArchive } from '@push.rocks/smartarchive';
|
||||
|
||||
// Stream-based processing for memory efficiency
|
||||
const archive = await SmartArchive.fromArchiveFile('./large-archive.zip');
|
||||
const streamOfFiles = await archive.exportToStreamOfStreamFiles();
|
||||
// Create a tar.gz archive with files
|
||||
await SmartArchive.create()
|
||||
.format('tar.gz')
|
||||
.compression(6)
|
||||
.entry('config.json', JSON.stringify({ name: 'myapp' }))
|
||||
.entry('readme.txt', 'Hello World!')
|
||||
.toFile('./backup.tar.gz');
|
||||
```
|
||||
|
||||
// Process each file in the archive
|
||||
streamOfFiles.on('data', async (streamFile) => {
|
||||
console.log(`Processing ${streamFile.relativeFilePath}`);
|
||||
const readStream = await streamFile.createReadStream();
|
||||
// Handle individual file stream
|
||||
});
|
||||
### Extract with filtering and path manipulation
|
||||
|
||||
streamOfFiles.on('end', () => {
|
||||
console.log('Extraction complete');
|
||||
});
|
||||
```typescript
|
||||
import { SmartArchive } from '@push.rocks/smartarchive';
|
||||
|
||||
// Extract only JSON files, stripping the first path component
|
||||
await SmartArchive.create()
|
||||
.url('https://registry.npmjs.org/lodash/-/lodash-4.17.21.tgz')
|
||||
.stripComponents(1) // Remove 'package/' prefix
|
||||
.include(/\.json$/) // Only extract JSON files
|
||||
.extract('./node_modules/lodash');
|
||||
```
|
||||
|
||||
## Core Concepts 💡
|
||||
|
||||
### Archive Sources
|
||||
### Fluent Builder Pattern
|
||||
|
||||
`SmartArchive` accepts archives from three sources:
|
||||
`SmartArchive` uses a fluent builder pattern where you chain methods to configure the operation:
|
||||
|
||||
| Source | Method | Use Case |
|
||||
|--------|--------|----------|
|
||||
| **URL** | `SmartArchive.fromArchiveUrl(url)` | Download and process archives from the web |
|
||||
| **File** | `SmartArchive.fromArchiveFile(path)` | Load archives from the local filesystem |
|
||||
| **Stream** | `SmartArchive.fromArchiveStream(stream)` | Process archives from any Node.js stream |
|
||||
```typescript
|
||||
SmartArchive.create() // Start a new builder
|
||||
.source(...) // Configure source (extraction mode)
|
||||
.options(...) // Set options
|
||||
.terminal() // Execute the operation
|
||||
```
|
||||
|
||||
### Export Destinations
|
||||
### Two Operating Modes
|
||||
|
||||
| Destination | Method | Use Case |
|
||||
|-------------|--------|----------|
|
||||
| **Filesystem** | `exportToFs(targetDir, fileName?)` | Extract directly to a directory |
|
||||
| **Stream of files** | `exportToStreamOfStreamFiles()` | Process files individually as `StreamFile` objects |
|
||||
**Extraction Mode** - Load an existing archive and extract/analyze it:
|
||||
```typescript
|
||||
SmartArchive.create()
|
||||
.url('...') // or .file(), .stream(), .buffer()
|
||||
.extract('./out') // or .toSmartFiles(), .list(), etc.
|
||||
```
|
||||
|
||||
**Creation Mode** - Build a new archive from entries:
|
||||
```typescript
|
||||
SmartArchive.create()
|
||||
.format('tar.gz') // Set output format
|
||||
.entry(...) // Add files
|
||||
.toFile('./out.tar.gz') // or .toBuffer(), .toStream()
|
||||
```
|
||||
|
||||
> ⚠️ **Note:** You cannot mix extraction and creation methods in the same chain.
|
||||
|
||||
## API Reference 📚
|
||||
|
||||
### Source Methods (Extraction Mode)
|
||||
|
||||
| Method | Description |
|
||||
|--------|-------------|
|
||||
| `.url(url)` | Load archive from a URL |
|
||||
| `.file(path)` | Load archive from local filesystem |
|
||||
| `.stream(readable)` | Load archive from any Node.js readable stream |
|
||||
| `.buffer(buffer)` | Load archive from an in-memory Buffer |
|
||||
|
||||
### Creation Methods (Creation Mode)
|
||||
|
||||
| Method | Description |
|
||||
|--------|-------------|
|
||||
| `.format(fmt)` | Set output format: `'tar'`, `'tar.gz'`, `'tgz'`, `'zip'`, `'gz'` |
|
||||
| `.compression(level)` | Set compression level (0-9, default: 6) |
|
||||
| `.entry(path, content)` | Add a file entry (string or Buffer content) |
|
||||
| `.entries(array)` | Add multiple entries at once |
|
||||
| `.directory(path, archiveBase?)` | Add entire directory contents |
|
||||
| `.addSmartFile(file, path?)` | Add a SmartFile instance |
|
||||
| `.addStreamFile(file, path?)` | Add a StreamFile instance |
|
||||
|
||||
### Filter Methods (Both Modes)
|
||||
|
||||
| Method | Description |
|
||||
|--------|-------------|
|
||||
| `.filter(predicate)` | Filter entries with custom function |
|
||||
| `.include(pattern)` | Only include entries matching regex/string pattern |
|
||||
| `.exclude(pattern)` | Exclude entries matching regex/string pattern |
|
||||
|
||||
### Extraction Options
|
||||
|
||||
| Method | Description |
|
||||
|--------|-------------|
|
||||
| `.stripComponents(n)` | Strip N leading path components |
|
||||
| `.overwrite(bool)` | Overwrite existing files (default: false) |
|
||||
| `.fileName(name)` | Set output filename for single-file archives (gz, bz2) |
|
||||
|
||||
### Terminal Methods (Extraction)
|
||||
|
||||
| Method | Returns | Description |
|
||||
|--------|---------|-------------|
|
||||
| `.extract(targetDir)` | `Promise<void>` | Extract to filesystem directory |
|
||||
| `.toStreamFiles()` | `Promise<StreamIntake<StreamFile>>` | Get stream of StreamFile objects |
|
||||
| `.toSmartFiles()` | `Promise<SmartFile[]>` | Get in-memory SmartFile array |
|
||||
| `.extractFile(path)` | `Promise<SmartFile \| null>` | Extract single file by path |
|
||||
| `.list()` | `Promise<IArchiveEntryInfo[]>` | List all entries |
|
||||
| `.analyze()` | `Promise<IArchiveInfo>` | Get archive metadata |
|
||||
| `.hasFile(path)` | `Promise<boolean>` | Check if file exists |
|
||||
|
||||
### Terminal Methods (Creation)
|
||||
|
||||
| Method | Returns | Description |
|
||||
|--------|---------|-------------|
|
||||
| `.build()` | `Promise<SmartArchive>` | Build the archive (implicit in other terminals) |
|
||||
| `.toBuffer()` | `Promise<Buffer>` | Get archive as Buffer |
|
||||
| `.toFile(path)` | `Promise<void>` | Write archive to disk |
|
||||
| `.toStream()` | `Promise<Readable>` | Get raw archive stream |
|
||||
|
||||
## Usage Examples 🔨
|
||||
|
||||
### Working with ZIP files
|
||||
### Download and extract npm packages
|
||||
|
||||
```typescript
|
||||
import { SmartArchive } from '@push.rocks/smartarchive';
|
||||
|
||||
// Extract a ZIP file
|
||||
const zipArchive = await SmartArchive.fromArchiveFile('./archive.zip');
|
||||
await zipArchive.exportToFs('./output');
|
||||
const pkg = await SmartArchive.create()
|
||||
.url('https://registry.npmjs.org/lodash/-/lodash-4.17.21.tgz');
|
||||
|
||||
// Stream ZIP contents for processing
|
||||
const fileStream = await zipArchive.exportToStreamOfStreamFiles();
|
||||
// Quick inspection of package.json
|
||||
const pkgJson = await pkg.extractFile('package/package.json');
|
||||
if (pkgJson) {
|
||||
const metadata = JSON.parse(pkgJson.contents.toString());
|
||||
console.log(`Package: ${metadata.name}@${metadata.version}`);
|
||||
}
|
||||
|
||||
// Full extraction with path normalization
|
||||
await SmartArchive.create()
|
||||
.url('https://registry.npmjs.org/lodash/-/lodash-4.17.21.tgz')
|
||||
.stripComponents(1)
|
||||
.extract('./node_modules/lodash');
|
||||
```
|
||||
|
||||
### Create ZIP archive
|
||||
|
||||
```typescript
|
||||
import { SmartArchive } from '@push.rocks/smartarchive';
|
||||
|
||||
await SmartArchive.create()
|
||||
.format('zip')
|
||||
.compression(9)
|
||||
.entry('report.txt', 'Monthly sales report...')
|
||||
.entry('data/figures.json', JSON.stringify({ revenue: 10000 }))
|
||||
.entry('images/logo.png', pngBuffer)
|
||||
.toFile('./report-bundle.zip');
|
||||
```
|
||||
|
||||
### Create TAR.GZ from directory
|
||||
|
||||
```typescript
|
||||
import { SmartArchive } from '@push.rocks/smartarchive';
|
||||
|
||||
await SmartArchive.create()
|
||||
.format('tar.gz')
|
||||
.compression(9)
|
||||
.directory('./src', 'source') // Archive ./src as 'source/' in archive
|
||||
.toFile('./project-backup.tar.gz');
|
||||
```
|
||||
|
||||
### Stream-based extraction
|
||||
|
||||
```typescript
|
||||
import { SmartArchive } from '@push.rocks/smartarchive';
|
||||
|
||||
const fileStream = await SmartArchive.create()
|
||||
.file('./large-archive.tar.gz')
|
||||
.toStreamFiles();
|
||||
|
||||
fileStream.on('data', async (streamFile) => {
|
||||
console.log(`Processing: ${streamFile.relativeFilePath}`);
|
||||
|
||||
if (streamFile.relativeFilePath.endsWith('.json')) {
|
||||
const readStream = await streamFile.createReadStream();
|
||||
// Process JSON files from the archive
|
||||
const content = await streamFile.getContentAsBuffer();
|
||||
const data = JSON.parse(content.toString());
|
||||
// Process JSON data...
|
||||
}
|
||||
});
|
||||
|
||||
fileStream.on('end', () => {
|
||||
console.log('Extraction complete');
|
||||
});
|
||||
```
|
||||
|
||||
### Working with TAR archives
|
||||
### Filter specific file types
|
||||
|
||||
```typescript
|
||||
import { SmartArchive, TarTools } from '@push.rocks/smartarchive';
|
||||
import { SmartArchive } from '@push.rocks/smartarchive';
|
||||
|
||||
// Extract a .tar.gz file
|
||||
const tarGzArchive = await SmartArchive.fromArchiveFile('./archive.tar.gz');
|
||||
await tarGzArchive.exportToFs('./extracted');
|
||||
// Extract only TypeScript files
|
||||
const tsFiles = await SmartArchive.create()
|
||||
.url('https://example.com/project.tar.gz')
|
||||
.include(/\.ts$/)
|
||||
.exclude(/node_modules/)
|
||||
.toSmartFiles();
|
||||
|
||||
for (const file of tsFiles) {
|
||||
console.log(`${file.relative}: ${file.contents.length} bytes`);
|
||||
}
|
||||
```
|
||||
|
||||
### Analyze archive without extraction
|
||||
|
||||
```typescript
|
||||
import { SmartArchive } from '@push.rocks/smartarchive';
|
||||
|
||||
const archive = SmartArchive.create()
|
||||
.file('./unknown-archive.tar.gz');
|
||||
|
||||
// Get format info
|
||||
const info = await archive.analyze();
|
||||
console.log(`Format: ${info.format}`);
|
||||
console.log(`Compressed: ${info.isCompressed}`);
|
||||
|
||||
// List contents
|
||||
const entries = await archive.list();
|
||||
for (const entry of entries) {
|
||||
console.log(`${entry.path} (${entry.isDirectory ? 'dir' : 'file'})`);
|
||||
}
|
||||
|
||||
// Check for specific file
|
||||
if (await archive.hasFile('package.json')) {
|
||||
const pkgFile = await archive.extractFile('package.json');
|
||||
console.log(pkgFile?.contents.toString());
|
||||
}
|
||||
```
|
||||
|
||||
### Working with GZIP files
|
||||
|
||||
```typescript
|
||||
import { SmartArchive, GzipTools } from '@push.rocks/smartarchive';
|
||||
|
||||
// Decompress a .gz file
|
||||
await SmartArchive.create()
|
||||
.file('./data.json.gz')
|
||||
.fileName('data.json') // Specify output name (gzip doesn't store filename)
|
||||
.extract('./decompressed');
|
||||
|
||||
// Use GzipTools directly for compression/decompression
|
||||
const gzipTools = new GzipTools();
|
||||
|
||||
// Compress a buffer
|
||||
const compressed = await gzipTools.compress(Buffer.from('Hello World'), 9);
|
||||
const decompressed = await gzipTools.decompress(compressed);
|
||||
|
||||
// Synchronous operations
|
||||
const compressedSync = gzipTools.compressSync(inputBuffer, 6);
|
||||
const decompressedSync = gzipTools.decompressSync(compressedSync);
|
||||
|
||||
// Streaming
|
||||
const compressStream = gzipTools.getCompressionStream(6);
|
||||
const decompressStream = gzipTools.getDecompressionStream();
|
||||
|
||||
createReadStream('./input.txt')
|
||||
.pipe(compressStream)
|
||||
.pipe(createWriteStream('./output.gz'));
|
||||
```
|
||||
|
||||
### Working with TAR archives directly
|
||||
|
||||
```typescript
|
||||
import { TarTools } from '@push.rocks/smartarchive';
|
||||
|
||||
// Create a TAR archive using TarTools directly
|
||||
const tarTools = new TarTools();
|
||||
|
||||
// Create a TAR archive manually
|
||||
const pack = await tarTools.getPackStream();
|
||||
|
||||
// Add files to the pack
|
||||
await tarTools.addFileToPack(pack, {
|
||||
fileName: 'hello.txt',
|
||||
content: 'Hello, World!'
|
||||
@@ -131,262 +331,56 @@ await tarTools.addFileToPack(pack, {
|
||||
content: Buffer.from(JSON.stringify({ foo: 'bar' }))
|
||||
});
|
||||
|
||||
// Finalize and pipe to destination
|
||||
pack.finalize();
|
||||
pack.pipe(createWriteStream('./output.tar'));
|
||||
|
||||
// Pack a directory to TAR.GZ buffer
|
||||
const tgzBuffer = await tarTools.packDirectoryToTarGz('./src', 6);
|
||||
|
||||
// Pack a directory to TAR.GZ stream
|
||||
const tgzStream = await tarTools.packDirectoryToTarGzStream('./src');
|
||||
```
|
||||
|
||||
### Pack a directory into TAR
|
||||
|
||||
```typescript
|
||||
import { TarTools } from '@push.rocks/smartarchive';
|
||||
import { createWriteStream } from 'fs';
|
||||
|
||||
const tarTools = new TarTools();
|
||||
|
||||
// Pack an entire directory
|
||||
const pack = await tarTools.packDirectory('./src');
|
||||
pack.finalize();
|
||||
pack.pipe(createWriteStream('./source.tar'));
|
||||
```
|
||||
|
||||
### Extracting from URLs
|
||||
|
||||
```typescript
|
||||
import { SmartArchive } from '@push.rocks/smartarchive';
|
||||
|
||||
// Download and extract npm packages
|
||||
const npmPackage = await SmartArchive.fromArchiveUrl(
|
||||
'https://registry.npmjs.org/@push.rocks/smartfile/-/smartfile-11.2.7.tgz'
|
||||
);
|
||||
await npmPackage.exportToFs('./node_modules/@push.rocks/smartfile');
|
||||
|
||||
// Or process as stream for memory efficiency
|
||||
const stream = await npmPackage.exportToStreamOfStreamFiles();
|
||||
stream.on('data', async (file) => {
|
||||
console.log(`Extracted: ${file.relativeFilePath}`);
|
||||
});
|
||||
```
|
||||
|
||||
### Working with GZIP files
|
||||
|
||||
```typescript
|
||||
import { SmartArchive, GzipTools } from '@push.rocks/smartarchive';
|
||||
import { createReadStream, createWriteStream } from 'fs';
|
||||
|
||||
// Decompress a .gz file - provide filename since gzip doesn't store it
|
||||
const gzipArchive = await SmartArchive.fromArchiveFile('./data.json.gz');
|
||||
await gzipArchive.exportToFs('./decompressed', 'data.json');
|
||||
|
||||
// Use GzipTools directly for streaming decompression
|
||||
const gzipTools = new GzipTools();
|
||||
const decompressStream = gzipTools.getDecompressionStream();
|
||||
|
||||
createReadStream('./compressed.gz')
|
||||
.pipe(decompressStream)
|
||||
.pipe(createWriteStream('./decompressed.txt'));
|
||||
```
|
||||
|
||||
### Working with BZIP2 files
|
||||
|
||||
```typescript
|
||||
import { SmartArchive } from '@push.rocks/smartarchive';
|
||||
|
||||
// Handle .bz2 files
|
||||
const bzipArchive = await SmartArchive.fromArchiveUrl(
|
||||
'https://example.com/data.bz2'
|
||||
);
|
||||
await bzipArchive.exportToFs('./extracted', 'data.txt');
|
||||
```
|
||||
|
||||
### In-memory processing (no filesystem)
|
||||
|
||||
```typescript
|
||||
import { SmartArchive } from '@push.rocks/smartarchive';
|
||||
import { Readable } from 'stream';
|
||||
|
||||
// Process archives entirely in memory
|
||||
const compressedBuffer = await fetchCompressedData();
|
||||
const memoryStream = Readable.from(compressedBuffer);
|
||||
|
||||
const archive = await SmartArchive.fromArchiveStream(memoryStream);
|
||||
const streamFiles = await archive.exportToStreamOfStreamFiles();
|
||||
|
||||
const extractedFiles: Array<{ name: string; content: Buffer }> = [];
|
||||
|
||||
streamFiles.on('data', async (streamFile) => {
|
||||
const chunks: Buffer[] = [];
|
||||
const readStream = await streamFile.createReadStream();
|
||||
|
||||
for await (const chunk of readStream) {
|
||||
chunks.push(chunk);
|
||||
}
|
||||
|
||||
extractedFiles.push({
|
||||
name: streamFile.relativeFilePath,
|
||||
content: Buffer.concat(chunks)
|
||||
});
|
||||
});
|
||||
|
||||
await new Promise((resolve) => streamFiles.on('end', resolve));
|
||||
console.log(`Extracted ${extractedFiles.length} files in memory`);
|
||||
```
|
||||
|
||||
### Nested archive handling (e.g., .tar.gz)
|
||||
|
||||
The library automatically handles nested compression. A `.tar.gz` file is:
|
||||
1. First decompressed from gzip
|
||||
2. Then unpacked from tar
|
||||
|
||||
This happens transparently:
|
||||
|
||||
```typescript
|
||||
import { SmartArchive } from '@push.rocks/smartarchive';
|
||||
|
||||
// Automatically handles gzip → tar extraction chain
|
||||
const tgzArchive = await SmartArchive.fromArchiveFile('./package.tar.gz');
|
||||
await tgzArchive.exportToFs('./extracted');
|
||||
```
|
||||
|
||||
## API Reference 📚
|
||||
|
||||
### SmartArchive Class
|
||||
|
||||
The main entry point for archive operations.
|
||||
|
||||
#### Static Factory Methods
|
||||
|
||||
```typescript
|
||||
// Create from URL - downloads and processes archive
|
||||
SmartArchive.fromArchiveUrl(url: string): Promise<SmartArchive>
|
||||
|
||||
// Create from local file path
|
||||
SmartArchive.fromArchiveFile(path: string): Promise<SmartArchive>
|
||||
|
||||
// Create from any Node.js readable stream
|
||||
SmartArchive.fromArchiveStream(stream: Readable | Duplex | Transform): Promise<SmartArchive>
|
||||
```
|
||||
|
||||
#### Instance Methods
|
||||
|
||||
```typescript
|
||||
// Extract all files to a directory
|
||||
// fileName is optional - used for single-file archives (like .gz) that don't store filename
|
||||
exportToFs(targetDir: string, fileName?: string): Promise<void>
|
||||
|
||||
// Get a stream that emits StreamFile objects for each file in the archive
|
||||
exportToStreamOfStreamFiles(): Promise<StreamIntake<StreamFile>>
|
||||
|
||||
// Get the raw archive stream (useful for piping)
|
||||
getArchiveStream(): Promise<Readable>
|
||||
```
|
||||
|
||||
#### Instance Properties
|
||||
|
||||
```typescript
|
||||
archive.tarTools // TarTools instance for TAR-specific operations
|
||||
archive.zipTools // ZipTools instance for ZIP-specific operations
|
||||
archive.gzipTools // GzipTools instance for GZIP-specific operations
|
||||
archive.bzip2Tools // Bzip2Tools instance for BZIP2-specific operations
|
||||
archive.archiveAnalyzer // ArchiveAnalyzer for inspecting archive type
|
||||
```
|
||||
|
||||
### TarTools Class
|
||||
|
||||
TAR-specific operations for creating and extracting TAR archives.
|
||||
|
||||
```typescript
|
||||
import { TarTools } from '@push.rocks/smartarchive';
|
||||
|
||||
const tarTools = new TarTools();
|
||||
|
||||
// Get a tar pack stream for creating archives
|
||||
const pack = await tarTools.getPackStream();
|
||||
|
||||
// Add files to a pack stream
|
||||
await tarTools.addFileToPack(pack, {
|
||||
fileName: 'file.txt', // Name in archive
|
||||
content: 'Hello World', // String, Buffer, Readable, SmartFile, or StreamFile
|
||||
byteLength?: number, // Optional: specify size for streams
|
||||
filePath?: string // Optional: path to file on disk
|
||||
});
|
||||
|
||||
// Pack an entire directory
|
||||
const pack = await tarTools.packDirectory('./src');
|
||||
|
||||
// Get extraction stream
|
||||
const extract = tarTools.getDecompressionStream();
|
||||
```
|
||||
|
||||
### ZipTools Class
|
||||
|
||||
ZIP-specific operations.
|
||||
### Working with ZIP archives directly
|
||||
|
||||
```typescript
|
||||
import { ZipTools } from '@push.rocks/smartarchive';
|
||||
|
||||
const zipTools = new ZipTools();
|
||||
|
||||
// Get compression stream (for creating ZIP)
|
||||
const compressor = zipTools.getCompressionStream();
|
||||
// Create a ZIP archive from entries
|
||||
const zipBuffer = await zipTools.createZip([
|
||||
{ archivePath: 'readme.txt', content: 'Hello!' },
|
||||
{ archivePath: 'data.bin', content: Buffer.from([0x00, 0x01, 0x02]) }
|
||||
], 6);
|
||||
|
||||
// Get decompression stream (for extracting ZIP)
|
||||
const decompressor = zipTools.getDecompressionStream();
|
||||
// Extract a ZIP buffer
|
||||
const entries = await zipTools.extractZip(zipBuffer);
|
||||
for (const entry of entries) {
|
||||
console.log(`${entry.path}: ${entry.content.length} bytes`);
|
||||
}
|
||||
```
|
||||
|
||||
### GzipTools Class
|
||||
|
||||
GZIP compression/decompression streams.
|
||||
|
||||
```typescript
|
||||
import { GzipTools } from '@push.rocks/smartarchive';
|
||||
|
||||
const gzipTools = new GzipTools();
|
||||
|
||||
// Get compression stream
|
||||
const compressor = gzipTools.getCompressionStream();
|
||||
|
||||
// Get decompression stream
|
||||
const decompressor = gzipTools.getDecompressionStream();
|
||||
```
|
||||
|
||||
## Supported Formats 📋
|
||||
|
||||
| Format | Extension(s) | Extract | Create |
|
||||
|--------|--------------|---------|--------|
|
||||
| TAR | `.tar` | ✅ | ✅ |
|
||||
| TAR.GZ / TGZ | `.tar.gz`, `.tgz` | ✅ | ⚠️ |
|
||||
| ZIP | `.zip` | ✅ | ⚠️ |
|
||||
| GZIP | `.gz` | ✅ | ✅ |
|
||||
| BZIP2 | `.bz2` | ✅ | ❌ |
|
||||
|
||||
✅ Full support | ⚠️ Partial/basic support | ❌ Not supported
|
||||
|
||||
## Performance Tips 🏎️
|
||||
|
||||
1. **Use streaming for large files** – Avoid loading entire archives into memory with `exportToStreamOfStreamFiles()`
|
||||
2. **Provide byte lengths when known** – When adding streams to TAR, provide `byteLength` for better performance
|
||||
3. **Process files as they stream** – Don't collect all files into an array unless necessary
|
||||
4. **Choose the right format** – TAR.GZ for Unix/compression, ZIP for cross-platform compatibility
|
||||
|
||||
## Error Handling 🛡️
|
||||
### In-memory round-trip
|
||||
|
||||
```typescript
|
||||
import { SmartArchive } from '@push.rocks/smartarchive';
|
||||
|
||||
try {
|
||||
const archive = await SmartArchive.fromArchiveUrl('https://example.com/file.zip');
|
||||
await archive.exportToFs('./output');
|
||||
} catch (error) {
|
||||
if (error.code === 'ENOENT') {
|
||||
console.error('Archive file not found');
|
||||
} else if (error.code === 'EACCES') {
|
||||
console.error('Permission denied');
|
||||
} else if (error.message.includes('fetch')) {
|
||||
console.error('Network error downloading archive');
|
||||
} else {
|
||||
console.error('Archive extraction failed:', error.message);
|
||||
}
|
||||
// Create archive in memory
|
||||
const archive = await SmartArchive.create()
|
||||
.format('tar.gz')
|
||||
.entry('config.json', JSON.stringify({ version: '1.0.0' }))
|
||||
.build();
|
||||
|
||||
const buffer = await archive.toBuffer();
|
||||
|
||||
// Extract from buffer
|
||||
const files = await SmartArchive.create()
|
||||
.buffer(buffer)
|
||||
.toSmartFiles();
|
||||
|
||||
for (const file of files) {
|
||||
console.log(`${file.relative}: ${file.contents.toString()}`);
|
||||
}
|
||||
```
|
||||
|
||||
@@ -395,51 +389,139 @@ try {
|
||||
### CI/CD: Download & Extract Build Artifacts
|
||||
|
||||
```typescript
|
||||
const artifacts = await SmartArchive.fromArchiveUrl(
|
||||
`${CI_SERVER}/artifacts/build-${BUILD_ID}.zip`
|
||||
);
|
||||
await artifacts.exportToFs('./dist');
|
||||
const artifacts = await SmartArchive.create()
|
||||
.url(`${CI_SERVER}/artifacts/build-${BUILD_ID}.zip`)
|
||||
.stripComponents(1)
|
||||
.extract('./dist');
|
||||
```
|
||||
|
||||
### Backup System: Restore from Archive
|
||||
### Backup System
|
||||
|
||||
```typescript
|
||||
const backup = await SmartArchive.fromArchiveFile('./backup-2024.tar.gz');
|
||||
await backup.exportToFs('/restore/location');
|
||||
// Create backup
|
||||
await SmartArchive.create()
|
||||
.format('tar.gz')
|
||||
.compression(9)
|
||||
.directory('./data')
|
||||
.toFile(`./backups/backup-${Date.now()}.tar.gz`);
|
||||
|
||||
// Restore backup
|
||||
await SmartArchive.create()
|
||||
.file('./backups/backup-latest.tar.gz')
|
||||
.extract('/restore/location');
|
||||
```
|
||||
|
||||
### NPM Package Inspection
|
||||
### Bundle files for HTTP download
|
||||
|
||||
```typescript
|
||||
const pkg = await SmartArchive.fromArchiveUrl(
|
||||
'https://registry.npmjs.org/lodash/-/lodash-4.17.21.tgz'
|
||||
);
|
||||
const files = await pkg.exportToStreamOfStreamFiles();
|
||||
import { SmartArchive } from '@push.rocks/smartarchive';
|
||||
|
||||
files.on('data', async (file) => {
|
||||
if (file.relativeFilePath.includes('package.json')) {
|
||||
const stream = await file.createReadStream();
|
||||
// Read and analyze package.json
|
||||
}
|
||||
// Express/Fastify handler
|
||||
app.get('/download-bundle', async (req, res) => {
|
||||
const buffer = await SmartArchive.create()
|
||||
.format('zip')
|
||||
.entry('report.pdf', pdfBuffer)
|
||||
.entry('data.xlsx', excelBuffer)
|
||||
.entry('images/chart.png', chartBuffer)
|
||||
.toBuffer();
|
||||
|
||||
res.setHeader('Content-Type', 'application/zip');
|
||||
res.setHeader('Content-Disposition', 'attachment; filename=report-bundle.zip');
|
||||
res.send(buffer);
|
||||
});
|
||||
```
|
||||
|
||||
### Data Pipeline: Process Compressed Datasets
|
||||
|
||||
```typescript
|
||||
const dataset = await SmartArchive.fromArchiveUrl(
|
||||
'https://data.source/dataset.tar.gz'
|
||||
);
|
||||
const fileStream = await SmartArchive.create()
|
||||
.url('https://data.source/dataset.tar.gz')
|
||||
.toStreamFiles();
|
||||
|
||||
const files = await dataset.exportToStreamOfStreamFiles();
|
||||
files.on('data', async (file) => {
|
||||
fileStream.on('data', async (file) => {
|
||||
if (file.relativeFilePath.endsWith('.csv')) {
|
||||
const stream = await file.createReadStream();
|
||||
// Stream CSV processing
|
||||
const content = await file.getContentAsBuffer();
|
||||
// Stream CSV processing...
|
||||
}
|
||||
});
|
||||
```
|
||||
|
||||
## Supported Formats 📋
|
||||
|
||||
| Format | Extension(s) | Extract | Create |
|
||||
|--------|--------------|---------|--------|
|
||||
| TAR | `.tar` | ✅ | ✅ |
|
||||
| TAR.GZ / TGZ | `.tar.gz`, `.tgz` | ✅ | ✅ |
|
||||
| ZIP | `.zip` | ✅ | ✅ |
|
||||
| GZIP | `.gz` | ✅ | ✅ |
|
||||
| BZIP2 | `.bz2` | ✅ | ❌ |
|
||||
|
||||
## Type Definitions
|
||||
|
||||
```typescript
|
||||
// Supported archive formats
|
||||
type TArchiveFormat = 'tar' | 'tar.gz' | 'tgz' | 'zip' | 'gz' | 'bz2';
|
||||
|
||||
// Compression level (0 = none, 9 = maximum)
|
||||
type TCompressionLevel = 0 | 1 | 2 | 3 | 4 | 5 | 6 | 7 | 8 | 9;
|
||||
|
||||
// Entry for creating archives
|
||||
interface IArchiveEntry {
|
||||
archivePath: string;
|
||||
content: string | Buffer | Readable | SmartFile | StreamFile;
|
||||
size?: number;
|
||||
mode?: number;
|
||||
mtime?: Date;
|
||||
}
|
||||
|
||||
// Information about an archive entry
|
||||
interface IArchiveEntryInfo {
|
||||
path: string;
|
||||
size: number;
|
||||
isDirectory: boolean;
|
||||
isFile: boolean;
|
||||
mtime?: Date;
|
||||
mode?: number;
|
||||
}
|
||||
|
||||
// Archive analysis result
|
||||
interface IArchiveInfo {
|
||||
format: TArchiveFormat | null;
|
||||
isCompressed: boolean;
|
||||
isArchive: boolean;
|
||||
entries?: IArchiveEntryInfo[];
|
||||
}
|
||||
```
|
||||
|
||||
## Performance Tips 🏎️
|
||||
|
||||
1. **Use streaming for large files** – `.toStreamFiles()` processes entries one at a time without loading the entire archive
|
||||
2. **Provide byte lengths when known** – When using TarTools directly, provide `byteLength` for better performance
|
||||
3. **Choose appropriate compression** – Use 1-3 for speed, 6 (default) for balance, 9 for maximum compression
|
||||
4. **Filter early** – Use `.include()`/`.exclude()` to skip unwanted entries before processing
|
||||
|
||||
## Error Handling 🛡️
|
||||
|
||||
```typescript
|
||||
import { SmartArchive } from '@push.rocks/smartarchive';
|
||||
|
||||
try {
|
||||
await SmartArchive.create()
|
||||
.url('https://example.com/file.zip')
|
||||
.extract('./output');
|
||||
} catch (error) {
|
||||
if (error.message.includes('No source configured')) {
|
||||
console.error('Forgot to specify source');
|
||||
} else if (error.message.includes('No format specified')) {
|
||||
console.error('Forgot to set format for creation');
|
||||
} else if (error.message.includes('extraction mode')) {
|
||||
console.error('Cannot mix extraction and creation methods');
|
||||
} else {
|
||||
console.error('Archive operation failed:', error.message);
|
||||
}
|
||||
}
|
||||
```
|
||||
|
||||
## License and Legal Information
|
||||
|
||||
This repository contains open-source code that is licensed under the MIT License. A copy of the MIT License can be found in the [license](license) file within this repository.
|
||||
@@ -450,6 +532,10 @@ This repository contains open-source code that is licensed under the MIT License
|
||||
|
||||
This project is owned and maintained by Task Venture Capital GmbH. The names and logos associated with Task Venture Capital GmbH and any related products or services are trademarks of Task Venture Capital GmbH and are not included within the scope of the MIT license granted herein. Use of these trademarks must comply with Task Venture Capital GmbH's Trademark Guidelines, and any usage must be approved in writing by Task Venture Capital GmbH.
|
||||
|
||||
### Issue Reporting and Security
|
||||
|
||||
For reporting bugs, issues, or security vulnerabilities, please visit [community.foss.global/](https://community.foss.global/). This is the central community hub for all issue reporting. Developers who sign and comply with our contribution agreement and go through identification can also get a [code.foss.global/](https://code.foss.global/) account to submit Pull Requests directly.
|
||||
|
||||
### Company Information
|
||||
|
||||
Task Venture Capital GmbH
|
||||
|
||||
Reference in New Issue
Block a user