A library for working with archive files, providing utilities for compressing and decompressing data.
Go to file
2024-06-08 14:48:25 +02:00
.gitea/workflows BREAKING CHANGE(core): update 2023-11-06 18:14:21 +01:00
.vscode fix(core): update 2023-07-26 16:13:33 +02:00
dist_ts fix(core): update 2024-06-08 12:46:43 +02:00
test fix(core): update 2023-11-14 10:58:01 +01:00
ts fix(core): update 2024-06-08 14:48:24 +02:00
.gitignore fix(core): update 2020-03-16 15:13:58 +00:00
license fix(core): update 2023-11-11 18:28:50 +01:00
npmextra.json fix(core): update 2024-06-08 12:49:02 +02:00
package.json 4.0.37 2024-06-08 14:48:25 +02:00
pnpm-lock.yaml fix(core): update 2024-06-08 10:47:27 +02:00
readme.hints.md update tsconfig 2024-04-14 17:20:20 +02:00
readme.md fix(core): update 2024-06-08 12:49:02 +02:00
test.js added callback support 2016-01-20 04:01:21 +01:00
tsconfig.json BREAKING CHANGE(core): update 2023-11-06 18:14:21 +01:00

@push.rocks/smartarchive

@push.rocks/smartarchive is a powerful library designed for managing archive files. It provides utilities for compressing and decompressing data in various formats such as zip, tar, gzip, and bzip2. This library aims to simplify the process of handling archive files, making it an ideal choice for projects that require manipulation of archived data.

Install

To install @push.rocks/smartarchive, you can either use npm or yarn. Run one of the following commands in your project directory:

npm install @push.rocks/smartarchive --save

or if you prefer yarn:

yarn add @push.rocks/smartarchive

This will add @push.rocks/smartarchive to your project's dependencies.

Usage

@push.rocks/smartarchive provides an easy-to-use API for extracting, creating, and analyzing archive files. Below, we'll cover how to get started and explore various features of the module.

Importing SmartArchive

First, import SmartArchive from @push.rocks/smartarchive using ESM syntax:

import { SmartArchive } from '@push.rocks/smartarchive';

Extracting Archive Files

You can extract archive files from different sources using SmartArchive.fromArchiveUrl, SmartArchive.fromArchiveFile, and SmartArchive.fromArchiveStream. Here's an example of extracting an archive from a URL:

import { SmartArchive } from '@push.rocks/smartarchive';

async function extractArchiveFromURL() {
  const url = 'https://example.com/archive.zip';
  const targetDir = '/path/to/extract';

  const archive = await SmartArchive.fromArchiveUrl(url);
  await archive.exportToFs(targetDir);

  console.log('Archive extracted successfully.');
}

extractArchiveFromURL();

Extracting an Archive from a File

Similarly, you can extract an archive from a local file:

import { SmartArchive } from '@push.rocks/smartarchive';

async function extractArchiveFromFile() {
  const filePath = '/path/to/archive.zip';
  const targetDir = '/path/to/extract';

  const archive = await SmartArchive.fromArchiveFile(filePath);
  await archive.exportToFs(targetDir);

  console.log('Archive extracted successfully.');
}

extractArchiveFromFile();

Stream-Based Extraction

For larger files, you might prefer a streaming approach to prevent high memory consumption. Heres an example:

import { SmartArchive } from '@push.rocks/smartarchive';
import { createReadStream } from 'fs';

async function extractArchiveUsingStream() {
  const archiveStream = createReadStream('/path/to/archive.zip');
  const archive = await SmartArchive.fromArchiveStream(archiveStream);
  const extractionStream = await archive.exportToStreamOfStreamFiles();
  
  extractionStream.pipe(createWriteStream('/path/to/destination'));
}

extractArchiveUsingStream();

Analyzing Archive Files

Sometimes, you may need to inspect the contents of an archive before extracting it. The following example shows how to analyze an archive:

import { SmartArchive } from '@push.rocks/smartarchive';

async function analyzeArchive() {
  const filePath = '/path/to/archive.zip';
  
  const archive = await SmartArchive.fromArchiveFile(filePath);
  const analysisResult = await archive.analyzeContent();
  
  console.log(analysisResult); // Outputs details about the archive content
}

analyzeArchive();

Creating Archive Files

Creating an archive file is straightforward. Here we demonstrate creating a tar.gz archive:

import { SmartArchive } from '@push.rocks/smartarchive';

async function createTarGzArchive() {
  const archive = new SmartArchive();
  
  // Add directories and files
  archive.addedDirectories.push('/path/to/directory1');
  archive.addedFiles.push('/path/to/file1.txt');
  
  // Export as tar.gz
  const tarGzStream = await archive.exportToTarGzStream();
  
  // Save to filesystem or handle as needed
  tarGzStream.pipe(createWriteStream('/path/to/destination.tar.gz'));
}

createTarGzArchive();

Stream Operations

Here's an example of using smartarchive's streaming capabilities:

import { createReadStream, createWriteStream } from 'fs';
import { SmartArchive } from '@push.rocks/smartarchive';

async function extractArchiveUsingStreams() {
  const archiveStream = createReadStream('/path/to/archive.zip');
  const archive = await SmartArchive.fromArchiveStream(archiveStream);
  const extractionStream = await archive.exportToStreamOfStreamFiles();
  
  extractionStream.pipe(createWriteStream('/path/to/extracted'));
}

extractArchiveUsingStreams();

Advanced Decompression Usage

smartarchive supports multiple compression formats. It also provides detailed control over the decompression processes:

  • For ZIP files, ZipTools handles decompression using the fflate library.
  • For TAR files, TarTools uses tar-stream.
  • For GZIP files, GzipTools provides a CompressGunzipTransform and DecompressGunzipTransform.
  • For BZIP2 files, Bzip2Tools utilizes custom streaming decompression.

Example: Working with a GZIP-compressed archive:

import { createReadStream, createWriteStream } from 'fs';
import { SmartArchive } from '@push.rocks/smartarchive';

async function decompressGzipArchive() {
  const filePath = '/path/to/archive.gz';
  const targetDir = '/path/to/extract';

  const archive = await SmartArchive.fromArchiveFile(filePath);
  await archive.exportToFs(targetDir);

  console.log('GZIP archive decompressed successfully.');
}

decompressGzipArchive();

Advancing with Custom Decompression Streams

You can inject custom decompression streams where needed:

import { createReadStream, createWriteStream } from 'fs';
import { SmartArchive, GzipTools } from '@push.rocks/smartarchive';

async function customDecompression() {
  const filePath = '/path/to/archive.gz';
  const targetDir = '/path/to/extract';

  const archive = await SmartArchive.fromArchiveFile(filePath);
  const gzipTools = new GzipTools();
  const decompressionStream = gzipTools.getDecompressionStream();

  const archiveStream = await archive.getArchiveStream();
  archiveStream.pipe(decompressionStream).pipe(createWriteStream(targetDir));

  console.log('Custom GZIP decompression successful.');
}

customDecompression();

Custom Pack and Unpack Tar

When dealing with tar archives, you may need to perform custom packing and unpacking:

import { SmartArchive, TarTools } from '@push.rocks/smartarchive';
import { createWriteStream } from 'fs';

async function customTarOperations() {
  const tarTools = new TarTools();

  // Packing a directory into a tar stream
  const packStream = await tarTools.packDirectory('/path/to/directory');
  packStream.pipe(createWriteStream('/path/to/archive.tar'));

  // Extracting files from a tar stream
  const extractStream = tarTools.getDecompressionStream();
  createReadStream('/path/to/archive.tar').pipe(extractStream).on('entry', (header, stream, next) => {
    const writeStream = createWriteStream(`/path/to/extract/${header.name}`);
    stream.pipe(writeStream);
    stream.on('end', next);
  });
}

customTarOperations();

Extract and Analyze All-in-One

To extract and simultaneously analyze archive content:

import { createReadStream, createWriteStream } from 'fs';
import { SmartArchive } from '@push.rocks/smartarchive';

async function extractAndAnalyze() {
  const filePath = '/path/to/archive.zip';
  const targetDir = '/path/to/extract';

  const archive = await SmartArchive.fromArchiveFile(filePath);
  const analyzedStream = archive.archiveAnalyzer.getAnalyzedStream();
  const extractionStream = await archive.exportToStreamOfStreamFiles();

  analyzedStream.pipe(extractionStream).pipe(createWriteStream(targetDir));

  analyzedStream.on('data', (chunk) => {
    console.log(JSON.stringify(chunk, null, 2));
  });
}

extractAndAnalyze();

Final Words

These examples demonstrate various use cases for @push.rocks/smartarchive. Depending on your specific project requirements, you can adapt these examples to suit your needs. Always refer to the latest documentation for the most current information and methods available in @push.rocks/smartarchive.

For more information and API references, check the official @push.rocks/smartarchive GitHub repository.

This repository contains open-source code that is licensed under the MIT License. A copy of the MIT License can be found in the license file within this repository.

Please note: The MIT License does not grant permission to use the trade names, trademarks, service marks, or product names of the project, except as required for reasonable and customary use in describing the origin of the work and reproducing the content of the NOTICE file.

Trademarks

This project is owned and maintained by Task Venture Capital GmbH. The names and logos associated with Task Venture Capital GmbH and any related products or services are trademarks of Task Venture Capital GmbH and are not included within the scope of the MIT license granted herein. Use of these trademarks must comply with Task Venture Capital GmbH's Trademark Guidelines, and any usage must be approved in writing by Task Venture Capital GmbH.

Company Information

Task Venture Capital GmbH
Registered at District court Bremen HRB 35230 HB, Germany

For any legal inquiries or if you require further information, please contact us via email at hello@task.vc.

By using this repository, you acknowledge that you have read this section, agree to comply with its terms, and understand that the licensing of the code does not imply endorsement by Task Venture Capital GmbH of any derivative works.