v1.3.1
Some checks failed
Default (tags) / security (push) Successful in 32s
Default (tags) / test (push) Failing after 36s
Default (tags) / release (push) Has been skipped
Default (tags) / metadata (push) Has been skipped

This commit is contained in:
2025-12-16 10:10:06 +00:00
parent 6a7f7496ea
commit e993f6deb9
4 changed files with 56 additions and 2 deletions

View File

@@ -135,6 +135,53 @@ await fs.directory('/path/to/dir')
const exists = await fs.directory('/path/to/dir').exists();
```
### 📁 Directory Copy & Move
Copy or move entire directory trees with fine-grained control:
```typescript
// Basic copy - copies all files recursively
await fs.directory('/source').copy('/destination');
// Basic move - moves directory to new location
await fs.directory('/old-location').move('/new-location');
// Copy with options
await fs.directory('/source')
.filter(/\.ts$/) // Only copy TypeScript files
.overwrite(true) // Overwrite existing files
.preserveTimestamps(true) // Keep original timestamps
.copy('/destination');
// Copy all files (ignore filter setting)
await fs.directory('/source')
.filter('*.ts')
.applyFilter(false) // Ignore filter, copy everything
.copy('/destination');
// Handle target directory conflicts
await fs.directory('/source')
.onConflict('merge') // Default: merge contents
.copy('/destination');
await fs.directory('/source')
.onConflict('error') // Throw if target exists
.copy('/destination');
await fs.directory('/source')
.onConflict('replace') // Delete target first, then copy
.copy('/destination');
```
**Configuration Options:**
| Method | Default | Description |
|--------|---------|-------------|
| `filter(pattern)` | none | Filter files by glob, regex, or function |
| `applyFilter(bool)` | `true` | Whether to apply filter during copy/move |
| `overwrite(bool)` | `false` | Overwrite existing files at destination |
| `preserveTimestamps(bool)` | `false` | Preserve original file timestamps |
| `onConflict(mode)` | `'merge'` | `'merge'`, `'error'`, or `'replace'` |
### 🔐 Tree Hashing (Cache-Busting)
Compute a deterministic hash of all files in a directory - perfect for cache invalidation: