Compare commits
20 Commits
Author | SHA1 | Date | |
---|---|---|---|
7fab4e5dd0 | |||
0dbaa1bc5d | |||
8b37ebc8f9 | |||
5d757207c8 | |||
c80df05fdf | |||
9be43a85ef | |||
bf66209d3e | |||
cdd1ae2c9b | |||
f4290ae7f7 | |||
e58c0fd215 | |||
a91fac450a | |||
5cb043009c | |||
4a1f11b885 | |||
43f9033ccc | |||
e7c0951786 | |||
efc107907c | |||
2b8b0e5bdd | |||
3ae2a7fcf5 | |||
0806d3749b | |||
f5d5e20a97 |
BIN
.serena/cache/typescript/document_symbols_cache_v23-06-25.pkl
vendored
Normal file
BIN
.serena/cache/typescript/document_symbols_cache_v23-06-25.pkl
vendored
Normal file
Binary file not shown.
44
.serena/memories/code_style_conventions.md
Normal file
44
.serena/memories/code_style_conventions.md
Normal file
@@ -0,0 +1,44 @@
|
||||
# Code Style & Conventions
|
||||
|
||||
## TypeScript Standards
|
||||
- **Target**: ES2022
|
||||
- **Module System**: ESM with NodeNext resolution
|
||||
- **Decorators**: Experimental decorators enabled
|
||||
- **Strict Mode**: Implied through TypeScript configuration
|
||||
|
||||
## Naming Conventions
|
||||
- **Interfaces**: Prefix with `I` (e.g., `IUserData`, `IConfig`)
|
||||
- **Types**: Prefix with `T` (e.g., `TResponseType`, `TQueryResult`)
|
||||
- **Classes**: PascalCase (e.g., `SmartdataDb`, `SmartDataDbDoc`)
|
||||
- **Files**: All lowercase (e.g., `classes.doc.ts`, `plugins.ts`)
|
||||
- **Methods**: camelCase (e.g., `findOne`, `saveToDb`)
|
||||
|
||||
## Import Patterns
|
||||
- All external dependencies imported in `ts/plugins.ts`
|
||||
- Reference as `plugins.moduleName.method()`
|
||||
- Use full import paths for internal modules
|
||||
- Maintain ESM syntax throughout
|
||||
|
||||
## Class Structure
|
||||
- Use decorators for MongoDB document definitions
|
||||
- Extend base classes (SmartDataDbDoc, SmartDataDbCollection)
|
||||
- Static methods for factory patterns
|
||||
- Instance methods for document operations
|
||||
|
||||
## Async Patterns
|
||||
- Preserve Promise-based patterns
|
||||
- Use async/await for clarity
|
||||
- Handle errors appropriately
|
||||
- Return typed Promises
|
||||
|
||||
## MongoDB Specifics
|
||||
- Use `@unify()` decorator for unique fields
|
||||
- Use `@svDb()` decorator for database fields
|
||||
- Implement proper serialization/deserialization
|
||||
- Type-safe query construction with DeepQuery<T>
|
||||
|
||||
## Testing Patterns
|
||||
- Import from `@git.zone/tstest/tapbundle`
|
||||
- End test files with `export default tap.start()`
|
||||
- Use descriptive test names
|
||||
- Cover edge cases and error conditions
|
37
.serena/memories/project_overview.md
Normal file
37
.serena/memories/project_overview.md
Normal file
@@ -0,0 +1,37 @@
|
||||
# Project Overview: @push.rocks/smartdata
|
||||
|
||||
## Purpose
|
||||
An advanced TypeScript-first MongoDB wrapper library providing enterprise-grade features for distributed systems, real-time data synchronization, and easy data management.
|
||||
|
||||
## Tech Stack
|
||||
- **Language**: TypeScript (ES2022 target)
|
||||
- **Runtime**: Node.js >= 16.x
|
||||
- **Database**: MongoDB >= 4.4
|
||||
- **Build System**: tsbuild
|
||||
- **Test Framework**: tstest with tapbundle
|
||||
- **Package Manager**: pnpm (v10.7.0)
|
||||
- **Module System**: ESM (ES Modules)
|
||||
|
||||
## Key Features
|
||||
- Type-safe MongoDB integration with decorators
|
||||
- Document management with automatic timestamps
|
||||
- EasyStore for key-value storage
|
||||
- Distributed coordination with leader election
|
||||
- Real-time data sync with RxJS watchers
|
||||
- Deep query type safety
|
||||
- Enhanced cursor API
|
||||
- Powerful search capabilities
|
||||
|
||||
## Project Structure
|
||||
- **ts/**: Main TypeScript source code
|
||||
- Core classes for DB, Collections, Documents, Cursors
|
||||
- Distributed coordinator, EasyStore, Watchers
|
||||
- Lucene adapter for search functionality
|
||||
- **test/**: Test files using tstest framework
|
||||
- **dist_ts/**: Compiled JavaScript output
|
||||
|
||||
## Key Dependencies
|
||||
- MongoDB driver (v6.18.0)
|
||||
- @push.rocks ecosystem packages
|
||||
- @tsclass/tsclass for decorators
|
||||
- RxJS for reactive programming
|
35
.serena/memories/suggested_commands.md
Normal file
35
.serena/memories/suggested_commands.md
Normal file
@@ -0,0 +1,35 @@
|
||||
# Suggested Commands for @push.rocks/smartdata
|
||||
|
||||
## Build & Development
|
||||
- `pnpm build` - Build the TypeScript project with web support
|
||||
- `pnpm buildDocs` - Generate documentation using tsdoc
|
||||
- `tsbuild --web --allowimplicitany` - Direct build command
|
||||
|
||||
## Testing
|
||||
- `pnpm test` - Run all tests in test/ directory
|
||||
- `pnpm testSearch` - Run specific search test
|
||||
- `tstest test/test.specific.ts --verbose` - Run specific test with verbose output
|
||||
- `tsbuild check test/**/* --skiplibcheck` - Type check test files
|
||||
|
||||
## Package Management
|
||||
- `pnpm install` - Install dependencies
|
||||
- `pnpm install --save-dev <package>` - Add dev dependency
|
||||
- `pnpm add <package>` - Add production dependency
|
||||
|
||||
## Version Control
|
||||
- `git status` - Check current changes
|
||||
- `git diff` - View uncommitted changes
|
||||
- `git log --oneline -10` - View recent commits
|
||||
- `git mv <old> <new>` - Move/rename files preserving history
|
||||
|
||||
## System Utilities (Linux)
|
||||
- `ls -la` - List all files with details
|
||||
- `grep -r "pattern" .` - Search for pattern in files
|
||||
- `find . -name "*.ts"` - Find TypeScript files
|
||||
- `ps aux | grep node` - Find Node.js processes
|
||||
- `lsof -i :80` - Check process on port 80
|
||||
|
||||
## Debug & Development
|
||||
- `tsx <script.ts>` - Run TypeScript file directly
|
||||
- Store debug scripts in `.nogit/debug/`
|
||||
- Curl endpoints for API testing
|
33
.serena/memories/task_completion_checklist.md
Normal file
33
.serena/memories/task_completion_checklist.md
Normal file
@@ -0,0 +1,33 @@
|
||||
# Task Completion Checklist
|
||||
|
||||
When completing any coding task in this project, always:
|
||||
|
||||
## Before Committing
|
||||
1. **Build the project**: Run `pnpm build` to ensure TypeScript compiles
|
||||
2. **Run tests**: Execute `pnpm test` to verify nothing is broken
|
||||
3. **Type check**: Verify types compile correctly
|
||||
4. **Check for lint issues**: Look for any code style violations
|
||||
|
||||
## Code Quality Checks
|
||||
- Verify all imports are in `ts/plugins.ts` for external dependencies
|
||||
- Check that interfaces are prefixed with `I`
|
||||
- Check that types are prefixed with `T`
|
||||
- Ensure filenames are lowercase
|
||||
- Verify async patterns are preserved where needed
|
||||
- Check that decorators are properly used for MongoDB documents
|
||||
|
||||
## Documentation
|
||||
- Update relevant comments if functionality changed
|
||||
- Ensure new exports are properly documented
|
||||
- Update readme.md if new features added (only if explicitly requested)
|
||||
|
||||
## Git Hygiene
|
||||
- Make small, focused commits
|
||||
- Write clear commit messages
|
||||
- Use `git mv` for file operations
|
||||
- Never commit sensitive data or keys
|
||||
|
||||
## Final Verification
|
||||
- Test the specific functionality that was changed
|
||||
- Ensure no unintended side effects
|
||||
- Verify the change solves the original problem completely
|
68
.serena/project.yml
Normal file
68
.serena/project.yml
Normal file
@@ -0,0 +1,68 @@
|
||||
# language of the project (csharp, python, rust, java, typescript, go, cpp, or ruby)
|
||||
# * For C, use cpp
|
||||
# * For JavaScript, use typescript
|
||||
# Special requirements:
|
||||
# * csharp: Requires the presence of a .sln file in the project folder.
|
||||
language: typescript
|
||||
|
||||
# whether to use the project's gitignore file to ignore files
|
||||
# Added on 2025-04-07
|
||||
ignore_all_files_in_gitignore: true
|
||||
# list of additional paths to ignore
|
||||
# same syntax as gitignore, so you can use * and **
|
||||
# Was previously called `ignored_dirs`, please update your config if you are using that.
|
||||
# Added (renamed)on 2025-04-07
|
||||
ignored_paths: []
|
||||
|
||||
# whether the project is in read-only mode
|
||||
# If set to true, all editing tools will be disabled and attempts to use them will result in an error
|
||||
# Added on 2025-04-18
|
||||
read_only: false
|
||||
|
||||
|
||||
# list of tool names to exclude. We recommend not excluding any tools, see the readme for more details.
|
||||
# Below is the complete list of tools for convenience.
|
||||
# To make sure you have the latest list of tools, and to view their descriptions,
|
||||
# execute `uv run scripts/print_tool_overview.py`.
|
||||
#
|
||||
# * `activate_project`: Activates a project by name.
|
||||
# * `check_onboarding_performed`: Checks whether project onboarding was already performed.
|
||||
# * `create_text_file`: Creates/overwrites a file in the project directory.
|
||||
# * `delete_lines`: Deletes a range of lines within a file.
|
||||
# * `delete_memory`: Deletes a memory from Serena's project-specific memory store.
|
||||
# * `execute_shell_command`: Executes a shell command.
|
||||
# * `find_referencing_code_snippets`: Finds code snippets in which the symbol at the given location is referenced.
|
||||
# * `find_referencing_symbols`: Finds symbols that reference the symbol at the given location (optionally filtered by type).
|
||||
# * `find_symbol`: Performs a global (or local) search for symbols with/containing a given name/substring (optionally filtered by type).
|
||||
# * `get_current_config`: Prints the current configuration of the agent, including the active and available projects, tools, contexts, and modes.
|
||||
# * `get_symbols_overview`: Gets an overview of the top-level symbols defined in a given file.
|
||||
# * `initial_instructions`: Gets the initial instructions for the current project.
|
||||
# Should only be used in settings where the system prompt cannot be set,
|
||||
# e.g. in clients you have no control over, like Claude Desktop.
|
||||
# * `insert_after_symbol`: Inserts content after the end of the definition of a given symbol.
|
||||
# * `insert_at_line`: Inserts content at a given line in a file.
|
||||
# * `insert_before_symbol`: Inserts content before the beginning of the definition of a given symbol.
|
||||
# * `list_dir`: Lists files and directories in the given directory (optionally with recursion).
|
||||
# * `list_memories`: Lists memories in Serena's project-specific memory store.
|
||||
# * `onboarding`: Performs onboarding (identifying the project structure and essential tasks, e.g. for testing or building).
|
||||
# * `prepare_for_new_conversation`: Provides instructions for preparing for a new conversation (in order to continue with the necessary context).
|
||||
# * `read_file`: Reads a file within the project directory.
|
||||
# * `read_memory`: Reads the memory with the given name from Serena's project-specific memory store.
|
||||
# * `remove_project`: Removes a project from the Serena configuration.
|
||||
# * `replace_lines`: Replaces a range of lines within a file with new content.
|
||||
# * `replace_symbol_body`: Replaces the full definition of a symbol.
|
||||
# * `restart_language_server`: Restarts the language server, may be necessary when edits not through Serena happen.
|
||||
# * `search_for_pattern`: Performs a search for a pattern in the project.
|
||||
# * `summarize_changes`: Provides instructions for summarizing the changes made to the codebase.
|
||||
# * `switch_modes`: Activates modes by providing a list of their names
|
||||
# * `think_about_collected_information`: Thinking tool for pondering the completeness of collected information.
|
||||
# * `think_about_task_adherence`: Thinking tool for determining whether the agent is still on track with the current task.
|
||||
# * `think_about_whether_you_are_done`: Thinking tool for determining whether the task is truly completed.
|
||||
# * `write_memory`: Writes a named memory (for future reference) to Serena's project-specific memory store.
|
||||
excluded_tools: []
|
||||
|
||||
# initial prompt for the project. It will always be given to the LLM upon activating the project
|
||||
# (contrary to the memories, which are loaded on demand).
|
||||
initial_prompt: ""
|
||||
|
||||
project_name: "smartdata"
|
73
changelog.md
73
changelog.md
@@ -1,5 +1,78 @@
|
||||
# Changelog
|
||||
|
||||
## 2025-08-18 - 5.16.4 - fix(classes.doc (convertFilterForMongoDb))
|
||||
Improve filter conversion: handle logical operators, merge operator objects, add nested filter tests and docs, and fix test script
|
||||
|
||||
- Fix package.json test script: remove stray dot in tstest --verbose argument to ensure tests run correctly
|
||||
- Enhance convertFilterForMongoDb in ts/classes.doc.ts to properly handle logical operators ($and, $or, $nor, $not) and return them recursively
|
||||
- Merge operator objects for the same field path (e.g. combining $gte and $lte) to avoid overwriting operator clauses when object and dot-notation are mixed
|
||||
- Add validation/guards for operator argument types (e.g. $in, $nin, $all must be arrays; $size must be numeric) and preserve existing behavior blocking $where for security
|
||||
- Add comprehensive nested filter tests in test/test.filters.ts to cover deep nested object queries, $elemMatch, array size, $all, $in on nested fields and more
|
||||
- Expand README filtering section with detailed examples for basic filtering, deep nested filters, comparison operators, array operations, logical and element operators, and advanced patterns
|
||||
|
||||
## 2025-08-18 - 5.16.3 - fix(docs)
|
||||
Add local Claude settings and remove outdated codex.md
|
||||
|
||||
- Added .claude/settings.local.json to store local Claude/assistant permissions and configuration.
|
||||
- Removed codex.md (project overview) — documentation file deleted.
|
||||
- No runtime/library code changes; documentation/configuration-only update, bump patch version.
|
||||
|
||||
## 2025-08-18 - 5.16.2 - fix(readme)
|
||||
Update README: clarify examples, expand search/cursor/docs and add local Claude settings
|
||||
|
||||
- Refined README wording and structure: clearer Quick Start, improved examples and developer-focused phrasing
|
||||
- Expanded documentation for search, cursors, change streams, distributed coordination, transactions and EasyStore with more concrete code examples
|
||||
- Adjusted code examples to show safer defaults (ID generation, status/tags, connection pooling) and improved best-practices guidance
|
||||
- Added .claude/settings.local.json to provide local assistant/CI permission configuration
|
||||
|
||||
## 2025-08-12 - 5.16.1 - fix(core)
|
||||
Improve error handling and logging; enhance search query sanitization; update dependency versions and documentation
|
||||
|
||||
- Replaced console.log and console.warn with structured logger.log calls throughout the core modules
|
||||
- Enhanced database initialization with try/catch and proper URI credential encoding
|
||||
- Improved search query conversion by disallowing dangerous operators (e.g. $where) and securely escaping regex patterns
|
||||
- Bumped dependency versions (smartlog, @tsclass/tsclass, mongodb, etc.) in package.json
|
||||
- Added detailed project memories including code style, project overview, and suggested commands for developers
|
||||
- Updated README with improved instructions, feature highlights, and quick start sections
|
||||
|
||||
## 2025-04-25 - 5.16.0 - feat(watcher)
|
||||
Enhance change stream watchers with buffering and EventEmitter support; update dependency versions
|
||||
|
||||
- Bumped smartmongo from ^2.0.11 to ^2.0.12 and smartrx from ^3.0.7 to ^3.0.10
|
||||
- Upgraded @tsclass/tsclass to ^9.0.0 and mongodb to ^6.16.0
|
||||
- Refactored the watch API to accept additional options (bufferTimeMs, fullDocument) for improved change stream handling
|
||||
- Modified SmartdataDbWatcher to extend EventEmitter and support event notifications
|
||||
|
||||
## 2025-04-24 - 5.15.1 - fix(cursor)
|
||||
Improve cursor usage documentation and refactor getCursor API to support native cursor modifiers
|
||||
|
||||
- Updated examples in readme.md to demonstrate manual iteration using cursor.next() and proper cursor closing.
|
||||
- Refactored the getCursor method in classes.doc.ts to accept session and modifier options, consolidating cursor handling.
|
||||
- Added new tests in test/test.cursor.ts to verify cursor operations, including limits, sorting, and skipping.
|
||||
|
||||
## 2025-04-24 - 5.15.0 - feat(svDb)
|
||||
Enhance svDb decorator to support custom serialization and deserialization options
|
||||
|
||||
- Added an optional options parameter to the svDb decorator to accept serialize/deserialize functions
|
||||
- Updated instance creation logic (updateFromDb) to apply custom deserialization if provided
|
||||
- Updated createSavableObject to use custom serialization when available
|
||||
|
||||
## 2025-04-23 - 5.14.1 - fix(db operations)
|
||||
Update transaction API to consistently pass optional session parameters across database operations
|
||||
|
||||
- Revised transaction support in readme to use startSession without await and showcased session usage in getInstance and save calls
|
||||
- Updated methods in classes.collection.ts to accept an optional session parameter for findOne, getCursor, findAll, insert, update, delete, and getCount
|
||||
- Enhanced SmartDataDbDoc save and delete methods to propagate session parameters
|
||||
- Improved overall consistency of transactional APIs across the library
|
||||
|
||||
## 2025-04-23 - 5.14.0 - feat(doc)
|
||||
Implement support for beforeSave, afterSave, beforeDelete, and afterDelete lifecycle hooks in document save and delete operations to allow custom logic execution during these critical moments.
|
||||
|
||||
- Calls beforeSave hook if defined before performing insert or update.
|
||||
- Calls afterSave hook after a document is saved.
|
||||
- Calls beforeDelete hook before deletion and afterDelete hook afterward.
|
||||
- Ensures _updatedAt timestamp is refreshed during save operations.
|
||||
|
||||
## 2025-04-22 - 5.13.1 - fix(search)
|
||||
Improve search query parsing for implicit AND queries by preserving quoted substrings and better handling free terms, quoted phrases, and field:value tokens.
|
||||
|
||||
|
77
codex.md
77
codex.md
@@ -1,77 +0,0 @@
|
||||
# SmartData Project Overview
|
||||
|
||||
This document provides a high-level overview of the SmartData library (`@push.rocks/smartdata`), its architecture, core components, and key features—including recent enhancements to the search API.
|
||||
|
||||
## 1. Project Purpose
|
||||
- A TypeScript‑first wrapper around MongoDB that supplies:
|
||||
- Strongly‑typed document & collection classes
|
||||
- Decorator‑based schema definition (no external schema files)
|
||||
- Advanced search capabilities with Lucene‑style queries
|
||||
- Built‑in support for real‑time data sync, distributed coordination, and key‑value EasyStore
|
||||
|
||||
## 2. Core Concepts & Components
|
||||
- **SmartDataDb**: Manages the MongoDB connection, pooling, and initialization of collections.
|
||||
- **SmartDataDbDoc**: Base class for all document models; provides CRUD, upsert, and cursor APIs.
|
||||
- **Decorators**:
|
||||
- `@Collection`: Associates a class with a MongoDB collection
|
||||
- `@svDb()`: Marks a field as persisted to the DB
|
||||
- `@unI()`: Marks a field as a unique index
|
||||
- `@index()`: Adds a regular index
|
||||
- `@searchable()`: Marks a field for inclusion in text searches or regex queries
|
||||
- **SmartdataCollection**: Wraps a MongoDB collection; auto‑creates indexes based on decorators.
|
||||
- **Lucene Adapter**: Parses a Lucene query string into an AST and transforms it to a MongoDB filter object.
|
||||
- **EasyStore**: A simple, schema‑less key‑value store built on top of MongoDB for sharing ephemeral data.
|
||||
- **Distributed Coordinator**: Leader election and task‑distribution API for building resilient, multi‑instance systems.
|
||||
- **Watcher**: Listens to change streams for real‑time updates and integrates with RxJS.
|
||||
|
||||
## 3. Search API
|
||||
SmartData provides a unified `.search(query[, opts])` method on all models with `@searchable()` fields:
|
||||
|
||||
- **Supported Syntax**:
|
||||
1. Exact field:value (e.g. `field:Value`)
|
||||
2. Quoted phrases (e.g. `"exact phrase"` or `'exact phrase'`)
|
||||
3. Wildcards: `*` (zero or more chars) and `?` (single char)
|
||||
4. Boolean operators: `AND`, `OR`, `NOT`
|
||||
5. Grouping: parenthesis `(A OR B) AND C`
|
||||
6. Range queries: `[num TO num]`, `{num TO num}`
|
||||
7. Multi‑term unquoted: terms AND’d across all searchable fields
|
||||
8. Empty query returns all documents
|
||||
|
||||
- **Fallback Mechanisms**:
|
||||
1. Text index based `$text` search (if supported)
|
||||
2. Field‑scoped and multi‑field regex queries
|
||||
3. In‑memory filtering for complex or unsupported cases
|
||||
|
||||
### New Security & Extensibility Hooks
|
||||
The `.search(query, opts?)` signature now accepts a `SearchOptions<T>` object:
|
||||
```ts
|
||||
interface SearchOptions<T> {
|
||||
filter?: Record<string, any>; // Additional MongoDB filter AND‑merged
|
||||
validate?: (doc: T) => boolean; // Post‑fetch hook to drop results
|
||||
}
|
||||
```
|
||||
- **filter**: Enforces mandatory constraints (e.g. multi‑tenant isolation) directly in the Mongo query.
|
||||
- **validate**: An async function that runs after fetching; return `false` to exclude a document.
|
||||
|
||||
## 4. Testing Strategy
|
||||
- Unit tests in `test/test.search.ts` cover basic search functionality and new options:
|
||||
- Exact, wildcard, phrase, boolean and grouping cases
|
||||
- Implicit AND and mixed free‑term + field searches
|
||||
- Edge cases (non‑searchable fields, quoted wildcards, no matches)
|
||||
- `filter` and `validate` tests ensure security hooks work as intended
|
||||
- Advanced search scenarios are covered in `test/test.search.advanced.ts`.
|
||||
|
||||
## 5. Usage Example
|
||||
```ts
|
||||
// Basic search
|
||||
const prods = await Product.search('wireless earbuds');
|
||||
|
||||
// Scoped search (only your organization’s items)
|
||||
const myItems = await Product.search('book', { filter: { ownerId } });
|
||||
|
||||
// Post‑search validation (only cheap items)
|
||||
const cheapItems = await Product.search('', { validate: p => p.price < 50 });
|
||||
```
|
||||
|
||||
---
|
||||
Last updated: 2025-04-22
|
26
package.json
26
package.json
@@ -1,13 +1,13 @@
|
||||
{
|
||||
"name": "@push.rocks/smartdata",
|
||||
"version": "5.13.1",
|
||||
"version": "5.16.4",
|
||||
"private": false,
|
||||
"description": "An advanced library for NoSQL data organization and manipulation using TypeScript with support for MongoDB, data validation, collections, and custom data types.",
|
||||
"main": "dist_ts/index.js",
|
||||
"typings": "dist_ts/index.d.ts",
|
||||
"type": "module",
|
||||
"scripts": {
|
||||
"test": "tstest test/",
|
||||
"test": "tstest test/ --verbose --logfile --timeout 120",
|
||||
"testSearch": "tsx test/test.search.ts",
|
||||
"build": "tsbuild --web --allowimplicitany",
|
||||
"buildDocs": "tsdoc"
|
||||
@@ -23,26 +23,26 @@
|
||||
},
|
||||
"homepage": "https://code.foss.global/push.rocks/smartdata#readme",
|
||||
"dependencies": {
|
||||
"@push.rocks/lik": "^6.0.14",
|
||||
"@push.rocks/lik": "^6.2.2",
|
||||
"@push.rocks/smartdelay": "^3.0.1",
|
||||
"@push.rocks/smartlog": "^3.0.2",
|
||||
"@push.rocks/smartmongo": "^2.0.11",
|
||||
"@push.rocks/smartlog": "^3.1.8",
|
||||
"@push.rocks/smartmongo": "^2.0.12",
|
||||
"@push.rocks/smartpromise": "^4.0.2",
|
||||
"@push.rocks/smartrx": "^3.0.7",
|
||||
"@push.rocks/smartrx": "^3.0.10",
|
||||
"@push.rocks/smartstring": "^4.0.15",
|
||||
"@push.rocks/smarttime": "^4.0.6",
|
||||
"@push.rocks/smartunique": "^3.0.8",
|
||||
"@push.rocks/taskbuffer": "^3.1.7",
|
||||
"@tsclass/tsclass": "^8.2.0",
|
||||
"mongodb": "^6.15.0"
|
||||
"@tsclass/tsclass": "^9.2.0",
|
||||
"mongodb": "^6.18.0"
|
||||
},
|
||||
"devDependencies": {
|
||||
"@git.zone/tsbuild": "^2.3.2",
|
||||
"@git.zone/tsbuild": "^2.6.7",
|
||||
"@git.zone/tsrun": "^1.2.44",
|
||||
"@git.zone/tstest": "^1.0.77",
|
||||
"@push.rocks/qenv": "^6.0.5",
|
||||
"@push.rocks/tapbundle": "^5.6.2",
|
||||
"@types/node": "^22.14.0"
|
||||
"@git.zone/tstest": "^2.3.5",
|
||||
"@push.rocks/qenv": "^6.1.3",
|
||||
"@push.rocks/tapbundle": "^6.0.3",
|
||||
"@types/node": "^22.15.2"
|
||||
},
|
||||
"files": [
|
||||
"ts/**/*",
|
||||
|
4665
pnpm-lock.yaml
generated
4665
pnpm-lock.yaml
generated
File diff suppressed because it is too large
Load Diff
97
test/test.cursor.ts
Normal file
97
test/test.cursor.ts
Normal file
@@ -0,0 +1,97 @@
|
||||
import { tap, expect } from '@push.rocks/tapbundle';
|
||||
import * as smartmongo from '@push.rocks/smartmongo';
|
||||
import { smartunique } from '../ts/plugins.js';
|
||||
import * as smartdata from '../ts/index.js';
|
||||
|
||||
// Set up database connection
|
||||
let smartmongoInstance: smartmongo.SmartMongo;
|
||||
let testDb: smartdata.SmartdataDb;
|
||||
|
||||
// Define a simple document model for cursor tests
|
||||
@smartdata.Collection(() => testDb)
|
||||
class CursorTest extends smartdata.SmartDataDbDoc<CursorTest, CursorTest> {
|
||||
@smartdata.unI()
|
||||
public id: string = smartunique.shortId();
|
||||
|
||||
@smartdata.svDb()
|
||||
public name: string;
|
||||
|
||||
@smartdata.svDb()
|
||||
public order: number;
|
||||
|
||||
constructor(name: string, order: number) {
|
||||
super();
|
||||
this.name = name;
|
||||
this.order = order;
|
||||
}
|
||||
}
|
||||
|
||||
// Initialize the in-memory MongoDB and SmartdataDB
|
||||
tap.test('cursor init: start Mongo and SmartdataDb', async () => {
|
||||
smartmongoInstance = await smartmongo.SmartMongo.createAndStart();
|
||||
testDb = new smartdata.SmartdataDb(
|
||||
await smartmongoInstance.getMongoDescriptor(),
|
||||
);
|
||||
await testDb.init();
|
||||
});
|
||||
|
||||
// Insert sample documents
|
||||
tap.test('cursor insert: save 5 test documents', async () => {
|
||||
for (let i = 1; i <= 5; i++) {
|
||||
const doc = new CursorTest(`item${i}`, i);
|
||||
await doc.save();
|
||||
}
|
||||
const count = await CursorTest.getCount({});
|
||||
expect(count).toEqual(5);
|
||||
});
|
||||
|
||||
// Test that toArray returns all documents
|
||||
tap.test('cursor toArray: retrieves all documents', async () => {
|
||||
const cursor = await CursorTest.getCursor({});
|
||||
const all = await cursor.toArray();
|
||||
expect(all.length).toEqual(5);
|
||||
});
|
||||
|
||||
// Test iteration via forEach
|
||||
tap.test('cursor forEach: iterates through all documents', async () => {
|
||||
const names: string[] = [];
|
||||
const cursor = await CursorTest.getCursor({});
|
||||
await cursor.forEach(async (item) => {
|
||||
names.push(item.name);
|
||||
});
|
||||
expect(names.length).toEqual(5);
|
||||
expect(names).toContain('item3');
|
||||
});
|
||||
|
||||
// Test native cursor modifiers: limit
|
||||
tap.test('cursor modifier limit: only two documents', async () => {
|
||||
const cursor = await CursorTest.getCursor({}, { modifier: (c) => c.limit(2) });
|
||||
const limited = await cursor.toArray();
|
||||
expect(limited.length).toEqual(2);
|
||||
});
|
||||
|
||||
// Test native cursor modifiers: sort and skip
|
||||
tap.test('cursor modifier sort & skip: returns correct order', async () => {
|
||||
const cursor = await CursorTest.getCursor({}, {
|
||||
modifier: (c) => c.sort({ order: -1 }).skip(1),
|
||||
});
|
||||
const results = await cursor.toArray();
|
||||
// Skipped the first (order 5), next should be 4,3,2,1
|
||||
expect(results.length).toEqual(4);
|
||||
expect(results[0].order).toEqual(4);
|
||||
});
|
||||
|
||||
// Cleanup: drop database, close connections, stop Mongo
|
||||
tap.test('cursor cleanup: drop DB and stop', async () => {
|
||||
await testDb.mongoDb.dropDatabase();
|
||||
await testDb.close();
|
||||
if (smartmongoInstance) {
|
||||
await smartmongoInstance.stopAndDumpToDir(
|
||||
`.nogit/dbdump/test.cursor.ts`,
|
||||
);
|
||||
}
|
||||
// Ensure process exits after cleanup
|
||||
setTimeout(() => process.exit(), 2000);
|
||||
});
|
||||
|
||||
export default tap.start();
|
819
test/test.filters.ts
Normal file
819
test/test.filters.ts
Normal file
@@ -0,0 +1,819 @@
|
||||
import { tap, expect } from '@git.zone/tstest/tapbundle';
|
||||
import * as smartmongo from '@push.rocks/smartmongo';
|
||||
import * as smartunique from '@push.rocks/smartunique';
|
||||
import * as smartdata from '../ts/index.js';
|
||||
|
||||
const { SmartdataDb, Collection, svDb, unI, index } = smartdata;
|
||||
|
||||
let smartmongoInstance: smartmongo.SmartMongo;
|
||||
let testDb: smartdata.SmartdataDb;
|
||||
|
||||
// Define test document classes
|
||||
@Collection(() => testDb)
|
||||
class TestUser extends smartdata.SmartDataDbDoc<TestUser, TestUser> {
|
||||
@unI()
|
||||
public id: string = smartunique.shortId();
|
||||
|
||||
@svDb()
|
||||
public name: string;
|
||||
|
||||
@svDb()
|
||||
public age: number;
|
||||
|
||||
@svDb()
|
||||
public email: string;
|
||||
|
||||
@svDb()
|
||||
public roles: string[];
|
||||
|
||||
@svDb()
|
||||
public tags: string[];
|
||||
|
||||
@svDb()
|
||||
public status: 'active' | 'inactive' | 'pending';
|
||||
|
||||
@svDb()
|
||||
public metadata: {
|
||||
lastLogin?: Date;
|
||||
loginCount?: number;
|
||||
preferences?: Record<string, any>;
|
||||
};
|
||||
|
||||
@svDb()
|
||||
public scores: number[];
|
||||
|
||||
constructor(data: Partial<TestUser> = {}) {
|
||||
super();
|
||||
Object.assign(this, data);
|
||||
}
|
||||
}
|
||||
|
||||
@Collection(() => testDb)
|
||||
class TestOrder extends smartdata.SmartDataDbDoc<TestOrder, TestOrder> {
|
||||
@unI()
|
||||
public id: string = smartunique.shortId();
|
||||
|
||||
@svDb()
|
||||
public userId: string;
|
||||
|
||||
@svDb()
|
||||
public items: Array<{
|
||||
product: string;
|
||||
quantity: number;
|
||||
price: number;
|
||||
}>;
|
||||
|
||||
@svDb()
|
||||
public totalAmount: number;
|
||||
|
||||
@svDb()
|
||||
public status: string;
|
||||
|
||||
@svDb()
|
||||
public tags: string[];
|
||||
|
||||
constructor(data: Partial<TestOrder> = {}) {
|
||||
super();
|
||||
Object.assign(this, data);
|
||||
}
|
||||
}
|
||||
|
||||
// Setup and teardown
|
||||
tap.test('should create a test database instance', async () => {
|
||||
smartmongoInstance = await smartmongo.SmartMongo.createAndStart();
|
||||
testDb = new smartdata.SmartdataDb(await smartmongoInstance.getMongoDescriptor());
|
||||
await testDb.init();
|
||||
expect(testDb).toBeInstanceOf(SmartdataDb);
|
||||
});
|
||||
|
||||
tap.test('should create test data', async () => {
|
||||
// Create test users
|
||||
const users = [
|
||||
new TestUser({
|
||||
name: 'John Doe',
|
||||
age: 30,
|
||||
email: 'john@example.com',
|
||||
roles: ['admin', 'user'],
|
||||
tags: ['javascript', 'nodejs', 'mongodb'],
|
||||
status: 'active',
|
||||
metadata: { loginCount: 5, lastLogin: new Date() },
|
||||
scores: [85, 90, 78]
|
||||
}),
|
||||
new TestUser({
|
||||
name: 'Jane Smith',
|
||||
age: 25,
|
||||
email: 'jane@example.com',
|
||||
roles: ['user'],
|
||||
tags: ['python', 'mongodb'],
|
||||
status: 'active',
|
||||
metadata: { loginCount: 3 },
|
||||
scores: [92, 88, 95]
|
||||
}),
|
||||
new TestUser({
|
||||
name: 'Bob Johnson',
|
||||
age: 35,
|
||||
email: 'bob@example.com',
|
||||
roles: ['moderator', 'user'],
|
||||
tags: ['javascript', 'react', 'nodejs'],
|
||||
status: 'inactive',
|
||||
metadata: { loginCount: 0 },
|
||||
scores: [70, 75, 80]
|
||||
}),
|
||||
new TestUser({
|
||||
name: 'Alice Brown',
|
||||
age: 28,
|
||||
email: 'alice@example.com',
|
||||
roles: ['admin'],
|
||||
tags: ['typescript', 'angular', 'mongodb'],
|
||||
status: 'active',
|
||||
metadata: { loginCount: 10 },
|
||||
scores: [95, 98, 100]
|
||||
}),
|
||||
new TestUser({
|
||||
name: 'Charlie Wilson',
|
||||
age: 22,
|
||||
email: 'charlie@example.com',
|
||||
roles: ['user'],
|
||||
tags: ['golang', 'kubernetes'],
|
||||
status: 'pending',
|
||||
metadata: { loginCount: 1 },
|
||||
scores: [60, 65]
|
||||
})
|
||||
];
|
||||
|
||||
for (const user of users) {
|
||||
await user.save();
|
||||
}
|
||||
|
||||
// Create test orders
|
||||
const orders = [
|
||||
new TestOrder({
|
||||
userId: users[0].id,
|
||||
items: [
|
||||
{ product: 'laptop', quantity: 1, price: 1200 },
|
||||
{ product: 'mouse', quantity: 2, price: 25 }
|
||||
],
|
||||
totalAmount: 1250,
|
||||
status: 'completed',
|
||||
tags: ['electronics', 'priority']
|
||||
}),
|
||||
new TestOrder({
|
||||
userId: users[1].id,
|
||||
items: [
|
||||
{ product: 'book', quantity: 3, price: 15 },
|
||||
{ product: 'pen', quantity: 5, price: 2 }
|
||||
],
|
||||
totalAmount: 55,
|
||||
status: 'pending',
|
||||
tags: ['stationery']
|
||||
}),
|
||||
new TestOrder({
|
||||
userId: users[0].id,
|
||||
items: [
|
||||
{ product: 'laptop', quantity: 2, price: 1200 },
|
||||
{ product: 'keyboard', quantity: 2, price: 80 }
|
||||
],
|
||||
totalAmount: 2560,
|
||||
status: 'processing',
|
||||
tags: ['electronics', 'bulk']
|
||||
})
|
||||
];
|
||||
|
||||
for (const order of orders) {
|
||||
await order.save();
|
||||
}
|
||||
|
||||
const savedUsers = await TestUser.getInstances({});
|
||||
const savedOrders = await TestOrder.getInstances({});
|
||||
expect(savedUsers.length).toEqual(5);
|
||||
expect(savedOrders.length).toEqual(3);
|
||||
});
|
||||
|
||||
// ============= BASIC FILTER TESTS =============
|
||||
tap.test('should filter by simple equality', async () => {
|
||||
const users = await TestUser.getInstances({ name: 'John Doe' });
|
||||
expect(users.length).toEqual(1);
|
||||
expect(users[0].name).toEqual('John Doe');
|
||||
});
|
||||
|
||||
tap.test('should filter by multiple fields (implicit AND)', async () => {
|
||||
const users = await TestUser.getInstances({
|
||||
status: 'active',
|
||||
age: 30
|
||||
});
|
||||
expect(users.length).toEqual(1);
|
||||
expect(users[0].name).toEqual('John Doe');
|
||||
});
|
||||
|
||||
tap.test('should filter by nested object fields', async () => {
|
||||
const users = await TestUser.getInstances({
|
||||
'metadata.loginCount': 5
|
||||
});
|
||||
expect(users.length).toEqual(1);
|
||||
expect(users[0].name).toEqual('John Doe');
|
||||
});
|
||||
|
||||
// ============= COMPREHENSIVE NESTED FILTER TESTS =============
|
||||
tap.test('should filter by nested object with direct object syntax', async () => {
|
||||
// Direct nested object matching (exact match)
|
||||
const users = await TestUser.getInstances({
|
||||
metadata: {
|
||||
loginCount: 5,
|
||||
lastLogin: (await TestUser.getInstances({}))[0].metadata.lastLogin // Get the exact date
|
||||
}
|
||||
});
|
||||
expect(users.length).toEqual(1);
|
||||
expect(users[0].name).toEqual('John Doe');
|
||||
});
|
||||
|
||||
tap.test('should filter by partial nested object match', async () => {
|
||||
// When using object syntax, only specified fields must match
|
||||
const users = await TestUser.getInstances({
|
||||
metadata: { loginCount: 5 } // Only checks loginCount, ignores other fields
|
||||
});
|
||||
expect(users.length).toEqual(1);
|
||||
expect(users[0].name).toEqual('John Doe');
|
||||
});
|
||||
|
||||
tap.test('should combine nested object and dot notation', async () => {
|
||||
const users = await TestUser.getInstances({
|
||||
metadata: { loginCount: { $gte: 3 } }, // Object syntax with operator
|
||||
'metadata.loginCount': { $lte: 10 } // Dot notation with operator
|
||||
});
|
||||
expect(users.length).toEqual(3); // Jane (3), John (5), and Alice (10) have loginCount between 3-10
|
||||
});
|
||||
|
||||
tap.test('should filter nested fields with operators using dot notation', async () => {
|
||||
const users = await TestUser.getInstances({
|
||||
'metadata.loginCount': { $gte: 5 }
|
||||
});
|
||||
expect(users.length).toEqual(2); // John (5) and Alice (10)
|
||||
const names = users.map(u => u.name).sort();
|
||||
expect(names).toEqual(['Alice Brown', 'John Doe']);
|
||||
});
|
||||
|
||||
tap.test('should filter nested fields with multiple operators', async () => {
|
||||
const users = await TestUser.getInstances({
|
||||
'metadata.loginCount': { $gte: 3, $lt: 10 }
|
||||
});
|
||||
expect(users.length).toEqual(2); // Jane (3) and John (5)
|
||||
const names = users.map(u => u.name).sort();
|
||||
expect(names).toEqual(['Jane Smith', 'John Doe']);
|
||||
});
|
||||
|
||||
tap.test('should handle deeply nested object structures', async () => {
|
||||
// First, create a user with deep nesting in preferences
|
||||
const deepUser = new TestUser({
|
||||
name: 'Deep Nester',
|
||||
age: 40,
|
||||
email: 'deep@example.com',
|
||||
roles: ['admin'],
|
||||
tags: [],
|
||||
status: 'active',
|
||||
metadata: {
|
||||
loginCount: 1,
|
||||
preferences: {
|
||||
theme: {
|
||||
colors: {
|
||||
primary: '#000000',
|
||||
secondary: '#ffffff'
|
||||
},
|
||||
fonts: {
|
||||
heading: 'Arial',
|
||||
body: 'Helvetica'
|
||||
}
|
||||
},
|
||||
notifications: {
|
||||
email: true,
|
||||
push: false
|
||||
}
|
||||
}
|
||||
},
|
||||
scores: []
|
||||
});
|
||||
await deepUser.save();
|
||||
|
||||
// Test deep nesting with dot notation
|
||||
const deepResults = await TestUser.getInstances({
|
||||
'metadata.preferences.theme.colors.primary': '#000000'
|
||||
});
|
||||
expect(deepResults.length).toEqual(1);
|
||||
expect(deepResults[0].name).toEqual('Deep Nester');
|
||||
|
||||
// Test deep nesting with operators
|
||||
const boolResults = await TestUser.getInstances({
|
||||
'metadata.preferences.notifications.email': { $eq: true }
|
||||
});
|
||||
expect(boolResults.length).toEqual(1);
|
||||
expect(boolResults[0].name).toEqual('Deep Nester');
|
||||
|
||||
// Clean up
|
||||
await deepUser.delete();
|
||||
});
|
||||
|
||||
tap.test('should filter arrays of nested objects using $elemMatch', async () => {
|
||||
const orders = await TestOrder.getInstances({
|
||||
items: {
|
||||
$elemMatch: {
|
||||
product: 'laptop',
|
||||
price: { $gte: 1000 }
|
||||
}
|
||||
}
|
||||
});
|
||||
expect(orders.length).toEqual(2); // Both laptop orders have price >= 1000
|
||||
});
|
||||
|
||||
tap.test('should filter nested arrays with dot notation', async () => {
|
||||
// Query for any order that has an item with specific product
|
||||
const orders = await TestOrder.getInstances({
|
||||
'items.product': 'laptop'
|
||||
});
|
||||
expect(orders.length).toEqual(2); // Two orders contain laptops
|
||||
});
|
||||
|
||||
tap.test('should combine nested object filters with logical operators', async () => {
|
||||
const users = await TestUser.getInstances({
|
||||
$or: [
|
||||
{ 'metadata.loginCount': { $gte: 10 } }, // Alice has 10
|
||||
{
|
||||
$and: [
|
||||
{ 'metadata.loginCount': { $lt: 5 } }, // Jane has 3, Bob has 0, Charlie has 1
|
||||
{ status: 'active' } // Jane is active, Bob is inactive, Charlie is pending
|
||||
]
|
||||
}
|
||||
]
|
||||
});
|
||||
expect(users.length).toEqual(2); // Alice (loginCount >= 10), Jane (loginCount < 5 AND active)
|
||||
const names = users.map(u => u.name).sort();
|
||||
expect(names).toEqual(['Alice Brown', 'Jane Smith']);
|
||||
});
|
||||
|
||||
tap.test('should handle null and undefined in nested fields', async () => {
|
||||
// Users without lastLogin
|
||||
const noLastLogin = await TestUser.getInstances({
|
||||
'metadata.lastLogin': { $exists: false }
|
||||
});
|
||||
expect(noLastLogin.length).toEqual(4); // Everyone except John
|
||||
|
||||
// Users with preferences (none have it set)
|
||||
const withPreferences = await TestUser.getInstances({
|
||||
'metadata.preferences': { $exists: true }
|
||||
});
|
||||
expect(withPreferences.length).toEqual(0);
|
||||
});
|
||||
|
||||
tap.test('should filter nested arrays by size', async () => {
|
||||
// Create an order with specific number of items
|
||||
const multiItemOrder = new TestOrder({
|
||||
userId: 'test-user',
|
||||
items: [
|
||||
{ product: 'item1', quantity: 1, price: 10 },
|
||||
{ product: 'item2', quantity: 2, price: 20 },
|
||||
{ product: 'item3', quantity: 3, price: 30 },
|
||||
{ product: 'item4', quantity: 4, price: 40 }
|
||||
],
|
||||
totalAmount: 100,
|
||||
status: 'pending',
|
||||
tags: ['test']
|
||||
});
|
||||
await multiItemOrder.save();
|
||||
|
||||
const fourItemOrders = await TestOrder.getInstances({
|
||||
items: { $size: 4 }
|
||||
});
|
||||
expect(fourItemOrders.length).toEqual(1);
|
||||
|
||||
// Clean up
|
||||
await multiItemOrder.delete();
|
||||
});
|
||||
|
||||
tap.test('should handle nested field comparison between documents', async () => {
|
||||
// Find users where loginCount equals their age divided by 6 (John: 30/6=5)
|
||||
const users = await TestUser.getInstances({
|
||||
$and: [
|
||||
{ 'metadata.loginCount': 5 },
|
||||
{ age: 30 }
|
||||
]
|
||||
});
|
||||
expect(users.length).toEqual(1);
|
||||
expect(users[0].name).toEqual('John Doe');
|
||||
});
|
||||
|
||||
tap.test('should filter using $in on nested fields', async () => {
|
||||
const users = await TestUser.getInstances({
|
||||
'metadata.loginCount': { $in: [0, 1, 5] }
|
||||
});
|
||||
expect(users.length).toEqual(3); // Bob (0), Charlie (1), John (5)
|
||||
const names = users.map(u => u.name).sort();
|
||||
expect(names).toEqual(['Bob Johnson', 'Charlie Wilson', 'John Doe']);
|
||||
});
|
||||
|
||||
tap.test('should filter nested arrays with $all', async () => {
|
||||
// Create an order with multiple tags
|
||||
const taggedOrder = new TestOrder({
|
||||
userId: 'test-user',
|
||||
items: [{ product: 'test', quantity: 1, price: 10 }],
|
||||
totalAmount: 10,
|
||||
status: 'completed',
|
||||
tags: ['urgent', 'priority', 'electronics']
|
||||
});
|
||||
await taggedOrder.save();
|
||||
|
||||
const priorityElectronics = await TestOrder.getInstances({
|
||||
tags: { $all: ['priority', 'electronics'] }
|
||||
});
|
||||
expect(priorityElectronics.length).toEqual(2); // Original order and new one
|
||||
|
||||
// Clean up
|
||||
await taggedOrder.delete();
|
||||
});
|
||||
|
||||
// ============= COMPARISON OPERATOR TESTS =============
|
||||
tap.test('should filter using $gt operator', async () => {
|
||||
const users = await TestUser.getInstances({
|
||||
age: { $gt: 30 }
|
||||
});
|
||||
expect(users.length).toEqual(1);
|
||||
expect(users[0].name).toEqual('Bob Johnson');
|
||||
});
|
||||
|
||||
tap.test('should filter using $gte operator', async () => {
|
||||
const users = await TestUser.getInstances({
|
||||
age: { $gte: 30 }
|
||||
});
|
||||
expect(users.length).toEqual(2);
|
||||
const names = users.map(u => u.name).sort();
|
||||
expect(names).toEqual(['Bob Johnson', 'John Doe']);
|
||||
});
|
||||
|
||||
tap.test('should filter using $lt operator', async () => {
|
||||
const users = await TestUser.getInstances({
|
||||
age: { $lt: 25 }
|
||||
});
|
||||
expect(users.length).toEqual(1);
|
||||
expect(users[0].name).toEqual('Charlie Wilson');
|
||||
});
|
||||
|
||||
tap.test('should filter using $lte operator', async () => {
|
||||
const users = await TestUser.getInstances({
|
||||
age: { $lte: 25 }
|
||||
});
|
||||
expect(users.length).toEqual(2);
|
||||
const names = users.map(u => u.name).sort();
|
||||
expect(names).toEqual(['Charlie Wilson', 'Jane Smith']);
|
||||
});
|
||||
|
||||
tap.test('should filter using $ne operator', async () => {
|
||||
const users = await TestUser.getInstances({
|
||||
status: { $ne: 'active' }
|
||||
});
|
||||
expect(users.length).toEqual(2);
|
||||
const statuses = users.map(u => u.status).sort();
|
||||
expect(statuses).toEqual(['inactive', 'pending']);
|
||||
});
|
||||
|
||||
tap.test('should filter using multiple comparison operators', async () => {
|
||||
const users = await TestUser.getInstances({
|
||||
age: { $gte: 25, $lt: 30 }
|
||||
});
|
||||
expect(users.length).toEqual(2);
|
||||
const names = users.map(u => u.name).sort();
|
||||
expect(names).toEqual(['Alice Brown', 'Jane Smith']);
|
||||
});
|
||||
|
||||
// ============= ARRAY OPERATOR TESTS =============
|
||||
tap.test('should filter using $in operator', async () => {
|
||||
const users = await TestUser.getInstances({
|
||||
status: { $in: ['active', 'pending'] }
|
||||
});
|
||||
expect(users.length).toEqual(4);
|
||||
expect(users.every(u => ['active', 'pending'].includes(u.status))).toEqual(true);
|
||||
});
|
||||
|
||||
tap.test('should filter arrays using $in operator', async () => {
|
||||
const users = await TestUser.getInstances({
|
||||
roles: { $in: ['admin'] }
|
||||
});
|
||||
expect(users.length).toEqual(2);
|
||||
const names = users.map(u => u.name).sort();
|
||||
expect(names).toEqual(['Alice Brown', 'John Doe']);
|
||||
});
|
||||
|
||||
tap.test('should filter using $nin operator', async () => {
|
||||
const users = await TestUser.getInstances({
|
||||
status: { $nin: ['inactive', 'pending'] }
|
||||
});
|
||||
expect(users.length).toEqual(3);
|
||||
expect(users.every(u => u.status === 'active')).toEqual(true);
|
||||
});
|
||||
|
||||
tap.test('should filter arrays using $all operator', async () => {
|
||||
const users = await TestUser.getInstances({
|
||||
tags: { $all: ['javascript', 'nodejs'] }
|
||||
});
|
||||
expect(users.length).toEqual(2);
|
||||
const names = users.map(u => u.name).sort();
|
||||
expect(names).toEqual(['Bob Johnson', 'John Doe']);
|
||||
});
|
||||
|
||||
tap.test('should filter arrays using $size operator', async () => {
|
||||
const users = await TestUser.getInstances({
|
||||
scores: { $size: 2 }
|
||||
});
|
||||
expect(users.length).toEqual(1);
|
||||
expect(users[0].name).toEqual('Charlie Wilson');
|
||||
});
|
||||
|
||||
tap.test('should filter arrays using $elemMatch operator', async () => {
|
||||
const orders = await TestOrder.getInstances({
|
||||
items: {
|
||||
$elemMatch: {
|
||||
product: 'laptop',
|
||||
quantity: { $gte: 2 }
|
||||
}
|
||||
}
|
||||
});
|
||||
expect(orders.length).toEqual(1);
|
||||
expect(orders[0].totalAmount).toEqual(2560);
|
||||
});
|
||||
|
||||
tap.test('should filter using $elemMatch with single condition', async () => {
|
||||
const orders = await TestOrder.getInstances({
|
||||
items: {
|
||||
$elemMatch: {
|
||||
price: { $gt: 100 }
|
||||
}
|
||||
}
|
||||
});
|
||||
expect(orders.length).toEqual(2);
|
||||
expect(orders.every(o => o.items.some(i => i.price > 100))).toEqual(true);
|
||||
});
|
||||
|
||||
// ============= LOGICAL OPERATOR TESTS =============
|
||||
tap.test('should filter using $or operator', async () => {
|
||||
const users = await TestUser.getInstances({
|
||||
$or: [
|
||||
{ age: { $lt: 25 } },
|
||||
{ status: 'inactive' }
|
||||
]
|
||||
});
|
||||
expect(users.length).toEqual(2);
|
||||
const names = users.map(u => u.name).sort();
|
||||
expect(names).toEqual(['Bob Johnson', 'Charlie Wilson']);
|
||||
});
|
||||
|
||||
tap.test('should filter using $and operator', async () => {
|
||||
const users = await TestUser.getInstances({
|
||||
$and: [
|
||||
{ status: 'active' },
|
||||
{ age: { $gte: 28 } }
|
||||
]
|
||||
});
|
||||
expect(users.length).toEqual(2);
|
||||
const names = users.map(u => u.name).sort();
|
||||
expect(names).toEqual(['Alice Brown', 'John Doe']);
|
||||
});
|
||||
|
||||
tap.test('should filter using $nor operator', async () => {
|
||||
const users = await TestUser.getInstances({
|
||||
$nor: [
|
||||
{ status: 'inactive' },
|
||||
{ age: { $lt: 25 } }
|
||||
]
|
||||
});
|
||||
expect(users.length).toEqual(3);
|
||||
expect(users.every(u => u.status !== 'inactive' && u.age >= 25)).toEqual(true);
|
||||
});
|
||||
|
||||
tap.test('should filter using nested logical operators', async () => {
|
||||
const users = await TestUser.getInstances({
|
||||
$or: [
|
||||
{
|
||||
$and: [
|
||||
{ status: 'active' },
|
||||
{ roles: { $in: ['admin'] } }
|
||||
]
|
||||
},
|
||||
{ age: { $lt: 23 } }
|
||||
]
|
||||
});
|
||||
expect(users.length).toEqual(3);
|
||||
const names = users.map(u => u.name).sort();
|
||||
expect(names).toEqual(['Alice Brown', 'Charlie Wilson', 'John Doe']);
|
||||
});
|
||||
|
||||
// ============= ELEMENT OPERATOR TESTS =============
|
||||
tap.test('should filter using $exists operator', async () => {
|
||||
const users = await TestUser.getInstances({
|
||||
'metadata.lastLogin': { $exists: true }
|
||||
});
|
||||
expect(users.length).toEqual(1);
|
||||
expect(users[0].name).toEqual('John Doe');
|
||||
});
|
||||
|
||||
tap.test('should filter using $exists false', async () => {
|
||||
const users = await TestUser.getInstances({
|
||||
'metadata.preferences': { $exists: false }
|
||||
});
|
||||
expect(users.length).toEqual(5);
|
||||
});
|
||||
|
||||
// ============= COMPLEX FILTER TESTS =============
|
||||
tap.test('should handle complex nested filters', async () => {
|
||||
const users = await TestUser.getInstances({
|
||||
$and: [
|
||||
{ status: 'active' },
|
||||
{
|
||||
$or: [
|
||||
{ age: { $gte: 30 } },
|
||||
{ roles: { $all: ['admin'] } }
|
||||
]
|
||||
},
|
||||
{ tags: { $in: ['mongodb'] } }
|
||||
]
|
||||
});
|
||||
expect(users.length).toEqual(2);
|
||||
const names = users.map(u => u.name).sort();
|
||||
expect(names).toEqual(['Alice Brown', 'John Doe']);
|
||||
});
|
||||
|
||||
tap.test('should combine multiple operator types', async () => {
|
||||
const orders = await TestOrder.getInstances({
|
||||
$and: [
|
||||
{ totalAmount: { $gte: 100 } },
|
||||
{ status: { $in: ['completed', 'processing'] } },
|
||||
{ tags: { $in: ['electronics'] } }
|
||||
]
|
||||
});
|
||||
expect(orders.length).toEqual(2);
|
||||
expect(orders.every(o => o.totalAmount >= 100)).toEqual(true);
|
||||
});
|
||||
|
||||
// ============= ERROR HANDLING TESTS =============
|
||||
tap.test('should throw error for $where operator', async () => {
|
||||
let error: Error | null = null;
|
||||
try {
|
||||
await TestUser.getInstances({
|
||||
$where: 'this.age > 25'
|
||||
});
|
||||
} catch (e) {
|
||||
error = e as Error;
|
||||
}
|
||||
expect(error).toBeTruthy();
|
||||
expect(error?.message).toMatch(/\$where.*not allowed/);
|
||||
});
|
||||
|
||||
tap.test('should throw error for invalid $in value', async () => {
|
||||
let error: Error | null = null;
|
||||
try {
|
||||
await TestUser.getInstances({
|
||||
status: { $in: 'active' as any } // Should be an array
|
||||
});
|
||||
} catch (e) {
|
||||
error = e as Error;
|
||||
}
|
||||
expect(error).toBeTruthy();
|
||||
expect(error?.message).toMatch(/\$in.*requires.*array/);
|
||||
});
|
||||
|
||||
tap.test('should throw error for invalid $size value', async () => {
|
||||
let error: Error | null = null;
|
||||
try {
|
||||
await TestUser.getInstances({
|
||||
scores: { $size: '3' as any } // Should be a number
|
||||
});
|
||||
} catch (e) {
|
||||
error = e as Error;
|
||||
}
|
||||
expect(error).toBeTruthy();
|
||||
expect(error?.message).toMatch(/\$size.*requires.*numeric/);
|
||||
});
|
||||
|
||||
tap.test('should throw error for dots in field names', async () => {
|
||||
let error: Error | null = null;
|
||||
try {
|
||||
await TestUser.getInstances({
|
||||
'some.nested.field': { 'invalid.key': 'value' }
|
||||
});
|
||||
} catch (e) {
|
||||
error = e as Error;
|
||||
}
|
||||
expect(error).toBeTruthy();
|
||||
expect(error?.message).toMatch(/keys cannot contain dots/);
|
||||
});
|
||||
|
||||
// ============= EDGE CASE TESTS =============
|
||||
tap.test('should handle empty filter (return all)', async () => {
|
||||
const users = await TestUser.getInstances({});
|
||||
expect(users.length).toEqual(5);
|
||||
});
|
||||
|
||||
tap.test('should handle null values in filter', async () => {
|
||||
// First, create a user with null email
|
||||
const nullUser = new TestUser({
|
||||
name: 'Null User',
|
||||
age: 40,
|
||||
email: null as any,
|
||||
roles: ['user'],
|
||||
tags: [],
|
||||
status: 'active',
|
||||
metadata: {},
|
||||
scores: []
|
||||
});
|
||||
await nullUser.save();
|
||||
|
||||
const users = await TestUser.getInstances({ email: null });
|
||||
expect(users.length).toEqual(1);
|
||||
expect(users[0].name).toEqual('Null User');
|
||||
|
||||
// Clean up
|
||||
await nullUser.delete();
|
||||
});
|
||||
|
||||
tap.test('should handle arrays as direct equality match', async () => {
|
||||
// This tests that arrays without operators are treated as equality matches
|
||||
const users = await TestUser.getInstances({
|
||||
roles: ['user'] // Exact match for array
|
||||
});
|
||||
expect(users.length).toEqual(2); // Both Jane and Charlie have exactly ['user']
|
||||
const names = users.map(u => u.name).sort();
|
||||
expect(names).toEqual(['Charlie Wilson', 'Jane Smith']);
|
||||
});
|
||||
|
||||
tap.test('should handle regex operator', async () => {
|
||||
const users = await TestUser.getInstances({
|
||||
name: { $regex: '^J', $options: 'i' }
|
||||
});
|
||||
expect(users.length).toEqual(2);
|
||||
const names = users.map(u => u.name).sort();
|
||||
expect(names).toEqual(['Jane Smith', 'John Doe']);
|
||||
});
|
||||
|
||||
tap.test('should handle unknown operators by letting MongoDB reject them', async () => {
|
||||
// Unknown operators should be passed through to MongoDB, which will reject them
|
||||
let error: Error | null = null;
|
||||
|
||||
try {
|
||||
await TestUser.getInstances({
|
||||
age: { $unknownOp: 30 } as any
|
||||
});
|
||||
} catch (e) {
|
||||
error = e as Error;
|
||||
}
|
||||
|
||||
expect(error).toBeTruthy();
|
||||
expect(error?.message).toMatch(/unknown operator.*\$unknownOp/);
|
||||
});
|
||||
|
||||
// ============= PERFORMANCE TESTS =============
|
||||
tap.test('should efficiently filter large result sets', async () => {
|
||||
// Create many test documents
|
||||
const manyUsers = [];
|
||||
for (let i = 0; i < 100; i++) {
|
||||
manyUsers.push(new TestUser({
|
||||
name: `User ${i}`,
|
||||
age: 20 + (i % 40),
|
||||
email: `user${i}@example.com`,
|
||||
roles: i % 3 === 0 ? ['admin'] : ['user'],
|
||||
tags: i % 2 === 0 ? ['even', 'test'] : ['odd', 'test'],
|
||||
status: i % 4 === 0 ? 'inactive' : 'active',
|
||||
metadata: { loginCount: i },
|
||||
scores: [i, i + 10, i + 20]
|
||||
}));
|
||||
}
|
||||
|
||||
// Save in batches for efficiency
|
||||
for (const user of manyUsers) {
|
||||
await user.save();
|
||||
}
|
||||
|
||||
// Complex filter that should still be fast
|
||||
const startTime = Date.now();
|
||||
const filtered = await TestUser.getInstances({
|
||||
$and: [
|
||||
{ age: { $gte: 30, $lt: 40 } },
|
||||
{ status: 'active' },
|
||||
{ tags: { $in: ['even'] } },
|
||||
{ 'metadata.loginCount': { $gte: 20 } }
|
||||
]
|
||||
});
|
||||
const duration = Date.now() - startTime;
|
||||
|
||||
console.log(`Complex filter on 100+ documents took ${duration}ms`);
|
||||
expect(duration).toBeLessThan(1000); // Should complete in under 1 second
|
||||
expect(filtered.length).toBeGreaterThan(0);
|
||||
|
||||
// Clean up
|
||||
for (const user of manyUsers) {
|
||||
await user.delete();
|
||||
}
|
||||
});
|
||||
|
||||
// ============= CLEANUP =============
|
||||
tap.test('should clean up test database', async () => {
|
||||
await testDb.mongoDb.dropDatabase();
|
||||
await testDb.close();
|
||||
await smartmongoInstance.stop();
|
||||
});
|
||||
|
||||
export default tap.start();
|
@@ -60,6 +60,43 @@ tap.test('should watch a collection', async (toolsArg) => {
|
||||
await done.promise;
|
||||
});
|
||||
|
||||
// ======= New tests for EventEmitter and buffering support =======
|
||||
tap.test('should emit change via EventEmitter', async (tools) => {
|
||||
const done = tools.defer();
|
||||
const watcher = await House.watch({});
|
||||
watcher.on('change', async (houseArg) => {
|
||||
// Expect a House instance
|
||||
expect(houseArg).toBeDefined();
|
||||
// Clean up
|
||||
await watcher.stop();
|
||||
done.resolve();
|
||||
});
|
||||
// Trigger an insert to generate a change event
|
||||
const h = new House();
|
||||
await h.save();
|
||||
await done.promise;
|
||||
});
|
||||
|
||||
tap.test('should buffer change events when bufferTimeMs is set', async (tools) => {
|
||||
const done = tools.defer();
|
||||
// bufferTimeMs collects events into arrays every 50ms
|
||||
const watcher = await House.watch({}, { bufferTimeMs: 50 });
|
||||
let received: House[];
|
||||
watcher.changeSubject.subscribe(async (batch: House[]) => {
|
||||
if (batch && batch.length > 0) {
|
||||
received = batch;
|
||||
await watcher.stop();
|
||||
done.resolve();
|
||||
}
|
||||
});
|
||||
// Rapidly insert multiple docs
|
||||
const docs = [new House(), new House(), new House()];
|
||||
for (const doc of docs) await doc.save();
|
||||
await done.promise;
|
||||
// All inserts should be in one buffered batch
|
||||
expect(received.length).toEqual(docs.length);
|
||||
});
|
||||
|
||||
// =======================================
|
||||
// close the database connection
|
||||
// =======================================
|
||||
|
@@ -3,6 +3,6 @@
|
||||
*/
|
||||
export const commitinfo = {
|
||||
name: '@push.rocks/smartdata',
|
||||
version: '5.13.1',
|
||||
version: '5.16.4',
|
||||
description: 'An advanced library for NoSQL data organization and manipulation using TypeScript with support for MongoDB, data validation, collections, and custom data types.'
|
||||
}
|
||||
|
@@ -4,6 +4,7 @@ import { SmartdataDbCursor } from './classes.cursor.js';
|
||||
import { SmartDataDbDoc, type IIndexOptions } from './classes.doc.js';
|
||||
import { SmartdataDbWatcher } from './classes.watcher.js';
|
||||
import { CollectionFactory } from './classes.collectionfactory.js';
|
||||
import { logger } from './logging.js';
|
||||
|
||||
export interface IFindOptions {
|
||||
limit?: number;
|
||||
@@ -161,7 +162,7 @@ export class SmartdataCollection<T> {
|
||||
});
|
||||
if (!wantedCollection) {
|
||||
await this.smartdataDb.mongoDb.createCollection(this.collectionName);
|
||||
console.log(`Successfully initiated Collection ${this.collectionName}`);
|
||||
logger.log('info', `Successfully initiated Collection ${this.collectionName}`);
|
||||
}
|
||||
this.mongoDbCollection = this.smartdataDb.mongoDb.collection(this.collectionName);
|
||||
// Auto-create a compound text index on all searchable fields
|
||||
@@ -182,10 +183,10 @@ export class SmartdataCollection<T> {
|
||||
/**
|
||||
* mark unique index
|
||||
*/
|
||||
public markUniqueIndexes(keyArrayArg: string[] = []) {
|
||||
public async markUniqueIndexes(keyArrayArg: string[] = []) {
|
||||
for (const key of keyArrayArg) {
|
||||
if (!this.uniqueIndexes.includes(key)) {
|
||||
this.mongoDbCollection.createIndex(key, {
|
||||
await this.mongoDbCollection.createIndex({ [key]: 1 }, {
|
||||
unique: true,
|
||||
});
|
||||
// make sure we only call this once and not for every doc we create
|
||||
@@ -197,12 +198,12 @@ export class SmartdataCollection<T> {
|
||||
/**
|
||||
* creates regular indexes for the collection
|
||||
*/
|
||||
public createRegularIndexes(indexesArg: Array<{field: string, options: IIndexOptions}> = []) {
|
||||
public async createRegularIndexes(indexesArg: Array<{field: string, options: IIndexOptions}> = []) {
|
||||
for (const indexDef of indexesArg) {
|
||||
// Check if we've already created this index
|
||||
const indexKey = indexDef.field;
|
||||
if (!this.regularIndexes.some(i => i.field === indexKey)) {
|
||||
this.mongoDbCollection.createIndex(
|
||||
await this.mongoDbCollection.createIndex(
|
||||
{ [indexDef.field]: 1 }, // Simple single-field index
|
||||
indexDef.options
|
||||
);
|
||||
@@ -222,53 +223,74 @@ export class SmartdataCollection<T> {
|
||||
/**
|
||||
* finds an object in the DbCollection
|
||||
*/
|
||||
public async findOne(filterObject: any): Promise<any> {
|
||||
public async findOne(
|
||||
filterObject: any,
|
||||
opts?: { session?: plugins.mongodb.ClientSession }
|
||||
): Promise<any> {
|
||||
await this.init();
|
||||
const cursor = this.mongoDbCollection.find(filterObject);
|
||||
const result = await cursor.next();
|
||||
cursor.close();
|
||||
return result;
|
||||
// Use MongoDB driver's findOne with optional session
|
||||
return this.mongoDbCollection.findOne(filterObject, { session: opts?.session });
|
||||
}
|
||||
|
||||
public async getCursor(
|
||||
filterObjectArg: any,
|
||||
dbDocArg: typeof SmartDataDbDoc,
|
||||
opts?: { session?: plugins.mongodb.ClientSession }
|
||||
): Promise<SmartdataDbCursor<any>> {
|
||||
await this.init();
|
||||
const cursor = this.mongoDbCollection.find(filterObjectArg);
|
||||
const cursor = this.mongoDbCollection.find(filterObjectArg, { session: opts?.session });
|
||||
return new SmartdataDbCursor(cursor, dbDocArg);
|
||||
}
|
||||
|
||||
/**
|
||||
* finds an object in the DbCollection
|
||||
*/
|
||||
public async findAll(filterObject: any): Promise<any[]> {
|
||||
public async findAll(
|
||||
filterObject: any,
|
||||
opts?: { session?: plugins.mongodb.ClientSession }
|
||||
): Promise<any[]> {
|
||||
await this.init();
|
||||
const cursor = this.mongoDbCollection.find(filterObject);
|
||||
const cursor = this.mongoDbCollection.find(filterObject, { session: opts?.session });
|
||||
const result = await cursor.toArray();
|
||||
cursor.close();
|
||||
return result;
|
||||
}
|
||||
|
||||
/**
|
||||
* watches the collection while applying a filter
|
||||
* Watches the collection, returning a SmartdataDbWatcher with RxJS and EventEmitter support.
|
||||
* @param filterObject match filter for change stream
|
||||
* @param opts optional MongoDB ChangeStreamOptions & { bufferTimeMs } to buffer events
|
||||
* @param smartdataDbDocArg document class for instance creation
|
||||
*/
|
||||
public async watch(
|
||||
filterObject: any,
|
||||
smartdataDbDocArg: typeof SmartDataDbDoc,
|
||||
opts: (plugins.mongodb.ChangeStreamOptions & { bufferTimeMs?: number }) = {},
|
||||
smartdataDbDocArg?: typeof SmartDataDbDoc,
|
||||
): Promise<SmartdataDbWatcher> {
|
||||
await this.init();
|
||||
// Extract bufferTimeMs from options
|
||||
const { bufferTimeMs, fullDocument, ...otherOptions } = opts || {};
|
||||
// Determine fullDocument behavior: default to 'updateLookup'
|
||||
const changeStreamOptions: plugins.mongodb.ChangeStreamOptions = {
|
||||
...otherOptions,
|
||||
fullDocument:
|
||||
fullDocument === undefined
|
||||
? 'updateLookup'
|
||||
: (fullDocument as any) === true
|
||||
? 'updateLookup'
|
||||
: fullDocument,
|
||||
} as any;
|
||||
// Build pipeline with match if provided
|
||||
const pipeline = filterObject ? [{ $match: filterObject }] : [];
|
||||
const changeStream = this.mongoDbCollection.watch(
|
||||
[
|
||||
{
|
||||
$match: filterObject,
|
||||
},
|
||||
],
|
||||
{
|
||||
fullDocument: 'updateLookup',
|
||||
},
|
||||
pipeline,
|
||||
changeStreamOptions,
|
||||
);
|
||||
const smartdataWatcher = new SmartdataDbWatcher(
|
||||
changeStream,
|
||||
smartdataDbDocArg,
|
||||
{ bufferTimeMs },
|
||||
);
|
||||
const smartdataWatcher = new SmartdataDbWatcher(changeStream, smartdataDbDocArg);
|
||||
await smartdataWatcher.readyDeferred.promise;
|
||||
return smartdataWatcher;
|
||||
}
|
||||
@@ -276,7 +298,10 @@ export class SmartdataCollection<T> {
|
||||
/**
|
||||
* create an object in the database
|
||||
*/
|
||||
public async insert(dbDocArg: T & SmartDataDbDoc<T, unknown>): Promise<any> {
|
||||
public async insert(
|
||||
dbDocArg: T & SmartDataDbDoc<T, unknown>,
|
||||
opts?: { session?: plugins.mongodb.ClientSession }
|
||||
): Promise<any> {
|
||||
await this.init();
|
||||
await this.checkDoc(dbDocArg);
|
||||
this.markUniqueIndexes(dbDocArg.uniqueIndexes);
|
||||
@@ -287,14 +312,17 @@ export class SmartdataCollection<T> {
|
||||
}
|
||||
|
||||
const saveableObject = await dbDocArg.createSavableObject();
|
||||
const result = await this.mongoDbCollection.insertOne(saveableObject);
|
||||
const result = await this.mongoDbCollection.insertOne(saveableObject, { session: opts?.session });
|
||||
return result;
|
||||
}
|
||||
|
||||
/**
|
||||
* inserts object into the DbCollection
|
||||
*/
|
||||
public async update(dbDocArg: T & SmartDataDbDoc<T, unknown>): Promise<any> {
|
||||
public async update(
|
||||
dbDocArg: T & SmartDataDbDoc<T, unknown>,
|
||||
opts?: { session?: plugins.mongodb.ClientSession }
|
||||
): Promise<any> {
|
||||
await this.init();
|
||||
await this.checkDoc(dbDocArg);
|
||||
const identifiableObject = await dbDocArg.createIdentifiableObject();
|
||||
@@ -309,21 +337,27 @@ export class SmartdataCollection<T> {
|
||||
const result = await this.mongoDbCollection.updateOne(
|
||||
identifiableObject,
|
||||
{ $set: updateableObject },
|
||||
{ upsert: true },
|
||||
{ upsert: true, session: opts?.session },
|
||||
);
|
||||
return result;
|
||||
}
|
||||
|
||||
public async delete(dbDocArg: T & SmartDataDbDoc<T, unknown>): Promise<any> {
|
||||
public async delete(
|
||||
dbDocArg: T & SmartDataDbDoc<T, unknown>,
|
||||
opts?: { session?: plugins.mongodb.ClientSession }
|
||||
): Promise<any> {
|
||||
await this.init();
|
||||
await this.checkDoc(dbDocArg);
|
||||
const identifiableObject = await dbDocArg.createIdentifiableObject();
|
||||
await this.mongoDbCollection.deleteOne(identifiableObject);
|
||||
await this.mongoDbCollection.deleteOne(identifiableObject, { session: opts?.session });
|
||||
}
|
||||
|
||||
public async getCount(filterObject: any) {
|
||||
public async getCount(
|
||||
filterObject: any,
|
||||
opts?: { session?: plugins.mongodb.ClientSession }
|
||||
) {
|
||||
await this.init();
|
||||
return this.mongoDbCollection.countDocuments(filterObject);
|
||||
return this.mongoDbCollection.countDocuments(filterObject, { session: opts?.session });
|
||||
}
|
||||
|
||||
/**
|
||||
|
@@ -35,24 +35,43 @@ export class SmartdataDb {
|
||||
* connects to the database that was specified during instance creation
|
||||
*/
|
||||
public async init(): Promise<any> {
|
||||
const finalConnectionUrl = this.smartdataOptions.mongoDbUrl
|
||||
.replace('<USERNAME>', this.smartdataOptions.mongoDbUser)
|
||||
.replace('<username>', this.smartdataOptions.mongoDbUser)
|
||||
.replace('<USER>', this.smartdataOptions.mongoDbUser)
|
||||
.replace('<user>', this.smartdataOptions.mongoDbUser)
|
||||
.replace('<PASSWORD>', this.smartdataOptions.mongoDbPass)
|
||||
.replace('<password>', this.smartdataOptions.mongoDbPass)
|
||||
.replace('<DBNAME>', this.smartdataOptions.mongoDbName)
|
||||
.replace('<dbname>', this.smartdataOptions.mongoDbName);
|
||||
try {
|
||||
// Safely encode credentials to handle special characters
|
||||
const encodedUser = this.smartdataOptions.mongoDbUser
|
||||
? encodeURIComponent(this.smartdataOptions.mongoDbUser)
|
||||
: '';
|
||||
const encodedPass = this.smartdataOptions.mongoDbPass
|
||||
? encodeURIComponent(this.smartdataOptions.mongoDbPass)
|
||||
: '';
|
||||
|
||||
const finalConnectionUrl = this.smartdataOptions.mongoDbUrl
|
||||
.replace('<USERNAME>', encodedUser)
|
||||
.replace('<username>', encodedUser)
|
||||
.replace('<USER>', encodedUser)
|
||||
.replace('<user>', encodedUser)
|
||||
.replace('<PASSWORD>', encodedPass)
|
||||
.replace('<password>', encodedPass)
|
||||
.replace('<DBNAME>', this.smartdataOptions.mongoDbName)
|
||||
.replace('<dbname>', this.smartdataOptions.mongoDbName);
|
||||
|
||||
this.mongoDbClient = await plugins.mongodb.MongoClient.connect(finalConnectionUrl, {
|
||||
maxPoolSize: 100,
|
||||
maxIdleTimeMS: 10,
|
||||
});
|
||||
this.mongoDb = this.mongoDbClient.db(this.smartdataOptions.mongoDbName);
|
||||
this.status = 'connected';
|
||||
this.statusConnectedDeferred.resolve();
|
||||
console.log(`Connected to database ${this.smartdataOptions.mongoDbName}`);
|
||||
const clientOptions: plugins.mongodb.MongoClientOptions = {
|
||||
maxPoolSize: (this.smartdataOptions as any).maxPoolSize ?? 100,
|
||||
maxIdleTimeMS: (this.smartdataOptions as any).maxIdleTimeMS ?? 300000, // 5 minutes default
|
||||
serverSelectionTimeoutMS: (this.smartdataOptions as any).serverSelectionTimeoutMS ?? 30000,
|
||||
retryWrites: true,
|
||||
};
|
||||
|
||||
this.mongoDbClient = await plugins.mongodb.MongoClient.connect(finalConnectionUrl, clientOptions);
|
||||
this.mongoDb = this.mongoDbClient.db(this.smartdataOptions.mongoDbName);
|
||||
this.status = 'connected';
|
||||
this.statusConnectedDeferred.resolve();
|
||||
logger.log('info', `Connected to database ${this.smartdataOptions.mongoDbName}`);
|
||||
} catch (error) {
|
||||
this.status = 'disconnected';
|
||||
this.statusConnectedDeferred.reject(error);
|
||||
logger.log('error', `Failed to connect to database ${this.smartdataOptions.mongoDbName}: ${error.message}`);
|
||||
throw error;
|
||||
}
|
||||
}
|
||||
|
||||
/**
|
||||
@@ -63,6 +82,12 @@ export class SmartdataDb {
|
||||
this.status = 'disconnected';
|
||||
logger.log('info', `disconnected from database ${this.smartdataOptions.mongoDbName}`);
|
||||
}
|
||||
/**
|
||||
* Start a MongoDB client session for transactions
|
||||
*/
|
||||
public startSession(): plugins.mongodb.ClientSession {
|
||||
return this.mongoDbClient.startSession();
|
||||
}
|
||||
|
||||
// handle table to class distribution
|
||||
|
||||
|
@@ -3,6 +3,7 @@ import { SmartdataDb } from './classes.db.js';
|
||||
import { managed, setDefaultManagerForDoc } from './classes.collection.js';
|
||||
import { SmartDataDbDoc, svDb, unI } from './classes.doc.js';
|
||||
import { SmartdataDbWatcher } from './classes.watcher.js';
|
||||
import { logger } from './logging.js';
|
||||
|
||||
@managed()
|
||||
export class DistributedClass extends SmartDataDbDoc<DistributedClass, DistributedClass> {
|
||||
@@ -63,11 +64,11 @@ export class SmartdataDistributedCoordinator extends plugins.taskbuffer.distribu
|
||||
this.ownInstance.data.elected = false;
|
||||
}
|
||||
if (this.ownInstance?.data.status === 'stopped') {
|
||||
console.log(`stopping a distributed instance that has not been started yet.`);
|
||||
logger.log('warn', `stopping a distributed instance that has not been started yet.`);
|
||||
}
|
||||
this.ownInstance.data.status = 'stopped';
|
||||
await this.ownInstance.save();
|
||||
console.log(`stopped ${this.ownInstance.id}`);
|
||||
logger.log('info', `stopped ${this.ownInstance.id}`);
|
||||
});
|
||||
}
|
||||
|
||||
@@ -83,17 +84,17 @@ export class SmartdataDistributedCoordinator extends plugins.taskbuffer.distribu
|
||||
public async sendHeartbeat() {
|
||||
await this.asyncExecutionStack.getExclusiveExecutionSlot(async () => {
|
||||
if (this.ownInstance.data.status === 'stopped') {
|
||||
console.log(`aborted sending heartbeat because status is stopped`);
|
||||
logger.log('debug', `aborted sending heartbeat because status is stopped`);
|
||||
return;
|
||||
}
|
||||
await this.ownInstance.updateFromDb();
|
||||
this.ownInstance.data.lastUpdated = Date.now();
|
||||
await this.ownInstance.save();
|
||||
console.log(`sent heartbeat for ${this.ownInstance.id}`);
|
||||
logger.log('debug', `sent heartbeat for ${this.ownInstance.id}`);
|
||||
const allInstances = DistributedClass.getInstances({});
|
||||
});
|
||||
if (this.ownInstance.data.status === 'stopped') {
|
||||
console.log(`aborted sending heartbeat because status is stopped`);
|
||||
logger.log('info', `aborted sending heartbeat because status is stopped`);
|
||||
return;
|
||||
}
|
||||
const eligibleLeader = await this.getEligibleLeader();
|
||||
@@ -120,7 +121,7 @@ export class SmartdataDistributedCoordinator extends plugins.taskbuffer.distribu
|
||||
await this.ownInstance.save();
|
||||
});
|
||||
} else {
|
||||
console.warn(`distributed instance already initialized`);
|
||||
logger.log('warn', `distributed instance already initialized`);
|
||||
}
|
||||
|
||||
// lets enable the heartbeat
|
||||
@@ -149,24 +150,24 @@ export class SmartdataDistributedCoordinator extends plugins.taskbuffer.distribu
|
||||
public async checkAndMaybeLead() {
|
||||
await this.asyncExecutionStack.getExclusiveExecutionSlot(async () => {
|
||||
this.ownInstance.data.status = 'initializing';
|
||||
this.ownInstance.save();
|
||||
await this.ownInstance.save();
|
||||
});
|
||||
if (await this.getEligibleLeader()) {
|
||||
await this.asyncExecutionStack.getExclusiveExecutionSlot(async () => {
|
||||
await this.ownInstance.updateFromDb();
|
||||
this.ownInstance.data.status = 'settled';
|
||||
await this.ownInstance.save();
|
||||
console.log(`${this.ownInstance.id} settled as follower`);
|
||||
logger.log('info', `${this.ownInstance.id} settled as follower`);
|
||||
});
|
||||
return;
|
||||
} else if (
|
||||
(await DistributedClass.getInstances({})).find((instanceArg) => {
|
||||
instanceArg.data.status === 'bidding' &&
|
||||
return instanceArg.data.status === 'bidding' &&
|
||||
instanceArg.data.biddingStartTime <= Date.now() - 4000 &&
|
||||
instanceArg.data.biddingStartTime >= Date.now() - 30000;
|
||||
})
|
||||
) {
|
||||
console.log('too late to the bidding party... waiting for next round.');
|
||||
logger.log('info', 'too late to the bidding party... waiting for next round.');
|
||||
return;
|
||||
} else {
|
||||
await this.asyncExecutionStack.getExclusiveExecutionSlot(async () => {
|
||||
@@ -175,9 +176,9 @@ export class SmartdataDistributedCoordinator extends plugins.taskbuffer.distribu
|
||||
this.ownInstance.data.biddingStartTime = Date.now();
|
||||
this.ownInstance.data.biddingShortcode = plugins.smartunique.shortId();
|
||||
await this.ownInstance.save();
|
||||
console.log('bidding code stored.');
|
||||
logger.log('info', 'bidding code stored.');
|
||||
});
|
||||
console.log(`bidding for leadership...`);
|
||||
logger.log('info', `bidding for leadership...`);
|
||||
await plugins.smartdelay.delayFor(plugins.smarttime.getMilliSecondsFromUnits({ seconds: 5 }));
|
||||
await this.asyncExecutionStack.getExclusiveExecutionSlot(async () => {
|
||||
let biddingInstances = await DistributedClass.getInstances({});
|
||||
@@ -187,7 +188,7 @@ export class SmartdataDistributedCoordinator extends plugins.taskbuffer.distribu
|
||||
instanceArg.data.lastUpdated >=
|
||||
Date.now() - plugins.smarttime.getMilliSecondsFromUnits({ seconds: 10 }),
|
||||
);
|
||||
console.log(`found ${biddingInstances.length} bidding instances...`);
|
||||
logger.log('info', `found ${biddingInstances.length} bidding instances...`);
|
||||
this.ownInstance.data.elected = true;
|
||||
for (const biddingInstance of biddingInstances) {
|
||||
if (biddingInstance.data.biddingShortcode < this.ownInstance.data.biddingShortcode) {
|
||||
@@ -195,7 +196,7 @@ export class SmartdataDistributedCoordinator extends plugins.taskbuffer.distribu
|
||||
}
|
||||
}
|
||||
await plugins.smartdelay.delayFor(5000);
|
||||
console.log(`settling with status elected = ${this.ownInstance.data.elected}`);
|
||||
logger.log('info', `settling with status elected = ${this.ownInstance.data.elected}`);
|
||||
this.ownInstance.data.status = 'settled';
|
||||
await this.ownInstance.save();
|
||||
});
|
||||
@@ -226,11 +227,11 @@ export class SmartdataDistributedCoordinator extends plugins.taskbuffer.distribu
|
||||
this.distributedWatcher.changeSubject.subscribe({
|
||||
next: async (distributedDoc) => {
|
||||
if (!distributedDoc) {
|
||||
console.log(`registered deletion of instance...`);
|
||||
logger.log('info', `registered deletion of instance...`);
|
||||
return;
|
||||
}
|
||||
console.log(distributedDoc);
|
||||
console.log(`registered change for ${distributedDoc.id}`);
|
||||
logger.log('info', distributedDoc);
|
||||
logger.log('info', `registered change for ${distributedDoc.id}`);
|
||||
distributedDoc;
|
||||
},
|
||||
});
|
||||
@@ -252,7 +253,7 @@ export class SmartdataDistributedCoordinator extends plugins.taskbuffer.distribu
|
||||
): Promise<plugins.taskbuffer.distributedCoordination.IDistributedTaskRequestResult> {
|
||||
await this.asyncExecutionStack.getExclusiveExecutionSlot(async () => {
|
||||
if (!this.ownInstance) {
|
||||
console.error('instance need to be started first...');
|
||||
logger.log('error', 'instance need to be started first...');
|
||||
return;
|
||||
}
|
||||
await this.ownInstance.updateFromDb();
|
||||
@@ -268,7 +269,7 @@ export class SmartdataDistributedCoordinator extends plugins.taskbuffer.distribu
|
||||
return taskRequestResult;
|
||||
});
|
||||
if (!result) {
|
||||
console.warn('no result found for task request...');
|
||||
logger.log('warn', 'no result found for task request...');
|
||||
return null;
|
||||
}
|
||||
return result;
|
||||
@@ -285,7 +286,7 @@ export class SmartdataDistributedCoordinator extends plugins.taskbuffer.distribu
|
||||
);
|
||||
});
|
||||
if (!existingInfoBasis) {
|
||||
console.warn('trying to update a non existing task request... aborting!');
|
||||
logger.log('warn', 'trying to update a non existing task request... aborting!');
|
||||
return;
|
||||
}
|
||||
Object.assign(existingInfoBasis, infoBasisArg);
|
||||
@@ -293,8 +294,10 @@ export class SmartdataDistributedCoordinator extends plugins.taskbuffer.distribu
|
||||
plugins.smartdelay.delayFor(60000).then(() => {
|
||||
this.asyncExecutionStack.getExclusiveExecutionSlot(async () => {
|
||||
const indexToRemove = this.ownInstance.data.taskRequests.indexOf(existingInfoBasis);
|
||||
this.ownInstance.data.taskRequests.splice(indexToRemove, indexToRemove);
|
||||
await this.ownInstance.save();
|
||||
if (indexToRemove >= 0) {
|
||||
this.ownInstance.data.taskRequests.splice(indexToRemove, 1);
|
||||
await this.ownInstance.save();
|
||||
}
|
||||
});
|
||||
});
|
||||
});
|
||||
|
@@ -1,6 +1,7 @@
|
||||
import * as plugins from './plugins.js';
|
||||
|
||||
import { SmartdataDb } from './classes.db.js';
|
||||
import { logger } from './logging.js';
|
||||
import { SmartdataDbCursor } from './classes.cursor.js';
|
||||
import { type IManager, SmartdataCollection } from './classes.collection.js';
|
||||
import { SmartdataDbWatcher } from './classes.watcher.js';
|
||||
@@ -11,8 +12,18 @@ import { SmartdataLuceneAdapter } from './classes.lucene.adapter.js';
|
||||
* - validate: post-fetch validator, return true to keep a doc
|
||||
*/
|
||||
export interface SearchOptions<T> {
|
||||
/**
|
||||
* Additional MongoDB filter to AND‐merge into the query
|
||||
*/
|
||||
filter?: Record<string, any>;
|
||||
/**
|
||||
* Post‐fetch validator; return true to keep each doc
|
||||
*/
|
||||
validate?: (doc: T) => Promise<boolean> | boolean;
|
||||
/**
|
||||
* Optional MongoDB session for transactional operations
|
||||
*/
|
||||
session?: plugins.mongodb.ClientSession;
|
||||
}
|
||||
|
||||
export type TDocCreation = 'db' | 'new' | 'mixed';
|
||||
@@ -21,7 +32,7 @@ export type TDocCreation = 'db' | 'new' | 'mixed';
|
||||
|
||||
export function globalSvDb() {
|
||||
return (target: SmartDataDbDoc<unknown, unknown>, key: string) => {
|
||||
console.log(`called svDb() on >${target.constructor.name}.${key}<`);
|
||||
logger.log('debug', `called svDb() on >${target.constructor.name}.${key}<`);
|
||||
if (!target.globalSaveableProperties) {
|
||||
target.globalSaveableProperties = [];
|
||||
}
|
||||
@@ -29,16 +40,34 @@ export function globalSvDb() {
|
||||
};
|
||||
}
|
||||
|
||||
/**
|
||||
* Options for custom serialization/deserialization of a field.
|
||||
*/
|
||||
export interface SvDbOptions {
|
||||
/** Function to serialize the field value before saving to DB */
|
||||
serialize?: (value: any) => any;
|
||||
/** Function to deserialize the field value after reading from DB */
|
||||
deserialize?: (value: any) => any;
|
||||
}
|
||||
|
||||
/**
|
||||
* saveable - saveable decorator to be used on class properties
|
||||
*/
|
||||
export function svDb() {
|
||||
export function svDb(options?: SvDbOptions) {
|
||||
return (target: SmartDataDbDoc<unknown, unknown>, key: string) => {
|
||||
console.log(`called svDb() on >${target.constructor.name}.${key}<`);
|
||||
logger.log('debug', `called svDb() on >${target.constructor.name}.${key}<`);
|
||||
if (!target.saveableProperties) {
|
||||
target.saveableProperties = [];
|
||||
}
|
||||
target.saveableProperties.push(key);
|
||||
// attach custom serializer/deserializer options to the class constructor
|
||||
const ctor = target.constructor as any;
|
||||
if (!ctor._svDbOptions) {
|
||||
ctor._svDbOptions = {};
|
||||
}
|
||||
if (options) {
|
||||
ctor._svDbOptions[key] = options;
|
||||
}
|
||||
};
|
||||
}
|
||||
|
||||
@@ -66,7 +95,7 @@ function escapeForRegex(input: string): string {
|
||||
*/
|
||||
export function unI() {
|
||||
return (target: SmartDataDbDoc<unknown, unknown>, key: string) => {
|
||||
console.log(`called unI on >>${target.constructor.name}.${key}<<`);
|
||||
logger.log('debug', `called unI on >>${target.constructor.name}.${key}<<`);
|
||||
|
||||
// mark the index as unique
|
||||
if (!target.uniqueIndexes) {
|
||||
@@ -98,7 +127,7 @@ export interface IIndexOptions {
|
||||
*/
|
||||
export function index(options?: IIndexOptions) {
|
||||
return (target: SmartDataDbDoc<unknown, unknown>, key: string) => {
|
||||
console.log(`called index() on >${target.constructor.name}.${key}<`);
|
||||
logger.log('debug', `called index() on >${target.constructor.name}.${key}<`);
|
||||
|
||||
// Initialize regular indexes array if it doesn't exist
|
||||
if (!target.regularIndexes) {
|
||||
@@ -122,42 +151,181 @@ export function index(options?: IIndexOptions) {
|
||||
};
|
||||
}
|
||||
|
||||
// Helper type to extract element type from arrays or return T itself
|
||||
type ElementOf<T> = T extends ReadonlyArray<infer U> ? U : T;
|
||||
|
||||
// Type for $in/$nin values - arrays of the element type
|
||||
type InValues<T> = ReadonlyArray<ElementOf<T>>;
|
||||
|
||||
// Type that allows MongoDB operators on leaf values while maintaining nested type safety
|
||||
export type MongoFilterCondition<T> = T | {
|
||||
$eq?: T;
|
||||
$ne?: T;
|
||||
$gt?: T;
|
||||
$gte?: T;
|
||||
$lt?: T;
|
||||
$lte?: T;
|
||||
$in?: InValues<T>;
|
||||
$nin?: InValues<T>;
|
||||
$exists?: boolean;
|
||||
$type?: string | number;
|
||||
$regex?: string | RegExp;
|
||||
$options?: string;
|
||||
$all?: T extends ReadonlyArray<infer U> ? ReadonlyArray<U> : never;
|
||||
$elemMatch?: T extends ReadonlyArray<infer U> ? MongoFilter<U> : never;
|
||||
$size?: T extends ReadonlyArray<any> ? number : never;
|
||||
$not?: MongoFilterCondition<T>;
|
||||
};
|
||||
|
||||
export type MongoFilter<T> = {
|
||||
[K in keyof T]?: T[K] extends object
|
||||
? T[K] extends any[]
|
||||
? MongoFilterCondition<T[K]> // Arrays can have operators
|
||||
: MongoFilter<T[K]> | MongoFilterCondition<T[K]> // Objects can be nested or have operators
|
||||
: MongoFilterCondition<T[K]>; // Primitives get operators
|
||||
} & {
|
||||
// Logical operators
|
||||
$and?: MongoFilter<T>[];
|
||||
$or?: MongoFilter<T>[];
|
||||
$nor?: MongoFilter<T>[];
|
||||
$not?: MongoFilter<T>;
|
||||
// Allow any string key for dot notation (we lose type safety here but maintain flexibility)
|
||||
[key: string]: any;
|
||||
};
|
||||
|
||||
export const convertFilterForMongoDb = (filterArg: { [key: string]: any }) => {
|
||||
// Special case: detect MongoDB operators and pass them through directly
|
||||
const topLevelOperators = ['$and', '$or', '$nor', '$not', '$text', '$where', '$regex'];
|
||||
// SECURITY: Block $where to prevent server-side JS execution
|
||||
if (filterArg.$where !== undefined) {
|
||||
throw new Error('$where operator is not allowed for security reasons');
|
||||
}
|
||||
|
||||
// Handle logical operators recursively
|
||||
const logicalOperators = ['$and', '$or', '$nor', '$not'];
|
||||
const processedFilter: { [key: string]: any } = {};
|
||||
|
||||
for (const key of Object.keys(filterArg)) {
|
||||
if (topLevelOperators.includes(key)) {
|
||||
return filterArg; // Return the filter as-is for MongoDB operators
|
||||
if (logicalOperators.includes(key)) {
|
||||
if (key === '$not') {
|
||||
processedFilter[key] = convertFilterForMongoDb(filterArg[key]);
|
||||
} else if (Array.isArray(filterArg[key])) {
|
||||
processedFilter[key] = filterArg[key].map((subFilter: any) => convertFilterForMongoDb(subFilter));
|
||||
}
|
||||
}
|
||||
}
|
||||
|
||||
// If only logical operators, return them
|
||||
const hasOnlyLogicalOperators = Object.keys(filterArg).every(key => logicalOperators.includes(key));
|
||||
if (hasOnlyLogicalOperators) {
|
||||
return processedFilter;
|
||||
}
|
||||
|
||||
// Original conversion logic for non-MongoDB query objects
|
||||
const convertedFilter: { [key: string]: any } = {};
|
||||
|
||||
// Helper to merge operator objects
|
||||
const mergeIntoConverted = (path: string, value: any) => {
|
||||
const existing = convertedFilter[path];
|
||||
if (!existing) {
|
||||
convertedFilter[path] = value;
|
||||
} else if (
|
||||
typeof existing === 'object' && !Array.isArray(existing) &&
|
||||
typeof value === 'object' && !Array.isArray(value) &&
|
||||
(Object.keys(existing).some(k => k.startsWith('$')) || Object.keys(value).some(k => k.startsWith('$')))
|
||||
) {
|
||||
// Both have operators, merge them
|
||||
convertedFilter[path] = { ...existing, ...value };
|
||||
} else {
|
||||
// Otherwise later wins
|
||||
convertedFilter[path] = value;
|
||||
}
|
||||
};
|
||||
|
||||
const convertFilterArgument = (keyPathArg2: string, filterArg2: any) => {
|
||||
if (Array.isArray(filterArg2)) {
|
||||
// Directly assign arrays (they might be using operators like $in or $all)
|
||||
convertFilterArgument(keyPathArg2, filterArg2[0]);
|
||||
// Arrays are typically used as values for operators like $in or as direct equality matches
|
||||
mergeIntoConverted(keyPathArg2, filterArg2);
|
||||
return;
|
||||
} else if (typeof filterArg2 === 'object' && filterArg2 !== null) {
|
||||
for (const key of Object.keys(filterArg2)) {
|
||||
if (key.startsWith('$')) {
|
||||
convertedFilter[keyPathArg2] = filterArg2;
|
||||
return;
|
||||
} else if (key.includes('.')) {
|
||||
// Check if this is an object with MongoDB operators
|
||||
const keys = Object.keys(filterArg2);
|
||||
const hasOperators = keys.some(key => key.startsWith('$'));
|
||||
|
||||
if (hasOperators) {
|
||||
// This object contains MongoDB operators
|
||||
// Validate and pass through allowed operators
|
||||
const allowedOperators = [
|
||||
// Comparison operators
|
||||
'$eq', '$ne', '$gt', '$gte', '$lt', '$lte',
|
||||
// Array operators
|
||||
'$in', '$nin', '$all', '$elemMatch', '$size',
|
||||
// Element operators
|
||||
'$exists', '$type',
|
||||
// Evaluation operators (safe ones only)
|
||||
'$regex', '$options', '$text', '$mod',
|
||||
// Logical operators (nested)
|
||||
'$and', '$or', '$nor', '$not'
|
||||
];
|
||||
|
||||
// Check for dangerous operators
|
||||
if (keys.includes('$where')) {
|
||||
throw new Error('$where operator is not allowed for security reasons');
|
||||
}
|
||||
|
||||
// Validate all operators are in the allowed list
|
||||
const invalidOperators = keys.filter(key =>
|
||||
key.startsWith('$') && !allowedOperators.includes(key)
|
||||
);
|
||||
|
||||
if (invalidOperators.length > 0) {
|
||||
console.warn(`Warning: Unknown MongoDB operators detected: ${invalidOperators.join(', ')}`);
|
||||
}
|
||||
|
||||
// For array operators, ensure the values are appropriate
|
||||
if (filterArg2.$in && !Array.isArray(filterArg2.$in)) {
|
||||
throw new Error('$in operator requires an array value');
|
||||
}
|
||||
if (filterArg2.$nin && !Array.isArray(filterArg2.$nin)) {
|
||||
throw new Error('$nin operator requires an array value');
|
||||
}
|
||||
if (filterArg2.$all && !Array.isArray(filterArg2.$all)) {
|
||||
throw new Error('$all operator requires an array value');
|
||||
}
|
||||
if (filterArg2.$size && typeof filterArg2.$size !== 'number') {
|
||||
throw new Error('$size operator requires a numeric value');
|
||||
}
|
||||
|
||||
// Use merge helper to handle duplicate paths
|
||||
mergeIntoConverted(keyPathArg2, filterArg2);
|
||||
return;
|
||||
}
|
||||
|
||||
// No operators, check for dots in keys
|
||||
for (const key of keys) {
|
||||
if (key.includes('.')) {
|
||||
throw new Error('keys cannot contain dots');
|
||||
}
|
||||
}
|
||||
for (const key of Object.keys(filterArg2)) {
|
||||
|
||||
// Recursively process nested objects
|
||||
for (const key of keys) {
|
||||
convertFilterArgument(`${keyPathArg2}.${key}`, filterArg2[key]);
|
||||
}
|
||||
} else {
|
||||
convertedFilter[keyPathArg2] = filterArg2;
|
||||
// Primitive values
|
||||
mergeIntoConverted(keyPathArg2, filterArg2);
|
||||
}
|
||||
};
|
||||
|
||||
for (const key of Object.keys(filterArg)) {
|
||||
convertFilterArgument(key, filterArg[key]);
|
||||
// Skip logical operators, they were already processed
|
||||
if (!logicalOperators.includes(key)) {
|
||||
convertFilterArgument(key, filterArg[key]);
|
||||
}
|
||||
}
|
||||
|
||||
// Add back processed logical operators
|
||||
Object.assign(convertedFilter, processedFilter);
|
||||
|
||||
return convertedFilter;
|
||||
};
|
||||
|
||||
@@ -179,7 +347,12 @@ export class SmartDataDbDoc<T extends TImplements, TImplements, TManager extends
|
||||
const newInstance = new this();
|
||||
(newInstance as any).creationStatus = 'db';
|
||||
for (const key of Object.keys(mongoDbNativeDocArg)) {
|
||||
newInstance[key] = mongoDbNativeDocArg[key];
|
||||
const rawValue = mongoDbNativeDocArg[key];
|
||||
const optionsMap = (this as any)._svDbOptions || {};
|
||||
const opts = optionsMap[key];
|
||||
newInstance[key] = opts && typeof opts.deserialize === 'function'
|
||||
? opts.deserialize(rawValue)
|
||||
: rawValue;
|
||||
}
|
||||
return newInstance;
|
||||
}
|
||||
@@ -187,14 +360,19 @@ export class SmartDataDbDoc<T extends TImplements, TImplements, TManager extends
|
||||
/**
|
||||
* gets all instances as array
|
||||
* @param this
|
||||
* @param filterArg
|
||||
* @param filterArg - Type-safe MongoDB filter with nested object support and operators
|
||||
* @returns
|
||||
*/
|
||||
public static async getInstances<T>(
|
||||
this: plugins.tsclass.typeFest.Class<T>,
|
||||
filterArg: plugins.tsclass.typeFest.PartialDeep<T>,
|
||||
filterArg: MongoFilter<T>,
|
||||
opts?: { session?: plugins.mongodb.ClientSession }
|
||||
): Promise<T[]> {
|
||||
const foundDocs = await (this as any).collection.findAll(convertFilterForMongoDb(filterArg));
|
||||
// Pass session through to findAll for transactional queries
|
||||
const foundDocs = await (this as any).collection.findAll(
|
||||
convertFilterForMongoDb(filterArg),
|
||||
{ session: opts?.session },
|
||||
);
|
||||
const returnArray = [];
|
||||
for (const foundDoc of foundDocs) {
|
||||
const newInstance: T = (this as any).createInstanceFromMongoDbNativeDoc(foundDoc);
|
||||
@@ -211,9 +389,14 @@ export class SmartDataDbDoc<T extends TImplements, TImplements, TManager extends
|
||||
*/
|
||||
public static async getInstance<T>(
|
||||
this: plugins.tsclass.typeFest.Class<T>,
|
||||
filterArg: plugins.tsclass.typeFest.PartialDeep<T>,
|
||||
filterArg: MongoFilter<T>,
|
||||
opts?: { session?: plugins.mongodb.ClientSession }
|
||||
): Promise<T> {
|
||||
const foundDoc = await (this as any).collection.findOne(convertFilterForMongoDb(filterArg));
|
||||
// Retrieve one document, with optional session for transactions
|
||||
const foundDoc = await (this as any).collection.findOne(
|
||||
convertFilterForMongoDb(filterArg),
|
||||
{ session: opts?.session },
|
||||
);
|
||||
if (foundDoc) {
|
||||
const newInstance: T = (this as any).createInstanceFromMongoDbNativeDoc(foundDoc);
|
||||
return newInstance;
|
||||
@@ -233,33 +416,27 @@ export class SmartDataDbDoc<T extends TImplements, TImplements, TManager extends
|
||||
}
|
||||
|
||||
/**
|
||||
* get cursor
|
||||
* @returns
|
||||
* Get a cursor for streaming results, with optional session and native cursor modifiers.
|
||||
* @param filterArg Partial filter to apply
|
||||
* @param opts Optional session and modifier for the raw MongoDB cursor
|
||||
*/
|
||||
public static async getCursor<T>(
|
||||
this: plugins.tsclass.typeFest.Class<T>,
|
||||
filterArg: plugins.tsclass.typeFest.PartialDeep<T>,
|
||||
) {
|
||||
const collection: SmartdataCollection<T> = (this as any).collection;
|
||||
const cursor: SmartdataDbCursor<T> = await collection.getCursor(
|
||||
convertFilterForMongoDb(filterArg),
|
||||
this as any as typeof SmartDataDbDoc,
|
||||
);
|
||||
return cursor;
|
||||
}
|
||||
|
||||
public static async getCursorExtended<T>(
|
||||
this: plugins.tsclass.typeFest.Class<T>,
|
||||
filterArg: plugins.tsclass.typeFest.PartialDeep<T>,
|
||||
modifierFunction = (cursorArg: plugins.mongodb.FindCursor<plugins.mongodb.WithId<plugins.mongodb.BSON.Document>>) => cursorArg,
|
||||
filterArg: MongoFilter<T>,
|
||||
opts?: {
|
||||
session?: plugins.mongodb.ClientSession;
|
||||
modifier?: (cursorArg: plugins.mongodb.FindCursor<plugins.mongodb.WithId<plugins.mongodb.BSON.Document>>) => plugins.mongodb.FindCursor<plugins.mongodb.WithId<plugins.mongodb.BSON.Document>>;
|
||||
}
|
||||
): Promise<SmartdataDbCursor<T>> {
|
||||
const collection: SmartdataCollection<T> = (this as any).collection;
|
||||
const { session, modifier } = opts || {};
|
||||
await collection.init();
|
||||
let cursor: plugins.mongodb.FindCursor<any> = collection.mongoDbCollection.find(
|
||||
convertFilterForMongoDb(filterArg),
|
||||
);
|
||||
cursor = modifierFunction(cursor);
|
||||
return new SmartdataDbCursor<T>(cursor, this as any as typeof SmartDataDbDoc);
|
||||
let rawCursor: plugins.mongodb.FindCursor<any> =
|
||||
collection.mongoDbCollection.find(convertFilterForMongoDb(filterArg), { session });
|
||||
if (modifier) {
|
||||
rawCursor = modifier(rawCursor);
|
||||
}
|
||||
return new SmartdataDbCursor<T>(rawCursor, this as any as typeof SmartDataDbDoc);
|
||||
}
|
||||
|
||||
/**
|
||||
@@ -268,13 +445,20 @@ export class SmartDataDbDoc<T extends TImplements, TImplements, TManager extends
|
||||
* @param filterArg
|
||||
* @param forEachFunction
|
||||
*/
|
||||
/**
|
||||
* Watch the collection for changes, with optional buffering and change stream options.
|
||||
* @param filterArg MongoDB filter to select which changes to observe
|
||||
* @param opts optional ChangeStreamOptions plus bufferTimeMs
|
||||
*/
|
||||
public static async watch<T>(
|
||||
this: plugins.tsclass.typeFest.Class<T>,
|
||||
filterArg: plugins.tsclass.typeFest.PartialDeep<T>,
|
||||
) {
|
||||
filterArg: MongoFilter<T>,
|
||||
opts?: plugins.mongodb.ChangeStreamOptions & { bufferTimeMs?: number },
|
||||
): Promise<SmartdataDbWatcher<T>> {
|
||||
const collection: SmartdataCollection<T> = (this as any).collection;
|
||||
const watcher: SmartdataDbWatcher<T> = await collection.watch(
|
||||
convertFilterForMongoDb(filterArg),
|
||||
opts || {},
|
||||
this as any,
|
||||
);
|
||||
return watcher;
|
||||
@@ -286,7 +470,7 @@ export class SmartDataDbDoc<T extends TImplements, TImplements, TManager extends
|
||||
*/
|
||||
public static async forEach<T>(
|
||||
this: plugins.tsclass.typeFest.Class<T>,
|
||||
filterArg: plugins.tsclass.typeFest.PartialDeep<T>,
|
||||
filterArg: MongoFilter<T>,
|
||||
forEachFunction: (itemArg: T) => Promise<any>,
|
||||
) {
|
||||
const cursor: SmartdataDbCursor<T> = await (this as any).getCursor(filterArg);
|
||||
@@ -298,7 +482,7 @@ export class SmartDataDbDoc<T extends TImplements, TImplements, TManager extends
|
||||
*/
|
||||
public static async getCount<T>(
|
||||
this: plugins.tsclass.typeFest.Class<T>,
|
||||
filterArg: plugins.tsclass.typeFest.PartialDeep<T> = {} as any,
|
||||
filterArg: MongoFilter<T> = {} as any,
|
||||
) {
|
||||
const collection: SmartdataCollection<T> = (this as any).collection;
|
||||
return await collection.getCount(filterArg);
|
||||
@@ -339,7 +523,9 @@ export class SmartDataDbDoc<T extends TImplements, TImplements, TManager extends
|
||||
if (opts?.filter) {
|
||||
mongoFilter = { $and: [mongoFilter, opts.filter] };
|
||||
}
|
||||
let docs: T[] = await (this as any).getInstances(mongoFilter);
|
||||
// Fetch with optional session for transactions
|
||||
// Fetch within optional session
|
||||
let docs: T[] = await (this as any).getInstances(mongoFilter, { session: opts?.session });
|
||||
if (opts?.validate) {
|
||||
const out: T[] = [];
|
||||
for (const d of docs) {
|
||||
@@ -546,35 +732,52 @@ export class SmartDataDbDoc<T extends TImplements, TImplements, TManager extends
|
||||
constructor() {}
|
||||
|
||||
/**
|
||||
* saves this instance but not any connected items
|
||||
* may lead to data inconsistencies, but is faster
|
||||
* saves this instance (optionally within a transaction)
|
||||
*/
|
||||
public async save() {
|
||||
public async save(opts?: { session?: plugins.mongodb.ClientSession }) {
|
||||
// allow hook before saving
|
||||
if (typeof (this as any).beforeSave === 'function') {
|
||||
await (this as any).beforeSave();
|
||||
}
|
||||
// tslint:disable-next-line: no-this-assignment
|
||||
const self: any = this;
|
||||
let dbResult: any;
|
||||
|
||||
// update timestamp
|
||||
this._updatedAt = new Date().toISOString();
|
||||
|
||||
// perform insert or update
|
||||
switch (this.creationStatus) {
|
||||
case 'db':
|
||||
dbResult = await this.collection.update(self);
|
||||
dbResult = await this.collection.update(self, { session: opts?.session });
|
||||
break;
|
||||
case 'new':
|
||||
dbResult = await this.collection.insert(self);
|
||||
dbResult = await this.collection.insert(self, { session: opts?.session });
|
||||
this.creationStatus = 'db';
|
||||
break;
|
||||
default:
|
||||
console.error('neither new nor in db?');
|
||||
logger.log('error', 'neither new nor in db?');
|
||||
}
|
||||
// allow hook after saving
|
||||
if (typeof (this as any).afterSave === 'function') {
|
||||
await (this as any).afterSave();
|
||||
}
|
||||
return dbResult;
|
||||
}
|
||||
|
||||
/**
|
||||
* deletes a document from the database
|
||||
* deletes a document from the database (optionally within a transaction)
|
||||
*/
|
||||
public async delete() {
|
||||
await this.collection.delete(this);
|
||||
public async delete(opts?: { session?: plugins.mongodb.ClientSession }) {
|
||||
// allow hook before deleting
|
||||
if (typeof (this as any).beforeDelete === 'function') {
|
||||
await (this as any).beforeDelete();
|
||||
}
|
||||
// perform deletion
|
||||
const result = await this.collection.delete(this, { session: opts?.session });
|
||||
// allow hook after delete
|
||||
if (typeof (this as any).afterDelete === 'function') {
|
||||
await (this as any).afterDelete();
|
||||
}
|
||||
return result;
|
||||
}
|
||||
|
||||
/**
|
||||
@@ -598,11 +801,20 @@ export class SmartDataDbDoc<T extends TImplements, TImplements, TManager extends
|
||||
/**
|
||||
* updates an object from db
|
||||
*/
|
||||
public async updateFromDb() {
|
||||
public async updateFromDb(): Promise<boolean> {
|
||||
const mongoDbNativeDoc = await this.collection.findOne(await this.createIdentifiableObject());
|
||||
for (const key of Object.keys(mongoDbNativeDoc)) {
|
||||
this[key] = mongoDbNativeDoc[key];
|
||||
if (!mongoDbNativeDoc) {
|
||||
return false; // Document not found in database
|
||||
}
|
||||
for (const key of Object.keys(mongoDbNativeDoc)) {
|
||||
const rawValue = mongoDbNativeDoc[key];
|
||||
const optionsMap = (this.constructor as any)._svDbOptions || {};
|
||||
const opts = optionsMap[key];
|
||||
this[key] = opts && typeof opts.deserialize === 'function'
|
||||
? opts.deserialize(rawValue)
|
||||
: rawValue;
|
||||
}
|
||||
return true;
|
||||
}
|
||||
|
||||
/**
|
||||
@@ -610,9 +822,17 @@ export class SmartDataDbDoc<T extends TImplements, TImplements, TManager extends
|
||||
*/
|
||||
public async createSavableObject(): Promise<TImplements> {
|
||||
const saveableObject: unknown = {}; // is not exposed to outside, so any is ok here
|
||||
const saveableProperties = [...this.globalSaveableProperties, ...this.saveableProperties];
|
||||
const globalProps = this.globalSaveableProperties || [];
|
||||
const specificProps = this.saveableProperties || [];
|
||||
const saveableProperties = [...globalProps, ...specificProps];
|
||||
// apply custom serialization if configured
|
||||
const optionsMap = (this.constructor as any)._svDbOptions || {};
|
||||
for (const propertyNameString of saveableProperties) {
|
||||
saveableObject[propertyNameString] = this[propertyNameString];
|
||||
const rawValue = (this as any)[propertyNameString];
|
||||
const opts = optionsMap[propertyNameString];
|
||||
(saveableObject as any)[propertyNameString] = opts && typeof opts.serialize === 'function'
|
||||
? opts.serialize(rawValue)
|
||||
: rawValue;
|
||||
}
|
||||
return saveableObject as TImplements;
|
||||
}
|
||||
|
@@ -18,7 +18,7 @@ export class EasyStore<T> {
|
||||
public nameId: string;
|
||||
|
||||
@svDb()
|
||||
public ephermal: {
|
||||
public ephemeral: {
|
||||
activated: boolean;
|
||||
timeout: number;
|
||||
};
|
||||
@@ -32,8 +32,8 @@ export class EasyStore<T> {
|
||||
return SmartdataEasyStore;
|
||||
})();
|
||||
|
||||
constructor(nameIdArg: string, smnartdataDbRefArg: SmartdataDb) {
|
||||
this.smartdataDbRef = smnartdataDbRefArg;
|
||||
constructor(nameIdArg: string, smartdataDbRefArg: SmartdataDb) {
|
||||
this.smartdataDbRef = smartdataDbRefArg;
|
||||
this.nameId = nameIdArg;
|
||||
}
|
||||
|
||||
@@ -110,10 +110,12 @@ export class EasyStore<T> {
|
||||
await easyStore.save();
|
||||
}
|
||||
|
||||
public async cleanUpEphermal() {
|
||||
while (
|
||||
(await this.smartdataDbRef.statusConnectedDeferred.promise) &&
|
||||
this.smartdataDbRef.status === 'connected'
|
||||
) {}
|
||||
public async cleanUpEphemeral() {
|
||||
// Clean up ephemeral data periodically while connected
|
||||
while (this.smartdataDbRef.status === 'connected') {
|
||||
await plugins.smartdelay.delayFor(60000); // Check every minute
|
||||
// TODO: Implement actual cleanup logic for ephemeral data
|
||||
// For now, this prevents the infinite CPU loop
|
||||
}
|
||||
}
|
||||
}
|
||||
|
@@ -2,6 +2,7 @@
|
||||
* Lucene to MongoDB query adapter for SmartData
|
||||
*/
|
||||
import * as plugins from './plugins.js';
|
||||
import { logger } from './logging.js';
|
||||
|
||||
// Types
|
||||
type NodeType =
|
||||
@@ -754,7 +755,7 @@ export class SmartdataLuceneAdapter {
|
||||
// Transform the AST to a MongoDB query
|
||||
return this.transformWithFields(ast, fieldsToSearch);
|
||||
} catch (error) {
|
||||
console.error(`Failed to convert Lucene query "${luceneQuery}":`, error);
|
||||
logger.log('error', `Failed to convert Lucene query "${luceneQuery}":`, error);
|
||||
throw new Error(`Failed to convert Lucene query: ${error}`);
|
||||
}
|
||||
}
|
||||
|
@@ -1,37 +1,73 @@
|
||||
import { SmartDataDbDoc } from './classes.doc.js';
|
||||
import * as plugins from './plugins.js';
|
||||
import { EventEmitter } from 'events';
|
||||
|
||||
/**
|
||||
* a wrapper for the native mongodb cursor. Exposes better
|
||||
*/
|
||||
export class SmartdataDbWatcher<T = any> {
|
||||
/**
|
||||
* Wraps a MongoDB ChangeStream with RxJS and EventEmitter support.
|
||||
*/
|
||||
export class SmartdataDbWatcher<T = any> extends EventEmitter {
|
||||
// STATIC
|
||||
public readyDeferred = plugins.smartpromise.defer();
|
||||
|
||||
// INSTANCE
|
||||
private changeStream: plugins.mongodb.ChangeStream<T>;
|
||||
|
||||
public changeSubject = new plugins.smartrx.rxjs.Subject<T>();
|
||||
private rawSubject: plugins.smartrx.rxjs.Subject<T>;
|
||||
/** Emits change documents (or arrays of documents if buffered) */
|
||||
public changeSubject: any;
|
||||
/**
|
||||
* @param changeStreamArg native MongoDB ChangeStream
|
||||
* @param smartdataDbDocArg document class for instance creation
|
||||
* @param opts.bufferTimeMs optional milliseconds to buffer events via RxJS
|
||||
*/
|
||||
constructor(
|
||||
changeStreamArg: plugins.mongodb.ChangeStream<T>,
|
||||
smartdataDbDocArg: typeof SmartDataDbDoc,
|
||||
opts?: { bufferTimeMs?: number },
|
||||
) {
|
||||
super();
|
||||
this.rawSubject = new plugins.smartrx.rxjs.Subject<T>();
|
||||
// Apply buffering if requested
|
||||
if (opts && opts.bufferTimeMs) {
|
||||
this.changeSubject = this.rawSubject.pipe(plugins.smartrx.rxjs.ops.bufferTime(opts.bufferTimeMs));
|
||||
} else {
|
||||
this.changeSubject = this.rawSubject;
|
||||
}
|
||||
this.changeStream = changeStreamArg;
|
||||
this.changeStream.on('change', async (item: any) => {
|
||||
if (!item.fullDocument) {
|
||||
this.changeSubject.next(null);
|
||||
return;
|
||||
let docInstance: T = null;
|
||||
if (item.fullDocument) {
|
||||
docInstance = smartdataDbDocArg.createInstanceFromMongoDbNativeDoc(
|
||||
item.fullDocument
|
||||
) as any as T;
|
||||
}
|
||||
this.changeSubject.next(
|
||||
smartdataDbDocArg.createInstanceFromMongoDbNativeDoc(item.fullDocument) as any as T,
|
||||
);
|
||||
// Notify subscribers
|
||||
this.rawSubject.next(docInstance);
|
||||
this.emit('change', docInstance);
|
||||
});
|
||||
// Signal readiness after one tick
|
||||
plugins.smartdelay.delayFor(0).then(() => {
|
||||
this.readyDeferred.resolve();
|
||||
});
|
||||
}
|
||||
|
||||
public async close() {
|
||||
/**
|
||||
* Close the change stream, complete the RxJS subject, and remove listeners.
|
||||
*/
|
||||
public async close(): Promise<void> {
|
||||
// Close MongoDB ChangeStream
|
||||
await this.changeStream.close();
|
||||
// Complete the subject to teardown any buffering operators
|
||||
this.rawSubject.complete();
|
||||
// Remove all EventEmitter listeners
|
||||
this.removeAllListeners();
|
||||
}
|
||||
/**
|
||||
* Alias for close(), matching README usage
|
||||
*/
|
||||
public async stop(): Promise<void> {
|
||||
return this.close();
|
||||
}
|
||||
}
|
||||
|
Reference in New Issue
Block a user