Compare commits
30 Commits
Author | SHA1 | Date | |
---|---|---|---|
c80df05fdf | |||
9be43a85ef | |||
bf66209d3e | |||
cdd1ae2c9b | |||
f4290ae7f7 | |||
e58c0fd215 | |||
a91fac450a | |||
5cb043009c | |||
4a1f11b885 | |||
43f9033ccc | |||
e7c0951786 | |||
efc107907c | |||
2b8b0e5bdd | |||
3ae2a7fcf5 | |||
0806d3749b | |||
f5d5e20a97 | |||
db2767010d | |||
e2dc094afd | |||
39d2957b7d | |||
490524516e | |||
ccd4b9e1ec | |||
9c6d6d9f2c | |||
e4d787096e | |||
2bf923b4f1 | |||
0ca1d452b4 | |||
436311ab06 | |||
498f586ddb | |||
6c50bd23ec | |||
419eb163f4 | |||
75aeb12e81 |
BIN
.serena/cache/typescript/document_symbols_cache_v23-06-25.pkl
vendored
Normal file
BIN
.serena/cache/typescript/document_symbols_cache_v23-06-25.pkl
vendored
Normal file
Binary file not shown.
44
.serena/memories/code_style_conventions.md
Normal file
44
.serena/memories/code_style_conventions.md
Normal file
@@ -0,0 +1,44 @@
|
|||||||
|
# Code Style & Conventions
|
||||||
|
|
||||||
|
## TypeScript Standards
|
||||||
|
- **Target**: ES2022
|
||||||
|
- **Module System**: ESM with NodeNext resolution
|
||||||
|
- **Decorators**: Experimental decorators enabled
|
||||||
|
- **Strict Mode**: Implied through TypeScript configuration
|
||||||
|
|
||||||
|
## Naming Conventions
|
||||||
|
- **Interfaces**: Prefix with `I` (e.g., `IUserData`, `IConfig`)
|
||||||
|
- **Types**: Prefix with `T` (e.g., `TResponseType`, `TQueryResult`)
|
||||||
|
- **Classes**: PascalCase (e.g., `SmartdataDb`, `SmartDataDbDoc`)
|
||||||
|
- **Files**: All lowercase (e.g., `classes.doc.ts`, `plugins.ts`)
|
||||||
|
- **Methods**: camelCase (e.g., `findOne`, `saveToDb`)
|
||||||
|
|
||||||
|
## Import Patterns
|
||||||
|
- All external dependencies imported in `ts/plugins.ts`
|
||||||
|
- Reference as `plugins.moduleName.method()`
|
||||||
|
- Use full import paths for internal modules
|
||||||
|
- Maintain ESM syntax throughout
|
||||||
|
|
||||||
|
## Class Structure
|
||||||
|
- Use decorators for MongoDB document definitions
|
||||||
|
- Extend base classes (SmartDataDbDoc, SmartDataDbCollection)
|
||||||
|
- Static methods for factory patterns
|
||||||
|
- Instance methods for document operations
|
||||||
|
|
||||||
|
## Async Patterns
|
||||||
|
- Preserve Promise-based patterns
|
||||||
|
- Use async/await for clarity
|
||||||
|
- Handle errors appropriately
|
||||||
|
- Return typed Promises
|
||||||
|
|
||||||
|
## MongoDB Specifics
|
||||||
|
- Use `@unify()` decorator for unique fields
|
||||||
|
- Use `@svDb()` decorator for database fields
|
||||||
|
- Implement proper serialization/deserialization
|
||||||
|
- Type-safe query construction with DeepQuery<T>
|
||||||
|
|
||||||
|
## Testing Patterns
|
||||||
|
- Import from `@git.zone/tstest/tapbundle`
|
||||||
|
- End test files with `export default tap.start()`
|
||||||
|
- Use descriptive test names
|
||||||
|
- Cover edge cases and error conditions
|
37
.serena/memories/project_overview.md
Normal file
37
.serena/memories/project_overview.md
Normal file
@@ -0,0 +1,37 @@
|
|||||||
|
# Project Overview: @push.rocks/smartdata
|
||||||
|
|
||||||
|
## Purpose
|
||||||
|
An advanced TypeScript-first MongoDB wrapper library providing enterprise-grade features for distributed systems, real-time data synchronization, and easy data management.
|
||||||
|
|
||||||
|
## Tech Stack
|
||||||
|
- **Language**: TypeScript (ES2022 target)
|
||||||
|
- **Runtime**: Node.js >= 16.x
|
||||||
|
- **Database**: MongoDB >= 4.4
|
||||||
|
- **Build System**: tsbuild
|
||||||
|
- **Test Framework**: tstest with tapbundle
|
||||||
|
- **Package Manager**: pnpm (v10.7.0)
|
||||||
|
- **Module System**: ESM (ES Modules)
|
||||||
|
|
||||||
|
## Key Features
|
||||||
|
- Type-safe MongoDB integration with decorators
|
||||||
|
- Document management with automatic timestamps
|
||||||
|
- EasyStore for key-value storage
|
||||||
|
- Distributed coordination with leader election
|
||||||
|
- Real-time data sync with RxJS watchers
|
||||||
|
- Deep query type safety
|
||||||
|
- Enhanced cursor API
|
||||||
|
- Powerful search capabilities
|
||||||
|
|
||||||
|
## Project Structure
|
||||||
|
- **ts/**: Main TypeScript source code
|
||||||
|
- Core classes for DB, Collections, Documents, Cursors
|
||||||
|
- Distributed coordinator, EasyStore, Watchers
|
||||||
|
- Lucene adapter for search functionality
|
||||||
|
- **test/**: Test files using tstest framework
|
||||||
|
- **dist_ts/**: Compiled JavaScript output
|
||||||
|
|
||||||
|
## Key Dependencies
|
||||||
|
- MongoDB driver (v6.18.0)
|
||||||
|
- @push.rocks ecosystem packages
|
||||||
|
- @tsclass/tsclass for decorators
|
||||||
|
- RxJS for reactive programming
|
35
.serena/memories/suggested_commands.md
Normal file
35
.serena/memories/suggested_commands.md
Normal file
@@ -0,0 +1,35 @@
|
|||||||
|
# Suggested Commands for @push.rocks/smartdata
|
||||||
|
|
||||||
|
## Build & Development
|
||||||
|
- `pnpm build` - Build the TypeScript project with web support
|
||||||
|
- `pnpm buildDocs` - Generate documentation using tsdoc
|
||||||
|
- `tsbuild --web --allowimplicitany` - Direct build command
|
||||||
|
|
||||||
|
## Testing
|
||||||
|
- `pnpm test` - Run all tests in test/ directory
|
||||||
|
- `pnpm testSearch` - Run specific search test
|
||||||
|
- `tstest test/test.specific.ts --verbose` - Run specific test with verbose output
|
||||||
|
- `tsbuild check test/**/* --skiplibcheck` - Type check test files
|
||||||
|
|
||||||
|
## Package Management
|
||||||
|
- `pnpm install` - Install dependencies
|
||||||
|
- `pnpm install --save-dev <package>` - Add dev dependency
|
||||||
|
- `pnpm add <package>` - Add production dependency
|
||||||
|
|
||||||
|
## Version Control
|
||||||
|
- `git status` - Check current changes
|
||||||
|
- `git diff` - View uncommitted changes
|
||||||
|
- `git log --oneline -10` - View recent commits
|
||||||
|
- `git mv <old> <new>` - Move/rename files preserving history
|
||||||
|
|
||||||
|
## System Utilities (Linux)
|
||||||
|
- `ls -la` - List all files with details
|
||||||
|
- `grep -r "pattern" .` - Search for pattern in files
|
||||||
|
- `find . -name "*.ts"` - Find TypeScript files
|
||||||
|
- `ps aux | grep node` - Find Node.js processes
|
||||||
|
- `lsof -i :80` - Check process on port 80
|
||||||
|
|
||||||
|
## Debug & Development
|
||||||
|
- `tsx <script.ts>` - Run TypeScript file directly
|
||||||
|
- Store debug scripts in `.nogit/debug/`
|
||||||
|
- Curl endpoints for API testing
|
33
.serena/memories/task_completion_checklist.md
Normal file
33
.serena/memories/task_completion_checklist.md
Normal file
@@ -0,0 +1,33 @@
|
|||||||
|
# Task Completion Checklist
|
||||||
|
|
||||||
|
When completing any coding task in this project, always:
|
||||||
|
|
||||||
|
## Before Committing
|
||||||
|
1. **Build the project**: Run `pnpm build` to ensure TypeScript compiles
|
||||||
|
2. **Run tests**: Execute `pnpm test` to verify nothing is broken
|
||||||
|
3. **Type check**: Verify types compile correctly
|
||||||
|
4. **Check for lint issues**: Look for any code style violations
|
||||||
|
|
||||||
|
## Code Quality Checks
|
||||||
|
- Verify all imports are in `ts/plugins.ts` for external dependencies
|
||||||
|
- Check that interfaces are prefixed with `I`
|
||||||
|
- Check that types are prefixed with `T`
|
||||||
|
- Ensure filenames are lowercase
|
||||||
|
- Verify async patterns are preserved where needed
|
||||||
|
- Check that decorators are properly used for MongoDB documents
|
||||||
|
|
||||||
|
## Documentation
|
||||||
|
- Update relevant comments if functionality changed
|
||||||
|
- Ensure new exports are properly documented
|
||||||
|
- Update readme.md if new features added (only if explicitly requested)
|
||||||
|
|
||||||
|
## Git Hygiene
|
||||||
|
- Make small, focused commits
|
||||||
|
- Write clear commit messages
|
||||||
|
- Use `git mv` for file operations
|
||||||
|
- Never commit sensitive data or keys
|
||||||
|
|
||||||
|
## Final Verification
|
||||||
|
- Test the specific functionality that was changed
|
||||||
|
- Ensure no unintended side effects
|
||||||
|
- Verify the change solves the original problem completely
|
68
.serena/project.yml
Normal file
68
.serena/project.yml
Normal file
@@ -0,0 +1,68 @@
|
|||||||
|
# language of the project (csharp, python, rust, java, typescript, go, cpp, or ruby)
|
||||||
|
# * For C, use cpp
|
||||||
|
# * For JavaScript, use typescript
|
||||||
|
# Special requirements:
|
||||||
|
# * csharp: Requires the presence of a .sln file in the project folder.
|
||||||
|
language: typescript
|
||||||
|
|
||||||
|
# whether to use the project's gitignore file to ignore files
|
||||||
|
# Added on 2025-04-07
|
||||||
|
ignore_all_files_in_gitignore: true
|
||||||
|
# list of additional paths to ignore
|
||||||
|
# same syntax as gitignore, so you can use * and **
|
||||||
|
# Was previously called `ignored_dirs`, please update your config if you are using that.
|
||||||
|
# Added (renamed)on 2025-04-07
|
||||||
|
ignored_paths: []
|
||||||
|
|
||||||
|
# whether the project is in read-only mode
|
||||||
|
# If set to true, all editing tools will be disabled and attempts to use them will result in an error
|
||||||
|
# Added on 2025-04-18
|
||||||
|
read_only: false
|
||||||
|
|
||||||
|
|
||||||
|
# list of tool names to exclude. We recommend not excluding any tools, see the readme for more details.
|
||||||
|
# Below is the complete list of tools for convenience.
|
||||||
|
# To make sure you have the latest list of tools, and to view their descriptions,
|
||||||
|
# execute `uv run scripts/print_tool_overview.py`.
|
||||||
|
#
|
||||||
|
# * `activate_project`: Activates a project by name.
|
||||||
|
# * `check_onboarding_performed`: Checks whether project onboarding was already performed.
|
||||||
|
# * `create_text_file`: Creates/overwrites a file in the project directory.
|
||||||
|
# * `delete_lines`: Deletes a range of lines within a file.
|
||||||
|
# * `delete_memory`: Deletes a memory from Serena's project-specific memory store.
|
||||||
|
# * `execute_shell_command`: Executes a shell command.
|
||||||
|
# * `find_referencing_code_snippets`: Finds code snippets in which the symbol at the given location is referenced.
|
||||||
|
# * `find_referencing_symbols`: Finds symbols that reference the symbol at the given location (optionally filtered by type).
|
||||||
|
# * `find_symbol`: Performs a global (or local) search for symbols with/containing a given name/substring (optionally filtered by type).
|
||||||
|
# * `get_current_config`: Prints the current configuration of the agent, including the active and available projects, tools, contexts, and modes.
|
||||||
|
# * `get_symbols_overview`: Gets an overview of the top-level symbols defined in a given file.
|
||||||
|
# * `initial_instructions`: Gets the initial instructions for the current project.
|
||||||
|
# Should only be used in settings where the system prompt cannot be set,
|
||||||
|
# e.g. in clients you have no control over, like Claude Desktop.
|
||||||
|
# * `insert_after_symbol`: Inserts content after the end of the definition of a given symbol.
|
||||||
|
# * `insert_at_line`: Inserts content at a given line in a file.
|
||||||
|
# * `insert_before_symbol`: Inserts content before the beginning of the definition of a given symbol.
|
||||||
|
# * `list_dir`: Lists files and directories in the given directory (optionally with recursion).
|
||||||
|
# * `list_memories`: Lists memories in Serena's project-specific memory store.
|
||||||
|
# * `onboarding`: Performs onboarding (identifying the project structure and essential tasks, e.g. for testing or building).
|
||||||
|
# * `prepare_for_new_conversation`: Provides instructions for preparing for a new conversation (in order to continue with the necessary context).
|
||||||
|
# * `read_file`: Reads a file within the project directory.
|
||||||
|
# * `read_memory`: Reads the memory with the given name from Serena's project-specific memory store.
|
||||||
|
# * `remove_project`: Removes a project from the Serena configuration.
|
||||||
|
# * `replace_lines`: Replaces a range of lines within a file with new content.
|
||||||
|
# * `replace_symbol_body`: Replaces the full definition of a symbol.
|
||||||
|
# * `restart_language_server`: Restarts the language server, may be necessary when edits not through Serena happen.
|
||||||
|
# * `search_for_pattern`: Performs a search for a pattern in the project.
|
||||||
|
# * `summarize_changes`: Provides instructions for summarizing the changes made to the codebase.
|
||||||
|
# * `switch_modes`: Activates modes by providing a list of their names
|
||||||
|
# * `think_about_collected_information`: Thinking tool for pondering the completeness of collected information.
|
||||||
|
# * `think_about_task_adherence`: Thinking tool for determining whether the agent is still on track with the current task.
|
||||||
|
# * `think_about_whether_you_are_done`: Thinking tool for determining whether the task is truly completed.
|
||||||
|
# * `write_memory`: Writes a named memory (for future reference) to Serena's project-specific memory store.
|
||||||
|
excluded_tools: []
|
||||||
|
|
||||||
|
# initial prompt for the project. It will always be given to the LLM upon activating the project
|
||||||
|
# (contrary to the memories, which are loaded on demand).
|
||||||
|
initial_prompt: ""
|
||||||
|
|
||||||
|
project_name: "smartdata"
|
108
changelog.md
108
changelog.md
@@ -1,5 +1,113 @@
|
|||||||
# Changelog
|
# Changelog
|
||||||
|
|
||||||
|
## 2025-08-18 - 5.16.2 - fix(readme)
|
||||||
|
Update README: clarify examples, expand search/cursor/docs and add local Claude settings
|
||||||
|
|
||||||
|
- Refined README wording and structure: clearer Quick Start, improved examples and developer-focused phrasing
|
||||||
|
- Expanded documentation for search, cursors, change streams, distributed coordination, transactions and EasyStore with more concrete code examples
|
||||||
|
- Adjusted code examples to show safer defaults (ID generation, status/tags, connection pooling) and improved best-practices guidance
|
||||||
|
- Added .claude/settings.local.json to provide local assistant/CI permission configuration
|
||||||
|
|
||||||
|
## 2025-08-12 - 5.16.1 - fix(core)
|
||||||
|
Improve error handling and logging; enhance search query sanitization; update dependency versions and documentation
|
||||||
|
|
||||||
|
- Replaced console.log and console.warn with structured logger.log calls throughout the core modules
|
||||||
|
- Enhanced database initialization with try/catch and proper URI credential encoding
|
||||||
|
- Improved search query conversion by disallowing dangerous operators (e.g. $where) and securely escaping regex patterns
|
||||||
|
- Bumped dependency versions (smartlog, @tsclass/tsclass, mongodb, etc.) in package.json
|
||||||
|
- Added detailed project memories including code style, project overview, and suggested commands for developers
|
||||||
|
- Updated README with improved instructions, feature highlights, and quick start sections
|
||||||
|
|
||||||
|
## 2025-04-25 - 5.16.0 - feat(watcher)
|
||||||
|
Enhance change stream watchers with buffering and EventEmitter support; update dependency versions
|
||||||
|
|
||||||
|
- Bumped smartmongo from ^2.0.11 to ^2.0.12 and smartrx from ^3.0.7 to ^3.0.10
|
||||||
|
- Upgraded @tsclass/tsclass to ^9.0.0 and mongodb to ^6.16.0
|
||||||
|
- Refactored the watch API to accept additional options (bufferTimeMs, fullDocument) for improved change stream handling
|
||||||
|
- Modified SmartdataDbWatcher to extend EventEmitter and support event notifications
|
||||||
|
|
||||||
|
## 2025-04-24 - 5.15.1 - fix(cursor)
|
||||||
|
Improve cursor usage documentation and refactor getCursor API to support native cursor modifiers
|
||||||
|
|
||||||
|
- Updated examples in readme.md to demonstrate manual iteration using cursor.next() and proper cursor closing.
|
||||||
|
- Refactored the getCursor method in classes.doc.ts to accept session and modifier options, consolidating cursor handling.
|
||||||
|
- Added new tests in test/test.cursor.ts to verify cursor operations, including limits, sorting, and skipping.
|
||||||
|
|
||||||
|
## 2025-04-24 - 5.15.0 - feat(svDb)
|
||||||
|
Enhance svDb decorator to support custom serialization and deserialization options
|
||||||
|
|
||||||
|
- Added an optional options parameter to the svDb decorator to accept serialize/deserialize functions
|
||||||
|
- Updated instance creation logic (updateFromDb) to apply custom deserialization if provided
|
||||||
|
- Updated createSavableObject to use custom serialization when available
|
||||||
|
|
||||||
|
## 2025-04-23 - 5.14.1 - fix(db operations)
|
||||||
|
Update transaction API to consistently pass optional session parameters across database operations
|
||||||
|
|
||||||
|
- Revised transaction support in readme to use startSession without await and showcased session usage in getInstance and save calls
|
||||||
|
- Updated methods in classes.collection.ts to accept an optional session parameter for findOne, getCursor, findAll, insert, update, delete, and getCount
|
||||||
|
- Enhanced SmartDataDbDoc save and delete methods to propagate session parameters
|
||||||
|
- Improved overall consistency of transactional APIs across the library
|
||||||
|
|
||||||
|
## 2025-04-23 - 5.14.0 - feat(doc)
|
||||||
|
Implement support for beforeSave, afterSave, beforeDelete, and afterDelete lifecycle hooks in document save and delete operations to allow custom logic execution during these critical moments.
|
||||||
|
|
||||||
|
- Calls beforeSave hook if defined before performing insert or update.
|
||||||
|
- Calls afterSave hook after a document is saved.
|
||||||
|
- Calls beforeDelete hook before deletion and afterDelete hook afterward.
|
||||||
|
- Ensures _updatedAt timestamp is refreshed during save operations.
|
||||||
|
|
||||||
|
## 2025-04-22 - 5.13.1 - fix(search)
|
||||||
|
Improve search query parsing for implicit AND queries by preserving quoted substrings and better handling free terms, quoted phrases, and field:value tokens.
|
||||||
|
|
||||||
|
- Replace previous implicit AND logic with tokenization that preserves quoted substrings
|
||||||
|
- Support both free term and field:value tokens with wildcards inside quotes
|
||||||
|
- Ensure errors are thrown for non-searchable fields in field-specific queries
|
||||||
|
|
||||||
|
## 2025-04-22 - 5.13.0 - feat(search)
|
||||||
|
Improve search query handling and update documentation
|
||||||
|
|
||||||
|
- Added 'codex.md' providing a high-level project overview and detailed search API documentation.
|
||||||
|
- Enhanced search parsing in SmartDataDbDoc to support combined free-term and quoted field phrase queries.
|
||||||
|
- Introduced a new fallback branch in the search method to handle free term with quoted field input.
|
||||||
|
- Updated tests in test/test.search.ts to cover new combined query scenarios and ensure robust behavior.
|
||||||
|
|
||||||
|
## 2025-04-22 - 5.12.2 - fix(search)
|
||||||
|
Fix handling of quoted wildcard patterns in field-specific search queries and add tests for location-based wildcard phrase searches
|
||||||
|
|
||||||
|
- Strip surrounding quotes from wildcard patterns in field queries to correctly transform them to regex
|
||||||
|
- Introduce new tests in test/test.search.ts to validate exact quoted and unquoted wildcard searches on a location field
|
||||||
|
|
||||||
|
## 2025-04-22 - 5.12.1 - fix(search)
|
||||||
|
Improve implicit AND logic for mixed free term and field queries in search and enhance wildcard field handling.
|
||||||
|
|
||||||
|
- Updated regex for field:value parsing to capture full value with wildcards.
|
||||||
|
- Added explicit handling for free terms by converting to regex across searchable fields.
|
||||||
|
- Improved error messaging for attempts to search non-searchable fields.
|
||||||
|
- Extended tests to cover combined free term and wildcard field searches, including error cases.
|
||||||
|
|
||||||
|
## 2025-04-22 - 5.12.0 - feat(doc/search)
|
||||||
|
Enhance search functionality with filter and validate options for advanced query control
|
||||||
|
|
||||||
|
- Added 'filter' option to merge additional MongoDB query constraints in search
|
||||||
|
- Introduced 'validate' hook to post-process and filter fetched documents
|
||||||
|
- Refactored underlying execQuery function to support additional search options
|
||||||
|
- Updated tests to cover new search scenarios and fallback mechanisms
|
||||||
|
|
||||||
|
## 2025-04-22 - 5.11.4 - fix(search)
|
||||||
|
Implement implicit AND logic for mixed simple term and field:value queries in search
|
||||||
|
|
||||||
|
- Added a new branch to detect and handle search queries that mix field:value pairs with plain terms without explicit operators
|
||||||
|
- Builds an implicit $and filter when query parts contain colon(s) but lack explicit boolean operators or quotes
|
||||||
|
- Ensures proper parsing and improved robustness of search filters
|
||||||
|
|
||||||
|
## 2025-04-22 - 5.11.3 - fix(lucene adapter and search tests)
|
||||||
|
Improve range query parsing in Lucene adapter and expand search test coverage
|
||||||
|
|
||||||
|
- Added a new 'testSearch' script in package.json to run search tests.
|
||||||
|
- Introduced advanced search tests for range queries and combined field filters in test/search.advanced.ts.
|
||||||
|
- Enhanced robustness tests in test/search.ts for wildcard and empty query scenarios.
|
||||||
|
- Fixed token validation in the parseRange method of the Lucene adapter to ensure proper error handling.
|
||||||
|
|
||||||
## 2025-04-21 - 5.11.2 - fix(readme)
|
## 2025-04-21 - 5.11.2 - fix(readme)
|
||||||
Update readme to clarify usage of searchable fields retrieval
|
Update readme to clarify usage of searchable fields retrieval
|
||||||
|
|
||||||
|
77
codex.md
Normal file
77
codex.md
Normal file
@@ -0,0 +1,77 @@
|
|||||||
|
# SmartData Project Overview
|
||||||
|
|
||||||
|
This document provides a high-level overview of the SmartData library (`@push.rocks/smartdata`), its architecture, core components, and key features—including recent enhancements to the search API.
|
||||||
|
|
||||||
|
## 1. Project Purpose
|
||||||
|
- A TypeScript‑first wrapper around MongoDB that supplies:
|
||||||
|
- Strongly‑typed document & collection classes
|
||||||
|
- Decorator‑based schema definition (no external schema files)
|
||||||
|
- Advanced search capabilities with Lucene‑style queries
|
||||||
|
- Built‑in support for real‑time data sync, distributed coordination, and key‑value EasyStore
|
||||||
|
|
||||||
|
## 2. Core Concepts & Components
|
||||||
|
- **SmartDataDb**: Manages the MongoDB connection, pooling, and initialization of collections.
|
||||||
|
- **SmartDataDbDoc**: Base class for all document models; provides CRUD, upsert, and cursor APIs.
|
||||||
|
- **Decorators**:
|
||||||
|
- `@Collection`: Associates a class with a MongoDB collection
|
||||||
|
- `@svDb()`: Marks a field as persisted to the DB
|
||||||
|
- `@unI()`: Marks a field as a unique index
|
||||||
|
- `@index()`: Adds a regular index
|
||||||
|
- `@searchable()`: Marks a field for inclusion in text searches or regex queries
|
||||||
|
- **SmartdataCollection**: Wraps a MongoDB collection; auto‑creates indexes based on decorators.
|
||||||
|
- **Lucene Adapter**: Parses a Lucene query string into an AST and transforms it to a MongoDB filter object.
|
||||||
|
- **EasyStore**: A simple, schema‑less key‑value store built on top of MongoDB for sharing ephemeral data.
|
||||||
|
- **Distributed Coordinator**: Leader election and task‑distribution API for building resilient, multi‑instance systems.
|
||||||
|
- **Watcher**: Listens to change streams for real‑time updates and integrates with RxJS.
|
||||||
|
|
||||||
|
## 3. Search API
|
||||||
|
SmartData provides a unified `.search(query[, opts])` method on all models with `@searchable()` fields:
|
||||||
|
|
||||||
|
- **Supported Syntax**:
|
||||||
|
1. Exact field:value (e.g. `field:Value`)
|
||||||
|
2. Quoted phrases (e.g. `"exact phrase"` or `'exact phrase'`)
|
||||||
|
3. Wildcards: `*` (zero or more chars) and `?` (single char)
|
||||||
|
4. Boolean operators: `AND`, `OR`, `NOT`
|
||||||
|
5. Grouping: parenthesis `(A OR B) AND C`
|
||||||
|
6. Range queries: `[num TO num]`, `{num TO num}`
|
||||||
|
7. Multi‑term unquoted: terms AND’d across all searchable fields
|
||||||
|
8. Empty query returns all documents
|
||||||
|
|
||||||
|
- **Fallback Mechanisms**:
|
||||||
|
1. Text index based `$text` search (if supported)
|
||||||
|
2. Field‑scoped and multi‑field regex queries
|
||||||
|
3. In‑memory filtering for complex or unsupported cases
|
||||||
|
|
||||||
|
### New Security & Extensibility Hooks
|
||||||
|
The `.search(query, opts?)` signature now accepts a `SearchOptions<T>` object:
|
||||||
|
```ts
|
||||||
|
interface SearchOptions<T> {
|
||||||
|
filter?: Record<string, any>; // Additional MongoDB filter AND‑merged
|
||||||
|
validate?: (doc: T) => boolean; // Post‑fetch hook to drop results
|
||||||
|
}
|
||||||
|
```
|
||||||
|
- **filter**: Enforces mandatory constraints (e.g. multi‑tenant isolation) directly in the Mongo query.
|
||||||
|
- **validate**: An async function that runs after fetching; return `false` to exclude a document.
|
||||||
|
|
||||||
|
## 4. Testing Strategy
|
||||||
|
- Unit tests in `test/test.search.ts` cover basic search functionality and new options:
|
||||||
|
- Exact, wildcard, phrase, boolean and grouping cases
|
||||||
|
- Implicit AND and mixed free‑term + field searches
|
||||||
|
- Edge cases (non‑searchable fields, quoted wildcards, no matches)
|
||||||
|
- `filter` and `validate` tests ensure security hooks work as intended
|
||||||
|
- Advanced search scenarios are covered in `test/test.search.advanced.ts`.
|
||||||
|
|
||||||
|
## 5. Usage Example
|
||||||
|
```ts
|
||||||
|
// Basic search
|
||||||
|
const prods = await Product.search('wireless earbuds');
|
||||||
|
|
||||||
|
// Scoped search (only your organization’s items)
|
||||||
|
const myItems = await Product.search('book', { filter: { ownerId } });
|
||||||
|
|
||||||
|
// Post‑search validation (only cheap items)
|
||||||
|
const cheapItems = await Product.search('', { validate: p => p.price < 50 });
|
||||||
|
```
|
||||||
|
|
||||||
|
---
|
||||||
|
Last updated: 2025-04-22
|
27
package.json
27
package.json
@@ -1,13 +1,14 @@
|
|||||||
{
|
{
|
||||||
"name": "@push.rocks/smartdata",
|
"name": "@push.rocks/smartdata",
|
||||||
"version": "5.11.2",
|
"version": "5.16.2",
|
||||||
"private": false,
|
"private": false,
|
||||||
"description": "An advanced library for NoSQL data organization and manipulation using TypeScript with support for MongoDB, data validation, collections, and custom data types.",
|
"description": "An advanced library for NoSQL data organization and manipulation using TypeScript with support for MongoDB, data validation, collections, and custom data types.",
|
||||||
"main": "dist_ts/index.js",
|
"main": "dist_ts/index.js",
|
||||||
"typings": "dist_ts/index.d.ts",
|
"typings": "dist_ts/index.d.ts",
|
||||||
"type": "module",
|
"type": "module",
|
||||||
"scripts": {
|
"scripts": {
|
||||||
"test": "tstest test/",
|
"test": "tstest test/ --verbose. --logfile --timeout 120",
|
||||||
|
"testSearch": "tsx test/test.search.ts",
|
||||||
"build": "tsbuild --web --allowimplicitany",
|
"build": "tsbuild --web --allowimplicitany",
|
||||||
"buildDocs": "tsdoc"
|
"buildDocs": "tsdoc"
|
||||||
},
|
},
|
||||||
@@ -22,26 +23,26 @@
|
|||||||
},
|
},
|
||||||
"homepage": "https://code.foss.global/push.rocks/smartdata#readme",
|
"homepage": "https://code.foss.global/push.rocks/smartdata#readme",
|
||||||
"dependencies": {
|
"dependencies": {
|
||||||
"@push.rocks/lik": "^6.0.14",
|
"@push.rocks/lik": "^6.2.2",
|
||||||
"@push.rocks/smartdelay": "^3.0.1",
|
"@push.rocks/smartdelay": "^3.0.1",
|
||||||
"@push.rocks/smartlog": "^3.0.2",
|
"@push.rocks/smartlog": "^3.1.8",
|
||||||
"@push.rocks/smartmongo": "^2.0.11",
|
"@push.rocks/smartmongo": "^2.0.12",
|
||||||
"@push.rocks/smartpromise": "^4.0.2",
|
"@push.rocks/smartpromise": "^4.0.2",
|
||||||
"@push.rocks/smartrx": "^3.0.7",
|
"@push.rocks/smartrx": "^3.0.10",
|
||||||
"@push.rocks/smartstring": "^4.0.15",
|
"@push.rocks/smartstring": "^4.0.15",
|
||||||
"@push.rocks/smarttime": "^4.0.6",
|
"@push.rocks/smarttime": "^4.0.6",
|
||||||
"@push.rocks/smartunique": "^3.0.8",
|
"@push.rocks/smartunique": "^3.0.8",
|
||||||
"@push.rocks/taskbuffer": "^3.1.7",
|
"@push.rocks/taskbuffer": "^3.1.7",
|
||||||
"@tsclass/tsclass": "^8.2.0",
|
"@tsclass/tsclass": "^9.2.0",
|
||||||
"mongodb": "^6.15.0"
|
"mongodb": "^6.18.0"
|
||||||
},
|
},
|
||||||
"devDependencies": {
|
"devDependencies": {
|
||||||
"@git.zone/tsbuild": "^2.3.2",
|
"@git.zone/tsbuild": "^2.6.7",
|
||||||
"@git.zone/tsrun": "^1.2.44",
|
"@git.zone/tsrun": "^1.2.44",
|
||||||
"@git.zone/tstest": "^1.0.77",
|
"@git.zone/tstest": "^2.3.5",
|
||||||
"@push.rocks/qenv": "^6.0.5",
|
"@push.rocks/qenv": "^6.1.3",
|
||||||
"@push.rocks/tapbundle": "^5.6.2",
|
"@push.rocks/tapbundle": "^6.0.3",
|
||||||
"@types/node": "^22.14.0"
|
"@types/node": "^22.15.2"
|
||||||
},
|
},
|
||||||
"files": [
|
"files": [
|
||||||
"ts/**/*",
|
"ts/**/*",
|
||||||
|
4665
pnpm-lock.yaml
generated
4665
pnpm-lock.yaml
generated
File diff suppressed because it is too large
Load Diff
97
test/test.cursor.ts
Normal file
97
test/test.cursor.ts
Normal file
@@ -0,0 +1,97 @@
|
|||||||
|
import { tap, expect } from '@push.rocks/tapbundle';
|
||||||
|
import * as smartmongo from '@push.rocks/smartmongo';
|
||||||
|
import { smartunique } from '../ts/plugins.js';
|
||||||
|
import * as smartdata from '../ts/index.js';
|
||||||
|
|
||||||
|
// Set up database connection
|
||||||
|
let smartmongoInstance: smartmongo.SmartMongo;
|
||||||
|
let testDb: smartdata.SmartdataDb;
|
||||||
|
|
||||||
|
// Define a simple document model for cursor tests
|
||||||
|
@smartdata.Collection(() => testDb)
|
||||||
|
class CursorTest extends smartdata.SmartDataDbDoc<CursorTest, CursorTest> {
|
||||||
|
@smartdata.unI()
|
||||||
|
public id: string = smartunique.shortId();
|
||||||
|
|
||||||
|
@smartdata.svDb()
|
||||||
|
public name: string;
|
||||||
|
|
||||||
|
@smartdata.svDb()
|
||||||
|
public order: number;
|
||||||
|
|
||||||
|
constructor(name: string, order: number) {
|
||||||
|
super();
|
||||||
|
this.name = name;
|
||||||
|
this.order = order;
|
||||||
|
}
|
||||||
|
}
|
||||||
|
|
||||||
|
// Initialize the in-memory MongoDB and SmartdataDB
|
||||||
|
tap.test('cursor init: start Mongo and SmartdataDb', async () => {
|
||||||
|
smartmongoInstance = await smartmongo.SmartMongo.createAndStart();
|
||||||
|
testDb = new smartdata.SmartdataDb(
|
||||||
|
await smartmongoInstance.getMongoDescriptor(),
|
||||||
|
);
|
||||||
|
await testDb.init();
|
||||||
|
});
|
||||||
|
|
||||||
|
// Insert sample documents
|
||||||
|
tap.test('cursor insert: save 5 test documents', async () => {
|
||||||
|
for (let i = 1; i <= 5; i++) {
|
||||||
|
const doc = new CursorTest(`item${i}`, i);
|
||||||
|
await doc.save();
|
||||||
|
}
|
||||||
|
const count = await CursorTest.getCount({});
|
||||||
|
expect(count).toEqual(5);
|
||||||
|
});
|
||||||
|
|
||||||
|
// Test that toArray returns all documents
|
||||||
|
tap.test('cursor toArray: retrieves all documents', async () => {
|
||||||
|
const cursor = await CursorTest.getCursor({});
|
||||||
|
const all = await cursor.toArray();
|
||||||
|
expect(all.length).toEqual(5);
|
||||||
|
});
|
||||||
|
|
||||||
|
// Test iteration via forEach
|
||||||
|
tap.test('cursor forEach: iterates through all documents', async () => {
|
||||||
|
const names: string[] = [];
|
||||||
|
const cursor = await CursorTest.getCursor({});
|
||||||
|
await cursor.forEach(async (item) => {
|
||||||
|
names.push(item.name);
|
||||||
|
});
|
||||||
|
expect(names.length).toEqual(5);
|
||||||
|
expect(names).toContain('item3');
|
||||||
|
});
|
||||||
|
|
||||||
|
// Test native cursor modifiers: limit
|
||||||
|
tap.test('cursor modifier limit: only two documents', async () => {
|
||||||
|
const cursor = await CursorTest.getCursor({}, { modifier: (c) => c.limit(2) });
|
||||||
|
const limited = await cursor.toArray();
|
||||||
|
expect(limited.length).toEqual(2);
|
||||||
|
});
|
||||||
|
|
||||||
|
// Test native cursor modifiers: sort and skip
|
||||||
|
tap.test('cursor modifier sort & skip: returns correct order', async () => {
|
||||||
|
const cursor = await CursorTest.getCursor({}, {
|
||||||
|
modifier: (c) => c.sort({ order: -1 }).skip(1),
|
||||||
|
});
|
||||||
|
const results = await cursor.toArray();
|
||||||
|
// Skipped the first (order 5), next should be 4,3,2,1
|
||||||
|
expect(results.length).toEqual(4);
|
||||||
|
expect(results[0].order).toEqual(4);
|
||||||
|
});
|
||||||
|
|
||||||
|
// Cleanup: drop database, close connections, stop Mongo
|
||||||
|
tap.test('cursor cleanup: drop DB and stop', async () => {
|
||||||
|
await testDb.mongoDb.dropDatabase();
|
||||||
|
await testDb.close();
|
||||||
|
if (smartmongoInstance) {
|
||||||
|
await smartmongoInstance.stopAndDumpToDir(
|
||||||
|
`.nogit/dbdump/test.cursor.ts`,
|
||||||
|
);
|
||||||
|
}
|
||||||
|
// Ensure process exits after cleanup
|
||||||
|
setTimeout(() => process.exit(), 2000);
|
||||||
|
});
|
||||||
|
|
||||||
|
export default tap.start();
|
604
test/test.filters.ts
Normal file
604
test/test.filters.ts
Normal file
@@ -0,0 +1,604 @@
|
|||||||
|
import { tap, expect } from '@git.zone/tstest/tapbundle';
|
||||||
|
import * as smartmongo from '@push.rocks/smartmongo';
|
||||||
|
import * as smartunique from '@push.rocks/smartunique';
|
||||||
|
import * as smartdata from '../ts/index.js';
|
||||||
|
|
||||||
|
const { SmartdataDb, Collection, svDb, unI, index } = smartdata;
|
||||||
|
|
||||||
|
let smartmongoInstance: smartmongo.SmartMongo;
|
||||||
|
let testDb: smartdata.SmartdataDb;
|
||||||
|
|
||||||
|
// Define test document classes
|
||||||
|
@Collection(() => testDb)
|
||||||
|
class TestUser extends smartdata.SmartDataDbDoc<TestUser, TestUser> {
|
||||||
|
@unI()
|
||||||
|
public id: string = smartunique.shortId();
|
||||||
|
|
||||||
|
@svDb()
|
||||||
|
public name: string;
|
||||||
|
|
||||||
|
@svDb()
|
||||||
|
public age: number;
|
||||||
|
|
||||||
|
@svDb()
|
||||||
|
public email: string;
|
||||||
|
|
||||||
|
@svDb()
|
||||||
|
public roles: string[];
|
||||||
|
|
||||||
|
@svDb()
|
||||||
|
public tags: string[];
|
||||||
|
|
||||||
|
@svDb()
|
||||||
|
public status: 'active' | 'inactive' | 'pending';
|
||||||
|
|
||||||
|
@svDb()
|
||||||
|
public metadata: {
|
||||||
|
lastLogin?: Date;
|
||||||
|
loginCount?: number;
|
||||||
|
preferences?: Record<string, any>;
|
||||||
|
};
|
||||||
|
|
||||||
|
@svDb()
|
||||||
|
public scores: number[];
|
||||||
|
|
||||||
|
constructor(data: Partial<TestUser> = {}) {
|
||||||
|
super();
|
||||||
|
Object.assign(this, data);
|
||||||
|
}
|
||||||
|
}
|
||||||
|
|
||||||
|
@Collection(() => testDb)
|
||||||
|
class TestOrder extends smartdata.SmartDataDbDoc<TestOrder, TestOrder> {
|
||||||
|
@unI()
|
||||||
|
public id: string = smartunique.shortId();
|
||||||
|
|
||||||
|
@svDb()
|
||||||
|
public userId: string;
|
||||||
|
|
||||||
|
@svDb()
|
||||||
|
public items: Array<{
|
||||||
|
product: string;
|
||||||
|
quantity: number;
|
||||||
|
price: number;
|
||||||
|
}>;
|
||||||
|
|
||||||
|
@svDb()
|
||||||
|
public totalAmount: number;
|
||||||
|
|
||||||
|
@svDb()
|
||||||
|
public status: string;
|
||||||
|
|
||||||
|
@svDb()
|
||||||
|
public tags: string[];
|
||||||
|
|
||||||
|
constructor(data: Partial<TestOrder> = {}) {
|
||||||
|
super();
|
||||||
|
Object.assign(this, data);
|
||||||
|
}
|
||||||
|
}
|
||||||
|
|
||||||
|
// Setup and teardown
|
||||||
|
tap.test('should create a test database instance', async () => {
|
||||||
|
smartmongoInstance = await smartmongo.SmartMongo.createAndStart();
|
||||||
|
testDb = new smartdata.SmartdataDb(await smartmongoInstance.getMongoDescriptor());
|
||||||
|
await testDb.init();
|
||||||
|
expect(testDb).toBeInstanceOf(SmartdataDb);
|
||||||
|
});
|
||||||
|
|
||||||
|
tap.test('should create test data', async () => {
|
||||||
|
// Create test users
|
||||||
|
const users = [
|
||||||
|
new TestUser({
|
||||||
|
name: 'John Doe',
|
||||||
|
age: 30,
|
||||||
|
email: 'john@example.com',
|
||||||
|
roles: ['admin', 'user'],
|
||||||
|
tags: ['javascript', 'nodejs', 'mongodb'],
|
||||||
|
status: 'active',
|
||||||
|
metadata: { loginCount: 5, lastLogin: new Date() },
|
||||||
|
scores: [85, 90, 78]
|
||||||
|
}),
|
||||||
|
new TestUser({
|
||||||
|
name: 'Jane Smith',
|
||||||
|
age: 25,
|
||||||
|
email: 'jane@example.com',
|
||||||
|
roles: ['user'],
|
||||||
|
tags: ['python', 'mongodb'],
|
||||||
|
status: 'active',
|
||||||
|
metadata: { loginCount: 3 },
|
||||||
|
scores: [92, 88, 95]
|
||||||
|
}),
|
||||||
|
new TestUser({
|
||||||
|
name: 'Bob Johnson',
|
||||||
|
age: 35,
|
||||||
|
email: 'bob@example.com',
|
||||||
|
roles: ['moderator', 'user'],
|
||||||
|
tags: ['javascript', 'react', 'nodejs'],
|
||||||
|
status: 'inactive',
|
||||||
|
metadata: { loginCount: 0 },
|
||||||
|
scores: [70, 75, 80]
|
||||||
|
}),
|
||||||
|
new TestUser({
|
||||||
|
name: 'Alice Brown',
|
||||||
|
age: 28,
|
||||||
|
email: 'alice@example.com',
|
||||||
|
roles: ['admin'],
|
||||||
|
tags: ['typescript', 'angular', 'mongodb'],
|
||||||
|
status: 'active',
|
||||||
|
metadata: { loginCount: 10 },
|
||||||
|
scores: [95, 98, 100]
|
||||||
|
}),
|
||||||
|
new TestUser({
|
||||||
|
name: 'Charlie Wilson',
|
||||||
|
age: 22,
|
||||||
|
email: 'charlie@example.com',
|
||||||
|
roles: ['user'],
|
||||||
|
tags: ['golang', 'kubernetes'],
|
||||||
|
status: 'pending',
|
||||||
|
metadata: { loginCount: 1 },
|
||||||
|
scores: [60, 65]
|
||||||
|
})
|
||||||
|
];
|
||||||
|
|
||||||
|
for (const user of users) {
|
||||||
|
await user.save();
|
||||||
|
}
|
||||||
|
|
||||||
|
// Create test orders
|
||||||
|
const orders = [
|
||||||
|
new TestOrder({
|
||||||
|
userId: users[0].id,
|
||||||
|
items: [
|
||||||
|
{ product: 'laptop', quantity: 1, price: 1200 },
|
||||||
|
{ product: 'mouse', quantity: 2, price: 25 }
|
||||||
|
],
|
||||||
|
totalAmount: 1250,
|
||||||
|
status: 'completed',
|
||||||
|
tags: ['electronics', 'priority']
|
||||||
|
}),
|
||||||
|
new TestOrder({
|
||||||
|
userId: users[1].id,
|
||||||
|
items: [
|
||||||
|
{ product: 'book', quantity: 3, price: 15 },
|
||||||
|
{ product: 'pen', quantity: 5, price: 2 }
|
||||||
|
],
|
||||||
|
totalAmount: 55,
|
||||||
|
status: 'pending',
|
||||||
|
tags: ['stationery']
|
||||||
|
}),
|
||||||
|
new TestOrder({
|
||||||
|
userId: users[0].id,
|
||||||
|
items: [
|
||||||
|
{ product: 'laptop', quantity: 2, price: 1200 },
|
||||||
|
{ product: 'keyboard', quantity: 2, price: 80 }
|
||||||
|
],
|
||||||
|
totalAmount: 2560,
|
||||||
|
status: 'processing',
|
||||||
|
tags: ['electronics', 'bulk']
|
||||||
|
})
|
||||||
|
];
|
||||||
|
|
||||||
|
for (const order of orders) {
|
||||||
|
await order.save();
|
||||||
|
}
|
||||||
|
|
||||||
|
const savedUsers = await TestUser.getInstances({});
|
||||||
|
const savedOrders = await TestOrder.getInstances({});
|
||||||
|
expect(savedUsers.length).toEqual(5);
|
||||||
|
expect(savedOrders.length).toEqual(3);
|
||||||
|
});
|
||||||
|
|
||||||
|
// ============= BASIC FILTER TESTS =============
|
||||||
|
tap.test('should filter by simple equality', async () => {
|
||||||
|
const users = await TestUser.getInstances({ name: 'John Doe' });
|
||||||
|
expect(users.length).toEqual(1);
|
||||||
|
expect(users[0].name).toEqual('John Doe');
|
||||||
|
});
|
||||||
|
|
||||||
|
tap.test('should filter by multiple fields (implicit AND)', async () => {
|
||||||
|
const users = await TestUser.getInstances({
|
||||||
|
status: 'active',
|
||||||
|
age: 30
|
||||||
|
});
|
||||||
|
expect(users.length).toEqual(1);
|
||||||
|
expect(users[0].name).toEqual('John Doe');
|
||||||
|
});
|
||||||
|
|
||||||
|
tap.test('should filter by nested object fields', async () => {
|
||||||
|
const users = await TestUser.getInstances({
|
||||||
|
'metadata.loginCount': 5
|
||||||
|
});
|
||||||
|
expect(users.length).toEqual(1);
|
||||||
|
expect(users[0].name).toEqual('John Doe');
|
||||||
|
});
|
||||||
|
|
||||||
|
// ============= COMPARISON OPERATOR TESTS =============
|
||||||
|
tap.test('should filter using $gt operator', async () => {
|
||||||
|
const users = await TestUser.getInstances({
|
||||||
|
age: { $gt: 30 }
|
||||||
|
});
|
||||||
|
expect(users.length).toEqual(1);
|
||||||
|
expect(users[0].name).toEqual('Bob Johnson');
|
||||||
|
});
|
||||||
|
|
||||||
|
tap.test('should filter using $gte operator', async () => {
|
||||||
|
const users = await TestUser.getInstances({
|
||||||
|
age: { $gte: 30 }
|
||||||
|
});
|
||||||
|
expect(users.length).toEqual(2);
|
||||||
|
const names = users.map(u => u.name).sort();
|
||||||
|
expect(names).toEqual(['Bob Johnson', 'John Doe']);
|
||||||
|
});
|
||||||
|
|
||||||
|
tap.test('should filter using $lt operator', async () => {
|
||||||
|
const users = await TestUser.getInstances({
|
||||||
|
age: { $lt: 25 }
|
||||||
|
});
|
||||||
|
expect(users.length).toEqual(1);
|
||||||
|
expect(users[0].name).toEqual('Charlie Wilson');
|
||||||
|
});
|
||||||
|
|
||||||
|
tap.test('should filter using $lte operator', async () => {
|
||||||
|
const users = await TestUser.getInstances({
|
||||||
|
age: { $lte: 25 }
|
||||||
|
});
|
||||||
|
expect(users.length).toEqual(2);
|
||||||
|
const names = users.map(u => u.name).sort();
|
||||||
|
expect(names).toEqual(['Charlie Wilson', 'Jane Smith']);
|
||||||
|
});
|
||||||
|
|
||||||
|
tap.test('should filter using $ne operator', async () => {
|
||||||
|
const users = await TestUser.getInstances({
|
||||||
|
status: { $ne: 'active' }
|
||||||
|
});
|
||||||
|
expect(users.length).toEqual(2);
|
||||||
|
const statuses = users.map(u => u.status).sort();
|
||||||
|
expect(statuses).toEqual(['inactive', 'pending']);
|
||||||
|
});
|
||||||
|
|
||||||
|
tap.test('should filter using multiple comparison operators', async () => {
|
||||||
|
const users = await TestUser.getInstances({
|
||||||
|
age: { $gte: 25, $lt: 30 }
|
||||||
|
});
|
||||||
|
expect(users.length).toEqual(2);
|
||||||
|
const names = users.map(u => u.name).sort();
|
||||||
|
expect(names).toEqual(['Alice Brown', 'Jane Smith']);
|
||||||
|
});
|
||||||
|
|
||||||
|
// ============= ARRAY OPERATOR TESTS =============
|
||||||
|
tap.test('should filter using $in operator', async () => {
|
||||||
|
const users = await TestUser.getInstances({
|
||||||
|
status: { $in: ['active', 'pending'] }
|
||||||
|
});
|
||||||
|
expect(users.length).toEqual(4);
|
||||||
|
expect(users.every(u => ['active', 'pending'].includes(u.status))).toEqual(true);
|
||||||
|
});
|
||||||
|
|
||||||
|
tap.test('should filter arrays using $in operator', async () => {
|
||||||
|
const users = await TestUser.getInstances({
|
||||||
|
roles: { $in: ['admin'] }
|
||||||
|
});
|
||||||
|
expect(users.length).toEqual(2);
|
||||||
|
const names = users.map(u => u.name).sort();
|
||||||
|
expect(names).toEqual(['Alice Brown', 'John Doe']);
|
||||||
|
});
|
||||||
|
|
||||||
|
tap.test('should filter using $nin operator', async () => {
|
||||||
|
const users = await TestUser.getInstances({
|
||||||
|
status: { $nin: ['inactive', 'pending'] }
|
||||||
|
});
|
||||||
|
expect(users.length).toEqual(3);
|
||||||
|
expect(users.every(u => u.status === 'active')).toEqual(true);
|
||||||
|
});
|
||||||
|
|
||||||
|
tap.test('should filter arrays using $all operator', async () => {
|
||||||
|
const users = await TestUser.getInstances({
|
||||||
|
tags: { $all: ['javascript', 'nodejs'] }
|
||||||
|
});
|
||||||
|
expect(users.length).toEqual(2);
|
||||||
|
const names = users.map(u => u.name).sort();
|
||||||
|
expect(names).toEqual(['Bob Johnson', 'John Doe']);
|
||||||
|
});
|
||||||
|
|
||||||
|
tap.test('should filter arrays using $size operator', async () => {
|
||||||
|
const users = await TestUser.getInstances({
|
||||||
|
scores: { $size: 2 }
|
||||||
|
});
|
||||||
|
expect(users.length).toEqual(1);
|
||||||
|
expect(users[0].name).toEqual('Charlie Wilson');
|
||||||
|
});
|
||||||
|
|
||||||
|
tap.test('should filter arrays using $elemMatch operator', async () => {
|
||||||
|
const orders = await TestOrder.getInstances({
|
||||||
|
items: {
|
||||||
|
$elemMatch: {
|
||||||
|
product: 'laptop',
|
||||||
|
quantity: { $gte: 2 }
|
||||||
|
}
|
||||||
|
}
|
||||||
|
});
|
||||||
|
expect(orders.length).toEqual(1);
|
||||||
|
expect(orders[0].totalAmount).toEqual(2560);
|
||||||
|
});
|
||||||
|
|
||||||
|
tap.test('should filter using $elemMatch with single condition', async () => {
|
||||||
|
const orders = await TestOrder.getInstances({
|
||||||
|
items: {
|
||||||
|
$elemMatch: {
|
||||||
|
price: { $gt: 100 }
|
||||||
|
}
|
||||||
|
}
|
||||||
|
});
|
||||||
|
expect(orders.length).toEqual(2);
|
||||||
|
expect(orders.every(o => o.items.some(i => i.price > 100))).toEqual(true);
|
||||||
|
});
|
||||||
|
|
||||||
|
// ============= LOGICAL OPERATOR TESTS =============
|
||||||
|
tap.test('should filter using $or operator', async () => {
|
||||||
|
const users = await TestUser.getInstances({
|
||||||
|
$or: [
|
||||||
|
{ age: { $lt: 25 } },
|
||||||
|
{ status: 'inactive' }
|
||||||
|
]
|
||||||
|
});
|
||||||
|
expect(users.length).toEqual(2);
|
||||||
|
const names = users.map(u => u.name).sort();
|
||||||
|
expect(names).toEqual(['Bob Johnson', 'Charlie Wilson']);
|
||||||
|
});
|
||||||
|
|
||||||
|
tap.test('should filter using $and operator', async () => {
|
||||||
|
const users = await TestUser.getInstances({
|
||||||
|
$and: [
|
||||||
|
{ status: 'active' },
|
||||||
|
{ age: { $gte: 28 } }
|
||||||
|
]
|
||||||
|
});
|
||||||
|
expect(users.length).toEqual(2);
|
||||||
|
const names = users.map(u => u.name).sort();
|
||||||
|
expect(names).toEqual(['Alice Brown', 'John Doe']);
|
||||||
|
});
|
||||||
|
|
||||||
|
tap.test('should filter using $nor operator', async () => {
|
||||||
|
const users = await TestUser.getInstances({
|
||||||
|
$nor: [
|
||||||
|
{ status: 'inactive' },
|
||||||
|
{ age: { $lt: 25 } }
|
||||||
|
]
|
||||||
|
});
|
||||||
|
expect(users.length).toEqual(3);
|
||||||
|
expect(users.every(u => u.status !== 'inactive' && u.age >= 25)).toEqual(true);
|
||||||
|
});
|
||||||
|
|
||||||
|
tap.test('should filter using nested logical operators', async () => {
|
||||||
|
const users = await TestUser.getInstances({
|
||||||
|
$or: [
|
||||||
|
{
|
||||||
|
$and: [
|
||||||
|
{ status: 'active' },
|
||||||
|
{ roles: { $in: ['admin'] } }
|
||||||
|
]
|
||||||
|
},
|
||||||
|
{ age: { $lt: 23 } }
|
||||||
|
]
|
||||||
|
});
|
||||||
|
expect(users.length).toEqual(3);
|
||||||
|
const names = users.map(u => u.name).sort();
|
||||||
|
expect(names).toEqual(['Alice Brown', 'Charlie Wilson', 'John Doe']);
|
||||||
|
});
|
||||||
|
|
||||||
|
// ============= ELEMENT OPERATOR TESTS =============
|
||||||
|
tap.test('should filter using $exists operator', async () => {
|
||||||
|
const users = await TestUser.getInstances({
|
||||||
|
'metadata.lastLogin': { $exists: true }
|
||||||
|
});
|
||||||
|
expect(users.length).toEqual(1);
|
||||||
|
expect(users[0].name).toEqual('John Doe');
|
||||||
|
});
|
||||||
|
|
||||||
|
tap.test('should filter using $exists false', async () => {
|
||||||
|
const users = await TestUser.getInstances({
|
||||||
|
'metadata.preferences': { $exists: false }
|
||||||
|
});
|
||||||
|
expect(users.length).toEqual(5);
|
||||||
|
});
|
||||||
|
|
||||||
|
// ============= COMPLEX FILTER TESTS =============
|
||||||
|
tap.test('should handle complex nested filters', async () => {
|
||||||
|
const users = await TestUser.getInstances({
|
||||||
|
$and: [
|
||||||
|
{ status: 'active' },
|
||||||
|
{
|
||||||
|
$or: [
|
||||||
|
{ age: { $gte: 30 } },
|
||||||
|
{ roles: { $all: ['admin'] } }
|
||||||
|
]
|
||||||
|
},
|
||||||
|
{ tags: { $in: ['mongodb'] } }
|
||||||
|
]
|
||||||
|
});
|
||||||
|
expect(users.length).toEqual(2);
|
||||||
|
const names = users.map(u => u.name).sort();
|
||||||
|
expect(names).toEqual(['Alice Brown', 'John Doe']);
|
||||||
|
});
|
||||||
|
|
||||||
|
tap.test('should combine multiple operator types', async () => {
|
||||||
|
const orders = await TestOrder.getInstances({
|
||||||
|
$and: [
|
||||||
|
{ totalAmount: { $gte: 100 } },
|
||||||
|
{ status: { $in: ['completed', 'processing'] } },
|
||||||
|
{ tags: { $in: ['electronics'] } }
|
||||||
|
]
|
||||||
|
});
|
||||||
|
expect(orders.length).toEqual(2);
|
||||||
|
expect(orders.every(o => o.totalAmount >= 100)).toEqual(true);
|
||||||
|
});
|
||||||
|
|
||||||
|
// ============= ERROR HANDLING TESTS =============
|
||||||
|
tap.test('should throw error for $where operator', async () => {
|
||||||
|
let error: Error | null = null;
|
||||||
|
try {
|
||||||
|
await TestUser.getInstances({
|
||||||
|
$where: 'this.age > 25'
|
||||||
|
});
|
||||||
|
} catch (e) {
|
||||||
|
error = e as Error;
|
||||||
|
}
|
||||||
|
expect(error).toBeTruthy();
|
||||||
|
expect(error?.message).toMatch(/\$where.*not allowed/);
|
||||||
|
});
|
||||||
|
|
||||||
|
tap.test('should throw error for invalid $in value', async () => {
|
||||||
|
let error: Error | null = null;
|
||||||
|
try {
|
||||||
|
await TestUser.getInstances({
|
||||||
|
status: { $in: 'active' as any } // Should be an array
|
||||||
|
});
|
||||||
|
} catch (e) {
|
||||||
|
error = e as Error;
|
||||||
|
}
|
||||||
|
expect(error).toBeTruthy();
|
||||||
|
expect(error?.message).toMatch(/\$in.*requires.*array/);
|
||||||
|
});
|
||||||
|
|
||||||
|
tap.test('should throw error for invalid $size value', async () => {
|
||||||
|
let error: Error | null = null;
|
||||||
|
try {
|
||||||
|
await TestUser.getInstances({
|
||||||
|
scores: { $size: '3' as any } // Should be a number
|
||||||
|
});
|
||||||
|
} catch (e) {
|
||||||
|
error = e as Error;
|
||||||
|
}
|
||||||
|
expect(error).toBeTruthy();
|
||||||
|
expect(error?.message).toMatch(/\$size.*requires.*numeric/);
|
||||||
|
});
|
||||||
|
|
||||||
|
tap.test('should throw error for dots in field names', async () => {
|
||||||
|
let error: Error | null = null;
|
||||||
|
try {
|
||||||
|
await TestUser.getInstances({
|
||||||
|
'some.nested.field': { 'invalid.key': 'value' }
|
||||||
|
});
|
||||||
|
} catch (e) {
|
||||||
|
error = e as Error;
|
||||||
|
}
|
||||||
|
expect(error).toBeTruthy();
|
||||||
|
expect(error?.message).toMatch(/keys cannot contain dots/);
|
||||||
|
});
|
||||||
|
|
||||||
|
// ============= EDGE CASE TESTS =============
|
||||||
|
tap.test('should handle empty filter (return all)', async () => {
|
||||||
|
const users = await TestUser.getInstances({});
|
||||||
|
expect(users.length).toEqual(5);
|
||||||
|
});
|
||||||
|
|
||||||
|
tap.test('should handle null values in filter', async () => {
|
||||||
|
// First, create a user with null email
|
||||||
|
const nullUser = new TestUser({
|
||||||
|
name: 'Null User',
|
||||||
|
age: 40,
|
||||||
|
email: null as any,
|
||||||
|
roles: ['user'],
|
||||||
|
tags: [],
|
||||||
|
status: 'active',
|
||||||
|
metadata: {},
|
||||||
|
scores: []
|
||||||
|
});
|
||||||
|
await nullUser.save();
|
||||||
|
|
||||||
|
const users = await TestUser.getInstances({ email: null });
|
||||||
|
expect(users.length).toEqual(1);
|
||||||
|
expect(users[0].name).toEqual('Null User');
|
||||||
|
|
||||||
|
// Clean up
|
||||||
|
await nullUser.delete();
|
||||||
|
});
|
||||||
|
|
||||||
|
tap.test('should handle arrays as direct equality match', async () => {
|
||||||
|
// This tests that arrays without operators are treated as equality matches
|
||||||
|
const users = await TestUser.getInstances({
|
||||||
|
roles: ['user'] // Exact match for array
|
||||||
|
});
|
||||||
|
expect(users.length).toEqual(2); // Both Jane and Charlie have exactly ['user']
|
||||||
|
const names = users.map(u => u.name).sort();
|
||||||
|
expect(names).toEqual(['Charlie Wilson', 'Jane Smith']);
|
||||||
|
});
|
||||||
|
|
||||||
|
tap.test('should handle regex operator', async () => {
|
||||||
|
const users = await TestUser.getInstances({
|
||||||
|
name: { $regex: '^J', $options: 'i' }
|
||||||
|
});
|
||||||
|
expect(users.length).toEqual(2);
|
||||||
|
const names = users.map(u => u.name).sort();
|
||||||
|
expect(names).toEqual(['Jane Smith', 'John Doe']);
|
||||||
|
});
|
||||||
|
|
||||||
|
tap.test('should handle unknown operators by letting MongoDB reject them', async () => {
|
||||||
|
// Unknown operators should be passed through to MongoDB, which will reject them
|
||||||
|
let error: Error | null = null;
|
||||||
|
|
||||||
|
try {
|
||||||
|
await TestUser.getInstances({
|
||||||
|
age: { $unknownOp: 30 } as any
|
||||||
|
});
|
||||||
|
} catch (e) {
|
||||||
|
error = e as Error;
|
||||||
|
}
|
||||||
|
|
||||||
|
expect(error).toBeTruthy();
|
||||||
|
expect(error?.message).toMatch(/unknown operator.*\$unknownOp/);
|
||||||
|
});
|
||||||
|
|
||||||
|
// ============= PERFORMANCE TESTS =============
|
||||||
|
tap.test('should efficiently filter large result sets', async () => {
|
||||||
|
// Create many test documents
|
||||||
|
const manyUsers = [];
|
||||||
|
for (let i = 0; i < 100; i++) {
|
||||||
|
manyUsers.push(new TestUser({
|
||||||
|
name: `User ${i}`,
|
||||||
|
age: 20 + (i % 40),
|
||||||
|
email: `user${i}@example.com`,
|
||||||
|
roles: i % 3 === 0 ? ['admin'] : ['user'],
|
||||||
|
tags: i % 2 === 0 ? ['even', 'test'] : ['odd', 'test'],
|
||||||
|
status: i % 4 === 0 ? 'inactive' : 'active',
|
||||||
|
metadata: { loginCount: i },
|
||||||
|
scores: [i, i + 10, i + 20]
|
||||||
|
}));
|
||||||
|
}
|
||||||
|
|
||||||
|
// Save in batches for efficiency
|
||||||
|
for (const user of manyUsers) {
|
||||||
|
await user.save();
|
||||||
|
}
|
||||||
|
|
||||||
|
// Complex filter that should still be fast
|
||||||
|
const startTime = Date.now();
|
||||||
|
const filtered = await TestUser.getInstances({
|
||||||
|
$and: [
|
||||||
|
{ age: { $gte: 30, $lt: 40 } },
|
||||||
|
{ status: 'active' },
|
||||||
|
{ tags: { $in: ['even'] } },
|
||||||
|
{ 'metadata.loginCount': { $gte: 20 } }
|
||||||
|
]
|
||||||
|
});
|
||||||
|
const duration = Date.now() - startTime;
|
||||||
|
|
||||||
|
console.log(`Complex filter on 100+ documents took ${duration}ms`);
|
||||||
|
expect(duration).toBeLessThan(1000); // Should complete in under 1 second
|
||||||
|
expect(filtered.length).toBeGreaterThan(0);
|
||||||
|
|
||||||
|
// Clean up
|
||||||
|
for (const user of manyUsers) {
|
||||||
|
await user.delete();
|
||||||
|
}
|
||||||
|
});
|
||||||
|
|
||||||
|
// ============= CLEANUP =============
|
||||||
|
tap.test('should clean up test database', async () => {
|
||||||
|
await testDb.mongoDb.dropDatabase();
|
||||||
|
await testDb.close();
|
||||||
|
await smartmongoInstance.stop();
|
||||||
|
});
|
||||||
|
|
||||||
|
export default tap.start();
|
@@ -172,6 +172,21 @@ tap.test('grouping: (Furniture OR Electronics) AND Chair', async () => {
|
|||||||
expect(names).toEqual(['Gaming Chair', 'Office Chair']);
|
expect(names).toEqual(['Gaming Chair', 'Office Chair']);
|
||||||
});
|
});
|
||||||
|
|
||||||
|
// Additional range and combined query tests
|
||||||
|
tap.test('range query price:[30 TO 300] returns expected products', async () => {
|
||||||
|
const res = await Product.search('price:[30 TO 300]');
|
||||||
|
// Expect products with price between 30 and 300 inclusive: Day Light Lamp, Gaming Chair, Office Chair, AirPods
|
||||||
|
expect(res.length).toEqual(4);
|
||||||
|
const names = res.map((r) => r.name).sort();
|
||||||
|
expect(names).toEqual(['AirPods', 'Day Light Lamp', 'Gaming Chair', 'Office Chair']);
|
||||||
|
});
|
||||||
|
|
||||||
|
tap.test('should filter category and price range', async () => {
|
||||||
|
const res = await Product.search('category:Lighting AND price:[30 TO 40]');
|
||||||
|
expect(res.length).toEqual(1);
|
||||||
|
expect(res[0].name).toEqual('Day Light Lamp');
|
||||||
|
});
|
||||||
|
|
||||||
// Teardown
|
// Teardown
|
||||||
tap.test('cleanup advanced search database', async () => {
|
tap.test('cleanup advanced search database', async () => {
|
||||||
await testDb.mongoDb.dropDatabase();
|
await testDb.mongoDb.dropDatabase();
|
||||||
|
@@ -9,6 +9,8 @@ import { searchable } from '../ts/classes.doc.js';
|
|||||||
// Set up database connection
|
// Set up database connection
|
||||||
let smartmongoInstance: smartmongo.SmartMongo;
|
let smartmongoInstance: smartmongo.SmartMongo;
|
||||||
let testDb: smartdata.SmartdataDb;
|
let testDb: smartdata.SmartdataDb;
|
||||||
|
// Class for location-based wildcard/phrase tests
|
||||||
|
let LocationDoc: any;
|
||||||
|
|
||||||
// Define a test class with searchable fields using the standard SmartDataDbDoc
|
// Define a test class with searchable fields using the standard SmartDataDbDoc
|
||||||
@smartdata.Collection(() => testDb)
|
@smartdata.Collection(() => testDb)
|
||||||
@@ -257,6 +259,143 @@ tap.test('should search hyphenated terms "E-reader"', async () => {
|
|||||||
expect(ereaderResults[0].name).toEqual('Kindle Paperwhite');
|
expect(ereaderResults[0].name).toEqual('Kindle Paperwhite');
|
||||||
});
|
});
|
||||||
|
|
||||||
|
// Additional robustness tests
|
||||||
|
tap.test('should return all products for empty search', async () => {
|
||||||
|
const searchResults = await Product.search('');
|
||||||
|
const allProducts = await Product.getInstances({});
|
||||||
|
expect(searchResults.length).toEqual(allProducts.length);
|
||||||
|
});
|
||||||
|
|
||||||
|
tap.test('should support wildcard plain term across all fields', async () => {
|
||||||
|
const results = await Product.search('*book*');
|
||||||
|
const names = results.map((r) => r.name).sort();
|
||||||
|
expect(names).toEqual(['Harry Potter', 'Kindle Paperwhite', 'MacBook Pro']);
|
||||||
|
});
|
||||||
|
|
||||||
|
tap.test('should support wildcard plain term with question mark pattern', async () => {
|
||||||
|
const results = await Product.search('?one?');
|
||||||
|
const names = results.map((r) => r.name).sort();
|
||||||
|
expect(names).toEqual(['Galaxy S21', 'iPhone 12']);
|
||||||
|
});
|
||||||
|
|
||||||
|
// Filter and Validation tests
|
||||||
|
tap.test('should apply filter option to restrict results', async () => {
|
||||||
|
// search term 'book' across all fields but restrict to Books category
|
||||||
|
const bookFiltered = await Product.search('book', { filter: { category: 'Books' } });
|
||||||
|
expect(bookFiltered.length).toEqual(2);
|
||||||
|
bookFiltered.forEach((p) => expect(p.category).toEqual('Books'));
|
||||||
|
});
|
||||||
|
tap.test('should apply validate hook to post-filter results', async () => {
|
||||||
|
// return only products with price > 500
|
||||||
|
const expensive = await Product.search('', { validate: (p) => p.price > 500 });
|
||||||
|
expect(expensive.length).toBeGreaterThan(0);
|
||||||
|
expensive.forEach((p) => expect(p.price).toBeGreaterThan(500));
|
||||||
|
});
|
||||||
|
|
||||||
|
// Tests for quoted and wildcard field-specific phrases
|
||||||
|
tap.test('setup location test products', async () => {
|
||||||
|
@smartdata.Collection(() => testDb)
|
||||||
|
class LD extends smartdata.SmartDataDbDoc<LD, LD> {
|
||||||
|
@smartdata.unI() public id: string = smartunique.shortId();
|
||||||
|
@smartdata.svDb() @searchable() public location: string;
|
||||||
|
constructor(loc: string) { super(); this.location = loc; }
|
||||||
|
}
|
||||||
|
// Assign to outer variable for subsequent tests
|
||||||
|
LocationDoc = LD;
|
||||||
|
const locations = ['Berlin', 'Frankfurt am Main', 'Frankfurt am Oder', 'London'];
|
||||||
|
for (const loc of locations) {
|
||||||
|
await new LocationDoc(loc).save();
|
||||||
|
}
|
||||||
|
});
|
||||||
|
tap.test('should search exact quoted field phrase', async () => {
|
||||||
|
const results = await (LocationDoc as any).search('location:"Frankfurt am Main"');
|
||||||
|
expect(results.length).toEqual(1);
|
||||||
|
expect(results[0].location).toEqual('Frankfurt am Main');
|
||||||
|
});
|
||||||
|
tap.test('should search wildcard quoted field phrase', async () => {
|
||||||
|
const results = await (LocationDoc as any).search('location:"Frankfurt am *"');
|
||||||
|
const names = results.map((d: any) => d.location).sort();
|
||||||
|
expect(names).toEqual(['Frankfurt am Main', 'Frankfurt am Oder']);
|
||||||
|
});
|
||||||
|
tap.test('should search unquoted wildcard field', async () => {
|
||||||
|
const results = await (LocationDoc as any).search('location:Frankfurt*');
|
||||||
|
const names = results.map((d: any) => d.location).sort();
|
||||||
|
expect(names).toEqual(['Frankfurt am Main', 'Frankfurt am Oder']);
|
||||||
|
});
|
||||||
|
|
||||||
|
// Combined free-term + field phrase/wildcard tests
|
||||||
|
let CombinedDoc: any;
|
||||||
|
tap.test('setup combined docs for free-term and location tests', async () => {
|
||||||
|
@smartdata.Collection(() => testDb)
|
||||||
|
class CD extends smartdata.SmartDataDbDoc<CD, CD> {
|
||||||
|
@smartdata.unI() public id: string = smartunique.shortId();
|
||||||
|
@smartdata.svDb() @searchable() public name: string;
|
||||||
|
@smartdata.svDb() @searchable() public location: string;
|
||||||
|
constructor(name: string, location: string) { super(); this.name = name; this.location = location; }
|
||||||
|
}
|
||||||
|
CombinedDoc = CD;
|
||||||
|
const docs = [
|
||||||
|
new CombinedDoc('TypeScript', 'Berlin'),
|
||||||
|
new CombinedDoc('TypeScript', 'Frankfurt am Main'),
|
||||||
|
new CombinedDoc('TypeScript', 'Frankfurt am Oder'),
|
||||||
|
new CombinedDoc('JavaScript', 'Berlin'),
|
||||||
|
];
|
||||||
|
for (const d of docs) await d.save();
|
||||||
|
});
|
||||||
|
tap.test('should search free term and exact quoted field phrase', async () => {
|
||||||
|
const res = await CombinedDoc.search('TypeScript location:"Berlin"');
|
||||||
|
expect(res.length).toEqual(1);
|
||||||
|
expect(res[0].location).toEqual('Berlin');
|
||||||
|
});
|
||||||
|
tap.test('should not match free term with non-matching quoted field phrase', async () => {
|
||||||
|
const res = await CombinedDoc.search('TypeScript location:"Frankfurt d"');
|
||||||
|
expect(res.length).toEqual(0);
|
||||||
|
});
|
||||||
|
tap.test('should search free term with quoted wildcard field phrase', async () => {
|
||||||
|
const res = await CombinedDoc.search('TypeScript location:"Frankfurt am *"');
|
||||||
|
const locs = res.map((r: any) => r.location).sort();
|
||||||
|
expect(locs).toEqual(['Frankfurt am Main', 'Frankfurt am Oder']);
|
||||||
|
});
|
||||||
|
// Quoted exact field phrase without wildcard should return no matches if no exact match
|
||||||
|
tap.test('should not match location:"Frankfurt d"', async () => {
|
||||||
|
const results = await (LocationDoc as any).search('location:"Frankfurt d"');
|
||||||
|
expect(results.length).toEqual(0);
|
||||||
|
});
|
||||||
|
|
||||||
|
// Combined free-term and field wildcard tests
|
||||||
|
tap.test('should combine free term and wildcard field search', async () => {
|
||||||
|
const results = await Product.search('book category:Book*');
|
||||||
|
expect(results.length).toEqual(2);
|
||||||
|
results.forEach((p) => expect(p.category).toEqual('Books'));
|
||||||
|
});
|
||||||
|
tap.test('should not match when free term matches but wildcard field does not', async () => {
|
||||||
|
const results = await Product.search('book category:Kitchen*');
|
||||||
|
expect(results.length).toEqual(0);
|
||||||
|
});
|
||||||
|
|
||||||
|
// Non-searchable field should cause an error for combined queries
|
||||||
|
tap.test('should throw when combining term with non-searchable field', async () => {
|
||||||
|
let error: Error;
|
||||||
|
try {
|
||||||
|
await Product.search('book location:Berlin');
|
||||||
|
} catch (e) {
|
||||||
|
error = e as Error;
|
||||||
|
}
|
||||||
|
expect(error).toBeTruthy();
|
||||||
|
expect(error.message).toMatch(/not searchable/);
|
||||||
|
});
|
||||||
|
tap.test('should throw when combining term with non-searchable wildcard field', async () => {
|
||||||
|
let error: Error;
|
||||||
|
try {
|
||||||
|
await Product.search('book location:Berlin*');
|
||||||
|
} catch (e) {
|
||||||
|
error = e as Error;
|
||||||
|
}
|
||||||
|
expect(error).toBeTruthy();
|
||||||
|
expect(error.message).toMatch(/not searchable/);
|
||||||
|
});
|
||||||
|
|
||||||
|
// Close database connection
|
||||||
tap.test('close database connection', async () => {
|
tap.test('close database connection', async () => {
|
||||||
await testDb.mongoDb.dropDatabase();
|
await testDb.mongoDb.dropDatabase();
|
||||||
await testDb.close();
|
await testDb.close();
|
||||||
|
@@ -60,6 +60,43 @@ tap.test('should watch a collection', async (toolsArg) => {
|
|||||||
await done.promise;
|
await done.promise;
|
||||||
});
|
});
|
||||||
|
|
||||||
|
// ======= New tests for EventEmitter and buffering support =======
|
||||||
|
tap.test('should emit change via EventEmitter', async (tools) => {
|
||||||
|
const done = tools.defer();
|
||||||
|
const watcher = await House.watch({});
|
||||||
|
watcher.on('change', async (houseArg) => {
|
||||||
|
// Expect a House instance
|
||||||
|
expect(houseArg).toBeDefined();
|
||||||
|
// Clean up
|
||||||
|
await watcher.stop();
|
||||||
|
done.resolve();
|
||||||
|
});
|
||||||
|
// Trigger an insert to generate a change event
|
||||||
|
const h = new House();
|
||||||
|
await h.save();
|
||||||
|
await done.promise;
|
||||||
|
});
|
||||||
|
|
||||||
|
tap.test('should buffer change events when bufferTimeMs is set', async (tools) => {
|
||||||
|
const done = tools.defer();
|
||||||
|
// bufferTimeMs collects events into arrays every 50ms
|
||||||
|
const watcher = await House.watch({}, { bufferTimeMs: 50 });
|
||||||
|
let received: House[];
|
||||||
|
watcher.changeSubject.subscribe(async (batch: House[]) => {
|
||||||
|
if (batch && batch.length > 0) {
|
||||||
|
received = batch;
|
||||||
|
await watcher.stop();
|
||||||
|
done.resolve();
|
||||||
|
}
|
||||||
|
});
|
||||||
|
// Rapidly insert multiple docs
|
||||||
|
const docs = [new House(), new House(), new House()];
|
||||||
|
for (const doc of docs) await doc.save();
|
||||||
|
await done.promise;
|
||||||
|
// All inserts should be in one buffered batch
|
||||||
|
expect(received.length).toEqual(docs.length);
|
||||||
|
});
|
||||||
|
|
||||||
// =======================================
|
// =======================================
|
||||||
// close the database connection
|
// close the database connection
|
||||||
// =======================================
|
// =======================================
|
||||||
|
@@ -3,6 +3,6 @@
|
|||||||
*/
|
*/
|
||||||
export const commitinfo = {
|
export const commitinfo = {
|
||||||
name: '@push.rocks/smartdata',
|
name: '@push.rocks/smartdata',
|
||||||
version: '5.11.2',
|
version: '5.16.2',
|
||||||
description: 'An advanced library for NoSQL data organization and manipulation using TypeScript with support for MongoDB, data validation, collections, and custom data types.'
|
description: 'An advanced library for NoSQL data organization and manipulation using TypeScript with support for MongoDB, data validation, collections, and custom data types.'
|
||||||
}
|
}
|
||||||
|
@@ -4,6 +4,7 @@ import { SmartdataDbCursor } from './classes.cursor.js';
|
|||||||
import { SmartDataDbDoc, type IIndexOptions } from './classes.doc.js';
|
import { SmartDataDbDoc, type IIndexOptions } from './classes.doc.js';
|
||||||
import { SmartdataDbWatcher } from './classes.watcher.js';
|
import { SmartdataDbWatcher } from './classes.watcher.js';
|
||||||
import { CollectionFactory } from './classes.collectionfactory.js';
|
import { CollectionFactory } from './classes.collectionfactory.js';
|
||||||
|
import { logger } from './logging.js';
|
||||||
|
|
||||||
export interface IFindOptions {
|
export interface IFindOptions {
|
||||||
limit?: number;
|
limit?: number;
|
||||||
@@ -161,7 +162,7 @@ export class SmartdataCollection<T> {
|
|||||||
});
|
});
|
||||||
if (!wantedCollection) {
|
if (!wantedCollection) {
|
||||||
await this.smartdataDb.mongoDb.createCollection(this.collectionName);
|
await this.smartdataDb.mongoDb.createCollection(this.collectionName);
|
||||||
console.log(`Successfully initiated Collection ${this.collectionName}`);
|
logger.log('info', `Successfully initiated Collection ${this.collectionName}`);
|
||||||
}
|
}
|
||||||
this.mongoDbCollection = this.smartdataDb.mongoDb.collection(this.collectionName);
|
this.mongoDbCollection = this.smartdataDb.mongoDb.collection(this.collectionName);
|
||||||
// Auto-create a compound text index on all searchable fields
|
// Auto-create a compound text index on all searchable fields
|
||||||
@@ -182,10 +183,10 @@ export class SmartdataCollection<T> {
|
|||||||
/**
|
/**
|
||||||
* mark unique index
|
* mark unique index
|
||||||
*/
|
*/
|
||||||
public markUniqueIndexes(keyArrayArg: string[] = []) {
|
public async markUniqueIndexes(keyArrayArg: string[] = []) {
|
||||||
for (const key of keyArrayArg) {
|
for (const key of keyArrayArg) {
|
||||||
if (!this.uniqueIndexes.includes(key)) {
|
if (!this.uniqueIndexes.includes(key)) {
|
||||||
this.mongoDbCollection.createIndex(key, {
|
await this.mongoDbCollection.createIndex({ [key]: 1 }, {
|
||||||
unique: true,
|
unique: true,
|
||||||
});
|
});
|
||||||
// make sure we only call this once and not for every doc we create
|
// make sure we only call this once and not for every doc we create
|
||||||
@@ -197,12 +198,12 @@ export class SmartdataCollection<T> {
|
|||||||
/**
|
/**
|
||||||
* creates regular indexes for the collection
|
* creates regular indexes for the collection
|
||||||
*/
|
*/
|
||||||
public createRegularIndexes(indexesArg: Array<{field: string, options: IIndexOptions}> = []) {
|
public async createRegularIndexes(indexesArg: Array<{field: string, options: IIndexOptions}> = []) {
|
||||||
for (const indexDef of indexesArg) {
|
for (const indexDef of indexesArg) {
|
||||||
// Check if we've already created this index
|
// Check if we've already created this index
|
||||||
const indexKey = indexDef.field;
|
const indexKey = indexDef.field;
|
||||||
if (!this.regularIndexes.some(i => i.field === indexKey)) {
|
if (!this.regularIndexes.some(i => i.field === indexKey)) {
|
||||||
this.mongoDbCollection.createIndex(
|
await this.mongoDbCollection.createIndex(
|
||||||
{ [indexDef.field]: 1 }, // Simple single-field index
|
{ [indexDef.field]: 1 }, // Simple single-field index
|
||||||
indexDef.options
|
indexDef.options
|
||||||
);
|
);
|
||||||
@@ -222,53 +223,74 @@ export class SmartdataCollection<T> {
|
|||||||
/**
|
/**
|
||||||
* finds an object in the DbCollection
|
* finds an object in the DbCollection
|
||||||
*/
|
*/
|
||||||
public async findOne(filterObject: any): Promise<any> {
|
public async findOne(
|
||||||
|
filterObject: any,
|
||||||
|
opts?: { session?: plugins.mongodb.ClientSession }
|
||||||
|
): Promise<any> {
|
||||||
await this.init();
|
await this.init();
|
||||||
const cursor = this.mongoDbCollection.find(filterObject);
|
// Use MongoDB driver's findOne with optional session
|
||||||
const result = await cursor.next();
|
return this.mongoDbCollection.findOne(filterObject, { session: opts?.session });
|
||||||
cursor.close();
|
|
||||||
return result;
|
|
||||||
}
|
}
|
||||||
|
|
||||||
public async getCursor(
|
public async getCursor(
|
||||||
filterObjectArg: any,
|
filterObjectArg: any,
|
||||||
dbDocArg: typeof SmartDataDbDoc,
|
dbDocArg: typeof SmartDataDbDoc,
|
||||||
|
opts?: { session?: plugins.mongodb.ClientSession }
|
||||||
): Promise<SmartdataDbCursor<any>> {
|
): Promise<SmartdataDbCursor<any>> {
|
||||||
await this.init();
|
await this.init();
|
||||||
const cursor = this.mongoDbCollection.find(filterObjectArg);
|
const cursor = this.mongoDbCollection.find(filterObjectArg, { session: opts?.session });
|
||||||
return new SmartdataDbCursor(cursor, dbDocArg);
|
return new SmartdataDbCursor(cursor, dbDocArg);
|
||||||
}
|
}
|
||||||
|
|
||||||
/**
|
/**
|
||||||
* finds an object in the DbCollection
|
* finds an object in the DbCollection
|
||||||
*/
|
*/
|
||||||
public async findAll(filterObject: any): Promise<any[]> {
|
public async findAll(
|
||||||
|
filterObject: any,
|
||||||
|
opts?: { session?: plugins.mongodb.ClientSession }
|
||||||
|
): Promise<any[]> {
|
||||||
await this.init();
|
await this.init();
|
||||||
const cursor = this.mongoDbCollection.find(filterObject);
|
const cursor = this.mongoDbCollection.find(filterObject, { session: opts?.session });
|
||||||
const result = await cursor.toArray();
|
const result = await cursor.toArray();
|
||||||
cursor.close();
|
cursor.close();
|
||||||
return result;
|
return result;
|
||||||
}
|
}
|
||||||
|
|
||||||
/**
|
/**
|
||||||
* watches the collection while applying a filter
|
* Watches the collection, returning a SmartdataDbWatcher with RxJS and EventEmitter support.
|
||||||
|
* @param filterObject match filter for change stream
|
||||||
|
* @param opts optional MongoDB ChangeStreamOptions & { bufferTimeMs } to buffer events
|
||||||
|
* @param smartdataDbDocArg document class for instance creation
|
||||||
*/
|
*/
|
||||||
public async watch(
|
public async watch(
|
||||||
filterObject: any,
|
filterObject: any,
|
||||||
smartdataDbDocArg: typeof SmartDataDbDoc,
|
opts: (plugins.mongodb.ChangeStreamOptions & { bufferTimeMs?: number }) = {},
|
||||||
|
smartdataDbDocArg?: typeof SmartDataDbDoc,
|
||||||
): Promise<SmartdataDbWatcher> {
|
): Promise<SmartdataDbWatcher> {
|
||||||
await this.init();
|
await this.init();
|
||||||
|
// Extract bufferTimeMs from options
|
||||||
|
const { bufferTimeMs, fullDocument, ...otherOptions } = opts || {};
|
||||||
|
// Determine fullDocument behavior: default to 'updateLookup'
|
||||||
|
const changeStreamOptions: plugins.mongodb.ChangeStreamOptions = {
|
||||||
|
...otherOptions,
|
||||||
|
fullDocument:
|
||||||
|
fullDocument === undefined
|
||||||
|
? 'updateLookup'
|
||||||
|
: (fullDocument as any) === true
|
||||||
|
? 'updateLookup'
|
||||||
|
: fullDocument,
|
||||||
|
} as any;
|
||||||
|
// Build pipeline with match if provided
|
||||||
|
const pipeline = filterObject ? [{ $match: filterObject }] : [];
|
||||||
const changeStream = this.mongoDbCollection.watch(
|
const changeStream = this.mongoDbCollection.watch(
|
||||||
[
|
pipeline,
|
||||||
{
|
changeStreamOptions,
|
||||||
$match: filterObject,
|
);
|
||||||
},
|
const smartdataWatcher = new SmartdataDbWatcher(
|
||||||
],
|
changeStream,
|
||||||
{
|
smartdataDbDocArg,
|
||||||
fullDocument: 'updateLookup',
|
{ bufferTimeMs },
|
||||||
},
|
|
||||||
);
|
);
|
||||||
const smartdataWatcher = new SmartdataDbWatcher(changeStream, smartdataDbDocArg);
|
|
||||||
await smartdataWatcher.readyDeferred.promise;
|
await smartdataWatcher.readyDeferred.promise;
|
||||||
return smartdataWatcher;
|
return smartdataWatcher;
|
||||||
}
|
}
|
||||||
@@ -276,7 +298,10 @@ export class SmartdataCollection<T> {
|
|||||||
/**
|
/**
|
||||||
* create an object in the database
|
* create an object in the database
|
||||||
*/
|
*/
|
||||||
public async insert(dbDocArg: T & SmartDataDbDoc<T, unknown>): Promise<any> {
|
public async insert(
|
||||||
|
dbDocArg: T & SmartDataDbDoc<T, unknown>,
|
||||||
|
opts?: { session?: plugins.mongodb.ClientSession }
|
||||||
|
): Promise<any> {
|
||||||
await this.init();
|
await this.init();
|
||||||
await this.checkDoc(dbDocArg);
|
await this.checkDoc(dbDocArg);
|
||||||
this.markUniqueIndexes(dbDocArg.uniqueIndexes);
|
this.markUniqueIndexes(dbDocArg.uniqueIndexes);
|
||||||
@@ -287,14 +312,17 @@ export class SmartdataCollection<T> {
|
|||||||
}
|
}
|
||||||
|
|
||||||
const saveableObject = await dbDocArg.createSavableObject();
|
const saveableObject = await dbDocArg.createSavableObject();
|
||||||
const result = await this.mongoDbCollection.insertOne(saveableObject);
|
const result = await this.mongoDbCollection.insertOne(saveableObject, { session: opts?.session });
|
||||||
return result;
|
return result;
|
||||||
}
|
}
|
||||||
|
|
||||||
/**
|
/**
|
||||||
* inserts object into the DbCollection
|
* inserts object into the DbCollection
|
||||||
*/
|
*/
|
||||||
public async update(dbDocArg: T & SmartDataDbDoc<T, unknown>): Promise<any> {
|
public async update(
|
||||||
|
dbDocArg: T & SmartDataDbDoc<T, unknown>,
|
||||||
|
opts?: { session?: plugins.mongodb.ClientSession }
|
||||||
|
): Promise<any> {
|
||||||
await this.init();
|
await this.init();
|
||||||
await this.checkDoc(dbDocArg);
|
await this.checkDoc(dbDocArg);
|
||||||
const identifiableObject = await dbDocArg.createIdentifiableObject();
|
const identifiableObject = await dbDocArg.createIdentifiableObject();
|
||||||
@@ -309,21 +337,27 @@ export class SmartdataCollection<T> {
|
|||||||
const result = await this.mongoDbCollection.updateOne(
|
const result = await this.mongoDbCollection.updateOne(
|
||||||
identifiableObject,
|
identifiableObject,
|
||||||
{ $set: updateableObject },
|
{ $set: updateableObject },
|
||||||
{ upsert: true },
|
{ upsert: true, session: opts?.session },
|
||||||
);
|
);
|
||||||
return result;
|
return result;
|
||||||
}
|
}
|
||||||
|
|
||||||
public async delete(dbDocArg: T & SmartDataDbDoc<T, unknown>): Promise<any> {
|
public async delete(
|
||||||
|
dbDocArg: T & SmartDataDbDoc<T, unknown>,
|
||||||
|
opts?: { session?: plugins.mongodb.ClientSession }
|
||||||
|
): Promise<any> {
|
||||||
await this.init();
|
await this.init();
|
||||||
await this.checkDoc(dbDocArg);
|
await this.checkDoc(dbDocArg);
|
||||||
const identifiableObject = await dbDocArg.createIdentifiableObject();
|
const identifiableObject = await dbDocArg.createIdentifiableObject();
|
||||||
await this.mongoDbCollection.deleteOne(identifiableObject);
|
await this.mongoDbCollection.deleteOne(identifiableObject, { session: opts?.session });
|
||||||
}
|
}
|
||||||
|
|
||||||
public async getCount(filterObject: any) {
|
public async getCount(
|
||||||
|
filterObject: any,
|
||||||
|
opts?: { session?: plugins.mongodb.ClientSession }
|
||||||
|
) {
|
||||||
await this.init();
|
await this.init();
|
||||||
return this.mongoDbCollection.countDocuments(filterObject);
|
return this.mongoDbCollection.countDocuments(filterObject, { session: opts?.session });
|
||||||
}
|
}
|
||||||
|
|
||||||
/**
|
/**
|
||||||
|
@@ -35,24 +35,43 @@ export class SmartdataDb {
|
|||||||
* connects to the database that was specified during instance creation
|
* connects to the database that was specified during instance creation
|
||||||
*/
|
*/
|
||||||
public async init(): Promise<any> {
|
public async init(): Promise<any> {
|
||||||
|
try {
|
||||||
|
// Safely encode credentials to handle special characters
|
||||||
|
const encodedUser = this.smartdataOptions.mongoDbUser
|
||||||
|
? encodeURIComponent(this.smartdataOptions.mongoDbUser)
|
||||||
|
: '';
|
||||||
|
const encodedPass = this.smartdataOptions.mongoDbPass
|
||||||
|
? encodeURIComponent(this.smartdataOptions.mongoDbPass)
|
||||||
|
: '';
|
||||||
|
|
||||||
const finalConnectionUrl = this.smartdataOptions.mongoDbUrl
|
const finalConnectionUrl = this.smartdataOptions.mongoDbUrl
|
||||||
.replace('<USERNAME>', this.smartdataOptions.mongoDbUser)
|
.replace('<USERNAME>', encodedUser)
|
||||||
.replace('<username>', this.smartdataOptions.mongoDbUser)
|
.replace('<username>', encodedUser)
|
||||||
.replace('<USER>', this.smartdataOptions.mongoDbUser)
|
.replace('<USER>', encodedUser)
|
||||||
.replace('<user>', this.smartdataOptions.mongoDbUser)
|
.replace('<user>', encodedUser)
|
||||||
.replace('<PASSWORD>', this.smartdataOptions.mongoDbPass)
|
.replace('<PASSWORD>', encodedPass)
|
||||||
.replace('<password>', this.smartdataOptions.mongoDbPass)
|
.replace('<password>', encodedPass)
|
||||||
.replace('<DBNAME>', this.smartdataOptions.mongoDbName)
|
.replace('<DBNAME>', this.smartdataOptions.mongoDbName)
|
||||||
.replace('<dbname>', this.smartdataOptions.mongoDbName);
|
.replace('<dbname>', this.smartdataOptions.mongoDbName);
|
||||||
|
|
||||||
this.mongoDbClient = await plugins.mongodb.MongoClient.connect(finalConnectionUrl, {
|
const clientOptions: plugins.mongodb.MongoClientOptions = {
|
||||||
maxPoolSize: 100,
|
maxPoolSize: (this.smartdataOptions as any).maxPoolSize ?? 100,
|
||||||
maxIdleTimeMS: 10,
|
maxIdleTimeMS: (this.smartdataOptions as any).maxIdleTimeMS ?? 300000, // 5 minutes default
|
||||||
});
|
serverSelectionTimeoutMS: (this.smartdataOptions as any).serverSelectionTimeoutMS ?? 30000,
|
||||||
|
retryWrites: true,
|
||||||
|
};
|
||||||
|
|
||||||
|
this.mongoDbClient = await plugins.mongodb.MongoClient.connect(finalConnectionUrl, clientOptions);
|
||||||
this.mongoDb = this.mongoDbClient.db(this.smartdataOptions.mongoDbName);
|
this.mongoDb = this.mongoDbClient.db(this.smartdataOptions.mongoDbName);
|
||||||
this.status = 'connected';
|
this.status = 'connected';
|
||||||
this.statusConnectedDeferred.resolve();
|
this.statusConnectedDeferred.resolve();
|
||||||
console.log(`Connected to database ${this.smartdataOptions.mongoDbName}`);
|
logger.log('info', `Connected to database ${this.smartdataOptions.mongoDbName}`);
|
||||||
|
} catch (error) {
|
||||||
|
this.status = 'disconnected';
|
||||||
|
this.statusConnectedDeferred.reject(error);
|
||||||
|
logger.log('error', `Failed to connect to database ${this.smartdataOptions.mongoDbName}: ${error.message}`);
|
||||||
|
throw error;
|
||||||
|
}
|
||||||
}
|
}
|
||||||
|
|
||||||
/**
|
/**
|
||||||
@@ -63,6 +82,12 @@ export class SmartdataDb {
|
|||||||
this.status = 'disconnected';
|
this.status = 'disconnected';
|
||||||
logger.log('info', `disconnected from database ${this.smartdataOptions.mongoDbName}`);
|
logger.log('info', `disconnected from database ${this.smartdataOptions.mongoDbName}`);
|
||||||
}
|
}
|
||||||
|
/**
|
||||||
|
* Start a MongoDB client session for transactions
|
||||||
|
*/
|
||||||
|
public startSession(): plugins.mongodb.ClientSession {
|
||||||
|
return this.mongoDbClient.startSession();
|
||||||
|
}
|
||||||
|
|
||||||
// handle table to class distribution
|
// handle table to class distribution
|
||||||
|
|
||||||
|
@@ -3,6 +3,7 @@ import { SmartdataDb } from './classes.db.js';
|
|||||||
import { managed, setDefaultManagerForDoc } from './classes.collection.js';
|
import { managed, setDefaultManagerForDoc } from './classes.collection.js';
|
||||||
import { SmartDataDbDoc, svDb, unI } from './classes.doc.js';
|
import { SmartDataDbDoc, svDb, unI } from './classes.doc.js';
|
||||||
import { SmartdataDbWatcher } from './classes.watcher.js';
|
import { SmartdataDbWatcher } from './classes.watcher.js';
|
||||||
|
import { logger } from './logging.js';
|
||||||
|
|
||||||
@managed()
|
@managed()
|
||||||
export class DistributedClass extends SmartDataDbDoc<DistributedClass, DistributedClass> {
|
export class DistributedClass extends SmartDataDbDoc<DistributedClass, DistributedClass> {
|
||||||
@@ -63,11 +64,11 @@ export class SmartdataDistributedCoordinator extends plugins.taskbuffer.distribu
|
|||||||
this.ownInstance.data.elected = false;
|
this.ownInstance.data.elected = false;
|
||||||
}
|
}
|
||||||
if (this.ownInstance?.data.status === 'stopped') {
|
if (this.ownInstance?.data.status === 'stopped') {
|
||||||
console.log(`stopping a distributed instance that has not been started yet.`);
|
logger.log('warn', `stopping a distributed instance that has not been started yet.`);
|
||||||
}
|
}
|
||||||
this.ownInstance.data.status = 'stopped';
|
this.ownInstance.data.status = 'stopped';
|
||||||
await this.ownInstance.save();
|
await this.ownInstance.save();
|
||||||
console.log(`stopped ${this.ownInstance.id}`);
|
logger.log('info', `stopped ${this.ownInstance.id}`);
|
||||||
});
|
});
|
||||||
}
|
}
|
||||||
|
|
||||||
@@ -83,17 +84,17 @@ export class SmartdataDistributedCoordinator extends plugins.taskbuffer.distribu
|
|||||||
public async sendHeartbeat() {
|
public async sendHeartbeat() {
|
||||||
await this.asyncExecutionStack.getExclusiveExecutionSlot(async () => {
|
await this.asyncExecutionStack.getExclusiveExecutionSlot(async () => {
|
||||||
if (this.ownInstance.data.status === 'stopped') {
|
if (this.ownInstance.data.status === 'stopped') {
|
||||||
console.log(`aborted sending heartbeat because status is stopped`);
|
logger.log('debug', `aborted sending heartbeat because status is stopped`);
|
||||||
return;
|
return;
|
||||||
}
|
}
|
||||||
await this.ownInstance.updateFromDb();
|
await this.ownInstance.updateFromDb();
|
||||||
this.ownInstance.data.lastUpdated = Date.now();
|
this.ownInstance.data.lastUpdated = Date.now();
|
||||||
await this.ownInstance.save();
|
await this.ownInstance.save();
|
||||||
console.log(`sent heartbeat for ${this.ownInstance.id}`);
|
logger.log('debug', `sent heartbeat for ${this.ownInstance.id}`);
|
||||||
const allInstances = DistributedClass.getInstances({});
|
const allInstances = DistributedClass.getInstances({});
|
||||||
});
|
});
|
||||||
if (this.ownInstance.data.status === 'stopped') {
|
if (this.ownInstance.data.status === 'stopped') {
|
||||||
console.log(`aborted sending heartbeat because status is stopped`);
|
logger.log('info', `aborted sending heartbeat because status is stopped`);
|
||||||
return;
|
return;
|
||||||
}
|
}
|
||||||
const eligibleLeader = await this.getEligibleLeader();
|
const eligibleLeader = await this.getEligibleLeader();
|
||||||
@@ -120,7 +121,7 @@ export class SmartdataDistributedCoordinator extends plugins.taskbuffer.distribu
|
|||||||
await this.ownInstance.save();
|
await this.ownInstance.save();
|
||||||
});
|
});
|
||||||
} else {
|
} else {
|
||||||
console.warn(`distributed instance already initialized`);
|
logger.log('warn', `distributed instance already initialized`);
|
||||||
}
|
}
|
||||||
|
|
||||||
// lets enable the heartbeat
|
// lets enable the heartbeat
|
||||||
@@ -149,24 +150,24 @@ export class SmartdataDistributedCoordinator extends plugins.taskbuffer.distribu
|
|||||||
public async checkAndMaybeLead() {
|
public async checkAndMaybeLead() {
|
||||||
await this.asyncExecutionStack.getExclusiveExecutionSlot(async () => {
|
await this.asyncExecutionStack.getExclusiveExecutionSlot(async () => {
|
||||||
this.ownInstance.data.status = 'initializing';
|
this.ownInstance.data.status = 'initializing';
|
||||||
this.ownInstance.save();
|
await this.ownInstance.save();
|
||||||
});
|
});
|
||||||
if (await this.getEligibleLeader()) {
|
if (await this.getEligibleLeader()) {
|
||||||
await this.asyncExecutionStack.getExclusiveExecutionSlot(async () => {
|
await this.asyncExecutionStack.getExclusiveExecutionSlot(async () => {
|
||||||
await this.ownInstance.updateFromDb();
|
await this.ownInstance.updateFromDb();
|
||||||
this.ownInstance.data.status = 'settled';
|
this.ownInstance.data.status = 'settled';
|
||||||
await this.ownInstance.save();
|
await this.ownInstance.save();
|
||||||
console.log(`${this.ownInstance.id} settled as follower`);
|
logger.log('info', `${this.ownInstance.id} settled as follower`);
|
||||||
});
|
});
|
||||||
return;
|
return;
|
||||||
} else if (
|
} else if (
|
||||||
(await DistributedClass.getInstances({})).find((instanceArg) => {
|
(await DistributedClass.getInstances({})).find((instanceArg) => {
|
||||||
instanceArg.data.status === 'bidding' &&
|
return instanceArg.data.status === 'bidding' &&
|
||||||
instanceArg.data.biddingStartTime <= Date.now() - 4000 &&
|
instanceArg.data.biddingStartTime <= Date.now() - 4000 &&
|
||||||
instanceArg.data.biddingStartTime >= Date.now() - 30000;
|
instanceArg.data.biddingStartTime >= Date.now() - 30000;
|
||||||
})
|
})
|
||||||
) {
|
) {
|
||||||
console.log('too late to the bidding party... waiting for next round.');
|
logger.log('info', 'too late to the bidding party... waiting for next round.');
|
||||||
return;
|
return;
|
||||||
} else {
|
} else {
|
||||||
await this.asyncExecutionStack.getExclusiveExecutionSlot(async () => {
|
await this.asyncExecutionStack.getExclusiveExecutionSlot(async () => {
|
||||||
@@ -175,9 +176,9 @@ export class SmartdataDistributedCoordinator extends plugins.taskbuffer.distribu
|
|||||||
this.ownInstance.data.biddingStartTime = Date.now();
|
this.ownInstance.data.biddingStartTime = Date.now();
|
||||||
this.ownInstance.data.biddingShortcode = plugins.smartunique.shortId();
|
this.ownInstance.data.biddingShortcode = plugins.smartunique.shortId();
|
||||||
await this.ownInstance.save();
|
await this.ownInstance.save();
|
||||||
console.log('bidding code stored.');
|
logger.log('info', 'bidding code stored.');
|
||||||
});
|
});
|
||||||
console.log(`bidding for leadership...`);
|
logger.log('info', `bidding for leadership...`);
|
||||||
await plugins.smartdelay.delayFor(plugins.smarttime.getMilliSecondsFromUnits({ seconds: 5 }));
|
await plugins.smartdelay.delayFor(plugins.smarttime.getMilliSecondsFromUnits({ seconds: 5 }));
|
||||||
await this.asyncExecutionStack.getExclusiveExecutionSlot(async () => {
|
await this.asyncExecutionStack.getExclusiveExecutionSlot(async () => {
|
||||||
let biddingInstances = await DistributedClass.getInstances({});
|
let biddingInstances = await DistributedClass.getInstances({});
|
||||||
@@ -187,7 +188,7 @@ export class SmartdataDistributedCoordinator extends plugins.taskbuffer.distribu
|
|||||||
instanceArg.data.lastUpdated >=
|
instanceArg.data.lastUpdated >=
|
||||||
Date.now() - plugins.smarttime.getMilliSecondsFromUnits({ seconds: 10 }),
|
Date.now() - plugins.smarttime.getMilliSecondsFromUnits({ seconds: 10 }),
|
||||||
);
|
);
|
||||||
console.log(`found ${biddingInstances.length} bidding instances...`);
|
logger.log('info', `found ${biddingInstances.length} bidding instances...`);
|
||||||
this.ownInstance.data.elected = true;
|
this.ownInstance.data.elected = true;
|
||||||
for (const biddingInstance of biddingInstances) {
|
for (const biddingInstance of biddingInstances) {
|
||||||
if (biddingInstance.data.biddingShortcode < this.ownInstance.data.biddingShortcode) {
|
if (biddingInstance.data.biddingShortcode < this.ownInstance.data.biddingShortcode) {
|
||||||
@@ -195,7 +196,7 @@ export class SmartdataDistributedCoordinator extends plugins.taskbuffer.distribu
|
|||||||
}
|
}
|
||||||
}
|
}
|
||||||
await plugins.smartdelay.delayFor(5000);
|
await plugins.smartdelay.delayFor(5000);
|
||||||
console.log(`settling with status elected = ${this.ownInstance.data.elected}`);
|
logger.log('info', `settling with status elected = ${this.ownInstance.data.elected}`);
|
||||||
this.ownInstance.data.status = 'settled';
|
this.ownInstance.data.status = 'settled';
|
||||||
await this.ownInstance.save();
|
await this.ownInstance.save();
|
||||||
});
|
});
|
||||||
@@ -226,11 +227,11 @@ export class SmartdataDistributedCoordinator extends plugins.taskbuffer.distribu
|
|||||||
this.distributedWatcher.changeSubject.subscribe({
|
this.distributedWatcher.changeSubject.subscribe({
|
||||||
next: async (distributedDoc) => {
|
next: async (distributedDoc) => {
|
||||||
if (!distributedDoc) {
|
if (!distributedDoc) {
|
||||||
console.log(`registered deletion of instance...`);
|
logger.log('info', `registered deletion of instance...`);
|
||||||
return;
|
return;
|
||||||
}
|
}
|
||||||
console.log(distributedDoc);
|
logger.log('info', distributedDoc);
|
||||||
console.log(`registered change for ${distributedDoc.id}`);
|
logger.log('info', `registered change for ${distributedDoc.id}`);
|
||||||
distributedDoc;
|
distributedDoc;
|
||||||
},
|
},
|
||||||
});
|
});
|
||||||
@@ -252,7 +253,7 @@ export class SmartdataDistributedCoordinator extends plugins.taskbuffer.distribu
|
|||||||
): Promise<plugins.taskbuffer.distributedCoordination.IDistributedTaskRequestResult> {
|
): Promise<plugins.taskbuffer.distributedCoordination.IDistributedTaskRequestResult> {
|
||||||
await this.asyncExecutionStack.getExclusiveExecutionSlot(async () => {
|
await this.asyncExecutionStack.getExclusiveExecutionSlot(async () => {
|
||||||
if (!this.ownInstance) {
|
if (!this.ownInstance) {
|
||||||
console.error('instance need to be started first...');
|
logger.log('error', 'instance need to be started first...');
|
||||||
return;
|
return;
|
||||||
}
|
}
|
||||||
await this.ownInstance.updateFromDb();
|
await this.ownInstance.updateFromDb();
|
||||||
@@ -268,7 +269,7 @@ export class SmartdataDistributedCoordinator extends plugins.taskbuffer.distribu
|
|||||||
return taskRequestResult;
|
return taskRequestResult;
|
||||||
});
|
});
|
||||||
if (!result) {
|
if (!result) {
|
||||||
console.warn('no result found for task request...');
|
logger.log('warn', 'no result found for task request...');
|
||||||
return null;
|
return null;
|
||||||
}
|
}
|
||||||
return result;
|
return result;
|
||||||
@@ -285,7 +286,7 @@ export class SmartdataDistributedCoordinator extends plugins.taskbuffer.distribu
|
|||||||
);
|
);
|
||||||
});
|
});
|
||||||
if (!existingInfoBasis) {
|
if (!existingInfoBasis) {
|
||||||
console.warn('trying to update a non existing task request... aborting!');
|
logger.log('warn', 'trying to update a non existing task request... aborting!');
|
||||||
return;
|
return;
|
||||||
}
|
}
|
||||||
Object.assign(existingInfoBasis, infoBasisArg);
|
Object.assign(existingInfoBasis, infoBasisArg);
|
||||||
@@ -293,8 +294,10 @@ export class SmartdataDistributedCoordinator extends plugins.taskbuffer.distribu
|
|||||||
plugins.smartdelay.delayFor(60000).then(() => {
|
plugins.smartdelay.delayFor(60000).then(() => {
|
||||||
this.asyncExecutionStack.getExclusiveExecutionSlot(async () => {
|
this.asyncExecutionStack.getExclusiveExecutionSlot(async () => {
|
||||||
const indexToRemove = this.ownInstance.data.taskRequests.indexOf(existingInfoBasis);
|
const indexToRemove = this.ownInstance.data.taskRequests.indexOf(existingInfoBasis);
|
||||||
this.ownInstance.data.taskRequests.splice(indexToRemove, indexToRemove);
|
if (indexToRemove >= 0) {
|
||||||
|
this.ownInstance.data.taskRequests.splice(indexToRemove, 1);
|
||||||
await this.ownInstance.save();
|
await this.ownInstance.save();
|
||||||
|
}
|
||||||
});
|
});
|
||||||
});
|
});
|
||||||
});
|
});
|
||||||
|
@@ -1,10 +1,30 @@
|
|||||||
import * as plugins from './plugins.js';
|
import * as plugins from './plugins.js';
|
||||||
|
|
||||||
import { SmartdataDb } from './classes.db.js';
|
import { SmartdataDb } from './classes.db.js';
|
||||||
|
import { logger } from './logging.js';
|
||||||
import { SmartdataDbCursor } from './classes.cursor.js';
|
import { SmartdataDbCursor } from './classes.cursor.js';
|
||||||
import { type IManager, SmartdataCollection } from './classes.collection.js';
|
import { type IManager, SmartdataCollection } from './classes.collection.js';
|
||||||
import { SmartdataDbWatcher } from './classes.watcher.js';
|
import { SmartdataDbWatcher } from './classes.watcher.js';
|
||||||
import { SmartdataLuceneAdapter } from './classes.lucene.adapter.js';
|
import { SmartdataLuceneAdapter } from './classes.lucene.adapter.js';
|
||||||
|
/**
|
||||||
|
* Search options for `.search()`:
|
||||||
|
* - filter: additional MongoDB query to AND-merge
|
||||||
|
* - validate: post-fetch validator, return true to keep a doc
|
||||||
|
*/
|
||||||
|
export interface SearchOptions<T> {
|
||||||
|
/**
|
||||||
|
* Additional MongoDB filter to AND‐merge into the query
|
||||||
|
*/
|
||||||
|
filter?: Record<string, any>;
|
||||||
|
/**
|
||||||
|
* Post‐fetch validator; return true to keep each doc
|
||||||
|
*/
|
||||||
|
validate?: (doc: T) => Promise<boolean> | boolean;
|
||||||
|
/**
|
||||||
|
* Optional MongoDB session for transactional operations
|
||||||
|
*/
|
||||||
|
session?: plugins.mongodb.ClientSession;
|
||||||
|
}
|
||||||
|
|
||||||
export type TDocCreation = 'db' | 'new' | 'mixed';
|
export type TDocCreation = 'db' | 'new' | 'mixed';
|
||||||
|
|
||||||
@@ -12,7 +32,7 @@ export type TDocCreation = 'db' | 'new' | 'mixed';
|
|||||||
|
|
||||||
export function globalSvDb() {
|
export function globalSvDb() {
|
||||||
return (target: SmartDataDbDoc<unknown, unknown>, key: string) => {
|
return (target: SmartDataDbDoc<unknown, unknown>, key: string) => {
|
||||||
console.log(`called svDb() on >${target.constructor.name}.${key}<`);
|
logger.log('debug', `called svDb() on >${target.constructor.name}.${key}<`);
|
||||||
if (!target.globalSaveableProperties) {
|
if (!target.globalSaveableProperties) {
|
||||||
target.globalSaveableProperties = [];
|
target.globalSaveableProperties = [];
|
||||||
}
|
}
|
||||||
@@ -20,16 +40,34 @@ export function globalSvDb() {
|
|||||||
};
|
};
|
||||||
}
|
}
|
||||||
|
|
||||||
|
/**
|
||||||
|
* Options for custom serialization/deserialization of a field.
|
||||||
|
*/
|
||||||
|
export interface SvDbOptions {
|
||||||
|
/** Function to serialize the field value before saving to DB */
|
||||||
|
serialize?: (value: any) => any;
|
||||||
|
/** Function to deserialize the field value after reading from DB */
|
||||||
|
deserialize?: (value: any) => any;
|
||||||
|
}
|
||||||
|
|
||||||
/**
|
/**
|
||||||
* saveable - saveable decorator to be used on class properties
|
* saveable - saveable decorator to be used on class properties
|
||||||
*/
|
*/
|
||||||
export function svDb() {
|
export function svDb(options?: SvDbOptions) {
|
||||||
return (target: SmartDataDbDoc<unknown, unknown>, key: string) => {
|
return (target: SmartDataDbDoc<unknown, unknown>, key: string) => {
|
||||||
console.log(`called svDb() on >${target.constructor.name}.${key}<`);
|
logger.log('debug', `called svDb() on >${target.constructor.name}.${key}<`);
|
||||||
if (!target.saveableProperties) {
|
if (!target.saveableProperties) {
|
||||||
target.saveableProperties = [];
|
target.saveableProperties = [];
|
||||||
}
|
}
|
||||||
target.saveableProperties.push(key);
|
target.saveableProperties.push(key);
|
||||||
|
// attach custom serializer/deserializer options to the class constructor
|
||||||
|
const ctor = target.constructor as any;
|
||||||
|
if (!ctor._svDbOptions) {
|
||||||
|
ctor._svDbOptions = {};
|
||||||
|
}
|
||||||
|
if (options) {
|
||||||
|
ctor._svDbOptions[key] = options;
|
||||||
|
}
|
||||||
};
|
};
|
||||||
}
|
}
|
||||||
|
|
||||||
@@ -57,7 +95,7 @@ function escapeForRegex(input: string): string {
|
|||||||
*/
|
*/
|
||||||
export function unI() {
|
export function unI() {
|
||||||
return (target: SmartDataDbDoc<unknown, unknown>, key: string) => {
|
return (target: SmartDataDbDoc<unknown, unknown>, key: string) => {
|
||||||
console.log(`called unI on >>${target.constructor.name}.${key}<<`);
|
logger.log('debug', `called unI on >>${target.constructor.name}.${key}<<`);
|
||||||
|
|
||||||
// mark the index as unique
|
// mark the index as unique
|
||||||
if (!target.uniqueIndexes) {
|
if (!target.uniqueIndexes) {
|
||||||
@@ -89,7 +127,7 @@ export interface IIndexOptions {
|
|||||||
*/
|
*/
|
||||||
export function index(options?: IIndexOptions) {
|
export function index(options?: IIndexOptions) {
|
||||||
return (target: SmartDataDbDoc<unknown, unknown>, key: string) => {
|
return (target: SmartDataDbDoc<unknown, unknown>, key: string) => {
|
||||||
console.log(`called index() on >${target.constructor.name}.${key}<`);
|
logger.log('debug', `called index() on >${target.constructor.name}.${key}<`);
|
||||||
|
|
||||||
// Initialize regular indexes array if it doesn't exist
|
// Initialize regular indexes array if it doesn't exist
|
||||||
if (!target.regularIndexes) {
|
if (!target.regularIndexes) {
|
||||||
@@ -113,9 +151,56 @@ export function index(options?: IIndexOptions) {
|
|||||||
};
|
};
|
||||||
}
|
}
|
||||||
|
|
||||||
|
// Helper type to extract element type from arrays or return T itself
|
||||||
|
type ElementOf<T> = T extends ReadonlyArray<infer U> ? U : T;
|
||||||
|
|
||||||
|
// Type for $in/$nin values - arrays of the element type
|
||||||
|
type InValues<T> = ReadonlyArray<ElementOf<T>>;
|
||||||
|
|
||||||
|
// Type that allows MongoDB operators on leaf values while maintaining nested type safety
|
||||||
|
export type MongoFilterCondition<T> = T | {
|
||||||
|
$eq?: T;
|
||||||
|
$ne?: T;
|
||||||
|
$gt?: T;
|
||||||
|
$gte?: T;
|
||||||
|
$lt?: T;
|
||||||
|
$lte?: T;
|
||||||
|
$in?: InValues<T>;
|
||||||
|
$nin?: InValues<T>;
|
||||||
|
$exists?: boolean;
|
||||||
|
$type?: string | number;
|
||||||
|
$regex?: string | RegExp;
|
||||||
|
$options?: string;
|
||||||
|
$all?: T extends ReadonlyArray<infer U> ? ReadonlyArray<U> : never;
|
||||||
|
$elemMatch?: T extends ReadonlyArray<infer U> ? MongoFilter<U> : never;
|
||||||
|
$size?: T extends ReadonlyArray<any> ? number : never;
|
||||||
|
$not?: MongoFilterCondition<T>;
|
||||||
|
};
|
||||||
|
|
||||||
|
export type MongoFilter<T> = {
|
||||||
|
[K in keyof T]?: T[K] extends object
|
||||||
|
? T[K] extends any[]
|
||||||
|
? MongoFilterCondition<T[K]> // Arrays can have operators
|
||||||
|
: MongoFilter<T[K]> | MongoFilterCondition<T[K]> // Objects can be nested or have operators
|
||||||
|
: MongoFilterCondition<T[K]>; // Primitives get operators
|
||||||
|
} & {
|
||||||
|
// Logical operators
|
||||||
|
$and?: MongoFilter<T>[];
|
||||||
|
$or?: MongoFilter<T>[];
|
||||||
|
$nor?: MongoFilter<T>[];
|
||||||
|
$not?: MongoFilter<T>;
|
||||||
|
// Allow any string key for dot notation (we lose type safety here but maintain flexibility)
|
||||||
|
[key: string]: any;
|
||||||
|
};
|
||||||
|
|
||||||
export const convertFilterForMongoDb = (filterArg: { [key: string]: any }) => {
|
export const convertFilterForMongoDb = (filterArg: { [key: string]: any }) => {
|
||||||
|
// SECURITY: Block $where to prevent server-side JS execution
|
||||||
|
if (filterArg.$where !== undefined) {
|
||||||
|
throw new Error('$where operator is not allowed for security reasons');
|
||||||
|
}
|
||||||
|
|
||||||
// Special case: detect MongoDB operators and pass them through directly
|
// Special case: detect MongoDB operators and pass them through directly
|
||||||
const topLevelOperators = ['$and', '$or', '$nor', '$not', '$text', '$where', '$regex'];
|
const topLevelOperators = ['$and', '$or', '$nor', '$not', '$text', '$regex'];
|
||||||
for (const key of Object.keys(filterArg)) {
|
for (const key of Object.keys(filterArg)) {
|
||||||
if (topLevelOperators.includes(key)) {
|
if (topLevelOperators.includes(key)) {
|
||||||
return filterArg; // Return the filter as-is for MongoDB operators
|
return filterArg; // Return the filter as-is for MongoDB operators
|
||||||
@@ -127,21 +212,76 @@ export const convertFilterForMongoDb = (filterArg: { [key: string]: any }) => {
|
|||||||
|
|
||||||
const convertFilterArgument = (keyPathArg2: string, filterArg2: any) => {
|
const convertFilterArgument = (keyPathArg2: string, filterArg2: any) => {
|
||||||
if (Array.isArray(filterArg2)) {
|
if (Array.isArray(filterArg2)) {
|
||||||
// Directly assign arrays (they might be using operators like $in or $all)
|
// Arrays are typically used as values for operators like $in or as direct equality matches
|
||||||
convertFilterArgument(keyPathArg2, filterArg2[0]);
|
|
||||||
} else if (typeof filterArg2 === 'object' && filterArg2 !== null) {
|
|
||||||
for (const key of Object.keys(filterArg2)) {
|
|
||||||
if (key.startsWith('$')) {
|
|
||||||
convertedFilter[keyPathArg2] = filterArg2;
|
convertedFilter[keyPathArg2] = filterArg2;
|
||||||
return;
|
return;
|
||||||
} else if (key.includes('.')) {
|
} else if (typeof filterArg2 === 'object' && filterArg2 !== null) {
|
||||||
|
// Check if this is an object with MongoDB operators
|
||||||
|
const keys = Object.keys(filterArg2);
|
||||||
|
const hasOperators = keys.some(key => key.startsWith('$'));
|
||||||
|
|
||||||
|
if (hasOperators) {
|
||||||
|
// This object contains MongoDB operators
|
||||||
|
// Validate and pass through allowed operators
|
||||||
|
const allowedOperators = [
|
||||||
|
// Comparison operators
|
||||||
|
'$eq', '$ne', '$gt', '$gte', '$lt', '$lte',
|
||||||
|
// Array operators
|
||||||
|
'$in', '$nin', '$all', '$elemMatch', '$size',
|
||||||
|
// Element operators
|
||||||
|
'$exists', '$type',
|
||||||
|
// Evaluation operators (safe ones only)
|
||||||
|
'$regex', '$options', '$text', '$mod',
|
||||||
|
// Logical operators (nested)
|
||||||
|
'$and', '$or', '$nor', '$not'
|
||||||
|
];
|
||||||
|
|
||||||
|
// Check for dangerous operators
|
||||||
|
if (keys.includes('$where')) {
|
||||||
|
throw new Error('$where operator is not allowed for security reasons');
|
||||||
|
}
|
||||||
|
|
||||||
|
// Validate all operators are in the allowed list
|
||||||
|
const invalidOperators = keys.filter(key =>
|
||||||
|
key.startsWith('$') && !allowedOperators.includes(key)
|
||||||
|
);
|
||||||
|
|
||||||
|
if (invalidOperators.length > 0) {
|
||||||
|
console.warn(`Warning: Unknown MongoDB operators detected: ${invalidOperators.join(', ')}`);
|
||||||
|
}
|
||||||
|
|
||||||
|
// For array operators, ensure the values are appropriate
|
||||||
|
if (filterArg2.$in && !Array.isArray(filterArg2.$in)) {
|
||||||
|
throw new Error('$in operator requires an array value');
|
||||||
|
}
|
||||||
|
if (filterArg2.$nin && !Array.isArray(filterArg2.$nin)) {
|
||||||
|
throw new Error('$nin operator requires an array value');
|
||||||
|
}
|
||||||
|
if (filterArg2.$all && !Array.isArray(filterArg2.$all)) {
|
||||||
|
throw new Error('$all operator requires an array value');
|
||||||
|
}
|
||||||
|
if (filterArg2.$size && typeof filterArg2.$size !== 'number') {
|
||||||
|
throw new Error('$size operator requires a numeric value');
|
||||||
|
}
|
||||||
|
|
||||||
|
// Pass the operator object through
|
||||||
|
convertedFilter[keyPathArg2] = filterArg2;
|
||||||
|
return;
|
||||||
|
}
|
||||||
|
|
||||||
|
// No operators, check for dots in keys
|
||||||
|
for (const key of keys) {
|
||||||
|
if (key.includes('.')) {
|
||||||
throw new Error('keys cannot contain dots');
|
throw new Error('keys cannot contain dots');
|
||||||
}
|
}
|
||||||
}
|
}
|
||||||
for (const key of Object.keys(filterArg2)) {
|
|
||||||
|
// Recursively process nested objects
|
||||||
|
for (const key of keys) {
|
||||||
convertFilterArgument(`${keyPathArg2}.${key}`, filterArg2[key]);
|
convertFilterArgument(`${keyPathArg2}.${key}`, filterArg2[key]);
|
||||||
}
|
}
|
||||||
} else {
|
} else {
|
||||||
|
// Primitive values
|
||||||
convertedFilter[keyPathArg2] = filterArg2;
|
convertedFilter[keyPathArg2] = filterArg2;
|
||||||
}
|
}
|
||||||
};
|
};
|
||||||
@@ -170,7 +310,12 @@ export class SmartDataDbDoc<T extends TImplements, TImplements, TManager extends
|
|||||||
const newInstance = new this();
|
const newInstance = new this();
|
||||||
(newInstance as any).creationStatus = 'db';
|
(newInstance as any).creationStatus = 'db';
|
||||||
for (const key of Object.keys(mongoDbNativeDocArg)) {
|
for (const key of Object.keys(mongoDbNativeDocArg)) {
|
||||||
newInstance[key] = mongoDbNativeDocArg[key];
|
const rawValue = mongoDbNativeDocArg[key];
|
||||||
|
const optionsMap = (this as any)._svDbOptions || {};
|
||||||
|
const opts = optionsMap[key];
|
||||||
|
newInstance[key] = opts && typeof opts.deserialize === 'function'
|
||||||
|
? opts.deserialize(rawValue)
|
||||||
|
: rawValue;
|
||||||
}
|
}
|
||||||
return newInstance;
|
return newInstance;
|
||||||
}
|
}
|
||||||
@@ -178,14 +323,19 @@ export class SmartDataDbDoc<T extends TImplements, TImplements, TManager extends
|
|||||||
/**
|
/**
|
||||||
* gets all instances as array
|
* gets all instances as array
|
||||||
* @param this
|
* @param this
|
||||||
* @param filterArg
|
* @param filterArg - Type-safe MongoDB filter with nested object support and operators
|
||||||
* @returns
|
* @returns
|
||||||
*/
|
*/
|
||||||
public static async getInstances<T>(
|
public static async getInstances<T>(
|
||||||
this: plugins.tsclass.typeFest.Class<T>,
|
this: plugins.tsclass.typeFest.Class<T>,
|
||||||
filterArg: plugins.tsclass.typeFest.PartialDeep<T>,
|
filterArg: MongoFilter<T>,
|
||||||
|
opts?: { session?: plugins.mongodb.ClientSession }
|
||||||
): Promise<T[]> {
|
): Promise<T[]> {
|
||||||
const foundDocs = await (this as any).collection.findAll(convertFilterForMongoDb(filterArg));
|
// Pass session through to findAll for transactional queries
|
||||||
|
const foundDocs = await (this as any).collection.findAll(
|
||||||
|
convertFilterForMongoDb(filterArg),
|
||||||
|
{ session: opts?.session },
|
||||||
|
);
|
||||||
const returnArray = [];
|
const returnArray = [];
|
||||||
for (const foundDoc of foundDocs) {
|
for (const foundDoc of foundDocs) {
|
||||||
const newInstance: T = (this as any).createInstanceFromMongoDbNativeDoc(foundDoc);
|
const newInstance: T = (this as any).createInstanceFromMongoDbNativeDoc(foundDoc);
|
||||||
@@ -202,9 +352,14 @@ export class SmartDataDbDoc<T extends TImplements, TImplements, TManager extends
|
|||||||
*/
|
*/
|
||||||
public static async getInstance<T>(
|
public static async getInstance<T>(
|
||||||
this: plugins.tsclass.typeFest.Class<T>,
|
this: plugins.tsclass.typeFest.Class<T>,
|
||||||
filterArg: plugins.tsclass.typeFest.PartialDeep<T>,
|
filterArg: MongoFilter<T>,
|
||||||
|
opts?: { session?: plugins.mongodb.ClientSession }
|
||||||
): Promise<T> {
|
): Promise<T> {
|
||||||
const foundDoc = await (this as any).collection.findOne(convertFilterForMongoDb(filterArg));
|
// Retrieve one document, with optional session for transactions
|
||||||
|
const foundDoc = await (this as any).collection.findOne(
|
||||||
|
convertFilterForMongoDb(filterArg),
|
||||||
|
{ session: opts?.session },
|
||||||
|
);
|
||||||
if (foundDoc) {
|
if (foundDoc) {
|
||||||
const newInstance: T = (this as any).createInstanceFromMongoDbNativeDoc(foundDoc);
|
const newInstance: T = (this as any).createInstanceFromMongoDbNativeDoc(foundDoc);
|
||||||
return newInstance;
|
return newInstance;
|
||||||
@@ -224,33 +379,27 @@ export class SmartDataDbDoc<T extends TImplements, TImplements, TManager extends
|
|||||||
}
|
}
|
||||||
|
|
||||||
/**
|
/**
|
||||||
* get cursor
|
* Get a cursor for streaming results, with optional session and native cursor modifiers.
|
||||||
* @returns
|
* @param filterArg Partial filter to apply
|
||||||
|
* @param opts Optional session and modifier for the raw MongoDB cursor
|
||||||
*/
|
*/
|
||||||
public static async getCursor<T>(
|
public static async getCursor<T>(
|
||||||
this: plugins.tsclass.typeFest.Class<T>,
|
this: plugins.tsclass.typeFest.Class<T>,
|
||||||
filterArg: plugins.tsclass.typeFest.PartialDeep<T>,
|
filterArg: MongoFilter<T>,
|
||||||
) {
|
opts?: {
|
||||||
const collection: SmartdataCollection<T> = (this as any).collection;
|
session?: plugins.mongodb.ClientSession;
|
||||||
const cursor: SmartdataDbCursor<T> = await collection.getCursor(
|
modifier?: (cursorArg: plugins.mongodb.FindCursor<plugins.mongodb.WithId<plugins.mongodb.BSON.Document>>) => plugins.mongodb.FindCursor<plugins.mongodb.WithId<plugins.mongodb.BSON.Document>>;
|
||||||
convertFilterForMongoDb(filterArg),
|
|
||||||
this as any as typeof SmartDataDbDoc,
|
|
||||||
);
|
|
||||||
return cursor;
|
|
||||||
}
|
}
|
||||||
|
|
||||||
public static async getCursorExtended<T>(
|
|
||||||
this: plugins.tsclass.typeFest.Class<T>,
|
|
||||||
filterArg: plugins.tsclass.typeFest.PartialDeep<T>,
|
|
||||||
modifierFunction = (cursorArg: plugins.mongodb.FindCursor<plugins.mongodb.WithId<plugins.mongodb.BSON.Document>>) => cursorArg,
|
|
||||||
): Promise<SmartdataDbCursor<T>> {
|
): Promise<SmartdataDbCursor<T>> {
|
||||||
const collection: SmartdataCollection<T> = (this as any).collection;
|
const collection: SmartdataCollection<T> = (this as any).collection;
|
||||||
|
const { session, modifier } = opts || {};
|
||||||
await collection.init();
|
await collection.init();
|
||||||
let cursor: plugins.mongodb.FindCursor<any> = collection.mongoDbCollection.find(
|
let rawCursor: plugins.mongodb.FindCursor<any> =
|
||||||
convertFilterForMongoDb(filterArg),
|
collection.mongoDbCollection.find(convertFilterForMongoDb(filterArg), { session });
|
||||||
);
|
if (modifier) {
|
||||||
cursor = modifierFunction(cursor);
|
rawCursor = modifier(rawCursor);
|
||||||
return new SmartdataDbCursor<T>(cursor, this as any as typeof SmartDataDbDoc);
|
}
|
||||||
|
return new SmartdataDbCursor<T>(rawCursor, this as any as typeof SmartDataDbDoc);
|
||||||
}
|
}
|
||||||
|
|
||||||
/**
|
/**
|
||||||
@@ -259,13 +408,20 @@ export class SmartDataDbDoc<T extends TImplements, TImplements, TManager extends
|
|||||||
* @param filterArg
|
* @param filterArg
|
||||||
* @param forEachFunction
|
* @param forEachFunction
|
||||||
*/
|
*/
|
||||||
|
/**
|
||||||
|
* Watch the collection for changes, with optional buffering and change stream options.
|
||||||
|
* @param filterArg MongoDB filter to select which changes to observe
|
||||||
|
* @param opts optional ChangeStreamOptions plus bufferTimeMs
|
||||||
|
*/
|
||||||
public static async watch<T>(
|
public static async watch<T>(
|
||||||
this: plugins.tsclass.typeFest.Class<T>,
|
this: plugins.tsclass.typeFest.Class<T>,
|
||||||
filterArg: plugins.tsclass.typeFest.PartialDeep<T>,
|
filterArg: MongoFilter<T>,
|
||||||
) {
|
opts?: plugins.mongodb.ChangeStreamOptions & { bufferTimeMs?: number },
|
||||||
|
): Promise<SmartdataDbWatcher<T>> {
|
||||||
const collection: SmartdataCollection<T> = (this as any).collection;
|
const collection: SmartdataCollection<T> = (this as any).collection;
|
||||||
const watcher: SmartdataDbWatcher<T> = await collection.watch(
|
const watcher: SmartdataDbWatcher<T> = await collection.watch(
|
||||||
convertFilterForMongoDb(filterArg),
|
convertFilterForMongoDb(filterArg),
|
||||||
|
opts || {},
|
||||||
this as any,
|
this as any,
|
||||||
);
|
);
|
||||||
return watcher;
|
return watcher;
|
||||||
@@ -277,7 +433,7 @@ export class SmartDataDbDoc<T extends TImplements, TImplements, TManager extends
|
|||||||
*/
|
*/
|
||||||
public static async forEach<T>(
|
public static async forEach<T>(
|
||||||
this: plugins.tsclass.typeFest.Class<T>,
|
this: plugins.tsclass.typeFest.Class<T>,
|
||||||
filterArg: plugins.tsclass.typeFest.PartialDeep<T>,
|
filterArg: MongoFilter<T>,
|
||||||
forEachFunction: (itemArg: T) => Promise<any>,
|
forEachFunction: (itemArg: T) => Promise<any>,
|
||||||
) {
|
) {
|
||||||
const cursor: SmartdataDbCursor<T> = await (this as any).getCursor(filterArg);
|
const cursor: SmartdataDbCursor<T> = await (this as any).getCursor(filterArg);
|
||||||
@@ -289,7 +445,7 @@ export class SmartDataDbDoc<T extends TImplements, TImplements, TManager extends
|
|||||||
*/
|
*/
|
||||||
public static async getCount<T>(
|
public static async getCount<T>(
|
||||||
this: plugins.tsclass.typeFest.Class<T>,
|
this: plugins.tsclass.typeFest.Class<T>,
|
||||||
filterArg: plugins.tsclass.typeFest.PartialDeep<T> = {} as any,
|
filterArg: MongoFilter<T> = {} as any,
|
||||||
) {
|
) {
|
||||||
const collection: SmartdataCollection<T> = (this as any).collection;
|
const collection: SmartdataCollection<T> = (this as any).collection;
|
||||||
return await collection.getCount(filterArg);
|
return await collection.getCount(filterArg);
|
||||||
@@ -318,15 +474,42 @@ export class SmartDataDbDoc<T extends TImplements, TImplements, TManager extends
|
|||||||
const ctor = this as any;
|
const ctor = this as any;
|
||||||
return Array.isArray(ctor.searchableFields) ? ctor.searchableFields : [];
|
return Array.isArray(ctor.searchableFields) ? ctor.searchableFields : [];
|
||||||
}
|
}
|
||||||
|
/**
|
||||||
|
* Execute a query with optional hard filter and post-fetch validation
|
||||||
|
*/
|
||||||
|
private static async execQuery<T>(
|
||||||
|
this: plugins.tsclass.typeFest.Class<T>,
|
||||||
|
baseFilter: Record<string, any>,
|
||||||
|
opts?: SearchOptions<T>
|
||||||
|
): Promise<T[]> {
|
||||||
|
let mongoFilter = baseFilter || {};
|
||||||
|
if (opts?.filter) {
|
||||||
|
mongoFilter = { $and: [mongoFilter, opts.filter] };
|
||||||
|
}
|
||||||
|
// Fetch with optional session for transactions
|
||||||
|
// Fetch within optional session
|
||||||
|
let docs: T[] = await (this as any).getInstances(mongoFilter, { session: opts?.session });
|
||||||
|
if (opts?.validate) {
|
||||||
|
const out: T[] = [];
|
||||||
|
for (const d of docs) {
|
||||||
|
if (await opts.validate(d)) out.push(d);
|
||||||
|
}
|
||||||
|
docs = out;
|
||||||
|
}
|
||||||
|
return docs;
|
||||||
|
}
|
||||||
|
|
||||||
/**
|
/**
|
||||||
* Search documents by text or field:value syntax, with safe regex fallback
|
* Search documents by text or field:value syntax, with safe regex fallback
|
||||||
|
* Supports additional filtering and post-fetch validation via opts
|
||||||
* @param query A search term or field:value expression
|
* @param query A search term or field:value expression
|
||||||
|
* @param opts Optional filter and validate hooks
|
||||||
* @returns Array of matching documents
|
* @returns Array of matching documents
|
||||||
*/
|
*/
|
||||||
public static async search<T>(
|
public static async search<T>(
|
||||||
this: plugins.tsclass.typeFest.Class<T>,
|
this: plugins.tsclass.typeFest.Class<T>,
|
||||||
query: string,
|
query: string,
|
||||||
|
opts?: SearchOptions<T>,
|
||||||
): Promise<T[]> {
|
): Promise<T[]> {
|
||||||
const searchableFields = (this as any).getSearchableFields();
|
const searchableFields = (this as any).getSearchableFields();
|
||||||
if (searchableFields.length === 0) {
|
if (searchableFields.length === 0) {
|
||||||
@@ -335,7 +518,8 @@ export class SmartDataDbDoc<T extends TImplements, TImplements, TManager extends
|
|||||||
// empty query -> return all
|
// empty query -> return all
|
||||||
const q = query.trim();
|
const q = query.trim();
|
||||||
if (!q) {
|
if (!q) {
|
||||||
return await (this as any).getInstances({});
|
// empty query: fetch all, apply opts
|
||||||
|
return await (this as any).execQuery({}, opts);
|
||||||
}
|
}
|
||||||
// simple exact field:value (no spaces, no wildcards, no quotes)
|
// simple exact field:value (no spaces, no wildcards, no quotes)
|
||||||
// simple exact field:value (no spaces, wildcards, quotes)
|
// simple exact field:value (no spaces, wildcards, quotes)
|
||||||
@@ -346,30 +530,35 @@ export class SmartDataDbDoc<T extends TImplements, TImplements, TManager extends
|
|||||||
if (!searchableFields.includes(field)) {
|
if (!searchableFields.includes(field)) {
|
||||||
throw new Error(`Field '${field}' is not searchable for class ${this.name}`);
|
throw new Error(`Field '${field}' is not searchable for class ${this.name}`);
|
||||||
}
|
}
|
||||||
return await (this as any).getInstances({ [field]: value });
|
// simple field:value search
|
||||||
|
return await (this as any).execQuery({ [field]: value }, opts);
|
||||||
}
|
}
|
||||||
// quoted phrase across all searchable fields: exact match of phrase
|
// quoted phrase across all searchable fields: exact match of phrase
|
||||||
const quoted = q.match(/^"(.+)"$|^'(.+)'$/);
|
const quoted = q.match(/^"(.+)"$|^'(.+)'$/);
|
||||||
if (quoted) {
|
if (quoted) {
|
||||||
const phrase = quoted[1] || quoted[2] || '';
|
const phrase = quoted[1] || quoted[2] || '';
|
||||||
// build regex that matches the exact phrase (allowing flexible whitespace)
|
|
||||||
const parts = phrase.split(/\s+/).map((t) => escapeForRegex(t));
|
const parts = phrase.split(/\s+/).map((t) => escapeForRegex(t));
|
||||||
const pattern = parts.join('\\s+');
|
const pattern = parts.join('\\s+');
|
||||||
const orConds = searchableFields.map((f) => ({ [f]: { $regex: pattern, $options: 'i' } }));
|
const orConds = searchableFields.map((f) => ({ [f]: { $regex: pattern, $options: 'i' } }));
|
||||||
return await (this as any).getInstances({ $or: orConds });
|
return await (this as any).execQuery({ $or: orConds }, opts);
|
||||||
}
|
}
|
||||||
// wildcard field:value (supports * and ?) -> direct regex on that field
|
// wildcard field:value (supports * and ?) -> direct regex on that field
|
||||||
const wildcardField = q.match(/^(\w+):(.+[*?].*)$/);
|
const wildcardField = q.match(/^(\w+):(.+[*?].*)$/);
|
||||||
if (wildcardField) {
|
if (wildcardField) {
|
||||||
const field = wildcardField[1];
|
const field = wildcardField[1];
|
||||||
const pattern = wildcardField[2];
|
// Support quoted wildcard patterns: strip surrounding quotes
|
||||||
|
let pattern = wildcardField[2];
|
||||||
|
if ((pattern.startsWith('"') && pattern.endsWith('"')) ||
|
||||||
|
(pattern.startsWith("'") && pattern.endsWith("'"))) {
|
||||||
|
pattern = pattern.slice(1, -1);
|
||||||
|
}
|
||||||
if (!searchableFields.includes(field)) {
|
if (!searchableFields.includes(field)) {
|
||||||
throw new Error(`Field '${field}' is not searchable for class ${this.name}`);
|
throw new Error(`Field '${field}' is not searchable for class ${this.name}`);
|
||||||
}
|
}
|
||||||
// escape regex special chars except * and ?, then convert wildcards
|
// escape regex special chars except * and ?, then convert wildcards
|
||||||
const escaped = pattern.replace(/([.+^${}()|[\\]\\])/g, '\\$1');
|
const escaped = pattern.replace(/([.+^${}()|[\\]\\])/g, '\\$1');
|
||||||
const regexPattern = escaped.replace(/\*/g, '.*').replace(/\?/g, '.');
|
const regexPattern = escaped.replace(/\*/g, '.*').replace(/\?/g, '.');
|
||||||
return await (this as any).getInstances({ [field]: { $regex: regexPattern, $options: 'i' } });
|
return await (this as any).execQuery({ [field]: { $regex: regexPattern, $options: 'i' } }, opts);
|
||||||
}
|
}
|
||||||
// wildcard plain term across all fields (supports * and ?)
|
// wildcard plain term across all fields (supports * and ?)
|
||||||
if (!q.includes(':') && (q.includes('*') || q.includes('?'))) {
|
if (!q.includes(':') && (q.includes('*') || q.includes('?'))) {
|
||||||
@@ -377,13 +566,60 @@ export class SmartDataDbDoc<T extends TImplements, TImplements, TManager extends
|
|||||||
const escaped = q.replace(/([.+^${}()|[\\]\\])/g, '\\$1');
|
const escaped = q.replace(/([.+^${}()|[\\]\\])/g, '\\$1');
|
||||||
const pattern = escaped.replace(/\*/g, '.*').replace(/\?/g, '.');
|
const pattern = escaped.replace(/\*/g, '.*').replace(/\?/g, '.');
|
||||||
const orConds = searchableFields.map((f) => ({ [f]: { $regex: pattern, $options: 'i' } }));
|
const orConds = searchableFields.map((f) => ({ [f]: { $regex: pattern, $options: 'i' } }));
|
||||||
return await (this as any).getInstances({ $or: orConds });
|
return await (this as any).execQuery({ $or: orConds }, opts);
|
||||||
|
}
|
||||||
|
// implicit AND for multiple tokens: free terms, quoted phrases, and field:values
|
||||||
|
{
|
||||||
|
// Split query into tokens, preserving quoted substrings
|
||||||
|
const rawTokens = q.match(/(?:[^\s"']+|"[^"]*"|'[^']*')+/g) || [];
|
||||||
|
// Only apply when more than one token and no boolean operators or grouping
|
||||||
|
if (
|
||||||
|
rawTokens.length > 1 &&
|
||||||
|
!/(\bAND\b|\bOR\b|\bNOT\b|\(|\))/i.test(q) &&
|
||||||
|
!/\[|\]/.test(q)
|
||||||
|
) {
|
||||||
|
const andConds: any[] = [];
|
||||||
|
for (let token of rawTokens) {
|
||||||
|
// field:value token
|
||||||
|
const fv = token.match(/^(\w+):(.+)$/);
|
||||||
|
if (fv) {
|
||||||
|
const field = fv[1];
|
||||||
|
let value = fv[2];
|
||||||
|
if (!searchableFields.includes(field)) {
|
||||||
|
throw new Error(`Field '${field}' is not searchable for class ${this.name}`);
|
||||||
|
}
|
||||||
|
// Strip surrounding quotes if present
|
||||||
|
if ((value.startsWith('"') && value.endsWith('"')) || (value.startsWith("'") && value.endsWith("'"))) {
|
||||||
|
value = value.slice(1, -1);
|
||||||
|
}
|
||||||
|
// Wildcard search?
|
||||||
|
if (value.includes('*') || value.includes('?')) {
|
||||||
|
const escaped = value.replace(/([.+^${}()|[\\]\\])/g, '\\$1');
|
||||||
|
const pattern = escaped.replace(/\*/g, '.*').replace(/\?/g, '.');
|
||||||
|
andConds.push({ [field]: { $regex: pattern, $options: 'i' } });
|
||||||
|
} else {
|
||||||
|
andConds.push({ [field]: value });
|
||||||
|
}
|
||||||
|
} else if ((token.startsWith('"') && token.endsWith('"')) || (token.startsWith("'") && token.endsWith("'"))) {
|
||||||
|
// Quoted free phrase across all fields
|
||||||
|
const phrase = token.slice(1, -1);
|
||||||
|
const parts = phrase.split(/\s+/).map((t) => escapeForRegex(t));
|
||||||
|
const pattern = parts.join('\\s+');
|
||||||
|
andConds.push({ $or: searchableFields.map((f) => ({ [f]: { $regex: pattern, $options: 'i' } })) });
|
||||||
|
} else {
|
||||||
|
// Free term across all fields
|
||||||
|
const esc = escapeForRegex(token);
|
||||||
|
andConds.push({ $or: searchableFields.map((f) => ({ [f]: { $regex: esc, $options: 'i' } })) });
|
||||||
|
}
|
||||||
|
}
|
||||||
|
return await (this as any).execQuery({ $and: andConds }, opts);
|
||||||
|
}
|
||||||
}
|
}
|
||||||
// detect advanced Lucene syntax: field:value, wildcards, boolean, grouping
|
// detect advanced Lucene syntax: field:value, wildcards, boolean, grouping
|
||||||
const luceneSyntax = /(\w+:[^\s]+)|\*|\?|\bAND\b|\bOR\b|\bNOT\b|\(|\)/;
|
const luceneSyntax = /(\w+:[^\s]+)|\*|\?|\bAND\b|\bOR\b|\bNOT\b|\(|\)/;
|
||||||
if (luceneSyntax.test(q)) {
|
if (luceneSyntax.test(q)) {
|
||||||
const filter = (this as any).createSearchFilter(q);
|
const filter = (this as any).createSearchFilter(q);
|
||||||
return await (this as any).getInstances(filter);
|
return await (this as any).execQuery(filter, opts);
|
||||||
}
|
}
|
||||||
// multi-term unquoted -> AND of regex across fields for each term
|
// multi-term unquoted -> AND of regex across fields for each term
|
||||||
const terms = q.split(/\s+/);
|
const terms = q.split(/\s+/);
|
||||||
@@ -393,12 +629,12 @@ export class SmartDataDbDoc<T extends TImplements, TImplements, TManager extends
|
|||||||
const ors = searchableFields.map((f) => ({ [f]: { $regex: esc, $options: 'i' } }));
|
const ors = searchableFields.map((f) => ({ [f]: { $regex: esc, $options: 'i' } }));
|
||||||
return { $or: ors };
|
return { $or: ors };
|
||||||
});
|
});
|
||||||
return await (this as any).getInstances({ $and: andConds });
|
return await (this as any).execQuery({ $and: andConds }, opts);
|
||||||
}
|
}
|
||||||
// single term -> regex across all searchable fields
|
// single term -> regex across all searchable fields
|
||||||
const esc = escapeForRegex(q);
|
const esc = escapeForRegex(q);
|
||||||
const orConds = searchableFields.map((f) => ({ [f]: { $regex: esc, $options: 'i' } }));
|
const orConds = searchableFields.map((f) => ({ [f]: { $regex: esc, $options: 'i' } }));
|
||||||
return await (this as any).getInstances({ $or: orConds });
|
return await (this as any).execQuery({ $or: orConds }, opts);
|
||||||
}
|
}
|
||||||
|
|
||||||
|
|
||||||
@@ -459,35 +695,52 @@ export class SmartDataDbDoc<T extends TImplements, TImplements, TManager extends
|
|||||||
constructor() {}
|
constructor() {}
|
||||||
|
|
||||||
/**
|
/**
|
||||||
* saves this instance but not any connected items
|
* saves this instance (optionally within a transaction)
|
||||||
* may lead to data inconsistencies, but is faster
|
|
||||||
*/
|
*/
|
||||||
public async save() {
|
public async save(opts?: { session?: plugins.mongodb.ClientSession }) {
|
||||||
|
// allow hook before saving
|
||||||
|
if (typeof (this as any).beforeSave === 'function') {
|
||||||
|
await (this as any).beforeSave();
|
||||||
|
}
|
||||||
// tslint:disable-next-line: no-this-assignment
|
// tslint:disable-next-line: no-this-assignment
|
||||||
const self: any = this;
|
const self: any = this;
|
||||||
let dbResult: any;
|
let dbResult: any;
|
||||||
|
// update timestamp
|
||||||
this._updatedAt = new Date().toISOString();
|
this._updatedAt = new Date().toISOString();
|
||||||
|
// perform insert or update
|
||||||
switch (this.creationStatus) {
|
switch (this.creationStatus) {
|
||||||
case 'db':
|
case 'db':
|
||||||
dbResult = await this.collection.update(self);
|
dbResult = await this.collection.update(self, { session: opts?.session });
|
||||||
break;
|
break;
|
||||||
case 'new':
|
case 'new':
|
||||||
dbResult = await this.collection.insert(self);
|
dbResult = await this.collection.insert(self, { session: opts?.session });
|
||||||
this.creationStatus = 'db';
|
this.creationStatus = 'db';
|
||||||
break;
|
break;
|
||||||
default:
|
default:
|
||||||
console.error('neither new nor in db?');
|
logger.log('error', 'neither new nor in db?');
|
||||||
|
}
|
||||||
|
// allow hook after saving
|
||||||
|
if (typeof (this as any).afterSave === 'function') {
|
||||||
|
await (this as any).afterSave();
|
||||||
}
|
}
|
||||||
return dbResult;
|
return dbResult;
|
||||||
}
|
}
|
||||||
|
|
||||||
/**
|
/**
|
||||||
* deletes a document from the database
|
* deletes a document from the database (optionally within a transaction)
|
||||||
*/
|
*/
|
||||||
public async delete() {
|
public async delete(opts?: { session?: plugins.mongodb.ClientSession }) {
|
||||||
await this.collection.delete(this);
|
// allow hook before deleting
|
||||||
|
if (typeof (this as any).beforeDelete === 'function') {
|
||||||
|
await (this as any).beforeDelete();
|
||||||
|
}
|
||||||
|
// perform deletion
|
||||||
|
const result = await this.collection.delete(this, { session: opts?.session });
|
||||||
|
// allow hook after delete
|
||||||
|
if (typeof (this as any).afterDelete === 'function') {
|
||||||
|
await (this as any).afterDelete();
|
||||||
|
}
|
||||||
|
return result;
|
||||||
}
|
}
|
||||||
|
|
||||||
/**
|
/**
|
||||||
@@ -511,11 +764,20 @@ export class SmartDataDbDoc<T extends TImplements, TImplements, TManager extends
|
|||||||
/**
|
/**
|
||||||
* updates an object from db
|
* updates an object from db
|
||||||
*/
|
*/
|
||||||
public async updateFromDb() {
|
public async updateFromDb(): Promise<boolean> {
|
||||||
const mongoDbNativeDoc = await this.collection.findOne(await this.createIdentifiableObject());
|
const mongoDbNativeDoc = await this.collection.findOne(await this.createIdentifiableObject());
|
||||||
for (const key of Object.keys(mongoDbNativeDoc)) {
|
if (!mongoDbNativeDoc) {
|
||||||
this[key] = mongoDbNativeDoc[key];
|
return false; // Document not found in database
|
||||||
}
|
}
|
||||||
|
for (const key of Object.keys(mongoDbNativeDoc)) {
|
||||||
|
const rawValue = mongoDbNativeDoc[key];
|
||||||
|
const optionsMap = (this.constructor as any)._svDbOptions || {};
|
||||||
|
const opts = optionsMap[key];
|
||||||
|
this[key] = opts && typeof opts.deserialize === 'function'
|
||||||
|
? opts.deserialize(rawValue)
|
||||||
|
: rawValue;
|
||||||
|
}
|
||||||
|
return true;
|
||||||
}
|
}
|
||||||
|
|
||||||
/**
|
/**
|
||||||
@@ -523,9 +785,17 @@ export class SmartDataDbDoc<T extends TImplements, TImplements, TManager extends
|
|||||||
*/
|
*/
|
||||||
public async createSavableObject(): Promise<TImplements> {
|
public async createSavableObject(): Promise<TImplements> {
|
||||||
const saveableObject: unknown = {}; // is not exposed to outside, so any is ok here
|
const saveableObject: unknown = {}; // is not exposed to outside, so any is ok here
|
||||||
const saveableProperties = [...this.globalSaveableProperties, ...this.saveableProperties];
|
const globalProps = this.globalSaveableProperties || [];
|
||||||
|
const specificProps = this.saveableProperties || [];
|
||||||
|
const saveableProperties = [...globalProps, ...specificProps];
|
||||||
|
// apply custom serialization if configured
|
||||||
|
const optionsMap = (this.constructor as any)._svDbOptions || {};
|
||||||
for (const propertyNameString of saveableProperties) {
|
for (const propertyNameString of saveableProperties) {
|
||||||
saveableObject[propertyNameString] = this[propertyNameString];
|
const rawValue = (this as any)[propertyNameString];
|
||||||
|
const opts = optionsMap[propertyNameString];
|
||||||
|
(saveableObject as any)[propertyNameString] = opts && typeof opts.serialize === 'function'
|
||||||
|
? opts.serialize(rawValue)
|
||||||
|
: rawValue;
|
||||||
}
|
}
|
||||||
return saveableObject as TImplements;
|
return saveableObject as TImplements;
|
||||||
}
|
}
|
||||||
|
@@ -18,7 +18,7 @@ export class EasyStore<T> {
|
|||||||
public nameId: string;
|
public nameId: string;
|
||||||
|
|
||||||
@svDb()
|
@svDb()
|
||||||
public ephermal: {
|
public ephemeral: {
|
||||||
activated: boolean;
|
activated: boolean;
|
||||||
timeout: number;
|
timeout: number;
|
||||||
};
|
};
|
||||||
@@ -32,8 +32,8 @@ export class EasyStore<T> {
|
|||||||
return SmartdataEasyStore;
|
return SmartdataEasyStore;
|
||||||
})();
|
})();
|
||||||
|
|
||||||
constructor(nameIdArg: string, smnartdataDbRefArg: SmartdataDb) {
|
constructor(nameIdArg: string, smartdataDbRefArg: SmartdataDb) {
|
||||||
this.smartdataDbRef = smnartdataDbRefArg;
|
this.smartdataDbRef = smartdataDbRefArg;
|
||||||
this.nameId = nameIdArg;
|
this.nameId = nameIdArg;
|
||||||
}
|
}
|
||||||
|
|
||||||
@@ -110,10 +110,12 @@ export class EasyStore<T> {
|
|||||||
await easyStore.save();
|
await easyStore.save();
|
||||||
}
|
}
|
||||||
|
|
||||||
public async cleanUpEphermal() {
|
public async cleanUpEphemeral() {
|
||||||
while (
|
// Clean up ephemeral data periodically while connected
|
||||||
(await this.smartdataDbRef.statusConnectedDeferred.promise) &&
|
while (this.smartdataDbRef.status === 'connected') {
|
||||||
this.smartdataDbRef.status === 'connected'
|
await plugins.smartdelay.delayFor(60000); // Check every minute
|
||||||
) {}
|
// TODO: Implement actual cleanup logic for ephemeral data
|
||||||
|
// For now, this prevents the infinite CPU loop
|
||||||
|
}
|
||||||
}
|
}
|
||||||
}
|
}
|
||||||
|
@@ -2,6 +2,7 @@
|
|||||||
* Lucene to MongoDB query adapter for SmartData
|
* Lucene to MongoDB query adapter for SmartData
|
||||||
*/
|
*/
|
||||||
import * as plugins from './plugins.js';
|
import * as plugins from './plugins.js';
|
||||||
|
import { logger } from './logging.js';
|
||||||
|
|
||||||
// Types
|
// Types
|
||||||
type NodeType =
|
type NodeType =
|
||||||
@@ -290,11 +291,11 @@ export class LuceneParser {
|
|||||||
const includeLower = this.tokens[this.pos] === '[';
|
const includeLower = this.tokens[this.pos] === '[';
|
||||||
const includeUpper = this.tokens[this.pos + 4] === ']';
|
const includeUpper = this.tokens[this.pos + 4] === ']';
|
||||||
|
|
||||||
this.pos++; // Skip open bracket
|
// Ensure tokens for lower, TO, upper, and closing bracket exist
|
||||||
|
|
||||||
if (this.pos + 4 >= this.tokens.length) {
|
if (this.pos + 4 >= this.tokens.length) {
|
||||||
throw new Error('Invalid range query syntax');
|
throw new Error('Invalid range query syntax');
|
||||||
}
|
}
|
||||||
|
this.pos++; // Skip open bracket
|
||||||
|
|
||||||
const lower = this.tokens[this.pos];
|
const lower = this.tokens[this.pos];
|
||||||
this.pos++;
|
this.pos++;
|
||||||
@@ -754,7 +755,7 @@ export class SmartdataLuceneAdapter {
|
|||||||
// Transform the AST to a MongoDB query
|
// Transform the AST to a MongoDB query
|
||||||
return this.transformWithFields(ast, fieldsToSearch);
|
return this.transformWithFields(ast, fieldsToSearch);
|
||||||
} catch (error) {
|
} catch (error) {
|
||||||
console.error(`Failed to convert Lucene query "${luceneQuery}":`, error);
|
logger.log('error', `Failed to convert Lucene query "${luceneQuery}":`, error);
|
||||||
throw new Error(`Failed to convert Lucene query: ${error}`);
|
throw new Error(`Failed to convert Lucene query: ${error}`);
|
||||||
}
|
}
|
||||||
}
|
}
|
||||||
|
@@ -1,37 +1,73 @@
|
|||||||
import { SmartDataDbDoc } from './classes.doc.js';
|
import { SmartDataDbDoc } from './classes.doc.js';
|
||||||
import * as plugins from './plugins.js';
|
import * as plugins from './plugins.js';
|
||||||
|
import { EventEmitter } from 'events';
|
||||||
|
|
||||||
/**
|
/**
|
||||||
* a wrapper for the native mongodb cursor. Exposes better
|
* a wrapper for the native mongodb cursor. Exposes better
|
||||||
*/
|
*/
|
||||||
export class SmartdataDbWatcher<T = any> {
|
/**
|
||||||
|
* Wraps a MongoDB ChangeStream with RxJS and EventEmitter support.
|
||||||
|
*/
|
||||||
|
export class SmartdataDbWatcher<T = any> extends EventEmitter {
|
||||||
// STATIC
|
// STATIC
|
||||||
public readyDeferred = plugins.smartpromise.defer();
|
public readyDeferred = plugins.smartpromise.defer();
|
||||||
|
|
||||||
// INSTANCE
|
// INSTANCE
|
||||||
private changeStream: plugins.mongodb.ChangeStream<T>;
|
private changeStream: plugins.mongodb.ChangeStream<T>;
|
||||||
|
private rawSubject: plugins.smartrx.rxjs.Subject<T>;
|
||||||
public changeSubject = new plugins.smartrx.rxjs.Subject<T>();
|
/** Emits change documents (or arrays of documents if buffered) */
|
||||||
|
public changeSubject: any;
|
||||||
|
/**
|
||||||
|
* @param changeStreamArg native MongoDB ChangeStream
|
||||||
|
* @param smartdataDbDocArg document class for instance creation
|
||||||
|
* @param opts.bufferTimeMs optional milliseconds to buffer events via RxJS
|
||||||
|
*/
|
||||||
constructor(
|
constructor(
|
||||||
changeStreamArg: plugins.mongodb.ChangeStream<T>,
|
changeStreamArg: plugins.mongodb.ChangeStream<T>,
|
||||||
smartdataDbDocArg: typeof SmartDataDbDoc,
|
smartdataDbDocArg: typeof SmartDataDbDoc,
|
||||||
|
opts?: { bufferTimeMs?: number },
|
||||||
) {
|
) {
|
||||||
|
super();
|
||||||
|
this.rawSubject = new plugins.smartrx.rxjs.Subject<T>();
|
||||||
|
// Apply buffering if requested
|
||||||
|
if (opts && opts.bufferTimeMs) {
|
||||||
|
this.changeSubject = this.rawSubject.pipe(plugins.smartrx.rxjs.ops.bufferTime(opts.bufferTimeMs));
|
||||||
|
} else {
|
||||||
|
this.changeSubject = this.rawSubject;
|
||||||
|
}
|
||||||
this.changeStream = changeStreamArg;
|
this.changeStream = changeStreamArg;
|
||||||
this.changeStream.on('change', async (item: any) => {
|
this.changeStream.on('change', async (item: any) => {
|
||||||
if (!item.fullDocument) {
|
let docInstance: T = null;
|
||||||
this.changeSubject.next(null);
|
if (item.fullDocument) {
|
||||||
return;
|
docInstance = smartdataDbDocArg.createInstanceFromMongoDbNativeDoc(
|
||||||
|
item.fullDocument
|
||||||
|
) as any as T;
|
||||||
}
|
}
|
||||||
this.changeSubject.next(
|
// Notify subscribers
|
||||||
smartdataDbDocArg.createInstanceFromMongoDbNativeDoc(item.fullDocument) as any as T,
|
this.rawSubject.next(docInstance);
|
||||||
);
|
this.emit('change', docInstance);
|
||||||
});
|
});
|
||||||
|
// Signal readiness after one tick
|
||||||
plugins.smartdelay.delayFor(0).then(() => {
|
plugins.smartdelay.delayFor(0).then(() => {
|
||||||
this.readyDeferred.resolve();
|
this.readyDeferred.resolve();
|
||||||
});
|
});
|
||||||
}
|
}
|
||||||
|
|
||||||
public async close() {
|
/**
|
||||||
|
* Close the change stream, complete the RxJS subject, and remove listeners.
|
||||||
|
*/
|
||||||
|
public async close(): Promise<void> {
|
||||||
|
// Close MongoDB ChangeStream
|
||||||
await this.changeStream.close();
|
await this.changeStream.close();
|
||||||
|
// Complete the subject to teardown any buffering operators
|
||||||
|
this.rawSubject.complete();
|
||||||
|
// Remove all EventEmitter listeners
|
||||||
|
this.removeAllListeners();
|
||||||
|
}
|
||||||
|
/**
|
||||||
|
* Alias for close(), matching README usage
|
||||||
|
*/
|
||||||
|
public async stop(): Promise<void> {
|
||||||
|
return this.close();
|
||||||
}
|
}
|
||||||
}
|
}
|
||||||
|
Reference in New Issue
Block a user