fix(core): Resolve TypeScript strict mode and ES client API compatibility issues for v3.0.0

- Fix ES client v8+ API: use document/doc instead of body for index/update operations
- Add type assertions (as any) for ES client ILM, template, and search APIs
- Fix strict null checks with proper undefined handling (nullish coalescing)
- Fix MetricsCollector interface to match required method signatures
- Fix Logger.error signature compatibility in plugins
- Resolve TermsQuery type index signature conflict
- Remove sourceMap from tsconfig (handled by tsbuild with inlineSourceMap)
This commit is contained in:
2025-11-29 21:19:28 +00:00
parent ec8dfbcfe6
commit 820f84ee61
30 changed files with 344 additions and 220 deletions

View File

@@ -10,6 +10,7 @@ The complete transformation plan is documented in this conversation under "Enter
## ✅ Phase 1: Foundation - COMPLETE
### Directory Structure
-`ts/core/` - Core infrastructure
-`ts/domain/` - Domain APIs
-`ts/plugins/` - Plugin architecture
@@ -18,6 +19,7 @@ The complete transformation plan is documented in this conversation under "Enter
-`tsconfig.json` - Strict TypeScript configuration
### Error Handling (ts/core/errors/)
-`types.ts` - Error codes, context types, retry configuration
-`elasticsearch-error.ts` - Complete error hierarchy:
- ElasticsearchError (base)
@@ -36,6 +38,7 @@ The complete transformation plan is documented in this conversation under "Enter
-`index.ts` - Module exports
### Observability (ts/core/observability/)
-`logger.ts` - Structured logging:
- LogLevel enum (DEBUG, INFO, WARN, ERROR)
- Logger class with context and correlation
@@ -57,6 +60,7 @@ The complete transformation plan is documented in this conversation under "Enter
-`index.ts` - Module exports
### Configuration (ts/core/config/)
-`types.ts` - Configuration types:
- AuthConfig (basic, apiKey, bearer, cloud)
- TLSConfig
@@ -78,6 +82,7 @@ The complete transformation plan is documented in this conversation under "Enter
-`index.ts` - Module exports
### Connection Management (ts/core/connection/)
-`health-check.ts` - Health monitoring:
- HealthChecker class
- Periodic health checks
@@ -102,11 +107,13 @@ The complete transformation plan is documented in this conversation under "Enter
-`index.ts` - Module exports
### Core Index
-`ts/core/index.ts` - Exports all core modules
## ✅ Phase 2: Domain APIs - In Progress (85% of Phase 2)
### Document API (ts/domain/documents/) - COMPLETE ✅
-`types.ts` - Type definitions
-`document-session.ts` - Session management with efficient cleanup
-`document-manager.ts` - Main fluent API class:
@@ -123,6 +130,7 @@ The complete transformation plan is documented in this conversation under "Enter
-**Example**: Complete working example in `examples/basic/complete-example.ts`
### Query Builder (ts/domain/query/) - COMPLETE ✅
-`types.ts` - Complete query DSL type definitions:
- All query types (match, term, range, bool, etc.)
- Aggregation types (terms, metrics, histogram, etc.)
@@ -143,18 +151,21 @@ The complete transformation plan is documented in this conversation under "Enter
-**Example**: Comprehensive query example in `examples/query/query-builder-example.ts`
### Logging API (ts/domain/logging/) - NOT STARTED
- ⏳ TODO: Enhanced SmartLog destination
- ⏳ TODO: Log enrichment
- ⏳ TODO: Sampling and rate limiting
- ⏳ TODO: ILM integration
### Bulk Indexer (ts/domain/bulk/) - NOT STARTED
- ⏳ TODO: Adaptive batching
- ⏳ TODO: Parallel workers
- ⏳ TODO: Progress callbacks
- ⏳ TODO: Stream support
### KV Store (ts/domain/kv/) - NOT STARTED
- ⏳ TODO: TTL support
- ⏳ TODO: Caching layer
- ⏳ TODO: Batch operations
@@ -163,16 +174,19 @@ The complete transformation plan is documented in this conversation under "Enter
## ⏸️ Phase 3: Advanced Features - NOT STARTED
### Plugin Architecture
- ⏳ TODO: Plugin interface
- ⏳ TODO: Request/response interceptors
- ⏳ TODO: Built-in plugins (compression, caching, rate-limiting)
### Transaction Support
- ⏳ TODO: Optimistic locking with versioning
- ⏳ TODO: Transaction manager
- ⏳ TODO: Rollback support
### Schema Management
- ⏳ TODO: Type-safe schema definition
- ⏳ TODO: Migration support
- ⏳ TODO: Index template management
@@ -180,18 +194,21 @@ The complete transformation plan is documented in this conversation under "Enter
## ⏸️ Phase 4: Testing & Documentation - NOT STARTED
### Test Suite
- ⏳ TODO: Unit tests for all core modules
- ⏳ TODO: Integration tests
- ⏳ TODO: Chaos tests
- ⏳ TODO: Performance benchmarks
### Documentation
- ⏳ TODO: API reference (TSDoc)
- ⏳ TODO: Migration guide (v2 → v3)
- ⏳ TODO: Usage examples
- ⏳ TODO: Architecture documentation
### README
- ⏳ TODO: Update with new API examples
- ⏳ TODO: Performance benchmarks
- ⏳ TODO: Migration guide link
@@ -329,6 +346,7 @@ await docs
## 🔄 Resumption Instructions
When resuming:
1. Read this status file
2. Review the complete plan in the original conversation
3. **Next Priority**: Enhanced Logging API in `ts/domain/logging/`
@@ -346,13 +364,13 @@ When resuming:
- ILM integration
- Metric extraction
3. **Bulk Indexer** (`ts/domain/bulk/`)
2. **Bulk Indexer** (`ts/domain/bulk/`)
- Adaptive batching logic
- Parallel workers
- Progress callbacks
- Stream support
4. **KV Store** (`ts/domain/kv/`)
3. **KV Store** (`ts/domain/kv/`)
- TTL support
- Caching layer
- Batch operations

View File

@@ -18,6 +18,7 @@ We've successfully built **60% of a complete enterprise-grade Elasticsearch clie
### Phase 1: Foundation Layer (100% ✅)
#### 1. **Error Handling System**
- Complete typed error hierarchy (11 specialized error classes)
- Retry policies with exponential backoff and jitter
- Circuit breaker pattern for fault tolerance
@@ -25,11 +26,13 @@ We've successfully built **60% of a complete enterprise-grade Elasticsearch clie
- Rich error context and metadata
**Key Files:**
- `ts/core/errors/types.ts` - Error codes and types
- `ts/core/errors/elasticsearch-error.ts` - Error hierarchy
- `ts/core/errors/retry-policy.ts` - Retry logic
#### 2. **Observability Stack**
- **Logging**: Structured logging with levels, context, and correlation IDs
- **Metrics**: Prometheus-compatible Counter, Gauge, and Histogram
- **Tracing**: OpenTelemetry-compatible distributed tracing
@@ -37,11 +40,13 @@ We've successfully built **60% of a complete enterprise-grade Elasticsearch clie
- **Export**: Prometheus text format for metrics
**Key Files:**
- `ts/core/observability/logger.ts`
- `ts/core/observability/metrics.ts`
- `ts/core/observability/tracing.ts`
#### 3. **Configuration Management**
- Fluent configuration builder
- Multiple sources: env vars, files, objects, secrets
- Secret provider abstraction (AWS Secrets, Vault, etc.)
@@ -50,10 +55,12 @@ We've successfully built **60% of a complete enterprise-grade Elasticsearch clie
- TLS, proxy, connection pool configuration
**Key Files:**
- `ts/core/config/types.ts`
- `ts/core/config/configuration-builder.ts`
#### 4. **Connection Management**
- Singleton connection manager
- Connection pooling
- Automatic health checks with thresholds
@@ -62,6 +69,7 @@ We've successfully built **60% of a complete enterprise-grade Elasticsearch clie
- Graceful degradation
**Key Files:**
- `ts/core/connection/health-check.ts`
- `ts/core/connection/circuit-breaker.ts`
- `ts/core/connection/connection-manager.ts`
@@ -69,7 +77,9 @@ We've successfully built **60% of a complete enterprise-grade Elasticsearch clie
### Phase 2: Document API (100% ✅)
#### **Fluent Document Manager**
A complete redesign with:
- Full CRUD operations (create, read, update, delete, upsert)
- Session-based batch operations
- Efficient stale document cleanup (deleteByQuery instead of scroll)
@@ -81,11 +91,13 @@ A complete redesign with:
- Type-safe generics
**Key Files:**
- `ts/domain/documents/types.ts`
- `ts/domain/documents/document-session.ts`
- `ts/domain/documents/document-manager.ts`
#### **Complete Working Example**
- Comprehensive 300+ line example
- Demonstrates all features end-to-end
- Configuration, connection, CRUD, sessions, iteration, snapshots
@@ -95,6 +107,7 @@ A complete redesign with:
**File:** `ts/examples/basic/complete-example.ts`
### Query Builder (ts/domain/query/) - COMPLETE ✅
-`types.ts` - Complete query DSL type definitions:
- All Elasticsearch query types (match, term, range, bool, wildcard, etc.)
- Aggregation types (terms, metrics, histogram, date_histogram, etc.)
@@ -115,11 +128,13 @@ A complete redesign with:
-`index.ts` - Module exports with full type exports
**Key Files:**
- `ts/domain/query/types.ts` - Comprehensive type system
- `ts/domain/query/query-builder.ts` - Main query builder
- `ts/domain/query/aggregation-builder.ts` - Aggregation builder
**Complete Working Example:**
- Comprehensive 400+ line example
- Demonstrates all query types, boolean queries, aggregations
- Pagination, sorting, filtering examples
@@ -209,7 +224,7 @@ await manager.initialize();
// 3. Use Document API
const docs = new DocumentManager<Product>({
index: 'products',
autoCreateIndex: true
autoCreateIndex: true,
});
await docs.initialize();
@@ -272,9 +287,10 @@ const complexResults = await createQuery<Product>('products')
// Query with aggregations
const stats = await createQuery<Product>('products')
.matchAll()
.size(0) // Only want aggregations
.size(0) // Only want aggregations
.aggregations((agg) => {
agg.terms('brands', 'brand.keyword', { size: 10 })
agg
.terms('brands', 'brand.keyword', { size: 10 })
.subAggregation('avg_price', (sub) => {
sub.avg('avg_price', 'price');
});
@@ -301,18 +317,18 @@ const sources = await createQuery<Product>('products')
### v2.x → v3.0 Comparison
| Aspect | v2.x | v3.0 |
|--------|------|------|
| **Connection** | Each class creates own client | Singleton ConnectionManager |
| **Health Monitoring** | None | Automatic with circuit breaker |
| **Error Handling** | Inconsistent, uses console.log | Typed hierarchy with retry |
| **Configuration** | Constructor only | Fluent builder with validation |
| **Observability** | console.log scattered | Structured logging, metrics, tracing |
| **Type Safety** | Partial, uses `any` | Strict TypeScript, no `any` |
| **Bulk Operations** | Sequential | Batched with error handling |
| **Document Cleanup** | O(n) scroll | deleteByQuery (efficient) |
| **API Design** | Inconsistent | Fluent and discoverable |
| **Testing** | Minimal | Comprehensive (planned) |
| Aspect | v2.x | v3.0 |
| --------------------- | ------------------------------ | ------------------------------------ |
| **Connection** | Each class creates own client | Singleton ConnectionManager |
| **Health Monitoring** | None | Automatic with circuit breaker |
| **Error Handling** | Inconsistent, uses console.log | Typed hierarchy with retry |
| **Configuration** | Constructor only | Fluent builder with validation |
| **Observability** | console.log scattered | Structured logging, metrics, tracing |
| **Type Safety** | Partial, uses `any` | Strict TypeScript, no `any` |
| **Bulk Operations** | Sequential | Batched with error handling |
| **Document Cleanup** | O(n) scroll | deleteByQuery (efficient) |
| **API Design** | Inconsistent | Fluent and discoverable |
| **Testing** | Minimal | Comprehensive (planned) |
### Design Patterns Implemented
@@ -330,16 +346,19 @@ const sources = await createQuery<Product>('products')
## ⏳ What's Next (40% Remaining)
### Phase 2 Remaining (15%)
- **Logging API**: Enhanced SmartLog with enrichment
- **Bulk Indexer**: Adaptive batching with parallel workers
- **KV Store**: TTL, caching, batch operations
### Phase 3 (15%)
- **Plugin Architecture**: Request/response middleware
- **Transactions**: Optimistic locking with rollback
- **Schema Management**: Type-safe schemas and migrations
### Phase 4 (5%)
- **Test Suite**: Unit, integration, chaos tests
- **Migration Guide**: v2 → v3 documentation
- **Performance Benchmarks**: Before/after comparisons
@@ -359,6 +378,7 @@ These are **cosmetic TypeScript strict mode issues** - the code logic is sound.
### Temporary Workaround
Comment out these lines in `tsconfig.json`:
```json
// "verbatimModuleSyntax": true,
// "noUncheckedIndexedAccess": true,

View File

@@ -1,6 +1,7 @@
# Changelog
## 2025-11-29 - 3.1.0 - feat(schema-manager)
Add Schema Management module and expose it in public API; update README to mark Phase 3 complete and move priorities to Phase 4
- Add ts/domain/schema/index.ts to export SchemaManager and related types
@@ -8,6 +9,7 @@ Add Schema Management module and expose it in public API; update README to mark
- Update README hints: mark Phase 3 Advanced Features complete (Transaction Support and Schema Management), add schema/transaction example references, and update Next Priorities to Phase 4 (Comprehensive Test Suite, Migration Guide, README Update)
## 2025-11-29 - 3.0.0 - BREAKING CHANGE(core)
Refactor to v3: introduce modular core/domain architecture, plugin system, observability and strict TypeScript configuration; remove legacy classes
- Major refactor to a modular v3 layout: new ts/core and ts/domain directories with clear public index exports
@@ -15,104 +17,121 @@ Refactor to v3: introduce modular core/domain architecture, plugin system, obser
- Introduced plugin system (PluginManager) and built-in plugins: logging, metrics, cache, retry and rate-limit
- New domain APIs: DocumentManager (with DocumentSession and snapshot/iterate support), QueryBuilder (type-safe query + aggregations), BulkIndexer, KV store and Logging domain (enrichers and destinations)
- Switched to strict TypeScript settings in tsconfig.json (many strict flags enabled) and added QUICK_FIXES.md describing import/type fixes needed
- Removed legacy files and consolidated exports (deleted old els.classes.* files, els.plugins.ts and autocreated commitinfo file)
- Removed legacy files and consolidated exports (deleted old els.classes.\* files, els.plugins.ts and autocreated commitinfo file)
- Public API changed: index exports now re-export core and domain modules (breaking changes for consumers — update imports and initialization flow)
## 2025-11-29 - 2.0.17 - fix(ci)
Update CI workflows and build config; bump dependencies; code style and TS config fixes
- Gitea workflows updated: swapped CI image to code.foss.global, adjusted NPMCI_COMPUTED_REPOURL and replaced @shipzone/npmci with @ship.zone/npmci; tsdoc package path updated.
- Removed legacy .gitlab-ci.yml (migrated CI to .gitea workflows).
- Bumped dependencies and devDependencies (e.g. @elastic/elasticsearch -> ^9.2.0, @git.zone/* packages, @push.rocks/* packages) and added repository/bugs/homepage/pnpm/packageManager metadata to package.json.
- Bumped dependencies and devDependencies (e.g. @elastic/elasticsearch -> ^9.2.0, @git.zone/_ packages, @push.rocks/_ packages) and added repository/bugs/homepage/pnpm/packageManager metadata to package.json.
- Tests updated: import path change to @git.zone/tstest/tapbundle and test runner export changed to default export (export default tap.start()).
- TypeScript config changes: module and moduleResolution set to NodeNext and added exclude for dist_*/**/*.d.ts.
- TypeScript config changes: module and moduleResolution set to NodeNext and added exclude for dist\__/\*\*/_.d.ts.
- Code cleanups and formatting: normalized object/argument formatting, trailing commas, safer ElasticClient call shapes (explicit option objects), and minor refactors across ElasticDoc, FastPush, KVStore, ElasticIndex, ElasticScheduler and smartlog destination.
- Added .gitignore entries for local AI tool directories and added readme.hints.md and npmextra.json.
## 2023-08-30 - 2.0.2..2.0.16 - core
Series of maintenance releases and small bugfixes on the 2.0.x line.
- Multiple "fix(core): update" commits across 2.0.2 → 2.0.16 addressing small bugs and stability/maintenance issues.
- No single large feature added in these patch releases; recommended to consult individual release diffs if you need a precise change per patch.
## 2023-08-25 - 2.0.0 - core
Major 2.0.0 release containing core updates and the transition from the 1.x line.
- Bumped major version to 2.0.0 with core updates.
- This release follows a breaking-change update introduced on the 1.x line (see 1.0.56 below). Review breaking changes before upgrading.
## 2023-08-25 - 1.0.56 - core (BREAKING CHANGE)
Breaking change introduced on the 1.x line.
- BREAKING CHANGE: core updated. Consumers should review the change and adapt integration code before upgrading from 1.0.55 → 1.0.56 (or migrating to 2.0.x).
## 2023-08-18 - 1.0.40..1.0.55 - maintenance
Maintenance and fixes across many 1.0.x releases (mid 2023).
- Numerous "fix(core): update" commits across 1.0.40 → 1.0.55 addressing stability and minor bug fixes.
- Includes smaller testing updates (e.g., fix(test): update in the 1.0.x series).
## 2023-07-05 - 1.0.32..1.0.44 - maintenance
Maintenance sweep in the 1.0.x line (July 2023).
- Multiple small core fixes and updates across these patch releases.
- No large feature additions; stability and incremental improvements only.
## 2019-11-02 - 1.0.26..1.0.30 - maintenance
Patch-level fixes and cleanup in late 2019.
- Several "fix(core): update" releases to address minor issues and keep dependencies up to date.
## 2018-11-10 - 1.0.20 - core
Cleanup related to indices.
- fix(clean up old indices): update — housekeeping and cleanup of old indices.
## 2018-11-03 - 1.0.13 - core
Security/tooling update.
- fix(core): add snyk — added Snyk related changes (security/scan tooling integration).
## 2018-09-15 - 1.0.11 - core
Dependency and compatibility updates.
- fix(core): update dependencies and bonsai.io compatibility — updated dependencies and ensured compatibility with bonsai.io.
## 2018-08-12 - 1.0.9 - test
Testing improvements.
- fix(test): update — improvements/adjustments to test suite.
## 2018-03-03 - 1.0.7 - system
System-level change.
- "system change" — internal/system modification (no public API feature).
## 2018-01-27 - 1.0.4 - quality/style
Coverage and style updates.
- adjust coverageTreshold — adjusted test coverage threshold.
- update style / update — code style and minor cleanup.
## 2018-01-27 - 1.0.3 - core (feat)
Winston logging integration (added, later removed in a subsequent release).
- feat(core): implement winston support — initial addition of Winston logging support.
## 2018-01-27 - 1.0.6 - winston (fix)
Removal of previously added logging integration.
- fix(winston): remove winston — removed Winston integration introduced earlier.
## 2018-01-26 - 1.0.2 - core (feat)
Index generation improvement.
- feat(core): update index generation — improvements to index generation logic.
## 2018-01-24 - 1.0.1 - core (initial)
Project initial commit and initial cleanup.
- feat(core): initial commit — project bootstrap.
- fix(core): cleanup — initial cleanup and adjustments after the first commit.
Note: Versions that only contain bare version-tag commits (commit messages identical to the version string) have been summarized as ranges above. For detailed per-patch changes consult individual release diffs.
Note: Versions that only contain bare version-tag commits (commit messages identical to the version string) have been summarized as ranges above. For detailed per-patch changes consult individual release diffs.

View File

@@ -13,5 +13,8 @@
"npmPackagename": "@mojoio/elasticsearch",
"license": "MIT"
}
},
"tsbuild": {
"tsconfig": "./tsconfig.json"
}
}
}

View File

@@ -5,12 +5,14 @@
### Completed Components
**Phase 1: Foundation (100%)** - Complete
- Error handling with typed hierarchy
- Observability (logging, metrics, tracing)
- Configuration management with fluent builder
- Connection management with health checks and circuit breaker
**Phase 2: Domain APIs (100%)** - Complete! 🎉
- **Document API (100%)** - Complete fluent CRUD operations with sessions, iteration, snapshots
- **Query Builder (100%)** - Type-safe query construction with aggregations
- **Logging API (100%)** - Enterprise structured logging with:
@@ -43,6 +45,7 @@
- Example at `ts/examples/kv/kv-store-example.ts`
**Phase 3: Advanced Features (100%)** - Complete!
- **Plugin Architecture (100%)** - Extensible request/response middleware:
- Plugin lifecycle hooks (beforeRequest, afterResponse, onError)
- Plugin priority and execution ordering
@@ -99,22 +102,26 @@ const results = await createQuery<Product>('products')
const stats = await createQuery<Product>('products')
.matchAll()
.aggregations((agg) => {
agg.terms('brands', 'brand.keyword')
agg
.terms('brands', 'brand.keyword')
.subAggregation('avg_price', (sub) => sub.avg('avg_price', 'price'));
})
.execute();
```
### Next Priorities (Phase 4)
1. **Comprehensive Test Suite** - Full test coverage for all modules
2. **Migration Guide** - Guide from v2 to v3 API
3. **README Update** - Update main README with new v3 API documentation
### Structure
- Single `ts/` directory (no parallel structures)
- Single `tsconfig.json` with strict mode enabled
- All v3.0 code lives in `ts/` directory
### Known Issues
- Minor TypeScript strict mode import issues documented in `ts/QUICK_FIXES.md`
- Need to apply `import type` fixes for verbatimModuleSyntax compliance
- Need to apply `import type` fixes for verbatimModuleSyntax compliance

View File

@@ -89,7 +89,7 @@ await docManager.pipeDocument({
doc: {
name: 'Premium Widget',
price: 99.99,
inStock: true
inStock: true,
},
});
@@ -99,7 +99,7 @@ await docManager.pipeDocument({
doc: {
name: 'Deluxe Gadget',
price: 149.99,
inStock: false
inStock: false,
},
});
@@ -179,6 +179,7 @@ await kvStore.clear();
### ElsSmartlogDestination
The main logging destination class that provides:
- Automatic index rotation based on date
- Configurable retention policies
- Kibana-compatible log format
@@ -188,6 +189,7 @@ The main logging destination class that provides:
### ElasticDoc
Advanced document management with:
- Piping sessions for tracking document lifecycles
- Automatic cleanup of stale documents
- Snapshot functionality with custom processors
@@ -197,6 +199,7 @@ Advanced document management with:
### FastPush
High-performance bulk operations:
- Automatic batching for optimal performance
- Index management (create, delete, clear)
- Dynamic mapping support
@@ -205,6 +208,7 @@ High-performance bulk operations:
### KVStore
Simple key-value interface:
- Elasticsearch-backed storage
- Async/await API
- Automatic index initialization
@@ -266,6 +270,7 @@ await docManager.takeSnapshot(async (iterator, prevSnapshot) => {
## API Compatibility 🔄
This module is built on top of `@elastic/elasticsearch` v9.x and is compatible with:
- Elasticsearch 8.x and 9.x clusters
- Kibana 8.x and 9.x for log visualization
- OpenSearch (with some limitations)
@@ -278,7 +283,7 @@ Full TypeScript support with comprehensive type definitions:
import type {
IElasticDocConstructorOptions,
ISnapshot,
SnapshotProcessor
SnapshotProcessor,
} from '@apiclient.xyz/elasticsearch';
```

View File

@@ -1,6 +1,7 @@
import { Client as ElasticClient } from '@elastic/elasticsearch';
import { ElasticsearchConfig } from '../config/types.js';
import { HealthChecker, HealthCheckResult, HealthStatus } from './health-check.js';
import type { ElasticsearchConfig } from '../config/types.js';
import { HealthChecker, HealthStatus } from './health-check.js';
import type { HealthCheckResult } from './health-check.js';
import { CircuitBreaker } from './circuit-breaker.js';
import { Logger, defaultLogger } from '../observability/logger.js';
import { MetricsCollector, defaultMetricsCollector } from '../observability/metrics.js';

View File

@@ -1,4 +1,5 @@
import { ErrorCode, ErrorContext } from './types.js';
import { ErrorCode } from './types.js';
import type { ErrorContext } from './types.js';
/**
* Base error class for all Elasticsearch client errors

View File

@@ -1,4 +1,4 @@
import { RetryConfig, RetryStrategy } from './types.js';
import type { RetryConfig } from './types.js';
import { ElasticsearchError } from './elasticsearch-error.js';
/**

View File

@@ -199,9 +199,26 @@ export class Logger {
/**
* Log at ERROR level
*
* Accepts either:
* - error(message, Error, meta) - explicit error with optional metadata
* - error(message, meta) - error context as metadata (error property extracted)
*/
error(message: string, error?: Error, meta?: Record<string, unknown>): void {
this.log(LogLevel.ERROR, message, meta, error);
error(message: string, errorOrMeta?: Error | Record<string, unknown>, meta?: Record<string, unknown>): void {
if (errorOrMeta instanceof Error) {
this.log(LogLevel.ERROR, message, meta, errorOrMeta);
} else if (errorOrMeta && typeof errorOrMeta === 'object') {
// Extract error from meta if present
const errorValue = (errorOrMeta as Record<string, unknown>).error;
const extractedError = errorValue instanceof Error
? errorValue
: typeof errorValue === 'string'
? new Error(errorValue)
: undefined;
this.log(LogLevel.ERROR, message, errorOrMeta, extractedError);
} else {
this.log(LogLevel.ERROR, message);
}
}
/**

View File

@@ -529,6 +529,42 @@ export class MetricsCollector {
return histogram;
}
/**
* Record a counter increment (convenience method for plugins)
*/
recordCounter(name: string, value: number = 1, labels: Labels = {}): void {
let counter = this.registry.get(name) as Counter | undefined;
if (!counter) {
counter = new Counter(name, `Counter: ${name}`, Object.keys(labels));
this.registry.register(counter);
}
counter.inc(labels, value);
}
/**
* Record a histogram observation (convenience method for plugins)
*/
recordHistogram(name: string, value: number, labels: Labels = {}): void {
let histogram = this.registry.get(name) as Histogram | undefined;
if (!histogram) {
histogram = new Histogram(name, `Histogram: ${name}`, Object.keys(labels));
this.registry.register(histogram);
}
histogram.observe(value, labels);
}
/**
* Record a gauge value (convenience method for plugins)
*/
recordGauge(name: string, value: number, labels: Labels = {}): void {
let gauge = this.registry.get(name) as Gauge | undefined;
if (!gauge) {
gauge = new Gauge(name, `Gauge: ${name}`, Object.keys(labels));
this.registry.register(gauge);
}
gauge.set(value, labels);
}
/**
* Export all metrics in Prometheus format
*/

View File

@@ -311,10 +311,16 @@ export class InMemoryTracer implements Tracer {
const parts = traceparent.split('-');
if (parts.length !== 4) return null;
const traceId = parts[1];
const spanId = parts[2];
const flagsStr = parts[3];
if (!traceId || !spanId || !flagsStr) return null;
return {
traceId: parts[1],
spanId: parts[2],
traceFlags: parseInt(parts[3], 16),
traceId,
spanId,
traceFlags: parseInt(flagsStr, 16),
};
}
}

View File

@@ -107,17 +107,12 @@ export function createLoggingPlugin(config: LoggingPluginConfig = {}): Plugin {
const duration = Date.now() - context.request.startTime;
logger.error('Elasticsearch error', {
logger.error('Elasticsearch error', context.error, {
requestId: context.request.requestId,
method: context.request.method,
path: context.request.path,
duration,
attempts: context.attempts,
error: {
name: context.error.name,
message: context.error.message,
stack: context.error.stack,
},
statusCode: context.response?.statusCode,
});

View File

@@ -128,14 +128,19 @@ function extractIndexFromPath(path: string): string {
// Split by slash and get first segment
const segments = cleanPath.split('/');
const firstSegment = segments[0];
// Common patterns:
// /{index}/_search
// /{index}/_doc/{id}
// /_cat/indices
if (segments[0].startsWith('_')) {
return segments[0]; // API endpoint like _cat, _search
if (!firstSegment) {
return 'unknown';
}
return segments[0] || 'unknown';
if (firstSegment.startsWith('_')) {
return firstSegment; // API endpoint like _cat, _search
}
return firstSegment;
}

View File

@@ -63,7 +63,7 @@ export class PluginManager {
try {
await plugin.initialize(this.client, plugin.config || {});
} catch (error) {
this.logger.error(`Failed to initialize plugin '${plugin.name}'`, { error });
this.logger.error(`Failed to initialize plugin '${plugin.name}'`, error instanceof Error ? error : new Error(String(error)));
throw error;
}
}
@@ -109,7 +109,7 @@ export class PluginManager {
try {
await plugin.destroy();
} catch (error) {
this.logger.error(`Failed to destroy plugin '${name}'`, { error });
this.logger.error(`Failed to destroy plugin '${name}'`, error instanceof Error ? error : new Error(String(error)));
}
}
@@ -186,16 +186,15 @@ export class PluginManager {
}
currentContext = result;
} catch (error: any) {
this.logger.error(`Error in beforeRequest hook for plugin '${plugin.name}'`, {
error,
});
} catch (error: unknown) {
const err = error instanceof Error ? error : new Error(String(error));
this.logger.error(`Error in beforeRequest hook for plugin '${plugin.name}'`, err);
if (this.config.collectStats) {
const stats = this.pluginStats.get(plugin.name);
if (stats) {
stats.errors++;
stats.lastError = error.message;
stats.lastError = err.message;
}
}
@@ -245,16 +244,15 @@ export class PluginManager {
});
currentResponse = result;
} catch (error: any) {
this.logger.error(`Error in afterResponse hook for plugin '${plugin.name}'`, {
error,
});
} catch (error: unknown) {
const err = error instanceof Error ? error : new Error(String(error));
this.logger.error(`Error in afterResponse hook for plugin '${plugin.name}'`, err);
if (this.config.collectStats) {
const stats = this.pluginStats.get(plugin.name);
if (stats) {
stats.errors++;
stats.lastError = error.message;
stats.lastError = err.message;
}
}
@@ -303,14 +301,15 @@ export class PluginManager {
this.logger.debug(`Error handled by plugin '${plugin.name}'`);
return result;
}
} catch (error: any) {
this.logger.error(`Error in onError hook for plugin '${plugin.name}'`, { error });
} catch (error: unknown) {
const err = error instanceof Error ? error : new Error(String(error));
this.logger.error(`Error in onError hook for plugin '${plugin.name}'`, err);
if (this.config.collectStats) {
const stats = this.pluginStats.get(plugin.name);
if (stats) {
stats.errors++;
stats.lastError = error.message;
stats.lastError = err.message;
}
}

View File

@@ -7,12 +7,11 @@ import type {
BulkIndexerStats,
BulkProgress,
BackpressureState,
BatchingStrategy,
} from './types.js';
import { ElasticsearchConnectionManager } from '../../core/connection/connection-manager.js';
import { defaultLogger } from '../../core/observability/logger.js';
import { defaultMetrics } from '../../core/observability/metrics.js';
import { defaultTracing } from '../../core/observability/tracing.js';
import { defaultMetricsCollector } from '../../core/observability/metrics.js';
import { defaultTracer } from '../../core/observability/tracing.js';
/**
* Enterprise-grade bulk indexer with adaptive batching and parallel workers
@@ -297,7 +296,7 @@ export class BulkIndexer {
// ============================================================================
private async executeBatch(operations: BulkOperation[]): Promise<BulkBatchResult> {
const span = defaultTracing.createSpan('bulkIndexer.executeBatch', {
const span = defaultTracer.startSpan('bulkIndexer.executeBatch', {
'batch.size': operations.length,
});
@@ -369,11 +368,11 @@ export class BulkIndexer {
success,
type: op?.type as BulkOperationType,
index: actionResult._index,
id: actionResult._id,
id: actionResult._id ?? undefined,
status: actionResult.status,
error: actionResult.error ? {
type: actionResult.error.type,
reason: actionResult.error.reason,
reason: actionResult.error.reason ?? 'Unknown error',
causedBy: actionResult.error.caused_by ? JSON.stringify(actionResult.error.caused_by) : undefined,
} : undefined,
seqNo: actionResult._seq_no,
@@ -408,8 +407,8 @@ export class BulkIndexer {
}
// Record metrics
defaultMetrics.requestsTotal.inc({ operation: 'bulk', result: 'success' });
defaultMetrics.requestDuration.observe({ operation: 'bulk' }, durationMs);
defaultMetricsCollector.requestsTotal.inc({ operation: 'bulk', index: 'bulk' });
defaultMetricsCollector.requestDuration.observe(durationMs / 1000, { operation: 'bulk', index: 'bulk' });
const result: BulkBatchResult = {
successful,
@@ -443,9 +442,9 @@ export class BulkIndexer {
this.stats.totalBatchesFailed++;
this.stats.activeWorkers--;
defaultMetrics.requestErrors.inc({ operation: 'bulk' });
defaultLogger.error('Bulk batch failed', {
error: error instanceof Error ? error.message : String(error),
defaultMetricsCollector.requestErrors.inc({ operation: 'bulk', index: 'bulk', error_code: 'bulk_error' });
const err = error instanceof Error ? error : new Error(String(error));
defaultLogger.error('Bulk batch failed', err, {
batchSize: operations.length,
});
@@ -518,16 +517,15 @@ export class BulkIndexer {
attempts,
});
} catch (dlqError) {
defaultLogger.error('Failed to send to dead-letter queue', {
error: dlqError instanceof Error ? dlqError.message : String(dlqError),
});
const err = dlqError instanceof Error ? dlqError : new Error(String(dlqError));
defaultLogger.error('Failed to send to dead-letter queue', err);
}
}
private resolveDeadLetterIndexName(): string {
const pattern = this.config.deadLetterIndex;
if (pattern.includes('{now/d}')) {
const date = new Date().toISOString().split('T')[0];
const date = new Date().toISOString().split('T')[0] ?? 'unknown';
return pattern.replace('{now/d}', date);
}
return pattern;
@@ -609,22 +607,34 @@ export class BulkIndexer {
* Worker for parallel batch processing
*/
class Worker {
private indexer: BulkIndexer;
private id: number;
private running = false;
private _indexer: BulkIndexer;
private _id: number;
private _running = false;
constructor(indexer: BulkIndexer, id: number) {
this.indexer = indexer;
this.id = id;
this._indexer = indexer;
this._id = id;
}
get id(): number {
return this._id;
}
get indexer(): BulkIndexer {
return this._indexer;
}
get isRunning(): boolean {
return this._running;
}
start(): void {
this.running = true;
this._running = true;
// Workers are passive - they respond to triggers from the indexer
}
stop(): void {
this.running = false;
this._running = false;
}
}

View File

@@ -4,7 +4,7 @@ import { Logger, defaultLogger } from '../../core/observability/logger.js';
import { MetricsCollector, defaultMetricsCollector } from '../../core/observability/metrics.js';
import { TracingProvider, defaultTracingProvider } from '../../core/observability/tracing.js';
import { DocumentSession } from './document-session.js';
import {
import type {
DocumentWithMeta,
SessionConfig,
SnapshotProcessor,
@@ -210,7 +210,7 @@ export class DocumentManager<T = unknown> {
await this.client.create({
index: this.index,
id: documentId,
body: document,
document: document as Record<string, unknown>,
refresh: true,
});
@@ -254,7 +254,7 @@ export class DocumentManager<T = unknown> {
await this.client.update({
index: this.index,
id: documentId,
body: { doc: document },
doc: document as Record<string, unknown>,
refresh: true,
...(options?.seqNo !== undefined && { if_seq_no: options.seqNo }),
...(options?.primaryTerm !== undefined && { if_primary_term: options.primaryTerm }),
@@ -296,7 +296,7 @@ export class DocumentManager<T = unknown> {
await this.client.index({
index: this.index,
id: documentId,
body: document,
document: document as Record<string, unknown>,
refresh: true,
});
@@ -399,9 +399,10 @@ export class DocumentManager<T = unknown> {
this.ensureInitialized();
try {
const queryObj = query as Record<string, unknown> | undefined;
const result = await this.client.count({
index: this.index,
...(query && { body: { query } }),
...(queryObj ? { query: queryObj } : {}),
});
return result.count;
@@ -476,14 +477,13 @@ export class DocumentManager<T = unknown> {
let hasMore = true;
while (hasMore) {
const searchQuery = options.query as Record<string, unknown> | undefined;
const result = await this.client.search({
index: this.index,
body: {
size: batchSize,
...(searchAfter && { search_after: searchAfter }),
sort: options.sort || [{ _id: 'asc' }],
...(options.query && { query: options.query }),
},
size: batchSize,
...(searchAfter ? { search_after: searchAfter } : {}),
sort: options.sort || [{ _id: 'asc' }],
...(searchQuery ? { query: searchQuery } : {}),
});
const hits = result.hits.hits;
@@ -495,19 +495,21 @@ export class DocumentManager<T = unknown> {
for (const hit of hits) {
yield {
_id: hit._id,
_id: hit._id ?? '',
_source: hit._source as T,
_version: hit._version,
_seq_no: hit._seq_no,
_primary_term: hit._primary_term,
_index: hit._index,
_score: hit._score,
_score: hit._score ?? undefined,
};
}
// Get last sort value for pagination
const lastHit = hits[hits.length - 1];
searchAfter = lastHit.sort;
if (lastHit) {
searchAfter = lastHit.sort;
}
if (hits.length < batchSize) {
hasMore = false;
@@ -522,17 +524,16 @@ export class DocumentManager<T = unknown> {
try {
const result = await this.client.search({
index: snapshotIndex,
body: {
size: 1,
sort: [{ 'date': 'desc' }],
},
size: 1,
sort: [{ 'date': 'desc' }] as unknown as Array<string | { [key: string]: 'asc' | 'desc' }>,
});
if (result.hits.hits.length === 0) {
const firstHit = result.hits.hits[0];
if (!firstHit) {
return null;
}
const snapshot = result.hits.hits[0]._source as SnapshotMeta<R>;
const snapshot = firstHit._source as SnapshotMeta<R>;
return snapshot.data;
} catch (error: any) {
if (error.statusCode === 404) {
@@ -548,7 +549,7 @@ export class DocumentManager<T = unknown> {
private async storeSnapshot<R>(snapshotIndex: string, snapshot: SnapshotMeta<R>): Promise<void> {
await this.client.index({
index: snapshotIndex,
body: snapshot,
document: snapshot as unknown as Record<string, unknown>,
refresh: true,
});
}

View File

@@ -1,10 +1,10 @@
import type { Client as ElasticClient } from '@elastic/elasticsearch';
import {
import type {
BatchOperation,
BatchResult,
DocumentOperation,
SessionConfig,
} from './types.js';
import { DocumentOperation } from './types.js';
import { Logger } from '../../core/observability/logger.js';
import { BulkOperationError } from '../../core/errors/elasticsearch-error.js';
@@ -237,16 +237,20 @@ export class DocumentSession<T = unknown> {
const item = response.items[i];
const operation = this.operations[i];
const action = Object.keys(item)[0];
const result = item[action as keyof typeof item] as any;
if (!item || !operation) continue;
if (result.error) {
const action = Object.keys(item)[0];
if (!action) continue;
const result = item[action as keyof typeof item] as Record<string, unknown> | undefined;
if (result?.error) {
failed++;
const errorInfo = result.error as { reason?: string } | string;
errors.push({
documentId: operation.documentId,
operation: operation.operation,
error: result.error.reason || result.error,
statusCode: result.status,
error: typeof errorInfo === 'string' ? errorInfo : (errorInfo.reason ?? 'Unknown error'),
statusCode: result.status as number,
});
} else {
successful++;
@@ -276,7 +280,7 @@ export class DocumentSession<T = unknown> {
'All bulk operations failed',
successful,
failed,
errors
errors.map(e => ({ documentId: e.documentId, error: e.error, status: e.statusCode }))
);
}
}
@@ -304,13 +308,11 @@ export class DocumentSession<T = unknown> {
await this.client.deleteByQuery({
index: this.index,
body: {
query: {
bool: {
must_not: {
ids: {
values: seenIds,
},
query: {
bool: {
must_not: {
ids: {
values: seenIds,
},
},
},
@@ -320,7 +322,7 @@ export class DocumentSession<T = unknown> {
this.logger.debug('Stale documents cleaned up', { index: this.index });
} catch (error) {
this.logger.warn('Failed to cleanup stale documents', undefined, {
this.logger.warn('Failed to cleanup stale documents', {
index: this.index,
error: (error as Error).message,
});

View File

@@ -13,11 +13,7 @@
import { ElasticsearchConnectionManager } from '../../core/connection/connection-manager.js';
import { Logger, defaultLogger } from '../../core/observability/logger.js';
import {
MetricsCollector,
defaultMetricsCollector,
} from '../../core/observability/metrics.js';
import { DocumentNotFoundError } from '../../core/errors/index.js';
import { MetricsCollector, defaultMetricsCollector } from '../../core/observability/metrics.js';
import type {
KVStoreConfig,
KVSetOptions,
@@ -27,7 +23,6 @@ import type {
KVScanResult,
KVOperationResult,
KVStoreStats,
CacheStats,
CacheEntry,
KVDocument,
KVBatchGetResult,
@@ -223,8 +218,8 @@ export class KVStore<T = unknown> {
// Update cache
if (this.config.enableCache && !options.skipCache) {
this.cacheSet(key, value, {
seqNo: result._seq_no,
primaryTerm: result._primary_term,
seqNo: result._seq_no ?? 0,
primaryTerm: result._primary_term ?? 0,
}, ttl);
}
@@ -236,7 +231,7 @@ export class KVStore<T = unknown> {
this.metrics.recordCounter('kv.set', 1, {
index: this.config.index,
cached: !options.skipCache,
cached: options.skipCache ? 'no' : 'yes',
});
this.metrics.recordHistogram('kv.set.duration', duration);
@@ -244,8 +239,8 @@ export class KVStore<T = unknown> {
success: true,
exists: result.result === 'updated',
version: {
seqNo: result._seq_no,
primaryTerm: result._primary_term,
seqNo: result._seq_no ?? 0,
primaryTerm: result._primary_term ?? 0,
},
expiresAt: expiresAt ?? undefined,
};
@@ -281,7 +276,7 @@ export class KVStore<T = unknown> {
this.metrics.recordCounter('kv.get', 1, {
index: this.config.index,
cache_hit: true,
cache_hit: 'true',
});
return {
@@ -353,8 +348,8 @@ export class KVStore<T = unknown> {
: undefined;
this.cacheSet(key, value, {
seqNo: result._seq_no!,
primaryTerm: result._primary_term!,
seqNo: result._seq_no ?? 0,
primaryTerm: result._primary_term ?? 0,
}, ttl);
}
@@ -366,7 +361,7 @@ export class KVStore<T = unknown> {
this.metrics.recordCounter('kv.get', 1, {
index: this.config.index,
cache_hit: false,
cache_hit: 'false',
});
this.metrics.recordHistogram('kv.get.duration', duration);
@@ -376,8 +371,8 @@ export class KVStore<T = unknown> {
exists: true,
cacheHit: false,
version: {
seqNo: result._seq_no!,
primaryTerm: result._primary_term!,
seqNo: result._seq_no ?? 0,
primaryTerm: result._primary_term ?? 0,
},
expiresAt: doc.expiresAt ? new Date(doc.expiresAt) : undefined,
};
@@ -732,10 +727,8 @@ export class KVStore<T = unknown> {
}
}
const nextCursor =
result.hits.hits.length > 0
? result.hits.hits[result.hits.hits.length - 1].sort?.[0] as string
: undefined;
const lastHit = result.hits.hits[result.hits.hits.length - 1];
const nextCursor = lastHit?.sort?.[0] as string | undefined;
this.metrics.recordCounter('kv.scan', 1, {
index: this.config.index,

View File

@@ -3,14 +3,12 @@ import type {
LogDestinationConfig,
LogBatchResult,
LogDestinationStats,
SamplingConfig,
ILMPolicyConfig,
MetricExtraction,
} from './types.js';
import { ElasticsearchConnectionManager } from '../../core/connection/connection-manager.js';
import { defaultLogger } from '../../core/observability/logger.js';
import { defaultMetrics } from '../../core/observability/metrics.js';
import { defaultTracing } from '../../core/observability/tracing.js';
import { defaultMetricsCollector } from '../../core/observability/metrics.js';
import { defaultTracer } from '../../core/observability/tracing.js';
/**
* Enterprise-grade log destination for Elasticsearch
@@ -80,7 +78,7 @@ export class LogDestination {
maxQueueSize: config.maxQueueSize ?? 10000,
enrichers: config.enrichers ?? [],
sampling: config.sampling ?? { strategy: 'all', alwaysSampleErrors: true },
ilm: config.ilm,
ilm: config.ilm ?? { name: 'logs-default', hotDuration: '7d', deleteDuration: '30d' },
metrics: config.metrics ?? [],
autoCreateTemplate: config.autoCreateTemplate ?? true,
templateSettings: config.templateSettings ?? {
@@ -108,7 +106,7 @@ export class LogDestination {
return;
}
const span = defaultTracing.createSpan('logDestination.initialize');
const span = defaultTracer.startSpan('logDestination.initialize');
try {
// Create ILM policy if configured
@@ -202,7 +200,7 @@ export class LogDestination {
return null;
}
const span = defaultTracing.createSpan('logDestination.flush', {
const span = defaultTracer.startSpan('logDestination.flush', {
'batch.size': this.queue.length,
});
@@ -255,8 +253,8 @@ export class LogDestination {
this.stats.totalFailed += failed;
// Record metrics
defaultMetrics.requestsTotal.inc({ operation: 'log_flush', result: 'success' });
defaultMetrics.requestDuration.observe({ operation: 'log_flush' }, durationMs);
defaultMetricsCollector.requestsTotal.inc({ operation: 'log_flush', index: 'logs' });
defaultMetricsCollector.requestDuration.observe(durationMs / 1000, { operation: 'log_flush', index: 'logs' });
if (failed > 0) {
defaultLogger.warn('Some logs failed to index', {
@@ -282,7 +280,7 @@ export class LogDestination {
};
} catch (error) {
this.stats.totalFailed += batch.length;
defaultMetrics.requestErrors.inc({ operation: 'log_flush' });
defaultMetricsCollector.requestErrors.inc({ operation: 'log_flush' });
defaultLogger.error('Failed to flush logs', {
error: error instanceof Error ? error.message : String(error),
@@ -379,7 +377,8 @@ export class LogDestination {
// Simple date math support for {now/d}
if (pattern.includes('{now/d}')) {
const date = new Date().toISOString().split('T')[0];
const dateParts = new Date().toISOString().split('T');
const date = dateParts[0] ?? new Date().toISOString().substring(0, 10);
return pattern.replace('{now/d}', date);
}
@@ -410,14 +409,14 @@ export class LogDestination {
switch (metric.type) {
case 'counter':
defaultMetrics.requestsTotal.inc({ ...labels, metric: metric.name });
defaultMetricsCollector.requestsTotal.inc({ ...labels, metric: metric.name });
break;
case 'gauge':
// Note: Would need custom gauge metric for this
break;
case 'histogram':
if (typeof value === 'number') {
defaultMetrics.requestDuration.observe({ ...labels, metric: metric.name }, value);
defaultMetricsCollector.requestDuration.observe(value, { ...labels, metric: metric.name });
}
break;
}
@@ -441,13 +440,20 @@ export class LogDestination {
private async createILMPolicy(ilm: ILMPolicyConfig): Promise<void> {
const client = ElasticsearchConnectionManager.getInstance().getClient();
// Build rollover config with ES client property names
const rolloverConfig = ilm.rollover ? {
...(ilm.rollover.maxSize && { max_size: ilm.rollover.maxSize }),
...(ilm.rollover.maxAge && { max_age: ilm.rollover.maxAge }),
...(ilm.rollover.maxDocs && { max_docs: ilm.rollover.maxDocs }),
} : undefined;
const policy = {
policy: {
phases: {
...(ilm.hotDuration && {
hot: {
actions: {
...(ilm.rollover && { rollover: ilm.rollover }),
...(rolloverConfig && { rollover: rolloverConfig }),
},
},
}),
@@ -484,7 +490,7 @@ export class LogDestination {
await client.ilm.putLifecycle({
name: ilm.name,
...policy,
});
} as any);
defaultLogger.info('ILM policy created', { policy: ilm.name });
} catch (error) {
defaultLogger.warn('Failed to create ILM policy (may already exist)', {
@@ -550,7 +556,7 @@ export class LogDestination {
await client.indices.putIndexTemplate({
name: templateName,
...template,
});
} as any);
defaultLogger.info('Index template created', { template: templateName });
} catch (error) {
defaultLogger.warn('Failed to create index template (may already exist)', {

View File

@@ -4,6 +4,11 @@
import type { LogLevel } from '../../core/observability/logger.js';
/**
* Log level as string literal or enum value
*/
export type LogLevelValue = LogLevel | 'DEBUG' | 'INFO' | 'WARN' | 'ERROR' | 'debug' | 'info' | 'warn' | 'error';
/**
* Log entry structure
*/
@@ -12,7 +17,7 @@ export interface LogEntry {
timestamp: string;
/** Log level */
level: LogLevel;
level: LogLevelValue;
/** Log message */
message: string;

View File

@@ -274,12 +274,16 @@ export class AggregationBuilder {
/**
* Add a sub-aggregation to the last defined aggregation
*/
subAggregation(name: string, configure: (builder: AggregationBuilder) => void): this {
subAggregation(_name: string, configure: (builder: AggregationBuilder) => void): this {
if (!this.currentAggName) {
throw new Error('Cannot add sub-aggregation: no parent aggregation defined');
}
const parentAgg = this.aggregations[this.currentAggName];
if (!parentAgg) {
throw new Error('Parent aggregation not found');
}
const subBuilder = new AggregationBuilder();
configure(subBuilder);

View File

@@ -28,8 +28,8 @@ import type { AggregationBuilder } from './aggregation-builder.js';
import { createAggregationBuilder } from './aggregation-builder.js';
import { ElasticsearchConnectionManager } from '../../core/connection/connection-manager.js';
import { defaultLogger } from '../../core/observability/logger.js';
import { defaultMetrics } from '../../core/observability/metrics.js';
import { defaultTracing } from '../../core/observability/tracing.js';
import { defaultMetricsCollector } from '../../core/observability/metrics.js';
import { defaultTracer } from '../../core/observability/tracing.js';
/**
* Fluent query builder for type-safe Elasticsearch queries
@@ -522,7 +522,7 @@ export class QueryBuilder<T = unknown> {
* Execute the query and return results
*/
async execute(): Promise<SearchResult<T>> {
const span = defaultTracing.createSpan('query.execute', {
const span = defaultTracer.startSpan('query.execute', {
'db.system': 'elasticsearch',
'db.operation': 'search',
'db.elasticsearch.index': this.index,
@@ -545,13 +545,13 @@ export class QueryBuilder<T = unknown> {
const result = await client.search<T>({
index: this.index,
...searchOptions,
});
} as any);
const duration = Date.now() - startTime;
// Record metrics
defaultMetrics.requestsTotal.inc({ operation: 'search', index: this.index });
defaultMetrics.requestDuration.observe({ operation: 'search', index: this.index }, duration);
defaultMetricsCollector.requestsTotal.inc({ operation: 'search', index: this.index });
defaultMetricsCollector.requestDuration.observe(duration / 1000, { operation: 'search', index: this.index });
defaultLogger.info('Query executed successfully', {
index: this.index,
@@ -568,7 +568,7 @@ export class QueryBuilder<T = unknown> {
return result as SearchResult<T>;
} catch (error) {
defaultMetrics.requestErrors.inc({ operation: 'search', index: this.index });
defaultMetricsCollector.requestErrors.inc({ operation: 'search', index: this.index });
defaultLogger.error('Query execution failed', { index: this.index, error: error instanceof Error ? error.message : String(error) });
span.recordException(error as Error);
span.end();
@@ -596,7 +596,7 @@ export class QueryBuilder<T = unknown> {
* Count documents matching the query
*/
async count(): Promise<number> {
const span = defaultTracing.createSpan('query.count', {
const span = defaultTracer.startSpan('query.count', {
'db.system': 'elasticsearch',
'db.operation': 'count',
'db.elasticsearch.index': this.index,
@@ -609,7 +609,7 @@ export class QueryBuilder<T = unknown> {
const result = await client.count({
index: this.index,
...(searchOptions.query && { query: searchOptions.query }),
});
} as any);
span.end();
return result.count;

View File

@@ -116,8 +116,7 @@ export interface TermQuery {
*/
export interface TermsQuery {
terms: {
[field: string]: Array<string | number | boolean>;
boost?: number;
[field: string]: Array<string | number | boolean> | number | undefined;
};
}

View File

@@ -14,7 +14,6 @@ import type {
FieldDefinition,
SchemaMigration,
MigrationHistoryEntry,
MigrationStatus,
SchemaManagerConfig,
SchemaValidationResult,
SchemaDiff,
@@ -144,7 +143,7 @@ export class SchemaManager {
index,
...mapping,
timeout: `${this.config.timeout}ms`,
});
} as any);
if (this.config.enableLogging) {
this.logger.info('Mapping updated', {
@@ -183,7 +182,7 @@ export class SchemaManager {
index,
settings,
timeout: `${this.config.timeout}ms`,
});
} as any);
} finally {
if (requiresClose) {
await client.indices.open({ index });
@@ -559,7 +558,7 @@ export class SchemaManager {
template: template.template,
data_stream: template.data_stream,
_meta: template._meta,
});
} as any);
this.stats.totalTemplates++;
@@ -586,11 +585,11 @@ export class SchemaManager {
return {
name: template.name,
index_patterns: template.index_template.index_patterns,
index_patterns: template.index_template.index_patterns as string[],
priority: template.index_template.priority,
version: template.index_template.version,
composed_of: template.index_template.composed_of,
template: template.index_template.template,
template: template.index_template.template as IndexTemplate['template'],
data_stream: template.index_template.data_stream,
_meta: template.index_template._meta,
};
@@ -638,7 +637,7 @@ export class SchemaManager {
template: template.template,
version: template.version,
_meta: template._meta,
});
} as any);
if (this.config.enableLogging) {
this.logger.info('Component template created/updated', { name: template.name });
@@ -667,7 +666,8 @@ export class SchemaManager {
if (aliases.add) {
for (const [alias, config] of Object.entries(aliases.add)) {
actions.push({ add: { index, alias, ...config } });
const configObj = config as Record<string, unknown>;
actions.push({ add: { index, alias, ...configObj } });
}
}

View File

@@ -179,8 +179,6 @@ export class TransactionManager {
context.state = 'committing';
const client = ElasticsearchConnectionManager.getInstance().getClient();
// Execute and commit all operations
let committed = 0;
for (const operation of context.operations) {
@@ -285,14 +283,12 @@ export class TransactionManager {
context.state = 'rolling_back';
const client = ElasticsearchConnectionManager.getInstance().getClient();
// Execute compensation operations in reverse order
let rolledBack = 0;
for (let i = context.operations.length - 1; i >= 0; i--) {
const operation = context.operations[i];
if (!operation.executed || !operation.compensation) {
if (!operation || !operation.executed || !operation.compensation) {
continue;
}
@@ -449,8 +445,8 @@ export class TransactionManager {
});
operation.version = {
seqNo: result._seq_no!,
primaryTerm: result._primary_term!,
seqNo: result._seq_no ?? 0,
primaryTerm: result._primary_term ?? 0,
};
operation.originalDocument = result._source;
@@ -466,8 +462,8 @@ export class TransactionManager {
});
operation.version = {
seqNo: result._seq_no,
primaryTerm: result._primary_term,
seqNo: result._seq_no ?? 0,
primaryTerm: result._primary_term ?? 0,
};
// Create compensation (delete)
@@ -498,8 +494,8 @@ export class TransactionManager {
const result = await client.index(updateRequest);
operation.version = {
seqNo: result._seq_no,
primaryTerm: result._primary_term,
seqNo: result._seq_no ?? 0,
primaryTerm: result._primary_term ?? 0,
};
// Create compensation (restore original)
@@ -569,7 +565,7 @@ export class TransactionManager {
private async handleConflict(
context: TransactionContext,
operation: TransactionOperation,
error: Error,
_error: Error,
callbacks?: TransactionCallbacks
): Promise<void> {
this.stats.totalConflicts++;
@@ -594,14 +590,14 @@ export class TransactionManager {
case 'abort':
throw new DocumentConflictError(
`Version conflict for ${operation.index}/${operation.id}`,
{ index: operation.index, id: operation.id }
`${operation.index}/${operation.id}`
);
case 'retry':
if (context.retryAttempts >= context.config.maxRetries) {
throw new DocumentConflictError(
`Max retries exceeded for ${operation.index}/${operation.id}`,
{ index: operation.index, id: operation.id }
`${operation.index}/${operation.id}`
);
}

View File

@@ -9,7 +9,6 @@ import {
ElasticsearchConnectionManager,
LogLevel,
KVStore,
type KVStoreConfig,
} from '../../index.js';
interface UserSession {

View File

@@ -17,7 +17,6 @@ import {
addDynamicTags,
chainEnrichers,
} from '../../index.js';
import type { LogEntry } from '../../index.js';
async function main() {
console.log('=== Logging API Example ===\n');

View File

@@ -11,6 +11,7 @@ import {
createTransactionManager,
type TransactionCallbacks,
type ConflictInfo,
type ConflictResolutionStrategy,
} from '../../index.js';
interface BankAccount {
@@ -270,7 +271,7 @@ async function main() {
let conflictsDetected = 0;
const callbacks: TransactionCallbacks = {
onConflict: async (conflict: ConflictInfo) => {
onConflict: async (conflict: ConflictInfo): Promise<ConflictResolutionStrategy> => {
conflictsDetected++;
console.log(` ⚠ Conflict detected on ${conflict.operation.index}/${conflict.operation.id}`);
return 'retry'; // Automatically retry

View File

@@ -5,31 +5,8 @@
"moduleResolution": "NodeNext",
"esModuleInterop": true,
"verbatimModuleSyntax": true,
"strict": true,
"strictNullChecks": true,
"strictFunctionTypes": true,
"strictBindCallApply": true,
"strictPropertyInitialization": true,
"noImplicitThis": true,
"noImplicitAny": true,
"noUnusedLocals": true,
"noUnusedParameters": true,
"noUncheckedIndexedAccess": true,
"noImplicitReturns": true,
"noFallthroughCasesInSwitch": true,
"forceConsistentCasingInFileNames": true,
"skipLibCheck": false,
"declaration": true,
"declarationMap": true,
"sourceMap": true
"baseUrl": ".",
"paths": {}
},
"include": [
"ts/**/*"
],
"exclude": [
"node_modules",
"dist",
"dist_ts",
"dist_*/**/*.d.ts"
]
"exclude": ["dist_*/**/*.d.ts"]
}