fix(package): Update dependency scopes/versions, fix imports and scripts, add pnpm workspace, remove GitLab CI
This commit is contained in:
318
readme.md
318
readme.md
@@ -1,97 +1,319 @@
|
||||
# @push.rocks/smartcache
|
||||
Cache things in smart ways.
|
||||
# @push.rocks/smartcache 🚀
|
||||
**Smart time-based caching for async functions** - Because waiting is overrated!
|
||||
|
||||
## Install
|
||||
To use `@push.rocks/smartcache` in your project, you'll need to install it using npm or yarn. Here's how you can do it:
|
||||
## 🌟 What is SmartCache?
|
||||
|
||||
`@push.rocks/smartcache` is a powerful TypeScript caching library that intelligently caches the results of asynchronous functions. It automatically detects identical function calls based on the call stack, preventing redundant computations and dramatically improving performance. Perfect for expensive API calls, complex calculations, or any async operation you don't want to repeat unnecessarily!
|
||||
|
||||
### ✨ Key Features
|
||||
|
||||
- 🎯 **Automatic cache identification** - Uses smart hashing to detect identical function calls
|
||||
- ⏱️ **Time-based expiration** - Set custom cache durations for each cached operation
|
||||
- 🔄 **Concurrent request handling** - Multiple simultaneous requests for the same data only trigger one actual call
|
||||
- 🎭 **Zero configuration** - Works out of the box with sensible defaults
|
||||
- 📦 **Lightweight** - Minimal dependencies, maximum performance
|
||||
- 🔒 **Type-safe** - Full TypeScript support with proper typing
|
||||
|
||||
## 📦 Installation
|
||||
|
||||
Get started in seconds:
|
||||
|
||||
```bash
|
||||
# Using npm
|
||||
npm install @push.rocks/smartcache --save
|
||||
```
|
||||
or if you are using yarn:
|
||||
|
||||
```bash
|
||||
# Using pnpm (recommended)
|
||||
pnpm add @push.rocks/smartcache
|
||||
|
||||
# Using yarn
|
||||
yarn add @push.rocks/smartcache
|
||||
```
|
||||
|
||||
## Usage
|
||||
|
||||
`@push.rocks/smartcache` is designed to cache the results of function calls in a smart and efficient way, significantly improving the performance of repeat function calls by avoiding unnecessary recalculations or remote fetches. This package is especially useful when dealing with data that changes infrequently or expensive computation functions whose results can be reused within a given period. Here's how to use `@push.rocks/smartcache` in your TypeScript projects:
|
||||
|
||||
First, make sure to import `SmartCache`:
|
||||
## 🎮 Quick Start
|
||||
|
||||
```typescript
|
||||
import { SmartCache } from '@push.rocks/smartcache';
|
||||
|
||||
// Create a cache instance
|
||||
const cache = new SmartCache();
|
||||
|
||||
// Your expensive async function
|
||||
async function fetchUserData(userId: string) {
|
||||
console.log(`Fetching user ${userId} from API...`);
|
||||
const response = await fetch(`/api/users/${userId}`);
|
||||
return response.json();
|
||||
}
|
||||
|
||||
// Wrap it with caching - subsequent calls within 5 seconds return cached result
|
||||
async function getCachedUserData(userId: string) {
|
||||
return cache.cacheReturn(
|
||||
() => fetchUserData(userId),
|
||||
5000 // Cache for 5 seconds
|
||||
);
|
||||
}
|
||||
|
||||
// First call - hits the API
|
||||
const user1 = await getCachedUserData('123'); // Logs: "Fetching user 123 from API..."
|
||||
|
||||
// Second call within 5 seconds - returns cached result instantly!
|
||||
const user2 = await getCachedUserData('123'); // No log, returns cached data
|
||||
```
|
||||
|
||||
### Basic Example
|
||||
## 🚀 Advanced Usage
|
||||
|
||||
Create an instance of `SmartCache`:
|
||||
### Automatic Call Stack Detection
|
||||
|
||||
SmartCache uses intelligent call stack analysis to automatically identify unique function calls. This means you don't need to manually specify cache keys!
|
||||
|
||||
```typescript
|
||||
const mySmartCache = new SmartCache();
|
||||
```
|
||||
const cache = new SmartCache();
|
||||
|
||||
Now, imagine you have an asynchronous function whose result you want to cache:
|
||||
// These will be automatically cached separately based on their call location
|
||||
async function getData() {
|
||||
// First location in code
|
||||
const result1 = await cache.cacheReturn(async () => {
|
||||
return await fetch('/api/data1').then(r => r.json());
|
||||
}, 10000);
|
||||
|
||||
```typescript
|
||||
async function fetchData(): Promise<string> {
|
||||
// simulate fetching data
|
||||
await new Promise(resolve => setTimeout(resolve, 1000)); // delay to mimic fetch time
|
||||
return 'Some fetched data';
|
||||
// Second location in code - different cache entry!
|
||||
const result2 = await cache.cacheReturn(async () => {
|
||||
return await fetch('/api/data2').then(r => r.json());
|
||||
}, 10000);
|
||||
|
||||
return { result1, result2 };
|
||||
}
|
||||
```
|
||||
|
||||
Using `@push.rocks/smartcache`, you can cache the result of `fetchData` easily:
|
||||
### Concurrent Request Handling
|
||||
|
||||
SmartCache brilliantly handles concurrent requests - if multiple requests come in for the same cached operation before the first one completes, they all wait for and share the same result:
|
||||
|
||||
```typescript
|
||||
async function getCachedData() {
|
||||
const cachedData = await mySmartCache.cacheReturn(fetchData, 60000); // Cache for 1 minute
|
||||
console.log(cachedData); // 'Some fetched data'
|
||||
const cache = new SmartCache();
|
||||
|
||||
async function expensiveOperation() {
|
||||
console.log('Starting expensive operation...');
|
||||
await new Promise(resolve => setTimeout(resolve, 2000));
|
||||
return { data: 'valuable result' };
|
||||
}
|
||||
|
||||
// Fire off 5 concurrent requests
|
||||
const promises = Array(5).fill(null).map(() =>
|
||||
cache.cacheReturn(expensiveOperation, 60000)
|
||||
);
|
||||
|
||||
// Only ONE "Starting expensive operation..." log appears!
|
||||
// All 5 requests get the same result
|
||||
const results = await Promise.all(promises);
|
||||
```
|
||||
|
||||
### Dynamic Cache Durations
|
||||
|
||||
Adjust cache duration based on your data's characteristics:
|
||||
|
||||
```typescript
|
||||
const cache = new SmartCache();
|
||||
|
||||
async function getDataWithVariableCache(dataType: string) {
|
||||
// Critical data - short cache
|
||||
if (dataType === 'critical') {
|
||||
return cache.cacheReturn(fetchCriticalData, 1000); // 1 second
|
||||
}
|
||||
|
||||
// Static data - long cache
|
||||
if (dataType === 'static') {
|
||||
return cache.cacheReturn(fetchStaticData, 3600000); // 1 hour
|
||||
}
|
||||
|
||||
// Default - moderate cache
|
||||
return cache.cacheReturn(fetchNormalData, 60000); // 1 minute
|
||||
}
|
||||
```
|
||||
|
||||
### Advanced Use Cases
|
||||
### Cache Different Function Results
|
||||
|
||||
#### Custom Cache Identifiers
|
||||
|
||||
By default, `@push.rocks/smartcache` identifies cache entries based on the call stack. However, when you have dynamic arguments that significantly change the output of the function, you might want to create a custom cache identifier to differentiate between these calls.
|
||||
|
||||
Suppose `fetchData` now takes an argument:
|
||||
Each function call location gets its own cache entry:
|
||||
|
||||
```typescript
|
||||
async function fetchData(userId: string): Promise<UserData> {
|
||||
// Fetch user data based on userId
|
||||
const cache = new SmartCache();
|
||||
|
||||
class DataService {
|
||||
async getUserPosts(userId: string) {
|
||||
// Cached separately from getUserProfile
|
||||
return cache.cacheReturn(
|
||||
async () => {
|
||||
const response = await fetch(`/api/users/${userId}/posts`);
|
||||
return response.json();
|
||||
},
|
||||
30000 // Cache for 30 seconds
|
||||
);
|
||||
}
|
||||
|
||||
async getUserProfile(userId: string) {
|
||||
// Different cache entry than getUserPosts
|
||||
return cache.cacheReturn(
|
||||
async () => {
|
||||
const response = await fetch(`/api/users/${userId}/profile`);
|
||||
return response.json();
|
||||
},
|
||||
60000 // Cache for 60 seconds
|
||||
);
|
||||
}
|
||||
}
|
||||
```
|
||||
|
||||
You could create a custom cache identifier like so:
|
||||
## 🎯 Real-World Examples
|
||||
|
||||
### API Rate Limiting Helper
|
||||
|
||||
Prevent hitting rate limits by caching API responses:
|
||||
|
||||
```typescript
|
||||
const userId = '123';
|
||||
const customIdentifier = `fetchData-${userId}`;
|
||||
const cache = new SmartCache();
|
||||
|
||||
const cachedUserData = await mySmartCache.cacheReturn(() => fetchData(userId), 60000, customIdentifier);
|
||||
class GitHubService {
|
||||
async getRepoInfo(owner: string, repo: string) {
|
||||
return cache.cacheReturn(
|
||||
async () => {
|
||||
const response = await fetch(
|
||||
`https://api.github.com/repos/${owner}/${repo}`,
|
||||
{ headers: { 'Authorization': `token ${process.env.GITHUB_TOKEN}` } }
|
||||
);
|
||||
return response.json();
|
||||
},
|
||||
300000 // Cache for 5 minutes - respect API limits
|
||||
);
|
||||
}
|
||||
}
|
||||
```
|
||||
|
||||
#### Clearing the Cache
|
||||
### Database Query Optimization
|
||||
|
||||
In certain situations, you might want to clear the cached data before it expires naturally. `@push.rocks/smartcache` provides a method to manually clear specific cache entries:
|
||||
Cache expensive database queries:
|
||||
|
||||
```typescript
|
||||
mySmartCache.clearCache(customIdentifier);
|
||||
const cache = new SmartCache();
|
||||
|
||||
class DatabaseService {
|
||||
async getTopProducts(category: string) {
|
||||
return cache.cacheReturn(
|
||||
async () => {
|
||||
// Expensive aggregation query
|
||||
const products = await db.products.aggregate([
|
||||
{ $match: { category } },
|
||||
{ $group: { _id: '$productId', totalSales: { $sum: '$sales' } } },
|
||||
{ $sort: { totalSales: -1 } },
|
||||
{ $limit: 10 }
|
||||
]);
|
||||
return products;
|
||||
},
|
||||
120000 // Cache for 2 minutes - data doesn't change often
|
||||
);
|
||||
}
|
||||
}
|
||||
```
|
||||
|
||||
### Best Practices
|
||||
### Computed Values Cache
|
||||
|
||||
- **Cache Duration**: Choose a cache duration that makes sense for your data. Highly dynamic data might not benefit much from long cache times, whereas static data can be cached longer.
|
||||
- **Error Handling**: Always implement error handling for your cached functions, especially when dealing with network requests.
|
||||
- **Memory Management**: Be mindful of what you cache. Caching large objects or a high number of entries can lead to increased memory usage.
|
||||
Cache expensive computations:
|
||||
|
||||
### Conclusion
|
||||
```typescript
|
||||
const cache = new SmartCache();
|
||||
|
||||
`@push.rocks/smartcache` is a powerful utility for caching asynchronous function results, significantly improving the performance and efficiency of your applications. By understanding and implementing the basics and exploring advanced use cases, you can leverage `@push.rocks/smartcache` to its full potential.
|
||||
class AnalyticsService {
|
||||
async calculateMetrics(data: number[]) {
|
||||
return cache.cacheReturn(
|
||||
async () => {
|
||||
console.log('Running expensive calculation...');
|
||||
// Simulate expensive computation
|
||||
const result = {
|
||||
mean: data.reduce((a, b) => a + b, 0) / data.length,
|
||||
median: this.calculateMedian(data),
|
||||
standardDeviation: this.calculateStdDev(data),
|
||||
percentiles: this.calculatePercentiles(data)
|
||||
};
|
||||
return result;
|
||||
},
|
||||
600000 // Cache for 10 minutes
|
||||
);
|
||||
}
|
||||
|
||||
private calculateMedian(data: number[]) { /* ... */ }
|
||||
private calculateStdDev(data: number[]) { /* ... */ }
|
||||
private calculatePercentiles(data: number[]) { /* ... */ }
|
||||
}
|
||||
```
|
||||
|
||||
Remember, the key to effective caching is understanding your data's behavior and aligning your caching strategy accordingly. Happy caching!
|
||||
## 📊 Performance Benefits
|
||||
|
||||
SmartCache can dramatically improve your application's performance:
|
||||
|
||||
- 🚀 **Reduce API calls** by up to 90% for frequently accessed data
|
||||
- ⚡ **Instant responses** for cached data (sub-millisecond)
|
||||
- 📉 **Lower server costs** by reducing redundant computations
|
||||
- 🛡️ **Protect against thundering herd** problems
|
||||
- 🔄 **Automatic cleanup** - expired cache entries are removed automatically
|
||||
|
||||
## 🏗️ Architecture & How It Works
|
||||
|
||||
SmartCache uses a sophisticated approach to caching:
|
||||
|
||||
1. **Call Stack Hashing**: When you call `cacheReturn`, SmartCache captures the call stack and generates a SHA-256 hash
|
||||
2. **Cache Lookup**: It checks if a valid cached result exists for this hash
|
||||
3. **Concurrent Protection**: If a request is in-flight, new requests wait for the same result
|
||||
4. **Automatic Expiration**: Each cache entry has a timer that automatically removes it when expired
|
||||
5. **Memory Efficient**: Only stores what's actively being used, expired entries are cleaned up
|
||||
|
||||
## 💡 Best Practices
|
||||
|
||||
### ✅ Do's
|
||||
|
||||
- **Cache expensive operations** - API calls, database queries, complex calculations
|
||||
- **Use appropriate durations** - Match cache time to your data's update frequency
|
||||
- **Cache at the right level** - Cache complete results, not partial data
|
||||
- **Monitor memory usage** - Be mindful when caching large objects
|
||||
|
||||
### ❌ Don'ts
|
||||
|
||||
- **Don't cache user-specific sensitive data** without careful consideration
|
||||
- **Don't use excessive cache durations** for frequently changing data
|
||||
- **Don't cache massive objects** that could cause memory issues
|
||||
- **Don't rely on cache** for critical data consistency
|
||||
|
||||
## 🎓 Pro Tips
|
||||
|
||||
1. **Layer your caching**: Use SmartCache alongside CDN and database caching for maximum effect
|
||||
2. **Cache warming**: Pre-populate cache with frequently accessed data on startup
|
||||
3. **Metrics tracking**: Monitor cache hit rates to optimize cache durations
|
||||
4. **Error handling**: Always handle potential errors in cached functions
|
||||
|
||||
```typescript
|
||||
const cache = new SmartCache();
|
||||
|
||||
// Good - with error handling
|
||||
async function getResilientData() {
|
||||
return cache.cacheReturn(
|
||||
async () => {
|
||||
try {
|
||||
const response = await fetch('/api/data');
|
||||
if (!response.ok) throw new Error(`HTTP ${response.status}`);
|
||||
return response.json();
|
||||
} catch (error) {
|
||||
console.error('Failed to fetch data:', error);
|
||||
// Return fallback or rethrow based on your needs
|
||||
throw error;
|
||||
}
|
||||
},
|
||||
30000
|
||||
);
|
||||
}
|
||||
```
|
||||
|
||||
## 🤝 Support & Community
|
||||
|
||||
- 🐛 **Found a bug?** [Open an issue](https://code.foss.global/push.rocks/smartcache/issues)
|
||||
- 💡 **Have a feature request?** [Start a discussion](https://code.foss.global/push.rocks/smartcache/issues)
|
||||
- 📖 **Need help?** Check our [comprehensive documentation](https://code.foss.global/push.rocks/smartcache)
|
||||
|
||||
## License and Legal Information
|
||||
|
||||
|
Reference in New Issue
Block a user