Compare commits

..

16 Commits

Author SHA1 Message Date
d0a00aedea 1.7.1
Some checks failed
Default (tags) / security (push) Successful in 32s
Default (tags) / test (push) Successful in 4m47s
Default (tags) / release (push) Failing after 47s
Default (tags) / metadata (push) Successful in 51s
2025-04-25 20:56:01 +00:00
b6af835d3f fix(CodeFeed): Improve commit fetching concurrency and add tagged-only commit filtering along with updated documentation and tests 2025-04-25 20:56:01 +00:00
c639735f92 1.7.0
Some checks failed
Default (tags) / security (push) Successful in 38s
Default (tags) / test (push) Successful in 1m11s
Default (tags) / release (push) Failing after 38s
Default (tags) / metadata (push) Successful in 45s
2025-04-25 20:44:32 +00:00
e40e008429 feat(core): Enhance commit fetching with caching, concurrency improvements, and dependency upgrades 2025-04-25 20:44:32 +00:00
6032867a13 1.6.5
Some checks failed
Default (tags) / security (push) Successful in 57s
Default (tags) / test (push) Successful in 2m34s
Default (tags) / release (push) Failing after 1m34s
Default (tags) / metadata (push) Successful in 1m57s
2024-12-16 22:47:00 +01:00
b59bd82685 fix(CodeFeed): Fixed timestamp initialization and commit fetching timeframe 2024-12-16 22:46:59 +01:00
a43114ab61 1.6.4
Some checks failed
Default (tags) / security (push) Successful in 55s
Default (tags) / test (push) Successful in 2m16s
Default (tags) / release (push) Failing after 1m33s
Default (tags) / metadata (push) Successful in 1m58s
2024-12-14 22:53:42 +01:00
1e0ccec03e fix(core): Refactor fetch logic to use a unified fetchFunction for API calls 2024-12-14 22:53:42 +01:00
e5e0ceee78 1.6.3
Some checks failed
Default (tags) / security (push) Successful in 52s
Default (tags) / test (push) Successful in 2m11s
Default (tags) / release (push) Failing after 1m31s
Default (tags) / metadata (push) Successful in 1m58s
2024-12-14 02:28:25 +01:00
d9ab609039 fix(codefeed): Refactor and fix formatting issues in the CodeFeed module 2024-12-14 02:28:25 +01:00
aa039e8b5e 1.6.2
Some checks failed
Default (tags) / security (push) Successful in 57s
Default (tags) / test (push) Successful in 2m13s
Default (tags) / release (push) Failing after 1m31s
Default (tags) / metadata (push) Successful in 1m53s
2024-12-14 02:04:10 +01:00
f511ab7a63 fix(core): Fix sorting order of tagged commits by timestamp 2024-12-14 02:04:10 +01:00
1df8064247 1.6.1
Some checks failed
Default (tags) / security (push) Successful in 54s
Default (tags) / test (push) Successful in 2m8s
Default (tags) / release (push) Failing after 1m28s
Default (tags) / metadata (push) Successful in 1m50s
2024-12-14 01:32:22 +01:00
ac1f398422 fix(docs): Updated project metadata and expanded documentation for installation and usage. 2024-12-14 01:32:22 +01:00
3a498c00ee 1.6.0
Some checks failed
Default (tags) / security (push) Successful in 54s
Default (tags) / test (push) Successful in 2m9s
Default (tags) / release (push) Failing after 1m29s
Default (tags) / metadata (push) Successful in 1m54s
2024-12-14 00:54:38 +01:00
bb248ed408 feat(core): Add changelog fetching and parsing functionality 2024-12-14 00:54:38 +01:00
10 changed files with 2916 additions and 1301 deletions

View File

@ -1,5 +1,57 @@
# Changelog # Changelog
## 2025-04-25 - 1.7.1 - fix(CodeFeed)
Improve commit fetching concurrency and add tagged-only commit filtering along with updated documentation and tests
- Updated readme examples to clarify default and options usage, including caching and tagged-only filtering
- Increased non-exclusive concurrency from 5 to 20 in fetchAllCommitsFromInstance
- Added tagged-only filtering logic for both cached and non-cached commit results
- Modified tests to enable tagged-only mode and require npm check
## 2025-04-25 - 1.7.0 - feat(core)
Enhance commit fetching with caching, concurrency improvements, and dependency upgrades
- Updated development dependencies (@git.zone/tsbuild, @git.zone/tsbundle, @git.zone/tstest, @push.rocks/tapbundle, @types/node) and dependency versions
- Introduced optional caching options (enableCache, cacheWindowMs, enableNpmCheck) in the CodeFeed constructor to optimize commit retrieval
- Refactored commit fetching to use AsyncExecutionStack for controlled concurrency and improved performance
- Removed deprecated ts/codefeed.plugins.ts in favor of a consolidated plugins.ts module
## 2024-12-16 - 1.6.5 - fix(CodeFeed)
Fixed timestamp initialization and commit fetching timeframe
- Updated the lastRunTimestamp initialization default period from 24 hours to 7 days in CodeFeed constructor.
- Modified commit fetching logic to consider commits from the last 7 days instead of 24 hours in fetchRecentCommitsForRepo.
## 2024-12-14 - 1.6.4 - fix(core)
Refactor fetch logic to use a unified fetchFunction for API calls
- Consolidated API request logic in the CodeFeed class to use fetchFunction for improved maintainability.
## 2024-12-14 - 1.6.3 - fix(codefeed)
Refactor and fix formatting issues in the CodeFeed module
- Refactored various method format and spacing.
- Fixed error handling formatting for readability.
- Improved consistency in JSON handling for API responses.
## 2024-12-14 - 1.6.2 - fix(core)
Fix sorting order of tagged commits by timestamp
- Fixed the sorting order of commits to be by timestamp in descending order after filtering for tagged commits.
## 2024-12-14 - 1.6.1 - fix(docs)
Updated project metadata and expanded documentation for installation and usage.
- Updated description and keywords in package.json and npmextra.json.
- Significant expansion of the README.md with detailed installation, usage, and feature instructions.
## 2024-12-14 - 1.6.0 - feat(core)
Add changelog fetching and parsing functionality
- Implemented loadChangelogFromRepo to directly load the changelog from a Gitea repository.
- Introduced parsing functionality to extract specific version details from the loaded changelog.
- Updated CodeFeed class to utilize the changelog for version verification and commit processing.
## 2024-12-14 - 1.5.3 - fix(core) ## 2024-12-14 - 1.5.3 - fix(core)
Fix filtering logic for returning only tagged commits Fix filtering logic for returning only tagged commits

View File

@ -5,10 +5,23 @@
"githost": "code.foss.global", "githost": "code.foss.global",
"gitscope": "foss.global", "gitscope": "foss.global",
"gitrepo": "codefeed", "gitrepo": "codefeed",
"description": "a module for creating feeds for code development", "description": "The @foss.global/codefeed module is designed for generating feeds from Gitea repositories, enhancing development workflows by processing commit data and repository activities.",
"npmPackagename": "@foss.global/codefeed", "npmPackagename": "@foss.global/codefeed",
"license": "MIT", "license": "MIT",
"projectDomain": "foss.global" "projectDomain": "foss.global",
"keywords": [
"codefeed",
"Gitea",
"commits",
"changelog",
"repository",
"development tools",
"npm",
"module",
"code analysis",
"activity feed",
"version control"
]
} }
}, },
"npmci": { "npmci": {

View File

@ -1,8 +1,8 @@
{ {
"name": "@foss.global/codefeed", "name": "@foss.global/codefeed",
"version": "1.5.3", "version": "1.7.1",
"private": false, "private": false,
"description": "a module for creating feeds for code development", "description": "The @foss.global/codefeed module is designed for generating feeds from Gitea repositories, enhancing development workflows by processing commit data and repository activities.",
"exports": { "exports": {
".": "./dist_ts/index.js", ".": "./dist_ts/index.js",
"./interfaces": "./dist_ts/interfaces/index.js" "./interfaces": "./dist_ts/interfaces/index.js"
@ -16,18 +16,19 @@
"buildDocs": "(tsdoc)" "buildDocs": "(tsdoc)"
}, },
"devDependencies": { "devDependencies": {
"@git.zone/tsbuild": "^2.1.25", "@git.zone/tsbuild": "^2.3.2",
"@git.zone/tsbundle": "^2.0.5", "@git.zone/tsbundle": "^2.2.5",
"@git.zone/tsrun": "^1.2.46", "@git.zone/tsrun": "^1.2.46",
"@git.zone/tstest": "^1.0.44", "@git.zone/tstest": "^1.0.96",
"@push.rocks/tapbundle": "^5.0.15", "@push.rocks/tapbundle": "^5.6.3",
"@types/node": "^22.10.2" "@types/node": "^22.15.2"
}, },
"dependencies": { "dependencies": {
"@push.rocks/lik": "^6.2.2",
"@push.rocks/qenv": "^6.1.0", "@push.rocks/qenv": "^6.1.0",
"@push.rocks/smartnpm": "^2.0.4", "@push.rocks/smartnpm": "^2.0.4",
"@push.rocks/smarttime": "^4.1.1", "@push.rocks/smarttime": "^4.1.1",
"@push.rocks/smartxml": "^1.0.8" "@push.rocks/smartxml": "^1.1.1"
}, },
"repository": { "repository": {
"type": "git", "type": "git",
@ -48,5 +49,19 @@
"cli.js", "cli.js",
"npmextra.json", "npmextra.json",
"readme.md" "readme.md"
] ],
"keywords": [
"codefeed",
"Gitea",
"commits",
"changelog",
"repository",
"development tools",
"npm",
"module",
"code analysis",
"activity feed",
"version control"
],
"packageManager": "pnpm@10.7.0+sha512.6b865ad4b62a1d9842b61d674a393903b871d9244954f652b8842c2b553c72176b278f64c463e52d40fff8aba385c235c8c9ecf5cc7de4fd78b8bb6d49633ab6"
} }

3368
pnpm-lock.yaml generated

File diff suppressed because it is too large Load Diff

142
readme.md
View File

@ -1,7 +1,143 @@
```markdown
# @foss.global/codefeed # @foss.global/codefeed
a module for creating feeds for code development A module for creating feeds for code development.
## How to create the docs ## Install
To create docs run gitzone aidoc. To install the `@foss.global/codefeed` package, you can run the following npm command in your project directory:
```bash
npm install @foss.global/codefeed
```
Ensure that you have a compatible version of Node.js installed and that your project is set up to support ECMAScript modules. The `@foss.global/codefeed` module uses ESM syntax.
## Usage
The `@foss.global/codefeed` package is designed to help developers generate feeds for code developments, specifically targeting Gitea repositories. It fetches and processes commit data, changelogs, and repository activities for further analysis or visualization. Here, we'll delve into how you can utilize the different features of the `CodeFeed` class.
### Setting Up CodeFeed
To get started, import the `CodeFeed` class from the module:
```typescript
import { CodeFeed } from '@foss.global/codefeed';
```
Then, create an instance of `CodeFeed`. You'll need the base URL of your Gitea instance and optionally an API token if your repositories require authentication.
```typescript
// default: fetch commits since 7 days ago, no caching or npm checks, include all commits
const codeFeed = new CodeFeed(
'https://your-gitea-instance-url.com',
'your-api-token'
);
// with options: cache commits in-memory for 30 days, disable npm lookups, return only tagged commits
const thirtyDays = 30 * 24 * 60 * 60 * 1000;
const codeFeedStateful = new CodeFeed(
'https://your-gitea-instance-url.com',
'your-api-token',
undefined, // defaults to 7 days ago
{
enableCache: true,
cacheWindowMs: thirtyDays,
enableNpmCheck: false,
taggedOnly: true,
}
);
```
The constructor can also accept a `lastRunTimestamp` which indicates the last time a sync was performed. If not provided, it defaults to one week (7 days) prior to the current time.
### Fetching Commits
One of the core functionalities of CodeFeed is fetching commits from a Gitea instance. By calling `fetchAllCommitsFromInstance`, you can retrieve commits across multiple repositories:
```typescript
(async () => {
try {
const commits = await codeFeed.fetchAllCommitsFromInstance();
console.log(commits);
} catch (error) {
console.error('An error occurred while fetching commits:', error);
}
})();
```
This method scans all organizations and repositories, fetches all commits since the constructors `lastRunTimestamp` (default: one week ago), and enriches them with metadata like:
- Git tags (to detect releases)
- npm publication status (when enabled)
- parsed changelog entries (when available)
When `taggedOnly` is enabled, only commits marked as release tags are returned. When `enableCache` is enabled, previously fetched commits are kept in memory (up to `cacheWindowMs`), and only new commits are fetched on subsequent calls.
Each commit object in the resulting array conforms to the `ICommitResult` interface, containing details such as:
- `baseUrl`
- `org`
- `repo`
- `timestamp`
- `hash`
- `commitMessage`
- `tagged` (boolean)
- `publishedOnNpm` (boolean)
- `prettyAgoTime` (human-readable relative time)
- `changelog` (text from the `changelog.md` associated with a commit)
### Understanding the Data Fetch Process
#### Fetching Organizations
The `fetchAllOrganizations` method collects all organizations within the Gitea instance:
```typescript
const organizations = await codeFeed.fetchAllOrganizations();
console.log('Organizations:', organizations);
```
This method interacts with the Gitea API to pull organization names, aiding further requests that require organization context.
#### Fetching Repositories
Repositories under these organizations can be retrieved using `fetchAllRepositories`:
```typescript
const repositories = await codeFeed.fetchAllRepositories();
console.log('Repositories:', repositories);
```
Here, filtering by organization can help narrow down the scope further when dealing with large instances.
#### Fetching Tags and Commits
To handle repository-specific details, use:
- `fetchTags(owner: string, repo: string)`: Appropriately handles paginated tag data within a repository.
- `fetchRecentCommitsForRepo(owner: string, repo: string)`: Gathers commit data specific to the past 24 hours for a given repository.
```typescript
const tags = await codeFeed.fetchTags('orgName', 'repoName');
const recentCommits = await codeFeed.fetchRecentCommitsForRepo('orgName', 'repoName');
console.log('Tags:', tags);
console.log('Recent Commits:', recentCommits);
```
### Changelog Integration
Loading changelog content from a repository is integrated into the flow with `loadChangelogFromRepo`. This can be accessed when processing specific commits:
```typescript
await codeFeed.loadChangelogFromRepo('org', 'repo');
const changelog = codeFeed.getChangelogForVersion('1.0.0');
console.log('Changelog for version 1.0.0:', changelog);
```
### Conclusion
The `@foss.global/codefeed` module provides robust capabilities for extracting and managing feed data related to code developments in Gitea environments. Through systematic setup and leveraging API-driven methods, it becomes a valuable tool for developers aiming to keep track of software progress and changes efficiently. The integration hooks like changelog and npm verification further enrich its utility, offering consolidated insights into each commit's journey from codebase to published package.
Explore integrating these capabilities into your development workflows to enhance tracking, deployment pipelines, or analytics systems within your projects. Remember to always handle API tokens securely and adhere to best practices when managing access to repository resources. Stay updated on any changes or enhancements to this module for further feature exposures or bug fixes. Happy coding!
```
undefined

View File

@ -9,12 +9,22 @@ let testCodeFeed: codefeed.CodeFeed;
tap.test('first test', async () => { tap.test('first test', async () => {
const token = await testQenv.getEnvVarOnDemand('GITEA_TOKEN'); const token = await testQenv.getEnvVarOnDemand('GITEA_TOKEN');
// console.log('token', token); // console.log('token', token);
testCodeFeed = new codefeed.CodeFeed('https://code.foss.global', token); // seed lastRunTimestamp to 1 year ago and enable in-memory caching for 1 year
const oneYearMs = 365 * 24 * 60 * 60 * 1000;
const oneYearAgo = new Date(Date.now() - oneYearMs).toISOString();
testCodeFeed = new codefeed.CodeFeed(
'https://code.foss.global',
token,
oneYearAgo,
{ enableCache: true, cacheWindowMs: oneYearMs, enableNpmCheck: true, taggedOnly: true }
);
expect(testCodeFeed).toBeInstanceOf(codefeed.CodeFeed); expect(testCodeFeed).toBeInstanceOf(codefeed.CodeFeed);
}); });
tap.test('fetchAllCommitsFromInstance', async () => { tap.test('fetchAllCommitsFromInstance', async () => {
const commits = await testCodeFeed.fetchAllCommitsFromInstance(); const commits = await testCodeFeed.fetchAllCommitsFromInstance();
// log the actual results so we can inspect them
console.log('Fetched commits:', JSON.stringify(commits, null, 2));
expect(commits).toBeArray(); expect(commits).toBeArray();
expect(commits.length).toBeGreaterThan(0); expect(commits.length).toBeGreaterThan(0);
// expect(commits[0]).toBeTypeofObject(); // expect(commits[0]).toBeTypeofObject();

View File

@ -3,6 +3,6 @@
*/ */
export const commitinfo = { export const commitinfo = {
name: '@foss.global/codefeed', name: '@foss.global/codefeed',
version: '1.5.3', version: '1.7.1',
description: 'a module for creating feeds for code development' description: 'The @foss.global/codefeed module is designed for generating feeds from Gitea repositories, enhancing development workflows by processing commit data and repository activities.'
} }

View File

@ -1,294 +1,330 @@
import * as plugins from './codefeed.plugins.js'; import * as plugins from './plugins.js';
export class CodeFeed { export class CodeFeed {
private baseUrl: string; private baseUrl: string;
private token?: string; private token?: string;
private npmRegistry = new plugins.smartnpm.NpmRegistry();
private smartxmlInstance = new plugins.smartxml.SmartXml();
private lastRunTimestamp: string; private lastRunTimestamp: string;
// Raw changelog content for the current repository
private changelogContent: string = '';
// npm registry helper for published-on-npm checks
private npmRegistry: plugins.smartnpm.NpmRegistry;
// In-memory stateful cache of commits
private enableCache: boolean = false;
private cacheWindowMs?: number;
private cache: plugins.interfaces.ICommitResult[] = [];
// enable or disable npm publishedOnNpm checks (true by default)
private enableNpmCheck: boolean = true;
// return only tagged commits (false by default)
private enableTaggedOnly: boolean = false;
constructor(baseUrl: string, token?: string, lastRunTimestamp?: string) { constructor(
baseUrl: string,
token?: string,
lastRunTimestamp?: string,
options?: {
enableCache?: boolean;
cacheWindowMs?: number;
enableNpmCheck?: boolean;
taggedOnly?: boolean;
}
) {
this.baseUrl = baseUrl; this.baseUrl = baseUrl;
this.token = token; this.token = token;
this.lastRunTimestamp = lastRunTimestamp || new Date(Date.now() - 24 * 60 * 60 * 1000).toISOString(); this.lastRunTimestamp =
lastRunTimestamp ?? new Date(Date.now() - 7 * 24 * 60 * 60 * 1000).toISOString();
// configure stateful caching
this.enableCache = options?.enableCache ?? false;
this.cacheWindowMs = options?.cacheWindowMs;
this.enableNpmCheck = options?.enableNpmCheck ?? true;
this.enableTaggedOnly = options?.taggedOnly ?? false;
this.cache = [];
// npm registry instance for version lookups
this.npmRegistry = new plugins.smartnpm.NpmRegistry();
console.log('CodeFeed initialized with last run timestamp:', this.lastRunTimestamp); console.log('CodeFeed initialized with last run timestamp:', this.lastRunTimestamp);
} }
/** /**
* Fetch all organizations from the Gitea instance. * Fetch all new commits (since lastRunTimestamp) across all orgs and repos.
*/ */
private async fetchAllOrganizations(): Promise<string[]> { public async fetchAllCommitsFromInstance(): Promise<plugins.interfaces.ICommitResult[]> {
const url = `${this.baseUrl}/api/v1/orgs`; // Controlled concurrency with AsyncExecutionStack
const response = await fetch(url, { const stack = new plugins.lik.AsyncExecutionStack();
headers: this.token ? { Authorization: `token ${this.token}` } : {}, stack.setNonExclusiveMaxConcurrency(20);
}); // determine since timestamp for this run (stateful caching)
let effectiveSince = this.lastRunTimestamp;
if (!response.ok) { if (this.enableCache && this.cache.length > 0) {
throw new Error(`Failed to fetch organizations: ${response.statusText}`); // use newest timestamp in cache to fetch only tail
effectiveSince = this.cache.reduce(
(max, c) => (c.timestamp > max ? c.timestamp : max),
effectiveSince
);
} }
const data: { username: string }[] = await response.json(); // 1) get all organizations
return data.map((org) => org.username);
}
/**
* Fetch organization-level activity RSS feed.
*/
private async fetchOrgRssFeed(optionsArg: {
orgName: string,
repoName?: string,
}): Promise<any[]> {
let rssUrl: string
if (optionsArg.orgName && !optionsArg.repoName) {
rssUrl = `${this.baseUrl}/${optionsArg.orgName}.atom`;
} else if (optionsArg.orgName && optionsArg.repoName) {
rssUrl = `${this.baseUrl}/${optionsArg.orgName}/${optionsArg.repoName}.atom`;
}
const response = await fetch(rssUrl);
if (!response.ok) {
throw new Error(`Failed to fetch RSS feed for organization ${optionsArg.orgName}/${optionsArg.repoName}: ${response.statusText}`);
}
const rssText = await response.text();
// Parse the Atom feed using fast-xml-parser
const rssData = this.smartxmlInstance.parseXmlToObject(rssText);
// Return the <entry> elements from the feed
return rssData.feed.entry || [];
}
/**
* Check if the organization's RSS feed has any new activities since the last run.
*/
private async hasNewActivity(optionsArg: {
orgName: string,
repoName?: string,
}): Promise<boolean> {
const entries = await this.fetchOrgRssFeed(optionsArg);
// Filter entries to find new activities since the last run
return entries.some((entry: any) => {
const updated = new Date(entry.updated);
return updated > new Date(this.lastRunTimestamp);
});
}
/**
* Fetch all repositories accessible to the token/user.
*/
private async fetchAllRepositories(): Promise<plugins.interfaces.Repository[]> {
let page = 1;
const allRepos: plugins.interfaces.Repository[] = [];
while (true) {
const url = new URL(`${this.baseUrl}/api/v1/repos/search`);
url.searchParams.set('limit', '50');
url.searchParams.set('page', page.toString());
const resp = await fetch(url.href, {
headers: this.token ? { 'Authorization': `token ${this.token}` } : {},
});
if (!resp.ok) {
throw new Error(`Failed to fetch repositories: ${resp.statusText}`);
}
const data: plugins.interfaces.RepoSearchResponse = await resp.json();
allRepos.push(...data.data);
if (data.data.length < 50) {
break;
}
page++;
}
return allRepos;
}
/**
* Fetch all tags for a given repository.
*/
private async fetchTags(owner: string, repo: string): Promise<Set<string>> {
let page = 1;
const tags: plugins.interfaces.Tag[] = [];
while (true) {
const url = new URL(`${this.baseUrl}/api/v1/repos/${owner}/${repo}/tags`);
url.searchParams.set('limit', '50');
url.searchParams.set('page', page.toString());
const resp = await fetch(url.href, {
headers: this.token ? { 'Authorization': `token ${this.token}` } : {},
});
if (!resp.ok) {
console.error(`Failed to fetch tags for ${owner}/${repo}: ${resp.status} ${resp.statusText} at ${url.href}`);
throw new Error(`Failed to fetch tags for ${owner}/${repo}: ${resp.statusText}`);
}
const data: plugins.interfaces.Tag[] = await resp.json();
tags.push(...data);
if (data.length < 50) {
break;
}
page++;
}
const taggedCommitShas = new Set<string>();
for (const t of tags) {
if (t.commit?.sha) {
taggedCommitShas.add(t.commit.sha);
}
}
return taggedCommitShas;
}
/**
* Fetch commits from the last 24 hours for a repository.
*/
private async fetchRecentCommitsForRepo(owner: string, repo: string): Promise<plugins.interfaces.Commit[]> {
const twentyFourHoursAgo = new Date(Date.now() - 24 * 60 * 60 * 1000);
let page = 1;
const recentCommits: plugins.interfaces.Commit[] = [];
while (true) {
const url = new URL(`${this.baseUrl}/api/v1/repos/${owner}/${repo}/commits`);
url.searchParams.set('limit', '50');
url.searchParams.set('page', page.toString());
const resp = await fetch(url.href, {
headers: this.token ? { 'Authorization': `token ${this.token}` } : {},
});
if (!resp.ok) {
console.error(`Failed to fetch commits for ${owner}/${repo}: ${resp.status} ${resp.statusText} at ${url.href}`);
throw new Error(`Failed to fetch commits for ${owner}/${repo}: ${resp.statusText}`);
}
const data: plugins.interfaces.Commit[] = await resp.json();
if (data.length === 0) {
break;
}
for (const commit of data) {
const commitDate = new Date(commit.commit.author.date);
if (commitDate > twentyFourHoursAgo) {
recentCommits.push(commit);
} else {
return recentCommits;
}
}
page++;
}
return recentCommits;
}
/**
* Fetch all commits by querying all organizations.
*/
public async fetchAllCommitsFromInstance(): Promise<plugins.interfaces.CommitResult[]> {
const orgs = await this.fetchAllOrganizations(); const orgs = await this.fetchAllOrganizations();
console.log(`Found ${orgs.length} organizations`);
let allCommits: plugins.interfaces.CommitResult[] = [];
for (const orgName of orgs) { // 2) fetch repos per org in parallel
console.log(`Checking activity for organization: ${orgName}`); const repoLists = await Promise.all(
orgs.map((org) =>
stack.getNonExclusiveExecutionSlot(() => this.fetchRepositoriesForOrg(org))
)
);
// flatten to [{ owner, name }]
const allRepos = orgs.flatMap((org, i) =>
repoLists[i].map((r) => ({ owner: org, name: r.name }))
);
try { // 3) probe latest commit per repo and fetch full list only if new commits exist
const hasActivity = await this.hasNewActivity({ const commitJobs = allRepos.map(({ owner, name }) =>
orgName, stack.getNonExclusiveExecutionSlot(async () => {
}); try {
if (!hasActivity) { // 3a) Probe the most recent commit (limit=1)
console.log(`No new activity for organization: ${orgName}`); const probeResp = await this.fetchFunction(
continue; `/api/v1/repos/${owner}/${name}/commits?limit=1`,
{ headers: this.token ? { Authorization: `token ${this.token}` } : {} }
);
if (!probeResp.ok) {
throw new Error(`Probe failed for ${owner}/${name}: ${probeResp.statusText}`);
}
const probeData: plugins.interfaces.ICommit[] = await probeResp.json();
// If no commits or no new commits since last run, skip
if (
probeData.length === 0 ||
new Date(probeData[0].commit.author.date).getTime() <=
new Date(effectiveSince).getTime()
) {
return { owner, name, commits: [] };
}
// 3b) Fetch commits since last run
const commits = await this.fetchRecentCommitsForRepo(
owner,
name,
effectiveSince
);
return { owner, name, commits };
} catch (e: any) {
console.error(`Failed to fetch commits for ${owner}/${name}:`, e.message);
return { owner, name, commits: [] };
} }
} catch (error) { })
console.error(`Error fetching activity for organization ${orgName}:`, error.message); );
const commitResults = await Promise.all(commitJobs);
// 4) build new commit entries with tagging, npm and changelog support
const newResults: plugins.interfaces.ICommitResult[] = [];
for (const { owner, name, commits } of commitResults) {
// skip repos with no new commits
if (commits.length === 0) {
this.changelogContent = '';
continue; continue;
} }
// load changelog for this repo
console.log(`New activity detected for organization: ${orgName}. Processing repositories...`); await this.loadChangelogFromRepo(owner, name);
// fetch tags for this repo
const repos = await this.fetchAllRepositories(); let taggedShas: Set<string>;
for (const r of repos.filter((repo) => repo.owner.login === orgName)) { try {
taggedShas = await this.fetchTags(owner, name);
} catch (e: any) {
console.error(`Failed to fetch tags for ${owner}/${name}:`, e.message);
taggedShas = new Set<string>();
}
// fetch npm package info only if any new commits correspond to a tag
const hasTaggedCommit = commits.some((c) => taggedShas.has(c.sha));
let pkgInfo: { allVersions: Array<{ version: string }> } | null = null;
if (hasTaggedCommit && this.enableNpmCheck) {
try { try {
const hasActivity = await this.hasNewActivity({ pkgInfo = await this.npmRegistry.getPackageInfo(`@${owner}/${name}`);
orgName, } catch (e: any) {
repoName: r.name, console.error(`Failed to fetch package info for ${owner}/${name}:`, e.message);
}); pkgInfo = null;
if (!hasActivity) {
console.log(`No new activity for repository: ${orgName}/${r.name}`);
continue;
}
} catch (error) {
console.error(`Error fetching activity for repository ${orgName}/${r.name}:`, error.message);
continue;
}
const org = r.owner.login;
const repo = r.name;
console.log(`Processing repository ${org}/${repo}`);
try {
const taggedCommitShas = await this.fetchTags(org, repo);
const commits = await this.fetchRecentCommitsForRepo(org, repo);
const commitResults = commits.map((c) => {
const commit: plugins.interfaces.CommitResult = {
baseUrl: this.baseUrl,
org,
repo,
timestamp: c.commit.author.date,
prettyAgoTime: plugins.smarttime.getMilliSecondsAsHumanReadableAgoTime(new Date(c.commit.author.date).getTime()),
hash: c.sha,
commitMessage: c.commit.message,
tagged: taggedCommitShas.has(c.sha),
publishedOnNpm: false,
}
return commit;
});
if (commitResults.length > 0) {
try {
const packageInfo = await this.npmRegistry.getPackageInfo(`@${org}/${repo}`);
for (const commit of commitResults.filter((c) => c.tagged)) {
const correspondingVersion = packageInfo.allVersions.find((versionArg) => {
return versionArg.version === commit.commitMessage.replace('\n', '');
});
if (correspondingVersion) {
commit.publishedOnNpm = true;
}
}
} catch (error) {
console.error(`Failed to fetch package info for ${org}/${repo}:`, error.message);
}
}
allCommits.push(...commitResults);
} catch (error) {
console.error(`Skipping repository ${org}/${repo} due to error:`, error.message);
} }
} }
// build commit entries
for (const c of commits) {
const versionCandidate = c.commit.message.replace(/\n/g, '').trim();
const isTagged = taggedShas.has(c.sha);
const publishedOnNpm = isTagged && pkgInfo
? pkgInfo.allVersions.some((v) => v.version === versionCandidate)
: false;
let changelogEntry: string | undefined;
if (this.changelogContent) {
changelogEntry = this.getChangelogForVersion(versionCandidate);
}
newResults.push({
baseUrl: this.baseUrl,
org: owner,
repo: name,
timestamp: c.commit.author.date,
prettyAgoTime: plugins.smarttime.getMilliSecondsAsHumanReadableAgoTime(
new Date(c.commit.author.date).getTime()
),
hash: c.sha,
commitMessage: c.commit.message,
tagged: isTagged,
publishedOnNpm,
changelog: changelogEntry,
});
}
}
// if caching is enabled, merge into in-memory cache and return full cache
if (this.enableCache) {
const existingHashes = new Set(this.cache.map((c) => c.hash));
const uniqueNew = newResults.filter((c) => !existingHashes.has(c.hash));
this.cache.push(...uniqueNew);
// trim commits older than window
if (this.cacheWindowMs !== undefined) {
const cutoff = Date.now() - this.cacheWindowMs;
this.cache = this.cache.filter((c) => new Date(c.timestamp).getTime() >= cutoff);
}
// advance lastRunTimestamp to now
this.lastRunTimestamp = new Date().toISOString();
// sort descending by timestamp
this.cache.sort((a, b) => b.timestamp.localeCompare(a.timestamp));
// apply tagged-only filter if requested
if (this.enableTaggedOnly) {
return this.cache.filter((c) => c.tagged === true);
}
return this.cache;
}
// no caching: apply tagged-only filter if requested
if (this.enableTaggedOnly) {
return newResults.filter((c) => c.tagged === true);
}
return newResults;
}
/**
* Load the changelog directly from the Gitea repository.
*/
private async loadChangelogFromRepo(owner: string, repo: string): Promise<void> {
const url = `/api/v1/repos/${owner}/${repo}/contents/changelog.md`;
const headers: Record<string, string> = {};
if (this.token) {
headers['Authorization'] = `token ${this.token}`;
} }
console.log(`Processed ${allCommits.length} commits in total.`); const response = await this.fetchFunction(url, { headers });
for (const c of allCommits) { if (!response.ok) {
console.log(` ========================================================================== console.error(
${c.prettyAgoTime} ago: `Could not fetch CHANGELOG.md from ${owner}/${repo}: ${response.status} ${response.statusText}`
${c.org}/${c.repo} );
${c.commitMessage} this.changelogContent = '';
Published on npm: ${c.publishedOnNpm} return;
`);
} }
allCommits = allCommits.filter(commitArg => commitArg.tagged); const data = await response.json();
if (!data.content) {
console.warn(`No content field found in response for ${owner}/${repo}/changelog.md`);
this.changelogContent = '';
return;
}
return allCommits; // decode base64 content
this.changelogContent = Buffer.from(data.content, 'base64').toString('utf8');
}
/**
* Parse the changelog to find the entry for a given version.
* The changelog format is assumed as:
*
* # Changelog
*
* ## <date> - <version> - <description>
* <changes...>
*/
private getChangelogForVersion(version: string): string | undefined {
if (!this.changelogContent) {
return undefined;
}
const lines = this.changelogContent.split('\n');
const versionHeaderIndex = lines.findIndex((line) => line.includes(`- ${version} -`));
if (versionHeaderIndex === -1) {
return undefined;
}
const changelogLines: string[] = [];
for (let i = versionHeaderIndex + 1; i < lines.length; i++) {
const line = lines[i];
// The next version header starts with `## `
if (line.startsWith('## ')) {
break;
}
changelogLines.push(line);
}
return changelogLines.join('\n').trim();
}
/**
* Fetch all tags for a given repo and return the set of tagged commit SHAs
*/
private async fetchTags(owner: string, repo: string): Promise<Set<string>> {
const taggedShas = new Set<string>();
let page = 1;
while (true) {
const url = `/api/v1/repos/${owner}/${repo}/tags?limit=50&page=${page}`;
const resp = await this.fetchFunction(url, {
headers: this.token ? { Authorization: `token ${this.token}` } : {},
});
if (!resp.ok) {
console.error(`Failed to fetch tags for ${owner}/${repo}: ${resp.status} ${resp.statusText}`);
return taggedShas;
}
const data: plugins.interfaces.ITag[] = await resp.json();
if (data.length === 0) break;
for (const t of data) {
if (t.commit?.sha) taggedShas.add(t.commit.sha);
}
if (data.length < 50) break;
page++;
}
return taggedShas;
}
private async fetchAllOrganizations(): Promise<string[]> {
const resp = await this.fetchFunction('/api/v1/orgs', {
headers: this.token ? { Authorization: `token ${this.token}` } : {},
});
if (!resp.ok) {
throw new Error(`Failed to fetch organizations: ${resp.statusText}`);
}
const data: { username: string }[] = await resp.json();
return data.map((o) => o.username);
}
private async fetchRepositoriesForOrg(org: string): Promise<plugins.interfaces.IRepository[]> {
const resp = await this.fetchFunction(`/api/v1/orgs/${org}/repos?limit=50`, {
headers: this.token ? { Authorization: `token ${this.token}` } : {},
});
if (!resp.ok) {
throw new Error(`Failed to fetch repositories for ${org}: ${resp.statusText}`);
}
const data: plugins.interfaces.IRepository[] = await resp.json();
return data;
}
private async fetchRecentCommitsForRepo(
owner: string,
repo: string,
sinceTimestamp?: string
): Promise<plugins.interfaces.ICommit[]> {
const since = sinceTimestamp ?? this.lastRunTimestamp;
const resp = await this.fetchFunction(
`/api/v1/repos/${owner}/${repo}/commits?since=${encodeURIComponent(
since
)}&limit=50`,
{ headers: this.token ? { Authorization: `token ${this.token}` } : {} }
);
if (!resp.ok) {
throw new Error(`Failed to fetch commits for ${owner}/${repo}: ${resp.statusText}`);
}
const data: plugins.interfaces.ICommit[] = await resp.json();
return data;
}
public async fetchFunction(
urlArg: string,
optionsArg: RequestInit = {}
): Promise<Response> {
return fetch(`${this.baseUrl}${urlArg}`, optionsArg);
} }
} }

View File

@ -1,37 +1,37 @@
export interface RepositoryOwner { export interface IRepositoryOwner {
login: string; login: string;
} }
export interface Repository { export interface IRepository {
owner: RepositoryOwner; owner: IRepositoryOwner;
name: string; name: string;
} }
export interface CommitAuthor { export interface ICommitAuthor {
date: string; date: string;
} }
export interface CommitDetail { export interface ICommitDetail {
message: string; message: string;
author: CommitAuthor; author: ICommitAuthor;
} }
export interface Commit { export interface ICommit {
sha: string; sha: string;
commit: CommitDetail; commit: ICommitDetail;
} }
export interface Tag { export interface ITag {
commit?: { commit?: {
sha?: string; sha?: string;
}; };
} }
export interface RepoSearchResponse { export interface IRepoSearchResponse {
data: Repository[]; data: IRepository[];
} }
export interface CommitResult { export interface ICommitResult {
baseUrl: string; baseUrl: string;
org: string; org: string;
repo: string; repo: string;
@ -41,4 +41,5 @@ export interface CommitResult {
tagged: boolean; tagged: boolean;
publishedOnNpm: boolean; publishedOnNpm: boolean;
prettyAgoTime: string; prettyAgoTime: string;
changelog: string | undefined;
} }

View File

@ -10,10 +10,12 @@ import * as qenv from '@push.rocks/qenv';
import * as smartnpm from '@push.rocks/smartnpm'; import * as smartnpm from '@push.rocks/smartnpm';
import * as smartxml from '@push.rocks/smartxml'; import * as smartxml from '@push.rocks/smartxml';
import * as smarttime from '@push.rocks/smarttime'; import * as smarttime from '@push.rocks/smarttime';
import * as lik from '@push.rocks/lik';
export { export {
qenv, qenv,
smartnpm, smartnpm,
smartxml, smartxml,
smarttime, smarttime,
lik,
} }