pkgsrc-Users archive

[Date Prev][Date Next][Thread Prev][Thread Next][Date Index][Thread Index][Old Index]

github-release lister



Hi!

I wrote a small python program that given a pkgsrc path lists all
available github releases and tags using the github API.

I didn't get much further because tag and release names are not
unified at all, so the looking-for-new-versions part is still using
human eyeballs.

Also the github API by default has a request limit, 60 per hour per IP
address. Perhaps someone knows how to do that better.

Anyway, I hope someone finds it useful.

Example usage:

# github-releases archivers/libzip
archivers/libzip 1.9.2 ['libzip 1.9.2', 'libzip 1.9.1', 'libzip 1.9.0', '1.8.0', 'libzip 1.7.3', 'libzip 1.7.2', 'libzip 1.7.1', 'libzip 1.7.0', 'libzip 1.6.1', '1.6.0', 'v1.9.2', 'v1.9.1', 'v1.9.0', 'v1.8.0', 'v1.7.3', 'v1.7.2', 'v1.7.1', 'rel-1-7-0', 'rel-1-6-1', 'rel-1-6-0', 'rel-1-5-2', 'rel-1-5-1', 'rel-1-5-0', 'rel-1-4-0', 'rel-1-3-2', 'rel-1-3-1', 'rel-1-3-0', 'rel-1-2-0', 'rel-1-1', 'rel-1-1-3', 'rel-1-1-2', 'rel-1-1-1', 'rel-1-0', 'rel-1-0-beta1', 'rel-1-0-1', 'rel-0-11', 'rel-0-11-2', 'rel-0-11-1', 'rel-0-10', 'rel-0-10-1']

The first two strings are the path and current version number, in the
array you have first the releases and then the tags in the order that
github returns them.
 Thomas
#!/usr/bin/env python3

import argparse
import re
import subprocess

import json
import requests

project_re = re.compile('.*github.com/([^/]*)/.*')

parser = argparse.ArgumentParser(description='Fetch list of releases for a github project')
parser.add_argument('path', type=str, nargs='+')
args = parser.parse_args()
for system in args.path:
    organization = None
    result = subprocess.run(['make', 'show-vars',
                             'VARNAMES=MASTER_SITES HOMEPAGE PKGVERSION_NOREV GITHUB_PROJECT'],
                            cwd="/usr/pkgsrc/" + system, capture_output=True, encoding='utf-8')
    if result.returncode != 0:
        print("extracting variables failed for " + system)
        continue
    master_sites, homepage, pkgversion, project = result.stdout.splitlines()
    for site in master_sites.split() + [homepage]:
        if m := project_re.match(site):
            organization = m.group(1)
            break
    if organization is None:
        print("no github organization found for " + system)
        continue
    url_base = 'https://api.github.com/repos/' + organization + '/' + project + '/'
    headers = {'Accept': 'application/vnd.github+json',
               'X-GitHub-Api-Version': '2022-11-28'}
    versions = []
    for gh_type in ['releases', 'tags']:
        url = url_base + gh_type
        result = requests.get(url, timeout=5)
        if result.status_code != 200:
            print("error fetching " + gh_type + " for " + system)
            continue
        parsed = json.loads(result.text)
        for entry in parsed:
            versions.append(entry['name'])
    print(system, pkgversion, versions)


Home | Main Index | Thread Index | Old Index