Compare commits

...

25 Commits
v1.3.1 ... main

Author SHA1 Message Date
gabriele.ponzo 2f31897cd9 Update README.md
Corretta forma inglese
2024-03-01 08:45:05 +00:00
emiliano.vavassori eff3bc67ee Fixed the README with pointers to the Wiki. 2024-01-01 18:32:36 +01:00
emiliano.vavassori 03fc28a1bf Implemented better option checking and defaults for builds. 2024-01-01 18:29:28 +01:00
emiliano.vavassori 8c4bc65184 Fixing duplication of functions in the previous commit. 2024-01-01 16:57:38 +01:00
emiliano.vavassori 5d6c7f2df2 Fixing multiple query build not working. 2024-01-01 16:36:50 +01:00
emiliano.vavassori 608eeb6071 Fixed a syntax error finding a file in a non-existent subdirectory. 2024-01-01 13:33:19 +01:00
emiliano.vavassori 2a617b1824 Fixing checksum function. 2024-01-01 13:18:55 +01:00
emiliano.vavassori 36da47b0b5 Fixing cleanup. 2024-01-01 10:02:15 +01:00
emiliano.vavassori 700d1eb376 Adding missing dependency. 2023-12-31 22:08:28 +01:00
emiliano.vavassori 3b01fc3b05 Fixed syntax error. 2023-12-09 23:36:54 +01:00
emiliano.vavassori 2277523f3c Removing unmet dependency. 2023-12-09 23:35:57 +01:00
emiliano.vavassori 3dbfef9fbe Cleaning up downloads directory before Daily and Prerelease, but leave downloaded versions in place already. 2023-12-09 23:26:03 +01:00
emiliano.vavassori 1a24f54d89 Revised and tested build cli subcommand. 2023-12-05 00:12:10 +01:00
emiliano.vavassori 024535afa9 Revised and tested getversion flow. Revised but not tested the build flows. 2023-12-04 00:51:58 +01:00
emiliano.vavassori d9775f4f94 First steps to refactoring the code and giving it a clear structure. 2023-12-03 04:16:31 +01:00
emiliano.vavassori 28528aa063 Adding Dockerfile to create a container to build. 2023-12-02 03:01:14 +01:00
emiliano.vavassori 142a09df14 Code for passing also a date for daily builds. Possibly blocked by cli command. 2023-12-02 03:00:46 +01:00
emiliano.vavassori 71a81b6a8e Fixing the Readme a bit. 2023-12-01 23:29:41 +01:00
emiliano.vavassori bb1e73fd6c Fixing last bits. 2023-12-01 23:15:57 +01:00
emiliano.vavassori a74b8d4858 Fixing version in setup. 2023-12-01 22:55:26 +01:00
emiliano.vavassori 8bd23dd08b Adding pypi uploading to pipeline. 2023-12-01 22:44:42 +01:00
emiliano.vavassori 7fe48c297d If repo is '.', resolve absolute path. 2023-12-01 22:44:24 +01:00
emiliano.vavassori 82e366c5da Redownloading again all files in case of prereleases or daily builds. 2023-12-01 21:55:13 +01:00
emiliano.vavassori 49e0ab5593 Reimplemented md5sum with python functions. 2023-12-01 21:42:52 +01:00
emiliano.vavassori 62248d862c Fixing daily URL breakage (issue #1).
As builds can come from different tinderboxes, adding an additional request and HTML parsing with xpath a final URL is less dependent on hardcoded strings.
2023-12-01 21:20:15 +01:00
8 changed files with 784 additions and 652 deletions

View File

@ -9,8 +9,6 @@ steps:
commands:
- pip install wheel
- python setup.py bdist_wheel
when:
event: tag
- name: release
image: plugins/gitea-release
@ -21,5 +19,15 @@ steps:
files: dist/*.whl
checksum: md5
draft: true
when:
event: tag
- name: publish
image: plugins/pypi
settings:
username: __token__
password:
from_secret: pypi
trigger:
event:
- tag
- push

1
.gitignore vendored
View File

@ -1,4 +1,5 @@
venv
test
build
dist
loaih.egg-info

5
Dockerfile Normal file
View File

@ -0,0 +1,5 @@
FROM python:3.9-slim
RUN pip install loaih
ENTRYPOINT [ "/usr/local/bin/loaih" ]
CMD [ "--help" ]

View File

@ -1,34 +1,9 @@
# LibreOffice AppImage Helper - `loaih` #
LibreOffice AppImage Helper is an enhanced Python porting from [previous work
from Antonio
Faccioli](https://github.com/antoniofaccioli/libreoffice-appimage). It helps
building a LibreOffice AppImage from officially released .deb files with some
options.
## Installing the package ##
[![Build Status](https://drone.libreitalia.org/api/badges/libreitalia/loaih/status.svg)](https://drone.libreitalia.org/libreitalia/loaih)
You can much more easily install the package via the produced wheel in the
[Releases](/libreitalia/loaih/releases/) page. Once downloaded, you can
install the utility with `pip`
$ pip install ./loaih-*.whl
You can also clone the repository and build the app yourself, which is as easy
as:
$ pip install wheel
$ git clone https://git.libreitalia.org/libreitalia/loaih.git
$ cd loaih
$ python setup.py bdist_wheel
$ pip install dist/loaih*.whl
## Using the package ##
The package will install a single command, `loaih`, which should help you
build your own version of the AppImage. Here's some usage scenarios.
by Antonio Faccioli](https://github.com/antoniofaccioli/libreoffice-appimage).
It helps building a LibreOffice AppImage from officially released .deb files
with some options.
## Getting options and help ##
@ -37,35 +12,7 @@ You can ask the app some information on how you can use it:
$ loaih --help
$ loaih getversion --help
$ loaih build --help
$ loaih batch --help
### Finding metadata on a specific version ###
You can use the command `getversion` and a specific query:
$ loaih getversion fresh
$ loaih getversion still
### Building a one-time AppImage ###
You can build yourself an AppImage with specific options:
$ loaih build -C -a x86_64 -l it -r . -o fresh
This will build an AppImage for the latest "Fresh" version, built for 64-bit
operating systems, with Italian as the only language, with Offline Help,
signed and updatable in the current folder.
For other build options, please see `loaih build --help` which should be
pretty complete.
### Batch building of a set of AppImages ###
This is less documented, but if the *query* parameter to the `build` command
is a YAML file (see references of it in the repository), this will loop
through the various options and create a complete set of builds based on some
characteristics.
$ loaih build fresh.yml
This is the main way the community builds found at the [LibreOffice AppImage
Repository](https://appimages.libreitalia.org) are produced.
For any other information needed, please visit the [wiki of the
project](https://git.libreitalia.org/libreitalia/loaih/wiki).

View File

@ -1,213 +1,259 @@
#!/usr/bin/env python
# encoding: utf-8
"""machinery for compiling new versions of appimages."""
import urllib.request
from lxml import etree
from packaging.version import parse as parse_version
import datetime
import json
import re
import requests
import subprocess
import shlex
from lxml import html
class Definitions(object):
DOWNLOADPAGE = "https://www.libreoffice.org/download/download/"
ARCHIVE = "https://downloadarchive.documentfoundation.org/libreoffice/old/"
RELEASE = "https://download.documentfoundation.org/libreoffice/stable/"
DAILY = "https://dev-builds.libreoffice.org/daily/master/Linux-rpm_deb-x86_64@tb87-TDF/"
PRERELEASE = "https://dev-builds.libreoffice.org/pre-releases/deb/x86_64/"
# Constants
DOWNLOADPAGE = "https://www.libreoffice.org/download/download/"
ARCHIVE = "https://downloadarchive.documentfoundation.org/libreoffice/old/"
RELEASE = "https://download.documentfoundation.org/libreoffice/stable/"
DAILY = "https://dev-builds.libreoffice.org/daily/master/"
PRERELEASE = "https://dev-builds.libreoffice.org/pre-releases/deb/x86_64/"
SELECTORS = {
'still': {
'URL': DOWNLOADPAGE,
'xpath': '(//span[@class="dl_version_number"])[last()]/text()'
},
'fresh': {
'URL': DOWNLOADPAGE,
'xpath': '(//span[@class="dl_version_number"])[1]/text()'
},
'prerelease': {
'URL': DOWNLOADPAGE,
'xpath': '//p[@class="lead_libre"][last()]/following-sibling::ul[last()]/li/a/text()'
},
'daily': {
'URL': DAILY,
'xpath': '//td/a'
}
SELECTORS = {
'still': {
'URL': DOWNLOADPAGE,
'xpath': '(//span[@class="dl_version_number"])[last()]/text()'
},
'fresh': {
'URL': DOWNLOADPAGE,
'xpath': '(//span[@class="dl_version_number"])[1]/text()'
},
'prerelease': {
'URL': DOWNLOADPAGE,
'xpath': '//p[@class="lead_libre"][last()]/following-sibling::ul[last()]/li/a/text()'
},
'daily': {
'URL': DAILY,
'xpath': '//td/a'
}
}
class Base(object):
# Class for static methods which might be useful even outside the build
# scripts.
@staticmethod
def dailyurl(date = datetime.datetime.today()):
"""Returns the URL for the latest valid daily build."""
# As per other parts of the build, we need to maintain an URL also for
# x86 versions that it isn't really provided.
# As such, the return value must be a dictionary
# Generic functions
def match_xpath(url: str, xpath: str):
"""Uses a couple of extensions to get results over webpage."""
resource = requests.get(url, timeout=10)
parsed = html.fromstring(resource.content)
return parsed.xpath(xpath)
# Get the anchor for today's builds
a = etree.HTML(urllib.request.urlopen(Definitions.DAILY).read()).xpath("//td/a[contains(text(), '" + date.strftime('%Y-%m-%d') + "')]/text()")
if len(a) == 0:
# No results found, no version found, let's return a
return { 'x86': '-', 'x86_64': '-' }
# On the contrary, more than a version is found. let's order the
# list and get the latest item
return { 'x86': '-', 'x86_64': Definitions.SELECTORS['daily']['URL'] + sorted(a)[-1] }
# Classes
class Version():
"""Represent the skeleton of each queried version."""
@staticmethod
def dailyver(date = datetime.datetime.today()):
"""Returns versions present on the latest daily build."""
url = Base.dailyurl(date)['x86_64']
# If no daily releases has been provided yet, return empty
if url == '-':
return []
def __init__(self):
self.query = ''
self.branch = ''
self.version = ''
self.urls = {
'x86': '-',
'x86_64': '-'
}
# Rerun the page parsing, this time to find out the versions built
b = etree.HTML(urllib.request.urlopen(url).read()).xpath("//td/a[contains(text(), '_deb.tar.gz')]/text()")
# This should have returned the main package for a version, but can
# have returned multiple ones, so let's treat it as a list
return [ x.split('_')[1] for x in b ]
def appname(self):
"""Determines the app name based on the query branch determined."""
datematch = re.match(r'[0-9]{8}', self.query)
retval = 'LibreOffice'
if self.query in {'prerelease', 'daily', 'current', 'yesterday'} or datematch:
retval = 'LibreOfficeDev'
@staticmethod
def namedver(query):
"""Gets the version for a specific named version."""
if query == 'daily' or query == 'yesterday':
# Daily needs double parsing for the same result to apply.
# We first select today's build anchor:
date = datetime.datetime.today()
if query == 'yesterday':
# Use yesterdays' date for testing purposes.
date += datetime.timedelta(days=-1)
return Base.dailyver(date)
# In case the query isn't for daily
return etree.HTML(urllib.request.urlopen(Definitions.SELECTORS[query]['URL']).read()).xpath(Definitions.SELECTORS[query]['xpath'])
@staticmethod
def fullversion(version):
"""Get latest full version from Archive based on partial version."""
versionlist = etree.HTML(urllib.request.urlopen(Definitions.ARCHIVE).read()).xpath(f"//td/a[starts-with(text(), '{version}')]/text()")
if versionlist:
cleanlist = sorted([ x.strip('/') for x in versionlist ])
# Sorting, then returning the last version
return cleanlist[-1]
return None
@staticmethod
def urlfromqueryandver(query, version):
"""Returns the fetching URL based on the queried version and the numeric version of it."""
# This has the purpose to simplify and explain how the releases are
# layed out.
# If the query tells about daily or 'yesterday' (for testing purposes),
# we might ignore versions and return the value coming from dailyurl:
if query == 'daily':
return Base.dailyurl()
if query == 'yesterday':
date = datetime.datetime.today() + datetime.timedelta(days=-1)
return Base.dailyurl(date)
# All other versions will be taken from Archive, as such we need a full
# version.
# If the version has only 2 points in it (or splits into three parts by '.'), that's not a full version and we will call the getlatestver() function
fullversion = str(version)
if len(fullversion.split('.')) <= 3:
fullversion = str(Base.fullversion(version))
# So the final URL is the Archive one, plus the full versions, plus a
# final '/deb/' - and an arch subfolder
baseurl = Definitions.ARCHIVE + fullversion + '/deb/'
retval = {}
# x86 binaries are not anymore offered after 6.3.0.
if parse_version(fullversion) < parse_version('6.3.0'):
retval['x86'] = baseurl + 'x86/'
else:
retval['x86'] = '-'
retval['x86_64'] = baseurl + 'x86_64/'
return retval
@staticmethod
def collectedbuilds(query):
"""Creates a list of Builds based on each queried version found."""
retval = []
if '.' in query:
# Called with a numeric query. Pass it to RemoteBuild
retval.append(RemoteBuild(query))
def cleanup_downloads(self, path, verbose=False) -> None:
"""Cleanups the downloads folder to assure new versions are built."""
search_name = self.appname() + '_' + self.version
cmd = f"find {path} -iname {search_name}\\*.tar.gz -delete"
if verbose:
subprocess.run(shlex.split(cmd), stdout=subprocess.DEVNULL, stderr=subprocess.DEVNULL)
else:
# Named query
a = Base.namedver(query)
subprocess.run(shlex.split(cmd))
if not a:
# a is empty
return retval
if isinstance(a, list) and len(a) > 1:
retval.extend([ RemoteBuild(query, version) for version in a ])
else:
retval.append(RemoteBuild(query))
return sorted(retval, key=lambda x: x.version)
class RemoteBuild(object):
def __init__(self, query, version = None):
"""Should simplify the single builded version."""
self.query = query
self.version = ''
self.basedirurl = { 'x86': '-', 'x86_64': '-' }
if version and isinstance(version, str):
self.version = version
if not '.' in self.query:
# Named version.
# Let's check if a specific version was requested.
if self.version == '':
# In case it was not requested, we will carry on the generic
# namedver() query.
# If the results are more than one, we'll take the latest (since we are requested to provide a single build).
a = Base.namedver(self.query)
if isinstance(a, list):
# if the number of versions is zero, return and exit
if not a:
return None
if len(a) == 1:
# version is a single one.
self.version = a[0]
else:
# In this case, we will select the latest release.
self.version = sorted(a)[-1]
# If the version has already a version, as requested by user,
# continue using that version
else:
# In case of numbered queries, put it as initial version
self.version = self.query
if len(str(self.version).split('.')) < 4:
# If not 4 dotted, let's search for the 4 dotted version
self.version = Base.fullversion(self.version)
self.basedirurl = Base.urlfromqueryandver(self.query, self.version)
def todict(self):
def to_dict(self):
"""Returns a dictionary of versions."""
return {
'query': self.query,
'version': self.version,
'basedirurl': self.basedirurl
'basedirurl': self.urls
}
def to_json(self):
"""Returns a json representation of the version."""
return json.dumps(self.to_dict())
def __str__(self):
return f"""query: {self.query}
version: {self.version}
x86: {self.basedirurl['x86']}
x86_64: {self.basedirurl['x86_64']}"""
x86: {self.urls['x86']}
x86_64: {self.urls['x86_64']}"""
class QueryError(Exception):
"""Standard exception for errors regarding queries."""
class Solver():
"""Generic solver to call others."""
def __init__(self, text: str, default_to_current = False):
self.text = text
self.branch = text
self.version = None
self.default_to_current = default_to_current
self.baseurl = ARCHIVE
def solve(self):
"""Splits the query text possibilities, calling all the rest of the solvers."""
solver = self
if self.text in { 'current', 'yesterday', 'daily' }:
solver = DailySolver(self.text, self.default_to_current)
elif self.text in { 'still', 'fresh', 'prerelease' }:
solver = NamedSolver(self.text)
elif '.' in self.text:
solver = NumberedSolver(self.text)
else:
try:
int(self.text)
solver = DailySolver(self.text, self.default_to_current)
except ValueError:
raise QueryError("The queried version does not exist.")
self.version = solver.solve()
self.baseurl = solver.baseurl
return self.version
def to_version(self):
retval = Version()
retval.query = self.text
retval.branch = self.branch
retval.version = self.version
if retval.branch != 'daily':
retval.urls['x86_64'] = self.baseurl + 'x86_64/'
try:
x86ver = match_xpath(self.baseurl + 'x86/', '//td/a/text()')
except Exception:
return retval
if len(x86ver) > 1:
retval.urls['x86'] = self.baseurl + 'x86/'
else:
retval.urls['x86_64'] = self.baseurl
return retval
@staticmethod
def parse(text: str, default_to_current = False):
"""Calling the same as solver class."""
retval = Solver(text, default_to_current)
retval.solve()
return retval.to_version()
class DailySolver(Solver):
"""Specific solver to daily queries."""
def __init__(self, text: str, default_to_current = False):
super().__init__(text, default_to_current)
self.branch = 'daily'
self.baseurl = DAILY
def solve(self):
"""Get daily urls based on query."""
x = "//td/a[starts-with(text(),'Linux-rpm_deb-x86') and contains(text(),'TDF/')]/text()"
tinderbox_segment = match_xpath(self.baseurl, x)[-1]
self.baseurl = self.baseurl + tinderbox_segment
# Reiterate now to search for the dated version
xpath_query = "//td/a/text()"
daily_set = match_xpath(self.baseurl, xpath_query)
matching = ''
today = datetime.datetime.today()
try:
int(self.text)
matching = datetime.datetime.strptime(self.text, "%Y%m%d").strftime('%Y-%m-%d')
except ValueError:
# All textual version
if self.text in { 'current', 'daily' }:
matching = 'current'
elif self.text == 'yesterday':
matching = (today + datetime.timedelta(days=-1)).strftime("%Y-%m-%d")
results = sorted([ x for x in daily_set if matching in x ])
if len(results) == 0:
# No daily versions found.
if self.default_to_current:
solver = DailySolver('current')
self.version = solver.version
self.baseurl = solver.baseurl
else:
self.baseurl = self.baseurl + results[-1]
# baseurl for x86 is not available for sure on daily builds.
xpath_string = "//td/a[contains(text(), '_deb.tar.gz')]/text()"
links = match_xpath(self.baseurl, xpath_string)
if len(links) > 0:
link = str(links[-1])
self.version = link.rsplit('/', maxsplit=1)[-1].split('_')[1]
return self.version
class NamedSolver(Solver):
"""Solves the query knowing that the input is a named query."""
def __init__(self, text: str):
super().__init__(text)
self.branch = text
self.baseurl = SELECTORS[self.text]['URL']
self.generalver = ''
def solve(self):
"""Get versions from query."""
xpath_query = SELECTORS[self.text]['xpath']
results = sorted(match_xpath(self.baseurl, xpath_query))
if len(results) > 0:
self.generalver = str(results[-1])
result: str = self.generalver
xpath_string = f"//td/a[starts-with(text(),'{result}')]/text()"
archived_versions = sorted(match_xpath(ARCHIVE, xpath_string))
if len(archived_versions) == 0:
return self.version
# Return just the last versions
fullversion: str = str(archived_versions[-1])
self.baseurl = ARCHIVE + fullversion + 'deb/'
self.version = fullversion.rstrip('/')
return self.version
class NumberedSolver(Solver):
"""Specific solver for numbered versions."""
def __init__(self, text: str):
super().__init__(text)
self.branch = '.'.join(text.split('.')[0-2])
def solve(self):
xpath_string = f"//td/a[starts-with(text(),'{self.text}')]/text()"
versions = sorted(match_xpath(self.baseurl, xpath_string))
if len(versions) == 0:
# It is possible that in the ARCHIVE there's no such version (might be a prerelease)
return self.version
version = str(versions[-1])
self.baseurl = self.baseurl + version + 'deb/'
self.version = version.rstrip('/')
return self.version

View File

@ -9,21 +9,26 @@ import shutil
import re
import shlex
import tempfile
import urllib.request
from lxml import etree
import hashlib
import requests
import loaih
class Collection(list):
"""Aggregates metadata on a collection of builds."""
def __init__(self, query, arch = ['x86', 'x86_64']):
"""Build a list of version to check/build for this round."""
super().__init__()
self.extend([
Build(query, arch, version) for version in loaih.Base.collectedbuilds(query)
])
class Build(loaih.RemoteBuild):
version = loaih.Solver.parse(query)
# If a version is not buildable, discard it now!
arch = [ x for x in arch if version.urls[x] != '-' ]
self.extend([ Build(version, ar) for ar in arch ])
class Build():
"""Builds a single version."""
LANGSTD = [ 'ar', 'de', 'en-GB', 'es', 'fr', 'it', 'ja', 'ko', 'pt',
@ -31,16 +36,17 @@ class Build(loaih.RemoteBuild):
LANGBASIC = [ 'en-GB' ]
ARCHSTD = [ 'x86', 'x86_64' ]
def __init__(self, query, arch, version = None):
super().__init__(query, version)
def __init__(self, version: loaih.Version, arch):
self.version = version
self.tidy_folder = True
self.verbose = True
self.arch = arch
self.short_version = str.join('.', self.version.split('.')[0:2])
self.branch_version = None
if not '.' in self.query:
self.branch_version = self.query
self.url = self.basedirurl
self.short_version = str.join('.', self.version.version.split('.')[0:2])
self.branch_version = self.version.branch
self.url = self.version.urls[arch]
# Other default values
# Other default values - for structured builds
# Most likely will be overridden by cli
self.language = 'basic'
self.offline_help = False
self.portable = False
@ -54,11 +60,11 @@ class Build(loaih.RemoteBuild):
self.appnamedir = ''
# Specific build version
self.appname = 'LibreOffice'
self.appname = self.version.appname()
self.appversion = ''
self.appimagedir = ''
self.appimagefilename = {}
self.zsyncfilename = {}
self.appimagefilename = ''
self.zsyncfilename = ''
# Other variables by build
self.languagepart = '.' + self.language
@ -67,7 +73,8 @@ class Build(loaih.RemoteBuild):
# Creating a tempfile
self.builddir = tempfile.mkdtemp()
self.tarballs = {}
self.built = { 'x86': False, 'x86_64': False }
self.found = False
self.built = False
# Preparing the default for the relative path on the storage for
# different versions.
@ -77,23 +84,23 @@ class Build(loaih.RemoteBuild):
self.full_path = ''
self.baseurl = ''
def calculate(self):
"""Calculate exclusions and other variables."""
print("--- Calculate Phase ---")
if self.verbose:
print("--- Calculate Phase ---")
# let's check here if we are on a remote repo or local.
if self.storage_path.startswith("http"):
# Final repository is remote
self.repo_type = 'remote'
print("Repo is remote.")
if self.verbose:
print("Repo is remote.")
else:
self.repo_type = 'local'
print("Repo is local.")
# AppName
if self.query in { 'prerelease', 'daily' }:
self.appname = 'LibreOfficeDev'
if self.verbose:
print("Repo is local.")
# Calculating languagepart
self.languagepart = "."
@ -107,32 +114,271 @@ class Build(loaih.RemoteBuild):
self.helppart = '.help'
# Building the required names
for arch in Build.ARCHSTD:
self.appimagefilename[arch] = self.__gen_appimagefilename__(self.version, arch)
self.zsyncfilename[arch] = self.appimagefilename[arch] + '.zsync'
self.appimagefilename = self.__gen_appimagefilename__()
self.zsyncfilename = self.appimagefilename + '.zsync'
# Mandate to the private function to calculate the full_path available
# for the storage and the checks.
self.__calculate_full_path__()
def __gen_appimagefilename__(self, version, arch):
def check(self):
"""Checking if the requested AppImage has been already built."""
if self.branch_version == 'daily':
# Daily versions have to be downloaded and built each time; no
# matter if another one is already present.
return
if self.verbose:
print("--- Check Phase ---")
if len(self.appimagefilename) == 0:
self.calculate()
if self.verbose:
print(f"Searching for {self.appimagefilename}")
# First, check if by metadata the repo is remote or not.
if self.repo_type == 'remote':
# Remote storage. I have to query a remote site to know if it
# was already built.
name = self.appimagefilename
url = self.storage_path.rstrip('/') + self.full_path + '/'
matching = []
try:
if len(loaih.match_xpath(url, f"//a[contains(@href,'{name}')]/@href")) > 0:
# Already built.
self.found = True
except Exception:
# The URL specified do not exist. So it is to build.
self.found = False
else:
# Repo is local
command = f"find {self.full_path} -name {self.appimagefilename}"
res = subprocess.run(shlex.split(command),
capture_output=True,
env={ "LC_ALL": "C" },
text=True, encoding='utf-8', check=True)
if res.stdout and len(res.stdout.strip("\n")) > 0:
# All good, the command was executed fine.
self.found = True
if self.found:
if self.verbose:
print(f"Found requested AppImage: {self.appimagefilename}.")
def download(self):
"""Downloads the contents of the URL as it was a folder."""
if self.verbose:
print("--- Download Phase ---")
if self.found:
return
if self.verbose:
print(f"Started downloads for {self.version.version}. Please wait.")
# Checking if a valid path has been provided
if self.url == '-':
if self.verbose:
print(f"Cannot build for arch {self.arch}. Continuing with other arches.")
# Faking already built it so to skip other checks.
self.found = True
# Identifying downloads
self.tarballs = [ x for x in loaih.match_xpath(self.url, "//td/a/text()") if x.endswith('tar.gz') and 'deb' in x ]
# Create and change directory to the download location
os.makedirs(self.download_path, exist_ok = True)
os.chdir(self.download_path)
for archive in self.tarballs:
# If the archive is already there, do not do anything.
if os.path.exists(archive):
continue
# Download the archive
try:
self.__download_archive__(archive)
except Exception as error:
print(f"Failed to download {archive}: {error}.")
if self.verbose:
print(f"Finished downloads for {self.version.version}.")
def build(self):
"""Building all the versions."""
if self.verbose:
print("--- Building Phase ---")
if self.found:
return
# Preparation tasks
self.appnamedir = os.path.join(self.builddir, self.appname)
os.makedirs(self.appnamedir, exist_ok=True)
# And then cd to the appname folder.
os.chdir(self.appnamedir)
# Download appimagetool from github
appimagetoolurl = r"https://github.com/AppImage/AppImageKit/releases/"
appimagetoolurl += f"download/continuous/appimagetool-{self.arch}.AppImage"
self.__download__(appimagetoolurl, 'appimagetool')
os.chmod('appimagetool', 0o755)
# Build the requested version.
self.__unpackbuild__()
def checksums(self):
"""Create checksums of the built versions."""
# Skip checksum if initally the build was already found in the storage directory
if self.verbose:
print("--- Checksum Phase ---")
if self.found:
return
os.chdir(self.appnamedir)
if self.built:
for item in [ self.appimagefilename, self.zsyncfilename ]:
itempath = os.path.join(self.appnamedir, item)
if os.path.exists(itempath):
self.__create_checksum__(item)
def publish(self):
"""Moves built versions to definitive storage."""
if self.verbose:
print("--- Publish Phase ---")
if self.found:
# All files are already present in the full_path
return
os.chdir(self.appnamedir)
# Two cases here: local and remote storage_path.
if self.repo_type == 'remote':
# Remote first.
# Build destination directory
remotepath = self.remote_path.rstrip('/') + self.full_path
try:
if self.verbose:
subprocess.run(
r"rsync -rlIvz --munge-links *.AppImage* " +
f"{self.remote_host}:{remotepath}",
cwd=self.appnamedir, shell=True, check=True
)
else:
subprocess.run(
r"rsync -rlIvz --munge-links *.AppImage* " +
f"{self.remote_host}:{remotepath}",
cwd=self.appnamedir, shell=True, check=True,
stdout=subprocess.DEVNULL, stderr=subprocess.DEVNULL
)
finally:
pass
else:
# Local
# Forcing creation of subfolders, in case there is a new build
os.makedirs(self.full_path, exist_ok = True)
for file in glob.glob("*.AppImage*"):
subprocess.run(shlex.split(
f"cp -f {file} {self.full_path}"
), check=True)
def generalize_and_link(self, chdir = 'default'):
"""Creates the needed generalized files if needed."""
if self.verbose:
print("--- Generalize and Link Phase ---")
# If called with a pointed version, no generalize and link necessary.
if not self.branch_version:
return
# If a prerelease or a daily version, either.
if self.version.query in { 'daily', 'prerelease' }:
return
if chdir == 'default':
chdir = self.full_path
appimagefilename = r''
zsyncfilename = r''
# Creating versions for short version and query text
versions = [ self.short_version, self.branch_version ]
os.chdir(chdir)
# if the appimage for the reported arch is not found, skip to next
# arch
if not os.path.exists(self.appimagefilename):
return
# Doing it both for short_name and for branchname
for version in versions:
appimagefilename = f"{self.appname}-{version}"
appimagefilename += f"{self.languagepart}{self.helppart}"
appimagefilename += f'-{self.arch}.AppImage'
zsyncfilename = appimagefilename + '.zsync'
# Create the symlink
if self.verbose:
print(f"Creating {appimagefilename} and checksums.")
if os.path.exists(appimagefilename):
os.unlink(appimagefilename)
os.symlink(self.appimagefilename, appimagefilename)
# Create the checksum for the AppImage
self.__create_checksum__(appimagefilename)
# Do not continue if no zsync file is provided.
if not self.updatable:
continue
if self.verbose:
print(f"Creating zsync file for version {version}.")
if os.path.exists(zsyncfilename):
os.unlink(zsyncfilename)
shutil.copyfile(self.zsyncfilename, zsyncfilename)
# Editing the zsyncfile
subprocess.run(shlex.split(
r"sed --in-place 's/^Filename:.*$/Filename: " +
f"{appimagefilename}/' {zsyncfilename}"
), check=True)
self.__create_checksum__(zsyncfilename)
### Private methods ###
def __gen_appimagefilename__(self):
"""Generalize the construction of the name of the app."""
self.appversion = version + self.languagepart + self.helppart
return self.appname + f'-{self.appversion}-{arch}.AppImage'
self.appversion = self.version.version + self.languagepart + self.helppart
return self.appname + f'-{self.appversion}-{self.arch}.AppImage'
def __calculate_full_path__(self):
"""Calculate relative path of the build, based on internal other variables."""
if len(self.relative_path) == 0:
if self.query == 'daily':
self.relative_path.append('daily')
elif self.query == 'prerelease':
self.relative_path.append('prerelease')
if self.tidy_folder:
if self.branch_version == 'daily':
self.relative_path.append('daily')
elif self.branch_version == 'prerelease':
self.relative_path.append('prerelease')
# Not the same check, an additional one
if self.portable:
self.relative_path.append('portable')
# Not the same check, an additional one
if self.portable:
self.relative_path.append('portable')
# Fullpath might be intended two ways:
if self.repo_type == 'remote':
@ -151,170 +397,91 @@ class Build(loaih.RemoteBuild):
fullpath_arr.extend(self.relative_path)
self.full_path = re.sub(r"/+", '/', str.join('/', fullpath_arr))
if not os.path.exists(self.full_path):
os.makedirs(self.full_path, exist_ok = True)
def check(self):
"""Checking if the requested AppImage has been already built."""
def __create_checksum__(self, file):
"""Internal function to create checksum file."""
print("--- Check Phase ---")
retval = hashlib.md5()
with open(file, 'rb') as rawfile:
while True:
buf = rawfile.read(2**20)
if not buf:
break
if len(self.appimagefilename) != 2:
self.calculate()
retval.update(buf)
for arch in self.arch:
print(f"Searching for {self.appimagefilename[arch]}")
# First, check if by metadata the repo is remote or not.
if self.repo_type == 'remote':
# Remote storage. I have to query a remote site to know if it
# was already built.
name = self.appimagefilename[arch]
url = self.storage_path.rstrip('/') + self.full_path + '/'
matching = []
try:
with urllib.request.urlopen(url) as response:
matching = etree.HTML(response.read()).xpath(
f"//a[contains(@href, '{name}')]/@href"
)
with open(f"{file}.md5", 'w', encoding='utf-8') as checkfile:
checkfile.write(f"{retval.hexdigest()} {os.path.basename(file)}")
if len(matching) > 0:
# Already built.
self.built[arch] = True
def __download_archive__(self, archive):
return self.__download__(self.url, archive)
except urllib.error.HTTPError:
# The URL specified do not exist. So it is to build.
pass
def __download__(self, url: str, filename: str):
basename = filename
if '/' in filename:
basename = filename.split('/')[-1]
else:
# Repo is local
command = f"find {self.full_path} -name {self.appimagefilename[arch]}"
res = subprocess.run(shlex.split(command),
capture_output=True,
env={ "LC_ALL": "C" },
text=True, encoding='utf-8', check=True)
full_url = url
if url.endswith('/'):
# URL has to be completed with basename of filename
full_url = url + basename
if "No such file or directory" in res.stderr:
# Folder is not existent: so the version was not built
# Build stays false, and we go to the next arch
continue
with requests.get(full_url, stream=True, timeout=10) as resource:
resource.raise_for_status()
with open(filename, 'wb') as file:
for chunk in resource.iter_content(chunk_size=8192):
file.write(chunk)
return filename
if res.stdout and len(res.stdout.strip("\n")) > 0:
# All good, the command was executed fine.
self.built[arch] = True
if self.built[arch]:
print(f"Found requested AppImage: {self.appimagefilename[arch]}.")
def download(self):
"""Downloads the contents of the URL as it was a folder."""
print("--- Download Phase ---")
print(f"Started downloads for {self.version}. Please wait.")
for arch in self.arch:
# Checking if a valid path has been provided
if self.url[arch] == '-':
print(f"Cannot build for arch {arch}. Continuing with other arches.")
# Faking already built it so to skip other checks.
self.built[arch] = True
continue
if self.built[arch]:
print(f"A build for {arch} was already found. Skipping specific packages.")
continue
# Identifying downloads
contents = []
with urllib.request.urlopen(self.url[arch]) as url:
contents = etree.HTML(url.read()).xpath("//td/a")
self.tarballs[arch] = [ x.text
for x in contents
if x.text.endswith('tar.gz') and 'deb' in x.text
]
tarballs = self.tarballs[arch]
# Create and change directory to the download location
os.makedirs(self.download_path, exist_ok = True)
os.chdir(self.download_path)
for archive in tarballs:
# If the archive is already there, do not do anything.
if os.path.exists(archive):
continue
# Download the archive
try:
urllib.request.urlretrieve(self.url[arch] + archive, archive)
except Exception as error:
print(f"Failed to download {archive}: {error}.")
print(f"Finished downloads for {self.version}.")
def build(self):
"""Building all the versions."""
print("--- Building Phase ---")
for arch in self.arch:
if self.built[arch]:
# Already built for arch or path not available. User has already been warned.
continue
# Preparation tasks
self.appnamedir = os.path.join(self.builddir, self.appname)
os.makedirs(self.appnamedir, exist_ok=True)
# And then cd to the appname folder.
os.chdir(self.appnamedir)
# Download appimagetool from github
appimagetoolurl = f"https://github.com/AppImage/AppImageKit/releases/download/continuous/appimagetool-{arch}.AppImage"
urllib.request.urlretrieve(appimagetoolurl, 'appimagetool')
os.chmod('appimagetool', 0o755)
# Build the requested version.
self.__unpackbuild__(arch)
def __unpackbuild__(self, arch):
def __unpackbuild__(self):
# We start by filtering out tarballs from the list
buildtarballs = [ self.tarballs[arch][0] ]
buildtarballs = [ self.tarballs[0] ]
# Let's process standard languages and append results to the
# buildtarball
if self.language == 'basic':
if self.offline_help:
buildtarballs.extend([ x for x in self.tarballs[arch] if 'pack_en-GB' in x ])
buildtarballs.extend([ x for x in self.tarballs if 'pack_en-GB' in x ])
else:
buildtarballs.extend([ x for x in self.tarballs[arch] if 'langpack_en-GB' in x])
buildtarballs.extend([ x for x in self.tarballs if 'langpack_en-GB' in x])
elif self.language == 'standard':
for lang in Build.LANGSTD:
if self.offline_help:
buildtarballs.extend([ x for x in self.tarballs[arch] if ('pack_' + lang) in x ])
buildtarballs.extend([ x for x in self.tarballs if 'pack_' + lang in x ])
else:
buildtarballs.extend([ x for x in self.tarballs[arch] if ('langpack_' + lang) in x ])
buildtarballs.extend([ x for x in self.tarballs if 'langpack_' + lang in x ])
elif self.language == 'full':
if self.offline_help:
# We need also all help. Let's replace buildtarball with the
# whole bunch
buildtarballs = self.tarballs[arch]
buildtarballs = self.tarballs
else:
buildtarballs.extend([ x for x in self.tarballs[arch] if 'langpack' in x ])
buildtarballs.extend([ x for x in self.tarballs if 'langpack' in x ])
else:
# Looping for each language in self.language
for lang in self.language.split(","):
if self.offline_help:
buildtarballs.extend([ x for x in self.tarballs[arch]
buildtarballs.extend([ x for x in self.tarballs
if 'pack' + lang in x ])
else:
buildtarballs.extend([ x for x in self.tarballs[arch]
buildtarballs.extend([ x for x in self.tarballs
if 'langpack' + lang in x ])
os.chdir(self.appnamedir)
# Unpacking the tarballs
if self.verbose:
print("---- Unpacking ----")
for archive in buildtarballs:
subprocess.run(shlex.split(
f"tar xzf {self.download_path}/{archive}"), check=True)
# create appimagedir
if self.verbose:
print("---- Preparing the build ----")
self.appimagedir = os.path.join(self.builddir, self.appname, self.appname + '.AppDir')
os.makedirs(self.appimagedir, exist_ok = True)
@ -369,9 +536,9 @@ class Build(loaih.RemoteBuild):
# Download AppRun from github
apprunurl = r"https://github.com/AppImage/AppImageKit/releases/"
apprunurl += f"download/continuous/AppRun-{arch}"
apprunurl += f"download/continuous/AppRun-{self.arch}"
dest = os.path.join(self.appimagedir, 'AppRun')
urllib.request.urlretrieve(apprunurl, dest)
self.__download__(apprunurl, dest)
os.chmod(dest, 0o755)
# Dealing with extra options
@ -381,16 +548,27 @@ class Build(loaih.RemoteBuild):
# adding zsync build if updatable
if self.updatable:
buildopts.append(f"-u 'zsync|{self.zsyncfilename[arch]}'")
buildopts.append(f"-u 'zsync|{self.zsyncfilename}'")
buildopts_str = str.join(' ', buildopts)
# Build the number-specific build
subprocess.run(shlex.split(
f"{self.appnamedir}/appimagetool {buildopts_str} -v " +
f"./{self.appname}.AppDir/"
), env={ "VERSION": self.appversion }, check=True)
print(f"Built AppImage version {self.appversion}")
# Build the number-specific build
if self.verbose:
print("---- Start building ----")
subprocess.run(shlex.split(
f"{self.appnamedir}/appimagetool {buildopts_str} -v " +
f"./{self.appname}.AppDir/"
), env={ "VERSION": self.appversion }, check=True)
print("---- End building ----")
else:
subprocess.run(shlex.split(
f"{self.appnamedir}/appimagetool {buildopts_str} -v " +
f"./{self.appname}.AppDir/"
), env={ "VERSION": self.appversion }, stdout=subprocess.DEVNULL,
stderr=subprocess.DEVNULL, check=True)
if self.verbose:
print(f"Built AppImage version {self.appversion}")
# Cleanup phase, before new run.
for deb in glob.glob(self.appnamedir + '/*.deb'):
@ -399,135 +577,7 @@ class Build(loaih.RemoteBuild):
r"find . -mindepth 1 -maxdepth 1 -type d -exec rm -rf {} \+"
), check=True)
def checksums(self):
"""Create checksums of the built versions."""
# Skip checksum if initally the build was already found in the storage directory
print("--- Checksum Phase ---")
if all(self.built[arch] for arch in self.arch):
return
os.chdir(self.appnamedir)
for arch in self.arch:
if not self.built[arch]:
# Here's the contrary. A newly built package has not yet been
# marked as built.
for item in [ self.appimagefilename[arch], self.zsyncfilename[arch] ]:
itempath = os.path.join(self.appnamedir, item)
if os.path.exists(itempath):
# For any built arch, find out if a file exist.
self.__create_checksum__(item)
def __create_checksum__(self, file):
"""Internal function to create checksum file."""
checksum = subprocess.run(f"md5sum {file}", shell=True,
capture_output=True, text=True, encoding='utf-8', check=True,
cwd=self.appnamedir)
if checksum.stdout:
with open(f"{file}.md5", 'w', encoding='utf-8') as checkfile:
checkfile.write(checksum.stdout)
def publish(self):
"""Moves built versions to definitive storage."""
print("--- Publish Phase ---")
if all(self.built[arch] for arch in self.arch):
# All files are already present in the full_path
return
os.chdir(self.appnamedir)
# Two cases here: local and remote storage_path.
if self.repo_type == 'remote':
# Remote first.
# Build destination directory
remotepath = self.remote_path.rstrip('/') + self.full_path
try:
subprocess.run(
r"rsync -rlIvz --munge-links *.AppImage* " +
f"{self.remote_host}:{remotepath}",
cwd=self.appnamedir, shell=True, check=True
)
finally:
pass
else:
# Local
# Forcing creation of subfolders, in case there is a new build
os.makedirs(self.full_path, exist_ok = True)
for file in glob.glob("*.AppImage*"):
subprocess.run(shlex.split(
f"cp -f {file} {self.full_path}"
), check=True)
def generalize_and_link(self, chdir = 'default'):
"""Creates the needed generalized files if needed."""
print("--- Generalize and Link Phase ---")
# If called with a pointed version, no generalize and link necessary.
if not self.branch_version:
return
# If a prerelease or a daily version, either.
if self.query in { 'daily', 'prerelease' }:
return
if chdir == 'default':
chdir = self.full_path
appimagefilename = {}
zsyncfilename = {}
# Creating versions for short version and query text
versions = [ self.short_version, self.branch_version ]
for arch in Build.ARCHSTD:
# If already built, do not do anything.
if self.built[arch]:
continue
os.chdir(chdir)
# if the appimage for the reported arch is not found, skip to next
# arch
if not os.path.exists(self.appimagefilename[arch]):
continue
# Doing it both for short_name and for branchname
for version in versions:
appimagefilename[arch] = self.appname + '-' + version
appimagefilename[arch] += self.languagepart + self.helppart
appimagefilename[arch] += f'-{arch}.AppImage'
zsyncfilename[arch] = appimagefilename[arch] + '.zsync'
# Create the symlink
print(f"Creating {appimagefilename[arch]} and checksums.")
if os.path.exists(appimagefilename[arch]):
os.unlink(appimagefilename[arch])
os.symlink(self.appimagefilename[arch], appimagefilename[arch])
# Create the checksum for the AppImage
self.__create_checksum__(appimagefilename[arch])
# Do not continue if no zsync file is provided.
if not self.updatable:
continue
print(f"Creating zsync file for version {version}.")
if os.path.exists(zsyncfilename[arch]):
os.unlink(zsyncfilename[arch])
shutil.copyfile(self.zsyncfilename[arch], zsyncfilename[arch])
# Editing the zsyncfile
subprocess.run(shlex.split(
r"sed --in-place 's/^Filename:.*$/Filename: " +
f"{appimagefilename[arch]}/' {zsyncfilename[arch]}"
), check=True)
self.__create_checksum__(zsyncfilename[arch])
self.built = True
def __del__(self):
"""Destructor"""

View File

@ -2,24 +2,29 @@
# encoding: utf-8
"""Helps with command line commands."""
import os
import shutil
import sys
import json
import click
import yaml
import loaih
import loaih.build
@click.group()
def cli():
"""Helps with command line commands."""
@cli.command()
@click.option('-j', '--json', 'jsonout', default=False, is_flag=True,
help="Output format in json.")
@click.argument('query')
def getversion(query, jsonout):
"""Get the numeral version from a named version."""
batch = []
@cli.command()
@click.option('-j', '--json', 'jsonout', default=False, is_flag=True, help="Output format in json.")
@click.option('--default-to-current', '-d', is_flag=True, default=False, help="If no versions are found, default to current one (for daily builds). Default: do not default to current.")
@click.argument('query')
def getversion(query, jsonout, default_to_current):
"""Get download information for named or numbered versions."""
batchlist = []
queries = []
if ',' in query:
queries.extend(query.split(','))
@ -27,122 +32,186 @@ def getversion(query, jsonout):
queries.append(query)
for singlequery in queries:
batch.extend(loaih.Base.collectedbuilds(singlequery))
elem = loaih.Solver.parse(singlequery, default_to_current)
if elem.version not in { None, "" }:
batchlist.append(elem)
if len(batch) > 0:
if len(batchlist) > 0:
if jsonout:
click.echo(json.dumps([x.todict() for x in batch]))
click.echo(json.dumps([x.to_dict() for x in batchlist ]))
else:
for value in batch:
for value in batchlist:
click.echo(value)
@cli.command()
@click.option('-a', '--arch', 'arch', default='all',
type=click.Choice(['x86', 'x86_64', 'all'], case_sensitive=False),
help="Build the AppImage for a specific architecture. If there is no specific options, the process will build for both architectures (if available). Default: all")
@click.option('-c/-C', '--check/--no-check', 'check', default=True,
help="Check in the final storage if the queried version is existent. Default: check")
@click.option('-d', '--download-path', 'download_path',
default = '/var/tmp/downloads', type=str,
help="Path to the download folder. Default: /var/tmp/downloads")
@click.option('-l', '--language', 'language', default = 'basic', type=str,
help="Languages to be included. Options: basic, standard, full, a language string (e.g. 'it') or a list of languages comma separated (e.g.: 'en-US,en-GB,it'). Default: basic")
@click.option('-o/-O', '--offline-help/--no-offline-help', 'offline', default = False,
help="Include or not the offline help for the chosen languages. Default: no offline help")
@click.option('-p/-P', '--portable/--no-portable', 'portable', default = False,
help="Create a portable version of the AppImage or not. Default: no portable")
@click.option('-r', '--repo-path', 'repo_path', default = '/mnt/appimage',
type=str, help="Path to the final storage of the AppImage. Default: /mnt/appimage")
@click.option('-s/-S', '--sign/--no-sign', 'sign', default=True,
help="Wether to sign the build. Default: sign")
@click.option('-u/-U', '--updatable/--no-updatable', 'updatable', default = True,
help="Create an updatable version of the AppImage or not. Default: updatable")
@click.option('-a', '--arch', 'arch', default='x86_64',
type=click.Choice(['x86', 'x86_64', 'all'], case_sensitive=False), help="Build the AppImage for a specific architecture. Default: x86_64")
@click.option('--check', '-c', is_flag=True, default=False, help="Checks in the repository path if the queried version is existent. Default: do not check")
@click.option('--checksums', '-e', is_flag=True, default=False, help="Create checksums for each created file (AppImage). Default: do not create checksums.")
@click.option('--keep-downloads', '-k', 'keep', is_flag=True, default=False, help="Keep the downloads folder after building the AppImage. Default: do not keep.")
@click.option('--languages', '-l', 'language', default='basic', type=str, help="Languages to be included. Options: basic, standard, full, a language string (e.g. 'it') or a list of languages comma separated (e.g.: 'en-US,en-GB,it'). Default: basic")
@click.option('--offline-help', '-o', 'offline', is_flag=True, default=False, help="Include the offline help pages for the chosen languages. Default: no offline help")
@click.option('--portable', '-p', 'portable', is_flag=True, default=False, help="Create a portable version of the AppImage or not. Default: no portable")
@click.option('--sign', '-s', is_flag=True, default=False, help="Sign the build with your default GPG key. Default: do not sign")
@click.option('--updatable', '-u', is_flag=True, default=False, help="Create an updatable AppImage (compatible with zsync2). Default: not updatable")
@click.option('--download-path', '-d', default='./downloads', type=str, help="Path to the download folder. Default: ./downloads")
@click.option('--repo-path', '-r', default='.', type=str, help="Path to the final storage of the AppImage. Default: current directory")
@click.argument('query')
def build(arch, language, offline, portable, updatable, download_path, repo_path, check, sign, query):
def build(arch, language, offline, portable, updatable, download_path, repo_path, check, checksums, sign, keep, query):
"""Builds an Appimage with the provided options."""
# Multiple query support
queries = []
if ',' in query:
queries.extend(query.split(','))
else:
queries.append(query)
# Parsing options
arches = []
if arch.lower() == 'all':
# We need to build it twice.
arches = [ 'x86', 'x86_64' ]
arches = ['x86', 'x86_64']
else:
arches = [ arch.lower() ]
arches = [arch.lower()]
if query.endswith('.yml') or query.endswith('.yaml'):
# This is a buildfile. So we have to load the file and pass the build options ourselves.
config = {}
with open(query, 'r', encoding= 'utf-8') as file:
config = yaml.safe_load(file)
# Other more global variables
repopath = os.path.abspath(repo_path)
if not os.path.exists(repopath):
os.makedirs(repopath, exist_ok=True)
downloadpath = os.path.abspath(download_path)
if not os.path.exists(downloadpath):
os.makedirs(downloadpath, exist_ok=True)
# With the config file, we ignore all the command line options and set
# generic default.
for cbuild in config['builds']:
# Loop a run for each build.
collection = loaih.build.Collection(cbuild['query'], arches)
for obj in collection:
# Configuration phase
obj.language = cbuild['language']
obj.offline_help = cbuild['offline_help']
obj.portable = cbuild['portable']
obj.updatable = True
obj.storage_path = "/srv/http/appimage.sys42.eu"
if 'repo' in config['data'] and config['data']['repo']:
obj.storage_path = config['data']['repo']
obj.download_path = "/var/tmp/downloads"
if 'download' in config['data'] and config['data']['download']:
obj.download_path = config['data']['download']
if 'http' in obj.storage_path:
obj.remoterepo = True
obj.remote_host = "ciccio.libreitalia.org"
if 'remote_host' in config['data'] and config['data']['remote_host']:
obj.remote_host = config['data']['remote_host']
obj.remote_path = "/var/lib/nethserver/vhost/appimages"
if 'remote_path' in config['data'] and config['data']['remote_path']:
obj.remote_path = config['data']['remote_path']
if 'sign' in config['data'] and config['data']['sign']:
obj.sign = True
# Build phase
obj.calculate()
if not 'force' in config['data'] or not config['data']['force']:
obj.check()
obj.download()
obj.build()
obj.checksums()
if obj.remoterepo and obj.appnamedir:
obj.generalize_and_link(obj.appnamedir)
obj.publish()
if not obj.remoterepo:
obj.generalize_and_link()
del obj
else:
collection = loaih.build.Collection(query, arches)
for obj in collection:
for myquery in queries:
for appbuild in loaih.build.Collection(myquery, arches):
# Configuration phase
obj.language = language
obj.offline_help = offline
obj.portable = portable
obj.updatable = updatable
obj.storage_path = repo_path
obj.download_path = download_path
if sign:
obj.sign = True
appbuild.tidy_folder = False
appbuild.language = language
appbuild.offline_help = offline
appbuild.portable = portable
appbuild.updatable = updatable
appbuild.storage_path = repopath
appbuild.download_path = downloadpath
appbuild.sign = sign
# Running phase
obj.calculate()
appbuild.calculate()
if check:
obj.check()
appbuild.check()
appbuild.download()
appbuild.build()
if checksums:
appbuild.checksums()
appbuild.publish()
del appbuild
if not keep:
shutil.rmtree(downloadpath)
@cli.command()
@click.option("--verbose", '-v', is_flag=True, default=False, help="Show building phases.", show_default=True)
@click.argument("yamlfile")
def batch(yamlfile, verbose):
"""Builds a collection of AppImages based on YAML file."""
# Defaults for a batch building is definitely more different than a
# manual one. To reflect this behaviour, I decided to split the commands
# between batch (bulk creation) and build (manual building).
# Check if yamlfile exists.
if not os.path.exists(os.path.abspath(yamlfile)):
click.echo(f"YAML file {yamlfile} does not exists or is unreadable.")
sys.exit(1)
# This is a buildfile. So we have to load the file and pass the build
# options ourselves.
config = {}
with open(os.path.abspath(yamlfile), 'r', encoding='utf-8') as file:
config = yaml.safe_load(file)
# Globals for yamlfile
gvars = {}
gvars['download_path'] = "/var/tmp/downloads"
if 'download' in config['data'] and config['data']['download']:
gvars['download_path'] = config['data']['download']
gvars['force'] = False
if 'force' in config['data'] and config['data']['force']:
gvars['force'] = config['data']['force']
gvars['storage_path'] = "/srv/http/appimage"
if 'repo' in config['data'] and config['data']['repo']:
gvars['storage_path'] = config['data']['repo']
gvars['remoterepo'] = False
gvars['remote_host'] = ''
gvars['remote_path'] = "/srv/http/appimage"
if 'http' in gvars['storage_path']:
gvars['remoterepo'] = True
gvars['remote_host'] = "ciccio.libreitalia.org"
if 'remote_host' in config['data'] and config['data']['remote_host']:
gvars['remote_host'] = config['data']['remote_host']
if 'remote_path' in config['data'] and config['data']['remote_path']:
gvars['remote_path'] = config['data']['remote_path']
gvars['sign'] = False
if 'sign' in config['data'] and config['data']['sign']:
gvars['sign'] = True
# With the config file, we ignore all the command line options and set
# generic default.
for cbuild in config['builds']:
# Loop a run for each build.
collection = loaih.build.Collection(cbuild['query'])
for obj in collection:
# Configuration phase
obj.verbose = verbose
obj.language = 'basic'
if 'language' in cbuild and cbuild['language']:
obj.language = cbuild['language']
obj.offline_help = False
if 'offline_help' in cbuild and cbuild['offline_help']:
obj.offline_help = cbuild['offline_help']
obj.portable = False
if 'portable' in cbuild and cbuild['portable']:
obj.portable = cbuild['portable']
obj.updatable = True
obj.storage_path = gvars['storage_path']
obj.download_path = gvars['download_path']
obj.remoterepo = gvars['remoterepo']
obj.remote_host = gvars['remote_host']
obj.remote_path = gvars['remote_path']
obj.sign = gvars['sign']
# Build phase
obj.calculate()
if not gvars['force']:
obj.check()
obj.download()
obj.build()
obj.checksums()
if obj.remoterepo and obj.appnamedir:
obj.generalize_and_link(obj.appnamedir)
obj.publish()
obj.generalize_and_link()
if not obj.remoterepo:
obj.generalize_and_link()
del obj
# In case prerelease or daily branches are used, cleanup the download
# folder after finishing the complete run (to make sure the next run
# will redownload all the needed files and is indeed fresh).
# we will swipe all the builds inside a collection to understand the files
# to delete.
for cbuild in config['builds']:
# Loop a run for each build.
for build in loaih.build.Collection(cbuild['query']):
if build.version.branch in {'prerelease', 'daily'}:
build.version.cleanup_downloads(gvars['download_path'], verbose)

View File

@ -3,12 +3,18 @@
# vim:sts=4:sw=4
"""Helps building and automatizing building LibreOffice AppImages."""
from setuptools import setup,find_packages
from pathlib import Path
from setuptools import setup, find_packages
this_directory = Path(__file__).parent
long_description = (this_directory / "README.md").read_text()
setup(
name="loaih",
version="1.3.1",
version="1.3.3",
description="LOAIH - LibreOffice AppImage Helpers, help build a LibreOffice AppImage",
long_description=long_description,
long_description_content_type='text/markdown',
author="Emiliano Vavassori",
author_email="syntaxerrormmm@libreoffice.org",
packages=find_packages(exclude=['contrib', 'docs', 'tests']),
@ -17,7 +23,7 @@ setup(
'loaih = loaih.script:cli',
],
},
install_requires=[ 'click', 'lxml', 'packaging', 'pyyaml' ],
install_requires=['click', 'lxml', 'packaging', 'pyyaml', 'requests'],
license='MIT',
url='https://git.libreitalia.org/LibreItalia/loappimage-helpers/',
url='https://git.libreitalia.org/LibreItalia/loaih/',
)