Compare commits

...

27 Commits
v1.3.1 ... main

Author SHA1 Message Date
emiliano.vavassori ad1ef60947 Some adjustments to Docker image. 2024-08-22 20:59:07 +02:00
emiliano.vavassori 8a9541c4f6 Changed packaging tooling to hatch.
Fixing issues with python build.
2024-08-22 20:32:35 +02:00
gabriele.ponzo 2f31897cd9 Update README.md
Corretta forma inglese
2024-03-01 08:45:05 +00:00
emiliano.vavassori eff3bc67ee Fixed the README with pointers to the Wiki. 2024-01-01 18:32:36 +01:00
emiliano.vavassori 03fc28a1bf Implemented better option checking and defaults for builds. 2024-01-01 18:29:28 +01:00
emiliano.vavassori 8c4bc65184 Fixing duplication of functions in the previous commit. 2024-01-01 16:57:38 +01:00
emiliano.vavassori 5d6c7f2df2 Fixing multiple query build not working. 2024-01-01 16:36:50 +01:00
emiliano.vavassori 608eeb6071 Fixed a syntax error finding a file in a non-existent subdirectory. 2024-01-01 13:33:19 +01:00
emiliano.vavassori 2a617b1824 Fixing checksum function. 2024-01-01 13:18:55 +01:00
emiliano.vavassori 36da47b0b5 Fixing cleanup. 2024-01-01 10:02:15 +01:00
emiliano.vavassori 700d1eb376 Adding missing dependency. 2023-12-31 22:08:28 +01:00
emiliano.vavassori 3b01fc3b05 Fixed syntax error. 2023-12-09 23:36:54 +01:00
emiliano.vavassori 2277523f3c Removing unmet dependency. 2023-12-09 23:35:57 +01:00
emiliano.vavassori 3dbfef9fbe Cleaning up downloads directory before Daily and Prerelease, but leave downloaded versions in place already. 2023-12-09 23:26:03 +01:00
emiliano.vavassori 1a24f54d89 Revised and tested build cli subcommand. 2023-12-05 00:12:10 +01:00
emiliano.vavassori 024535afa9 Revised and tested getversion flow. Revised but not tested the build flows. 2023-12-04 00:51:58 +01:00
emiliano.vavassori d9775f4f94 First steps to refactoring the code and giving it a clear structure. 2023-12-03 04:16:31 +01:00
emiliano.vavassori 28528aa063 Adding Dockerfile to create a container to build. 2023-12-02 03:01:14 +01:00
emiliano.vavassori 142a09df14 Code for passing also a date for daily builds. Possibly blocked by cli command. 2023-12-02 03:00:46 +01:00
emiliano.vavassori 71a81b6a8e Fixing the Readme a bit. 2023-12-01 23:29:41 +01:00
emiliano.vavassori bb1e73fd6c Fixing last bits. 2023-12-01 23:15:57 +01:00
emiliano.vavassori a74b8d4858 Fixing version in setup. 2023-12-01 22:55:26 +01:00
emiliano.vavassori 8bd23dd08b Adding pypi uploading to pipeline. 2023-12-01 22:44:42 +01:00
emiliano.vavassori 7fe48c297d If repo is '.', resolve absolute path. 2023-12-01 22:44:24 +01:00
emiliano.vavassori 82e366c5da Redownloading again all files in case of prereleases or daily builds. 2023-12-01 21:55:13 +01:00
emiliano.vavassori 49e0ab5593 Reimplemented md5sum with python functions. 2023-12-01 21:42:52 +01:00
emiliano.vavassori 62248d862c Fixing daily URL breakage (issue #1).
As builds can come from different tinderboxes, adding an additional request and HTML parsing with xpath a final URL is less dependent on hardcoded strings.
2023-12-01 21:20:15 +01:00
13 changed files with 1148 additions and 982 deletions

View File

@ -9,8 +9,6 @@ steps:
commands:
- pip install wheel
- python setup.py bdist_wheel
when:
event: tag
- name: release
image: plugins/gitea-release
@ -21,5 +19,15 @@ steps:
files: dist/*.whl
checksum: md5
draft: true
when:
event: tag
- name: publish
image: plugins/pypi
settings:
username: __token__
password:
from_secret: pypi
trigger:
event:
- tag
- push

1
.gitignore vendored
View File

@ -1,4 +1,5 @@
venv
test
build
dist
loaih.egg-info

8
Dockerfile Normal file
View File

@ -0,0 +1,8 @@
# vim:sts=4:sw=4
FROM python:3.9-slim-bullseye
RUN mkdir /build && \
pip install loaih
WORKDIR /build
ENTRYPOINT [ "/usr/local/bin/loaih" ]
CMD [ "--help" ]

View File

@ -1,34 +1,9 @@
# LibreOffice AppImage Helper - `loaih` #
LibreOffice AppImage Helper is an enhanced Python porting from [previous work
from Antonio
Faccioli](https://github.com/antoniofaccioli/libreoffice-appimage). It helps
building a LibreOffice AppImage from officially released .deb files with some
options.
## Installing the package ##
[![Build Status](https://drone.libreitalia.org/api/badges/libreitalia/loaih/status.svg)](https://drone.libreitalia.org/libreitalia/loaih)
You can much more easily install the package via the produced wheel in the
[Releases](/libreitalia/loaih/releases/) page. Once downloaded, you can
install the utility with `pip`
$ pip install ./loaih-*.whl
You can also clone the repository and build the app yourself, which is as easy
as:
$ pip install wheel
$ git clone https://git.libreitalia.org/libreitalia/loaih.git
$ cd loaih
$ python setup.py bdist_wheel
$ pip install dist/loaih*.whl
## Using the package ##
The package will install a single command, `loaih`, which should help you
build your own version of the AppImage. Here's some usage scenarios.
by Antonio Faccioli](https://github.com/antoniofaccioli/libreoffice-appimage).
It helps building a LibreOffice AppImage from officially released .deb files
with some options.
## Getting options and help ##
@ -37,35 +12,7 @@ You can ask the app some information on how you can use it:
$ loaih --help
$ loaih getversion --help
$ loaih build --help
$ loaih batch --help
### Finding metadata on a specific version ###
You can use the command `getversion` and a specific query:
$ loaih getversion fresh
$ loaih getversion still
### Building a one-time AppImage ###
You can build yourself an AppImage with specific options:
$ loaih build -C -a x86_64 -l it -r . -o fresh
This will build an AppImage for the latest "Fresh" version, built for 64-bit
operating systems, with Italian as the only language, with Offline Help,
signed and updatable in the current folder.
For other build options, please see `loaih build --help` which should be
pretty complete.
### Batch building of a set of AppImages ###
This is less documented, but if the *query* parameter to the `build` command
is a YAML file (see references of it in the repository), this will loop
through the various options and create a complete set of builds based on some
characteristics.
$ loaih build fresh.yml
This is the main way the community builds found at the [LibreOffice AppImage
Repository](https://appimages.libreitalia.org) are produced.
For any other information needed, please visit the [wiki of the
project](https://git.libreitalia.org/libreitalia/loaih/wiki).

View File

@ -1,213 +0,0 @@
#!/usr/bin/env python
# encoding: utf-8
import urllib.request
from lxml import etree
from packaging.version import parse as parse_version
import datetime
class Definitions(object):
DOWNLOADPAGE = "https://www.libreoffice.org/download/download/"
ARCHIVE = "https://downloadarchive.documentfoundation.org/libreoffice/old/"
RELEASE = "https://download.documentfoundation.org/libreoffice/stable/"
DAILY = "https://dev-builds.libreoffice.org/daily/master/Linux-rpm_deb-x86_64@tb87-TDF/"
PRERELEASE = "https://dev-builds.libreoffice.org/pre-releases/deb/x86_64/"
SELECTORS = {
'still': {
'URL': DOWNLOADPAGE,
'xpath': '(//span[@class="dl_version_number"])[last()]/text()'
},
'fresh': {
'URL': DOWNLOADPAGE,
'xpath': '(//span[@class="dl_version_number"])[1]/text()'
},
'prerelease': {
'URL': DOWNLOADPAGE,
'xpath': '//p[@class="lead_libre"][last()]/following-sibling::ul[last()]/li/a/text()'
},
'daily': {
'URL': DAILY,
'xpath': '//td/a'
}
}
class Base(object):
# Class for static methods which might be useful even outside the build
# scripts.
@staticmethod
def dailyurl(date = datetime.datetime.today()):
"""Returns the URL for the latest valid daily build."""
# As per other parts of the build, we need to maintain an URL also for
# x86 versions that it isn't really provided.
# As such, the return value must be a dictionary
# Get the anchor for today's builds
a = etree.HTML(urllib.request.urlopen(Definitions.DAILY).read()).xpath("//td/a[contains(text(), '" + date.strftime('%Y-%m-%d') + "')]/text()")
if len(a) == 0:
# No results found, no version found, let's return a
return { 'x86': '-', 'x86_64': '-' }
# On the contrary, more than a version is found. let's order the
# list and get the latest item
return { 'x86': '-', 'x86_64': Definitions.SELECTORS['daily']['URL'] + sorted(a)[-1] }
@staticmethod
def dailyver(date = datetime.datetime.today()):
"""Returns versions present on the latest daily build."""
url = Base.dailyurl(date)['x86_64']
# If no daily releases has been provided yet, return empty
if url == '-':
return []
# Rerun the page parsing, this time to find out the versions built
b = etree.HTML(urllib.request.urlopen(url).read()).xpath("//td/a[contains(text(), '_deb.tar.gz')]/text()")
# This should have returned the main package for a version, but can
# have returned multiple ones, so let's treat it as a list
return [ x.split('_')[1] for x in b ]
@staticmethod
def namedver(query):
"""Gets the version for a specific named version."""
if query == 'daily' or query == 'yesterday':
# Daily needs double parsing for the same result to apply.
# We first select today's build anchor:
date = datetime.datetime.today()
if query == 'yesterday':
# Use yesterdays' date for testing purposes.
date += datetime.timedelta(days=-1)
return Base.dailyver(date)
# In case the query isn't for daily
return etree.HTML(urllib.request.urlopen(Definitions.SELECTORS[query]['URL']).read()).xpath(Definitions.SELECTORS[query]['xpath'])
@staticmethod
def fullversion(version):
"""Get latest full version from Archive based on partial version."""
versionlist = etree.HTML(urllib.request.urlopen(Definitions.ARCHIVE).read()).xpath(f"//td/a[starts-with(text(), '{version}')]/text()")
if versionlist:
cleanlist = sorted([ x.strip('/') for x in versionlist ])
# Sorting, then returning the last version
return cleanlist[-1]
return None
@staticmethod
def urlfromqueryandver(query, version):
"""Returns the fetching URL based on the queried version and the numeric version of it."""
# This has the purpose to simplify and explain how the releases are
# layed out.
# If the query tells about daily or 'yesterday' (for testing purposes),
# we might ignore versions and return the value coming from dailyurl:
if query == 'daily':
return Base.dailyurl()
if query == 'yesterday':
date = datetime.datetime.today() + datetime.timedelta(days=-1)
return Base.dailyurl(date)
# All other versions will be taken from Archive, as such we need a full
# version.
# If the version has only 2 points in it (or splits into three parts by '.'), that's not a full version and we will call the getlatestver() function
fullversion = str(version)
if len(fullversion.split('.')) <= 3:
fullversion = str(Base.fullversion(version))
# So the final URL is the Archive one, plus the full versions, plus a
# final '/deb/' - and an arch subfolder
baseurl = Definitions.ARCHIVE + fullversion + '/deb/'
retval = {}
# x86 binaries are not anymore offered after 6.3.0.
if parse_version(fullversion) < parse_version('6.3.0'):
retval['x86'] = baseurl + 'x86/'
else:
retval['x86'] = '-'
retval['x86_64'] = baseurl + 'x86_64/'
return retval
@staticmethod
def collectedbuilds(query):
"""Creates a list of Builds based on each queried version found."""
retval = []
if '.' in query:
# Called with a numeric query. Pass it to RemoteBuild
retval.append(RemoteBuild(query))
else:
# Named query
a = Base.namedver(query)
if not a:
# a is empty
return retval
if isinstance(a, list) and len(a) > 1:
retval.extend([ RemoteBuild(query, version) for version in a ])
else:
retval.append(RemoteBuild(query))
return sorted(retval, key=lambda x: x.version)
class RemoteBuild(object):
def __init__(self, query, version = None):
"""Should simplify the single builded version."""
self.query = query
self.version = ''
self.basedirurl = { 'x86': '-', 'x86_64': '-' }
if version and isinstance(version, str):
self.version = version
if not '.' in self.query:
# Named version.
# Let's check if a specific version was requested.
if self.version == '':
# In case it was not requested, we will carry on the generic
# namedver() query.
# If the results are more than one, we'll take the latest (since we are requested to provide a single build).
a = Base.namedver(self.query)
if isinstance(a, list):
# if the number of versions is zero, return and exit
if not a:
return None
if len(a) == 1:
# version is a single one.
self.version = a[0]
else:
# In this case, we will select the latest release.
self.version = sorted(a)[-1]
# If the version has already a version, as requested by user,
# continue using that version
else:
# In case of numbered queries, put it as initial version
self.version = self.query
if len(str(self.version).split('.')) < 4:
# If not 4 dotted, let's search for the 4 dotted version
self.version = Base.fullversion(self.version)
self.basedirurl = Base.urlfromqueryandver(self.query, self.version)
def todict(self):
return {
'query': self.query,
'version': self.version,
'basedirurl': self.basedirurl
}
def __str__(self):
return f"""query: {self.query}
version: {self.version}
x86: {self.basedirurl['x86']}
x86_64: {self.basedirurl['x86_64']}"""

View File

@ -1,535 +0,0 @@
#!/usr/bin/env python3
# encoding: utf-8
"""Classes and functions to build an AppImage."""
import os
import glob
import subprocess
import shutil
import re
import shlex
import tempfile
import urllib.request
from lxml import etree
import loaih
class Collection(list):
"""Aggregates metadata on a collection of builds."""
def __init__(self, query, arch = ['x86', 'x86_64']):
"""Build a list of version to check/build for this round."""
super().__init__()
self.extend([
Build(query, arch, version) for version in loaih.Base.collectedbuilds(query)
])
class Build(loaih.RemoteBuild):
"""Builds a single version."""
LANGSTD = [ 'ar', 'de', 'en-GB', 'es', 'fr', 'it', 'ja', 'ko', 'pt',
'pt-BR', 'ru', 'zh-CN', 'zh-TW' ]
LANGBASIC = [ 'en-GB' ]
ARCHSTD = [ 'x86', 'x86_64' ]
def __init__(self, query, arch, version = None):
super().__init__(query, version)
self.arch = arch
self.short_version = str.join('.', self.version.split('.')[0:2])
self.branch_version = None
if not '.' in self.query:
self.branch_version = self.query
self.url = self.basedirurl
# Other default values
self.language = 'basic'
self.offline_help = False
self.portable = False
self.updatable = True
self.sign = True
self.repo_type = 'local'
self.remote_host = ''
self.remote_path = ''
self.storage_path = '/mnt/appimage'
self.download_path = '/var/tmp/downloads'
self.appnamedir = ''
# Specific build version
self.appname = 'LibreOffice'
self.appversion = ''
self.appimagedir = ''
self.appimagefilename = {}
self.zsyncfilename = {}
# Other variables by build
self.languagepart = '.' + self.language
self.helppart = ''
# Creating a tempfile
self.builddir = tempfile.mkdtemp()
self.tarballs = {}
self.built = { 'x86': False, 'x86_64': False }
# Preparing the default for the relative path on the storage for
# different versions.
# The path will evaluated as part of the check() function, as it is
# understood the storage_path can be changed before that phase.
self.relative_path = []
self.full_path = ''
self.baseurl = ''
def calculate(self):
"""Calculate exclusions and other variables."""
print("--- Calculate Phase ---")
# let's check here if we are on a remote repo or local.
if self.storage_path.startswith("http"):
# Final repository is remote
self.repo_type = 'remote'
print("Repo is remote.")
else:
self.repo_type = 'local'
print("Repo is local.")
# AppName
if self.query in { 'prerelease', 'daily' }:
self.appname = 'LibreOfficeDev'
# Calculating languagepart
self.languagepart = "."
if ',' in self.language:
self.languagepart += self.language.replace(',', '-')
else:
self.languagepart += self.language
# Calculating help part
if self.offline_help:
self.helppart = '.help'
# Building the required names
for arch in Build.ARCHSTD:
self.appimagefilename[arch] = self.__gen_appimagefilename__(self.version, arch)
self.zsyncfilename[arch] = self.appimagefilename[arch] + '.zsync'
# Mandate to the private function to calculate the full_path available
# for the storage and the checks.
self.__calculate_full_path__()
def __gen_appimagefilename__(self, version, arch):
"""Generalize the construction of the name of the app."""
self.appversion = version + self.languagepart + self.helppart
return self.appname + f'-{self.appversion}-{arch}.AppImage'
def __calculate_full_path__(self):
"""Calculate relative path of the build, based on internal other variables."""
if len(self.relative_path) == 0:
if self.query == 'daily':
self.relative_path.append('daily')
elif self.query == 'prerelease':
self.relative_path.append('prerelease')
# Not the same check, an additional one
if self.portable:
self.relative_path.append('portable')
# Fullpath might be intended two ways:
if self.repo_type == 'remote':
# Repository is remote
# we build full_path as it is absolute to the root of the
# storage_path.
self.full_path = '/'
if len(self.relative_path) >= 1:
self.full_path += str.join('/', self.relative_path)
else:
# Repository is local
# If it is remote or if it is local
fullpath_arr = self.storage_path.split('/')
# Joining relative path only if it is not null
if len(self.relative_path) > 0:
fullpath_arr.extend(self.relative_path)
self.full_path = re.sub(r"/+", '/', str.join('/', fullpath_arr))
def check(self):
"""Checking if the requested AppImage has been already built."""
print("--- Check Phase ---")
if len(self.appimagefilename) != 2:
self.calculate()
for arch in self.arch:
print(f"Searching for {self.appimagefilename[arch]}")
# First, check if by metadata the repo is remote or not.
if self.repo_type == 'remote':
# Remote storage. I have to query a remote site to know if it
# was already built.
name = self.appimagefilename[arch]
url = self.storage_path.rstrip('/') + self.full_path + '/'
matching = []
try:
with urllib.request.urlopen(url) as response:
matching = etree.HTML(response.read()).xpath(
f"//a[contains(@href, '{name}')]/@href"
)
if len(matching) > 0:
# Already built.
self.built[arch] = True
except urllib.error.HTTPError:
# The URL specified do not exist. So it is to build.
pass
else:
# Repo is local
command = f"find {self.full_path} -name {self.appimagefilename[arch]}"
res = subprocess.run(shlex.split(command),
capture_output=True,
env={ "LC_ALL": "C" },
text=True, encoding='utf-8', check=True)
if "No such file or directory" in res.stderr:
# Folder is not existent: so the version was not built
# Build stays false, and we go to the next arch
continue
if res.stdout and len(res.stdout.strip("\n")) > 0:
# All good, the command was executed fine.
self.built[arch] = True
if self.built[arch]:
print(f"Found requested AppImage: {self.appimagefilename[arch]}.")
def download(self):
"""Downloads the contents of the URL as it was a folder."""
print("--- Download Phase ---")
print(f"Started downloads for {self.version}. Please wait.")
for arch in self.arch:
# Checking if a valid path has been provided
if self.url[arch] == '-':
print(f"Cannot build for arch {arch}. Continuing with other arches.")
# Faking already built it so to skip other checks.
self.built[arch] = True
continue
if self.built[arch]:
print(f"A build for {arch} was already found. Skipping specific packages.")
continue
# Identifying downloads
contents = []
with urllib.request.urlopen(self.url[arch]) as url:
contents = etree.HTML(url.read()).xpath("//td/a")
self.tarballs[arch] = [ x.text
for x in contents
if x.text.endswith('tar.gz') and 'deb' in x.text
]
tarballs = self.tarballs[arch]
# Create and change directory to the download location
os.makedirs(self.download_path, exist_ok = True)
os.chdir(self.download_path)
for archive in tarballs:
# If the archive is already there, do not do anything.
if os.path.exists(archive):
continue
# Download the archive
try:
urllib.request.urlretrieve(self.url[arch] + archive, archive)
except Exception as error:
print(f"Failed to download {archive}: {error}.")
print(f"Finished downloads for {self.version}.")
def build(self):
"""Building all the versions."""
print("--- Building Phase ---")
for arch in self.arch:
if self.built[arch]:
# Already built for arch or path not available. User has already been warned.
continue
# Preparation tasks
self.appnamedir = os.path.join(self.builddir, self.appname)
os.makedirs(self.appnamedir, exist_ok=True)
# And then cd to the appname folder.
os.chdir(self.appnamedir)
# Download appimagetool from github
appimagetoolurl = f"https://github.com/AppImage/AppImageKit/releases/download/continuous/appimagetool-{arch}.AppImage"
urllib.request.urlretrieve(appimagetoolurl, 'appimagetool')
os.chmod('appimagetool', 0o755)
# Build the requested version.
self.__unpackbuild__(arch)
def __unpackbuild__(self, arch):
# We start by filtering out tarballs from the list
buildtarballs = [ self.tarballs[arch][0] ]
# Let's process standard languages and append results to the
# buildtarball
if self.language == 'basic':
if self.offline_help:
buildtarballs.extend([ x for x in self.tarballs[arch] if 'pack_en-GB' in x ])
else:
buildtarballs.extend([ x for x in self.tarballs[arch] if 'langpack_en-GB' in x])
elif self.language == 'standard':
for lang in Build.LANGSTD:
if self.offline_help:
buildtarballs.extend([ x for x in self.tarballs[arch] if ('pack_' + lang) in x ])
else:
buildtarballs.extend([ x for x in self.tarballs[arch] if ('langpack_' + lang) in x ])
elif self.language == 'full':
if self.offline_help:
# We need also all help. Let's replace buildtarball with the
# whole bunch
buildtarballs = self.tarballs[arch]
else:
buildtarballs.extend([ x for x in self.tarballs[arch] if 'langpack' in x ])
else:
# Looping for each language in self.language
for lang in self.language.split(","):
if self.offline_help:
buildtarballs.extend([ x for x in self.tarballs[arch]
if 'pack' + lang in x ])
else:
buildtarballs.extend([ x for x in self.tarballs[arch]
if 'langpack' + lang in x ])
os.chdir(self.appnamedir)
# Unpacking the tarballs
for archive in buildtarballs:
subprocess.run(shlex.split(
f"tar xzf {self.download_path}/{archive}"), check=True)
# create appimagedir
self.appimagedir = os.path.join(self.builddir, self.appname, self.appname + '.AppDir')
os.makedirs(self.appimagedir, exist_ok = True)
# At this point, let's decompress the deb packages
subprocess.run(shlex.split(
r"find .. -iname '*.deb' -exec dpkg -x {} . \;"
), cwd=self.appimagedir, check=True)
if self.portable:
subprocess.run(shlex.split(
r"find . -type f -iname 'bootstraprc' " +
r"-exec sed -i 's|^UserInstallation=.*|" +
r"UserInstallation=\$SYSUSERCONFIG/libreoffice/%s|g' {} \+" % self.short_version
), cwd=self.appimagedir, check=True)
# Changing desktop file
subprocess.run(shlex.split(
r"find . -iname startcenter.desktop -exec cp {} . \;"
), cwd=self.appimagedir, check=True)
subprocess.run(shlex.split(
f"sed --in-place \'s:^Name=.*$:Name={self.appname}:\' " +
r"startcenter.desktop"
), cwd=self.appimagedir, check=False)
subprocess.run(shlex.split(
r"find . -name '*startcenter.png' -path '*hicolor*48x48*' " +
r"-exec cp {} . \;"
), cwd=self.appimagedir, check=True)
# Find the name of the binary called in the desktop file.
binaryname = ''
with open(
os.path.join(self.appimagedir, 'startcenter.desktop'),
'r', encoding="utf-8"
) as desktopfile:
for line in desktopfile.readlines():
if re.match(r'^Exec', line):
binaryname = line.split('=')[-1].split(' ')[0]
# Esci al primo match
break
#binary_exec = subprocess.run(shlex.split(r"awk 'BEGIN { FS = \"=\" } /^Exec/ { print $2; exit }' startcenter.desktop | awk '{ print $1 }'"), cwd=self.appimagedir, text=True, encoding='utf-8')
#binaryname = binary_exec.stdout.strip("\n")
bindir=os.path.join(self.appimagedir, 'usr', 'bin')
os.makedirs(bindir, exist_ok = True)
subprocess.run(shlex.split(
r"find ../../opt -iname soffice -path '*program*' " +
r"-exec ln -sf {} ./%s \;" % binaryname
), cwd=bindir, check=True)
# Download AppRun from github
apprunurl = r"https://github.com/AppImage/AppImageKit/releases/"
apprunurl += f"download/continuous/AppRun-{arch}"
dest = os.path.join(self.appimagedir, 'AppRun')
urllib.request.urlretrieve(apprunurl, dest)
os.chmod(dest, 0o755)
# Dealing with extra options
buildopts = []
if self.sign:
buildopts.append('--sign')
# adding zsync build if updatable
if self.updatable:
buildopts.append(f"-u 'zsync|{self.zsyncfilename[arch]}'")
buildopts_str = str.join(' ', buildopts)
# Build the number-specific build
subprocess.run(shlex.split(
f"{self.appnamedir}/appimagetool {buildopts_str} -v " +
f"./{self.appname}.AppDir/"
), env={ "VERSION": self.appversion }, check=True)
print(f"Built AppImage version {self.appversion}")
# Cleanup phase, before new run.
for deb in glob.glob(self.appnamedir + '/*.deb'):
os.remove(deb)
subprocess.run(shlex.split(
r"find . -mindepth 1 -maxdepth 1 -type d -exec rm -rf {} \+"
), check=True)
def checksums(self):
"""Create checksums of the built versions."""
# Skip checksum if initally the build was already found in the storage directory
print("--- Checksum Phase ---")
if all(self.built[arch] for arch in self.arch):
return
os.chdir(self.appnamedir)
for arch in self.arch:
if not self.built[arch]:
# Here's the contrary. A newly built package has not yet been
# marked as built.
for item in [ self.appimagefilename[arch], self.zsyncfilename[arch] ]:
itempath = os.path.join(self.appnamedir, item)
if os.path.exists(itempath):
# For any built arch, find out if a file exist.
self.__create_checksum__(item)
def __create_checksum__(self, file):
"""Internal function to create checksum file."""
checksum = subprocess.run(f"md5sum {file}", shell=True,
capture_output=True, text=True, encoding='utf-8', check=True,
cwd=self.appnamedir)
if checksum.stdout:
with open(f"{file}.md5", 'w', encoding='utf-8') as checkfile:
checkfile.write(checksum.stdout)
def publish(self):
"""Moves built versions to definitive storage."""
print("--- Publish Phase ---")
if all(self.built[arch] for arch in self.arch):
# All files are already present in the full_path
return
os.chdir(self.appnamedir)
# Two cases here: local and remote storage_path.
if self.repo_type == 'remote':
# Remote first.
# Build destination directory
remotepath = self.remote_path.rstrip('/') + self.full_path
try:
subprocess.run(
r"rsync -rlIvz --munge-links *.AppImage* " +
f"{self.remote_host}:{remotepath}",
cwd=self.appnamedir, shell=True, check=True
)
finally:
pass
else:
# Local
# Forcing creation of subfolders, in case there is a new build
os.makedirs(self.full_path, exist_ok = True)
for file in glob.glob("*.AppImage*"):
subprocess.run(shlex.split(
f"cp -f {file} {self.full_path}"
), check=True)
def generalize_and_link(self, chdir = 'default'):
"""Creates the needed generalized files if needed."""
print("--- Generalize and Link Phase ---")
# If called with a pointed version, no generalize and link necessary.
if not self.branch_version:
return
# If a prerelease or a daily version, either.
if self.query in { 'daily', 'prerelease' }:
return
if chdir == 'default':
chdir = self.full_path
appimagefilename = {}
zsyncfilename = {}
# Creating versions for short version and query text
versions = [ self.short_version, self.branch_version ]
for arch in Build.ARCHSTD:
# If already built, do not do anything.
if self.built[arch]:
continue
os.chdir(chdir)
# if the appimage for the reported arch is not found, skip to next
# arch
if not os.path.exists(self.appimagefilename[arch]):
continue
# Doing it both for short_name and for branchname
for version in versions:
appimagefilename[arch] = self.appname + '-' + version
appimagefilename[arch] += self.languagepart + self.helppart
appimagefilename[arch] += f'-{arch}.AppImage'
zsyncfilename[arch] = appimagefilename[arch] + '.zsync'
# Create the symlink
print(f"Creating {appimagefilename[arch]} and checksums.")
if os.path.exists(appimagefilename[arch]):
os.unlink(appimagefilename[arch])
os.symlink(self.appimagefilename[arch], appimagefilename[arch])
# Create the checksum for the AppImage
self.__create_checksum__(appimagefilename[arch])
# Do not continue if no zsync file is provided.
if not self.updatable:
continue
print(f"Creating zsync file for version {version}.")
if os.path.exists(zsyncfilename[arch]):
os.unlink(zsyncfilename[arch])
shutil.copyfile(self.zsyncfilename[arch], zsyncfilename[arch])
# Editing the zsyncfile
subprocess.run(shlex.split(
r"sed --in-place 's/^Filename:.*$/Filename: " +
f"{appimagefilename[arch]}/' {zsyncfilename[arch]}"
), check=True)
self.__create_checksum__(zsyncfilename[arch])
def __del__(self):
"""Destructor"""
# Cleaning up build directory
shutil.rmtree(self.builddir)

View File

@ -1,148 +0,0 @@
#!/usr/bin/env python
# encoding: utf-8
"""Helps with command line commands."""
import json
import click
import yaml
import loaih
import loaih.build
@click.group()
def cli():
"""Helps with command line commands."""
@cli.command()
@click.option('-j', '--json', 'jsonout', default=False, is_flag=True,
help="Output format in json.")
@click.argument('query')
def getversion(query, jsonout):
"""Get the numeral version from a named version."""
batch = []
queries = []
if ',' in query:
queries.extend(query.split(','))
else:
queries.append(query)
for singlequery in queries:
batch.extend(loaih.Base.collectedbuilds(singlequery))
if len(batch) > 0:
if jsonout:
click.echo(json.dumps([x.todict() for x in batch]))
else:
for value in batch:
click.echo(value)
@cli.command()
@click.option('-a', '--arch', 'arch', default='all',
type=click.Choice(['x86', 'x86_64', 'all'], case_sensitive=False),
help="Build the AppImage for a specific architecture. If there is no specific options, the process will build for both architectures (if available). Default: all")
@click.option('-c/-C', '--check/--no-check', 'check', default=True,
help="Check in the final storage if the queried version is existent. Default: check")
@click.option('-d', '--download-path', 'download_path',
default = '/var/tmp/downloads', type=str,
help="Path to the download folder. Default: /var/tmp/downloads")
@click.option('-l', '--language', 'language', default = 'basic', type=str,
help="Languages to be included. Options: basic, standard, full, a language string (e.g. 'it') or a list of languages comma separated (e.g.: 'en-US,en-GB,it'). Default: basic")
@click.option('-o/-O', '--offline-help/--no-offline-help', 'offline', default = False,
help="Include or not the offline help for the chosen languages. Default: no offline help")
@click.option('-p/-P', '--portable/--no-portable', 'portable', default = False,
help="Create a portable version of the AppImage or not. Default: no portable")
@click.option('-r', '--repo-path', 'repo_path', default = '/mnt/appimage',
type=str, help="Path to the final storage of the AppImage. Default: /mnt/appimage")
@click.option('-s/-S', '--sign/--no-sign', 'sign', default=True,
help="Wether to sign the build. Default: sign")
@click.option('-u/-U', '--updatable/--no-updatable', 'updatable', default = True,
help="Create an updatable version of the AppImage or not. Default: updatable")
@click.argument('query')
def build(arch, language, offline, portable, updatable, download_path, repo_path, check, sign, query):
"""Builds an Appimage with the provided options."""
# Parsing options
arches = []
if arch.lower() == 'all':
# We need to build it twice.
arches = [ 'x86', 'x86_64' ]
else:
arches = [ arch.lower() ]
if query.endswith('.yml') or query.endswith('.yaml'):
# This is a buildfile. So we have to load the file and pass the build options ourselves.
config = {}
with open(query, 'r', encoding= 'utf-8') as file:
config = yaml.safe_load(file)
# With the config file, we ignore all the command line options and set
# generic default.
for cbuild in config['builds']:
# Loop a run for each build.
collection = loaih.build.Collection(cbuild['query'], arches)
for obj in collection:
# Configuration phase
obj.language = cbuild['language']
obj.offline_help = cbuild['offline_help']
obj.portable = cbuild['portable']
obj.updatable = True
obj.storage_path = "/srv/http/appimage.sys42.eu"
if 'repo' in config['data'] and config['data']['repo']:
obj.storage_path = config['data']['repo']
obj.download_path = "/var/tmp/downloads"
if 'download' in config['data'] and config['data']['download']:
obj.download_path = config['data']['download']
if 'http' in obj.storage_path:
obj.remoterepo = True
obj.remote_host = "ciccio.libreitalia.org"
if 'remote_host' in config['data'] and config['data']['remote_host']:
obj.remote_host = config['data']['remote_host']
obj.remote_path = "/var/lib/nethserver/vhost/appimages"
if 'remote_path' in config['data'] and config['data']['remote_path']:
obj.remote_path = config['data']['remote_path']
if 'sign' in config['data'] and config['data']['sign']:
obj.sign = True
# Build phase
obj.calculate()
if not 'force' in config['data'] or not config['data']['force']:
obj.check()
obj.download()
obj.build()
obj.checksums()
if obj.remoterepo and obj.appnamedir:
obj.generalize_and_link(obj.appnamedir)
obj.publish()
if not obj.remoterepo:
obj.generalize_and_link()
del obj
else:
collection = loaih.build.Collection(query, arches)
for obj in collection:
# Configuration phase
obj.language = language
obj.offline_help = offline
obj.portable = portable
obj.updatable = updatable
obj.storage_path = repo_path
obj.download_path = download_path
if sign:
obj.sign = True
# Running phase
obj.calculate()
if check:
obj.check()
obj.download()
obj.build()
obj.checksums()
obj.publish()
obj.generalize_and_link()
del obj

54
pyproject.toml Normal file
View File

@ -0,0 +1,54 @@
# vim:sts=4:sw=4
[build-system]
requires = ["hatchling"]
build-backend = "hatchling.build"
[project]
name = "loaih"
dynamic = ["version"]
authors = [
{ name = "Emiliano Vavassori", email = "syntaxerrormmm@gmail.com" },
]
description = "LOAIH - LibreOffice AppImage Helpers, help build a LibreOffice AppImage"
readme = "README.md"
license = "MIT"
requires-python = ">= 3.6"
dependencies = [
"click",
"lxml",
"pyyaml",
"requests",
]
classifiers = [
"Development Status :: 5 - Production/Stable",
"Programming Language :: Python",
"Programming Language :: Python :: 3",
"Programming Language :: Python :: 3 :: Only",
"Programming Language :: Python :: 3.6",
"Environment :: Console",
"Intended Audience :: Developers",
"Intended Audience :: End Users/Desktop",
"Intended Audience :: System Administrators",
"License :: OSI Approved :: MIT License",
"Natural Language :: English",
"Operating System :: POSIX :: Linux",
"Topic :: Office/Business",
"Topic :: Software Development :: Build Tools",
"Topic :: Software Development :: Quality Assurance",
"Topic :: Software Development :: Testing",
"Topic :: Software Development :: User Interfaces"
]
[project.scripts]
loaih = "loaih.script:cli"
[project.urls]
Homepage = "https://git.libreitalia.org/LibreItalia/loaih/"
[tool.hatch.version]
path = "src/loaih/version.py"
[tool.hatch.build.targets.sdist]
include = [
"src/loaih",
]

View File

@ -1,23 +0,0 @@
#!/usr/bin/env python
# encoding: utf-8
# vim:sts=4:sw=4
"""Helps building and automatizing building LibreOffice AppImages."""
from setuptools import setup,find_packages
setup(
name="loaih",
version="1.3.1",
description="LOAIH - LibreOffice AppImage Helpers, help build a LibreOffice AppImage",
author="Emiliano Vavassori",
author_email="syntaxerrormmm@libreoffice.org",
packages=find_packages(exclude=['contrib', 'docs', 'tests']),
entry_points={
'console_scripts': [
'loaih = loaih.script:cli',
],
},
install_requires=[ 'click', 'lxml', 'packaging', 'pyyaml' ],
license='MIT',
url='https://git.libreitalia.org/LibreItalia/loappimage-helpers/',
)

259
src/loaih/__init__.py Normal file
View File

@ -0,0 +1,259 @@
#!/usr/bin/env python
# encoding: utf-8
"""machinery for compiling new versions of appimages."""
import datetime
import json
import re
import requests
import subprocess
import shlex
from lxml import html
# Constants
DOWNLOADPAGE = "https://www.libreoffice.org/download/download/"
ARCHIVE = "https://downloadarchive.documentfoundation.org/libreoffice/old/"
RELEASE = "https://download.documentfoundation.org/libreoffice/stable/"
DAILY = "https://dev-builds.libreoffice.org/daily/master/"
PRERELEASE = "https://dev-builds.libreoffice.org/pre-releases/deb/x86_64/"
SELECTORS = {
'still': {
'URL': DOWNLOADPAGE,
'xpath': '(//span[@class="dl_version_number"])[last()]/text()'
},
'fresh': {
'URL': DOWNLOADPAGE,
'xpath': '(//span[@class="dl_version_number"])[1]/text()'
},
'prerelease': {
'URL': DOWNLOADPAGE,
'xpath': '//p[@class="lead_libre"][last()]/following-sibling::ul[last()]/li/a/text()'
},
'daily': {
'URL': DAILY,
'xpath': '//td/a'
}
}
# Generic functions
def match_xpath(url: str, xpath: str):
"""Uses a couple of extensions to get results over webpage."""
resource = requests.get(url, timeout=10)
parsed = html.fromstring(resource.content)
return parsed.xpath(xpath)
# Classes
class Version():
"""Represent the skeleton of each queried version."""
def __init__(self):
self.query = ''
self.branch = ''
self.version = ''
self.urls = {
'x86': '-',
'x86_64': '-'
}
def appname(self):
"""Determines the app name based on the query branch determined."""
datematch = re.match(r'[0-9]{8}', self.query)
retval = 'LibreOffice'
if self.query in {'prerelease', 'daily', 'current', 'yesterday'} or datematch:
retval = 'LibreOfficeDev'
return retval
def cleanup_downloads(self, path, verbose=False) -> None:
"""Cleanups the downloads folder to assure new versions are built."""
search_name = self.appname() + '_' + self.version
cmd = f"find {path} -iname {search_name}\\*.tar.gz -delete"
if verbose:
subprocess.run(shlex.split(cmd), stdout=subprocess.DEVNULL, stderr=subprocess.DEVNULL)
else:
subprocess.run(shlex.split(cmd))
def to_dict(self):
"""Returns a dictionary of versions."""
return {
'query': self.query,
'version': self.version,
'basedirurl': self.urls
}
def to_json(self):
"""Returns a json representation of the version."""
return json.dumps(self.to_dict())
def __str__(self):
return f"""query: {self.query}
version: {self.version}
x86: {self.urls['x86']}
x86_64: {self.urls['x86_64']}"""
class QueryError(Exception):
"""Standard exception for errors regarding queries."""
class Solver():
"""Generic solver to call others."""
def __init__(self, text: str, default_to_current = False):
self.text = text
self.branch = text
self.version = None
self.default_to_current = default_to_current
self.baseurl = ARCHIVE
def solve(self):
"""Splits the query text possibilities, calling all the rest of the solvers."""
solver = self
if self.text in { 'current', 'yesterday', 'daily' }:
solver = DailySolver(self.text, self.default_to_current)
elif self.text in { 'still', 'fresh', 'prerelease' }:
solver = NamedSolver(self.text)
elif '.' in self.text:
solver = NumberedSolver(self.text)
else:
try:
int(self.text)
solver = DailySolver(self.text, self.default_to_current)
except ValueError:
raise QueryError("The queried version does not exist.")
self.version = solver.solve()
self.baseurl = solver.baseurl
return self.version
def to_version(self):
retval = Version()
retval.query = self.text
retval.branch = self.branch
retval.version = self.version
if retval.branch != 'daily':
retval.urls['x86_64'] = self.baseurl + 'x86_64/'
try:
x86ver = match_xpath(self.baseurl + 'x86/', '//td/a/text()')
except Exception:
return retval
if len(x86ver) > 1:
retval.urls['x86'] = self.baseurl + 'x86/'
else:
retval.urls['x86_64'] = self.baseurl
return retval
@staticmethod
def parse(text: str, default_to_current = False):
"""Calling the same as solver class."""
retval = Solver(text, default_to_current)
retval.solve()
return retval.to_version()
class DailySolver(Solver):
"""Specific solver to daily queries."""
def __init__(self, text: str, default_to_current = False):
super().__init__(text, default_to_current)
self.branch = 'daily'
self.baseurl = DAILY
def solve(self):
"""Get daily urls based on query."""
x = "//td/a[starts-with(text(),'Linux-rpm_deb-x86') and contains(text(),'TDF/')]/text()"
tinderbox_segment = match_xpath(self.baseurl, x)[-1]
self.baseurl = self.baseurl + tinderbox_segment
# Reiterate now to search for the dated version
xpath_query = "//td/a/text()"
daily_set = match_xpath(self.baseurl, xpath_query)
matching = ''
today = datetime.datetime.today()
try:
int(self.text)
matching = datetime.datetime.strptime(self.text, "%Y%m%d").strftime('%Y-%m-%d')
except ValueError:
# All textual version
if self.text in { 'current', 'daily' }:
matching = 'current'
elif self.text == 'yesterday':
matching = (today + datetime.timedelta(days=-1)).strftime("%Y-%m-%d")
results = sorted([ x for x in daily_set if matching in x ])
if len(results) == 0:
# No daily versions found.
if self.default_to_current:
solver = DailySolver('current')
self.version = solver.version
self.baseurl = solver.baseurl
else:
self.baseurl = self.baseurl + results[-1]
# baseurl for x86 is not available for sure on daily builds.
xpath_string = "//td/a[contains(text(), '_deb.tar.gz')]/text()"
links = match_xpath(self.baseurl, xpath_string)
if len(links) > 0:
link = str(links[-1])
self.version = link.rsplit('/', maxsplit=1)[-1].split('_')[1]
return self.version
class NamedSolver(Solver):
"""Solves the query knowing that the input is a named query."""
def __init__(self, text: str):
super().__init__(text)
self.branch = text
self.baseurl = SELECTORS[self.text]['URL']
self.generalver = ''
def solve(self):
"""Get versions from query."""
xpath_query = SELECTORS[self.text]['xpath']
results = sorted(match_xpath(self.baseurl, xpath_query))
if len(results) > 0:
self.generalver = str(results[-1])
result: str = self.generalver
xpath_string = f"//td/a[starts-with(text(),'{result}')]/text()"
archived_versions = sorted(match_xpath(ARCHIVE, xpath_string))
if len(archived_versions) == 0:
return self.version
# Return just the last versions
fullversion: str = str(archived_versions[-1])
self.baseurl = ARCHIVE + fullversion + 'deb/'
self.version = fullversion.rstrip('/')
return self.version
class NumberedSolver(Solver):
"""Specific solver for numbered versions."""
def __init__(self, text: str):
super().__init__(text)
self.branch = '.'.join(text.split('.')[0-2])
def solve(self):
xpath_string = f"//td/a[starts-with(text(),'{self.text}')]/text()"
versions = sorted(match_xpath(self.baseurl, xpath_string))
if len(versions) == 0:
# It is possible that in the ARCHIVE there's no such version (might be a prerelease)
return self.version
version = str(versions[-1])
self.baseurl = self.baseurl + version + 'deb/'
self.version = version.rstrip('/')
return self.version