Compare commits

...

70 Commits
v1.2.0 ... main

Author SHA1 Message Date
gabriele.ponzo 2f31897cd9 Update README.md
Corretta forma inglese
2024-03-01 08:45:05 +00:00
emiliano.vavassori eff3bc67ee Fixed the README with pointers to the Wiki. 2024-01-01 18:32:36 +01:00
emiliano.vavassori 03fc28a1bf Implemented better option checking and defaults for builds. 2024-01-01 18:29:28 +01:00
emiliano.vavassori 8c4bc65184 Fixing duplication of functions in the previous commit. 2024-01-01 16:57:38 +01:00
emiliano.vavassori 5d6c7f2df2 Fixing multiple query build not working. 2024-01-01 16:36:50 +01:00
emiliano.vavassori 608eeb6071 Fixed a syntax error finding a file in a non-existent subdirectory. 2024-01-01 13:33:19 +01:00
emiliano.vavassori 2a617b1824 Fixing checksum function. 2024-01-01 13:18:55 +01:00
emiliano.vavassori 36da47b0b5 Fixing cleanup. 2024-01-01 10:02:15 +01:00
emiliano.vavassori 700d1eb376 Adding missing dependency. 2023-12-31 22:08:28 +01:00
emiliano.vavassori 3b01fc3b05 Fixed syntax error. 2023-12-09 23:36:54 +01:00
emiliano.vavassori 2277523f3c Removing unmet dependency. 2023-12-09 23:35:57 +01:00
emiliano.vavassori 3dbfef9fbe Cleaning up downloads directory before Daily and Prerelease, but leave downloaded versions in place already. 2023-12-09 23:26:03 +01:00
emiliano.vavassori 1a24f54d89 Revised and tested build cli subcommand. 2023-12-05 00:12:10 +01:00
emiliano.vavassori 024535afa9 Revised and tested getversion flow. Revised but not tested the build flows. 2023-12-04 00:51:58 +01:00
emiliano.vavassori d9775f4f94 First steps to refactoring the code and giving it a clear structure. 2023-12-03 04:16:31 +01:00
emiliano.vavassori 28528aa063 Adding Dockerfile to create a container to build. 2023-12-02 03:01:14 +01:00
emiliano.vavassori 142a09df14 Code for passing also a date for daily builds. Possibly blocked by cli command. 2023-12-02 03:00:46 +01:00
emiliano.vavassori 71a81b6a8e Fixing the Readme a bit. 2023-12-01 23:29:41 +01:00
emiliano.vavassori bb1e73fd6c Fixing last bits. 2023-12-01 23:15:57 +01:00
emiliano.vavassori a74b8d4858 Fixing version in setup. 2023-12-01 22:55:26 +01:00
emiliano.vavassori 8bd23dd08b Adding pypi uploading to pipeline. 2023-12-01 22:44:42 +01:00
emiliano.vavassori 7fe48c297d If repo is '.', resolve absolute path. 2023-12-01 22:44:24 +01:00
emiliano.vavassori 82e366c5da Redownloading again all files in case of prereleases or daily builds. 2023-12-01 21:55:13 +01:00
emiliano.vavassori 49e0ab5593 Reimplemented md5sum with python functions. 2023-12-01 21:42:52 +01:00
emiliano.vavassori 62248d862c Fixing daily URL breakage (issue #1).
As builds can come from different tinderboxes, adding an additional request and HTML parsing with xpath a final URL is less dependent on hardcoded strings.
2023-12-01 21:20:15 +01:00
Emiliano Vavassori 0dc3c97758 Updating release number after dependencies.
continuous-integration/drone Build is passing Details
2023-12-01 00:07:33 +01:00
Emiliano Vavassori 8bc7290a6f Restoring drone default settings. 2023-12-01 00:06:07 +01:00
Emiliano Vavassori 2fec2a2de6 forcing release.
continuous-integration/drone Build is failing Details
2023-12-01 00:03:58 +01:00
Emiliano Vavassori bb002e88dc Fixing drone pipeline
continuous-integration/drone Build is passing Details
2023-11-30 23:36:45 +01:00
Emiliano Vavassori ad6d85e423 Updated dependencies.
continuous-integration/drone Build is passing Details
2023-11-30 23:03:37 +01:00
Emiliano Vavassori 0215324bba Aggiunta attivazione/disattivazione venv. 2023-07-03 00:00:49 +02:00
Emiliano Vavassori 47b8e0cf2a Added README, changed build options to suite LibreItalia Drone instance.
continuous-integration/drone Build is passing Details
2023-01-08 17:09:02 +01:00
Emiliano Vavassori b087e85ec5 Semplificati alcuni messaggi nell'output. 2023-01-07 23:22:35 +01:00
Emiliano Vavassori 3a9f13594c Fix probabilmente definitivo. 2023-01-07 23:08:33 +01:00
Emiliano Vavassori 1f83db6105 Ristrutturazione codice di verifica. 2023-01-07 22:57:15 +01:00
Emiliano Vavassori 8b5c87f801 Correzioni su full path. 2023-01-07 22:52:11 +01:00
Emiliano Vavassori 0bea7a81bc Correzione di un paio di errori di sintassi sui percorsi multipli. 2023-01-07 22:17:18 +01:00
Emiliano Vavassori 78a43350ed Cambio versione loaih. 2023-01-07 21:48:24 +01:00
Emiliano Vavassori 8c3e649a25 Correzione file yaml per le build. 2023-01-07 21:46:40 +01:00
Emiliano Vavassori db01651251 Gestito errore di verifica delle build con cartella quando non trovate. 2023-01-07 02:23:34 +01:00
Emiliano Vavassori 64effab3d7 Corrette tutte le occorrenze di variabile rinominata. 2023-01-07 02:06:05 +01:00
Emiliano Vavassori 0a3f475fa6 script è linted. Corretto codice in check su build. 2023-01-07 02:03:31 +01:00
Emiliano Vavassori 03620cf013 Correzione append percorsi. 2023-01-07 01:37:17 +01:00
Emiliano Vavassori c015aeea99 Correzione codice checksum. 2023-01-07 01:25:32 +01:00
Emiliano Vavassori 0a18586201 Aggiunta testo per fasi. Rivista procedura. 2023-01-07 01:03:21 +01:00
Emiliano Vavassori 5df2c5dbdb Ancora sul build. Aggiunte stringhe per debugging. 2023-01-07 00:59:38 +01:00
Emiliano Vavassori 41dcbe1718 Aggiunto codice per debug, corretto lancio comando di checksum. 2023-01-07 00:52:21 +01:00
Emiliano Vavassori 2405601d2d Sistemato codice per build. 2023-01-07 00:46:45 +01:00
Emiliano Vavassori dff74f0a35 Ripristinate istruzioni originali, con chiamata a comando locale. 2023-01-07 00:30:56 +01:00
Emiliano Vavassori 6bfa6a3707 Ancora correzioni al processo di checksum. 2023-01-07 00:24:35 +01:00
Emiliano Vavassori 059518ccbf Ancora qualche correzione al codice per generare checksum. 2023-01-07 00:20:56 +01:00
Emiliano Vavassori 7013318188 Sostituito lancio programma con codice python. 2023-01-07 00:15:42 +01:00
Emiliano Vavassori f6bcf610ba Correzione md5. 2023-01-07 00:07:03 +01:00
Emiliano Vavassori 3aab2626ed Correzione script. 2023-01-07 00:02:20 +01:00
Emiliano Vavassori df079c91b5 Correzione pubblicazione. 2023-01-06 23:55:30 +01:00
Emiliano Vavassori a0c4fbcad0 Correzione rsync. 2023-01-06 23:39:29 +01:00
Emiliano Vavassori a316b85afe Correzione comando sbagliato da console - 1. 2023-01-06 23:29:27 +01:00
Emiliano Vavassori 60b246c548 Correzione comando sbagliato da console. 2023-01-06 23:23:30 +01:00
Emiliano Vavassori a0c6217d95 Sistemazione build. 2023-01-05 19:58:45 +01:00
Emiliano Vavassori f95c4d4b1d Sistemazione upload con rsync. 2023-01-05 18:28:11 +01:00
Emiliano Vavassori 9d259c4aa5 Cambiata la logica di build in caso di assenza di build. 2023-01-05 01:59:55 +01:00
Emiliano Vavassori 07a895c86c Fix checksum che torna valore non valido. 2023-01-05 01:56:33 +01:00
Emiliano Vavassori 05533bf5e2 Fix sintassi xpath. 2023-01-05 01:36:53 +01:00
Emiliano Vavassori 38a78860b0 Fix procedura dopo codice debug. 2023-01-05 01:33:26 +01:00
Emiliano Vavassori da31e1655b Aggiunta codice debug. 2023-01-05 01:30:06 +01:00
Emiliano Vavassori ae9668554a Cambiamenti per esecuzione corretta verifica online. 2023-01-05 01:24:37 +01:00
Emiliano Vavassori 9cf3119489 Correzione sintassi in build. 2023-01-05 01:14:28 +01:00
Emiliano Vavassori cbaf6b2e3c Corretto anche il file di build per il test. 2023-01-05 01:08:35 +01:00
Emiliano Vavassori df5012eedd Prima implementazione controllo build remota e caricamento con rsync + ssh. 2023-01-05 01:07:12 +01:00
Emiliano Vavassori 2c19eefa05 Rimozione passaggi superflui pipeline. 2023-01-04 23:44:10 +01:00
14 changed files with 1063 additions and 662 deletions

View File

@ -1,5 +1,6 @@
---
kind: pipeline
type: docker
name: default
steps:
@ -8,33 +9,25 @@ steps:
commands:
- pip install wheel
- python setup.py bdist_wheel
- mkdir out
- cp dist/*.whl out/
when:
event: tag
- name: release
image: plugins/gitea-release
settings:
api_key:
from_secret: gitea-deploy
base_url: https://git.sys42.eu/
files: out/*.whl
checksum:
- md5
from_secret: loaih-deploy
base_url: https://git.libreitalia.org
files: dist/*.whl
checksum: md5
draft: true
when:
event: tag
- name: handycopy
image: drillster/drone-rsync
- name: publish
image: plugins/pypi
settings:
hosts: deimos.sys42.eu
user: syntaxerrormmm
port: 45454
key:
from_secret: fisso-ssh-key
source: out/*.whl
target: ~/
when:
event: tag
username: __token__
password:
from_secret: pypi
trigger:
event:
- tag
- push

1
.gitignore vendored
View File

@ -1,4 +1,5 @@
venv
test
build
dist
loaih.egg-info

5
Dockerfile Normal file
View File

@ -0,0 +1,5 @@
FROM python:3.9-slim
RUN pip install loaih
ENTRYPOINT [ "/usr/local/bin/loaih" ]
CMD [ "--help" ]

18
README.md Normal file
View File

@ -0,0 +1,18 @@
# LibreOffice AppImage Helper - `loaih` #
LibreOffice AppImage Helper is an enhanced Python porting from [previous work
by Antonio Faccioli](https://github.com/antoniofaccioli/libreoffice-appimage).
It helps building a LibreOffice AppImage from officially released .deb files
with some options.
## Getting options and help ##
You can ask the app some information on how you can use it:
$ loaih --help
$ loaih getversion --help
$ loaih build --help
$ loaih batch --help
For any other information needed, please visit the [wiki of the
project](https://git.libreitalia.org/libreitalia/loaih/wiki).

View File

@ -11,10 +11,16 @@ if [[ ${retval} -ne 0 ]]; then
# for the sake of consistency, let's make the check_updates.sh script
# executable
chmod +x check_updates.sh
if [[ -d venv ]]; then
source venv/bin/activate
fi
pip3 uninstall -y loaih
# build the actual toolkit
python3 setup.py bdist_wheel
pip3 install dist/*.whl; rv=$?
if [[ -d venv ]]; then
deactivate
fi
if [[ ${rv} -eq 0 ]]; then
# cleanup

View File

@ -1,67 +1,69 @@
---
data:
repo: /mnt/appimage
repo: https://appimages.libreitalia.org
remote_host: ciccio
remote_path: /var/lib/nethserver/vhost/appimages
download: /var/tmp/downloads
force: no
sign: yes
force: false
sign: true
builds:
- query: daily
language: basic
offline_help: no
portable: no
offline_help: false
portable: false
- query: daily
language: basic
offline_help: yes
portable: no
offline_help: true
portable: false
- query: daily
language: basic
offline_help: no
portable: yes
offline_help: false
portable: true
- query: daily
language: basic
offline_help: yes
portable: yes
offline_help: true
portable: true
- query: daily
language: standard
offline_help: no
portable: no
offline_help: false
portable: false
- query: daily
language: standard
offline_help: yes
portable: no
offline_help: true
portable: false
- query: daily
language: standard
offline_help: no
portable: yes
offline_help: false
portable: true
- query: daily
language: standard
offline_help: yes
portable: yes
offline_help: true
portable: true
- query: daily
language: full
offline_help: no
portable: no
offline_help: false
portable: false
- query: daily
language: full
offline_help: yes
portable: no
offline_help: true
portable: false
- query: daily
language: full
offline_help: no
portable: yes
offline_help: false
portable: true
- query: daily
language: full
offline_help: yes
portable: yes
offline_help: true
portable: true

View File

@ -1,67 +1,69 @@
---
data:
repo: /mnt/appimage
repo: https://appimages.libreitalia.org
remote_host: ciccio
remote_path: /var/lib/nethserver/vhost/appimages
download: /var/tmp/downloads
force: no
sign: yes
force: false
sign: true
builds:
- query: fresh
language: basic
offline_help: no
portable: no
offline_help: false
portable: false
- query: fresh
language: basic
offline_help: yes
portable: no
offline_help: true
portable: false
- query: fresh
language: basic
offline_help: no
portable: yes
offline_help: false
portable: true
- query: fresh
language: basic
offline_help: yes
portable: yes
offline_help: true
portable: true
- query: fresh
language: standard
offline_help: no
portable: no
offline_help: false
portable: false
- query: fresh
language: standard
offline_help: yes
portable: no
offline_help: true
portable: false
- query: fresh
language: standard
offline_help: no
portable: yes
offline_help: false
portable: true
- query: fresh
language: standard
offline_help: yes
portable: yes
offline_help: true
portable: true
- query: fresh
language: full
offline_help: no
portable: no
offline_help: false
portable: false
- query: fresh
language: full
offline_help: yes
portable: no
offline_help: true
portable: false
- query: fresh
language: full
offline_help: no
portable: yes
offline_help: false
portable: true
- query: fresh
language: full
offline_help: yes
portable: yes
offline_help: true
portable: true

View File

@ -1,213 +1,259 @@
#!/usr/bin/env python
# encoding: utf-8
"""machinery for compiling new versions of appimages."""
import urllib.request
from lxml import etree
from packaging.version import parse as parse_version
import datetime
import json
import re
import requests
import subprocess
import shlex
from lxml import html
class Definitions(object):
DOWNLOADPAGE = "https://www.libreoffice.org/download/download/"
ARCHIVE = "https://downloadarchive.documentfoundation.org/libreoffice/old/"
RELEASE = "https://download.documentfoundation.org/libreoffice/stable/"
DAILY = "https://dev-builds.libreoffice.org/daily/master/Linux-rpm_deb-x86_64@tb87-TDF/"
PRERELEASE = "https://dev-builds.libreoffice.org/pre-releases/deb/x86_64/"
# Constants
DOWNLOADPAGE = "https://www.libreoffice.org/download/download/"
ARCHIVE = "https://downloadarchive.documentfoundation.org/libreoffice/old/"
RELEASE = "https://download.documentfoundation.org/libreoffice/stable/"
DAILY = "https://dev-builds.libreoffice.org/daily/master/"
PRERELEASE = "https://dev-builds.libreoffice.org/pre-releases/deb/x86_64/"
SELECTORS = {
'still': {
'URL': DOWNLOADPAGE,
'xpath': '(//span[@class="dl_version_number"])[last()]/text()'
},
'fresh': {
'URL': DOWNLOADPAGE,
'xpath': '(//span[@class="dl_version_number"])[1]/text()'
},
'prerelease': {
'URL': DOWNLOADPAGE,
'xpath': '//p[@class="lead_libre"][last()]/following-sibling::ul[last()]/li/a/text()'
},
'daily': {
'URL': DAILY,
'xpath': '//td/a'
}
SELECTORS = {
'still': {
'URL': DOWNLOADPAGE,
'xpath': '(//span[@class="dl_version_number"])[last()]/text()'
},
'fresh': {
'URL': DOWNLOADPAGE,
'xpath': '(//span[@class="dl_version_number"])[1]/text()'
},
'prerelease': {
'URL': DOWNLOADPAGE,
'xpath': '//p[@class="lead_libre"][last()]/following-sibling::ul[last()]/li/a/text()'
},
'daily': {
'URL': DAILY,
'xpath': '//td/a'
}
}
class Base(object):
# Class for static methods which might be useful even outside the build
# scripts.
@staticmethod
def dailyurl(date = datetime.datetime.today()):
"""Returns the URL for the latest valid daily build."""
# As per other parts of the build, we need to maintain an URL also for
# x86 versions that it isn't really provided.
# As such, the return value must be a dictionary
# Generic functions
def match_xpath(url: str, xpath: str):
"""Uses a couple of extensions to get results over webpage."""
resource = requests.get(url, timeout=10)
parsed = html.fromstring(resource.content)
return parsed.xpath(xpath)
# Get the anchor for today's builds
a = etree.HTML(urllib.request.urlopen(Definitions.DAILY).read()).xpath("//td/a[contains(text(), '" + date.strftime('%Y-%m-%d') + "')]/text()")
if len(a) == 0:
# No results found, no version found, let's return a
return { 'x86': '-', 'x86_64': '-' }
# On the contrary, more than a version is found. let's order the
# list and get the latest item
return { 'x86': '-', 'x86_64': Definitions.SELECTORS['daily']['URL'] + sorted(a)[-1] }
# Classes
class Version():
"""Represent the skeleton of each queried version."""
@staticmethod
def dailyver(date = datetime.datetime.today()):
"""Returns versions present on the latest daily build."""
url = Base.dailyurl(date)['x86_64']
# If no daily releases has been provided yet, return empty
if url == '-':
return []
def __init__(self):
self.query = ''
self.branch = ''
self.version = ''
self.urls = {
'x86': '-',
'x86_64': '-'
}
# Rerun the page parsing, this time to find out the versions built
b = etree.HTML(urllib.request.urlopen(url).read()).xpath("//td/a[contains(text(), '_deb.tar.gz')]/text()")
# This should have returned the main package for a version, but can
# have returned multiple ones, so let's treat it as a list
return [ x.split('_')[1] for x in b ]
def appname(self):
"""Determines the app name based on the query branch determined."""
datematch = re.match(r'[0-9]{8}', self.query)
retval = 'LibreOffice'
if self.query in {'prerelease', 'daily', 'current', 'yesterday'} or datematch:
retval = 'LibreOfficeDev'
@staticmethod
def namedver(query):
"""Gets the version for a specific named version."""
if query == 'daily' or query == 'yesterday':
# Daily needs double parsing for the same result to apply.
# We first select today's build anchor:
date = datetime.datetime.today()
if query == 'yesterday':
# Use yesterdays' date for testing purposes.
date += datetime.timedelta(days=-1)
return Base.dailyver(date)
# In case the query isn't for daily
return etree.HTML(urllib.request.urlopen(Definitions.SELECTORS[query]['URL']).read()).xpath(Definitions.SELECTORS[query]['xpath'])
@staticmethod
def fullversion(version):
"""Get latest full version from Archive based on partial version."""
versionlist = etree.HTML(urllib.request.urlopen(Definitions.ARCHIVE).read()).xpath(f"//td/a[starts-with(text(), '{version}')]/text()")
if versionlist:
cleanlist = sorted([ x.strip('/') for x in versionlist ])
# Sorting, then returning the last version
return cleanlist[-1]
return None
@staticmethod
def urlfromqueryandver(query, version):
"""Returns the fetching URL based on the queried version and the numeric version of it."""
# This has the purpose to simplify and explain how the releases are
# layed out.
# If the query tells about daily or 'yesterday' (for testing purposes),
# we might ignore versions and return the value coming from dailyurl:
if query == 'daily':
return Base.dailyurl()
if query == 'yesterday':
date = datetime.datetime.today() + datetime.timedelta(days=-1)
return Base.dailyurl(date)
# All other versions will be taken from Archive, as such we need a full
# version.
# If the version has only 2 points in it (or splits into three parts by '.'), that's not a full version and we will call the getlatestver() function
fullversion = str(version)
if len(fullversion.split('.')) <= 3:
fullversion = str(Base.fullversion(version))
# So the final URL is the Archive one, plus the full versions, plus a
# final '/deb/' - and an arch subfolder
baseurl = Definitions.ARCHIVE + fullversion + '/deb/'
retval = {}
# x86 binaries are not anymore offered after 6.3.0.
if parse_version(fullversion) < parse_version('6.3.0'):
retval['x86'] = baseurl + 'x86/'
else:
retval['x86'] = '-'
retval['x86_64'] = baseurl + 'x86_64/'
return retval
@staticmethod
def collectedbuilds(query):
"""Creates a list of Builds based on each queried version found."""
retval = []
if '.' in query:
# Called with a numeric query. Pass it to RemoteBuild
retval.append(RemoteBuild(query))
def cleanup_downloads(self, path, verbose=False) -> None:
"""Cleanups the downloads folder to assure new versions are built."""
search_name = self.appname() + '_' + self.version
cmd = f"find {path} -iname {search_name}\\*.tar.gz -delete"
if verbose:
subprocess.run(shlex.split(cmd), stdout=subprocess.DEVNULL, stderr=subprocess.DEVNULL)
else:
# Named query
a = Base.namedver(query)
subprocess.run(shlex.split(cmd))
if not a:
# a is empty
return retval
if isinstance(a, list) and len(a) > 1:
retval.extend([ RemoteBuild(query, version) for version in a ])
else:
retval.append(RemoteBuild(query))
return sorted(retval, key=lambda x: x.version)
class RemoteBuild(object):
def __init__(self, query, version = None):
"""Should simplify the single builded version."""
self.query = query
self.version = ''
self.basedirurl = { 'x86': '-', 'x86_64': '-' }
if version and isinstance(version, str):
self.version = version
if not '.' in self.query:
# Named version.
# Let's check if a specific version was requested.
if self.version == '':
# In case it was not requested, we will carry on the generic
# namedver() query.
# If the results are more than one, we'll take the latest (since we are requested to provide a single build).
a = Base.namedver(self.query)
if isinstance(a, list):
# if the number of versions is zero, return and exit
if not a:
return None
if len(a) == 1:
# version is a single one.
self.version = a[0]
else:
# In this case, we will select the latest release.
self.version = sorted(a)[-1]
# If the version has already a version, as requested by user,
# continue using that version
else:
# In case of numbered queries, put it as initial version
self.version = self.query
if len(str(self.version).split('.')) < 4:
# If not 4 dotted, let's search for the 4 dotted version
self.version = Base.fullversion(self.version)
self.basedirurl = Base.urlfromqueryandver(self.query, self.version)
def todict(self):
def to_dict(self):
"""Returns a dictionary of versions."""
return {
'query': self.query,
'version': self.version,
'basedirurl': self.basedirurl
'basedirurl': self.urls
}
def to_json(self):
"""Returns a json representation of the version."""
return json.dumps(self.to_dict())
def __str__(self):
return f"""query: {self.query}
version: {self.version}
x86: {self.basedirurl['x86']}
x86_64: {self.basedirurl['x86_64']}"""
x86: {self.urls['x86']}
x86_64: {self.urls['x86_64']}"""
class QueryError(Exception):
"""Standard exception for errors regarding queries."""
class Solver():
"""Generic solver to call others."""
def __init__(self, text: str, default_to_current = False):
self.text = text
self.branch = text
self.version = None
self.default_to_current = default_to_current
self.baseurl = ARCHIVE
def solve(self):
"""Splits the query text possibilities, calling all the rest of the solvers."""
solver = self
if self.text in { 'current', 'yesterday', 'daily' }:
solver = DailySolver(self.text, self.default_to_current)
elif self.text in { 'still', 'fresh', 'prerelease' }:
solver = NamedSolver(self.text)
elif '.' in self.text:
solver = NumberedSolver(self.text)
else:
try:
int(self.text)
solver = DailySolver(self.text, self.default_to_current)
except ValueError:
raise QueryError("The queried version does not exist.")
self.version = solver.solve()
self.baseurl = solver.baseurl
return self.version
def to_version(self):
retval = Version()
retval.query = self.text
retval.branch = self.branch
retval.version = self.version
if retval.branch != 'daily':
retval.urls['x86_64'] = self.baseurl + 'x86_64/'
try:
x86ver = match_xpath(self.baseurl + 'x86/', '//td/a/text()')
except Exception:
return retval
if len(x86ver) > 1:
retval.urls['x86'] = self.baseurl + 'x86/'
else:
retval.urls['x86_64'] = self.baseurl
return retval
@staticmethod
def parse(text: str, default_to_current = False):
"""Calling the same as solver class."""
retval = Solver(text, default_to_current)
retval.solve()
return retval.to_version()
class DailySolver(Solver):
"""Specific solver to daily queries."""
def __init__(self, text: str, default_to_current = False):
super().__init__(text, default_to_current)
self.branch = 'daily'
self.baseurl = DAILY
def solve(self):
"""Get daily urls based on query."""
x = "//td/a[starts-with(text(),'Linux-rpm_deb-x86') and contains(text(),'TDF/')]/text()"
tinderbox_segment = match_xpath(self.baseurl, x)[-1]
self.baseurl = self.baseurl + tinderbox_segment
# Reiterate now to search for the dated version
xpath_query = "//td/a/text()"
daily_set = match_xpath(self.baseurl, xpath_query)
matching = ''
today = datetime.datetime.today()
try:
int(self.text)
matching = datetime.datetime.strptime(self.text, "%Y%m%d").strftime('%Y-%m-%d')
except ValueError:
# All textual version
if self.text in { 'current', 'daily' }:
matching = 'current'
elif self.text == 'yesterday':
matching = (today + datetime.timedelta(days=-1)).strftime("%Y-%m-%d")
results = sorted([ x for x in daily_set if matching in x ])
if len(results) == 0:
# No daily versions found.
if self.default_to_current:
solver = DailySolver('current')
self.version = solver.version
self.baseurl = solver.baseurl
else:
self.baseurl = self.baseurl + results[-1]
# baseurl for x86 is not available for sure on daily builds.
xpath_string = "//td/a[contains(text(), '_deb.tar.gz')]/text()"
links = match_xpath(self.baseurl, xpath_string)
if len(links) > 0:
link = str(links[-1])
self.version = link.rsplit('/', maxsplit=1)[-1].split('_')[1]
return self.version
class NamedSolver(Solver):
"""Solves the query knowing that the input is a named query."""
def __init__(self, text: str):
super().__init__(text)
self.branch = text
self.baseurl = SELECTORS[self.text]['URL']
self.generalver = ''
def solve(self):
"""Get versions from query."""
xpath_query = SELECTORS[self.text]['xpath']
results = sorted(match_xpath(self.baseurl, xpath_query))
if len(results) > 0:
self.generalver = str(results[-1])
result: str = self.generalver
xpath_string = f"//td/a[starts-with(text(),'{result}')]/text()"
archived_versions = sorted(match_xpath(ARCHIVE, xpath_string))
if len(archived_versions) == 0:
return self.version
# Return just the last versions
fullversion: str = str(archived_versions[-1])
self.baseurl = ARCHIVE + fullversion + 'deb/'
self.version = fullversion.rstrip('/')
return self.version
class NumberedSolver(Solver):
"""Specific solver for numbered versions."""
def __init__(self, text: str):
super().__init__(text)
self.branch = '.'.join(text.split('.')[0-2])
def solve(self):
xpath_string = f"//td/a[starts-with(text(),'{self.text}')]/text()"
versions = sorted(match_xpath(self.baseurl, xpath_string))
if len(versions) == 0:
# It is possible that in the ARCHIVE there's no such version (might be a prerelease)
return self.version
version = str(versions[-1])
self.baseurl = self.baseurl + version + 'deb/'
self.version = version.rstrip('/')
return self.version

View File

@ -1,50 +1,81 @@
#!/usr/bin/env python3
# encoding: utf-8
"""Classes and functions to build an AppImage."""
import urllib.request
import os
import glob
import subprocess
import shutil
import re
import shlex
import tempfile
import hashlib
import requests
import loaih
from lxml import etree
import tempfile, os, sys, glob, subprocess, shutil, re, shlex
class Collection(list):
"""Aggregates metadata on a collection of builds."""
def __init__(self, query, arch = ['x86', 'x86_64']):
"""Build a list of version to check/build for this round."""
super().__init__()
self.extend([ Build(query, arch, version) for version in loaih.Base.collectedbuilds(query) ])
class Build(loaih.RemoteBuild):
LANGSTD = [ 'ar', 'de', 'en-GB', 'es', 'fr', 'it', 'ja', 'ko', 'pt', 'pt-BR', 'ru', 'zh-CN', 'zh-TW' ]
version = loaih.Solver.parse(query)
# If a version is not buildable, discard it now!
arch = [ x for x in arch if version.urls[x] != '-' ]
self.extend([ Build(version, ar) for ar in arch ])
class Build():
"""Builds a single version."""
LANGSTD = [ 'ar', 'de', 'en-GB', 'es', 'fr', 'it', 'ja', 'ko', 'pt',
'pt-BR', 'ru', 'zh-CN', 'zh-TW' ]
LANGBASIC = [ 'en-GB' ]
ARCHSTD = [ u'x86', u'x86_64' ]
ARCHSTD = [ 'x86', 'x86_64' ]
def __init__(self, query, arch, version = None):
super().__init__(query, version)
def __init__(self, version: loaih.Version, arch):
self.version = version
self.tidy_folder = True
self.verbose = True
self.arch = arch
self.short_version = str.join('.', self.version.split('.')[0:2])
self.branch_version = None
if not '.' in self.query:
self.branch_version = self.query
self.url = self.basedirurl
self.short_version = str.join('.', self.version.version.split('.')[0:2])
self.branch_version = self.version.branch
self.url = self.version.urls[arch]
# Other default values
# Other default values - for structured builds
# Most likely will be overridden by cli
self.language = 'basic'
self.offline_help = False
self.portable = False
self.updatable = True
self.sign = True
self.repo_type = 'local'
self.remote_host = ''
self.remote_path = ''
self.storage_path = '/mnt/appimage'
self.download_path = '/var/tmp/downloads'
self.appnamedir = ''
# Specific build version
self.appname = self.version.appname()
self.appversion = ''
self.appimagefilename = {}
self.zsyncfilename = {}
self.appimagedir = ''
self.appimagefilename = ''
self.zsyncfilename = ''
# Other variables by build
self.languagepart = '.' + self.language
self.helppart = ''
# Creating a tempfile
self.builddir = tempfile.mkdtemp()
self.tarballs = {}
self.built = { u'x86': False, u'x86_64': False }
self.found = False
self.built = False
# Preparing the default for the relative path on the storage for
# different versions.
# The path will evaluated as part of the check() function, as it is
@ -53,10 +84,23 @@ class Build(loaih.RemoteBuild):
self.full_path = ''
self.baseurl = ''
def calculate(self):
"""Calculate exclusions and other variables."""
# AppName
self.appname = 'LibreOffice' if not self.query == 'daily' and not self.query == 'prerelease' else 'LibreOfficeDev'
if self.verbose:
print("--- Calculate Phase ---")
# let's check here if we are on a remote repo or local.
if self.storage_path.startswith("http"):
# Final repository is remote
self.repo_type = 'remote'
if self.verbose:
print("Repo is remote.")
else:
self.repo_type = 'local'
if self.verbose:
print("Repo is local.")
# Calculating languagepart
self.languagepart = "."
@ -66,199 +110,435 @@ class Build(loaih.RemoteBuild):
self.languagepart += self.language
# Calculating help part
self.helppart = '.help' if self.offline_help else ''
if self.offline_help:
self.helppart = '.help'
# Building the required names
for arch in Build.ARCHSTD:
self.appimagefilename[arch] = self.__gen_appimagefilename__(self.version, arch)
self.zsyncfilename[arch] = self.appimagefilename[arch] + '.zsync'
self.appimagefilename = self.__gen_appimagefilename__()
self.zsyncfilename = self.appimagefilename + '.zsync'
# Mandate to the private function to calculate the full_path available
# for the storage and the checks.
self.__calculate_full_path__()
def __gen_appimagefilename__(self, version, arch):
def check(self):
"""Checking if the requested AppImage has been already built."""
if self.branch_version == 'daily':
# Daily versions have to be downloaded and built each time; no
# matter if another one is already present.
return
if self.verbose:
print("--- Check Phase ---")
if len(self.appimagefilename) == 0:
self.calculate()
if self.verbose:
print(f"Searching for {self.appimagefilename}")
# First, check if by metadata the repo is remote or not.
if self.repo_type == 'remote':
# Remote storage. I have to query a remote site to know if it
# was already built.
name = self.appimagefilename
url = self.storage_path.rstrip('/') + self.full_path + '/'
matching = []
try:
if len(loaih.match_xpath(url, f"//a[contains(@href,'{name}')]/@href")) > 0:
# Already built.
self.found = True
except Exception:
# The URL specified do not exist. So it is to build.
self.found = False
else:
# Repo is local
command = f"find {self.full_path} -name {self.appimagefilename}"
res = subprocess.run(shlex.split(command),
capture_output=True,
env={ "LC_ALL": "C" },
text=True, encoding='utf-8', check=True)
if res.stdout and len(res.stdout.strip("\n")) > 0:
# All good, the command was executed fine.
self.found = True
if self.found:
if self.verbose:
print(f"Found requested AppImage: {self.appimagefilename}.")
def download(self):
"""Downloads the contents of the URL as it was a folder."""
if self.verbose:
print("--- Download Phase ---")
if self.found:
return
if self.verbose:
print(f"Started downloads for {self.version.version}. Please wait.")
# Checking if a valid path has been provided
if self.url == '-':
if self.verbose:
print(f"Cannot build for arch {self.arch}. Continuing with other arches.")
# Faking already built it so to skip other checks.
self.found = True
# Identifying downloads
self.tarballs = [ x for x in loaih.match_xpath(self.url, "//td/a/text()") if x.endswith('tar.gz') and 'deb' in x ]
# Create and change directory to the download location
os.makedirs(self.download_path, exist_ok = True)
os.chdir(self.download_path)
for archive in self.tarballs:
# If the archive is already there, do not do anything.
if os.path.exists(archive):
continue
# Download the archive
try:
self.__download_archive__(archive)
except Exception as error:
print(f"Failed to download {archive}: {error}.")
if self.verbose:
print(f"Finished downloads for {self.version.version}.")
def build(self):
"""Building all the versions."""
if self.verbose:
print("--- Building Phase ---")
if self.found:
return
# Preparation tasks
self.appnamedir = os.path.join(self.builddir, self.appname)
os.makedirs(self.appnamedir, exist_ok=True)
# And then cd to the appname folder.
os.chdir(self.appnamedir)
# Download appimagetool from github
appimagetoolurl = r"https://github.com/AppImage/AppImageKit/releases/"
appimagetoolurl += f"download/continuous/appimagetool-{self.arch}.AppImage"
self.__download__(appimagetoolurl, 'appimagetool')
os.chmod('appimagetool', 0o755)
# Build the requested version.
self.__unpackbuild__()
def checksums(self):
"""Create checksums of the built versions."""
# Skip checksum if initally the build was already found in the storage directory
if self.verbose:
print("--- Checksum Phase ---")
if self.found:
return
os.chdir(self.appnamedir)
if self.built:
for item in [ self.appimagefilename, self.zsyncfilename ]:
itempath = os.path.join(self.appnamedir, item)
if os.path.exists(itempath):
self.__create_checksum__(item)
def publish(self):
"""Moves built versions to definitive storage."""
if self.verbose:
print("--- Publish Phase ---")
if self.found:
# All files are already present in the full_path
return
os.chdir(self.appnamedir)
# Two cases here: local and remote storage_path.
if self.repo_type == 'remote':
# Remote first.
# Build destination directory
remotepath = self.remote_path.rstrip('/') + self.full_path
try:
if self.verbose:
subprocess.run(
r"rsync -rlIvz --munge-links *.AppImage* " +
f"{self.remote_host}:{remotepath}",
cwd=self.appnamedir, shell=True, check=True
)
else:
subprocess.run(
r"rsync -rlIvz --munge-links *.AppImage* " +
f"{self.remote_host}:{remotepath}",
cwd=self.appnamedir, shell=True, check=True,
stdout=subprocess.DEVNULL, stderr=subprocess.DEVNULL
)
finally:
pass
else:
# Local
# Forcing creation of subfolders, in case there is a new build
os.makedirs(self.full_path, exist_ok = True)
for file in glob.glob("*.AppImage*"):
subprocess.run(shlex.split(
f"cp -f {file} {self.full_path}"
), check=True)
def generalize_and_link(self, chdir = 'default'):
"""Creates the needed generalized files if needed."""
if self.verbose:
print("--- Generalize and Link Phase ---")
# If called with a pointed version, no generalize and link necessary.
if not self.branch_version:
return
# If a prerelease or a daily version, either.
if self.version.query in { 'daily', 'prerelease' }:
return
if chdir == 'default':
chdir = self.full_path
appimagefilename = r''
zsyncfilename = r''
# Creating versions for short version and query text
versions = [ self.short_version, self.branch_version ]
os.chdir(chdir)
# if the appimage for the reported arch is not found, skip to next
# arch
if not os.path.exists(self.appimagefilename):
return
# Doing it both for short_name and for branchname
for version in versions:
appimagefilename = f"{self.appname}-{version}"
appimagefilename += f"{self.languagepart}{self.helppart}"
appimagefilename += f'-{self.arch}.AppImage'
zsyncfilename = appimagefilename + '.zsync'
# Create the symlink
if self.verbose:
print(f"Creating {appimagefilename} and checksums.")
if os.path.exists(appimagefilename):
os.unlink(appimagefilename)
os.symlink(self.appimagefilename, appimagefilename)
# Create the checksum for the AppImage
self.__create_checksum__(appimagefilename)
# Do not continue if no zsync file is provided.
if not self.updatable:
continue
if self.verbose:
print(f"Creating zsync file for version {version}.")
if os.path.exists(zsyncfilename):
os.unlink(zsyncfilename)
shutil.copyfile(self.zsyncfilename, zsyncfilename)
# Editing the zsyncfile
subprocess.run(shlex.split(
r"sed --in-place 's/^Filename:.*$/Filename: " +
f"{appimagefilename}/' {zsyncfilename}"
), check=True)
self.__create_checksum__(zsyncfilename)
### Private methods ###
def __gen_appimagefilename__(self):
"""Generalize the construction of the name of the app."""
self.appversion = version + self.languagepart + self.helppart
return self.appname + f'-{self.appversion}-{arch}.AppImage'
self.appversion = self.version.version + self.languagepart + self.helppart
return self.appname + f'-{self.appversion}-{self.arch}.AppImage'
def __calculate_full_path__(self):
"""Calculate relative path of the build, based on internal other variables."""
if len(self.relative_path) == 0:
if self.query == 'daily':
self.relative_path.append('daily')
elif self.query == 'prerelease':
self.relative_path.append('prerelease')
if self.tidy_folder:
if self.branch_version == 'daily':
self.relative_path.append('daily')
elif self.branch_version == 'prerelease':
self.relative_path.append('prerelease')
# Not the same check, an additional one
if self.portable:
self.relative_path.append('portable')
# Not the same check, an additional one
if self.portable:
self.relative_path.append('portable')
fullpath_arr = self.storage_path.split('/')
# Joining relative path only if it is not null
if len(self.relative_path) > 0:
fullpath_arr.extend(self.relative_path)
self.full_path = re.sub(r"/+", '/', str.join('/', fullpath_arr))
# Fullpath might be intended two ways:
if self.repo_type == 'remote':
# Repository is remote
# we build full_path as it is absolute to the root of the
# storage_path.
self.full_path = '/'
if len(self.relative_path) >= 1:
self.full_path += str.join('/', self.relative_path)
else:
# Repository is local
# If it is remote or if it is local
fullpath_arr = self.storage_path.split('/')
# Joining relative path only if it is not null
if len(self.relative_path) > 0:
fullpath_arr.extend(self.relative_path)
self.full_path = re.sub(r"/+", '/', str.join('/', fullpath_arr))
if not os.path.exists(self.full_path):
os.makedirs(self.full_path, exist_ok = True)
def check(self):
"""Checking if the requested AppImage has been already built."""
if not len(self.appimagefilename) == 2:
self.calculate()
def __create_checksum__(self, file):
"""Internal function to create checksum file."""
for arch in self.arch:
print(f"Searching for {self.appimagefilename[arch]}")
res = subprocess.run(shlex.split(f"find {self.full_path} -name {self.appimagefilename[arch]}"), capture_output=True, env={ "LC_ALL": "C" }, text=True, encoding='utf-8')
retval = hashlib.md5()
with open(file, 'rb') as rawfile:
while True:
buf = rawfile.read(2**20)
if not buf:
break
if "No such file or directory" in res.stderr:
# Folder is not existent: so the version was not built
# Build stays false, and we go to the next arch
continue
retval.update(buf)
if res.stdout and len(res.stdout.strip("\n")) > 0:
# All good, the command was executed fine.
print(f"Build for {self.version} found.")
self.built[arch] = True
with open(f"{file}.md5", 'w', encoding='utf-8') as checkfile:
checkfile.write(f"{retval.hexdigest()} {os.path.basename(file)}")
if self.built[arch]:
print(f"The requested AppImage already exists on storage for {arch}. I'll skip downloading, building and moving the results.")
def __download_archive__(self, archive):
return self.__download__(self.url, archive)
def __download__(self, url: str, filename: str):
basename = filename
if '/' in filename:
basename = filename.split('/')[-1]
def download(self):
"""Downloads the contents of the URL as it was a folder."""
print(f"Started downloads for {self.version}. Please wait.")
for arch in self.arch:
# Checking if a valid path has been provided
if self.url[arch] == '-':
print(f"No build has been provided for the requested AppImage for {arch}. Continue with other options.")
# Faking already built it so to skip other checks.
self.built[arch] = True
continue
full_url = url
if url.endswith('/'):
# URL has to be completed with basename of filename
full_url = url + basename
if self.built[arch]:
print(f"A build for {arch} was already found. Skipping specific packages.")
continue
with requests.get(full_url, stream=True, timeout=10) as resource:
resource.raise_for_status()
with open(filename, 'wb') as file:
for chunk in resource.iter_content(chunk_size=8192):
file.write(chunk)
return filename
# Identifying downloads
contents = etree.HTML(urllib.request.urlopen(self.url[arch]).read()).xpath("//td/a")
self.tarballs[arch] = [ x.text for x in contents if x.text.endswith('tar.gz') and 'deb' in x.text ]
tarballs = self.tarballs[arch]
maintarball = tarballs[0]
# Create and change directory to the download location
os.makedirs(self.download_path, exist_ok = True)
os.chdir(self.download_path)
for archive in tarballs:
# If the archive is already there, do not do anything.
if os.path.exists(archive):
continue
# Download the archive
try:
urllib.request.urlretrieve(self.url[arch] + archive, archive)
except:
print(f"Failed to download {archive}.")
print(f"Finished downloads for {self.version}.")
def build(self):
"""Building all the versions."""
for arch in self.arch:
if self.built[arch]:
# Already built for arch or path not available. User has already been warned.
continue
# Preparation tasks
self.appnamedir = os.path.join(self.builddir, self.appname)
os.makedirs(self.appnamedir, exist_ok=True)
# And then cd to the appname folder.
os.chdir(self.appnamedir)
# Download appimagetool from github
appimagetoolurl = f"https://github.com/AppImage/AppImageKit/releases/download/continuous/appimagetool-{arch}.AppImage"
urllib.request.urlretrieve(appimagetoolurl, 'appimagetool')
os.chmod('appimagetool', 0o755)
# Build the requested version.
self.__unpackbuild__(arch)
def __unpackbuild__(self, arch):
def __unpackbuild__(self):
# We start by filtering out tarballs from the list
buildtarballs = [ self.tarballs[arch][0] ]
buildtarballs = [ self.tarballs[0] ]
# Let's process standard languages and append results to the
# buildtarball
if self.language == 'basic':
if self.offline_help:
buildtarballs.extend([ x for x in self.tarballs[arch] if 'pack_en-GB' in x ])
buildtarballs.extend([ x for x in self.tarballs if 'pack_en-GB' in x ])
else:
buildtarballs.extend([ x for x in self.tarballs[arch] if 'langpack_en-GB' in x])
buildtarballs.extend([ x for x in self.tarballs if 'langpack_en-GB' in x])
elif self.language == 'standard':
for lang in Build.LANGSTD:
if self.offline_help:
buildtarballs.extend([ x for x in self.tarballs[arch] if ('pack_' + lang) in x ])
buildtarballs.extend([ x for x in self.tarballs if 'pack_' + lang in x ])
else:
buildtarballs.extend([ x for x in self.tarballs[arch] if ('langpack_' + lang) in x ])
buildtarballs.extend([ x for x in self.tarballs if 'langpack_' + lang in x ])
elif self.language == 'full':
if self.offline_help:
# We need also all help. Let's replace buildtarball with the
# whole bunch
buildtarballs = self.tarballs[arch]
buildtarballs = self.tarballs
else:
buildtarballs.extend([ x for x in self.tarballs[arch] if 'langpack' in x ])
buildtarballs.extend([ x for x in self.tarballs if 'langpack' in x ])
else:
# Looping for each language in self.language
for lang in self.language.split(","):
if self.offline_help:
buildtarballs.extend([ x for x in self.tarballs[arch] if ('pack' + lang) in x ])
buildtarballs.extend([ x for x in self.tarballs
if 'pack' + lang in x ])
else:
buildtarballs.extend([ x for x in self.tarballs[arch] if ('langpack' + lang) in x ])
buildtarballs.extend([ x for x in self.tarballs
if 'langpack' + lang in x ])
os.chdir(self.appnamedir)
# Unpacking the tarballs
if self.verbose:
print("---- Unpacking ----")
for archive in buildtarballs:
subprocess.run(shlex.split(f"tar xzf {self.download_path}/{archive}"))
subprocess.run(shlex.split(
f"tar xzf {self.download_path}/{archive}"), check=True)
# create appimagedir
if self.verbose:
print("---- Preparing the build ----")
self.appimagedir = os.path.join(self.builddir, self.appname, self.appname + '.AppDir')
os.makedirs(self.appimagedir, exist_ok = True)
# At this point, let's decompress the deb packages
subprocess.run(shlex.split("find .. -iname '*.deb' -exec dpkg -x {} . \;"), cwd=self.appimagedir)
subprocess.run(shlex.split(
r"find .. -iname '*.deb' -exec dpkg -x {} . \;"
), cwd=self.appimagedir, check=True)
if self.portable:
subprocess.run(shlex.split("find . -type f -iname 'bootstraprc' -exec sed -i 's|^UserInstallation=.*|UserInstallation=\$SYSUSERCONFIG/libreoffice/%s|g' {} \+" % self.short_version), cwd=self.appimagedir)
subprocess.run(shlex.split(
r"find . -type f -iname 'bootstraprc' " +
r"-exec sed -i 's|^UserInstallation=.*|" +
r"UserInstallation=\$SYSUSERCONFIG/libreoffice/%s|g' {} \+" % self.short_version
), cwd=self.appimagedir, check=True)
# Changing desktop file
subprocess.run(shlex.split("find . -iname startcenter.desktop -exec cp {} . \;"), cwd=self.appimagedir)
subprocess.run(shlex.split("sed --in-place 's:^Name=.*$:Name=%s:' startcenter.desktop > startcenter.desktop" % self.appname), cwd=self.appimagedir)
subprocess.run(shlex.split(
r"find . -iname startcenter.desktop -exec cp {} . \;"
), cwd=self.appimagedir, check=True)
subprocess.run(shlex.split("find . -name '*startcenter.png' -path '*hicolor*48x48*' -exec cp {} . \;"), cwd=self.appimagedir)
subprocess.run(shlex.split(
f"sed --in-place \'s:^Name=.*$:Name={self.appname}:\' " +
r"startcenter.desktop"
), cwd=self.appimagedir, check=False)
subprocess.run(shlex.split(
r"find . -name '*startcenter.png' -path '*hicolor*48x48*' " +
r"-exec cp {} . \;"
), cwd=self.appimagedir, check=True)
# Find the name of the binary called in the desktop file.
binaryname = ''
with open(os.path.join(self.appimagedir, 'startcenter.desktop'), 'r') as d:
a = d.readlines()
for line in a:
with open(
os.path.join(self.appimagedir, 'startcenter.desktop'),
'r', encoding="utf-8"
) as desktopfile:
for line in desktopfile.readlines():
if re.match(r'^Exec', line):
binaryname = line.split('=')[-1].split(' ')[0]
# Esci al primo match
break
#binary_exec = subprocess.run(shlex.split(r"awk 'BEGIN { FS = \"=\" } /^Exec/ { print $2; exit }' startcenter.desktop | awk '{ print $1 }'"), cwd=self.appimagedir, text=True, encoding='utf-8')
#binaryname = binary_exec.stdout.strip("\n")
bindir=os.path.join(self.appimagedir, 'usr', 'bin')
os.makedirs(bindir, exist_ok = True)
subprocess.run(shlex.split("find ../../opt -iname soffice -path '*program*' -exec ln -sf {} ./%s \;" % binaryname), cwd=bindir)
subprocess.run(shlex.split(
r"find ../../opt -iname soffice -path '*program*' " +
r"-exec ln -sf {} ./%s \;" % binaryname
), cwd=bindir, check=True)
# Download AppRun from github
apprunurl = f"https://github.com/AppImage/AppImageKit/releases/download/continuous/AppRun-{arch}"
apprunurl = r"https://github.com/AppImage/AppImageKit/releases/"
apprunurl += f"download/continuous/AppRun-{self.arch}"
dest = os.path.join(self.appimagedir, 'AppRun')
urllib.request.urlretrieve(apprunurl, dest)
self.__download__(apprunurl, dest)
os.chmod(dest, 0o755)
# Dealing with extra options
@ -268,103 +548,36 @@ class Build(loaih.RemoteBuild):
# adding zsync build if updatable
if self.updatable:
buildopts.append(f"-u 'zsync|{self.zsyncfilename[arch]}'")
buildopts.append(f"-u 'zsync|{self.zsyncfilename}'")
buildopts_str = str.join(' ', buildopts)
# Build the number-specific build
subprocess.run(shlex.split(f"{self.appnamedir}/appimagetool {buildopts_str} -v ./{self.appname}.AppDir/"), env={ "VERSION": self.appversion })
print(f"Built AppImage version {self.appversion}")
if self.verbose:
print("---- Start building ----")
subprocess.run(shlex.split(
f"{self.appnamedir}/appimagetool {buildopts_str} -v " +
f"./{self.appname}.AppDir/"
), env={ "VERSION": self.appversion }, check=True)
print("---- End building ----")
else:
subprocess.run(shlex.split(
f"{self.appnamedir}/appimagetool {buildopts_str} -v " +
f"./{self.appname}.AppDir/"
), env={ "VERSION": self.appversion }, stdout=subprocess.DEVNULL,
stderr=subprocess.DEVNULL, check=True)
if self.verbose:
print(f"Built AppImage version {self.appversion}")
# Cleanup phase, before new run.
for deb in glob.glob(self.appnamedir + '/*.deb'):
os.remove(deb)
subprocess.run(shlex.split("find . -mindepth 1 -maxdepth 1 -type d -exec rm -rf {} \+"))
def checksums(self):
"""Create checksums of the built versions."""
# Skip checksum if initally the build was already found in the storage directory
if all(self.built.values()):
return
os.chdir(self.appnamedir)
for arch in self.arch:
for item in [ self.appimagefilename[arch], self.zsyncfilename[arch] ]:
# For any built arch, find out if a file exist.
self.__create_checksum__(item)
def __create_checksum__(self, file):
"""Internal function to create checksum file."""
checksum = subprocess.run(shlex.split(f"md5sum {file}"), capture_output=True, text=True, encoding='utf-8')
if checksum.stdout:
with open(f"{file}.md5", 'w') as c:
c.write(checksum.stdout)
def publish(self):
"""Moves built versions to definitive storage."""
if all(self.built.values()):
# All files are already present in the full_path
return
os.chdir(self.appnamedir)
# Forcing creation of subfolders, in case there is a new build
os.makedirs(self.full_path, exist_ok = True)
for file in glob.glob("*.AppImage*"):
subprocess.run(shlex.split(f"cp -f {file} {self.full_path}"))
def generalize_and_link(self):
"""Creates the needed generalized files if needed."""
# If called with a pointed version, no generalize and link necessary.
if not self.branch_version:
return
# If a prerelease or a daily version, either.
if self.query == 'daily' or self.query == 'prerelease':
return
appimagefilename = {}
zsyncfilename = {}
# Creating versions for short version and query text
versions = [ self.short_version, self.branch_version ]
for arch in Build.ARCHSTD:
# If already built, do not do anything.
if self.built[arch]:
continue
os.chdir(self.full_path)
# if the appimage for the reported arch is not found, skip to next
# arch
if not os.path.exists(self.appimagefilename[arch]):
continue
# Doing it both for short_name and for branchname
for version in versions:
appimagefilename[arch] = self.appname + '-' + version + self.languagepart + self.helppart + f'-{arch}.AppImage'
zsyncfilename[arch] = appimagefilename[arch] + '.zsync'
# Create the symlink
print(f"Creating {appimagefilename[arch]} and checksums.")
if os.path.exists(appimagefilename[arch]):
os.unlink(appimagefilename[arch])
os.symlink(self.appimagefilename[arch], appimagefilename[arch])
# Create the checksum for the AppImage
self.__create_checksum__(appimagefilename[arch])
# Do not continue if no zsync file is provided.
if not self.updatable:
continue
print(f"Creating zsync file for version {version}.")
if os.path.exists(zsyncfilename[arch]):
os.unlink(zsyncfilename[arch])
shutil.copyfile(self.zsyncfilename[arch], zsyncfilename[arch])
# Editing the zsyncfile
subprocess.run(shlex.split(f"sed --in-place 's/^Filename:.*$/Filename: {appimagefilename[arch]}/' {zsyncfilename[arch]}"))
self.__create_checksum__(zsyncfilename[arch])
subprocess.run(shlex.split(
r"find . -mindepth 1 -maxdepth 1 -type d -exec rm -rf {} \+"
), check=True)
self.built = True
def __del__(self):
"""Destructor"""

View File

@ -1,115 +1,217 @@
#!/usr/bin/env python
# encoding: utf-8
"""Helps with command line commands."""
import os
import shutil
import sys
import json
import click
import yaml
import loaih, loaih.build
import re, sys, json
import loaih
import loaih.build
@click.group()
def cli():
pass
"""Helps with command line commands."""
@cli.command()
@click.option('-j', '--json', 'jsonout', default=False, is_flag=True, help="Output format in json.")
@click.option('--default-to-current', '-d', is_flag=True, default=False, help="If no versions are found, default to current one (for daily builds). Default: do not default to current.")
@click.argument('query')
def getversion(query, jsonout):
b = []
def getversion(query, jsonout, default_to_current):
"""Get download information for named or numbered versions."""
batchlist = []
queries = []
if ',' in query:
queries.extend(query.split(','))
else:
queries.append(query)
for q in queries:
b.extend(loaih.Base.collectedbuilds(q))
for singlequery in queries:
elem = loaih.Solver.parse(singlequery, default_to_current)
if elem.version not in { None, "" }:
batchlist.append(elem)
if len(b) > 0:
if len(batchlist) > 0:
if jsonout:
click.echo(json.dumps([x.todict() for x in b]))
click.echo(json.dumps([x.to_dict() for x in batchlist ]))
else:
for v in b:
click.echo(v)
for value in batchlist:
click.echo(value)
@cli.command()
@click.option('-a', '--arch', 'arch', type=click.Choice(['x86', 'x86_64', 'all'], case_sensitive=False), default='all', help="Build the AppImage for a specific architecture. If there is no specific options, the process will build for both architectures (if available). Default: all")
@click.option('-c/-C', '--check/--no-check', 'check', default=True, help="Check in the final storage if the queried version is existent. Default: check")
@click.option('-d', '--download-path', 'download_path', default = '/var/tmp/downloads', type=str, help="Path to the download folder. Default: /var/tmp/downloads")
@click.option('-l', '--language', 'language', default = 'basic', type=str, help="Languages to be included. Options: basic, standard, full, a language string (e.g. 'it') or a list of languages comma separated (e.g.: 'en-US,en-GB,it'). Default: basic")
@click.option('-o/-O', '--offline-help/--no-offline-help', 'offline', default = False, help="Include or not the offline help for the chosen languages. Default: no offline help")
@click.option('-p/-P', '--portable/--no-portable', 'portable', default = False, help="Create a portable version of the AppImage or not. Default: no portable")
@click.option('-r', '--repo-path', 'repo_path', default = '/mnt/appimage', type=str, help="Path to the final storage of the AppImage. Default: /mnt/appimage")
@click.option('-s/-S', '--sign/--no-sign', 'sign', default=True, help="Wether to sign the build. Default: sign")
@click.option('-u/-U', '--updatable/--no-updatable', 'updatable', default = True, help="Create an updatable version of the AppImage or not. Default: updatable")
@click.option('-a', '--arch', 'arch', default='x86_64',
type=click.Choice(['x86', 'x86_64', 'all'], case_sensitive=False), help="Build the AppImage for a specific architecture. Default: x86_64")
@click.option('--check', '-c', is_flag=True, default=False, help="Checks in the repository path if the queried version is existent. Default: do not check")
@click.option('--checksums', '-e', is_flag=True, default=False, help="Create checksums for each created file (AppImage). Default: do not create checksums.")
@click.option('--keep-downloads', '-k', 'keep', is_flag=True, default=False, help="Keep the downloads folder after building the AppImage. Default: do not keep.")
@click.option('--languages', '-l', 'language', default='basic', type=str, help="Languages to be included. Options: basic, standard, full, a language string (e.g. 'it') or a list of languages comma separated (e.g.: 'en-US,en-GB,it'). Default: basic")
@click.option('--offline-help', '-o', 'offline', is_flag=True, default=False, help="Include the offline help pages for the chosen languages. Default: no offline help")
@click.option('--portable', '-p', 'portable', is_flag=True, default=False, help="Create a portable version of the AppImage or not. Default: no portable")
@click.option('--sign', '-s', is_flag=True, default=False, help="Sign the build with your default GPG key. Default: do not sign")
@click.option('--updatable', '-u', is_flag=True, default=False, help="Create an updatable AppImage (compatible with zsync2). Default: not updatable")
@click.option('--download-path', '-d', default='./downloads', type=str, help="Path to the download folder. Default: ./downloads")
@click.option('--repo-path', '-r', default='.', type=str, help="Path to the final storage of the AppImage. Default: current directory")
@click.argument('query')
def build(arch, language, offline, portable, updatable, download_path, repo_path, check, sign, query):
def build(arch, language, offline, portable, updatable, download_path, repo_path, check, checksums, sign, keep, query):
"""Builds an Appimage with the provided options."""
# Multiple query support
queries = []
if ',' in query:
queries.extend(query.split(','))
else:
queries.append(query)
# Parsing options
arches = []
if arch.lower() == 'all':
# We need to build it twice.
arches = [ u'x86', u'x86_64' ]
arches = ['x86', 'x86_64']
else:
arches = [ arch.lower() ]
arches = [arch.lower()]
if query.endswith('.yml') or query.endswith('.yaml'):
# This is a buildfile. So we have to load the file and pass the build options ourselves.
config = {}
with open(query, 'r') as file:
config = yaml.safe_load(file)
# Other more global variables
repopath = os.path.abspath(repo_path)
if not os.path.exists(repopath):
os.makedirs(repopath, exist_ok=True)
downloadpath = os.path.abspath(download_path)
if not os.path.exists(downloadpath):
os.makedirs(downloadpath, exist_ok=True)
# With the config file, we ignore all the command line options and set
# generic default.
for build in config['builds']:
# Loop a run for each build.
collection = loaih.build.Collection(build['query'], arches)
for obj in collection:
# Configuration phase
obj.language = build['language']
obj.offline_help = build['offline_help']
obj.portable = build['portable']
obj.updatable = True
obj.storage_path = config['data']['repo'] if 'repo' in config['data'] and config['data']['repo'] else '/srv/http/appimage.sys42.eu'
obj.download_path = config['data']['download'] if 'download' in config['data'] and config['data']['download'] else '/var/tmp/downloads'
if 'sign' in config['data'] and config['data']['sign']:
obj.sign = True
# Build phase
obj.calculate()
if not 'force' in config['data'] or not config['data']['force']:
obj.check()
obj.download()
obj.build()
obj.checksums()
obj.publish()
obj.generalize_and_link()
del obj
else:
collection = loaih.build.Collection(query, arches)
for obj in collection:
for myquery in queries:
for appbuild in loaih.build.Collection(myquery, arches):
# Configuration phase
obj.language = language
obj.offline_help = offline
obj.portable = portable
obj.updatable = updatable
obj.storage_path = repo_path
obj.download_path = download_path
if sign:
obj.sign = True
appbuild.tidy_folder = False
appbuild.language = language
appbuild.offline_help = offline
appbuild.portable = portable
appbuild.updatable = updatable
appbuild.storage_path = repopath
appbuild.download_path = downloadpath
appbuild.sign = sign
# Running phase
obj.calculate()
appbuild.calculate()
if check:
obj.check()
appbuild.check()
appbuild.download()
appbuild.build()
if checksums:
appbuild.checksums()
appbuild.publish()
del appbuild
if not keep:
shutil.rmtree(downloadpath)
@cli.command()
@click.option("--verbose", '-v', is_flag=True, default=False, help="Show building phases.", show_default=True)
@click.argument("yamlfile")
def batch(yamlfile, verbose):
"""Builds a collection of AppImages based on YAML file."""
# Defaults for a batch building is definitely more different than a
# manual one. To reflect this behaviour, I decided to split the commands
# between batch (bulk creation) and build (manual building).
# Check if yamlfile exists.
if not os.path.exists(os.path.abspath(yamlfile)):
click.echo(f"YAML file {yamlfile} does not exists or is unreadable.")
sys.exit(1)
# This is a buildfile. So we have to load the file and pass the build
# options ourselves.
config = {}
with open(os.path.abspath(yamlfile), 'r', encoding='utf-8') as file:
config = yaml.safe_load(file)
# Globals for yamlfile
gvars = {}
gvars['download_path'] = "/var/tmp/downloads"
if 'download' in config['data'] and config['data']['download']:
gvars['download_path'] = config['data']['download']
gvars['force'] = False
if 'force' in config['data'] and config['data']['force']:
gvars['force'] = config['data']['force']
gvars['storage_path'] = "/srv/http/appimage"
if 'repo' in config['data'] and config['data']['repo']:
gvars['storage_path'] = config['data']['repo']
gvars['remoterepo'] = False
gvars['remote_host'] = ''
gvars['remote_path'] = "/srv/http/appimage"
if 'http' in gvars['storage_path']:
gvars['remoterepo'] = True
gvars['remote_host'] = "ciccio.libreitalia.org"
if 'remote_host' in config['data'] and config['data']['remote_host']:
gvars['remote_host'] = config['data']['remote_host']
if 'remote_path' in config['data'] and config['data']['remote_path']:
gvars['remote_path'] = config['data']['remote_path']
gvars['sign'] = False
if 'sign' in config['data'] and config['data']['sign']:
gvars['sign'] = True
# With the config file, we ignore all the command line options and set
# generic default.
for cbuild in config['builds']:
# Loop a run for each build.
collection = loaih.build.Collection(cbuild['query'])
for obj in collection:
# Configuration phase
obj.verbose = verbose
obj.language = 'basic'
if 'language' in cbuild and cbuild['language']:
obj.language = cbuild['language']
obj.offline_help = False
if 'offline_help' in cbuild and cbuild['offline_help']:
obj.offline_help = cbuild['offline_help']
obj.portable = False
if 'portable' in cbuild and cbuild['portable']:
obj.portable = cbuild['portable']
obj.updatable = True
obj.storage_path = gvars['storage_path']
obj.download_path = gvars['download_path']
obj.remoterepo = gvars['remoterepo']
obj.remote_host = gvars['remote_host']
obj.remote_path = gvars['remote_path']
obj.sign = gvars['sign']
# Build phase
obj.calculate()
if not gvars['force']:
obj.check()
obj.download()
obj.build()
obj.checksums()
if obj.remoterepo and obj.appnamedir:
obj.generalize_and_link(obj.appnamedir)
obj.publish()
obj.generalize_and_link()
if not obj.remoterepo:
obj.generalize_and_link()
del obj
# In case prerelease or daily branches are used, cleanup the download
# folder after finishing the complete run (to make sure the next run
# will redownload all the needed files and is indeed fresh).
# we will swipe all the builds inside a collection to understand the files
# to delete.
for cbuild in config['builds']:
# Loop a run for each build.
for build in loaih.build.Collection(cbuild['query']):
if build.version.branch in {'prerelease', 'daily'}:
build.version.cleanup_downloads(gvars['download_path'], verbose)

View File

@ -1,67 +1,69 @@
---
data:
repo: /mnt/appimage
repo: https://appimages.libreitalia.org
remote_host: ciccio
remote_path: /var/lib/nethserver/vhost/appimages
download: /var/tmp/downloads
force: no
sign: yes
force: false
sign: true
builds:
- query: prerelease
language: basic
offline_help: no
portable: no
offline_help: false
portable: false
- query: prerelease
language: basic
offline_help: yes
portable: no
offline_help: true
portable: false
- query: prerelease
language: basic
offline_help: no
portable: yes
offline_help: false
portable: true
- query: prerelease
language: basic
offline_help: yes
portable: yes
offline_help: true
portable: true
- query: prerelease
language: standard
offline_help: no
portable: no
offline_help: false
portable: false
- query: prerelease
language: standard
offline_help: yes
portable: no
offline_help: true
portable: false
- query: prerelease
language: standard
offline_help: no
portable: yes
offline_help: false
portable: true
- query: prerelease
language: standard
offline_help: yes
portable: yes
offline_help: true
portable: true
- query: prerelease
language: full
offline_help: no
portable: no
offline_help: false
portable: false
- query: prerelease
language: full
offline_help: yes
portable: no
offline_help: true
portable: false
- query: prerelease
language: full
offline_help: no
portable: yes
offline_help: false
portable: true
- query: prerelease
language: full
offline_help: yes
portable: yes
offline_help: true
portable: true

View File

@ -1,13 +1,20 @@
#!/usr/bin/env python
# encoding: utf-8
# vim:sts=4:sw=4
"""Helps building and automatizing building LibreOffice AppImages."""
from setuptools import setup,find_packages
from pathlib import Path
from setuptools import setup, find_packages
this_directory = Path(__file__).parent
long_description = (this_directory / "README.md").read_text()
setup(
name="loaih",
version="1.2.0",
version="1.3.3",
description="LOAIH - LibreOffice AppImage Helpers, help build a LibreOffice AppImage",
long_description=long_description,
long_description_content_type='text/markdown',
author="Emiliano Vavassori",
author_email="syntaxerrormmm@libreoffice.org",
packages=find_packages(exclude=['contrib', 'docs', 'tests']),
@ -16,7 +23,7 @@ setup(
'loaih = loaih.script:cli',
],
},
install_requires=[ 'click', ],
install_requires=['click', 'lxml', 'packaging', 'pyyaml', 'requests'],
license='MIT',
url='https://git.libreitalia.org/LibreItalia/loappimage-helpers/',
url='https://git.libreitalia.org/LibreItalia/loaih/',
)

View File

@ -1,67 +1,69 @@
---
data:
repo: /mnt/appimage
repo: https://appimages.libreitalia.org
remote_host: ciccio
remote_path: /var/lib/nethserver/vhost/appimages
download: /var/tmp/downloads
force: no
sign: yes
force: false
sign: true
builds:
- query: still
language: basic
offline_help: no
portable: no
offline_help: false
portable: false
- query: still
language: basic
offline_help: yes
portable: no
offline_help: true
portable: false
- query: still
language: basic
offline_help: no
portable: yes
offline_help: false
portable: true
- query: still
language: basic
offline_help: yes
portable: yes
offline_help: true
portable: true
- query: still
language: standard
offline_help: no
portable: no
offline_help: false
portable: false
- query: still
language: standard
offline_help: yes
portable: no
offline_help: true
portable: false
- query: still
language: standard
offline_help: no
portable: yes
offline_help: false
portable: true
- query: still
language: standard
offline_help: yes
portable: yes
offline_help: true
portable: true
- query: still
language: full
offline_help: no
portable: no
offline_help: false
portable: false
- query: still
language: full
offline_help: yes
portable: no
offline_help: true
portable: false
- query: still
language: full
offline_help: no
portable: yes
offline_help: false
portable: true
- query: still
language: full
offline_help: yes
portable: yes
offline_help: true
portable: true

View File

@ -1,12 +1,14 @@
---
data:
repo: /mnt/appimage
repo: https://appimages.libreitalia.org
remote_host: ciccio
remote_path: /var/lib/nethserver/vhost/appimages
download: /var/tmp/downloads
force: yes
sign: yes
force: true
sign: true
builds:
- query: fresh
- query: 7.2.3
language: basic
offline_help: no
portable: no
offline_help: false
portable: false