tag: add description
Add the possibility to set a custom description on a tag with the
"dlrepo-cli set-description BRANCH TAG DESCRIPTION" command.
The message is stored in an internal ".description" file inside
the tag directory.
The description is displayed on top of the tag page on the web
interface. It can also be read on the CLI using the
"dlrepo-cli get-description BRANCH TAG" command.
Signed-off-by: Julien Floret <julien.floret@6wind.com>
Acked-by: Thomas Faivre <thomas.faivre@6wind.com>
Acked-by: Robin Jarry <robin@jarry.cc>
fs: allow disabling periodic cleanup
It is impractical to rely on the periodic cleanup, because the
time of the cleanup depends on the time when the server started.
So if we want to setup let's say a daily cleanup at 8 AM, we have to
start the server at 8 AM beforehand.
Allow disabling periodic cleanup altogether by setting the
DLREPO_TAG_CLEANUP_PERIOD to 0.
As a reminder, the tag cleanup can also be triggered from an external
script by sending the USR1 signal to the server. This method allows
more control over the cleanup schedule.
Signed-off-by: Julien Floret <julien.floret@6wind.com>
Acked-by: Thomas Faivre <thomas.faivre@6wind.com>
Acked-by: Robin Jarry <robin@jarry.cc>
requirements-dev: upgrade multidict to 6.0.5
Fix compilation issue when using gcc-14 (i.e. debian/testing used in the
CI):
> × Building wheel for multidict (pyproject.toml) did not run successfully.
> │ exit code: 1
> [...]
> note: This error originates from a subprocess, and is likely not a problem with pip.
> ERROR: Failed building wheel for multidict
> [...]
> Failed to build multidict
> ERROR: ERROR: Failed to build installable wheels for some pyproject.toml based projects (multidict)
> make: Leaving directory '/home/build/dlrepo'
> make: *** [Makefile:18: .venv/.stamp] Error 1
Release notes seem harmless, so no regression is expected.
Note: version 6.1.0 is out, but release notes are more scary. Only focus
on fixing CI for now.
Link: https://github.com/aio-libs/multidict/releases/tag/v6.0.5
Link: https://github.com/aio-libs/multidict/issues/926
Signed-off-by: Thomas Faivre <thomas.faivre@6wind.com>
Acked-by: Julien Floret <julien.floret@6wind.com>
Acked-by: Robin Jarry <robin@jarry.cc>
fmt: add content disposition header to redirection
When issuing a GET or HEAD request on a format folder that contains a
single artifact file, the response is an HTTP redirection to that
file. In that case, add a "Content-Disposition" HTTP header to the
response, so that it can be used with e.g. "curl -JOL" to save the
downloaded file using the actual file name instead of that of the
format folder.
Signed-off-by: Julien Floret <julien.floret@6wind.com>
Acked-by: Thomas Faivre <thomas.faivre@6wind.com>
Acked-by: Robin Jarry <robin@jarry.cc>
fmt: allow modifying internal format in locked job
Internal formats are not released and are not included in the
calculation of the job digest.
They can then be safely added to or deleted from a locked job, without
needing to unlock it prior, so the uploader will not need the "update"
access, only the "add" access.
It is useful to add extra info after a job has been released, like
test results, or transient data like CVE scans updated daily with an
up-to-date vulnerability database, without giving too many permissions
to the uploader.
Signed-off-by: Julien Floret <julien.floret@6wind.com>
Acked-by: Thomas Faivre <thomas.faivre@6wind.com>
Acked-by: Robin Jarry <robin@jarry.cc>
requirements: update dependencies
Some of the dev dependencies are too old for python 3.12. Update to the
latest versions.
Fix/silence new pylint warnings.
Signed-off-by: Robin Jarry <robin@jarry.cc>
format: add delete method
It can be useful to delete a specific job format without deleting the
entire job. For example if there is a mistake in some additional
documentation or test result and we want to update it without
deleting and re-uploading the entire job.
Signed-off-by: Julien Floret <julien.floret@6wind.com>
Acked-by: Thomas Faivre <thomas.faivre@6wind.com>
Acked-by: Robin Jarry <robin@jarry.cc>
fs: ensure upload never replaces an existing blob
When a file is uploaded, it is first uploaded to .uploads/<uuid>, then
moved to .blobs/<digest>.
The .blobs/<digest> file is then hardlinked to the actual file path
into the job.
If however the same file is uploaded more than once, without checking
that its digest already exists on the server, then any existing
.blobs/<digest> file is replaced with the same newly uploaded
file. As a consequence, the existing hardlinks become orphaned.
What's more, if the most recent job is deleted afterwards, the
associated .blobs/<digest> file will have no other hardlink to it, so
it will be deleted at the next run of the cleanup task.
This is catastrophic for containers, because it breaks container
pull. Indeed "docker pull", for example, downloads its layers
from .blobs/<digest>. If the blob has been cleaned up, the pull
command will fail with the "unknown blob" error message, even if the
missing blob can be seen under the job's container format.
The problem can be easily reproduced by running multiple parallel
"dlrepo-cli upload" of the same big file into different jobs.
To avoid this, let's check if the blob already exists before moving
the uploaded file into it. If yes, just keep the existing blob and
remove the newly uploaded one.
Fixes: bd1c23893882 ("server: add filesystem api")
Signed-off-by: Julien Floret <julien.floret@6wind.com>
Acked-by: Thomas Faivre <thomas.faivre@6wind.com>
Acked-by: Robin Jarry <robin@jarry.cc>
templates: add branch links to product page
To ease navigation between the product page and the original job(s) in
/branches/, add a "Branch Link(s)" section to the bottom of the
product page (similar to the "Product Link" section at the bottom of
the job page).
The branch links are not displayed if the user does not have the
rights to access the job (typically, customers should have access to
/products/ but not to /branches/).
Signed-off-by: Julien Floret <julien.floret@6wind.com>
Acked-by: Thomas Faivre <thomas.faivre@6wind.com>
job: detect product link conflicts
Several jobs may be linked to the same product. For example,
we can have a job handling the generation of the binary packages (a
"bin" format), and another job handling the generation of the
documentation (a "doc" format) for the same product.
However, if some formats of the two jobs overlap (e.g. each job has a
"bin" format), the product format will point to the latest uploaded
job. This usually happens if there is a mistake in the declaration of
the product variant of one of the jobs.
However, since there is no error, the conflict can easily be missed.
To avoid this, when updating the symlinks, let symlink_to() throw an
exception if the link already exists. The dlrepo-cli "set-info"
command will then return an error and it will be easier to detect the
conflict. Rename the update_symlink() method to reflect the change.
Note that this works because we should never have to update an
existing link: in set_metadata(), the current links are removed
before creating the new links.
Signed-off-by: Julien Floret <julien.floret@6wind.com>
Acked-by: Thomas Faivre <thomas.faivre@6wind.com>
job: rm redundant code in _link_to_product()
The method iterates over the job formats to symlink each format into
the product folder. It then attempts to symlink the "container" format
explicitly. The latter is redundant since containers are already taken
care of in the loop.
Fixes: bd1c23893882 ("server: add filesystem api")
Signed-off-by: Julien Floret <julien.floret@6wind.com>
Acked-by: Thomas Faivre <thomas.faivre@6wind.com>