Allow running ramalama
without a GPU
#1909
Merged
Red Hat Konflux kflux-prd-rh03 / Konflux kflux-prd-rh03 / llama-stack-on-pull-request
failed
Sep 8, 2025 in 10s
Failed
Konflux kflux-prd-rh03/llama-stack-on-pull-request has failed.
Details
- Namespace: ramalama-tenant
- PipelineRun: llama-stack-on-pull-request-f9mm6
Task Statuses:
Pipeline ramalama-tenant/llama-stack-on-pull-request-f9mm6 can't be Run; it contains Tasks that don't exist: Couldn't retrieve Task "resolver type bundles\nname = deprecated-image-check\n": error requesting remote resource: error getting "bundleresolver" "ramalama-tenant/bundles-9cd463079c3aee4a1e01318fe7f54379": cannot retrieve the oci image: GET https://quay.io/v2/konflux-ci/tekton-catalog/task-deprecated-image-check/manifests/sha256:c49732039f105de809840be396f83ead8c46f6a6948e1335b76d37e9eb469574: MANIFEST_UNKNOWN: manifest unknown; map[]
Loading