jenkinsci/docker

Latest alpine image uses unsupported JRE

Bitals opened this issue · 16 comments

Jenkins and plugins versions report

Environment
Paste the output here

What Operating System are you using (both controller, and any agents involved in the problem)?

Official jenkins:alpine image (sha256:c550540e16695929938a84725bec6d0360ab64201bb3e02e9cb6bfab74e0900b)
Debian host

Reproduction steps

  1. Run the latest jenkins:alpine image (sha256:c550540e16695929938a84725bec6d0360ab64201bb3e02e9cb6bfab74e0900b as of now)
  2. It crashes with the following error:
Running with Java 8 from /usr/lib/jvm/java-1.8-openjdk/jre, which is older than the minimum required version (Java 11).
Supported Java versions are: [11, 17, 21]
See https://jenkins.io/redirect/java-support/ for more information.

Expected Results

Official image uses supported version of JRE

Actual Results

Official image uses unsupported version of JRE

Anything else?

The Dockerfile explicitly sets Java version 8 on lines 5, 7 and 8.

alpine-jdk11 and alpine-jdk17 work after manually resetting/changing PATH and JAVA_HOME env vars, but the main alpine tag does not have any JRE other than 8 installed, so is impossible to fix locally without modifying the running container or the image.

timja commented

Reproduced.

timja commented

Config looks right, if I build locally and do:

export LATEST_WEEKLY=true

It works just fine

I can't see what's causing this.

Trusted.ci build logs also look fine

cc @dduportal

What is suspicious is that it says:

Last pushed 21 minutes ago by [jenkinsinfraadmin](https://hub.docker.com/u/jenkinsinfraadmin)

but it was actually built 3 days ago

timja commented

I wonder if we just change the password and rotate the tokens to break whatever is pushing this =/

I wonder if we just change the password and rotate the tokens to break whatever is pushing this =/

yep, I think we should do this yes

@timja I'm grepping the trusted.ci controller to check for the shasum c550540e16695929938a84725bec6d0360ab64201bb3e02e9cb6bfab74e0900b in the build logs, just to be sure there isn't a forgotten job (that's how I found the issue last time).

I wonder if we just change the password and rotate the tokens to break whatever is pushing this =/

Rotation of the token in progress

Culprit found: no security issue but an old tag of this repository which was triggered and rebuilt: https://github.com/jenkinsci/docker/tree/jenkins-docker-packaging-2.164.2

I have honestly no idea why this tag was triggered (and not the others): trusted.ci is set up to only build tags newer than 3 days old.

All credentials rotated @timja .

I propose that we wait for next week's weekly and LTS releases to have the images fixed: does it look good to you?

@Bitals sorry for the inconvenience and thanks for reporting. I strongly suggest that you stick to a pinned tagged version as the "latest" images cannot ensure a fixed behavior

@Bitals sorry for the inconvenience and thanks for reporting. I strongly suggest that you stick to a pinned tagged version as the "latest" images cannot ensure a fixed behavior

I know the risk and am fine with it. It's a homelab, I have regular backups and usually some time to troubleshoot and report stuff like this.

@Bitals sorry for the inconvenience and thanks for reporting. I strongly suggest that you stick to a pinned tagged version as the "latest" images cannot ensure a fixed behavior

From the Docker Hub, it looks like pinned versions have been overwritten too (e.g. lts-alpine and 2.405-alpine). We're using the lts-alpine tag which broke this morning, and I was going to pin us to 2.401.3 (last working version) but that has the same digest hash as the LTS tag. The oldest version I've spotted (though I haven't looked thoroughly) that hasn't been overwritten seems to be 2.387.2

FWIW the -jdk11 images don't seem to be affected, I'm not sure how feasible it is to switch to image set instead.

@Bitals sorry for the inconvenience and thanks for reporting. I strongly suggest that you stick to a pinned tagged version as the "latest" images cannot ensure a fixed behavior

From the Docker Hub, it looks like pinned versions have been overwritten too (e.g. lts-alpine and 2.405-alpine). We're using the lts-alpine tag which broke this morning, and I was going to pin us to 2.401.3 (last working version) but that has the same digest hash as the LTS tag. The oldest version I've spotted (though I haven't looked thoroughly) that hasn't been overwritten seems to be 2.387.2

FWIW the -jdk11 images don't seem to be affected, I'm not sure how feasible it is to switch to image set instead.

Good catch. It means we have to republish the images as soon as posible, and then informa users of the change to avoid inadvertently seeing the problem.

I've started the publication of 2.419 (lates weekly) and 2.401.3 (latest LTS) as we use both of them on the public infrasrtucture.

The images for 2.401.1, 2.401.2, 2.401.3 and 2.419 have been all rebuilt.

Please note that 2.401.3 was rebuilt a 3rd time to ensure the tags are properly defined for lts

Thank you!
Can I ask when (if at all) other pinned tags will be republished (as can be seen in #1691)?

It was done for the 3 LTS version (2.401.1, 2.401.2 and 2.401.3) and latest weekly version (2.419) as explained in https://www.jenkins.io/blog/2023/08/22/linux-containers-rebuilt/.

Since then, the weekly release 2.420 has been added and a new LTS line was started (2.414.1) without this problem.

I'm going to proceed and close this issue as fixed. Feel free to reopen with a reproduction example if you still see a problem.

  • If you were using a pinned weekly version <= 2.414 then you must upgrade to the new LTS 2.414.1
  • If you were using a pinned weekly version > 2.414 then you must upgrade to either 2.419 or the latest 2.420