docker binaries in slave not mounted correctly
neuromantik33 opened this issue · 1 comments
Hello,
I'm experiencing an issue similar to #77 but the proposed fix does not work for me. My situation is a little different however, I'm running the latest Jenkins in a bare metal server in premises and I configured the Kubernetes plugin to use a GKE cluster. I followed all instructions detailed here configuring jenkins but when running from jenkins, it can't seem to find the mount.
Here is an example of my pipeline:
node('k8s') {
sh 'uname -a'
sh 'id'
sh 'whoami'
sh 'printenv'
sh 'mount'
sh 'ls -l /usr/bin/docker || true'
sh 'ls -l /var/run/docker.sock || true'
sh 'which docker'
}
and its output:
Started by user Nicolas ESTRADA
Running in Durability level: MAX_SURVIVABILITY
[Pipeline] node
Still waiting to schedule task
slave-2l7bp is offline
Agent slave-2l7bp is provisioned from template Kubernetes Pod Template
Agent specification [Kubernetes Pod Template] (k8s jdk ubuntu):
* [jenkins-slave] gcr.io/cloud-solutions-images/jenkins-k8s-slave:v4(resourceRequestCpu: , resourceRequestMemory: , resourceLimitCpu: , resourceLimitMemory: )
Running on slave-2l7bp in /home/jenkins/workspace/nick-pipeline-scratchpad
[Pipeline] {
[Pipeline] sh
[nick-pipeline-scratchpad] Running shell script
+ uname -a
Linux slave-2l7bp 4.4.111+ #1 SMP Thu Apr 5 21:21:21 PDT 2018 x86_64 Linux
[Pipeline] sh
[nick-pipeline-scratchpad] Running shell script
+ id
uid=10000(jenkins) gid=10000(jenkins) groups=10000(jenkins)
[Pipeline] sh
[nick-pipeline-scratchpad] Running shell script
+ whoami
jenkins
[Pipeline] sh
[nick-pipeline-scratchpad] Running shell script
+ printenv
JENKINS_HOME=/var/lib/jenkins
KUBERNETES_PORT=tcp://10.23.240.1:443
JENKINS_SECRET=**redacted**
KUBERNETES_SERVICE_PORT=443
JAVA_ALPINE_VERSION=8.151.12-r0
RUN_CHANGES_DISPLAY_URL=http://jenkins-linux.oscaroad.com/job/nick-pipeline-scratchpad/90/display/redirect?page=changes
HOSTNAME=slave-2l7bp
LD_LIBRARY_PATH=/usr/lib/jvm/java-1.8-openjdk/jre/lib/amd64/server:/usr/lib/jvm/java-1.8-openjdk/jre/lib/amd64:/usr/lib/jvm/java-1.8-openjdk/jre/../lib/amd64
NODE_LABELS=jdk k8s slave-2l7bp ubuntu
HUDSON_URL=http://jenkins-linux.oscaroad.com/
SHLVL=3
HOME=/home/jenkins
BUILD_URL=http://jenkins-linux.oscaroad.com/job/nick-pipeline-scratchpad/90/
HUDSON_COOKIE=41f4bf31-ce7b-4305-ae99-4ebeb7cd8e15
JENKINS_SERVER_COOKIE=durable-981b84a42f63e728576bf615e6fe571e
WORKSPACE=/home/jenkins/workspace/nick-pipeline-scratchpad
JAVA_VERSION=8u151
NODE_NAME=slave-2l7bp
EXECUTOR_NUMBER=0
KUBERNETES_PORT_443_TCP_ADDR=10.23.240.1
BUILD_DISPLAY_NAME=#90
HUDSON_HOME=/var/lib/jenkins
AGENT_WORKDIR=/home/jenkins/agent
JOB_BASE_NAME=nick-pipeline-scratchpad
PATH=/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/usr/lib/jvm/java-1.8-openjdk/jre/bin:/usr/lib/jvm/java-1.8-openjdk/bin
BUILD_ID=90
KUBERNETES_PORT_443_TCP_PORT=443
BUILD_TAG=jenkins-nick-pipeline-scratchpad-90
KUBERNETES_PORT_443_TCP_PROTO=tcp
LANG=C.UTF-8
JENKINS_URL=http://jenkins-linux.oscaroad.com/
JOB_URL=http://jenkins-linux.oscaroad.com/job/nick-pipeline-scratchpad/
BUILD_NUMBER=90
XFILESEARCHPATH=/usr/dt/app-defaults/%L/Dt
JENKINS_NODE_COOKIE=4d7ae7b2-d7c4-4e74-9107-6ccc3c8bb6ab
RUN_DISPLAY_URL=http://jenkins-linux.oscaroad.com/job/nick-pipeline-scratchpad/90/display/redirect
HUDSON_SERVER_COOKIE=2c33264286faff7f
JOB_DISPLAY_URL=http://jenkins-linux.oscaroad.com/job/nick-pipeline-scratchpad/display/redirect
JENKINS_NAME=slave-2l7bp
KUBERNETES_PORT_443_TCP=tcp://10.23.240.1:443
NLSPATH=/usr/dt/lib/nls/msg/%L/%N.cat
KUBERNETES_SERVICE_PORT_HTTPS=443
JOB_NAME=nick-pipeline-scratchpad
JAVA_HOME=/usr/lib/jvm/java-1.8-openjdk
KUBERNETES_SERVICE_HOST=10.23.240.1
PWD=/home/jenkins/workspace/nick-pipeline-scratchpad
[Pipeline] sh
[nick-pipeline-scratchpad] Running shell script
+ mount
overlay on / type overlay (rw,relatime,lowerdir=/var/lib/docker/overlay2/l/D6F7KCFWZJEKPGKPJUTFU45J6X:/var/lib/docker/overlay2/l/32C4NSDKICBLNVFF6WO4LUWFSX:/var/lib/docker/overlay2/l/OSTMQYOIQEY7IRLL2AKHRVXEQM:/var/lib/docker/overlay2/l/QULP2TSSWLL54RYVCKGF36VXWJ:/var/lib/docker/overlay2/l/VK4KKHT53DRM4UXZC7BZNUGLSX:/var/lib/docker/overlay2/l/IU6NT2U7XXNLFCIJINGTK3ES7X:/var/lib/docker/overlay2/l/L33OZHNBEX5ZQ54P7V5K3KPH7D:/var/lib/docker/overlay2/l/GFH52HFVCOKQS6NGZJ2PRBAFJA:/var/lib/docker/overlay2/l/6RZWSCV3HKUWVQKFKU3JDZ7QRB,upperdir=/var/lib/docker/overlay2/4b4911895da7aee1f4d027d8cb4f95e1d18083db93baa4d838dfb74cdab27407/diff,workdir=/var/lib/docker/overlay2/4b4911895da7aee1f4d027d8cb4f95e1d18083db93baa4d838dfb74cdab27407/work)
proc on /proc type proc (rw,nosuid,nodev,noexec,relatime)
tmpfs on /dev type tmpfs (rw,nosuid,mode=755)
devpts on /dev/pts type devpts (rw,nosuid,noexec,relatime,gid=5,mode=620,ptmxmode=666)
sysfs on /sys type sysfs (ro,nosuid,nodev,noexec,relatime)
tmpfs on /sys/fs/cgroup type tmpfs (ro,nosuid,nodev,noexec,relatime,mode=755)
cgroup on /sys/fs/cgroup/systemd type cgroup (ro,nosuid,nodev,noexec,relatime,xattr,release_agent=/usr/lib/systemd/systemd-cgroups-agent,name=systemd)
cgroup on /sys/fs/cgroup/net_cls,net_prio type cgroup (ro,nosuid,nodev,noexec,relatime,net_cls,net_prio)
cgroup on /sys/fs/cgroup/perf_event type cgroup (ro,nosuid,nodev,noexec,relatime,perf_event)
cgroup on /sys/fs/cgroup/devices type cgroup (ro,nosuid,nodev,noexec,relatime,devices)
cgroup on /sys/fs/cgroup/blkio type cgroup (ro,nosuid,nodev,noexec,relatime,blkio)
cgroup on /sys/fs/cgroup/cpu,cpuacct type cgroup (ro,nosuid,nodev,noexec,relatime,cpu,cpuacct)
cgroup on /sys/fs/cgroup/pids type cgroup (ro,nosuid,nodev,noexec,relatime,pids)
cgroup on /sys/fs/cgroup/memory type cgroup (ro,nosuid,nodev,noexec,relatime,memory)
cgroup on /sys/fs/cgroup/freezer type cgroup (ro,nosuid,nodev,noexec,relatime,freezer)
cgroup on /sys/fs/cgroup/hugetlb type cgroup (ro,nosuid,nodev,noexec,relatime,hugetlb)
cgroup on /sys/fs/cgroup/cpuset type cgroup (ro,nosuid,nodev,noexec,relatime,cpuset)
mqueue on /dev/mqueue type mqueue (rw,nosuid,nodev,noexec,relatime)
/dev/sda1 on /home/jenkins type ext4 (rw,relatime,commit=30,data=ordered)
/dev/sda1 on /dev/termination-log type ext4 (rw,relatime,commit=30,data=ordered)
shm on /dev/shm type tmpfs (rw,nosuid,nodev,noexec,relatime,size=65536k)
/dev/sda1 on /etc/resolv.conf type ext4 (rw,nosuid,nodev,relatime,commit=30,data=ordered)
/dev/sda1 on /etc/hostname type ext4 (rw,nosuid,nodev,relatime,commit=30,data=ordered)
/dev/sda1 on /etc/hosts type ext4 (rw,relatime,commit=30,data=ordered)
/dev/sda1 on /home/jenkins/agent type ext4 (rw,nosuid,nodev,relatime,commit=30,data=ordered)
/dev/sda1 on /home/jenkins/.jenkins type ext4 (rw,nosuid,nodev,relatime,commit=30,data=ordered)
tmpfs on /run/secrets/kubernetes.io/serviceaccount type tmpfs (ro,relatime)
proc on /proc/bus type proc (ro,nosuid,nodev,noexec,relatime)
proc on /proc/fs type proc (ro,nosuid,nodev,noexec,relatime)
proc on /proc/irq type proc (ro,nosuid,nodev,noexec,relatime)
proc on /proc/sys type proc (ro,nosuid,nodev,noexec,relatime)
proc on /proc/sysrq-trigger type proc (ro,nosuid,nodev,noexec,relatime)
tmpfs on /proc/kcore type tmpfs (rw,nosuid,mode=755)
tmpfs on /proc/timer_list type tmpfs (rw,nosuid,mode=755)
tmpfs on /sys/firmware type tmpfs (ro,relatime)
[Pipeline] sh
[nick-pipeline-scratchpad] Running shell script
+ ls -l /usr/bin/docker
ls: /usr/bin/docker: No such file or directory
+ true
[Pipeline] sh
[nick-pipeline-scratchpad] Running shell script
+ ls -l /var/run/docker.sock
ls: /var/run/docker.sock: No such file or directory
+ true
[Pipeline] sh
[nick-pipeline-scratchpad] Running shell script
+ which docker
[Pipeline] }
[Pipeline] // node
[Pipeline] End of Pipeline
ERROR: script returned exit code 1
Finished: FAILURE
The odd thing is that my pod indeed shows the correct mount points
kubectl get pod -l jenkins=slave --export -o yaml
apiVersion: v1
items:
- apiVersion: v1
kind: Pod
metadata:
creationTimestamp: 2018-05-31T20:49:16Z
labels:
jenkins: slave
jenkins/jdk: "true"
jenkins/k8s: "true"
jenkins/ubuntu: "true"
name: slave-2l7bp
namespace: jenkins
resourceVersion: "29485095"
selfLink: /api/v1/namespaces/jenkins/pods/slave-2l7bp
uid: 150f1c2c-6514-11e8-b9a2-42010a8400b6
spec:
containers:
- args:
- cat
command:
- sh
- -c
env:
- name: JENKINS_SECRET
value: **redacted**
- name: JENKINS_NAME
value: slave-2l7bp
- name: JENKINS_URL
value: https://jenkins-linux.oscaroad.com/
- name: HOME
value: /home/jenkins
image: gcr.io/cloud-solutions-images/jenkins-k8s-slave:v4
imagePullPolicy: IfNotPresent
name: jenkins-slave
resources: {}
securityContext:
privileged: false
terminationMessagePath: /dev/termination-log
terminationMessagePolicy: File
tty: true
volumeMounts:
- mountPath: /var/run/docker.sock
name: volume-1
- mountPath: /usr/bin/docker
name: volume-0
- mountPath: /home/jenkins
name: workspace-volume
- mountPath: /var/run/secrets/kubernetes.io/serviceaccount
name: default-token-bnlpk
readOnly: true
workingDir: /home/jenkins
- env:
- name: JENKINS_SECRET
value: **redacted**
- name: JENKINS_NAME
value: slave-2l7bp
- name: JENKINS_URL
value: https://jenkins-linux.oscaroad.com/
- name: HOME
value: /home/jenkins
image: jenkins/jnlp-slave:alpine
imagePullPolicy: IfNotPresent
name: jnlp
resources: {}
terminationMessagePath: /dev/termination-log
terminationMessagePolicy: File
volumeMounts:
- mountPath: /home/jenkins
name: workspace-volume
- mountPath: /var/run/secrets/kubernetes.io/serviceaccount
name: default-token-bnlpk
readOnly: true
dnsPolicy: ClusterFirst
nodeName: **redacted**
restartPolicy: Never
schedulerName: default-scheduler
securityContext: {}
serviceAccount: default
serviceAccountName: default
terminationGracePeriodSeconds: 30
tolerations:
- effect: NoExecute
key: node.kubernetes.io/not-ready
operator: Exists
tolerationSeconds: 300
- effect: NoExecute
key: node.kubernetes.io/unreachable
operator: Exists
tolerationSeconds: 300
volumes:
- hostPath:
path: /usr/bin/docker
type: ""
name: volume-0
- hostPath:
path: /var/run/docker.sock
type: ""
name: volume-1
- emptyDir: {}
name: workspace-volume
- name: default-token-bnlpk
secret:
defaultMode: 420
secretName: default-token-bnlpk
status:
conditions:
- lastProbeTime: null
lastTransitionTime: 2018-05-31T20:49:16Z
status: "True"
type: Initialized
- lastProbeTime: null
lastTransitionTime: 2018-05-31T20:49:18Z
status: "True"
type: Ready
- lastProbeTime: null
lastTransitionTime: 2018-05-31T20:49:16Z
status: "True"
type: PodScheduled
containerStatuses:
- containerID: docker://01d6318d2ccd86794bc636470b8dd34664c9451671ff113e3969f7357bcdb32c
image: gcr.io/cloud-solutions-images/jenkins-k8s-slave:v4
imageID: docker-pullable://gcr.io/cloud-solutions-images/jenkins-k8s-slave@sha256:254363c0b37f14cbc5280eebea6f80e494d5d8ddbbdd23330549a22b7c291ad4
lastState: {}
name: jenkins-slave
ready: true
restartCount: 0
state:
running:
startedAt: 2018-05-31T20:49:17Z
- containerID: docker://76c886cb7e04d41aa38f7744361e3e28ab380d60fbb5916d9ea93b015d38e504
image: jenkins/jnlp-slave:alpine
imageID: docker-pullable://jenkins/jnlp-slave@sha256:7038c115ef71aee0820ccbcc9e2e723767efd47eee664fdb2bdf82a99acfdd71
lastState: {}
name: jnlp
ready: true
restartCount: 0
state:
running:
startedAt: 2018-05-31T20:49:17Z
hostIP: 172.17.0.17
phase: Running
podIP: 10.20.8.34
qosClass: BestEffort
startTime: 2018-05-31T20:49:16Z
kind: List
metadata:
resourceVersion: ""
selfLink: ""
So the pod seems completely fine and I can confirm this using a simple test:
$ k exec -ti slave-2l7bp -c jenkins-slave -- bash
root@slave-2l7bp:~# which docker
/usr/bin/docker
root@slave-2l7bp:~# docker --version
Docker version 17.03.2-ce, build f5ec1e2
root@slave-2l7bp:~# ls -l /usr/bin/docker
-rwxr-xr-x 1 root root 13824040 Apr 6 04:24 /usr/bin/docker
root@slave-2l7bp:~# ls -l /var/run/docker.sock
srw-rw---- 1 root 412 0 Apr 27 20:41 /var/run/docker.sock
As you can see I'm using the latest v4
version of the image and despite running my master outside the cluster, I've done everything detailed in the GKE documentation detailed above.
Any help or assistance would be greatly appreciated.
Thanks in advance
Can you try running with the pod explicitly defined in your Jenkinsfile and letting the Kuberentes plugin inject the jnlp container itself?
For example:
https://github.com/GoogleCloudPlatform/continuous-deployment-on-kubernetes/blob/master/sample-app/Jenkinsfile#L7