- top-right menu -> Settings
- left menu (Developer Settings) -> Personal access tokens
- click 'Generate..'
- select 'Public repo'
- go to "https://slack.com/get-started" and follow the instructions to creeate a new Slack workspace if you don't have one already
- once a workspace ahs been created, navigate to the "browse apps" page within your workspace
- for Hubot:
- search for "Hubot" and fill out the appropriate fields to generate a Hubot app within slack
- Note the "HUBOT_SLACK_TOKEN" that begins with xoxb. This will be need to be included with initiating the Hubot bot from the command line to connect your local Hubot bot to the Slack workspace
- search for "Hubot" and fill out the appropriate fields to generate a Hubot app within slack
- for Jenkins:
- follow the instructions given at "https://www.youtube.com/watch?v=TWwvxn2-J7E" to install the Jenkins Slack app
- Note: "Team Domain" will be replaced with Jenkins Base URL
- follow the instructions given at "https://www.youtube.com/watch?v=TWwvxn2-J7E" to install the Jenkins Slack app
Vagrant 1.9.3 VirtualBox 5.1.18
git clone https://github.com/SLS-ALL/devops-microcosm.git
cd devops-microcosm
vagrant box add metadata.json
- jenkins (+ owaspZAP + selenium)
- gitlab
- mediawiki (+ bugzilla + hubot)
- staging
To get started, first bring up all VMs (i.e. 'newJenkins', 'gitlab', 'staging', 'mediaWiki' )
vagrant up newJenkins gitlab mediaWiki staging
When each VM is ready, proceed with the configuration steps below for each.
Note: You can also create each VM , one at a time by running 'vagrant up ' like
vagrant up gitlab
Note: If you wish to use the Microservice (Docker-Compose) version of Microcosm, See the instructions under "Environment Creation via IaC Using Docker-Compose", below.
on 'gitlab' VM : http://localhost:8083
-
Visit http://localhost:8083
-
set root password on gitlab
- username (default): root
- password: 1amd3v0p5 (or your own choice)
-
register new account
-
add
spring-petclinic
project- on GitLab dashboard, click 'new project'
- click 'import project from github'
- enter personal access token (created above)
- click 'import' next to 'spring-petclinic' to import
- nav to project and select HTTP clone URL for the next step
-
clone project into dev box
-
on your host command line in ./projects - NOTE: you must change 'localhost' to 'localhost:8083' in the HTTP clone URL
git clone <HTTP clone URL>
-
-
(optional) add 'github' as additional remote for upstream changes
This local clone is connected to your GitLab VM and simulates the ability to collaborate changes with your development team, however the real 'spring-petclinic' repository at GitHub.com may undergo real changes. Adding this remote enables you to sync upstream changes:
cd spring-petclinic git remote add github https://github.com/SLS-ALL/spring-petclinic.git
After this, sync upstream changes with:
git pull github master # - pull github changes to this checkout git pull # - ensure you are synced with your gitlab VM repo git push # - push any commits that were pulled from the github repo to your gitlab VM repo
That's it! You now have a local GitLab server running and holding your project code. You also have a clone of your project checked-out and ready for development.
on 'jenkins' VM : http://localhost:8088
-
Visit http://localhost:8088
-
Validate Jenkins install, initial plugins and user account
-
copy administrator password from /var/log/jenkins/jenkins.log and paste into form when prompted
vagrant ssh newJenkins sudo tail -n 30 /var/lib/jenkins/secrets/initialAdminPassword
-
click to install 'suggested plugins'
-
register new account
- click 'Save and Finish'!
-
-
Add Maven Tool
- click "Manage Jenkins"
- click "Global Tool Configuration"
- click "Add Maven"
- the form may not expand the first time. sometimes one or more page refreshes is required before this works.
- enter "petclinic" as name
- click Apply and then click Save
-
Install Additional Plugins
- click "Manage Plugins"
- select 'Available' tab
- search: "owasp"
- select: Official OWASP ZAP Jenkins Plugin
- search: Git Plugin
- select Git Plugin
- search: "maven"
- select: Maven Integration Plugin
- search: "ansible"
- select: Ansible plugin
- seach: "custom tools"
- select: "Custom Tools Plugin"
- search "Summary Display"
- select "Summary Display Plugin"
- search: "Selenium HTML Report"
- select "Selenium HTML Report"
- search "HTML Publisher"
- select "HTML Publisher Plugin"
- search "Slack Notification Plugin"
- select "Slack notification Plugin"
- click "install without restart" at bottom of page
- check box next to "Restart Jenkins when installation is complete and no jobs are running."
- at top-left menu, click "back to Dashboard"
- NOTE: Jenkins will restart in the background and the UI may appear to be hung - you may need to refresh the page
-
Create Custom Tool for Owasp Zap Plugin
- click "Manage Jenkins"
- click "Global Tool Configuration"
- click "Custom Tool Installations"
- click "Add Custom tool"
- enter "ZAP_2.6.0" in the "name" field
- click the "Install automatically" checkbox
- enter "https://github.com/zaproxy/zaproxy/releases/download/2.6.0/ZAP_2.6.0_Linux.tar.gz" in the "Download URL for binary archive" field
- enter "ZAP_2.6.0" in the "Subdirectory of extracted archive" field
- click apply and then click save
-
Add Global Slack Notifier Configurations
- follow the instructions given at "https://www.youtube.com/watch?v=TWwvxn2-J7E" to enter the appropriate configuration information in Jenkins to link the Jenkins service to your Slack workspace
-
Add spring-petclinic project
- click "New Item", enter "petclinic" as name, choose "Freestyle", and click OK
- under Source Code Management, select 'git'
- beside Credentials, click Add -> Jenkins
- select "Username with password"
- enter your GitLab credentials (see 'gitlab' VM instructions above) and click Add
- enter repository URL: http://@//spring-petclinic.git
- NOTE: this is the HTTP URL from the GitLab project page where 'localhost' is replaced by the 'gitlab' VM's private network IP (ex: http://10.1.1.3/root/spring-petclinic.git)
- select appropriate credentials
- Add build step -> Invoke top-level Maven targets
- Leave default values
- Add build step -> Invoke Ansible Playbook
- Playbook path: deploy.yml
- Inventory: File or host list: /etc/ansible/hosts
- beside Credentials, click Add -> Jenkins
- select "SSH Username with private key"
- Username: vagrant
- Private Key: "From a file on Jenkins master": /etc/ansible/vagrant_id_rsa
- Credentials: select 'vagrant'
- Click Apply and then click Save
-
Build and Deploy!
- In the Jenkins UI project view, click "Build Now" on left hand side of screen, or on the main dashboard click the icon to schedule a build
- NOTE: One initial build must be completed in order to create the appropriate Jenkins workspace. This is workspace will be the home of the ZAP session files generated through ZAP GUI, as well as the ZAP vulnerability Reports.
- In the Jenkins UI project view, click "Build Now" on left hand side of screen, or on the main dashboard click the icon to schedule a build
-
Add SonarQube Scanner build step
-
Complete the "SonarQube Integration with Jenkins Instructions" steps founds at the end of the README.md
-
Click "Add build step" and select "Execute SonarQube Scanner"
-
Under "Analysis properties" enter:
sonar.projectKey=petclinic sonar.projectName=petclinic sonar.sources=/var/jenkins_home/workspace/petclinic/src/ sonar.java.binaries=/var/jenkins_home/workspace/petclinic/src/
-
Click Apply and Save
-
After a successful build, the static code analysis will be available at "http://localhost:9000/dashboard/index/petclinic"
-
-
Add OwaspZap build step
- Navigate to the desktop instance of the "Jenkins" VM which contains owaspZap and launch a terminal
- Type "sudo /opt/zapproxy/ZAP_2.6.0/./zap.sh" to launch the owasZap GUI as root
- The user will be promtped to persist the current session of ZAP
- Click "Yes" to persist the session and specify the Jenkins workspace that was created upon the initial successful build of petclinic as the place to save the ZAP session files
- ex: petclinicSession.session
- Open a new terminal tab (necessary for ZAP HTML Reports)
- Create “/var/lib/jenkins/jobs/htmlreports” directory and change ownership to jenkins user-> chown jenkins:jenkins htmlreports
- Create “/var/lib/jenkins/workspace/petclinic/reports/html” directory and change ownership to jenkins user -> chown jenkins:jenkins html
- click "add build step" and select "Execute ZAP"
- Under "Admin Configurations" enter:
- localhost in the "Override Host" field
- 8090 in the "Override Port" field
- Under "Java" "InheritFromJob" should automatically be chosen in the JDK field
- Under "Installation Method" choose "Custom Tools Installation"
- Choose "ZAP_2.6.0" (the name of the custom tool that was created in step 5)
- Under "ZAP Home Directory" enter:
- "~/.ZAP" for Linux
- "~/Library/Application Support/ZAP" for Mac OS
- "C:\Users<username>\OWASP ZAP" for Windows 7/8
- "C:\Documents and Settings<username>\OWASP ZAP" for Windows XP
- Under "Session Management" select "Persist Session"
- Enter the name of the ZAP session file created after choosing to persist the session upon launching ZAP (petclinicSession)
- Under "Session Properties" enter:
- "myContext" in the "Context Name" field
- "http://10.1.1.7:8080/petclinic/*" in the "Include in Context" field
- Under "Attack Mode" enter "http://10.1.1.7:8080/petclinic/" in the "Starting Point" field
- click the "Spider Scan" and "Recurse" checkboxes
- Under "Finalize Run" click the "Generate Reports", "Clean Workdpsave Reports", and "Generate Report" radio butotns
- Enter JENKINS_ZAP_VULNERABILITY_REPORT in the "Filename" field
- Select "html" under the "Format" field
- Click "Add post-built action" and select "Publish HTML reports"
- Under "Publish HTML reports" enter:
- "/var/lib/jenkins/workspace/petclinic/" in the "HTML directory to archive" field
- "JENKINS_ZAP_VULNERABILITY_REPORT.html" in the "Index page[s]" field
- "Last ZAP Vulnerability Report" in the "Report title" field
- Click Apply and then click Save
- Develop!
- Build and Deploy!
- In the Jenkins UI project view, click "Build Now" on left hand side of screen, or on the main dashboard click the icon to schedule a build
- To view the most recent ZAP Vulnerability Report, click "Last ZAP Vulnerability report"
- Visit http://localhost:8087/petclinic/
on 'mediaWiki' VM: http://localhost:8086
-
Browse to "http://localhost:8086/wiki" to access the MediaWiki web interface.
-
Login with the administrator credentials specified in the MediaWiki cookbook to begin customization.
-
Type "http://localhost:8086/bugzilla-5.0.3/" in your browser to access the Bugzilla web interface.
-
Login to the administrator account with the credentials used in the "checksetup_config.erb" recipe template to configure your issue tracking service.
-
Upon a successful "vagrant up", ssh into the VM using "vagrant ssh mediaWiki".
-
Navigate to "/home/vagrant/myhubot" as the vagrant user.
-
Type "yo hubot --adapter slack" to create a Hubot bot that can integrate with your Slack workspace
-
While in the "/home/vagrant/myhubot" directory, execute the "npm_packages_install.sh" script to install the necessary npm packages to allow your hubot to integrate with Jenkins and Slack
- This must be done BEFORE launching the Hubot bot
-
Export HUBOT_JENKINS_AUTH, HUBOT_JENKINS_URL, and HUBOT_SLACK_TOKEN variables
- HUBOT_JENKINS_AUTH should be in "username:password" format (use jenkins account credentials)
-
Launch the previously created Hubot by passing all of the appropriate command line flags
- ex: HUBOT_SLACK_TOKEN=$HUBOT_SLACK_TOKEN HUBOT_JENKINS_URL=$HUBOT_JENKINS_URL HUBOT_JENKINS_AUTH=$HUBOT_JENKINS_AUTH ./bin/hubot --adapter slack
-
You will now be able to chat with your Hubot via your Slack workspace, as well as kick off Jenkins builds of the petclinic application
Provided in this repository is one simple one-stop script to setup MediaWiki, IaC-setup-script.sh
. It singlehandedly automates the IaC pipeline to start up a development project, which is the MediaWiki application in this case.
The setup script operates in two layers. Above, we used Vagrant to generate four VMs for our environment (i.e. newJenkins, gitlab, mediaWiki, and staging). Instead, in our first layer of abstraction, we spin up two VMs, staging and docker-compose, the latter of which automatically contains six containers (e.g. mediaWiki, jenkins, gitlab, ...). The second stage then installs MediaWiki with pre-determined settings on the mediaWiki container.
As such, the IaC-setup-script.sh
copies copy-to-mediawiki.sh
and mediawiki-setup.sh
to the docker-compose VM. Then, copy-to-mediawiki.sh
is executed remotely, which copies mediawiki-setup.sh
from the docker-compose VM to the somewiki container (with the mediaWiki image) and executes the script, installing MediaWiki. Once MediaWiki is installed on the somewiki container, users may access the application via http://localhost:8096/index.php/Main_Page .
Although this script handles all grunt work effortlessly, understanding the underlying levels of abstraction and details of the scripts is imperative. It is also important to note IaC-setup-script.sh automatically installs the vagrant-docker-compose vagrant plugin via the command:
vagrant plugin install vagrant-docker-compose
If necessary, one may destroy any vagrant VMs and start from scratch via the command:
vagrant destroy -f
Finally, the "docker-compose.yml" file contains configuration specifications for each service/container in the Microcosm pipline. Most noticeably, the somewiki container lies within the specifications, utilizing the mediaWiki image.
It is important to understand the port forwarding that is going on behind the scenes with the Dockerized version of Microcosm. As opposed to the "strictly VM" version of Microcosm, there are two levels of port forwarding that occur.
At the service level, the initial layer of port forwarding occurs between each container and the Centos 7 VM that is running Docker-Compose. This can be seen for each container definition in the "docker-compose.yml" file, as shown below for the Jenkins container:
jenkins:
image: h1kkan/jenkins-docker:lts
container_name: jenkins
ports:
- "8080:8080"
volumes:
- jenkins_home:/var/jenkins_home
extra_hosts:
- "staging:10.1.1.7"
Note the values assigned to the "ports" argument: 8080:8080. The port to the right of the colon specifies the port in which the service is listening on within the container. The port to the left of the colon specifies forwarded port in which the Centos 7 VM is listening on.
The second layer of port forwarding now occurs between the Centos 7 VM and the host machine that is running Vagrant. This takes place in the VM definition within the Vagrantfile:
docker.vm.network :forwarded_port, guest:8080, host:8098
The Centos 7 VM re-forwards its forwarded port to the specified port on the host machine. Jenkins is therefore available at "localhost:8098" via a browser on the host machine.
The Jenkins Docker imaged used is fortunately packaged with Ansible pre-installed, but a few steps must be taken to create the "/etc/ansible/hosts" file for remote deployment, as well as the /etc/ansible/vagrant_id_rsa" key to allow jenkins to ssh into the Staging server during the execution of the Ansible playbook.
-
SSH into the VM using "vagrant ssh docker-compose"
-
Enter the Jenkins container as root using a bash shell:
docker exec -it --user root jenkins bash
-
Update apt-get with:
apt-get update
-
Install VIM to author the "/etc/ansible/hosts" file:
apt-get install vim
-
Create the "/etc/ansible" directory:
mkdir /etc/ansible
-
Author the "/etc/ansible/hosts" file
vi /etc/ansible/hosts Inside the file: [DevOps] 10.1.1.7
-
While in the "/etc/ansible/" directory, download the Vagrant RSA key needed for deployment:
curl -o vagrant_id_rsa https://raw.githubusercontent.com/mitchellh/vagrant/master/keys/vagrant
-
Install the SonarQube Scanner for Jenkins via the Jenkins Plugin Manager
-
Go to Manage Jenkins -> Configure System
-
Enter "SonarQube" in the "Name" field
-
ssh into the docker-compose VM and use the "docker inspect -f '{{range .NetworkSettings.Networks}}{{.IPAddress}}{{end}}' NAME_OF_CONTAINER" command to find the IP address of the SonarQube container. Enter the returned IP address under the "Server URL" field.
ex: http://SonarQube_Container_IP_Address:9000
-
-
Select "5.3 or higher" for the "Server version" field
-
Enter the authentication token that was generated upon logging into the SonarQube web interface in the "Server authentication token" field
-
Click Apply and Save
-
Go to Manage Jenkins -> Global Tool Configuration
-
Enter "SonarQube" in the "Name" field
-
Check "Install automatically"
- Choose the most recent version of SonarQube Scanner
-
Click Apply and Save
The environment arguments for the Hubot container defined in "docker-compose.yml" will change.
-
HUBOT_JENKINS_URL=http://IP_ADDRESS_OF_JENKINS_CONTAINER:8080
-
Print IP address of container while in "docker-compose" VM with:
docker inspect -f '{{range .NetworkSettings.Networks}}{{.IPAddress}}{{end}}' NAME_OF_CONTAINER
-
-
HUBOT_JENKINS_AUTH=JENKINS_USERNAME:PASSWORD