/btv-sec-eng-teleport-cluster

A simple, all-in-one Teleport cluster (auth, node, proxy) using a single EC2 instance. Based on Teleport's starter cluster example.

Primary LanguageHCLMIT LicenseMIT

Blue Team Village Teleport cluster

What is this repo?

This repo contains a Terraform module and an Ansible role that can be imported into existing projects to setup a Teleport cluster and agents.

Prerequisites

Install/Setup Terraform

It should be noted that this repo only supports Terraform v1.3.7 and greater.

Install/Setup Anislbe

It should be noted that this repo only supports Ansible v2.14.1 and greater.

Install/Setup AWS CLI and Terraform

Install/Setup Teleport tsh

Github SSO

This section will walk through the process to create a Github Oauth app for Teleport SSO. These steps need to be executed by a Github org admin.

  1. Log into Github as an org admin
  2. Browse to your Github orgs settings page
  3. Developer settings > OAuth apps
  4. Select "Register an application"
    1. Enter an application name
    2. Enter your project's Homepage URL
    3. Enter a description
    4. Enter https://<teleport FQDN>/v1/webapi/github/ for Authorization callback URL
      1. teleport_github_oauth
    5. Select "Register application"
      1. github_OAuth_secret
  5. Select "Generate a new client secret"
  6. Copy "OAuth client ID" and "Oauth client secret"

Terraform

Perform the following instructions in an existing Terraform project. Upon completion, this Terraform module will create all the necessary AWS resources for a high-availability Teleport cluster:

  • Teleport all-in-one (auth, node, proxy) single cluster ec2 instance
  • DynamoDB tables (cluster state, cluster events, ssl lock)
  • S3 bucket (session recording storage)
  • Route53 A record
  • Security Groups and IAM roles

Terraform layout

File Description
dynamodb.tf DynamoDB table provisioning. Tables used for Teleport state and events.
ec2.tf EC2 instance provisioning.
iam.tf IAM role provisioning. Permits ec2 instance to talk to AWS resources (S3, DynamoDB, etc)
outputs.tf Export module variables to be used by other Terraform resources
route53.tf Route53 zone creation. Requires a hosted zone to configure SSL.
s3.tf S3 bucket provisioning. Bucket used for session recording storage.
secrets.tf Creates empty secret stub for Github Oauth client secret
variables.tf Inbound variables for Teleport module

Instructions

  1. vim main.tf and add:
module "teleport" {
  source = "github.com/blueteamvillage/btv-teleport-single-cluster"

  #### General ####
  PROJECT_PREFIX = <project name>
  primary_region = <region to deploy Teleport cluster too>
  public_key_name = <Name of an SSH key to provision the EC2 instance>

  #### Route53 ####
  route53_zone_id = "<Route 53 Zone ID for the Teleport FQDN>"
  route53_domain = "<domain>"

  #### VPC ####
  vpc_id = <VPC ID to deploy Teleport too>
  teleport_subnet_id = <Subnet ID to deploy Teleport too>

  #### Inra ####
  aws_account  = data.aws_caller_identity.current.account_id
  teleport_ami = var.ubunut-ami
}
  1. terraform init
    1. Import this new module
  2. terraform apply
module.teleport.aws_dynamodb_table.teleport_locks: Creating...
module.teleport.aws_dynamodb_table.teleport: Creating...
module.teleport.aws_dynamodb_table.teleport_events: Creating...
module.teleport.aws_s3_bucket.teleport: Creating...
module.teleport.aws_s3_bucket.teleport: Creation complete after 2s [id=defcon-2023-obsidian-teleport-kxl6y]
module.teleport.aws_s3_bucket_acl.teleport: Creating...

Set the value of Github OAuth secret

  1. Log into AWS console
  2. Go to the Secrets Manager service
  3. Terraform created an empty with the following name schema: "${var.PROJECT_PREFIX}-teleport-github-OAuth-secret"
  4. Secret value > Retrieve secret value
    1. This will produce an error because no value has been set, this is expected
  5. Select "Set secret value"
  6. Set secret value to the Github OAuth client secret

Ansible

Perform the following instructions in an existing Ansible project. Upon completion, this Ansible will create a single-node Teleport "cluster" configured with Github SSO.

Terraform layout

File Description
teleport-cluster/tasks/init_linux.yml Ansible tasks to update linux and set basic OS settings
teleport-cluster/tasks/setup_teleport.yml Ansible tasks to install Teleport and configure based on the tempalte in teleport-cluster/templates
teleport-cluster/templates/cap.yaml.j2 Configure Github SSO as the default login mechanism
teleport-cluster/templates/github.yaml.j2 Configure Github SSO for your org
teleport-cluster/templates/sec_infra_role.yaml.j2 Define the resources admins have access too
teleport-cluster/templates/workshop_contributors.yaml.j2 Define the resources workshop contributors have access too
teleport-cluster/var/main.yml Variables for how to install and configure Teleport

Create a tunnel using SSM

Teleport by default does not expose SSH to the public internet. To provision the EC2 instance via Ansible we can use SSM to create a tunnel that can be used by Ansible.

  1. aws ssm start-session --target <Teleport EC2 instance ID> --document-name AWS-StartPortForwardingSession --parameters '{"portNumber":["22"], "localPortNumber":["2022"]}'

ansible_aws_ssm_session

aws_ssm_tunnel_ssh

Init Ansible playbook

  1. ansible-galaxy collection install amazon.aws
  2. pip3 install -U boto3==1.26.69
  3. vim hosts.ini and set:
[teleport_cluster]
127.0.0.1 ansible_port=2022 ansible_user=ubuntu
  1. group_vars/teleport_cluster.yml and set:
    1. General
      1. teleport_fqdn - Set the fully qualified domain for Teleport
      2. primary_region - Set to the region where you want to host Teleport
    2. AWS
      1. teleport_bucket_name - Set this to the name of the S3 bucket created by the Terraform for Teleport
      2. teleport_dynamodb_table - Set this to the name of the DynamoDB table created by the Terraform for Teleport
        1. Name schmea: "${var.PROJECT_PREFIX}-teleport-${random_string.suffix.result}"
        2. AWS DynamoDB Table view
      3. teleport_dynamodb_events_table - Set this to the name of the DynamoDB table created by the Terraform for Teleport
        1. Name schmea: "${var.PROJECT_PREFIX}-teleport-${random_string.suffix.result}-events"
        2. AWS DynamoDB Table view
    3. Github
      1. github_org_name - Name of the Github org for SSO
      2. github_client_id - The Github Oauth client ID - NOT a secret
      3. github_client_secret - Define the name of the AWS Secret that contains the Github Oauth client` secret.
      4. github_redirect_url - Leave this as the default value unless you are hosting Teleport at a different URL path.
      5. github_admin_team - Define the Github team that contains a list of users that will be admins for the Teleport cluster
      6. github_workshop_contributors - Define the Github team that contains a list of users that will be accessing computing resources behind the Teleport cluster
    4. ansible_teleport_variables
    5. Save and exit

Run Ansible playbook

  1. ansible-playbook -i hosts.ini deploy_teleport_cluster.yml
    1. ansible_deploy_teleport

Teleport

  1. Connect to Teleport using SSM: aws ssm start-session --target <Teleport EC2 instance ID>
  2. /bin/bash
    1. aws_ssm_ssh_session
  3. tctl users add admin --roles=editor,access --logins=root,ubuntu
    1. Copy the URL produced
    2. teleport_create_admin
  4. Enter the URL produced in the previous command into your browser
  5. Select "Get started"
    1. teleport_getting_started
  6. Enter a password for the admin user
    1. teleport_admin_password
  7. Select "Next"
  8. Setup OTP
    1. teleport_admin_otp
  9. Select "Go to dashboard"
    1. teleport_admin_homescreen

User login via Github SSO

Log into Teleport via browser

  1. Browese to Teleport FQDN
    1. Ex: https://teleport.blueteamvillage.com/web/login
  2. Select "Github"
  3. Select "Authorize
    1. teleport_github_authorize

Log into Teleport using tsh CLI

  1. tsh login --proxy=<teleport FQDN> teleport_tsh_login

SSH into Teleport EC2 instance

➜ tsh ls
Node Name       Address        Labels
--------------- -------------- ------------------------
ip-172-16-10-93 127.0.0.1:3022 hostname=ip-172-16-10-93

➜ tsh ssh ubuntu@ip-172-16-10-93
ubuntu@ip-172-16-10-93:~$ whoami
ubuntu
ubuntu@ip-172-16-10-93:~$

Supported versions

  • Terraform v1.3.7
  • Ansible v2.14.1
  • Ubuntu Server 22.04
  • Teleport v12
  • awscli v2.2.31

References