0
votes

With Terraform, following the doc here and there, I am trying to create a template for ec2 instance with elastic inference gpu specification and graphics.

This my code for aws launch template resource:

resource "aws_launch_template" "elastic_ec2" {
  name_prefix                          = "DeepLearning"
  description                          = "Deep Learning"
  disable_api_termination              = true
  ebs_optimized                        = true
  image_id                             = "ami-0d9d11b8557309342"
  instance_initiated_shutdown_behavior = "terminate"
  instance_type                        = "t3.medium"
  key_name                             = "${local.pem_key_name}"

  block_device_mappings {
    device_name = "/dev/sda1"
  }

  capacity_reservation_specification {
    capacity_reservation_preference = "open"
  }

  credit_specification {
    cpu_credits = "standard"
  }

  elastic_gpu_specifications {
    type = "eg1.medium"
  }

  elastic_inference_accelerator {
    type = "eia1.medium"
  }

  iam_instance_profile {
    name = "my-right-profile"
  }

  instance_market_options {
    market_type = "spot"
  }

  monitoring {
    enabled = true
  }

  network_interfaces {
    associate_public_ip_address = true
  }

  placement {
    availability_zone = "${var.main_location}"
  }

  tag_specifications {
    resource_type = "instance"
    tags = {
      Environment = "${local.environment}"
    }
  }
}

resource "aws_instance" "web" {
  ami                  = "${aws_launch_template.elastic_ec2.image_id}"
  instance_type        = "${aws_launch_template.elastic_ec2.instance_type}"
  key_name             = "${local.pem_key_name}"
  iam_instance_profile = "${aws_iam_instance_profile.ec2_profile.name}"

  tags = {
    Environment = "${local.environment}"
    App         = "${local.app_name}"
  }
}

Of course this the policies used for the deployment of the instance following the doc specifications here:

resource "aws_iam_role" "ec2_exec_role" {
  name        = "ec2_exec"
  path        = "/"
  description = "Allows Lambda Function to call AWS services on your behalf."

  assume_role_policy = <<POLICY
{
  "Version": "2012-10-17",
  "Statement": [
    {
      "Action": "sts:AssumeRole",
      "Principal": {
        "Service": [
          "ec2.amazonaws.com",
          "ecs-tasks.amazonaws.com"
        ]
      },
      "Effect": "Allow",
      "Sid": ""
    }
  ]
}
POLICY
}

resource "aws_iam_instance_profile" "ec2_profile" {
  name = "${local.environment}_ec2_profile"
  role = "${aws_iam_role.ec2_exec_role.name}"
}

resource "aws_iam_policy" "ec2_policy" {
  name = "${local.environment}_ec2_policy"
  description = "Ec2 main polify for ${local.environment} environment"
  policy = <<EOF
{
  "Version": "2012-10-17",
  "Statement": [
      {
          "Effect": "Allow",
          "Action": [
              "elastic-inference:Connect",
              "iam:List*",
              "iam:Get*",
              "ec2:Describe*",
              "ec2:Get*",
              "ecs:RegisterTaskDefinition",
              "ecs:RunTask"
          ],
          "Resource": [
              "arn:aws:ec2:${var.main_location}::*",
              "arn:aws:elastic-inference:${var.main_location}:${data.aws_caller_identity.current.account_id}:*",
              "arn:aws:iam::${data.aws_caller_identity.current.account_id}:role/ecs-ei-task-role",
              "arn:aws:iam::${data.aws_caller_identity.current.account_id}:role/ecsTaskExecutionRole"
          ]
      }
  ]
}
EOF
}

resource "aws_iam_policy_attachment" "ec2_attachment" {
  name       = "${local.environment}_ec2__attachment"
  roles      = ["${aws_iam_role.ec2_exec_role.name}"]
  policy_arn = "${aws_iam_policy.ec2_policy.arn}"
}

I have no error when I am deploying the terraform code with the command terraform plan and terraform apply. I have my instance template and my ec2 instance deployed, but I can see that in my instance I have no Elastic Inference Accelerator ID attached on it. What is the right configuration I need using terraform to attach correctly all elastic inference specifications

1

1 Answers

1
votes

Thanks for making this question, it led me to answer how to launch an EC2 instance with an elastic inference accelerator. There's currently an issue open in the AWS terraform provider repo about adding an option to attach an EI directly to an EC2 instance.

The issue with this config is that the launch template isn't associated with your instance. I was able to launch my EC2 instance with an EI by specifying your launch template in a launch configuration for an Autoscaling Group.

But keep in mind that there are many other steps that you need to take to attach an EI to an EC2 instance. Specifying security groups, creating a VPC endpoint... Docs

Hope this helps.