0
votes

I'm trying to attach and mount a volume to an AWS machine. I am starting the machine using Vagrant, and I have created in the Vagrantfile a block device called "/dev/sda2". After starting the machine, I see in the AWS console that the device is created: enter image description here

As you can see, AWS knows the device exists. However, when I try to mount the device on the AWS machine, I get an error.

sudo mount -t ext4 /dev/sda2 /mnt/example_mount
mount: special device /dev/sda2 does not exist

In fact, the directory /dev/sda2 does not exist.

cd /dev/sda2
-bash: cd: /dev/sda2: No such file or directory

Here is lsblk output:

NAME    MAJ:MIN RM SIZE RO TYPE MOUNTPOINT
xvda    202:0    0  10G  0 disk
├─xvda1 202:1    0   1M  0 part
└─xvda2 202:2    0  10G  0 part /
xvdb    202:16   0  50G  0 disk
xvdi    202:128  0  50G  0 disk

Why do I see the device in the AWS console but not on the machine?

1
While it applies to EBS volumes read this doc to understand how to create a new file system and why the disk names are different.stdunbar
Doesn't it seem like /dev/xvdb is an excellent candidate, here?Michael - sqlbot
Yes, I think that's the one. I still find the renaming strange -- but that seems right.nickackerman42

1 Answers

1
votes

What linux are you using? Make sure your system doesn't see the drive differently, for example /dev/xvda2? If you can, please provide the lsblk command output.