I have AWS account. I'm using S3 to store backups from different servers. The question is there any information in the AWS console about how much disk space is in use in my S3 cloud?
18 Answers
The AWS CLI now supports the --query
parameter which takes a JMESPath expressions.
This means you can sum the size values given by list-objects
using sum(Contents[].Size)
and count like length(Contents[])
.
This can be be run using the official AWS CLI as below and was introduced in Feb 2014
aws s3api list-objects --bucket BUCKETNAME --output json --query "[sum(Contents[].Size), length(Contents[])]"
On linux box that have python
(with pip
installer), grep
and awk
, install AWS CLI (command line tools for EC2, S3 and many other services)
sudo pip install awscli
then create a .awssecret
file in your home folder with content as below (adjust key, secret and region as needed):
[default]
aws_access_key_id=<YOUR_KEY_HERE>
aws_secret_access_key=<YOUR_SECRET_KEY_HERE>
region=<AWS_REGION>
Make this file read-write to your user only:
sudo chmod 600 .awssecret
and export it to your environment
export AWS_CONFIG_FILE=/home/<your_name>/.awssecret
then run in the terminal (this is a single line command, separated by \
for easy reading here):
aws s3 ls s3://<bucket_name>/foo/bar | \
grep -v -E "(Bucket: |Prefix: |LastWriteTime|^$|--)" | \
awk 'BEGIN {total=0}{total+=$3}END{print total/1024/1024" MB"}'
- the
aws
part lists the bucket (or optionally a 'sub-folder') - the
grep
part removes (using-v
) the lines that match the Regular Expression (using-E
).^$
is for blank line,--
is for the separator lines in the output ofaws s3 ls
- the last
awk
simply add tototal
the 3rd colum of the resulting output (the size in KB) then display it at the end
NOTE this command works for the current bucket or 'folder', not recursively
See https://serverfault.com/questions/84815/how-can-i-get-the-size-of-an-amazon-s3-bucket
Answered by Vic...
<?php
if (!class_exists('S3')) require_once 'S3.php';
// Instantiate the class
$s3 = new S3('accessKeyId', 'secretAccessKey');
S3::$useSSL = false;
// List your buckets:
echo "S3::listBuckets(): ";
echo '<pre>' . print_r($s3->listBuckets(), 1). '</pre>';
$totalSize = 0;
$objects = $s3->getBucket('name-of-your-bucket');
foreach ($objects as $name => $val) {
// If you want to get the size of a particular directory, you can do
// only that.
// if (strpos($name, 'directory/sub-directory') !== false)
$totalSize += $val['size'];
}
echo ($totalSize / 1024 / 1024 / 1024) . ' GB';
?>
Getting large buckets size via API (either aws cli or s4cmd) is quite slow. Here's my HowTo explaining how to parse S3 Usage Report using bash one liner:
cat report.csv | awk -F, '{printf "%.2f GB %s %s \n", $7/(1024**3 )/24, $4, $2}' | sort -n
The AWS console wont show you this but you can use Bucket Explorer or Cloudberry Explorer to get the total size of a bucket. Both have free versions available.
Note: these products still have to get the size of each individual object, so it could take a long time for buckets with lots of objects.
Based on @cudds's answer:
function s3size()
{
for path in $*; do
size=$(aws s3 ls "s3://$path" --recursive | grep -v -E "(Bucket: |Prefix: |LastWriteTime|^$|--)" | awk 'BEGIN {total=0}{total+=$3}END{printf "%.2fGb\n", (total/1024/1024/1024)}')
echo "[s3://$path]=[$size]"
done
}
...
$ s3size bucket-a bucket-b/dir
[s3://bucket-a]=[24.04Gb]
[s3://bucket-b/dir]=[26.69Gb]
Also, Cyberduck conveniently allows for calculation of size for a bucket or a folder.
Mini John's answer totally worked for me! Awesome... had to add
--region eu-west-1
from Europe though
This is an old inquiry, but since I was looking for the answer I ran across it. Some of the answers made me remember I use S3 Browser to manage data. You can click on a bucket and hit properties and it shows you the total. Pretty simple. I highly recommend the browser: https://s3browser.com/default.aspx?v=6-1-1&fam=x64
You asked: information in AWS console about how much disk space is using on my S3 cloud?
I so to the Billing Dashboard and check the S3 usage in the current bill.
They give you the information - MTD - in Gb to 6 decimal points, IOW, to the Kb level.
It's broken down by region, but adding them up (assuming you use more than one region) is easy enough.
BTW: You may need specific IAM permissions to get to the Billing information.
Well, you can do it also through an S3 client if you prefer a human friendly UI.
I use CrossFTP, which is free and cross-platform, and there you can right-click on the folder directory -> select "Properties..." -> click on "Calculate" button next to Size and voila.
s3admin is an opensource app (UI) that lets you browse buckets, calculate total size, show largest/smallest files. It's tailored for having a quick overview of your Buckets and their usage.
I use Cloud Turtle to get the size of individual buckets. If the bucket size exceeds >100 Gb, then it would take some time to display the size. Cloud turtle is freeware.