0
votes

I am trying to copy data from a Bucket in Google Cloud Storage into Local Hadoop Cluster(Which I have installed in my Mac). Have followed the instructions as given in this link ; Migrating 50TB data from local Hadoop cluster to Google Cloud Storage But I am getting the following error when I execute the hdfs command hdfs dfs -ls gs://tempuserstorage.

17/04/28 15:42:13 WARN util.NativeCodeLoader: Unable to load native-hadoop library for your platform... using builtin-java classes where applicable
17/04/28 15:42:14 INFO gcs.GoogleHadoopFileSystemBase: GHFS version: 1.6.0-hadoop2
    -ls: Google Cloud Storage bucket name must not contain '/' character.
    Usage: hadoop fs [generic options] -ls [-d] [-h] [-R] [<path> ...]

I have also tried hadoop fs -ls gs://tempuserstorage but getting the same error though

Am I missing something here?

1

1 Answers

0
votes

Looks like I have made a mistake while provide value for fs.gs.system.bucket. Apart for the bucket name(say mybucket) ; I have also added the directory inside the bucket(mybucket/mydir) which seems to have caused the problem