0
votes

I want to upload files in azure blob storage gen2. But problem is not able to connect using tenant id, client id and client secret. I am referring Java code given in document -> https://docs.microsoft.com/en-us/azure/storage/blobs/data-lake-storage-directory-file-acl-java#upload-a-file-to-a-directory.

static public DataLakeServiceClient GetDataLakeServiceClient
    (String accountName, String clientId, String ClientSecret, String tenantID){

    String endpoint = "https://" + accountName + ".dfs.core.windows.net";
        
    ClientSecretCredential clientSecretCredential = new ClientSecretCredentialBuilder()
    .clientId(clientId)
    .clientSecret(ClientSecret)
    .tenantId(tenantID)
    .build();
           
    DataLakeServiceClientBuilder builder = new DataLakeServiceClientBuilder();
    return builder.credential(clientSecretCredential).endpoint(endpoint).buildClient();
 }

But getting error for endpoint at last line of above code.

From Postman:

URI http://localhost:8081/upload/
Request param : <file to be uploaded>

"error": "Internal Server Error",
"message": "java.lang.NoClassDefFoundError: com/azure/core/implementation/util/ImplUtils"
1
Seems some issue due to SAS token error is coming from DataLakeServiceClientBuilder::endpoint() but not sure why !!!bee

1 Answers

0
votes

If you want to access Azure data lake gen2 vai Azure AD auth, we need to assign a special Azure RABC role (Storage Blob Data Owner, Storage Blob Data Contributor and Storage Blob Data Reader) to the sp or user. For more details, please refer to here.

For example

  1. Create a service principal and assign Storage Blob Data Contributor to the sp at the storage account level
az login
az ad sp create-for-rbac -n "MyApp" --role 'Storage Blob Data Contributor' \
    --scopes /subscriptions/<subscription>/resourceGroups/<resource-group>/providers/Microsoft.Storage/storageAccounts/<storage-account>
  1. Code (download file)
String clientId="<sp appId>";
        String ClientSecret="<sp password>";
        String tenantID="";
        ClientSecretCredential clientSecretCredential = new ClientSecretCredentialBuilder()
                .clientId(clientId)
                .clientSecret(ClientSecret)
                .tenantId(tenantID)
                .build();
        String accountName="";
        DataLakeServiceClient serviceClient  = new DataLakeServiceClientBuilder()
                 .credential(clientSecretCredential)
                 .endpoint("https://" + accountName + ".dfs.core.windows.net")
                .buildClient();

        DataLakeFileSystemClient fileSystemClient =serviceClient.getFileSystemClient("test");
        DataLakeFileClient fileClient = fileSystemClient.getFileClient("test.txt");
        ByteArrayOutputStream outputStream = new ByteArrayOutputStream();
        fileClient.read(outputStream);
        byte[] data =outputStream.toByteArray();
        System.out.println("The file content : "+new String(data));

enter image description here