I have a Databricks instance which does some work. Jobs are triggered from Azure Data Factory. There is several environments and each one has its own Key Vault to store secrets.
As long as I kept access token - let's say "hardcoded" - within a Databricks linked service configuration everything worked fine. But I need to comply with security standards, so keeping it in JSON which lays somewhere isn't an option - it was fine for the time being.
Key Vault to the rescue - access token to the Databricks is created via API and stored in a Key Vault, now I wanted to use the Key Vault as linked service in Databricks linked service to populate access token, and the surprise comes here - it doesn't work.
I can't debug pipeline, I can't trigger it, I can't even test a connection, it always fails with 403 Invalid access token:
The JSON for this linked service:
{
"name": "ls_databricks",
"type": "Microsoft.DataFactory/factories/linkedservices",
"properties": {
"annotations": [],
"type": "AzureDatabricks",
"typeProperties": {
"domain": "https://**************.azuredatabricks.net",
"accessToken": {
"type": "AzureKeyVaultSecret",
"store": {
"referenceName": "ls_keyVault",
"type": "LinkedServiceReference"
},
"secretName": "DatabricksAccessToken"
},
"existingClusterId": "*********"
}
}
}
While, using Postman I can easily access Databricks API using the same access token:

Key Vault linked service itself works fine and connection test passes:

I have configured different linked service to connect to ADLS using Key Vault and it works as expected:

Does anybody have any ideas what's wrong here? It is just broken or I'm doing something wrong?
p.s. Apologies for flooding you with all of these screenshots :)
I'm using https://docs.databricks.com/dev-tools/api/latest/scim/scim-sp.html SCIM API to entitle my service principal to a proper group on Databricks instance.
