0
votes

I have three Data Factories in Azure. I've made several changes to pipelines in the Data Factories (different in each) and now I am no longer able to publish from the Data Factory UI. Previously, publishing worked just fine. I believe the issue started after making changes in the UI and running a DevOps pipeline. The pipeline, however, does not deploy anything to the data factories. It simply makes an artifact of the ADF content.

In two out of three data factory I've made changes to

  1. Pipelines: changing the target of the copy activity from blob storage to ADLS
  2. Adding linked service for on-premises SQL server.

In the other data factory, I made no changes but the error also shows there.

It displays the following error (I've removed sensitive details) in all ADFs:

Publishing error

{"_body":"\r\n\r\n\r\n\r\n404 - File or directory not found.\r\n\r\n\r\n\r\n
Server Error
\r\n
\r\n
\r\n
404 - File or directory not found.
\r\n
The resource you are looking for might have been removed, had its name changed, or is temporarily unavailable.
\r\n
\r\n
\r\n\r\n\r\n","status":404,"ok":false,"statusText":"OK","headers":{},"url":"https://management.azure.com/subscriptions/<subscription id>/resourcegroups/<resource group>/providers/Microsoft.DataFactory/factories/<adf name>/mds/databricks%20notebooks.md?api-version=2018-06-01"}

enter image description here

Clicking on 'Details' gives the following information on the error:

Error code: OK
Inner error code: undefined
Message: undefined

The data factories are almost exact replicas, apart from some additional pipelines and linked services. One of the data factories has a databricks instance in the same resource group and is connected to that. Pipelines have always run successfully. The other data factories have the same linked service for databricks, but have no databricks workspace. It's only there as a template.

The JSON of the databricks linked service looks like this, after removing secret names:

{
    "properties": {
        "type": "AzureDatabricks",
        "annotations": [],
        "typeProperties": {
            "domain": {
                "type": "AzureKeyVaultSecret",
                "store": {
                    "referenceName": "LS_keyvault",
                    "type": "LinkedServiceReference"
                },
                "secretName": ""
            },
            "authentication": "MSI",
            "workspaceResourceId": {
                "type": "AzureKeyVaultSecret",
                "store": {
                    "referenceName": "LS_keyvault",
                    "type": "LinkedServiceReference"
                },
                "secretName": ""
            },
            "existingClusterId": {
                "type": "AzureKeyVaultSecret",
                "store": {
                    "referenceName": "LS_keyvault",
                    "type": "LinkedServiceReference"
                },
                "secretName": ""
            }
        }
    }
} 

Solutions I've tried

  • Added Databricks as resource provider in the subscriptions but still the same error shows.
  • In the data factory that is actually connected to databricks, updated the databricks notebooks path to reference the true location of the notebooks.

The error suggests to me that the issue is related to databricks, but I can't pinpoint the problem. Has anyone solved this issue before?

Thanks!

1
I've seen similar issues when working directly against the main branch. The Publish branch can get stale/out of sync with main, specifically when items get moved or renamed. You might see if this helps: stackoverflow.com/questions/59323766/…Joel Cochran
@JoelCochran Thank you so much. This worked!LearningAzure
@JoelCochran Would you like to post that as an answer so OP can mark this question as answered?CHEEKATLAPRADEEP-MSFT

1 Answers

0
votes

I've seen similar issues when working directly against the main branch. The Publish branch can get stale/out of sync with main, specifically when items get moved or renamed. Here is another post on a related issue, the solution there may help with your situation.