1
votes

I have a use case of running a spark job everyday. I am using databricks to run the job. Since it is a daily job, I would like to create a cluster, run the notebook and destroy the cluster. I am using data factory to do that. But I am not seeing any option to customise the "Inactivity period" in data factory when creating the linked service of databricks on creating the cluster.

How can I destroy the cluster once my job is completed?

1

1 Answers

2
votes

Just choose "new job cluster". Job clusters are only active during the job lifetime.