I am new to azure databricks . I have written a sample spark program in scala to load in azure sql via below query . I am getting an error . can someone please help me in this
Error Message ----
com.microsoft.sqlserver.jdbc.SQLServerException: The TCP/IP connection to the host b63da5ce2d2d.tr27.northeurope1-a.worker.database.windows.net, port 65535 has failed. Error: "connect timed out. Verify the connection properties. Make sure that an instance of SQL Server is running on the host and accepting TCP/IP connections at the port. Make sure that TCP connections to the port are not blocked by a firewall."
Scala code -
import com.microsoft.azure.sqldb.spark.config.Configimportcom.microsoft.azure.sqldb.spark.connect._// Aquire a DataFrame collection (val collection)valconfig=Config(Map("url"->"mysqlserver.database.windows.net","databaseName"->"MyDatabase","dbTable"->"dbo.Clients""user"->"username","password"->"xxxxxxxx"))importorg.apache.spark.sql.SaveModecollection.write.mode(SaveMode.Append).sqlDB(config)