I have a scenario where I need to trigger Stored procedure in the SQL server from Databricks. With the spark SQL connector,I can able to SELECT but not trigger the stored procedure.
I am trying to connect Java JDBC but whenever I execute it says "NO Driver found"
I have uploaded the driver (mssql_jdbc_8_2_2_jre11.jar) to the Databricks cluster.
Tried Code:
import java.sql.{Connection, DriverManager, ResultSet}
DriverManager.registerDriver(new com.microsoft.sqlserver.jdbc.SQLServerDriver());
Class.forName("com.microsoft.sqlserver.jdbc.SQLServerDriver")
val conn = DriverManager.getConnection("jdbc:xxxx.database.windows.net;databaseName=yyyy-db;user=admin;password=pwd;useUnicode=true;characterEncoding=UTF-8")
Error: java.sql.SQLException: No suitable driver found
Need suggestion on the same and is there a way to execute stored procedure from Databricks using Scala / Java.