I am using the below code to query a sql server table hr.employee in my azure sql server database using azure databricks. I am new to spark sql and trying to learn the nuances one step at a time.
Azure Databricks:
%scala
val jdbcHostname = dbutils.widgets.get("hostName")
val jdbcPort = 1433
val jdbcDatabase = dbutils.widgets.get("database")
val jdbcUrl = s"jdbc:sqlserver://${jdbcHostname}:${jdbcPort};database=${jdbcDatabase}"
import java.util.Properties
val connectionProperties = new Properties()
connectionProperties.put("user", s"${jdbcUsername}")
connectionProperties.put("password", s"${jdbcPassword}")
%scala
val driverClass = "com.microsoft.sqlserver.jdbc.SQLServerDriver"
connectionProperties.setProperty("Driver", driverClass)
%scala
val employee = spark.read.jdbc(jdbcUrl, "hr.Employee", connectionProperties)
%scala
spark.sql("select * from employee")
%sql
select * from employee
employee.select("col1","col2").show()
I get the below error. Not sure what wrong am I doing. Tried a couple of variations as well and no luck so far.
Error:
';' expected but integer literal found.
command-922779590419509:26: error: not found: value %
%sql
command-922779590419509:27: error: not found: value select
select * from employee
command-922779590419509:27: error: not found: value from
select * from employee
command-922779590419509:16: error: not found: value %
%scala
spark.sql(query)
? – Lamanus%scala spark.sql("select * from employee")
– Lamanus%scala
and get the same error. – ITHelpGuyval dataDF = { val query = "select * from hr.Employee" sqlContext.sql(query) };
This doesn't work either. – ITHelpGuy