I want to create spark dataframe by using PySpark and for that I ran this code in PyCharm:
from pyspark.sql import SparkSession
Spark_Session:SparkSession.builder\
.enableHiveSupport()\
.master("local"\
.getOrCreate()
However, it returns this errors:
Using Spark's default log4j profile: org/apache/spark/log4j-defaults.properties Setting default log level to "WARN". To adjust logging level use sc.setLogLevel(newLevel). For SparkR, use setLogLevel(newLevel). 18/01/08 10:17:55 WARN NativeCodeLoader: Unable to load native-hadoop library for your platform... using builtin-java classes where applicable 18/01/08 10:18:14 WARN ObjectStore: Failed to get database global_temp, returning NoSuchObjectException
How should I solve this problem?