Unable to instantiate SparkSession with Hive support because Hive classes are not found.


when I am practicing the following

https://spark.apache.org/docs/latest/sql-data-sources-hive-tables.html

I have used the following sbts.

name := "SparkProject"

version := "0.1"

scalaVersion := "2.12.9"

// https://mvnrepository.com/artifact/org.apache.spark/spark-core
libraryDependencies += "org.apache.spark" %% "spark-core" % "2.4.2"
libraryDependencies += "org.apache.spark" %% "spark-sql" % "2.4.2"
libraryDependencies += "org.apache.spark" %% "spark-hive" % "2.4.2" % "provided"
Exception in thread "main" java.lang.IllegalArgumentException: Unable to instantiate SparkSession with Hive support because Hive classes are not found.at org.apache.spark.sql.SparkSession$Builder.enableHiveSupport(SparkSession.scala:869)

I have changed the hive dependency from

libraryDependencies += "org.apache.spark" %% "spark-hive" % "2.4.2" % "provided"

to

libraryDependencies += "org.apache.spark" %% "spark-hive" % "2.4.2"

then I am able to execute the program without any errors.

Have any Question or Comment?

Leave a Reply

Your email address will not be published. Required fields are marked *