Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Cannot find destination table in metastore #3

Open
GrzesiuKo opened this issue May 4, 2020 · 0 comments
Open

Cannot find destination table in metastore #3

GrzesiuKo opened this issue May 4, 2020 · 0 comments

Comments

@GrzesiuKo
Copy link

I encounter this issue, although the table exists in the database:
Exception in thread "main" org.apache.spark.sql.streaming.StreamingQueryException: Cannot find destination table in metastore, please create table mytable at first
and
Caused by: org.apache.spark.sql.catalyst.analysis.NoSuchTableException: Table or view 'mytable' not found in database 'default';

And I am sure that, the given config properties are good, because they work while using HiveWarehouseConenctor.

  def writeToHive(config: Map[String, String], data: Dataset[Row]): StreamingQuery = {
    val hiveConfig: Map[String, String] = Map(
      "metastore" -> config("spark.sql.hive.metastoreUri"),
      "db" -> config("spark.hive.database.name"),
      "table" -> config("spark.hive.database.table.name"),
      "checkpointLocation" -> (config("spark.hive.checkpointLocation")+config("stream.name"))
    )

    data
      .writeStream
      .format("hive-streaming")
      .options(hiveConfig)
      .queryName(config("stream.name") + "_query")
      .start()
  }

I am trying to use this repo, because HWC cannot handle writing to Hive, when Hive column contains struct. It fills the struct fields with null.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

1 participant