Configuration of ScalarDB Analytics with Spark
警告
This version of ScalarDB Analytics with Spark was in private preview. Please use version 3.14 or later instead.
There are two ways to configure ScalarDB Analytics with Spark:
- By configuring the properties in
spark.conf
- By using the helper method that ScalarDB Analytics with Spark provides
Both ways are conceptually equivalent processes, so you can choose either one based on your preference.
Configure ScalarDB Analytics with Spark by using spark.conf
Since ScalarDB Analytics with Spark is provided as a Spark custom catalog plugin, you can enable ScalarDB Analytics with Spark via spark.conf
.
spark.sql.catalog.scalardb_catalog = com.scalar.db.analytics.spark.datasource.ScalarDbCatalog
spark.sql.catalog.scalardb_catalog.config = /<PATH_TO_YOUR_SCALARDB_PROPERTIES>/config.properties
spark.sql.catalog.scalardb_catalog.namespaces = <YOUR_NAMESPACE_NAME_2>,<YOUR_NAMESPACE_NAME_2>
spark.sql.catalog.scalardb_catalog.license.key = {"your":"license", "key":"in", "json":"format"}
spark.sql.catalog.scalardb_catalog.license.cert_path = /<PATH_TO_YOUR_LICENSE>/cert.pem