ScalarDB Analytics with Spark
This version of ScalarDB Analytics with Spark was in private preview. Please use version 3.14 or later instead.
ScalarDB, as a universal transaction manager, targets mainly transactional workloads and therefore supports limited subsets of relational queries.
ScalarDB Analytics with Spark extends the functionality of ScalarDB to process analytical queries on ScalarDB-managed data by using Apache Spark and Spark SQL.
Since ScalarDB Analytics with Spark is provided as a Spark catalog plugin, you can read externally managed data sources with its data schema. By using this plugin, you can read data from ScalarDB tables as Spark SQL tables with the same schema.
You need to have a license key (trial license or commercial license) to use ScalarDB Analytics with Spark. If you don't have a license key, please contact us.
Further reading​
- To run ad-hoc analytical queries or development applications by using ScalarDB Analytics with Spark, see Getting Started with ScalarDB Analytics with Spark.
- For tutorials on how to use ScalarDB Analytics with Spark by using a sample dataset and application, see Run Analytical Queries on Sample Data by Using ScalarDB Analytics with Spark.
- For details on how to configure ScalarDB Analytics with Spark, see Configuration of ScalarDB Analytics with Spark.
- For supported Spark and Scala versions, see Version Compatibility of ScalarDB Analytics with Spark