IBM logo


Use dashDB Local with Spark and Notebooks

Why should you use dashDB Local with Apache Spark?

One standout feature of dashDB Local is its Spark integration. dashDB Local provides embedded open-source Spark capabilities out of the box, giving users access to a secure and multi-tenant Spark execution environment so that Spark applications can be deployed and called from inside the data warehouse.

Set up dashDB Local with Apache Spark

Use Jupyter Notebooks with dashDB Local

Create a deployable Spark application with dashDB Local

Deploy and run Spark applications in dashDB Local

Use Spark-based machine learning routines in dashDB Local

Run Spark application in dashDB Local with spark_submit

Use one-click deployment of Spark applications from Notebooks

Walk through the Tornado Spark notebook for dashDB Local

Streaming data using the built-in Apache Spark infrastructure in dashDB Local


Connect apps to dashDB

Use dashDB with Watson Analytics

Perform Predictive Analytics and SQL Pushdown

Use dashDB with Spark

Use dashDB with Pyspark and Pandas

Use dashDB with R

Use dashDB with IBM Embeddable Reporting Service

Use dashDB with Tableau

Leverage dashDB in Cognos Business Intelligence

Load geospatial data into dashDB to analyze in Esri ArcGIS

Integrate dashDB with Excel

Extract and export dashDB data to a CSV file

Analyze With SPSS Statistics and dashDB

> Use dashDB Local with Spark and Notebooks

Use dashDB Local with Watson Analytics

Use Visual Explain in dashDB Local