IBM Cloud Docs
Default limits and quotas for Spark engine

Default limits and quotas for Spark engine

The following sections provide details about the default limit and quota settings for the Spark engine.

These default values are set to avoid excessive billing, to override the default limits and quotas for the Spark engine, based on your requirements, contact IBM Support.

Application limits

The following table lists the default limits and quotas for the Spark engine.

Default limits and quotas for Spark instances
Category Default
Maximum number of Spark engines per watsonx.data instances 3
Maximum number of nodes per Spark engine 20
Shuffle space per core approx. 30 GB (Not customizable)

Supported Spark driver and executor vCPU and memory combinations

Apache Spark supports only the following pre-defined Spark driver and executor vCPU and memory combinations.

These two vCPU to memory proportions are supported: 1 vCPU to 4 GB of memory and 1 vCPU to 8 GB of memory.

The following table shows the supported vCPU to memory size combinations.

Supported vCPU to memory size combinations
Lower value Upper value
1 vCPU x 1 GB 10 vCPU x 48 GB

Supported Spark version for watsonx.data Spark engine

IBM® watsonx.data supports the following Spark runtime versions to run Spark workloads by using watsonx.data.

Supported Spark versions
Name Status
Apache Spark 3.4.4 Supported
Apache Spark 3.5.4 Supported
Apache Spark 4.0 Supported

Supported Spark version for Gluten accelerated Spark engine

IBM® watsonx.data supports the following Spark runtime versions to run Spark workloads by using Gluten accelerated Spark engine.

Supported Spark versions
Name Status
Apache Spark 3.4.4 Supported
Apache Spark 3.5.4 Supported

Default Hardware configuration

To manually specify the number of CPU cores (Driver and Executor) and memory that is required for the workload , below configs can be modified and passed in the payload:

"num-executors" : "1",
"spark.executor.cores": "1",
"spark.executor.memory": "4G",
"spark.driver.cores": "1",
"spark.driver.memory": "4G",

For details on enabling autoscaling, see Enabling application autoscaling.