Overview of Spark configurations
Find myself looking for an overview too often. So let’s create a rough overview of common used config for Spark. As a start, create a Spark Session with default config: from pyspark.sql import SparkSession spark = SparkSession.builder \ .master(SPARK_MASTER) \ .appname("app name") \ .getOrCreate() The Spark Context represents the connection to the cluster; communicaties with lower-level API’s and RDDs. Some resource settings on the driver: ... .config("spark.driver.memory", "8g") ... .config("spark.cores.max", "4") ....