pyspark.SparkContext.defaultMinPartitions#
- property SparkContext.defaultMinPartitions#
Default min number of partitions for Hadoop RDDs when not given by user
New in version 1.1.0.
Examples
>>> sc.defaultMinPartitions > 0 True
Site Navigation
Section Navigation
Default min number of partitions for Hadoop RDDs when not given by user
New in version 1.1.0.
Examples
>>> sc.defaultMinPartitions > 0
True