pyspark.SparkContext.setInterruptOnCancel#
- SparkContext.setInterruptOnCancel(interruptOnCancel)[source]#
Set the behavior of job cancellation from jobs started in this thread.
New in version 3.5.0.
- Parameters
- interruptOnCancelbool
If true, then job cancellation will result in
Thread.interrupt()
being called on the job’s executor threads. This is useful to help ensure that the tasks are actually stopped in a timely manner, but is off by default due to HDFS-1208, where HDFS may respond toThread.interrupt()
by marking nodes as dead.