public class TaskContextImpl extends TaskContext implements Logging
| Constructor and Description |
|---|
TaskContextImpl(int stageId,
int partitionId,
long taskAttemptId,
int attemptNumber,
boolean runningLocally,
org.apache.spark.executor.TaskMetrics taskMetrics) |
| Modifier and Type | Method and Description |
|---|---|
void |
addOnCompleteCallback(scala.Function0<scala.runtime.BoxedUnit> f)
Adds a callback function to be executed on task completion.
|
TaskContextImpl |
addTaskCompletionListener(scala.Function1<TaskContext,scala.runtime.BoxedUnit> f)
Adds a listener in the form of a Scala closure to be executed on task completion.
|
TaskContextImpl |
addTaskCompletionListener(TaskCompletionListener listener)
Adds a (Java friendly) listener to be executed on task completion.
|
long |
attemptId() |
int |
attemptNumber()
How many times this task has been attempted.
|
boolean |
isCompleted()
Returns true if the task has completed.
|
boolean |
isInterrupted()
Returns true if the task has been killed.
|
boolean |
isRunningLocally()
Returns true if the task is running locally in the driver program.
|
void |
markInterrupted()
Marks the task for interruption, i.e.
|
void |
markTaskCompleted()
Marks the task as completed and triggers the listeners.
|
int |
partitionId()
The ID of the RDD partition that is computed by this task.
|
boolean |
runningLocally() |
int |
stageId()
The ID of the stage that this task belong to.
|
long |
taskAttemptId()
An ID that is unique to this task attempt (within the same SparkContext, no two task attempts
will share the same attempt ID).
|
org.apache.spark.executor.TaskMetrics |
taskMetrics()
::DeveloperApi::
|
getequals, getClass, hashCode, notify, notifyAll, toString, wait, wait, waitinitializeIfNecessary, initializeLogging, isTraceEnabled, log_, log, logDebug, logDebug, logError, logError, logInfo, logInfo, logName, logTrace, logTrace, logWarning, logWarningpublic TaskContextImpl(int stageId,
int partitionId,
long taskAttemptId,
int attemptNumber,
boolean runningLocally,
org.apache.spark.executor.TaskMetrics taskMetrics)
public int stageId()
TaskContextstageId in class TaskContextpublic int partitionId()
TaskContextpartitionId in class TaskContextpublic long taskAttemptId()
TaskContexttaskAttemptId in class TaskContextpublic int attemptNumber()
TaskContextattemptNumber in class TaskContextpublic boolean runningLocally()
runningLocally in class TaskContextpublic org.apache.spark.executor.TaskMetrics taskMetrics()
TaskContexttaskMetrics in class TaskContextpublic long attemptId()
attemptId in class TaskContextpublic TaskContextImpl addTaskCompletionListener(TaskCompletionListener listener)
TaskContextaddTaskCompletionListener in class TaskContextpublic TaskContextImpl addTaskCompletionListener(scala.Function1<TaskContext,scala.runtime.BoxedUnit> f)
TaskContextaddTaskCompletionListener in class TaskContextpublic void addOnCompleteCallback(scala.Function0<scala.runtime.BoxedUnit> f)
TaskContextaddOnCompleteCallback in class TaskContextf - Callback function.public void markTaskCompleted()
public void markInterrupted()
public boolean isCompleted()
TaskContextisCompleted in class TaskContextpublic boolean isRunningLocally()
TaskContextisRunningLocally in class TaskContextpublic boolean isInterrupted()
TaskContextisInterrupted in class TaskContext