org.apache.hadoop.mapred.pipes
Class Submitter

java.lang.Object
  extended by org.apache.hadoop.mapred.pipes.Submitter

public class Submitter
extends Object

The main entry point and job submitter. It may either be used as a command line-based or API-based method to launch Pipes jobs.


Constructor Summary
Submitter()
           
 
Method Summary
static String getExecutable(JobConf conf)
          Get the URI of the application's executable.
static boolean getIsJavaMapper(JobConf conf)
          Check whether the job is using a Java Mapper.
static boolean getIsJavaRecordReader(JobConf conf)
          Check whether the job is using a Java RecordReader
static boolean getIsJavaRecordWriter(JobConf conf)
          Will the reduce use a Java RecordWriter?
static boolean getIsJavaReducer(JobConf conf)
          Check whether the job is using a Java Reducer.
static boolean getKeepCommandFile(JobConf conf)
          Does the user want to keep the command file for debugging? If this is true, pipes will write a copy of the command data to a file in the task directory named "downlink.data", which may be used to run the C++ program under the debugger.
static void main(String[] args)
          Submit a pipes job based on the command line arguments.
static void setExecutable(JobConf conf, String executable)
          Set the URI for the application's executable.
static void setIsJavaMapper(JobConf conf, boolean value)
          Set whether the Mapper is written in Java.
static void setIsJavaRecordReader(JobConf conf, boolean value)
          Set whether the job is using a Java RecordReader.
static void setIsJavaRecordWriter(JobConf conf, boolean value)
          Set whether the job will use a Java RecordWriter.
static void setIsJavaReducer(JobConf conf, boolean value)
          Set whether the Reducer is written in Java.
static void setKeepCommandFile(JobConf conf, boolean keep)
          Set whether to keep the command file for debugging
static RunningJob submitJob(JobConf conf)
          Submit a job to the map/reduce cluster.
 
Methods inherited from class java.lang.Object
clone, equals, finalize, getClass, hashCode, notify, notifyAll, toString, wait, wait, wait
 

Constructor Detail

Submitter

public Submitter()
Method Detail

getExecutable

public static String getExecutable(JobConf conf)
Get the URI of the application's executable.

Parameters:
conf -
Returns:
the URI where the application's executable is located

setExecutable

public static void setExecutable(JobConf conf,
                                 String executable)
Set the URI for the application's executable. Normally this is a hdfs: location.

Parameters:
conf -
executable - The URI of the application's executable.

setIsJavaRecordReader

public static void setIsJavaRecordReader(JobConf conf,
                                         boolean value)
Set whether the job is using a Java RecordReader.

Parameters:
conf - the configuration to modify
value - the new value

getIsJavaRecordReader

public static boolean getIsJavaRecordReader(JobConf conf)
Check whether the job is using a Java RecordReader

Parameters:
conf - the configuration to check
Returns:
is it a Java RecordReader?

setIsJavaMapper

public static void setIsJavaMapper(JobConf conf,
                                   boolean value)
Set whether the Mapper is written in Java.

Parameters:
conf - the configuration to modify
value - the new value

getIsJavaMapper

public static boolean getIsJavaMapper(JobConf conf)
Check whether the job is using a Java Mapper.

Parameters:
conf - the configuration to check
Returns:
is it a Java Mapper?

setIsJavaReducer

public static void setIsJavaReducer(JobConf conf,
                                    boolean value)
Set whether the Reducer is written in Java.

Parameters:
conf - the configuration to modify
value - the new value

getIsJavaReducer

public static boolean getIsJavaReducer(JobConf conf)
Check whether the job is using a Java Reducer.

Parameters:
conf - the configuration to check
Returns:
is it a Java Reducer?

setIsJavaRecordWriter

public static void setIsJavaRecordWriter(JobConf conf,
                                         boolean value)
Set whether the job will use a Java RecordWriter.

Parameters:
conf - the configuration to modify
value - the new value to set

getIsJavaRecordWriter

public static boolean getIsJavaRecordWriter(JobConf conf)
Will the reduce use a Java RecordWriter?

Parameters:
conf - the configuration to check
Returns:
true, if the output of the job will be written by Java

getKeepCommandFile

public static boolean getKeepCommandFile(JobConf conf)
Does the user want to keep the command file for debugging? If this is true, pipes will write a copy of the command data to a file in the task directory named "downlink.data", which may be used to run the C++ program under the debugger. You probably also want to set JobConf.setKeepFailedTaskFiles(true) to keep the entire directory from being deleted. To run using the data file, set the environment variable "hadoop.pipes.command.file" to point to the file.

Parameters:
conf - the configuration to check
Returns:
will the framework save the command file?

setKeepCommandFile

public static void setKeepCommandFile(JobConf conf,
                                      boolean keep)
Set whether to keep the command file for debugging

Parameters:
conf - the configuration to modify
keep - the new value

submitJob

public static RunningJob submitJob(JobConf conf)
                            throws IOException
Submit a job to the map/reduce cluster. All of the necessary modifications to the job to run under pipes are made to the configuration.

Parameters:
conf - the job to submit to the cluster (MODIFIED)
Throws:
IOException

main

public static void main(String[] args)
                 throws Exception
Submit a pipes job based on the command line arguments.

Parameters:
args -
Throws:
Exception


Copyright © 2008 The Apache Software Foundation