Package | Description |
---|---|
org.apache.pig.backend.hadoop | |
org.apache.pig.backend.hadoop.executionengine.mapReduceLayer | |
org.apache.pig.backend.hadoop.executionengine.mapReduceLayer.partitioners | |
org.apache.pig.backend.hadoop.executionengine.physicalLayer.relationalOperators | |
org.apache.pig.backend.hadoop.executionengine.tez.runtime | |
org.apache.pig.data |
This package contains implementations of Pig specific data types as well as
support functions for reading, writing, and using all Pig data types.
|
org.apache.pig.impl.io |
Modifier and Type | Method and Description |
---|---|
static PigNullableWritable |
HDataType.getNewWritableComparable(byte keyType) |
static PigNullableWritable |
HDataType.getWritableComparable(java.lang.String className) |
static PigNullableWritable |
HDataType.getWritableComparableTypes(byte type) |
static PigNullableWritable |
HDataType.getWritableComparableTypes(java.lang.Object o,
byte keyType) |
Modifier and Type | Method and Description |
---|---|
static byte |
HDataType.findTypeFromNullableWritable(PigNullableWritable o) |
Modifier and Type | Method and Description |
---|---|
PigNullableWritable |
PigMapReduce.Reduce.IllustratorContextImpl.getCurrentKey() |
Modifier and Type | Method and Description |
---|---|
protected void |
PigGenericMapReduce.Reduce.reduce(PigNullableWritable key,
java.lang.Iterable<NullableTuple> tupIter,
org.apache.hadoop.mapreduce.Reducer.Context context)
The reduce function which packages the key and List<Tuple>
into key, Bag<Tuple> after converting Hadoop type key into Pig type.
|
protected void |
PigGenericMapReduce.ReduceWithComparator.reduce(PigNullableWritable key,
java.lang.Iterable<NullableTuple> tupIter,
org.apache.hadoop.mapreduce.Reducer.Context context)
The reduce function which packages the key and List<Tuple>
into key, Bag<Tuple> after converting Hadoop type key into Pig type.
|
protected void |
PigCombiner.Combine.reduce(PigNullableWritable key,
java.lang.Iterable<NullableTuple> tupIter,
org.apache.hadoop.mapreduce.Reducer.Context context)
The reduce function which packages the key and List <Tuple>
into key, Bag<Tuple> after converting Hadoop type key into Pig type.
|
protected void |
DistinctCombiner.Combine.reduce(PigNullableWritable key,
java.lang.Iterable<NullableTuple> tupIter,
org.apache.hadoop.mapreduce.Reducer.Context context)
The reduce function which removes values.
|
void |
PigMapReduce.Reduce.IllustratorContextImpl.write(PigNullableWritable k,
org.apache.hadoop.io.Writable t) |
void |
PigMapBase.IllustratorContext.write(PigNullableWritable key,
org.apache.hadoop.io.Writable value) |
Modifier and Type | Method and Description |
---|---|
org.apache.hadoop.mapreduce.Mapper.Context |
PigMapBase.getIllustratorContext(org.apache.hadoop.conf.Configuration conf,
DataBag input,
java.util.List<Pair<PigNullableWritable,org.apache.hadoop.io.Writable>> output,
org.apache.hadoop.mapreduce.InputSplit split)
Get mapper's illustrator context
|
abstract org.apache.hadoop.mapreduce.Mapper.Context |
PigGenericMapBase.getIllustratorContext(org.apache.hadoop.conf.Configuration conf,
DataBag input,
java.util.List<Pair<PigNullableWritable,org.apache.hadoop.io.Writable>> output,
org.apache.hadoop.mapreduce.InputSplit split) |
org.apache.hadoop.mapreduce.Reducer.Context |
PigMapReduce.Reduce.getIllustratorContext(org.apache.hadoop.mapred.jobcontrol.Job job,
java.util.List<Pair<PigNullableWritable,org.apache.hadoop.io.Writable>> input,
POPackage pkg)
Get reducer's illustrator context
|
abstract org.apache.hadoop.mapreduce.Reducer.Context |
PigGenericMapReduce.Reduce.getIllustratorContext(org.apache.hadoop.mapred.jobcontrol.Job job,
java.util.List<Pair<PigNullableWritable,org.apache.hadoop.io.Writable>> input,
POPackage pkg)
Get reducer's illustrator context
|
Constructor and Description |
---|
IllustratorContext(org.apache.hadoop.conf.Configuration conf,
DataBag input,
java.util.List<Pair<PigNullableWritable,org.apache.hadoop.io.Writable>> output,
org.apache.hadoop.mapreduce.InputSplit split) |
IllustratorContextImpl(org.apache.hadoop.mapred.jobcontrol.Job job,
java.util.List<Pair<PigNullableWritable,org.apache.hadoop.io.Writable>> input,
POPackage pkg) |
Modifier and Type | Field and Description |
---|---|
protected PigNullableWritable[] |
WeightedRangePartitioner.quantiles |
Modifier and Type | Field and Description |
---|---|
protected org.apache.hadoop.io.RawComparator<PigNullableWritable> |
WeightedRangePartitioner.comparator |
protected java.util.Map<PigNullableWritable,DiscreteProbabilitySampleGenerator> |
WeightedRangePartitioner.weightedParts |
Modifier and Type | Method and Description |
---|---|
protected PigNullableWritable |
WeightedRangePartitioner.getPigNullableWritable(Tuple t) |
Modifier and Type | Method and Description |
---|---|
int |
WeightedRangePartitioner.getPartition(PigNullableWritable key,
org.apache.hadoop.io.Writable value,
int numPartitions) |
int |
SkewedPartitioner.getPartition(PigNullableWritable wrappedKey,
org.apache.hadoop.io.Writable value,
int numPartitions) |
int |
SecondaryKeyPartitioner.getPartition(PigNullableWritable key,
org.apache.hadoop.io.Writable value,
int numPartitions) |
Modifier and Type | Field and Description |
---|---|
protected PigNullableWritable |
POPackage.keyWritable |
Modifier and Type | Method and Description |
---|---|
void |
POPackage.attachInput(PigNullableWritable k,
java.util.Iterator<NullableTuple> inp)
Attaches the required inputs
|
java.lang.Object |
Packager.getKey(PigNullableWritable key) |
Tuple |
Packager.getValueTuple(PigNullableWritable keyWritable,
NullableTuple ntup,
int index) |
Tuple |
MultiQueryPackager.getValueTuple(PigNullableWritable keyWritable,
NullableTuple ntup,
int origIndex) |
Tuple |
LitePackager.getValueTuple(PigNullableWritable keyWritable,
NullableTuple ntup,
int index)
Makes use of the superclass method, but this requires an additional
parameter key passed by ReadOnceBag.
|
Tuple |
CombinerPackager.getValueTuple(PigNullableWritable keyWritable,
NullableTuple ntup,
int index) |
Modifier and Type | Method and Description |
---|---|
int |
WeightedRangePartitionerTez.getPartition(PigNullableWritable key,
org.apache.hadoop.io.Writable value,
int numPartitions) |
Modifier and Type | Field and Description |
---|---|
protected PigNullableWritable |
ReadOnceBag.keyWritable |
Constructor and Description |
---|
ReadOnceBag(Packager pkgr,
java.util.Iterator<NullableTuple> tupIter,
PigNullableWritable keyWritable)
This constructor creates a bag out of an existing iterator
of tuples by taking ownership of the iterator and NOT
copying the elements of the iterator.
|
Modifier and Type | Class and Description |
---|---|
class |
NullableBag |
class |
NullableBigDecimalWritable |
class |
NullableBigIntegerWritable |
class |
NullableBooleanWritable |
class |
NullableBytesWritable |
class |
NullableDateTimeWritable |
class |
NullableDoubleWritable |
class |
NullableFloatWritable |
class |
NullableIntWritable |
class |
NullableLongWritable |
class |
NullablePartitionWritable
NullablePartitionWritable is an adaptor class around PigNullableWritable that adds a partition
index to the class.
|
class |
NullableText |
class |
NullableTuple |
class |
NullableUnknownWritable
This class can be used when data type is 'Unknown' and
there is a need for PigNullableWritable object.
|
Modifier and Type | Method and Description |
---|---|
PigNullableWritable |
PigNullableWritable.clone() |
PigNullableWritable |
NullablePartitionWritable.getKey() |
Modifier and Type | Method and Description |
---|---|
void |
NullablePartitionWritable.setKey(PigNullableWritable k) |
Constructor and Description |
---|
NullablePartitionWritable(PigNullableWritable k) |
Copyright © 2007-2017 The Apache Software Foundation