ArrayWritable as key in Hadoop MapReduce

I am attempting to create a dynamic map reduce application that takes in dimensions from an external properties file. The main problem lies in the fact that the variables i.e. the key will be composite and may be of whatever numbers, e.g pair of 3 keys, pair of 4 keys, etc.

My Mapper:

public void map(AvroKey<flumeLogs> key, NullWritable value, Context context) throws IOException, InterruptedException{
    Configuration conf = context.getConfiguration();
    int dimensionCount = Integer.parseInt(conf.get("dimensionCount"));
    String[] dimensions = conf.get("dimensions").split(","); //this gets the dimensions from the run method in main

    Text[] values = new Text[dimensionCount]; //This is supposed to be my composite key

    for (int i=0; i<dimensionCount; i++){

        case "region":  values[i] = new Text("-");

        case "event":  values[i] = new Text("-");

        case "eventCode":  values[i] = new Text("-");

        case "mobile":  values[i] = new Text("-");
    context.write(new StringArrayWritable(values), new IntWritable(1));


The values will have good logic later.

My StringArrayWritable:

public class StringArrayWritable extends ArrayWritable {
public StringArrayWritable() {

public StringArrayWritable(Text[] values){
    super(Text.class, values);
    Text[] texts = new Text[values.length];
    for (int i = 0; i < values.length; i++) {
        texts[i] = new Text(values[i]);

public String toString(){
    StringBuilder sb = new StringBuilder();

    for(String s : super.toStrings()){

    return sb.toString();

The error I am getting:

    Error: Initialization of all the collectors failed. Error in last collector was :class StringArrayWritable
    at org.apache.hadoop.mapred.MapTask.createSortingCollector(
    at org.apache.hadoop.mapred.MapTask.access$100(
    at org.apache.hadoop.mapred.MapTask$NewOutputCollector.<init>(
    at org.apache.hadoop.mapred.MapTask.runNewMapper(
    at org.apache.hadoop.mapred.YarnChild$
    at Method)
    at org.apache.hadoop.mapred.YarnChild.main(
Caused by: java.lang.ClassCastException: class StringArrayWritable
    at java.lang.Class.asSubclass(
    at org.apache.hadoop.mapred.JobConf.getOutputKeyComparator(
    at org.apache.hadoop.mapred.MapTask$MapOutputBuffer.init(
    at org.apache.hadoop.mapred.MapTask.createSortingCollector(
    ... 9 more

Any help would be greatly appreciated.

Thanks a lot.


You're trying to use a Writable object as the key. In mapreduce the key must implement the WritableComparable interface. ArrayWritable only implements the Writable interface.

The difference between the two is that the comaprable interface requires you to implement a compareTo method so that mapreduce is able to sort and group the keys correctly.

Need Your Help

Image Preloading in Canvas

javascript canvas

I'm drawing an Image on the canvas using the drawImage function. This is how Im setting the src property of the image: