1、org.apache.hadoop.ipc.RemoteException: org.apache.hadoop.hdfs.protocol.AlreadyBeingCreatedException: failed to create file /user/bjdata/user/wuyb/semv/SemAAJob_3/calsigma/_temporary/_attempt_201306261152_42270_r_000000_0/part-r-00000 for DFSClient_attempt_201306261152_42270_r_000000_0 on client 10.0.2.79 because current leaseholder is trying to recreate file.
解决:
MultipleOutputs.addNamedOutput(job, CalSigmaReducer.RECORD_OUT, MultipleTextOutputFormat.class, Text.class, NullWritable.class);
原来是使用TextOutputFormat,改成MultipleTextOutputFormat。
2、使用了thrift生成的class,java.lang.IllegalStateException: Runtime parameterized Protobuf/Thrift class is unkonwn. This object was probably created with default constructor. Please use setConverter(Class).
MultipleOutputs.addNamedOutput(job, CalSigmaReducer.RECORD_OUT2, MultipleSequenceFileOutputFormat.class, ThriftWritable.class, NullWritable.class);
原来想设置输出格式为TextSequenceFileOutputFormat,然后使用ThriftWritable为key类型。hadoop在生成对象的时候使用ThriftWritable默认构造器,但ThriftWritable要求设置一个被包装的类型。这无法做到,因此导致用hadoop的生成对象的方法生成了一个不具有“被包装类型”的ThriftWritable对象。
解决:写了一个工具类,把thrift生成的类转换为String。(为了不修改thrift生成的类)
3、使用后忘记close,然后发现有的文件没有内容。
解决:cleanup的时候mos.close()