最近写Flink on Yarn程序遇到的一些问题

1.UDF造成的compile 编译失败

class GetDay() extends ScalarFunction{
// 这个变量千万不能定义在这里,否则调试没问题,on yarn运行会编译出错
// val simpleDateFormat = new SimpleDateFormat("yyyyMMdd")
  def eval():String = {
    val simpleDateFormat = new SimpleDateFormat("yyyyMMdd")
    val calendar = Calendar.getInstance
    simpleDateFormat.format(calendar.getTime)
  }
}

2.类加载顺序导致的问题

Caused by: java.lang.ClassCastException: cannot assign instance of org.apache.commons.collections.map.LinkedMap to 
field org.apache.flink.streaming.connectors.kafka.FlinkKafkaConsumerBase.pendingOffsetsToCommit of type 
org.apache.commons.collections.map.LinkedMap in instance of org.apache.flink.streaming.connectors.kafka.FlinkKafkaConsumer010

修改yarn-conf.yml解决

classloader.resolve-order: parent-first
  1. 类型转换cast string报错
Caused by: org.apache.calcite.sql.validate.SqlValidatorException: Unknown datatype name 'string'
  at sun.reflect.NativeConstructorAccessorImpl.newInstance0(Native Method)
  at sun.reflect.NativeConstructorAccessorImpl.newInstance(NativeConstructorAccessorImpl.java:62)
  at sun.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.java:45)
  at java.lang.reflect.Constructor.newInstance(Constructor.java:423)
  at org.apache.calcite.runtime.Resources$ExInstWithCause.ex(Resources.java:463)
  at org.apache.calcite.runtime.Resources$ExInst.ex(Resources.java:572)
  1. kafka 依赖问题
Caused by: org.apache.flink.table.api.NoMatchingTableFactoryException: Could not find a suitable table factory for 'org.apache.flink.table.factories.DeserializationSchemaFactory' in
the classpath.

Reason: No factory implements 'org.apache.flink.table.factories.DeserializationSchemaFactory'.

将依赖放在lib下。 09和10的。10依赖09. 还有base的包。
和flink-json的。

  1. 写ES报错冲突
Caused by: [es_index_poc/A8kEJoRzQHeGpqlN_0JdOA][[es_index_poc][0]] VersionConflictEngineException[[es_type][32698248814793]: version conflict, current version [20] is different than the one provided [19]]

主键要唯一

你可能感兴趣的:(Flink)