【无标题】

Exception in thread "main" org.apache.hadoop.security.AccessControlException: Permission denied: user=S, access=WRITE, inode="/user/hudi":root:supergroup:drwxr-xr-x

hadoop fs -chmod -R 777 /user/hudi

2

 org.apache.spark.serializer.KryoSerializer是Hudi所需的序列化器

hoodie only support org.apache.spark.serializer.KryoSerializer as spark.serializer

val spark = SparkSession.builder()
  .appName("JsonDataGenerator")
  .master("local")  .config("spark.serializer", "org.apache.spark.serializer.KryoSerializer")

3

Exception in thread "main" org.apache.hadoop.ipc.RemoteException(java.io.IOException): File /user/hudi/.hoodie/hoodie.properties could only be written to 0 of the 1 minReplication nodes. There are 3 datanode(s) running and 3 node(s) are excluded in this operation.

1 hdfs dfsadmin -report

2 hdfs getconf -confKey dfs.replication检查当前副本因子配置

3 hdfs cacheadmin -addPool mycache 添加缓存

4 hdfs cacheadmin -modifyPool -pool mycache -expiryMs 86400000设置缓存池过期时间为一天

4

Exception in thread "main" java.lang.IllegalArgumentException: hoodie. properties file seems invalid.  Please check for left over `. updated` files if any, manually copy it to hoodie. properties and retry

hoodie. properties 文件有问题,去找好的替掉

你可能感兴趣的:(hudi,大数据,idea,ajax,javascript,大数据)