spark 判断s3路径是否存在

val sc = new SparkContext(new SparkConf().setAppName("AppName"))
sc.hadoopConfiguration.set("fs.s3n.awsAccessKeyId", "ACCESS_KEY")
sc.hadoopConfiguration.set("fs.s3n.awsSecretAccessKey", "SECRET_ACCESS_KEY")
val textFile = sc.textFile("s3n://bucket/source_path")
textFile.saveAsTextFile("s3n://bucket/target_path")
FileSystem.get(new URI("s3n://bucket"), sc.hadoopConfiguration).exists(new Path("s3n://bucket/path_to_check"))

你可能感兴趣的:(HADOOP,AWS)