按照以往的惯例,还是需求驱动学习,有位网友在我的flume学习五中留言提了一个问题如下:
我想实现一个功能,就在读一个文件的时候,将文件的名字和文件生成的日期作为event的header传到hdfs上时,不同的event存到不同的目录下,如一个文件是a.log.2014-07-25在hdfs上是存到/a/2014-07-25目录下,a.log.2014-07-26存到/a/2014-07-26目录下,就是每个文件对应自己的目录,这个要怎么实现。
带着这个问题,我又重新翻看了官方的文档,发现一个spooling directory source跟这个需求稍微有点吻合:它监视指定的文件夹下面有没有写入新的文件,有的话,就会把该文件内容传递给sink,然后将该文件后缀标示为.complete,表示已处理。提供了参数可以将文件名和文件全路径名添加到event的header中去。
现有的功能不能满足我们的需求,但是至少提供了一个方向:它能将文件名放入header!
当时就在祈祷源代码不要太复杂,这样我们在这个地方稍微修改修改,把文件名拆分一下,然后再放入header,这样就完成了我们想要的功能了。
于是就打开了源代码,果然不复杂,代码结构非常清晰,按照我的思路,稍微改了一下,就实现了这个功能,主要修改了与spooling directory source代码相关的三个类,分别是:ReliableSpoolingFileEventExtReader,SpoolDirectorySourceConfigurationExtConstants,SpoolDirectoryExtSource(在原类名的基础上增加了:Ext)代码如下:
首先,要根据flume ng提供的接口来实现自定义source,需要我们依赖flume ng的配置,我们引入两个配置flume-ng-core和flume-ng-configuration,具体的maven配置如下:
<dependency>
<groupId>org.apache.flumegroupId>
<artifactId>flume-ng-coreartifactId>
<version>1.6.0version>
dependency>
<dependency>
<groupId>org.apache.flumegroupId>
<artifactId>flume-ng-configurationartifactId>
<version>1.6.0version>
dependency>
-
-
-
-
-
-
-
-
-
-
-
-
-
-
-
-
-
-
-
- package com.besttone.flume;
-
- import java.io.File;
- import java.io.FileFilter;
- import java.io.FileNotFoundException;
- import java.io.IOException;
- import java.nio.charset.Charset;
- import java.util.Arrays;
- import java.util.Collections;
- import java.util.Comparator;
- import java.util.List;
- import java.util.regex.Matcher;
- import java.util.regex.Pattern;
-
- import org.apache.flume.Context;
- import org.apache.flume.Event;
- import org.apache.flume.FlumeException;
- import org.apache.flume.annotations.InterfaceAudience;
- import org.apache.flume.annotations.InterfaceStability;
- import org.apache.flume.client.avro.ReliableEventReader;
- import org.apache.flume.serialization.DecodeErrorPolicy;
- import org.apache.flume.serialization.DurablePositionTracker;
- import org.apache.flume.serialization.EventDeserializer;
- import org.apache.flume.serialization.EventDeserializerFactory;
- import org.apache.flume.serialization.PositionTracker;
- import org.apache.flume.serialization.ResettableFileInputStream;
- import org.apache.flume.serialization.ResettableInputStream;
- import org.apache.flume.tools.PlatformDetect;
- import org.joda.time.DateTime;
- import org.joda.time.format.DateTimeFormat;
- import org.joda.time.format.DateTimeFormatter;
- import org.slf4j.Logger;
- import org.slf4j.LoggerFactory;
-
- import com.google.common.base.Charsets;
- import com.google.common.base.Optional;
- import com.google.common.base.Preconditions;
- import com.google.common.io.Files;
-
-
-
-
-
-
-
-
-
-
-
-
-
-
-
-
-
-
-
-
-
-
-
-
-
-
-
-
-
- @InterfaceAudience.Private
- @InterfaceStability.Evolving
- public class ReliableSpoolingFileEventExtReader implements ReliableEventReader {
-
- private static final Logger logger = LoggerFactory
- .getLogger(ReliableSpoolingFileEventExtReader.class);
-
- static final String metaFileName = ".flumespool-main.meta";
-
- private final File spoolDirectory;
- private final String completedSuffix;
- private final String deserializerType;
- private final Context deserializerContext;
- private final Pattern ignorePattern;
- private final File metaFile;
- private final boolean annotateFileName;
- private final boolean annotateBaseName;
- private final String fileNameHeader;
- private final String baseNameHeader;
-
-
- private final boolean annotateFileNameExtractor;
- private final String fileNameExtractorHeader;
- private final Pattern fileNameExtractorPattern;
- private final boolean convertToTimestamp;
- private final String dateTimeFormat;
-
- private final boolean splitFileName;
- private final String splitBy;
- private final String splitBaseNameHeader;
-
-
- private final String deletePolicy;
- private final Charset inputCharset;
- private final DecodeErrorPolicy decodeErrorPolicy;
-
- private Optional currentFile = Optional.absent();
-
- private Optional lastFileRead = Optional.absent();
- private boolean committed = true;
-
-
-
-
- private ReliableSpoolingFileEventExtReader(File spoolDirectory,
- String completedSuffix, String ignorePattern,
- String trackerDirPath, boolean annotateFileName,
- String fileNameHeader, boolean annotateBaseName,
- String baseNameHeader, String deserializerType,
- Context deserializerContext, String deletePolicy,
- String inputCharset, DecodeErrorPolicy decodeErrorPolicy,
- boolean annotateFileNameExtractor, String fileNameExtractorHeader,
- String fileNameExtractorPattern, boolean convertToTimestamp,
- String dateTimeFormat, boolean splitFileName, String splitBy,
- String splitBaseNameHeader) throws IOException {
-
-
- Preconditions.checkNotNull(spoolDirectory);
- Preconditions.checkNotNull(completedSuffix);
- Preconditions.checkNotNull(ignorePattern);
- Preconditions.checkNotNull(trackerDirPath);
- Preconditions.checkNotNull(deserializerType);
- Preconditions.checkNotNull(deserializerContext);
- Preconditions.checkNotNull(deletePolicy);
- Preconditions.checkNotNull(inputCharset);
-
-
- if (!deletePolicy.equalsIgnoreCase(DeletePolicy.NEVER.name())
- && !deletePolicy
- .equalsIgnoreCase(DeletePolicy.IMMEDIATE.name())) {
- throw new IllegalArgumentException("Delete policies other than "
- + "NEVER and IMMEDIATE are not yet supported");
- }
-
- if (logger.isDebugEnabled()) {
- logger.debug("Initializing {} with directory={}, metaDir={}, "
- + "deserializer={}", new Object[] {
- ReliableSpoolingFileEventExtReader.class.getSimpleName(),
- spoolDirectory, trackerDirPath, deserializerType });
- }
-
-
- Preconditions
- .checkState(
- spoolDirectory.exists(),
- "Directory does not exist: "
- + spoolDirectory.getAbsolutePath());
- Preconditions.checkState(spoolDirectory.isDirectory(),
- "Path is not a directory: " + spoolDirectory.getAbsolutePath());
-
-
- try {
- File canary = File.createTempFile("flume-spooldir-perm-check-",
- ".canary", spoolDirectory);
- Files.write("testing flume file permissions\n", canary,
- Charsets.UTF_8);
- List lines = Files.readLines(canary, Charsets.UTF_8);
- Preconditions.checkState(!lines.isEmpty(), "Empty canary file %s",
- canary);
- if (!canary.delete()) {
- throw new IOException("Unable to delete canary file " + canary);
- }
- logger.debug("Successfully created and deleted canary file: {}",
- canary);
- } catch (IOException e) {
- throw new FlumeException("Unable to read and modify files"
- + " in the spooling directory: " + spoolDirectory, e);
- }
-
- this.spoolDirectory = spoolDirectory;
- this.completedSuffix = completedSuffix;
- this.deserializerType = deserializerType;
- this.deserializerContext = deserializerContext;
- this.annotateFileName = annotateFileName;
- this.fileNameHeader = fileNameHeader;
- this.annotateBaseName = annotateBaseName;
- this.baseNameHeader = baseNameHeader;
- this.ignorePattern = Pattern.compile(ignorePattern);
- this.deletePolicy = deletePolicy;
- this.inputCharset = Charset.forName(inputCharset);
- this.decodeErrorPolicy = Preconditions.checkNotNull(decodeErrorPolicy);
-
-
- this.annotateFileNameExtractor = annotateFileNameExtractor;
- this.fileNameExtractorHeader = fileNameExtractorHeader;
- this.fileNameExtractorPattern = Pattern
- .compile(fileNameExtractorPattern);
- this.convertToTimestamp = convertToTimestamp;
- this.dateTimeFormat = dateTimeFormat;
-
- this.splitFileName = splitFileName;
- this.splitBy = splitBy;
- this.splitBaseNameHeader = splitBaseNameHeader;
-
-
- File trackerDirectory = new File(trackerDirPath);
-
-
- if (!trackerDirectory.isAbsolute()) {
- trackerDirectory = new File(spoolDirectory, trackerDirPath);
- }
-
-
- if (!trackerDirectory.exists()) {
- if (!trackerDirectory.mkdir()) {
- throw new IOException(
- "Unable to mkdir nonexistent meta directory "
- + trackerDirectory);
- }
- }
-
-
- if (!trackerDirectory.isDirectory()) {
- throw new IOException("Specified meta directory is not a directory"
- + trackerDirectory);
- }
-
- this.metaFile = new File(trackerDirectory, metaFileName);
- }
-
-
-
-
-
-
- public String getLastFileRead() {
- if (!lastFileRead.isPresent()) {
- return null;
- }
- return lastFileRead.get().getFile().getAbsolutePath();
- }
-
-
- public Event readEvent() throws IOException {
- List events = readEvents(1);
- if (!events.isEmpty()) {
- return events.get(0);
- } else {
- return null;
- }
- }
-
- public List readEvents(int numEvents) throws IOException {
- if (!committed) {
- if (!currentFile.isPresent()) {
- throw new IllegalStateException("File should not roll when "
- + "commit is outstanding.");
- }
- logger.info("Last read was never committed - resetting mark position.");
- currentFile.get().getDeserializer().reset();
- } else {
-
- if (!currentFile.isPresent()) {
- currentFile = getNextFile();
- }
-
- if (!currentFile.isPresent()) {
- return Collections.emptyList();
- }
- }
-
- EventDeserializer des = currentFile.get().getDeserializer();
- List events = des.readEvents(numEvents);
-
-
-
-
-
- if (events.isEmpty()) {
- retireCurrentFile();
- currentFile = getNextFile();
- if (!currentFile.isPresent()) {
- return Collections.emptyList();
- }
- events = currentFile.get().getDeserializer().readEvents(numEvents);
- }
-
- if (annotateFileName) {
- String filename = currentFile.get().getFile().getAbsolutePath();
- for (Event event : events) {
- event.getHeaders().put(fileNameHeader, filename);
- }
- }
-
- if (annotateBaseName) {
- String basename = currentFile.get().getFile().getName();
- for (Event event : events) {
- event.getHeaders().put(baseNameHeader, basename);
- }
- }
-
-
-
-
- if (annotateFileNameExtractor) {
-
- Matcher matcher = fileNameExtractorPattern.matcher(currentFile
- .get().getFile().getName());
-
- if (matcher.find()) {
- String value = matcher.group();
- if (convertToTimestamp) {
- DateTimeFormatter formatter = DateTimeFormat
- .forPattern(dateTimeFormat);
- DateTime dateTime = formatter.parseDateTime(value);
-
- value = Long.toString(dateTime.getMillis());
- }
-
- for (Event event : events) {
- event.getHeaders().put(fileNameExtractorHeader, value);
- }
- }
-
- }
-
-
- if (splitFileName) {
- String[] splits = currentFile.get().getFile().getName()
- .split(splitBy);
-
- for (Event event : events) {
- for (int i = 0; i < splits.length; i++) {
- event.getHeaders().put(splitBaseNameHeader + i, splits[i]);
- }
-
- }
-
- }
-
-
-
- committed = false;
- lastFileRead = currentFile;
- return events;
- }
-
- @Override
- public void close() throws IOException {
- if (currentFile.isPresent()) {
- currentFile.get().getDeserializer().close();
- currentFile = Optional.absent();
- }
- }
-
-
- @Override
- public void commit() throws IOException {
- if (!committed && currentFile.isPresent()) {
- currentFile.get().getDeserializer().mark();
- committed = true;
- }
- }
-
-
-
-
-
-
-
-
-
-
-
-
- private void retireCurrentFile() throws IOException {
- Preconditions.checkState(currentFile.isPresent());
-
- File fileToRoll = new File(currentFile.get().getFile()
- .getAbsolutePath());
-
- currentFile.get().getDeserializer().close();
-
-
- if (fileToRoll.lastModified() != currentFile.get().getLastModified()) {
- String message = "File has been modified since being read: "
- + fileToRoll;
- throw new IllegalStateException(message);
- }
- if (fileToRoll.length() != currentFile.get().getLength()) {
- String message = "File has changed size since being read: "
- + fileToRoll;
- throw new IllegalStateException(message);
- }
-
- if (deletePolicy.equalsIgnoreCase(DeletePolicy.NEVER.name())) {
- rollCurrentFile(fileToRoll);
- } else if (deletePolicy.equalsIgnoreCase(DeletePolicy.IMMEDIATE.name())) {
- deleteCurrentFile(fileToRoll);
- } else {
-
- throw new IllegalArgumentException("Unsupported delete policy: "
- + deletePolicy);
- }
- }
-
-
-
-
-
-
-
- private void rollCurrentFile(File fileToRoll) throws IOException {
-
- File dest = new File(fileToRoll.getPath() + completedSuffix);
- logger.info("Preparing to move file {} to {}", fileToRoll, dest);
-
-
- if (dest.exists() && PlatformDetect.isWindows()) {
-
-
-
-
-
-
-
-
- if (Files.equal(currentFile.get().getFile(), dest)) {
- logger.warn("Completed file " + dest
- + " already exists, but files match, so continuing.");
- boolean deleted = fileToRoll.delete();
- if (!deleted) {
- logger.error("Unable to delete file "
- + fileToRoll.getAbsolutePath()
- + ". It will likely be ingested another time.");
- }
- } else {
- String message = "File name has been re-used with different"
- + " files. Spooling assumptions violated for " + dest;
- throw new IllegalStateException(message);
- }
-
-
- } else if (dest.exists()) {
- String message = "File name has been re-used with different"
- + " files. Spooling assumptions violated for " + dest;
- throw new IllegalStateException(message);
-
-
- } else {
- boolean renamed = fileToRoll.renameTo(dest);
- if (renamed) {
- logger.debug("Successfully rolled file {} to {}", fileToRoll,
- dest);
-
-
- deleteMetaFile();
- } else {
-
-
-
-
-
-
- String message = "Unable to move "
- + fileToRoll
- + " to "
- + dest
- + ". This will likely cause duplicate events. Please verify that "
- + "flume has sufficient permissions to perform these operations.";
- throw new FlumeException(message);
- }
- }
- }
-
-
-
-
-
-
-
- private void deleteCurrentFile(File fileToDelete) throws IOException {
- logger.info("Preparing to delete file {}", fileToDelete);
- if (!fileToDelete.exists()) {
- logger.warn("Unable to delete nonexistent file: {}", fileToDelete);
- return;
- }
- if (!fileToDelete.delete()) {
- throw new IOException("Unable to delete spool file: "
- + fileToDelete);
- }
-
- deleteMetaFile();
- }
-
-
-
-
-
-
- private Optional getNextFile() {
-
- FileFilter filter = new FileFilter() {
- public boolean accept(File candidate) {
- String fileName = candidate.getName();
- if ((candidate.isDirectory())
- || (fileName.endsWith(completedSuffix))
- || (fileName.startsWith("."))
- || ignorePattern.matcher(fileName).matches()) {
- return false;
- }
- return true;
- }
- };
- List candidateFiles = Arrays.asList(spoolDirectory
- .listFiles(filter));
- if (candidateFiles.isEmpty()) {
- return Optional.absent();
- } else {
- Collections.sort(candidateFiles, new Comparator() {
- public int compare(File a, File b) {
- int timeComparison = new Long(a.lastModified())
- .compareTo(new Long(b.lastModified()));
- if (timeComparison != 0) {
- return timeComparison;
- } else {
- return a.getName().compareTo(b.getName());
- }
- }
- });
- File nextFile = candidateFiles.get(0);
- try {
-
- String nextPath = nextFile.getPath();
- PositionTracker tracker = DurablePositionTracker.getInstance(
- metaFile, nextPath);
- if (!tracker.getTarget().equals(nextPath)) {
- tracker.close();
- deleteMetaFile();
- tracker = DurablePositionTracker.getInstance(metaFile,
- nextPath);
- }
-
-
- Preconditions
- .checkState(
- tracker.getTarget().equals(nextPath),
- "Tracker target %s does not equal expected filename %s",
- tracker.getTarget(), nextPath);
-
- ResettableInputStream in = new ResettableFileInputStream(
- nextFile, tracker,
- ResettableFileInputStream.DEFAULT_BUF_SIZE,
- inputCharset, decodeErrorPolicy);
- EventDeserializer deserializer = EventDeserializerFactory
- .getInstance(deserializerType, deserializerContext, in);
-
- return Optional.of(new FileInfo(nextFile, deserializer));
- } catch (FileNotFoundException e) {
-
- logger.warn("Could not find file: " + nextFile, e);
- return Optional.absent();
- } catch (IOException e) {
- logger.error("Exception opening file: " + nextFile, e);
- return Optional.absent();
- }
- }
- }
-
- private void deleteMetaFile() throws IOException {
- if (metaFile.exists() && !metaFile.delete()) {
- throw new IOException("Unable to delete old meta file " + metaFile);
- }
- }
-
-
- private static class FileInfo {
- private final File file;
- private final long length;
- private final long lastModified;
- private final EventDeserializer deserializer;
-
- public FileInfo(File file, EventDeserializer deserializer) {
- this.file = file;
- this.length = file.length();
- this.lastModified = file.lastModified();
- this.deserializer = deserializer;
- }
-
- public long getLength() {
- return length;
- }
-
- public long getLastModified() {
- return lastModified;
- }
-
- public EventDeserializer getDeserializer() {
- return deserializer;
- }
-
- public File getFile() {
- return file;
- }
- }
-
- @InterfaceAudience.Private
- @InterfaceStability.Unstable
- static enum DeletePolicy {
- NEVER, IMMEDIATE, DELAY
- }
-
-
-
-
- public static class Builder {
- private File spoolDirectory;
- private String completedSuffix = SpoolDirectorySourceConfigurationExtConstants.SPOOLED_FILE_SUFFIX;
- private String ignorePattern = SpoolDirectorySourceConfigurationExtConstants.DEFAULT_IGNORE_PAT;
- private String trackerDirPath = SpoolDirectorySourceConfigurationExtConstants.DEFAULT_TRACKER_DIR;
- private Boolean annotateFileName = SpoolDirectorySourceConfigurationExtConstants.DEFAULT_FILE_HEADER;
- private String fileNameHeader = SpoolDirectorySourceConfigurationExtConstants.DEFAULT_FILENAME_HEADER_KEY;
- private Boolean annotateBaseName = SpoolDirectorySourceConfigurationExtConstants.DEFAULT_BASENAME_HEADER;
- private String baseNameHeader = SpoolDirectorySourceConfigurationExtConstants.DEFAULT_BASENAME_HEADER_KEY;
- private String deserializerType = SpoolDirectorySourceConfigurationExtConstants.DEFAULT_DESERIALIZER;
- private Context deserializerContext = new Context();
- private String deletePolicy = SpoolDirectorySourceConfigurationExtConstants.DEFAULT_DELETE_POLICY;
- private String inputCharset = SpoolDirectorySourceConfigurationExtConstants.DEFAULT_INPUT_CHARSET;
- private DecodeErrorPolicy decodeErrorPolicy = DecodeErrorPolicy
- .valueOf(SpoolDirectorySourceConfigurationExtConstants.DEFAULT_DECODE_ERROR_POLICY
- .toUpperCase());
-
-
-
- private Boolean annotateFileNameExtractor = SpoolDirectorySourceConfigurationExtConstants.DEFAULT_FILENAME_EXTRACTOR;
- private String fileNameExtractorHeader = SpoolDirectorySourceConfigurationExtConstants.DEFAULT_FILENAME_EXTRACTOR_HEADER_KEY;
- private String fileNameExtractorPattern = SpoolDirectorySourceConfigurationExtConstants.DEFAULT_FILENAME_EXTRACTOR_PATTERN;
- private Boolean convertToTimestamp = SpoolDirectorySourceConfigurationExtConstants.DEFAULT_FILENAME_EXTRACTOR_CONVERT_TO_TIMESTAMP;
-
- private String dateTimeFormat = SpoolDirectorySourceConfigurationExtConstants.DEFAULT_FILENAME_EXTRACTOR_DATETIME_FORMAT;
-
- private Boolean splitFileName = SpoolDirectorySourceConfigurationExtConstants.DEFAULT_SPLIT_FILENAME;
- private String splitBy = SpoolDirectorySourceConfigurationExtConstants.DEFAULT_SPLITY_BY;
- private String splitBaseNameHeader = SpoolDirectorySourceConfigurationExtConstants.DEFAULT_SPLIT_BASENAME_HEADER;
-
- public Builder annotateFileNameExtractor(
- Boolean annotateFileNameExtractor) {
- this.annotateFileNameExtractor = annotateFileNameExtractor;
- return this;
- }
-
- public Builder fileNameExtractorHeader(String fileNameExtractorHeader) {
- this.fileNameExtractorHeader = fileNameExtractorHeader;
- return this;
- }
-
- public Builder fileNameExtractorPattern(String fileNameExtractorPattern) {
- this.fileNameExtractorPattern = fileNameExtractorPattern;
- return this;
- }
-
- public Builder convertToTimestamp(Boolean convertToTimestamp) {
- this.convertToTimestamp = convertToTimestamp;
- return this;
- }
-
- public Builder dateTimeFormat(String dateTimeFormat) {
- this.dateTimeFormat = dateTimeFormat;
- return this;
- }
-
- public Builder splitFileName(Boolean splitFileName) {
- this.splitFileName = splitFileName;
- return this;
- }
-
- public Builder splitBy(String splitBy) {
- this.splitBy = splitBy;
- return this;
- }
-
- public Builder splitBaseNameHeader(String splitBaseNameHeader) {
- this.splitBaseNameHeader = splitBaseNameHeader;
- return this;
- }
-
-
-
- public Builder spoolDirectory(File directory) {
- this.spoolDirectory = directory;
- return this;
- }
-
- public Builder completedSuffix(String completedSuffix) {
- this.completedSuffix = completedSuffix;
- return this;
- }
-
- public Builder ignorePattern(String ignorePattern) {
- this.ignorePattern = ignorePattern;
- return this;
- }
-
- public Builder trackerDirPath(String trackerDirPath) {
- this.trackerDirPath = trackerDirPath;
- return this;
- }
-
- public Builder annotateFileName(Boolean annotateFileName) {
- this.annotateFileName = annotateFileName;
- return this;
- }
-
- public Builder fileNameHeader(String fileNameHeader) {
- this.fileNameHeader = fileNameHeader;
- return this;
- }
-
- public Builder annotateBaseName(Boolean annotateBaseName) {
- this.annotateBaseName = annotateBaseName;
- return this;
- }
-
- public Builder baseNameHeader(String baseNameHeader) {
- this.baseNameHeader = baseNameHeader;
- return this;
- }
-
- public Builder deserializerType(String deserializerType) {
- this.deserializerType = deserializerType;
- return this;
- }
-
- public Builder deserializerContext(Context deserializerContext) {
- this.deserializerContext = deserializerContext;
- return this;
- }
-
- public Builder deletePolicy(String deletePolicy) {
- this.deletePolicy = deletePolicy;
- return this;
- }
-
- public Builder inputCharset(String inputCharset) {
- this.inputCharset = inputCharset;
- return this;
- }
-
- public Builder decodeErrorPolicy(DecodeErrorPolicy decodeErrorPolicy) {
- this.decodeErrorPolicy = decodeErrorPolicy;
- return this;
- }
-
- public ReliableSpoolingFileEventExtReader build() throws IOException {
- return new ReliableSpoolingFileEventExtReader(spoolDirectory,
- completedSuffix, ignorePattern, trackerDirPath,
- annotateFileName, fileNameHeader, annotateBaseName,
- baseNameHeader, deserializerType, deserializerContext,
- deletePolicy, inputCharset, decodeErrorPolicy,
- annotateFileNameExtractor, fileNameExtractorHeader,
- fileNameExtractorPattern, convertToTimestamp,
- dateTimeFormat, splitFileName, splitBy, splitBaseNameHeader);
- }
- }
-
- }
- package com.besttone.flume;
-
- import static com.besttone.flume.SpoolDirectorySourceConfigurationExtConstants.*;
-
-
- import java.io.File;
- import java.io.IOException;
- import java.util.List;
- import java.util.concurrent.Executors;
- import java.util.concurrent.ScheduledExecutorService;
- import java.util.concurrent.TimeUnit;
-
- import org.apache.flume.ChannelException;
- import org.apache.flume.Context;
- import org.apache.flume.Event;
- import org.apache.flume.EventDrivenSource;
- import org.apache.flume.FlumeException;
- import org.apache.flume.conf.Configurable;
- import org.apache.flume.instrumentation.SourceCounter;
- import org.apache.flume.serialization.DecodeErrorPolicy;
- import org.apache.flume.serialization.LineDeserializer;
- import org.apache.flume.source.AbstractSource;
- import org.slf4j.Logger;
- import org.slf4j.LoggerFactory;
-
- import com.google.common.annotations.VisibleForTesting;
- import com.google.common.base.Preconditions;
- import com.google.common.base.Throwables;
-
- public class SpoolDirectoryExtSource extends AbstractSource implements
- Configurable, EventDrivenSource {
-
- private static final Logger logger = LoggerFactory
- .getLogger(SpoolDirectoryExtSource.class);
-
-
- private static final int POLL_DELAY_MS = 500;
-
-
- private String completedSuffix;
- private String spoolDirectory;
- private boolean fileHeader;
- private String fileHeaderKey;
- private boolean basenameHeader;
- private String basenameHeaderKey;
- private int batchSize;
- private String ignorePattern;
- private String trackerDirPath;
- private String deserializerType;
- private Context deserializerContext;
- private String deletePolicy;
- private String inputCharset;
- private DecodeErrorPolicy decodeErrorPolicy;
- private volatile boolean hasFatalError = false;
-
- private SourceCounter sourceCounter;
- ReliableSpoolingFileEventExtReader reader;
- private ScheduledExecutorService executor;
- private boolean backoff = true;
- private boolean hitChannelException = false;
- private int maxBackoff;
-
-
-
- private Boolean annotateFileNameExtractor;
- private String fileNameExtractorHeader;
- private String fileNameExtractorPattern;
- private Boolean convertToTimestamp;
-
- private String dateTimeFormat;
-
- private boolean splitFileName;
- private String splitBy;
- private String splitBaseNameHeader;
-
-
-
- @Override
- public synchronized void start() {
- logger.info("SpoolDirectorySource source starting with directory: {}",
- spoolDirectory);
-
- executor = Executors.newSingleThreadScheduledExecutor();
-
- File directory = new File(spoolDirectory);
- try {
- reader = new ReliableSpoolingFileEventExtReader.Builder()
- .spoolDirectory(directory).completedSuffix(completedSuffix)
- .ignorePattern(ignorePattern)
- .trackerDirPath(trackerDirPath)
- .annotateFileName(fileHeader).fileNameHeader(fileHeaderKey)
- .annotateBaseName(basenameHeader)
- .baseNameHeader(basenameHeaderKey)
- .deserializerType(deserializerType)
- .deserializerContext(deserializerContext)
- .deletePolicy(deletePolicy).inputCharset(inputCharset)
- .decodeErrorPolicy(decodeErrorPolicy)
- .annotateFileNameExtractor(annotateFileNameExtractor)
- .fileNameExtractorHeader(fileNameExtractorHeader)
- .fileNameExtractorPattern(fileNameExtractorPattern)
- .convertToTimestamp(convertToTimestamp)
- .dateTimeFormat(dateTimeFormat)
- .splitFileName(splitFileName).splitBy(splitBy)
- .splitBaseNameHeader(splitBaseNameHeader).build();
- } catch (IOException ioe) {
- throw new FlumeException(
- "Error instantiating spooling event parser", ioe);
- }
-
- Runnable runner = new SpoolDirectoryRunnable(reader, sourceCounter);
- executor.scheduleWithFixedDelay(runner, 0, POLL_DELAY_MS,
- TimeUnit.MILLISECONDS);
-
- super.start();
- logger.debug("SpoolDirectorySource source started");
- sourceCounter.start();
- }
-
- @Override
- public synchronized void stop() {
- executor.shutdown();
- try {
- executor.awaitTermination(10L, TimeUnit.SECONDS);
- } catch (InterruptedException ex) {
- logger.info("Interrupted while awaiting termination", ex);
- }
- executor.shutdownNow();
-
- super.stop();
- sourceCounter.stop();
- logger.info("SpoolDir source {} stopped. Metrics: {}", getName(),
- sourceCounter);
- }
-
- @Override
- public String toString() {
- return "Spool Directory source " + getName() + ": { spoolDir: "
- + spoolDirectory + " }";
- }
-
- @Override
- public synchronized void configure(Context context) {
- spoolDirectory = context.getString(SPOOL_DIRECTORY);
- Preconditions.checkState(spoolDirectory != null,
- "Configuration must specify a spooling directory");
-
- completedSuffix = context.getString(SPOOLED_FILE_SUFFIX,
- DEFAULT_SPOOLED_FILE_SUFFIX);
- deletePolicy = context.getString(DELETE_POLICY, DEFAULT_DELETE_POLICY);
- fileHeader = context.getBoolean(FILENAME_HEADER, DEFAULT_FILE_HEADER);
- fileHeaderKey = context.getString(FILENAME_HEADER_KEY,
- DEFAULT_FILENAME_HEADER_KEY);
- basenameHeader = context.getBoolean(BASENAME_HEADER,
- DEFAULT_BASENAME_HEADER);
- basenameHeaderKey = context.getString(BASENAME_HEADER_KEY,
- DEFAULT_BASENAME_HEADER_KEY);
- batchSize = context.getInteger(BATCH_SIZE, DEFAULT_BATCH_SIZE);
- inputCharset = context.getString(INPUT_CHARSET, DEFAULT_INPUT_CHARSET);
- decodeErrorPolicy = DecodeErrorPolicy
- .valueOf(context.getString(DECODE_ERROR_POLICY,
- DEFAULT_DECODE_ERROR_POLICY).toUpperCase());
-
- ignorePattern = context.getString(IGNORE_PAT, DEFAULT_IGNORE_PAT);
- trackerDirPath = context.getString(TRACKER_DIR, DEFAULT_TRACKER_DIR);
-
- deserializerType = context
- .getString(DESERIALIZER, DEFAULT_DESERIALIZER);
- deserializerContext = new Context(context.getSubProperties(DESERIALIZER
- + "."));
-
-
-
- Integer bufferMaxLineLength = context
- .getInteger(BUFFER_MAX_LINE_LENGTH);
- if (bufferMaxLineLength != null && deserializerType != null
- && deserializerType.equalsIgnoreCase(DEFAULT_DESERIALIZER)) {
- deserializerContext.put(LineDeserializer.MAXLINE_KEY,
- bufferMaxLineLength.toString());
- }
-
- maxBackoff = context.getInteger(MAX_BACKOFF, DEFAULT_MAX_BACKOFF);
- if (sourceCounter == null) {
- sourceCounter = new SourceCounter(getName());
- }
-
-
-
- annotateFileNameExtractor=context.getBoolean(FILENAME_EXTRACTOR, DEFAULT_FILENAME_EXTRACTOR);
- fileNameExtractorHeader=context.getString(FILENAME_EXTRACTOR_HEADER_KEY, DEFAULT_FILENAME_EXTRACTOR_HEADER_KEY);
- fileNameExtractorPattern=context.getString(FILENAME_EXTRACTOR_PATTERN,DEFAULT_FILENAME_EXTRACTOR_PATTERN);
- convertToTimestamp=context.getBoolean(FILENAME_EXTRACTOR_CONVERT_TO_TIMESTAMP, DEFAULT_FILENAME_EXTRACTOR_CONVERT_TO_TIMESTAMP);
- dateTimeFormat=context.getString(FILENAME_EXTRACTOR_DATETIME_FORMAT, DEFAULT_FILENAME_EXTRACTOR_DATETIME_FORMAT);
-
- splitFileName=context.getBoolean(SPLIT_FILENAME, DEFAULT_SPLIT_FILENAME);
- splitBy=context.getString(SPLITY_BY, DEFAULT_SPLITY_BY);
- splitBaseNameHeader=context.getString(SPLIT_BASENAME_HEADER, DEFAULT_SPLIT_BASENAME_HEADER);
-
-
-
-
- }
-
- @VisibleForTesting
- protected boolean hasFatalError() {
- return hasFatalError;
- }
-
-
-
-
-
-
-
-
- @VisibleForTesting
- protected void setBackOff(boolean backoff) {
- this.backoff = backoff;
- }
-
- @VisibleForTesting
- protected boolean hitChannelException() {
- return hitChannelException;
- }
-
- @VisibleForTesting
- protected SourceCounter getSourceCounter() {
- return sourceCounter;
- }
-
- private class SpoolDirectoryRunnable implements Runnable {
- private ReliableSpoolingFileEventExtReader reader;
- private SourceCounter sourceCounter;
-
- public SpoolDirectoryRunnable(
- ReliableSpoolingFileEventExtReader reader,
- SourceCounter sourceCounter) {
- this.reader = reader;
- this.sourceCounter = sourceCounter;
- }
-
- @Override
- public void run() {
- int backoffInterval = 250;
- try {
- while (!Thread.interrupted()) {
- List events = reader.readEvents(batchSize);
- if (events.isEmpty()) {
- break;
- }
- sourceCounter.addToEventReceivedCount(events.size());
- sourceCounter.incrementAppendBatchReceivedCount();
-
- try {
- getChannelProcessor().processEventBatch(events);
- reader.commit();
- } catch (ChannelException ex) {
- logger.warn("The channel is full, and cannot write data now. The "
- + "source will try again after "
- + String.valueOf(backoffInterval)
- + " milliseconds");
- hitChannelException = true;
- if (backoff) {
- TimeUnit.MILLISECONDS.sleep(backoffInterval);
- backoffInterval = backoffInterval << 1;
- backoffInterval = backoffInterval >= maxBackoff ? maxBackoff
- : backoffInterval;
- }
- continue;
- }
- backoffInterval = 250;
- sourceCounter.addToEventAcceptedCount(events.size());
- sourceCounter.incrementAppendBatchAcceptedCount();
- }
- logger.info("Spooling Directory Source runner has shutdown.");
- } catch (Throwable t) {
- logger.error(
- "FATAL: "
- + SpoolDirectoryExtSource.this.toString()
- + ": "
- + "Uncaught exception in SpoolDirectorySource thread. "
- + "Restart or reconfigure Flume to continue processing.",
- t);
- hasFatalError = true;
- Throwables.propagate(t);
- }
- }
- }
- }
主要提供了如下几个设置参数:
tier1.sources.source1.type=com.besttone.flume.SpoolDirectoryExtSource #写类的全路径名
tier1.sources.source1.spoolDir=/opt/logs #监控的目录
tier1.sources.source1.splitFileName=true #是否分隔文件名,并把分割后的内容添加到header中,默认false
tier1.sources.source1.splitBy=\\. #以什么符号分隔,默认是"."分隔
tier1.sources.source1.splitBaseNameHeader=fileNameSplit #分割后写入header的key的前缀,比如a.log.2014-07-31,按“."分隔,
则header中有fileNameSplit0=a,fileNameSplit1=log,fileNameSplit2=2014-07-31
(其中还有扩展一个通过正则表达式抽取文件名的功能也在里面,我们这里不用到,就不介绍了)
扩展了这个source之后,前面的那个需求就很容易实现了,只需要:
tier1.sinks.sink1.hdfs.path=hdfs://master68:8020/flume/events/%{fileNameSplit0}/%{fileNameSplit2}
a.log.2014-07-31这个文件的内容就会保存到hdfs://master68:8020/flume/events/a/2014-07-31目录下面去了。
接下来我们说说如何部署这个我们扩展的自定义的spooling directory source(基于CM的设置)。
首先,我们把上面三个类打成JAR包:SpoolDirectoryExtSource.jar
CM的flume插件目录为:/var/lib/flume-ng/plugins.d
然后再你需要使用这个source的agent上的/var/lib/flume-ng/plugins.d目录下面创建SpoolDirectoryExtSource目录以及子目录lib,libext,native,lib是放插件JAR的目录,libext是放插件的依赖JAR的目录,native放使用到的原生库(如果有用到的话)。
我们这里没有使用到其他的依赖,于是就把SpoolDirectoryExtSource.jar放到lib目录下就好了,最终的目录结构:
plugins.d/
plugins.d/SpoolDirectoryExtSource/
plugins.d/SpoolDirectoryExtSource/lib/SpoolDirectoryExtSource.jar
plugins.d/SpoolDirectoryExtSource/libext/
plugins.d/SpoolDirectoryExtSource/native/
重新启动flume agent,flume就会自动装载我们的插件,这样在flume.conf中就可以使用全路径类名配置type属性了。
最终flume.conf配置如下:
- tier1.sources=source1
- tier1.channels=channel1
- tier1.sinks=sink1
- tier1.sources.source1.type=com.besttone.flume.SpoolDirectoryExtSource
- tier1.sources.source1.spoolDir=/opt/logs
- tier1.sources.source1.splitFileName=true
- tier1.sources.source1.splitBy=\\.
- tier1.sources.source1.splitBaseNameHeader=fileNameSplit
- tier1.sources.source1.channels=channel1
- tier1.sinks.sink1.type=hdfs
- tier1.sinks.sink1.channel=channel1
- tier1.sinks.sink1.hdfs.path=hdfs://master68:8020/flume/events/%{fileNameSplit0}/%{fileNameSplit2}
- tier1.sinks.sink1.hdfs.round=true
- tier1.sinks.sink1.hdfs.roundValue=10
- tier1.sinks.sink1.hdfs.roundUnit=minute
- tier1.sinks.sink1.hdfs.fileType=DataStream
- tier1.sinks.sink1.hdfs.writeFormat=Text
- tier1.sinks.sink1.hdfs.rollInterval=0
- tier1.sinks.sink1.hdfs.rollSize=10240
- tier1.sinks.sink1.hdfs.rollCount=0
- tier1.sinks.sink1.hdfs.idleTimeout=60
- tier1.channels.channel1.type=memory
- tier1.channels.channel1.capacity=10000
- tier1.channels.channel1.transactionCapacity=1000
- tier1.channels.channel1.keep-alive=30
附上一张用logger作为sink的查看日志文件的截图:
问题导读:
1.如何实现flume端自定一个sink,来按照我们的规则来保存日志?
2.想从flume的配置文件中获取rootPath的值,该如何配置?
最近需要利用flume来做收集远端日志,所以学习一些flume最基本的用法。这里仅作记录。
远端日志收集的整体思路是远端自定义实现log4j的appender把消息发送到flume端,flume端自定义实现一个sink来按照我们的规则保存日志。
自定义Sink代码:
- public class LocalFileLogSink extends AbstractSink implements Configurable {
- private static final Logger logger = LoggerFactory
- . getLogger(LocalFileLogSink .class );
- private static final String PROP_KEY_ROOTPATH = "rootPath";
- private String rootPath;
- @Override
- public void configure(Context context) {
- String rootPath = context.getString(PROP_KEY_ROOTPATH );
- setRootPath(rootPath);
- }
-
- @Override
- public Status process() throws EventDeliveryException {
- logger .debug("Do process" );
- }
- }
复制代码
实现Configurable接口,即可在初始化时,通过configure方法从context中获取配置的参数的值。这里,我们是想从flume的配置文件中获取rootPath的值,也就是日志保存的根路径。在flume-conf.properties中配置如下:
- agent.sinks = loggerSink
- agent.sinks.loggerSink.rootPath = ./logs
复制代码
loggerSink是自定义sink的名称,我们取值时的key,只需要loggerSink后面的部分即可,即这里的rootPath。
实际业务逻辑的执行,是通过继承复写AbstractSink中的process方法实现。从基类的getChannel方法中获取信道,从中取出Event处理即可。
- Channel ch = getChannel();
- Transaction txn = ch.getTransaction();
- txn.begin();
- try {
- logger .debug("Get event." );
- Event event = ch.take();
- txn.commit();
- status = Status. READY ;
- return status;
- }finally {
- Log. info( "trx close.");
- txn.close();
- }
以下是我的自定义kafka sink插件的pom文件,编译成jar包丢到flume的lib下即可使用
- xml version="1.0" encoding="UTF-8"?>
-
-
- <project xmlns="http://maven.apache.org/POM/4.0.0"
- xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance"
- xsi:schemaLocation="http://maven.apache.org/POM/4.0.0 http://maven.apache.org/xsd/maven-4.0.0.xsd">
- <modelVersion>4.0.0modelVersion>
-
- <groupId>flume-sinksgroupId>
- <artifactId>cmcc-kafka-sinkartifactId>
- <name>Flume Kafka Sinkname>
- <version>1.0.0version>
- <build>
- <plugins>
- <plugin>
- <groupId>org.apache.maven.pluginsgroupId>
- <artifactId>maven-jar-pluginartifactId>
- plugin>
- plugins>
- build>