Architecture of Apache Flume
Architecture of Apache Flume:
The architecture of
Apache Flume & Hadoop Training in Hyderabad:
Flume is robust, flexible, standard, simple, and extensible tool
for data breakdown from various data producers into Hadoop, in The Apache flume.
We will use a simple example to explain the Basics of Apache Flume. The
professionals who would like to learn hadoop is process of transferring
streaming and log data from various Web servers to HBase using Apache Flume, to
make the most of this Apache flume, you should have good understanding of the basics
Of HDFS and Hadoop commands
What is Flume and Hadoop training Institutes in Hyderabad?
It is a tool mechanism for collecting aggregating and transporting
large amounts of streaming data such as events, Log files from different
sources to the centralized data store. Flume is distributed, reliable, and
configurable tool, It is designed to copy streaming data from various web
servers to HDFS.Apache flume is easy when learning Hadoop. Kosmik TechnologiesProvides Hadoop training in Hyderabad, experienced faculty In Hadoop classes,
easy to learn Hadoop in different ways
Apache Flume Configuration:
After installing Flume, we have to configure it using the configuration
the file which is Java property file having
key value pairs. We have to know the pass
values to the keys i file.
In the Flume configuration file, we need to Name the
components of the Apache flume.
> Describe or configure the source.
Features of Flume:
Some of the features of Flume are given below
1. Flume can be scaled.
2. Using Flume, we can get the data from many servers
immediately into Hadoop.
3. Flume log data from many web servers into a centralized
store.
4. Flume supports the large set of sources and destinations
types.
5. Flume supports multi-hop flows, contextual routing, fan
in & fan outflows etc.
6. Flume is used to import big volumes of event data
produced by social networking
Sites like Twitter and Face book, Along with the log files.
Let us assume e-commerce web application needs to analyze
the customer behavior from Particular region so they would need to move the
available log data into Hadoop for analysis
Flume is used to move the log data generated by application
servers into HDFS at Higher speed
1. Here are the advantages of using Flume
2. Flume provides the feature of contextual routing.
3. Using Apache Flume we can store the data into any of the
centralized stores.
4. Flume is fault tolerant, reliable, manageable, scalable,
and customizable.
5. The transactions in Flume channel based where two
transactions maintained for each message.

Comments
Post a Comment