Package | Description |
---|---|
org.apache.flume | |
org.apache.flume.agent.embedded |
This package provides Flume users the ability to embed simple agents
in applications.
|
org.apache.flume.api | |
org.apache.flume.channel | |
org.apache.flume.channel.file | |
org.apache.flume.channel.file.encryption | |
org.apache.flume.clients.log4jappender | |
org.apache.flume.conf | |
org.apache.flume.instrumentation | |
org.apache.flume.sink | |
org.apache.flume.sink.hbase | |
org.apache.flume.sink.hdfs | |
org.apache.flume.source | |
org.apache.flume.source.http | |
org.apache.flume.source.jms | |
org.apache.flume.source.kafka |
Modifier and Type | Method and Description |
---|---|
Source |
SourceFactory.create(String sourceName,
String type) |
Sink |
SinkFactory.create(String name,
String type) |
Channel |
ChannelFactory.create(String name,
String type) |
Class<? extends Source> |
SourceFactory.getClass(String type) |
Class<? extends Sink> |
SinkFactory.getClass(String type) |
Class<? extends Channel> |
ChannelFactory.getClass(String type) |
Modifier and Type | Method and Description |
---|---|
void |
EmbeddedAgent.configure(Map<String,String> properties)
Configures the embedded agent.
|
void |
EmbeddedAgent.start()
Started the agent.
|
void |
EmbeddedAgent.stop()
Stops the agent.
|
Modifier and Type | Method and Description |
---|---|
void |
ThriftRpcClient.close() |
void |
RpcClient.close()
Immediately closes the client so that it may no longer be used.
|
void |
NettyAvroRpcClient.close() |
void |
LoadBalancingRpcClient.close() |
void |
FailoverRpcClient.close()
Close the connection.
|
abstract void |
AbstractRpcClient.close() |
protected void |
SecureThriftRpcClient.configure(Properties properties) |
protected void |
ThriftRpcClient.configure(Properties properties) |
void |
NettyAvroRpcClient.configure(Properties properties)
Configure the actual client using the properties.
|
protected void |
LoadBalancingRpcClient.configure(Properties properties) |
void |
FailoverRpcClient.configure(Properties properties) |
protected abstract void |
AbstractRpcClient.configure(Properties properties)
Configure the client using the given properties object.
|
static RpcClient |
RpcClientFactory.getDefaultInstance(String hostname,
Integer port)
|
static RpcClient |
RpcClientFactory.getDefaultInstance(String hostname,
Integer port,
Integer batchSize)
Returns an instance of
RpcClient connected to the specified
hostname and port with the specified batchSize . |
static RpcClient |
RpcClientFactory.getInstance(Properties properties)
Returns an instance of
RpcClient , optionally with failover. |
static RpcClient |
RpcClientFactory.getInstance(String hostname,
Integer port)
Deprecated.
|
static RpcClient |
RpcClientFactory.getInstance(String hostname,
Integer port,
Integer batchSize)
Deprecated.
|
void |
SecureThriftRpcClient.UgiSaslClientTransport.open()
Open the SASL transport with using the current UserGroupInformation.
|
Modifier and Type | Method and Description |
---|---|
Channel |
DefaultChannelFactory.create(String name,
String type) |
Class<? extends Channel> |
DefaultChannelFactory.getClass(String type) |
Modifier and Type | Class and Description |
---|---|
class |
BadCheckpointException
Exception thrown when the checkpoint directory contains invalid data,
probably due to the channel stopping while the checkpoint was written.
|
Modifier and Type | Class and Description |
---|---|
class |
DecryptionFailureException
Exception that is thrown when the channel is unable to decrypt an even
read from the channel.
|
Modifier and Type | Method and Description |
---|---|
void |
Log4jAppender.activateOptions()
Activate the options set using setPort()
and setHostname()
|
void |
LoadBalancingLog4jAppender.activateOptions()
Activate the options set using setHosts(), setSelector
and setMaxBackoff
|
void |
Log4jAppender.append(org.apache.log4j.spi.LoggingEvent event)
Append the LoggingEvent, to send to the first Flume hop.
|
void |
Log4jAppender.close()
Closes underlying client.
|
Modifier and Type | Class and Description |
---|---|
class |
ConfigurationException |
Constructor and Description |
---|
GangliaServer() |
Modifier and Type | Method and Description |
---|---|
Sink |
DefaultSinkFactory.create(String name,
String type) |
Class<? extends Sink> |
DefaultSinkFactory.getClass(String type) |
Modifier and Type | Method and Description |
---|---|
List<org.apache.hadoop.hbase.client.Row> |
SimpleHbaseEventSerializer.getActions() |
List<org.apache.hadoop.hbase.client.Row> |
RegexHbaseEventSerializer.getActions() |
Modifier and Type | Class and Description |
---|---|
class |
BucketClosedException |
Modifier and Type | Method and Description |
---|---|
Source |
DefaultSourceFactory.create(String name,
String type) |
protected void |
StressSource.doConfigure(Context context)
Read parameters from context
-maxTotalEvents = type long that defines the total number of Events to be sent
-maxSuccessfulEvents = type long that defines the number of successful Events
-size = type int that defines the number of bytes in each Event
-batchSize = type int that defines the number of Events being sent in one batch
|
protected void |
SequenceGeneratorSource.doConfigure(Context context)
Read parameters from context
batchSize = type int that defines the size of event batches
|
protected abstract void |
BasicSourceSemantics.doConfigure(Context context) |
protected void |
StressSource.doStart() |
protected void |
SequenceGeneratorSource.doStart() |
protected abstract void |
BasicSourceSemantics.doStart() |
protected void |
StressSource.doStop() |
protected void |
SequenceGeneratorSource.doStop() |
protected abstract void |
BasicSourceSemantics.doStop() |
Class<? extends Source> |
DefaultSourceFactory.getClass(String type) |
Modifier and Type | Class and Description |
---|---|
class |
HTTPBadRequestException
Exception thrown by an HTTP Handler if the request was not parsed correctly
into an event because the request was not in the expected format.
|
Modifier and Type | Method and Description |
---|---|
protected void |
JMSSource.doConfigure(Context context) |
Modifier and Type | Method and Description |
---|---|
protected void |
KafkaSource.doConfigure(Context context)
We configure the source and generate properties for the Kafka Consumer
Kafka Consumer properties are generated as follows:
1.
|
protected void |
KafkaSource.doStart() |
protected void |
KafkaSource.doStop() |
Copyright © 2009-2016 Apache Software Foundation. All Rights Reserved.