I am running this kafka producer example mentioned in its site
The code:
public class TestProducer {
public static void main(String[] args) {
Not sure but one possibility could be that the topic is not created on Kafka.
Check the web UI for kafka and make sure the topic you are using i.e. "page_visits" to send the data is created there.
If not it is very easy to create the topic using the GUI.
I got these errors when running a Kafka producer:
SLF4J: Failed to load class "org.slf4j.impl.StaticLoggerBinder".
SLF4J: Defaulting to no-operation (NOP) logger implementation
SLF4J: See http://www.slf4j.org/codes.html#StaticLoggerBinder for further details.
Found a solution:
On my Mac box, after I download the scala-2.10
and kafka_2.10-0.8.1
, in the kafka_2.10-0.8.1 directory, every thing is fine when I start zookeeper, kafka server, and create a test topic. Then I need to start a producer for the test topic. but there is an error:
yhuangMac:kafka_2.10-0.8.1 yhuang$ ./bin/kafka-console-producer.sh –broker-list localhost:9092 –topic test
SLF4J: Failed to load class “org.slf4j.impl.StaticLoggerBinder”.
SLF4J: Defaulting to no-operation (NOP) logger implementation
SLF4J: See http://www.slf4j.org/codes.html#StaticLoggerBinder for further details.
The reason is that in the kafka libs directory, the kafka release zip file only included jar file of slf4j-api, they missed a jar file: slf4j-nop.jar, so we have to go to http://www.slf4j.org, download slf4j-1.7.7.zip
, and then unzip it, copy the slf4j-api-1.7.7, slf4j-nop-1.7.7.jar into kafka’s libs directory.
Restart kafka producer again, now no error is reported.
Source: SOLUTION
I got the error from apache kafka:
bin/kafka-console-producer.sh --broker-list localhost:9092 --topic test
SLF4J: Failed to load class "org.slf4j.impl.StaticLoggerBinder".
SLF4J: Defaulting to no-operation (NOP) logger implementation
SLF4J: See http://www.slf4j.org/codes.html#StaticLoggerBinder for further details
My Setup:
OS: Ubuntu 14.04
sbt: sbt launcher version 0.13.5
scala: Scala code runner version 2.9.2
Was able to fix it with these commands:
cd /home/el/apachekafka/kafka_2.10-0.8.1.1/libs
wget http://www.slf4j.org/dist/slf4j-1.7.7.tar.gz
tar -xvf slf4j-1.7.7.tar.gz
cd /home/el/apachekafka/kafka_2.10-0.8.1.1/libs/slf4j-1.7.7
cp slf4j-api-1.7.7.jar ..
cp slf4j-nop-1.7.7.jar ..
Then re-run the command and the producer doesn't throw any error.
I came across this error in Hortonworks HDP 2.2 where default port is set to 6667. If your kafka server is running on HDP sandbox the resolution is to set metadata.broker.list as 10.0.2.15:6667 Please follow this code.
Properties props = new Properties();
props.put("metadata.broker.list", "10.0.2.15:6667");
props.put("serializer.class", "kafka.serializer.StringEncoder");
//props.put("producer.type","async");
props.put("request.required.acks", "1");
ProducerConfig config = new ProducerConfig(props);
Producer<String, String> producer = new Producer<String, String>(config);
try{
producer.send(new KeyedMessage<String, String>("zerg.hydra", jsonPayload));
producer.close();
}catch(Exception e){
e.printStackTrace();
}
This can happen if the client cannot reach BOTH hostname and IP of the kafka broker.
Make an entry to the clients \etc\hosts or C:\Windows\System32\drivers\etc\hosts and that resolved this issue for me.
You need to add the SLF4j logging implementation. if you are using maven as the build tool try adding this following to your pom.xml and see if it works ..
<dependency>
<groupId>org.slf4j</groupId>
<artifactId>slf4j-log4j12</artifactId>
<version>1.7.5</version>
</dependency>