Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Plugin does not create consumer group in kafka? #254

Open
mckingho opened this issue Mar 15, 2018 · 0 comments
Open

Plugin does not create consumer group in kafka? #254

mckingho opened this issue Mar 15, 2018 · 0 comments

Comments

@mckingho
Copy link

I can get messages from kafka. But I cannot list the group id of logstash from kafka's zookeeper. So I am not able to check my message lag using consumer offset command.

Is it plugin's feature, or I set a wrong configuration? Please help if you find any possible issue. Much thanks.

  • Version: v5.1.11 (logstash 5.6.1)
  • Operating System: centos:7 (run as docker container)
  • Config File
input {
    kafka {
        topics => ["test"] 
        bootstrap_servers => "10.x.x.x:29092,10.x.x.x:29093,10.x.x.x:29094"
        group_id => "logstash"
    }
}

My kafka configuration where 35.x.x.x is external ip, 10.x.x.x is internal ip of same region network with logstash.

inter.broker.listener.name: INTERNAL
advertised.listeners: EXTERNAL://35.x.x.x:9092,INTERNAL://10.x.x.x:19092,CONSUMER://10.x.x.x:29092
listeners: EXTERNAL://0.0.0.0:9092,INTERNAL://0.0.0.0:19092,CONSUMER://0.0.0.0:29092
listener.security.protocol.map: PLAINTEXT:PLAINTEXT,EXTERNAL:PLAINTEXT,INTERNAL:PLAINTEXT,CONSUMER:PLAINTEXT

I have 2 more kafka with similar setting, only port number increment by 1 and 2 (9093, 9094).

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

1 participant