排查升级到版本后kafka不兼容问题
参考文档:
/usr/share/logstash/vendor/bundle/jruby//gems/logstash-input-kafka-/
/usr/share/logstash/vendor/bundle/jruby//gems/logstash-input-kafka-/
/usr/share/logstash/vendor/bundle/jruby//gems/logstash-input-kafka-/
/usr/share/logstash/vendor/bundle/jruby//gems/logstash-input-kafka-/lib/logstash/inputs/
/usr/share/logstash/vendor/bundle/jruby//gems/logstash-output-kafka-/lib/logstash/outputs/
缘由:
之前对ELKB环境从版本升级到最新的稳定版本,主要升级步骤可以参考/3002256/1870205,后来发现kafka集群运行报错,现在把排查过程记录如下,仅供参考
之前环境:
logstash-input-kafka-
logstash-output-kafka-
kafka_-
升级后环境:
logstash-input-kafka-
logstash-output-kafka-
报错信息:
[2016-11-16T14:35:44,739][ERROR][ ] Unknown setting 'zk_connect' for kafka
[2016-11-16T14:35:44,741][ERROR][ ] Unknown setting 'topic_id' for kafka
[2016-11-16T14:35:44,741][ERROR][ ] Unknown setting 'reset_beginning' for kafka
实施步骤:
1,根据错误查看程序哪里报错
grep "Unknown setting" /usr/share/logstash/ -R
/usr/share/logstash/logstash-core/lib/logstash/config/: ("Unknown setting '#{name}' for #{@plugin_name}")
2,查看程序相关代码,发现需要查看plugins的config定义文件等
def validate_check_invalid_parameter_names(params)
invalid_params =
# Filter out parameters that match regexp keys.
# These are defined in plugins like this:
# config /foo.*/ => ...
@_key do |config_key|
if config__a?(Regexp)
invalid_! { |k| k =~ config_key }
elsif config__a?(String)
invalid_! { |k| k == config_key }
end
end
if invalid_ > 0
invalid_ do |name|
("Unknown setting '#{name}' for #{@plugin_name}")
end
return false
end # if invalid_ > 0
return true
end # def validate_check_invalid_parameter_names
3,进入插件总目录查看具体信息
cd /usr/share/logstash/vendor/bundle/jruby//gems/logstash-input-kafka-
发现重点查看如下文件
grep config ./* -R |awk '{print $1}' |uniq
./:
./:See
./lib/logstash/inputs/:#
./lib/logstash/inputs/:
./:-
Binary
1)首先看,就有发现logstash-input-开始就不在向后兼容,且剔除了jruby-kafka,注意这里有个坑2)会讲到,版本说开始支持kafka ,又说开始
支持切不向后兼容,这破坏性更新也是够了。看来问题找到了我的kafka版本是kafka_-,kafka版本不兼容导致的。
部分文档如下:
##
- Update to Kafka version for bug fixes
##
- Support for Kafka which is not backward compatible with broker.
##
- Republish all the gems under jruby.
- Update the plugin to the version of the plugin api, this change is required for Logstash compatibility. See https:///elastic/logstash/issues/5141
- Support for Kafka for LS
##
- Refactor to use new Java based consumer, bypassing jruby-kafka
- Breaking: Change configuration to match Kafka's configuration. This version is not backward compatible
2)之前我看文档时,看配置语法都正确,还以为是却少依赖关系jruby-kafka library呢,这个再是在用的(另外对比发现5版本少了不少插件。另外
kafka版本写的是,看来这个没有及时更新(与后面文件不一致),谁要是看到了麻烦及时更新啊,虽是小问题但是也可能误导我等屁民。当然也有可能是我没
有全面看文档导致的。
文档结尾如下:
Dependencies
====================
* Apache Kafka version
* jruby-kafka library
3)开始看文档,特意看了下kafka的兼容性 看来logstas-input-和logstash-output-只能用了。如果你想用还想用,你的
logstash-input-kafka和logstash-output-kafka只能降级版本到了,这里都说他是中间过渡版本了,所以还是随大流吧。
## Kafka Compatibility
Here's a table that describes the compatibility matrix for Kafka Broker support. Please remember that it is good advice to upgrade brokers before consumers/producers
since brokers target backwards compatibility. The broker will work with both the consumer and consumer APIs but not the other way around.
| Kafka Broker Version | Logstash Version | Input Plugin | Output Plugin | Why? |
|:---------------:|:------------------:|:--------------:|:---------------:|:------|
| | - | < | < | Legacy, is still popular |
| | - | | | Intermediate release before that works with old Ruby Event API `[]` |
| | , | | | Intermediate release before with new get/set API |
| | , | | | Track latest Kafka release. Not compatible with broker |
4)现在看来只能升级kafka版本了。最后我看了下jar-dependencies发现了kafka-clients-
ls /usr/share/logstash/vendor/bundle/jruby//gems/logstash-input-kafka-/vendor/jar-dependencies/runtime-jars/
kafka-clients- log4j- lz4- slf4j-api- slf4j-log4j12- snappy-java-
5)还有一个文件没有看,怀着好奇心我看了一眼,发现之前都白费力气了,这里才是最有参考价值的的主参考文档啊,是捷径啊,隐藏的够深的,差点错过了,汗!
/usr/share/logstash/vendor/bundle/jruby//gems/logstash-input-kafka-/lib/logstash/inputs/
部分文档如下:
# This input will read events from a Kafka topic. It uses the the newly designed
# version of consumer API provided by Kafka to read messages from the broker.
#
# Here's a compatibility matrix that shows the Kafka client versions that are compatible with each combination
# of Logstash and the Kafka input plugin:
#
# [options="header"]
# |==========================================================
# |Kafka Client Version |Logstash Version |Plugin Version |Security Features |Why?
# | | - |< | |Legacy, is still popular
# | | - | |Basic Auth, SSL |Works with the old Ruby Event API (`event['product']['price'] = 10`)
# | | - | |Basic Auth, SSL |Works with the new getter/setter APIs (`('[product][price]', 10)`)
# | | - | |Basic Auth, SSL |Not compatible with the broker
# |==========================================================
#
# NOTE: We recommended that you use matching Kafka client and broker versions. During upgrades, you should
# upgrade brokers before clients because brokers target backwards compatibility. For example, the broker
# is compatible with both the consumer and consumer APIs, but not the other way around.
6)升级kafka_-为kafka_- (我看kafka-clients-,所以没有用最新的kafka_-)
大概步骤
关闭老kafka
/usr/local/kafka/bin/kafka-server- /usr/local/kafka/config/
备份老配置文件
和
删除kafka
rm -rf /usr/local/kafka/
rm -rf /data/kafkalogs/*
安装配置新kafka
wget /apache/kafka//kafka_-
tar zxvf kafka_- -C /usr/local/
ln -s /usr/local/kafka_- /usr/local/kafka
diff了下和变动不大可以直接使用
启动新kafka
/usr/local/kafka/bin/kafka-server- /usr/local/kafka/config/ &
7)注意几个关键配置需要修改
config :bootstrap_servers, :validate => :string, :default => "localhost:9092"
config :group_id, :validate => :string, :default => "logstash"
config :topics, :validate => :array, :default => ["logstash"]
config :consumer_threads, :validate => :number, :default => 1
除了上面的几个关键配置外,kafka的topic分片信息需要重新create一份,否则KafkaMonitor监控不出Active Topic Consumer图形,但实际是在工作中。