/**
* 系统环境: vm12 下的centos 7.2
* 当前安装版本: elasticsearch-2.4.0.tar.gz
*/
ElasticSearch中内置了许多分词器, standard, english, chinese等, 中文分词效果不佳, 所以使用ik
安装ik分词器
下载链接: https://github.com/medcl/elasticsearch-analysis-ik/releases
版本对应关系: https://github.com/medcl/elasticsearch-analysis-ik
关闭elasticsearch.bat,将下载下来的压缩文件解压,在ES目录中的plugins文件夹里新建名为ik的文件夹,将解压得到的所有文件复制到ik中。
unzip elasticsearch-analysis-ik-1.10.0.zip
确认 plugin-descriptor.properties 中的版本和安装的elasticsearch版本一直, 否则报异常
在elasticsearch.yml中增加ik设置
index.analysis.analyzer.ik.type : “ik”
或者添加:
index:
analysis:
analyzer:
ik:
alias: [ik_analyzer]
type: org.elasticsearch.index.analysis.IkAnalyzerProvider
ik_max_word:
type: ik
use_smart: false
ik_smart:
type: ik
use_smart: true
重新启动elasticsearch
注意: 不可将zip包放在在ik目录同级, 否则报错
Exception in thread "main" java.lang.IllegalStateException: Could not load plugin descriptor for existing plugin [elasticsearch-analysis-ik-1.10..zip]. Was the plugin built before 2.0?
Likely root cause: java.nio.file.FileSystemException: /usr/work/elasticsearch/elasticsearch-2.4./plugins/elasticsearch-analysis-ik-1.10..zip/plugin-descriptor.properties: 不是目录
at sun.nio.fs.UnixException.translateToIOException(UnixException.java:)
at sun.nio.fs.UnixException.rethrowAsIOException(UnixException.java:)
at sun.nio.fs.UnixException.rethrowAsIOException(UnixException.java:)
at sun.nio.fs.UnixFileSystemProvider.newByteChannel(UnixFileSystemProvider.java:)
at java.nio.file.Files.newByteChannel(Files.java:)
at java.nio.file.Files.newByteChannel(Files.java:)
at java.nio.file.spi.FileSystemProvider.newInputStream(FileSystemProvider.java:)
at java.nio.file.Files.newInputStream(Files.java:)
at org.elasticsearch.plugins.PluginInfo.readFromProperties(PluginInfo.java:)
at org.elasticsearch.plugins.PluginsService.getPluginBundles(PluginsService.java:)
at org.elasticsearch.plugins.PluginsService.<init>(PluginsService.java:)
at org.elasticsearch.node.Node.<init>(Node.java:)
at org.elasticsearch.node.Node.<init>(Node.java:)
at org.elasticsearch.node.NodeBuilder.build(NodeBuilder.java:)
at org.elasticsearch.bootstrap.Bootstrap.setup(Bootstrap.java:)
at org.elasticsearch.bootstrap.Bootstrap.init(Bootstrap.java:)
at org.elasticsearch.bootstrap.Elasticsearch.main(Elasticsearch.java:)
Refer to the log for complete error details.
测试:
首先配置:
curl -XPUT localhost:/local -d '{
"settings" : {
"analysis" : {
"analyzer" : {
"ik" : {
"tokenizer" : "ik"
}
}
}
},
"mappings" : {
"article" : {
"dynamic" : true,
"properties" : {
"title" : {
"type" : "string",
"analyzer" : "ik"
}
}
}
}
}'
然后测试
curl 'http://localhost:9200/index/_analyze?analyzer=ik&pretty=true' -d'
{
"text":"*国歌"
}
'
{
"tokens" : [ {
"token" : "text",
"start_offset" : ,
"end_offset" : ,
"type" : "ENGLISH",
"position" :
}, {
"token" : "*",
"start_offset" : ,
"end_offset" : ,
"type" : "CN_WORD",
"position" :
}, {
"token" : "国歌",
"start_offset" : ,
"end_offset" : ,
"type" : "CN_WORD",
"position" :
} ]
}
想要返回最细粒度的结果, 需要在elaticsearch.yml中配置
index:
analysis:
analyzer:
ik:
alias: [ik_analyzer]
type: org.elasticsearch.index.analysis.IkAnalyzerProvider
ik_smart:
type: ik
use_smart: true
ik_max_word:
type: ik
use_smart: false