elasticsearch ik分词 实现

本次实现环境:

  操作系统:windows xp

  elasticsearch版本:1.0.3

  ik版本:1.2.3

以上的文件附在附近供下载。

1、先将下载的elasticsearch-analysis-ik-1.2.3.jar放在ES_HOME\plugins\analysis-ik\下;

2、将ik.rar解压到ES_HOME\config\下。

3、修改ES_HOME\config\elasticsearch.yml最后添加:

  index:
  analysis:
    analyzer:
      ik:
          alias: [ik_analyzer]
          type: org.elasticsearch.index.analysis.IkAnalyzerProvider
      ik_max_word:
          type: ik
          use_smart: false
      ik_smart:
          type: ik
          use_smart: true

4、重启elasticsearch

5、测试:localhost:9200/这里是索引名/_analyze?analyzer=ik&text=我的第一个中文分词

 

{"tokens":[{"token":"我","start_offset":0,"end_offset":1,"type":"CN_CHAR","position":1},
{"token":"第一个","start_offset":2,"end_offset":5,"type":"CN_WORD","position":2},
{"token":"eslasticsearch","start_offset":5,"end_offset":19,"type":"ENGLISH","position":3},
{"token":"ik","start_offset":20,"end_offset":22,"type":"ENGLISH","position":4},
{"token":"中文","start_offset":22,"end_offset":24,"type":"CN_WORD","position":5},
{"token":"分词","start_offset":24,"end_offset":26,"type":"CN_WORD","position":6}]}

完毕

你可能感兴趣的:(elasticsearch)