构建带ik分词器Elasticsearch7.4.2 docker镜像

创建Dockerfile 文件:

FROM elasticsearch:7.4.2
RUN cd /usr/share/elasticsearch && sh -c '/bin/echo -e "y" | elasticsearch-plugin install  https://github.com/medcl/elasticsearch-analysis-ik/releases/download/v7.4.2/elasticsearch-analysis-ik-7.4.2.zip'

在Dockerfile文件的同级目录下,执行构建镜像命令:

docker build -t es_ik:7.4.2 .

启动容器:

docker run -p 9200:9200 -p 9300:9300 -e "discovery.type=single-node" es_ik:7.4.2

验证分词起效果:

goby@/data/es-node2$ curl -XPOST  "http://127.0.0.1:9200/_analyze?pretty"  -H 'Content-Type: application/json' -d '{"text":"Kobe是一个伟大的篮球运动员","tokenizer":"ik_smart"}'
{
  "tokens" : [
    {
      "token" : "kobe",
      "start_offset" : 0,
      "end_offset" : 4,
      "type" : "ENGLISH",
      "position" : 0
    },
    {
      "token" : "是",
      "start_offset" : 4,
      "end_offset" : 5,
      "type" : "CN_CHAR",
      "position" : 1
    },
    {
      "token" : "一个",
      "start_offset" : 5,
      "end_offset" : 7,
      "type" : "CN_WORD",
      "position" : 2
    },
    {
      "token" : "伟大",
      "start_offset" : 7,
      "end_offset" : 9,
      "type" : "CN_WORD",
      "position" : 3
    },
    {
      "token" : "的",
      "start_offset" : 9,
      "end_offset" : 10,
      "type" : "CN_CHAR",
      "position" : 4
    },
    {
      "token" : "篮球",
      "start_offset" : 10,
      "end_offset" : 12,
      "type" : "CN_WORD",
      "position" : 5
    },
    {
      "token" : "运动员",
      "start_offset" : 12,
      "end_offset" : 15,
      "type" : "CN_WORD",
      "position" : 6
    }
  ]
}

你可能感兴趣的:(Elasticsearch,docker,elasticsearch)