elasticsearch一直在使用,这里总结一下mappings的修改方法,分为两种情况:
curl -XPUT 'http://127.0.0.1:9200/test?pretty' -H 'Content-Type: application/json' -d '{"settings":{},"mappings":{}}'
{
"settings": {
"index": {
"number_of_shards": 2,
"number_of_replicas": 3,
"analysis": {
"analyzer": {
"char_analyzer": {
"filter": ["lowercase"],
"type": "custom",
"tokenizer": "char_split"
}
},
"tokenizer": {
"char_split": {
"token_chars": ["letter", "digit", "punctuation", "symbol"],
"min_gram": "1",
"type": "nGram",
"max_gram": "1"
}
}
}
}
},
"mappings": {
"doc": {
"properties": {
"id": {
"type": "long"
},
"pd_name": {
"type": "text",
"fields": {
"keyword": {
"type": "keyword",
"ignore_above": 256
}
},
"analyzer": "char_analyzer"
},
"pd_uname": {
"type": "text",
"fields": {
"keyword": {
"type": "keyword",
"ignore_above": 256
}
}
},
"product_id": {
"type": "long"
}
}
}
}
}
curl -XGET 'http://127.0.0.1:9200/test/_mappings?pretty'
{
"test" : {
"mappings" : {
"doc" : {
"properties" : {
"id" : {
"type" : "long"
},
"pd_name" : {
"type" : "text",
"fields" : {
"keyword" : {
"type" : "keyword",
"ignore_above" : 256
}
},
"analyzer" : "char_analyzer"
},
"pd_uname" : {
"type" : "text",
"fields" : {
"keyword" : {
"type" : "keyword",
"ignore_above" : 256
}
}
},
"product_id" : {
"type" : "long"
}
}
}
}
}
}
增加一个 new_stocks 字段,如下:
curl -XPUT 'http://127.0.0.1:9200/test/doc/_mapping?pretty' -H 'Content-Type: application/json' -d '{"properties":{"new_stocks":{"type":"nested","properties":{"value":{"type":"long"},"conversion":{"type":"long"}}}}}'
{
"acknowledged" : true
}
再查一下:
curl -XGET 'http://127.0.0.1:9200/test/_mappings?pretty' {
"test" : {
"mappings" : {
"doc" : {
"properties" : {
"id" : {
"type" : "long"
},
"new_stocks" : {
"type" : "nested",
"properties" : {
"conversion" : {
"type" : "long"
},
"value" : {
"type" : "long"
}
}
},
"pd_name" : {
"type" : "text",
"fields" : {
"keyword" : {
"type" : "keyword",
"ignore_above" : 256
}
},
"analyzer" : "char_analyzer"
},
"pd_uname" : {
"type" : "text",
"fields" : {
"keyword" : {
"type" : "keyword",
"ignore_above" : 256
}
}
},
"product_id" : {
"type" : "long"
}
}
}
}
}
}
可以看到new_stocks字段已经加上去了。
如果想把product_id
字段类型由long改成text呢?此时用第二章节的方案就不行了,需要借用reindex命令重做索引。
curl -XPUT 'http://127.0.0.1:9200/new_test?pretty' -H 'Content-Type: application/json' -d '{"settings":{},"mappings":{}}'
{
"settings": {
"index": {
"number_of_shards": 2,
"number_of_replicas": 3,
"analysis": {
"analyzer": {
"char_analyzer": {
"filter": ["lowercase"],
"type": "custom",
"tokenizer": "char_split"
}
},
"tokenizer": {
"char_split": {
"token_chars": ["letter", "digit", "punctuation", "symbol"],
"min_gram": "1",
"type": "nGram",
"max_gram": "1"
}
}
}
}
},
"mappings": {
"doc": {
"properties": {
"id": {
"type": "long"
},
"pd_name": {
"type": "text",
"fields": {
"keyword": {
"type": "keyword",
"ignore_above": 256
}
},
"analyzer": "char_analyzer"
},
"pd_uname": {
"type": "text",
"fields": {
"keyword": {
"type": "keyword",
"ignore_above": 256
}
}
},
"product_id": {
"type": "text",
"fields": {
"keyword": {
"type": "keyword",
"ignore_above": 256
}
}
}
}
}
}
}
curl -XPOST 'http://127.0.0.1:9200/_reindex?pretty' -H 'Content-Type: application/json' -d '{"source":{"index":"test"},"dest":{"index":"new_test"}}'
#删除
curl -XDELETE 'http://127.0.0.1:9200/test?pretty'
#设置别名
curl -XPOST 'http://127.0.0.1:9200/_aliases?pretty' -H 'Content-Type: application/json' -d '{"actions":[{"add":{"index":"new_test","alias":"test"}}]}'
#从源头过滤, _source 参数
curl -XPOST 'http://127.0.0.1:9200/_reindex?pretty' -H 'Content-Type: application/json' -d '{"source":{"index":"test","_source":["id","pd_name"]},"dest":{"index":"new_test"}}'
#从目的地移除 脚本控制
curl -XPOST 'http://127.0.0.1:9200/_reindex?pretty' -H 'Content-Type: application/json' -d '{"source":{"index":"test","size":5000},"dest":{"index":"new_test"}}'
1]:官方文档