官网的下载地址在这里。
官网文档地址在这里
访问路径是doc/search
让人惊呆了,新版的elastic-search竟然需要JDK23。
The locale database used by Elasticsearch, used to map from various date formats to the underlying date storage format, depends on the version of the JDK that Elasticsearch is running on. On JDK version 23 and above, Elasticsearch will use the CLDR database. On JDK version 22 and below, Elasticsearch will use the COMPAT database. This may mean that the strings used for textual date formats, and the output of custom week-date formats, may change when moving from a previous JDK version to JDK 23 or above. For more information, see custom date formats.
具体对JVM和操作系统的支持情况,可以看这个matrix
从该网站看出,最低支持JDK17。
To use your own version of Java, set the ES_JAVA_HOME environment variable to the path to your own JVM installation
mkdir -p /WORK/SOFTWARES/elasticsearch
tar -xzvf elasticsearch-8.17.0-linux-x86_64.tar.gz -C /WORK/SOFTWARE/elasticsearch
cd /WORK/SOFTWARE/elasticsearch
mv elasticsearch-8.17.0 8.17
export ELASTIC_PASSWORD="study@2025"
export ES_JAVA_HOME=$JAVA_HOME
./bin/
可能还会有权限问题,不能用root用户启动。
更改环境变量,在/etc/profile.d文件夹下添加elastic-search.sh
export ELASTIC_PASSWORD="study@2025"
export ES_JAVA_HOME=$JAVA_HOME
权限设置
useradd elastic-search
passwd elastic-search
#设置成study@2025
chown elastic-search '8.17'
创建用户
mkdir -p /home/elastic-search
chown -R elastic-search /home/elastic-search
#查看用户的末尾是否是bash
grep elastic-search /etc/passwd
#如果不是的话
usermod -s /bin/bash elastic-search
#设置一下环境变量
source /etc/profile
更改config下的yml
# 开放外网访问
network.host: 0.0.0.0
创建service
cd /etc/systemd/system
vim elasticsearch.service
配置
[Unit]
Description=Elasticsearch
Documentation=https://www.elastic.co
Wants=network-online.target
After=network-online.target
[Service]
User=elastic-search
Group=elastic-search
ExecStart=/WORK/SOFTWARE/elasticsearch/8.17/bin/elasticsearch
Restart=always
LimitMEMLOCK=infinity
sudo systemctl daemon-reload
sudo systemctl enable elasticsearch
sudo systemctl start elasticsearch
安装分词器ik
首先去官网下载
下载后放在/WORK/DOWNLOADS下,cd进去
mkdir -p /WORK/SOFTWARE/elasticsearch/8.17/plugins/ik
unzip elasticsearch-analysis-ik-8.17.0.zip -d /WORK/SOFTWARE/elasticsearch/8.17/plugins/ik
安装分词器hanpl
首先去官网下载
和ik类似,下载后,放入如下文件夹
mkdir -p /WORK/SOFTWARE/elasticsearch/8.17/plugins/hanlp
## 把文件拷到这个文件夹
chown -R elastic-search:elastic-search hanlp
注意一点,如果报如下错误:
fatal exception while booting Elasticsearch
java.lang.IllegalArgumentException: plugin policy [/WORK/SOFTWARE/elasticsearch/8.17/plugins/hanlp/plugin-security.policy] contains illegal permission ("java.io.FilePermission" "plugins/analysis-hanlp/hanlp.cache#plus" "read,write,delete") in global grant
就在plugin-security.policy文件添加
java.io.FilePermission "<>", "read,write,delete";
当然,我这里装的是8,用不了该插件
mkdir -p /WORK/SOFTWARE/kibana
tar -xzvf kibana-8.17.0-linux-x86_64.tar.gz -C /WORK/SOFTWARE/kibana
cd /WORK/SOFTWARE/kibana
mv kibana-8.17.0 8.17
kibana配置
elasticsearch.username: "kibana"
elasticsearch.password: "study@2025"
server.port: 5601
server.host: 192.168.0.64
elasticsearch.hosts: ['https://192.168.0.64:9200']
服务配置
cd /etc/systemd/system
vim kibana.service
添加链接描述```bash
[Unit]
Description=Elasticsearch
Documentation=https://www.elastic.co
Wants=network-online.target
After=network-online.target
[Service]
User=elastic-search
Group=elastic-search
ExecStart=/WORK/SOFTWARE/kibana/8.17/bin/kibana
Restart=always
LimitMEMLOCK=infinity
```bash
sudo systemctl daemon-reload
sudo systemctl enable kibana
sudo systemctl start kibana
elastic的密码重置
# 重置密码
bin/elasticsearch-reset-password -u elastic
# 设置密码
bin/elasticsearch-reset-password --username user1 -i
由于本篇目标仅仅是入门,不会深入讲解el的细节问题,目标只在于大家能上手使用。
其实细节部分,上B站搜个视频,然后看看官方文档,效果远远好于看帖子。
el-search中的index,就相当于mysql的表(后面我们都把表称为索引)。
可以使用如下命令创建一个空表,实例中创建商品索引(item)
PUT /item
而后,为该表创建字段
PUT /item_cellphone
PUT item_cellphone/_mapping
{
"properties":{
"name": {
"type": "text",
"analyzer": "ik_max_word",
"search_analyzer": "ik_smart"
},
"specification": {
"type": "text",
"analyzer": "ik_smart",
"search_analyzer": "ik_smart"
},
"brand": {
"type": "text",
"analyzer": "ik_smart",
"search_analyzer": "ik_smart"
},
"price": {
"type": "integer"
},
"tag":{
"type": "keyword"
}
}
}
PUT item_cellphone/_alias/item
而后再创建一个item_clothes
略
注意,item是两个索引的别名,我们查询时,直接对着item查就好了。
值得注意的点是,el-search中,任何字段可以为空,可以为单值,也可以为数组。只是数组所有元素的类型必须一致。
诸如tag之类的字段,不想做分词,但希望被精确查找,可以用keyword
使用命令GET,可以看索引(表)的情况
GET /item
el-search中的doc,就相当于mysql中表的数据。
我们可以使用如下结构插入数据:
POST /[索引名]/_doc
{
}//json数据
PUT /[索引名]/_doc/{id}
{
}//json数据
具体构想的数据,可以去代码仓库查看(后面会给出)
添加链接描述```json
GET /item/_search
{
“query”: {
“match_all”: {}
},
“_source”: [“name”,“specification”,“price”],
“track_scores”: false,
“sort”: [
{ “price”: “desc” }
],
“from”: 5,
“size”: 10
}
解释一下,_source字段,表示只查哪几列。"track_scores": false,表示关闭评分(如果不做全文搜索,或者对text字段做匹配,通常关闭评分节省性能损耗),sort是排序。from,size,分别表示从第几个元素开始,查多少个。
### 4.3.2 精确查询
```json
GET /item/_search
{
"query": {
"bool": {
"filter": [
{"term": {"name": "裙"}},
{"term": {"tag": "白色"}}
]
}
}
}
比如这个,就是查找白色的裙子
term是精确匹配,对于test类型的字段,去倒排索引列表中找,有匹配的则通过。但对于term类型的字段,则必须完全一致。
还有一个常用的,就是范围查找
GET /item/_search
{
"query": {
"range": {
"price": {
"gte": 5000,
"lte": 8000
}
}
}
}
全文搜索会对查找内容进行分词
GET /item/_search
{
"query": {
"multi_match": {
"query": "手机 运行内存 8G",
"fields": ["name","specification"]
}
}
}
GET _analyze
{
"analyzer": "ik_smart",
"text": "手机 运行内存 8G"
}
初步尝试下来,不做任何优化,且仅仅使用ik分词器的el,可能达不到理想的效果。
组合条件
GET /item/_search
{
"query": {
"bool": {
"should": [
{
"match_phrase":
{
"specification":"运行内存 16GB"
}
}
,
{
"match":
{
"name": "手机 5G"
}
},
{
"term": {
"tag": {
"value": "黑色"
}
}
}
],
"minimum_should_match": 2
}
}
}
使用explain关键字可以查看算分逻辑
通常,我们想使用更细粒度的API操作el-search,这时我们可以去官网查看文档。
<dependency>
<groupId>com.fasterxml.jackson.coregroupId>
<artifactId>jackson-databindartifactId>
<version>2.17.0version>
dependency>
<dependency>
<groupId>indi.zhifa.coregroupId>
<artifactId>commonartifactId>
<version>1.1.20version>
dependency>
@RequiredArgsConstructor
@Data
public class ElasticSearchClientProvider {
private final String host;
private final Integer port;
private final String apiKey;
private final ResourceLoader resourceLoader;
private RestClientBuilder builder;
@PostConstruct
public void init() throws IOException {
SSLContext sslContext = TransportUtils.sslContextFromHttpCaCrt(resourceLoader.getResource(ResourceLoader.CLASSPATH_URL_PREFIX+"http_ca.crt").getFile());
builder = RestClient.builder(
new HttpHost(host, port, "https") // 替换为你的Elasticsearch地址
).setDefaultHeaders(new Header[]{
new BasicHeader("Authorization", "ApiKey " + apiKey)
})
.setFailureListener(new RestClient.FailureListener(){
@Override
public void onFailure(Node node) {
super.onFailure(node);
}
}).setHttpClientConfigCallback(hc->
hc.setSSLContext(sslContext)
);
}
public ElasticsearchClient get(){
RestClient restClient = builder.build();
ElasticsearchTransport transport = new RestClientTransport(
restClient, new JacksonJsonpMapper());
ElasticsearchClient esClient = new ElasticsearchClient(transport);
return esClient;
}
}
@ConfigurationProperties(prefix = "el-search")
@Configuration
public class EsClientConfig {
@Setter
@Getter
private String host;
@Setter
@Getter
private Integer port;
@Setter
@Getter
private String apiKey;
@Bean
public ElasticSearchClientProvider elasticSearchClientProvider(ResourceLoader pResourceLoader){
ElasticSearchClientProvider elasticSearchClientProvider = new ElasticSearchClientProvider(host, port,apiKey,pResourceLoader);
return elasticSearchClientProvider;
}
}
需要注意的是,el-search默认开启了ssl,必须要用https才能访问。
https的证书在el文件夹的config的cert下,我们把相应文件拷贝到Resource下即可。
@Slf4j
@Validated
@RequiredArgsConstructor
@RequestMapping(value = "/api/test/el-test")
@ZhiFaRestController
@Tag(name = "el-测试")
public class EsTestController {
private final ElasticSearchClientProvider mElasticSearchClientProvider;
@GetMapping("/simpleTest")
public List<ItemEntity> elTest() throws IOException {
ElasticsearchClient elasticsearchClient = mElasticSearchClientProvider.get();
Reader queryJson = new StringReader(
"""
{
"query": {
"bool": {
"filter": [
{"term": {"name": "裙"}},
{"term": {"tag": "白色"}}
]
}
}
}
"""
);
SearchRequest searchRequest = SearchRequest.of(i->i.index("item").withJson(queryJson));
SearchResponse<ItemEntity> response = elasticsearchClient.search(searchRequest, ItemEntity.class);
List<ItemEntity> rtn = response.hits().hits().stream().map(hit->hit.source()).toList();
elasticsearchClient.close();
return rtn;
}
}