列出所有索引
列出所有索引(列出所有的数据库)
GET /_cat/indices?v
颜色有三个
绿色:集群状态之下是可以用的
黄色:单机可以用,集群不通
红色:压根就不能用来
检查心跳的命令:
GET /_cat/health
添加索引
PUT /goods
{
"settings": {
// 副本数
"number_of_replicas": 1,
// 分片数
"number_of_shards": 5
}
}
删除索引
DELETE /goods
修改文档
POST /goods/_doc/1/_update
{
"doc": {"price":100 }
}
POST /goods/_doc
{
"title": "你好世界"
}
查询(搜索)
GET /goods/_search
// 查询是xiaomi9的
GET /goods/_search
{
"query": {
"match": {
"title": "xiaomi9"
}
}
}
// 排序
GET /goods/_search
{
"query": {
"match_all": {}
},
"sort": [
{
"_id": {
"order": "desc"
}
}
]
}
ik分词器2种模式
ik_smart 最粗粒度的拆分
ik_max_word 最细粒度的拆分
它们其实就是2中分词算法,其区别直接通过测试观察
GET /_analyze
{
"analyzer": "ik_smart",
"text":"程序员"
}
GET /_analyze
{
"analyzer": "ik_max_word",
"text":"程序员"
}
可以看出 ik_max_word 比 ik_smart 划分的词条更多,这就是它们为什么叫做最细粒度和最粗粒度。
PHP操作ES
官网:www.elastic.co/guide/en/el…
创建索引
$hosts = [
'127.0.0.1:9200'
];
$client = \Elasticsearch\ClientBuilder::create()->setHosts($hosts)->build();
// 创建索引
$params = [
'index' => 'goods',
'body' => [
'settings' => [
'number_of_shards' => 5,
'number_of_replicas' => 1
],
'mappings' => [
'_doc' => [
'_source' => [
'enabled' => true
],
'properties' => [
'title' => [
'type' => 'keyword'
],
'desn' => [
'type' => 'text',
'analyzer' => 'ik_max_word',
'search_analyzer' => 'ik_max_word'
]
]
]
]
]
];
$response = $client->indices()->create($params);
如果是ElasticSearch7.15.2会报以下错误
{"error_code":1000,"data":null,"msg":"{"error":{"root_cause":[{"type":"illegal_argument_exception","reason":"The mapping definition cannot be nested under a type [_doc] unless include_type_name is set to true."}],"type":"illegal_argument_exception","reason":"The mapping definition cannot be nested under a type [_doc] unless include_type_name is set to true."},"status":400}"}
解决方案
$hosts = [
'127.0.0.1:9200'
];
$client = \Elasticsearch\ClientBuilder::create()->setHosts($hosts)->build();
// 创建索引
$params = [
'index' => 'article',
'body' => [
'settings' => [
'number_of_shards' => 5,
'number_of_replicas' => 1
],
'mappings' => [
'_source' => [
'enabled' => true
],
'properties' => [
'title' => [
'type' => 'text',//相当于mysql中like检索
'analyzer' => 'ik_max_word',
'search_analyzer' => 'ik_max_word'
]
]
]
]
];
$response = $client->indices()->create($params);
更新文档
$hosts = [
'127.0.0.1:9200',
];
$client = \Elasticsearch\ClientBuilder::create()->setHosts($hosts)->build();
// 写文档
$params = [
'index' => 'goods',
'type' => '_doc',
'id' => $model->id,
'body' => [
'title' => $model->title,
'desn' => $model->desn,
],
];
$response = $client->index($params);
搜索
$hosts = [
'127.0.0.1:9200',
];
$client = \Elasticsearch\ClientBuilder::create()->setHosts($hosts)->build();
$params = [
'index' => 'goods',
'type' => '_doc',
'body' => [
'query' => [
'match' => [
'title'=>[
'query' => '手机'
]
]
]
]
];
$results = $client->search($params);
dump($results);
es 全文检索所用 服务 端口号 9200
索引 类似于数据库
类型 标识 _doc
文档 行数据 数据所在的地方
分区 默认分区5个,后不能修改
副本 默认1个 日后可以修改
注: 6.0之后,创建索引是一个要指定,否则报警告
字段映射
keyword 相当于 =
text 相当于 like
中文分词,前提一定要在es中安装了中文分词插件才可以用
analyzer = ik_max_word
search_analyzer = ik_max_word
# 安装
composer require elasticsearch/elasticsearch