本教程要向大家介绍spring-data-mongodb的Aggregation(聚合)操作,也就是类似于关系型数据库的group操作!内容包括其使用实例、应用技巧、基本知识点总结和需要注意事项,具有一定的参考价值,需要的朋友可以参考一下。
在spring-data-mongodb框架中Aggregation分组统计基本的操作包括:
$group
- 分组操作$match
- 过滤操作$project
- 从输出中提取字段,可以重命名字段$limit
- 返回结果集中的前n个文档$skip
- 丢弃结果集中的前n个文档$count
,$avg
, $sum
等等函数…$sort
- 排序例如现在有一张各地城市的饭馆表store_detail_info
现在有一个统计北上广的饭馆的个数,平均信誉分数,平均消费价格的需求,那么
MySql:
查询语句如下:
select
CITY as CITY_NAME,
count(*) as COUNT_STORE,
sum(AVGSCORE) as AVG_SCORE,
sum(AVGPRICE) as AVG_PRICE
from
store_detail_info
where
CITY in ("上海", "北京", "广州")
group by
CITY
order by
COUNT_STORE desc
MongoDB:
原始的聚合(aggregate)语句如下:
db.store_detail_info.aggregate([{
"$match": {
"CITY": {
"$in": ["北京", "上海", "广州"]
}
}
}, {
"$group": {
"_id": "$CITY",
"CITY_NAME": {
"$first": "$CITY"
},
"COUNT_STORE": {
"$sum": 1
},
"AVG_SCORE": {
"$avg": "$AVGSCORE"
},
"AVG_PRICE": {
"$avg": "$AVGPRICE"
}
}
}, {
"$project": {
"_id": 0,
"CITY_NAME": 1,
"COUNT_STORE": 1,
"AVG_SCORE": 1,
"AVG_PRICE": 1
}
}, {
"$sort": {
"COUNT_STORE": -1
}
}]
);
$match
: 只匹配北京,上海, 广州
这3个城市
$group
: 根据CITY
分组,输出城市名称(CITY_NAME), 然后统计COUNT_STORE(饭馆个数)
,用$sum
函数,统计AVGSCORE(饭馆平均信誉分)
,用$avg
函数,统计AVGPRICE(饭馆平均消费价格)
,用$avg
函数
$project
: 定义要显示的字段,1为显示,0为不显示
$sort
: 按照COUNT_STORE(饭馆个数)
降序
下面详解在spring-data-mongodb中我们要怎么去实现这个功能
首先,我们需要添加使用 MongoDB 和 Spring Boot 所需的依赖项。
让我们将 spring-boot-starter-data-mongodb 添加到我们的 pom.xml 中:
<dependency>
<groupId>org.springframework.bootgroupId>
<artifactId>spring-boot-starter-data-mongodbartifactId>
dependency>
现在,我们通过将以下配置添加到 application.yml 文件来设置连接:
# MongoDB数据库
spring:
data:
mongodb:
uri: "mongodb://user:password@localhost:27017/test"
让我们运行应用程序来测试我们是否可以连接到数据库。 我们应该在日志中看到与此类似的消息:
[info,71] - Opened connection [connectionId{localValue:1, serverValue:237}] to localhost:27017
这意味着应用程序可以成功连接到 MongoDB。
@SpringBootTest
class MyMongoApplicationTests {
@Autowired
MongoTemplate mongoTemplate;
@BeforeEach
void contextLoads() {
System.out.println(mongoTemplate.getDb().getName());
}
void aggregation() {
MatchOperation match = Aggregation.match(Criteria.where("CITY").in("上海", "北京", "广州"));
GroupOperation group = Aggregation.group("CITY").first("CITY").as("CITY_NAME").count().as("COUNT_STORE").avg("AVGSCORE").as("AVG_SCORE").avg("AVGPRICE").as("AVG_PRICE");
ProjectionOperation project = Aggregation.project().andExclude("_id").andInclude("CITY_NAME", "COUNT_STORE", "AVG_SCORE", "AVG_PRICE");
SortOperation sort = Aggregation.sort(Direction.DESC, "COUNT_STORE");
//@wjw_note: 这里多个Aggregation 是按先后顺序执行的,即后面一个在前面一个的基础上作筛选等操作,所以顺序不一致可能会报错
Aggregation aggregation = Aggregation.newAggregation(match, group, project, sort);
System.out.println(aggregation.toString()); // <1>
AggregationResults<Document> aa = mongoTemplate.aggregate(aggregation, "store_detail_info", Document.class);
List<Document> documents = aa.getMappedResults();
documents.forEach(document -> {
System.out.println(document.toJson(indentJsonWriterSettings));
});
}
}
<1> 打印输出原生的MongodbDB aggregate 命令,帮助理解
运行后输出:
{ "aggregate" : "__collection__", "pipeline" : [{ "$match" : { "CITY" : { "$in" : ["上海", "北京", "广州"]}}}, { "$group" : { "_id" : "$CITY", "CITY_NAME" : { "$first" : "$CITY"}, "COUNT_STORE" : { "$sum" : 1}, "AVG_SCORE" : { "$avg" : "$AVGSCORE"}, "AVG_PRICE" : { "$avg" : "$AVGPRICE"}}}, { "$project" : { "_id" : 0, "CITY_NAME" : 1, "COUNT_STORE" : 1, "AVG_SCORE" : 1, "AVG_PRICE" : 1}}, { "$sort" : { "COUNT_STORE" : -1}}]}
2022-11-07 19:10:08.692 [main] INFO org.mongodb.driver.connection - [info,71] - Opened connection [connectionId{localValue:3, serverValue:242}] to localhost:27017
{
"CITY_NAME": "北京",
"COUNT_STORE": 2214,
"AVG_SCORE": 4.0862691960252935,
"AVG_PRICE": 85.12420957542909
}
{
"CITY_NAME": "上海",
"COUNT_STORE": 1950,
"AVG_SCORE": 3.781025641025641,
"AVG_PRICE": 90.47435897435898
}
{
"CITY_NAME": "广州",
"COUNT_STORE": 1483,
"AVG_SCORE": 3.6790289952798383,
"AVG_PRICE": 60.956169925826025
}
@SpringBootTest
class MyMongoApplicationTests {
@Autowired
MongoTemplate mongoTemplate;
@BeforeEach
void contextLoads() {
System.out.println(mongoTemplate.getDb().getName());
}
@Test
void aggregation2() {
List<Document> pipeline = new ArrayList<Document>();
Document group = new Document();
group.put("$group",
new Document("_id", "$CITY")
.append("CITY_NAME", new Document("$first", "$CITY"))
.append("COUNT_STORE", new Document("$sum", 1))
.append("AVG_SCORE", new Document("$sum", "$AVGSCORE"))
.append("AVG_PRICE", new Document("$sum", "$AVGPRICE")));
Document project = new Document();
project.append("$project",
new Document("CITY_NAME", 1)
.append("COUNT_STORE", 1)
.append("AVG_SCORE", 1)
.append("AVG_PRICE", 1)
.append("_id", 0));
Document match = new Document();
match.append("$match", new Document("CITY_NAME", new Document().append("$in", Arrays.asList("上海", "北京", "广州"))));
Document sort = new Document();
sort.append("$sort", new Document("COUNT_STORE", -1));
pipeline.add(group);
pipeline.add(project);
pipeline.add(match);
pipeline.add(sort);
System.out.println(org.springframework.data.mongodb.core.query.SerializationUtils.serializeToJsonSafely(pipeline)); // <1>
AggregateIterable<Document> output = mongoTemplate.getCollection("store_detail_info").aggregate(pipeline);
output.forEach(document -> {
System.out.println(document.toJson(indentJsonWriterSettings));
});
}
}
<1> 打印输出原生的MongodbDB aggregate 命令,帮助理解
运行后输出:
[{ "$group" : { "_id" : "$CITY", "CITY_NAME" : { "$first" : "$CITY"}, "COUNT_STORE" : { "$sum" : 1}, "AVG_SCORE" : { "$sum" : "$AVGSCORE"}, "AVG_PRICE" : { "$sum" : "$AVGPRICE"}}}, { "$project" : { "CITY_NAME" : 1, "COUNT_STORE" : 1, "AVG_SCORE" : 1, "AVG_PRICE" : 1, "_id" : 0}}, { "$match" : { "CITY_NAME" : { "$in" : ["上海", "北京", "广州"]}}}, { "$sort" : { "COUNT_STORE" : -1}}]
2022-11-07 19:12:40.534 [main] INFO org.mongodb.driver.connection - [info,71] - Opened connection [connectionId{localValue:3, serverValue:245}] to localhost:27017
{
"CITY_NAME": "北京",
"COUNT_STORE": 2214,
"AVG_SCORE": 9047,
"AVG_PRICE": 188465
}
{
"CITY_NAME": "上海",
"COUNT_STORE": 1950,
"AVG_SCORE": 7373,
"AVG_PRICE": 176425
}
{
"CITY_NAME": "广州",
"COUNT_STORE": 1483,
"AVG_SCORE": 5456,
"AVG_PRICE": 90398
}
至此,我们使用框架和原生api实现了MongoDB的聚合功能.
<<<<<<<<<<<< [完] >>>>>>>>>>>>