前置知识:ReactJS
tyarn install #安装相关依赖
tyarn start #启动服务
可见,布局是由layout常量定义
可见,左侧的菜单是自定义组件
{isTop && !isMobile ? null : (
)}
//导入
import SiderMenu from '@/components/SiderMenu';
打开 /components/SideMenu 文件
return (
//设置logo的位置
好客租房 · 后台
);
在Footer.js文件中修改版权信息
import React, { Fragment } from 'react';
import { Layout, Icon } from 'antd';
import GlobalFooter from '@/components/GlobalFooter';
const { Footer } = Layout;
const FooterView = () => (
);
export default FooterView;
路由即菜单
修改国际化映射文件 locale locales=>zh-CN=>settings.js
只有在路由中的命名空间才会被注册 命名空间名唯一
字段 | 类型 | 备注 |
---|---|---|
id | Long | 楼盘id |
name | String | 楼盘名称 |
province | String | 所在省 |
city | String | 所在市 |
area | String | 所在区 |
address | String | 具体地址 |
year | String | 建筑年代 |
type | String | 建筑类型 |
propertyCost | String | 物业费 |
propertyCompany | String | 物业公司 |
developers | String | 开发商 |
created | datetime | 创建时间 |
updated | datetime | 更新时间 |
字段 | 类型 | 备注 |
---|---|---|
id | Long | 房源id |
title | String | 房源标题,如:南北通透,两室朝南,主卧带阳台 |
estateId | Long | 楼盘id |
buildingNum | String | 楼号(栋) |
buildingUnit | String | 单元号 |
buildingFloorNum | String | 门牌号 |
rent | int | 租金 |
rentMethod | int | 租赁方式,1-整租,2-合租 |
paymentMethod | int | 支付方式,1-付一押一,2-付三押一,3-付六押一,4-年付押一,5-其它 |
houseType | String | 户型,如:2室1厅1卫 |
coveredArea | String | 建筑面积 |
useArea | String | 使用面积 |
floor | String | 楼层,如:8/26 |
orientation | int | 朝向:东、南、西、北 |
decoration | String | 装修,1-精装,2-简装,3-毛坯 |
facilities | String | 配套设施, 如:1,2,3 |
pic | String | 图片,最多5张 |
desc | String | 房源描述,如:出小区门,门口有时代联华超市,餐饮有川菜馆,淮南牛肉汤,黄焖鸡沙县小吃等;可到达亲水湾城市生活广场,里面有儿童乐园,台球室和康桥健身等休闲娱乐;生活广场往北沿御水路往北步行一公里就是御桥路,旁边就是御桥地铁站,地铁站商场… |
contact | String | 联系人 |
mobile | String | 手机号 |
time | int | 看房时间,1-上午,2-中午,3-下午,4-晚上,5-全天 |
propertyCost | String | 物业费 |
created | datetime | 创建时间 |
updated | datetime | 更新时间 |
官网链接
高性能表单控件,自带数据域管理。包含数据录入
、校验
以及对应 样式
与 API
。
API
被设置了 name
属性的 Form.Item
包装的控件,表单控件会自动添加 value
(或 valuePropName
指定的其他属性) onChange
(或 trigger
指定的其他属性),数据同步将被 Form 接管,这会导致以下结果:
onChange
来做数据收集同步(你可以使用 Form 的 onValuesChange
),但还是可以继续监听 onChange
事件。value
或 defaultValue
等属性来设置表单域的值,默认值可以用 Form 里的 initialValues
来设置。注意 initialValues
不能被 setState
动态更新,你需要用 setFieldsValue
来更新。setState
,可以使用 form.setFieldsValue
来动态改变表单值。在 rules的参数中,可以增加校验规则
{
initialValue:'1',
rules:[{
required: true,
message:"此项为必填项"
}]
}
表单的提交通过submit按钮完成,通过onSubmit方法进行拦截处理
handleSubmit = e => {
const { dispatch, form } = this.props;
e.preventDefault();
console.log(this.state.fileList);
form.validateFieldsAndScroll((err, values) => {
if (!err) {
//对设施进行处理
//1,2,3,4
//水,电,煤气/天然气,暖气
if(values.facilities){
values.facilities = values.facilities.join(",");
}
// 3/20
// 第三层、总共20层
if(values.floor_1 && values.floor_2){
values.floor = values.floor_1 + "/" + values.floor_2;
}
//3室1厅2卫1厨有阳台
values.houseType = values.houseType_1 + "室" + values.houseType_2 + "厅"
+ values.houseType_3 + "卫" + values.houseType_4 + "厨"
+ values.houseType_2 + "阳台";
delete values.floor_1;
delete values.floor_2;
delete values.houseType_1;
delete values.houseType_2;
delete values.houseType_3;
delete values.houseType_4;
delete values.houseType_5;
dispatch({
type: 'form/submitRegularForm',
payload: values,
});
}
});
};
文档
效果
实现
{
let v = estateMap.get(value);
this.setState({
estateAddress: v.substring(v.indexOf('|')+1),
estateId : v.substring(0,v.indexOf('|'))
});
}}
onSearch={this.handleSearch}
filterOption={(inputValue, option) => option.props.children.toUpperCase().indexOf(inputValue.toUpperCase()) !== -1}
/>
const estateMap = new Map([
['中远两湾城','1001|上海市,上海市,普陀区,远景路97弄'],
['上海康城','1002|上海市,上海市,闵行区,莘松路958弄'],
['保利西子湾','1003|上海市,上海市,松江区,广富林路1188弄'],
['万科城市花园','1004|上海市,上海市,闵行区,七莘路3333弄2区-15区'],
['上海阳城','1005|上海市,上海市,闵行区,罗锦路888弄']
]);
// 通过onSearch进行动态设置数据源,这里使用的数据是静态数据
handleSearch = (value)=>{
let arr = new Array();
if(value.length > 0 ){
estateMap.forEach((v, k) => {
if(k.startsWith(value)){
arr.push(k);
}
});
}
this.setState({
estateDataSource: arr
});
} ;
// 通过onSelect设置,选择中楼盘数据后,在楼盘地址中填写地址数据
onSelect={(value, option)=>{
let v = estateMap.get(value);
this.setState({
estateAddress: v.substring(v.indexOf('|')+1),
estateId : v.substring(0,v.indexOf('|'))
});
}}
图片上传通过自定义组件 PicturesWall
完成,在PictureWall中,通过 antd 的 Upload
组件实现
如何解决子组件的值传递到父组件
this.props
获取传入的函数,进行调用,即可将数据传递到父组件中this 的值是在执行的时候才能确认,定义的时候不能确认!
this 是执行上下文环境的一部分,而执行上下文需要在代码执行之前确定,而不是定义的时候
var obj = {
getThis: function() {
console.log(this);
}
};
obj.getThis();
var getThisCopy = obj.getThis;
getThisCopy();
绑定函数,使其无论怎么样调用都用相同的上下文环境
fun.bind(thisArgument, argument1, argument2, …)
var obj = {
num: 100,
numFun: function() {
console.log(this.num);
}
};
var numFunCopy = obj.numFun;
numFunCopy();
在 Window
上下文中,没有 num 值,num的值是在 obj
中定义的
所以引入 bind()
解决 this 不能够指向原来的问题
var obj = {
num: 100,
numFun: function(){
console.log(this.num);
}
}
var numFunCopy = obj.numFun;
obj.numFun();
numFunCopy.bind(obj)();
前端是使用React+semantic-ui实现移动端web展示,后期可以将web打包成app进行发布
npm install # 安装依赖
npm start # 启动服务
地址:http://localhost:9000/
使用node.js开发服务端的方式进行了demo化开发,只是作为前端开发的api工程,并不是实际环境
创建数据库
将 myhome.sql 执行 ,创建数据库
修改配置文件——数据库配置
/** 数据库配置 */
db: {
/** 模型文件路径 */
models*path: '/models',
/** 数据库主机IP */
host: '8.140.130.91',
/** 数据库的端口号 */
port: 3306,
/** 数据库类型 */
type: 'mysql',
/** 数据库登录用户名 */
username: 'root',
/** 数据库密码 */
password: 'root',
/** 数据库名称 */
database: 'myhome',
/** 是否显示数据库日志 */
logging: console.log,// false 为禁用日志
/** 配置数据库连接池 */
pool: {
max: 5,
min: 0,
charset: 'utf8',
idle: 30000
}
}
输入命令进行初始化和启动服务
npm install #安装依赖
npm run dev #启动dev脚本
#脚本如下
"scripts": {
"test": "cross-env NODE*ENV=config-test node app.js",
"dev": "cross-env NODE*ENV=config-dev node app.js", #设置环境变量
"pro": "cross-env NODE*ENV=config-pro node app.js"
}
登录系统测试
问题
Client does not support authentication protocol requested by server; conside
use mysql;
flush privileges;
-- 加密算法为caching*sha2*password,而旧版加密算法为mysql*native*password
select user,host,plugin from user;
alter user 'root'@'%' identified with mysql*native*password by 'root';
select user,host,plugin from user;
ERWRONGFIELDWITHGROUP
use myhome;
SET sql*mode=(SELECT REPLACE(@@sql*mode, 'ONLY*FULL*GROUP*BY', ''));
select @@sql*mode;
Promise.all()方法获取到所有的异步处理的结果,并且将结果保存到this.state中,然后再render中渲染
app.js
//设置全局的 axios baseUrl 配置
axios.defaults.baseURL = config.apiBaseUrl;
//设置拦截器
axios.interceptors.request.use(function (config) {
//在发送请求前获取mytoken的值
if(!config.url.endsWith('/login')){
config.headers.Authorization = localStorage.getItem('mytoken');
}
return config;
}, function (error) {
//获取数据失败处理
return Promise.reject(error);
});
axios.interceptors.response.use(function (response) {
// 对响应的拦截——————返回response.data数据
return response.data;
}, function (error) {
return Promise.reject(error);
});
目标:所有的数据通过自己实现的接口提供,不需要使用nodejs,便于后端开发
mock-data.properties
mock.indexMenu={"data":{"list":[\
{"id":1,"menu*name":"二手房","menu*logo":"home","menu*path":"/home","menu*status":1,"menu*style":null},\
{"id":2,"menu*name":"新房","menu*logo":null,"menu*path":null,"menu*status":null,"menu*style":null},\
{"id":3,"menu*name":"租房","menu*logo":null,"menu*path":null,"menu*status":null,"menu*style":null},\
{"id":4,"menu*name":"海外","menu*logo":null,"menu*path":null,"menu*status":null,"menu*style":null},\
{"id":5,"menu*name":"地图找房","menu*logo":null,"menu*path":null,"menu*status":null,"menu*style":null},\
{"id":6,"menu*name":"查公交","menu*logo":null,"menu*path":null,"menu*status":null,"menu*style":null},\
{"id":7,"menu*name":"计算器","menu*logo":null,"menu*path":null,"menu*status":null,"menu*style":null},\
{"id":8,"menu*name":"问答","menu*logo":null,"menu*path":null,"menu*status":null,"menu*style":null}]},"meta":\
{"status":200,"msg":"测试数据"}}
mock.indexInfo={"data":{"list":[\
{"id":1,"info*title":"房企半年销售业绩继","info*thumb":null,"info*time":null,"info*content":null,"user*id":null,"info*status":null,"info*type":1},\
{"id":2,"info*title":"上半年土地市场两重天:一线降温三四线量价齐升","info*thumb":null,"info*time":null,"info*content":null,"user*id":null,"info*status":null,"info*type":1}]},\
"meta":{"status":200,"msg":"测试数据"}}
mock.indexFaq={"data":{"list":[\
{"question*name":"在北京买房,需要支付的税费有哪些?","question*tag":"学区,海淀","answer*content":"各种费用","atime":33,"question*id":1,"qnum":2},\
{"question*name":"一般首付之后,贷款多久可以下来?","question*tag":"学区,昌平","answer*content":"大概1个月","atime":22,"question*id":2,"qnum":2}]},\
"meta":{"status":200,"msg":"测试数据"}}
mock.indexHouse={"data":{"list":[\
{"id":1,"home*name":"安贞西里123","home*price":"4511","home*desc":"72.32㎡/南 北/低楼层","home*infos":null,"home*type":1,"home*tags":"海淀,昌平","home*address":null,"user*id":null,"home*status":null,"home*time":12,"group*id":1},\
{"id":8,"home*name":"安贞西里 三室一厅","home*price":"4500","home*desc":"72.32㎡/南北/低楼层","home*infos":null,"home*type":1,"home*tags":"海淀","home*address":null,"user*id":null,"home*status":null,"home*time":23,"group*id":2},\
{"id":3,"home*name":"安贞西里 三室一厅","home*price":"4220","home*desc":"72.32㎡/南北/低楼层","home*infos":null,"home*type":2,"home*tags":"海淀","home*address":null,"user*id":null,"home*status":null,"home*time":1,"group*id":1},\
{"id":4,"home*name":"安贞西里 三室一厅","home*price":"4500","home*desc":"72.32㎡/南 北/低楼层","home*infos":"4500","home*type":2,"home*tags":"海淀","home*address":"","user*id":null,"home*status":null,"home*time":12,"group*id":2},\
{"id":5,"home*name":"安贞西里 三室一厅","home*price":"4522","home*desc":"72.32㎡/南 北/低楼层","home*infos":null,"home*type":3,"home*tags":"海淀","home*address":null,"user*id":null,"home*status":null,"home*time":23,"group*id":1},\
{"id":6,"home*name":"安贞西里 三室一厅","home*price":"4500","home*desc":"72.32㎡/南北/低楼层","home*infos":null,"home*type":3,"home*tags":"海淀","home*address":null,"user*id":null,"home*status":null,"home*time":1221,"group*id":2},\
{"id":9,"home*name":"安贞西里 三室一厅","home*price":"4500","home*desc":"72.32㎡/南北/低楼层","home*infos":null,"home*type":4,"home*tags":"海淀","home*address":null,"user*id":null,"home*status":null,"home*time":23,"group*id":1}\
]},
"meta":{"status":200,"msg":"测试数据"}}
mock.infosList1={"data":{"list":{"total":8,"data":[{"id":13,"info*title":"wwwwwwwwwwwww","info*thumb":null,"info*time":null,"info*content":null,"user*id":null,"info*status":null,"info*type":1},{"id":12,"info*title":"房企半年销售业绩继","info*thumb":null,"info*time":null,"info*content":null,"user*id":null,"info*status":null,"info*type":1}]}},"meta":{"status":200,"msg":"获取数据成功"}}
mock.infosList2={"data":{"list":{"total":4,"data":[{"id":9,"info*title":"房企半年销售业绩继续冲高三巨头销售额过亿","info*thumb":null,"info*time":null,"info*content":null,"user*id":null,"info*status":null,"info*type":2},{"id":7,"info*title":"房企半年销售业绩继续冲高三巨头销售额过亿","info*thumb":null,"info*time":null,"info*content":null,"user*id":null,"info*status":null,"info*type":2}]}},"meta":{"status":200,"msg":"获取数据成功"}}
mock.infosList3={"data":{"list":{"total":10,"data":[{"username":"tom","question*name":"在北京买房,需要支付的税费有哪些?","question*tag":"学区,海淀","answer*content":"各种费用","atime":33,"question*id":1,"qnum":2},{"username":"tom","question*name":"一般首付之后,贷款多久可以下来?","question*tag":"学区,昌平","answer*content":"大概1个月","atime":22,"question*id":2,"qnum":2}]}},"meta":{"status":200,"msg":"获取数据成功"}}
mock.my={"data":{"id":1,"username":"tom","password":"123","mobile":"123","type":null,"status":null,"avatar":"public/icon.png"},"meta":{"status":200,"msg":"获取数据成功"}}
读取properties文件,映射为String
package com.haoke.api.config;
import lombok.Data;
import org.springframework.boot.context.properties.ConfigurationProperties;
import org.springframework.context.annotation.Configuration;
import org.springframework.context.annotation.PropertySource;
@PropertySource("classpath:mock-data.properties")
@ConfigurationProperties(prefix = "mock")
@Configuration
@Data
public class MockConfig {
private String indexMenu;
private String indexInfo;
private String indexFaq;
private String indexHouse;
private String infosList1;
private String infosList2;
private String infosList3;
private String my;
}
package com.haoke.api.controller;
import com.haoke.api.config.MockConfig;
import org.springframework.beans.factory.annotation.Autowired;
import org.springframework.web.bind.annotation.*;
@RequestMapping("mock")
@RestController
@CrossOrigin
public class MockController {
@Autowired
private MockConfig mockConfig;
/**
* 菜单
*
* @return
*/
@GetMapping("index/menu")
public String indexMenu() {
return this.mockConfig.getIndexMenu();
}
/**
* 首页资讯
* @return
*/
@GetMapping("index/info")
public String indexInfo() {
return this.mockConfig.getIndexInfo();
}
/**
* 首页问答
* @return
*/
@GetMapping("index/faq")
public String indexFaq() {
return this.mockConfig.getIndexFaq();
}
/**
* 首页房源信息
* @return
*/
@GetMapping("index/house")
public String indexHouse() {
return this.mockConfig.getIndexHouse();
}
/**
* 查询资讯
*
* @param type
* @return
*/
@GetMapping("infos/list")
public String infosList(@RequestParam("type")Integer type) {
switch (type){
case 1:
return this.mockConfig.getInfosList1();
case 2:
return this.mockConfig.getInfosList2();
case 3:
return this.mockConfig.getInfosList3();
}
return this.mockConfig.getInfosList1();
}
/**
* 我的中心
* @return
*/
@GetMapping("my/info")
public String myInfo() {
return this.mockConfig.getMy();
}
}
Axios 是一个基于 promise 的 HTTP 库,可以用在浏览器和 node.js 中。
title: 毕设项目后台
top: 63
categories:
前置知识:
Mybatis
Spring
SpringMVC
SpringBoot
微服务
Dubbo
后台信息管理系统采用的是前后端分离开发模式,前端使用LayUI系统作为模板进行改造,后端采用的是SpringBoot+Dubbo+SSM的架构进行开发
后台服务采用微服务的思想,使用Dubbo作为服务治理框架
在单体架构的应用中,一个服务的所有功能模块都会被部署到同一台机器上。当用户规模变得庞大,应用的响应速度会变慢。这个时候可以通过增加一台机器部署应用来提高响应速度。假设这个应用只有房源模块和用户模块。经过分析,这个系统中房源模块消耗的资源最多。提高响应速度最高效的方法就是单独为房源模块分配更多资源。单体架构的应用无法做到这一点。
在微服务架构的应用中,应用会按照功能模块拆分为多个服务。部署时,可以在一台机器上部署一个用户服务和一个房源服务,另一台机器上部署两个房源服务,这样可以最大化利用系统资源。
Dubbo框架基于生产者-消费者模型来完成服务治理。在一项业务流程中,一个服务要么是服务提供方,要么是服务消费方。降低了各层之间服务的耦合度。
设想一种场景,前台使用者进入程序会根据地理位置、平时浏览的户型、设置的喜好推送房源列表,后台管理员登录管理系统后,也会有获取房源列表的需求。但这两种角色身份不同,获取到的房源列表也会不同,所以需要不同的处理器对模型层返回的房源列表进行处理。但这两种处理器中,从模型层获取房源列表这个过程是公有的,所以将模型层查询房源列表服务作为一个服务,提供给不同控制器调用。
这与Dubbo的生产者-消费者模式很类似,Service层处理完数据后提供给Controller层,Controller层利用Service层的数据进行不同的业务处理流程,所以我把Service层作为服务的生产者,Controller层作为服务的消费者。
haoke-manage
<project xmlns="http://maven.apache.org/POM/4.0.0"
xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance"
xsi:schemaLocation="http://maven.apache.org/POM/4.0.0
http://maven.apache.org/xsd/maven-4.0.0.xsd">
<modelVersion>4.0.0modelVersion>
<groupId>com.haoke.managegroupId>
<artifactId>haoke-manageartifactId>
<packaging>pompackaging>
<version>1.0-SNAPSHOTversion>
<modules>
<module>haoke-manage-dubbo-servermodule>
<module>haoke-manage-api-servermodule>
modules>
<parent>
<artifactId>spring-boot-starter-parentartifactId>
<groupId>org.springframework.bootgroupId>
<version>2.4.3version>
parent>
<dependencies>
<dependency>
<groupId>org.springframework.bootgroupId>
<artifactId>spring-boot-starter-testartifactId>
<version>2.4.3version>
dependency>
<dependency>
<groupId>org.apache.commonsgroupId>
<artifactId>commons-lang3artifactId>
dependency>
<dependency>
<groupId>com.alibaba.bootgroupId>
<artifactId>dubbo-spring-boot-starterartifactId>
<version>0.2.0version>
dependency>
<dependency>
<groupId>com.alibabagroupId>
<artifactId>dubboartifactId>
<version>2.6.4version>
dependency>
<dependency>
<groupId>org.apache.zookeepergroupId>
<artifactId>zookeeperartifactId>
<version>3.4.13version>
dependency>
<dependency>
<groupId>com.github.sgroschupfgroupId>
<artifactId>zkclientartifactId>
<version>0.1version>
dependency>
dependencies>
<build>
<plugins>
<plugin>
<groupId>org.springframework.bootgroupId>
<artifactId>spring-boot-maven-pluginartifactId>
plugin>
plugins>
build>
project>
haoke-manage-dubbo-server
<project xmlns="http://maven.apache.org/POM/4.0.0"
xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance"
xsi:schemaLocation="http://maven.apache.org/POM/4.0.0 http://maven.apache.org/xsd/maven-4.0.0.xsd">
<parent>
<artifactId>haoke-manageartifactId>
<groupId>com.haoke.managegroupId>
<version>1.0-SNAPSHOTversion>
parent>
<modelVersion>4.0.0modelVersion>
<packaging>pompackaging>
<modules>
<module>haoke-manage-dubbo-server-house-resourcesmodule>
<module>haoke-manage-dubbo-server-generatormodule>
modules>
<artifactId>haoke-manage-dubbo-serverartifactId>
<dependencies>
<dependency>
<groupId>org.springframework.bootgroupId>
<artifactId>spring-boot-starterartifactId>
dependency>
<dependency>
<groupId>org.projectlombokgroupId>
<artifactId>lombokartifactId>
dependency>
dependencies>
project>
haoke-manage-api-server
<project xmlns="http://maven.apache.org/POM/4.0.0"
xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance"
xsi:schemaLocation="http://maven.apache.org/POM/4.0.0 http://maven.apache.org/xsd/maven-4.0.0.xsd">
<parent>
<artifactId>haoke-manageartifactId>
<groupId>com.haoke.managegroupId>
<version>1.0-SNAPSHOTversion>
parent>
<modelVersion>4.0.0modelVersion>
<artifactId>haoke-manage-api-serverartifactId>
<dependencies>
<dependency>
<groupId>org.springframework.bootgroupId>
<artifactId>spring-boot-starter-webartifactId>
dependency>
<dependency>
<groupId>com.haoke.managegroupId>
<artifactId>haoke-manage-dubbo-server-house-resources-interfaceartifactId>
<version>1.0-SNAPSHOTversion>
dependency>
dependencies>
project>
use haoke;
DROP TABLE IF EXISTS `TB_ESTATE`;
CREATE TABLE `TB_ESTATE` (
`id` bigint NOT NULL AUTO_INCREMENT,
`name` varchar(100) DEFAULT NULL COMMENT '楼盘名称',
`province` varchar(10) DEFAULT NULL COMMENT '所在省',
`city` varchar(10) DEFAULT NULL COMMENT '所在市',
`area` varchar(10) DEFAULT NULL COMMENT '所在区',
`address` varchar(100) DEFAULT NULL COMMENT '具体地址',
`year` varchar(10) DEFAULT NULL COMMENT '建筑年代',
`type` varchar(10) DEFAULT NULL COMMENT '建筑类型',
`property_cost` varchar(10) DEFAULT NULL COMMENT '物业费',
`property_company` varchar(20) DEFAULT NULL COMMENT '物业公司',
`developers` varchar(20) DEFAULT NULL COMMENT '开发商',
`created` datetime DEFAULT NULL COMMENT '创建时间',
`updated` datetime DEFAULT NULL COMMENT '更新时间',
PRIMARY KEY (`id`)
) ENGINE=InnoDB AUTO_INCREMENT=1006 DEFAULT CHARSET=utf8 COMMENT='楼盘表';
INSERT INTO `TB_ESTATE` VALUES
(1001,'中远两湾城','上海市','上海市','普陀区','远景路97弄','2001','塔楼/板楼','1.5','上海中远物业管理发展有限公司','上海万业企业股份有限公司','2021-03-16 23:00:20','2021-03-16 23:00:20'),
(1002,'上海康城','上海市','上海市','闵行区','莘松路958弄','2001','塔楼/板楼','1.5','盛孚物业','闵行房地产','2021-03-16 23:00:20','2021-03-16 23:00:20'),
(1003,'保利西子湾','上海市','上海市','松江区','广富林路1188弄','2008','塔楼/板楼','1.75','上海保利物业管理','上海城乾房地产开发有限公司','2021-03-16 23:00:20','2021-03-16 23:00:20'),
(1004,'万科城市花园','上海市','上海市','松江区','广富林路1188弄','2002','塔楼/板楼','1.5','上海保利物业管理','上海城乾房地产开发有限公司','2021-03-16 23:00:20','2021-03-16 23:00:20'),
(1005,'上海阳城','上海市','上海市','闵行区','罗锦路888弄','2002','塔楼/板楼','1.5','上海莲阳物业管理有限公司','上海莲城房地产开发有限公司','2021-03-16 23:00:20','2021-03-16 23:00:20');
CREATE TABLE `TB_HOUSE_RESOURCES` (
`id` bigint(20) NOT NULL AUTO_INCREMENT,
`title` varchar(100) DEFAULT NULL COMMENT '房源标题',
`estate_id` bigint(20) DEFAULT NULL COMMENT '楼盘id',
`building_num` varchar(5) DEFAULT NULL COMMENT '楼号(栋)',
`building_unit` varchar(5) DEFAULT NULL COMMENT '单元号',
`building_floor_num` varchar(5) DEFAULT NULL COMMENT '门牌号',
`rent` int(10) DEFAULT NULL COMMENT '租金',
`rent_method` tinyint(1) DEFAULT NULL COMMENT '租赁方式,1-整租,2-合租',
`payment_method` tinyint(1) DEFAULT NULL COMMENT '支付方式,1-付一押一,2-付三押一,3-付六押一,4-年付押一,5-其它',
`house_type` varchar(255) DEFAULT NULL COMMENT '户型,如:2室1厅1卫',
`covered_area` varchar(10) DEFAULT NULL COMMENT '建筑面积',
`use_area` varchar(10) DEFAULT NULL COMMENT '使用面积',
`floor` varchar(10) DEFAULT NULL COMMENT '楼层,如:8/26',
`orientation` varchar(2) DEFAULT NULL COMMENT '朝向:东、南、西、北',
`decoration` tinyint(1) DEFAULT NULL COMMENT '装修,1-精装,2-简装,3-毛坯',
`facilities` varchar(50) DEFAULT NULL COMMENT '配套设施, 如:1,2,3',
`pic` varchar(200) DEFAULT NULL COMMENT '图片,最多5张',
`house_desc` varchar(200) DEFAULT NULL COMMENT '描述',
`contact` varchar(10) DEFAULT NULL COMMENT '联系人',
`mobile` varchar(11) DEFAULT NULL COMMENT '手机号',
`time` tinyint(1) DEFAULT NULL COMMENT '看房时间,1-上午,2-中午,3-下午,4-晚上,5-全天',
`property_cost` varchar(10) DEFAULT NULL COMMENT '物业费',
`created` datetime DEFAULT NULL,
`updated` datetime DEFAULT NULL,
PRIMARY KEY (`id`)
) ENGINE=InnoDB AUTO_INCREMENT=1 DEFAULT CHARSET=utf8 COMMENT='房源表';
package com.haoke.dubbo.server.pojo;
@Data
public abstract class BasePojo implements Serializable {
private Date created;
private Date updated;
}
mybatis-plus的AutoGenerator插件根据 数据库中的表结构 生成相应的POJO类
<project xmlns="http://maven.apache.org/POM/4.0.0"
xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance"
xsi:schemaLocation="http://maven.apache.org/POM/4.0.0 http://maven.apache.org/xsd/maven-4.0.0.xsd">
<modelVersion>4.0.0modelVersion>
<artifactId>haoke-manage-dubbo-server-generatorartifactId>
<dependencies>
<dependency>
<groupId>org.freemarkergroupId>
<artifactId>freemarkerartifactId>
dependency>
<dependency>
<groupId>com.baomidougroupId>
<artifactId>mybatis-plus-coreartifactId>
<version>3.4.2version>
dependency>
<dependency>
<groupId>com.baomidougroupId>
<artifactId>mybatis-plus-generatorartifactId>
<version>3.4.1version>
dependency>
dependencies>
project>
public class CodeGenerator {
/**
* 读取控制台内容
*/
public static String scanner(String tip) {
Scanner scanner = new Scanner(System.in);
StringBuilder help = new StringBuilder();
help.append("请输入" + tip + ":");
System.out.println(help.toString());
if (scanner.hasNext()) {
String ipt = scanner.next();
if (StringUtils.isNotEmpty(ipt)) {
return ipt;
}
}
throw new MybatisPlusException("请输入正确的" + tip + "!");
}
public static void main(String[] args) {
// 代码生成器
AutoGenerator mpg = new AutoGenerator();
// 全局配置
GlobalConfig gc = new GlobalConfig();
String projectPath = System.getProperty("user.dir");
gc.setOutputDir(projectPath + "/src/main/java");
gc.setAuthor("amostian");
gc.setOpen(false);
mpg.setGlobalConfig(gc);
// 数据源配置
DataSourceConfig dsc = new DataSourceConfig();
dsc.setUrl("jdbc:mysql://82.157.25.57:4002/haoke?characterEncoding=utf8&useSSL=false&serverTimezone=UTC");
// dsc.setSchemaName("public");
dsc.setDriverName("com.mysql.cj.jdbc.Driver");
dsc.setUsername("mycat");
dsc.setPassword("mycat");
mpg.setDataSource(dsc);
// 目标包配置
PackageConfig pc = new PackageConfig();
pc.setModuleName(scanner("模块名"));
pc.setParent("com.haoke.dubbo.server");
mpg.setPackageInfo(pc);
// 自定义配置
InjectionConfig cfg = new InjectionConfig(){
@Override
public void initMap() {
}
};
List<FileOutConfig> focList = new ArrayList<>();
focList.add(new FileOutConfig("/templates/mapper.xml.ftl") {
@Override
public String outputFile(TableInfo tableInfo) {
// 自定义输入文件名称
return projectPath + "/src/main/resources/mapper/" +
pc.getModuleName()
+ "/" + tableInfo.getEntityName() + "Mapper" +
StringPool.DOT_XML;
}
});
cfg.setFileOutConfigList(focList);
mpg.setCfg(cfg);
mpg.setTemplate(new TemplateConfig().setXml(null));
// 策略配置
StrategyConfig strategy = new StrategyConfig();
strategy.setNaming(NamingStrategy.underline_to_camel);
strategy.setColumnNaming(NamingStrategy.underline_to_camel);
strategy.setSuperEntityClass("com.haoke.dubbo.server.pojo.BasePojo");
strategy.setEntityLombokModel(true);
strategy.setRestControllerStyle(true);
strategy.setInclude(scanner("表名"));
strategy.setSuperEntityColumns("id");
strategy.setControllerMappingHyphenStyle(true);
strategy.setTablePrefix(pc.getModuleName() + "_");
mpg.setStrategy(strategy);
mpg.setTemplateEngine(new FreemarkerTemplateEngine());
mpg.execute();
}
}
只需要entity (pojo)
com.haoke.dubbo.server.pojo
下package com.haoke.dubbo.server.pojo;
import com.baomidou.mybatisplus.annotation.TableName;
import lombok.Data;
import lombok.EqualsAndHashCode;
@Data
@EqualsAndHashCode(callSuper = true)
@TableName("TB_HOUSE_RESOURCES")
public class HouseResources extends BasePojo {
private static final long serialVersionUID = -2471649692631014216L;
/**
* 房源标题
*/
private String title;
/**
* 楼盘id
*/
@TableId(value = "ID", type = IdType.AUTO)
private Long estateId;
/**
* 楼号(栋)
*/
private String buildingNum;
/**
* 单元号
*/
private String buildingUnit;
/**
* 门牌号
*/
private String buildingFloorNum;
/**
* 租金
*/
private Integer rent;
/**
* 租赁方式,1-整租,2-合租
*/
private Integer rentMethod;
/**
* 支付方式,1-付一押一,2-付三押一,3-付六押一,4-年付押一,5-其它
*/
private Integer paymentMethod;
/**
* 户型,如:2室1厅1卫
*/
private String houseType;
/**
* 建筑面积
*/
private String coveredArea;
/**
* 使用面积
*/
private String useArea;
/**
* 楼层,如:8/26
*/
private String floor;
/**
* 朝向:东、南、西、北
*/
private String orientation;
/**
* 装修,1-精装,2-简装,3-毛坯
*/
private Integer decoration;
/**
* 配套设施, 如:1,2,3
*/
private String facilities;
/**
* 图片,最多5张
*/
private String pic;
/**
* 描述
*/
private String houseDesc;
/**
* 联系人
*/
private String contact;
/**
* 手机号
*/
private String mobile;
/**
* 看房时间,1-上午,2-中午,3-下午,4-晚上,5-全天
*/
private Integer time;
/**
* 物业费
*/
private String propertyCost;
}
将房源业务分为接口层面和实现层面,是为了便于组件化维护。
haoke-manage-dubbo-server-house-resources
<project xmlns="http://maven.apache.org/POM/4.0.0"
xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance"
xsi:schemaLocation="http://maven.apache.org/POM/4.0.0 http://maven.apache.org/xsd/maven-4.0.0.xsd">
<parent>
<artifactId>haoke-manage-dubbo-serverartifactId>
<groupId>com.haoke.managegroupId>
<version>1.0-SNAPSHOTversion>
parent>
<modelVersion>4.0.0modelVersion>
<artifactId>haoke-manage-dubbo-server-house-resourcesartifactId>
<packaging>pompackaging>
<modules>
<module>haoke-manage-dubbo-server-house-resources-interfacemodule>
<module>haoke-manage-dubbo-server-house-resources-servicemodule>
modules>
<dependencies>
<dependency>
<groupId>com.baomidougroupId>
<artifactId>mybatis-plus-boot-starterartifactId>
<version>3.4.2version>
dependency>
<dependency>
<groupId>mysqlgroupId>
<artifactId>mysql-connector-javaartifactId>
<version>8.0.16version>
dependency>
dependencies>
project>
haoke-manage-dubbo-server-house-resources-interface
<project xmlns="http://maven.apache.org/POM/4.0.0"
xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance"
xsi:schemaLocation="http://maven.apache.org/POM/4.0.0 http://maven.apache.org/xsd/maven-4.0.0.xsd">
<parent>
<artifactId>haoke-manage-dubbo-server-house-resourcesartifactId>
<groupId>com.haoke.managegroupId>
<version>1.0-SNAPSHOTversion>
parent>
<modelVersion>4.0.0modelVersion>
<artifactId>haoke-manage-dubbo-server-house-resources-interfaceartifactId>
project>
haoke-manage-server-house-resources-service
房源服务的实现——Spring业务
<project xmlns="http://maven.apache.org/POM/4.0.0"
xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance"
xsi:schemaLocation="http://maven.apache.org/POM/4.0.0 http://maven.apache.org/xsd/maven-4.0.0.xsd">
<parent>
<artifactId>haoke-manage-dubbo-server-house-resourcesartifactId>
<groupId>com.haoke.managegroupId>
<version>1.0-SNAPSHOTversion>
parent>
<modelVersion>4.0.0modelVersion>
<artifactId>haoke-manage-dubbo-server-house-resources-serviceartifactId>
<dependencies>
<dependency>
<groupId>org.springframework.bootgroupId>
<artifactId>spring-boot-starter-jdbcartifactId>
dependency>
<dependency>
<groupId>com.haoke.managegroupId>
<artifactId>haoke-manage-dubbo-server-house-resources-interfaceartifactId>
<version>1.0-SNAPSHOTversion>
dependency>
dependencies>
project>
application.preperties
# Spring boot application
spring.application.name = haoke-manage-dubbo-server-house-resources
# 数据库
spring.datasource.driver-class-name=com.mysql.cj.jdbc.Driver
spring.datasource.url=jdbc:mysql://82.157.25.25:4002/haoke?characterEncoding=utf8&useSSL=false&serverTimezone=UTC
spring.datasource.username=mycat
spring.datasource.password=mycat
# Dubbo配置
##服务的扫描包
dubbo.scan.basePackages = com.haoke.server.api
##服务名称
dubbo.application.name = dubbo-provider-house-resources
dubbo.service.version = 1.0.0
##协议以及端口
dubbo.protocol.name = dubbo
dubbo.protocol.port = 20880
##zk服务注册中心地址
dubbo.registry.address = zookeeper://8.140.130.91:2181
dubbo.registry.client = zkclient
haoke-manage-dubbo-server-house-resources-interface
package com.haoke.server.api;
import com.haoke.server.pojo.HouseResources;
public interface ApiHouseResourcesService {
/**
* @param houseResources
*
* @return -1:输入的参数不符合要求,0:数据插入数据库失败,1:成功
*/
int saveHouseResources(HouseResources houseResources);
}
创建SpringBoot应用,实现新增房源服务
- 连接数据库——Dao层
- 实现CRUD接口——Service层
MybatisPlus配置类
package com.haoke.server.config;
import org.mybatis.spring.annotation.MapperScan;
import org.springframework.context.annotation.Configuration;
//定义包扫描路径
@MapperScan("com.haoke.server.mapper")
@Configuration
public class MybatisPlusConfig {}
HouseResourcesMapper接口
package com.haoke.server.mapper;
import com.baomidou.mybatisplus.core.mapper.BaseMapper;
import com.haoke.dubbo.server.pojo.HouseResources;
//BaseMapper是MybatisPlus提供的基本CRUD类
public interface HouseResourcesMapper extends BaseMapper<HouseResources> {}
此处实现的是spring的服务,为dubbo服务的具体实现细节,无需对外暴露,同时需要进行事务控制和其他判断逻辑
定义接口
package com.haoke.server.service;
import com.haoke.server.pojo.HouseResources;
public interface HouseResourcesService {
/**
*
* @param houseResources
* @return -1:输入的参数不符合要求,0:数据插入数据库失败,1:成功
*/
int saveHouseResources(HouseResources houseResources);
}
编写实现类
通用CRUD实现
package com.haoke.server.service.impl;
import com.baomidou.mybatisplus.core.conditions.query.QueryWrapper;
import com.baomidou.mybatisplus.core.mapper.BaseMapper;
import com.baomidou.mybatisplus.core.metadata.IPage;
import com.baomidou.mybatisplus.extension.plugins.pagination.Page;
import com.haoke.dubbo.server.pojo.BasePojo;
import org.springframework.beans.factory.annotation.Autowired;
import java.util.Date;
import java.util.List;
public class BaseServiceImpl<T extends BasePojo>{
@Autowired
private BaseMapper<T> mapper;
/**
* 根据id查询数据
* @param id
* @return
*/
public T queryById(Long id) {
return this.mapper.selectById(id);
}
/**
* 查询所有数据
*
* @return
*/
public List<T> queryAll() {
return this.mapper.selectList(null);
}
/**
* 根据条件查询一条数据
*
* @param record
* @return
*/
public T queryOne(T record) {
return this.mapper.selectOne(new QueryWrapper<>(record));
}
/**
* 根据条件查询数据列表
* @param record
* @return
*/
public List<T> queryListByWhere(T record) {
return this.mapper.selectList(new QueryWrapper<>(record));
}
/**
* 根据条件分页查询数据列表
* @param record
* @param page
* @param rows
* @return
* */
public IPage<T> queryPageListByWhere(T record, Integer page, Integer rows) {
// 获取分页数据
return this.mapper.selectPage(new Page<T>(page, rows), new QueryWrapper<>
(record));
}
/**
* 保存数据
*
* @param record
* @return
*/
public Integer save(T record) {
record.setCreated(new Date());
record.setUpdated(record.getCreated());
return this.mapper.insert(record);
}
/**
* 更新数据
* @param record
* @return
*/
public Integer update(T record) {
record.setUpdated(new Date());
return this.mapper.updateById(record);
}
/**
* 根据id删除数据
* @param id
* @return
*/
public Integer deleteById(Long id) {
return this.mapper.deleteById(id);
}
/**
* 根据ids批量删除数据
* @param ids
* @return
*/
public Integer deleteByIds(List<Long> ids) {
return this.mapper.deleteBatchIds(ids);
}
/**
* 根据条件删除数据
* @param record
* @return
*/
public Integer deleteByWhere(T record){
return this.mapper.delete(new QueryWrapper<>(record));
}
}
房源相关实现类——HouseResourcesImpl
package com.haoke.server.service.impl;
import com.alibaba.dubbo.common.utils.StringUtils;
import com.haoke.server.pojo.HouseResources;
import com.haoke.server.service.HouseResourcesService;
import org.springframework.stereotype.Service;
import org.springframework.transaction.annotation.Transactional;
@Transactional//开启事务
@Service//这是Spring的服务
public class HouseResourcesServiceImpl
extends BaseServiceImpl
implements HouseResourcesService {
@Override
public int saveHouseResources(HouseResources houseResources) {
// 编写校验逻辑,如果校验不通过,返回-1
if (StringUtils.isBlank(houseResources.getTitle())) {
return -1;
}
//其他校验以及逻辑省略 ……
return super.save(houseResources);
}
}
暴露新增房源的dubbo服务,将接口作为Dubbo服务导出
package com.haoke.server.api;
import com.alibaba.dubbo.config.annotation.Service;
import com.haoke.server.pojo.HouseResources;
import com.haoke.server.service.HouseResourcesService;
import org.springframework.beans.factory.annotation.Autowired;
//实现Dubbo,对外暴露服务
@Service(version = "${dubbo.service.version}")
public class ApiHoukeResourcesImpl implements ApiHouseResourcesService{
@Autowired
private HouseResourcesService resourcesService;
@Override
public int saveHouseResources(HouseResources houseResources) {
return this.resourcesService.saveHouseResources(houseResources);
}
}
启动SpringBoot程序,将Dubbo服务导出到注册中心
package com.haoke.server;
import org.springframework.boot.WebApplicationType;
import org.springframework.boot.autoconfigure.SpringBootApplication;
import org.springframework.boot.builder.SpringApplicationBuilder;
@SpringBootApplication
public class DubboProvider {
public static void main(String[] args) {
new SpringApplicationBuilder(DubboProvider.class)
.web(WebApplicationType.NONE)//不是web应用
.run(args);
}
}
cd /opt/incubator-dubbo-ops/
mvn --projects dubbo-admin-server spring-boot:run
dubbo-provider-house-resources,端口为20880
haoke-manage-api-server
因为dubbo是消费方,需要添加dubbo提供方提供的接口、pojo的依赖
<dependencies>
<dependency>
<groupId>org.springframework.bootgroupId>
<artifactId>spring-boot-starter-webartifactId>
dependency>
<dependency>
<groupId>com.haoke.managegroupId>
<artifactId>haoke-manage-dubbo-server-house-resources-interfaceartifactId>
<version>1.0-SNAPSHOTversion>
dependency>
dependencies>
# Spring boot application
spring.application.name = haoke-manage-api-server
server.port = 9091
#logging.level.root=DEBUG
# 应用名称
dubbo.application.name = dubbo-consumer-haoke-manage
# zk注册中心 服务消费方从注册中心订阅服务
dubbo.registry.address = zookeeper://8.140.130.91:2181
dubbo.registry.client = zkclient
dubbo.service.version = 1.0.0
HouseResourceService用于调用dubbo服务
package com.haoke.api.service;
import com.alibaba.dubbo.config.annotation.Reference;
import com.haoke.server.api.ApiHouseResourcesService;
import com.haoke.server.pojo.HouseResources;
import org.springframework.stereotype.Service;
@Service
public class HouseResourceService {
@Reference(version = "${dubbo.service.version}")
private ApiHouseResourcesService apiHouseResourcesService;
public boolean save(HouseResources houseResources){
int result = this.apiHouseResourcesService.saveHouseResources(houseResources);
return result==1;
}
}
package com.haoke.api.controller;
import com.haoke.api.service.HouseResourceService;
import com.haoke.server.pojo.HouseResources;
import org.springframework.beans.factory.annotation.Autowired;
import org.springframework.http.HttpStatus;
import org.springframework.http.ResponseEntity;
import org.springframework.stereotype.Controller;
import org.springframework.web.bind.annotation.*;
@RequestMapping("house/resources")
@Controller
public class HouseResourcesController {
@Autowired
private HouseResourceService houseResourceService;
/**
* 新增房源
*
* @param houseResources json数据
* @return
*/
@PostMapping
@ResponseBody
public ResponseEntity<Void> save(@RequestBody HouseResources houseResources){
try {
boolean bool = this.houseResourceService.save(houseResources);
if(bool){
return ResponseEntity.status(HttpStatus.CREATED).build();
}
} catch (Exception e) {
e.printStackTrace();
}
return ResponseEntity.status(HttpStatus.INTERNAL_SERVER_ERROR).build();
}
/**
* test
* @return
*/
@GetMapping
@ResponseBody
public ResponseEntity<String> get(){
System.out.println("get House Resources");
return ResponseEntity.ok("ok");
}
}
package com.haoke.api;
import org.springframework.boot.SpringApplication;
import org.springframework.boot.autoconfigure.SpringBootApplication;
@SpringBootApplication(exclude = {DataSourceAutoConfiguration.class})
public class DubboApiApplication {
public static void main(String[] args) {
SpringApplication.run(DubboApiApplication.class, args);
}
}
新建 models 文件夹
import { routerRedux } from 'dva/router';
import { message } from 'antd';
import { addHouseResource } from '@/services/haoke/haoke';
export default {
namespace: 'house',
state: {
},
effects: {
*submitHouseForm({ payload }, { call }) {
console.log("page model")
yield call(addHouseResource, payload);
message.success('提交成功');
}
},
reducers: {
},
};
import request from '@/utils/request';
export async function addHouseResource(params) {
return request('/haoke/house/resources', {
method: 'POST',
body: params
});
}
handleSubmit = e => {
const { dispatch, form } = this.props;
e.preventDefault();
form.validateFieldsAndScroll((err, values) => {
if (!err) {
if(values.facilities){
values.facilities = values.facilities.join(",");
}
if(values.floor_1 && values.floor_2){
values.floor = `${values.floor_1 }/${ values.floor_2}`;
}
values.houseType = `${values.houseType_1 }室${ values.houseType_2 }厅${
values.houseType_3 }卫${ values.houseType_4 }厨${
values.houseType_2 }阳台`;
delete values.floor_1;
delete values.floor_2;
delete values.houseType_1;
delete values.houseType_2;
delete values.houseType_3;
delete values.houseType_4;
delete values.houseType_5;
dispatch({
type: 'house/submitHouseForm',
payload: values,
});
}
});
};
https://umijs.org/zh-CN/config#proxy
proxy: {
'/haoke/': {
target: 'http://127.0.0.1:9091',//目标地址
changeOrigin: true,
pathRewrite: { '^/haoke/': '' },//路径覆盖
},
},
代理效果:
请求:http://127.0.0.1:8000/haoke/house/resources
实际:http://127.0.0.1:9091/house/resources
haoke-manage-server-house-resources-dubbo-interface
Dubbo服务提供方接口
package com.haoke.server.api;
import com.haoke.server.pojo.HouseResources;
import com.haoke.server.vo.PageInfo;
public interface ApiHouseResourcesService {
/**
* @param houseResources
*
* @return -1:输入的参数不符合要求,0:数据插入数据库失败,1:成功
*/
int saveHouseResources(HouseResources houseResources);
/**
* 分页查询房源列表
*
* @param page 当前页
* @param pageSize 页面大小
* @param queryCondition 查询条件
* @return
*/
PageInfo<HouseResources> queryHouseResourcesList(int page, int pageSize, HouseResources queryCondition);
}
haoke-manage-dubbo-server-house-resources-service
服务提供方封装返回的数据
package com.haoke.server.vo;
import lombok.AllArgsConstructor;
import lombok.Data;
import java.util.Collections;
import java.util.List;
@Data
@AllArgsConstructor
public class PageInfo<T> implements java.io.Serializable{
/**
* 总条数
*/
private Integer total;
/**
* 当前页
*/
private Integer pageNum;
/**
* 一页显示的大小
*/
private Integer pageSize;
/**
* 数据列表
*/
private List<T> records = Collections.emptyList();
}
dubbo服务的实现实际上为调用Spring的服务层业务
package com.haoke.server.api;
import com.alibaba.dubbo.config.annotation.Service;
import com.haoke.server.pojo.HouseResources;
import com.haoke.server.service.HouseResourcesService;
import com.haoke.server.vo.PageInfo;
import org.springframework.beans.factory.annotation.Autowired;
//实现Dubbo,对外暴露服务
@Service(version = "${dubbo.service.version}")
public class ApiHaokeResourcesImpl implements ApiHouseResourcesService{
@Autowired
private HouseResourcesService houseResourcesService;
@Override
public int saveHouseResources(HouseResources houseResources) {
return this.houseResourcesService.saveHouseResources(houseResources);
}
@Override
public PageInfo<HouseResources> queryHouseResourcesList(int page, int pageSize, HouseResources queryCondition) {
return this.houseResourcesService.queryHouseResourcesList(page, pageSize, queryCondition);
}
}
mybatisplus从数据库获取数据
@Override
public PageInfo<HouseResources> queryHouseResourcesList(int page, int pageSize, HouseResources queryCondition) {
QueryWrapper<HouseResources> queryWrapper = new QueryWrapper<HouseResources>(queryCondition);
queryWrapper.orderByDesc("updated");//按更新时间降序排列
IPage iPage = super.queryPageList(queryWrapper, page, pageSize);
return new PageInfo<HouseResources>(Long.valueOf(iPage.getTotal()).intValue() , page, pageSize, iPage.getRecords());
}
spring的服务层实现查询列表业务
spring服务层定义
package com.haoke.server.service;
import com.haoke.server.pojo.HouseResources;
import com.haoke.server.vo.PageInfo;
public interface HouseResourcesService {
/**
*
* @param houseResources
* @return -1:输入的参数不符合要求,0:数据插入数据库失败,1:成功
*/
int saveHouseResources(HouseResources houseResources);
public PageInfo<HouseResources> queryHouseResourcesList(int page, int pageSize, HouseResources queryCondition);
}
spring服务层实现
package com.haoke.server.service.impl;
import com.alibaba.dubbo.common.utils.StringUtils;
import com.baomidou.mybatisplus.core.conditions.query.QueryWrapper;
import com.baomidou.mybatisplus.core.metadata.IPage;
import com.haoke.server.pojo.HouseResources;
import com.haoke.server.service.HouseResourcesService;
import com.haoke.server.vo.PageInfo;
import org.springframework.stereotype.Service;
import org.springframework.transaction.annotation.Transactional;
@Transactional//开启事务
@Service//这是Spring的服务
public class HouseResourcesServiceImpl
extends BaseServiceImpl
implements HouseResourcesService {
@Override
public int saveHouseResources(HouseResources houseResources) {
// 编写校验逻辑,如果校验不通过,返回-1
if (StringUtils.isBlank(houseResources.getTitle())) {
return -1;
}
//其他校验以及逻辑省略 ……
return super.save(houseResources);
}
@Override
public PageInfo<HouseResources> queryHouseResourcesList(int page, int pageSize, HouseResources queryCondition) {
QueryWrapper<Object> queryWrapper = new QueryWrapper<>(queryCondition);
queryWrapper.orderByDesc("updated");
IPage iPage = super.queryPageList(queryWrapper, page, pageSize);
return new PageInfo<HouseResources>(Long.valueOf(iPage.getTotal()).intValue() , page, pageSize, iPage.getRecords());
}
}
实现RESTful风格接口
@Data
@AllArgsConstructor
public class TableResult<T> {
private List<T> list;
private Pagination pagination;
}
@Data
@AllArgsConstructor
public class Pagination {
private Integer current;
private Integer pageSize;
private Integer total;
}
public TableResult queryList(HouseResources houseResources, Integer currentPage, Integer pageSize) {
PageInfo<HouseResources> pageInfo
= this.apiHouseResourcesService.queryHouseResourcesList(currentPage, pageSize, houseResources);
return new TableResult(
pageInfo.getRecords(),
new Pagination(currentPage, pageSize, pageInfo.getTotal()));
}
/**
* 查询房源列表
* @param houseResources
* @param currentPage
* @param pageSize
* @return
*/
@GetMapping("/list")//完整请求路径是/house/resource/list
@ResponseBody
public ResponseEntity<TableResult> list(HouseResources houseResources,
@RequestParam(name = "currentPage", defaultValue = "1") Integer currentPage,
@RequestParam(name = "pageSize",defaultValue = "10") Integer pageSize) {
return ResponseEntity.ok(this.houseResourceService.queryList(houseResources, currentPage, pageSize));
}
columns = [
{
title: '房源编号',
dataIndex: 'id',
},
{
title: '房源信息',
dataIndex: 'title',
},
{
title: '图',
dataIndex: 'pic',
render : (text, record, index) => <ShowPics pics={text} />
},
{
title: '楼栋',
render : (text, record, index) => `${record.buildingFloorNum }栋${record.buildingNum}单元${record.buildingUnit}号`
},
{
title: '户型',
dataIndex: 'houseType'
},
{
title: '面积',
dataIndex: 'useArea',
render : (text, record, index) => `${text}平方`
},
{
title: '楼层',
dataIndex: 'floor'
},
{
title: '操作',
render: (text, record) => (
<Fragment>
<a onClick={() => this.handleUpdateModalVisible(true, record)}>查看</a>
<Divider type="vertical" />
<a href="">删除</a>
</Fragment>
),
},
];
import React from 'react';
import { Modal, Button, Carousel } from 'antd';
class ShowPics extends React.Component{
info = () => {
Modal.info({
title: '',
iconType:'false',
width: '800px',
okText: "ok",
content: (
{
this.props.pics.split(',').map((value,index) =>
)
}
),
onOk() {},
});
};
constructor(props){
super(props);
this.state={
btnDisabled: !this.props.pics
}
}
render() {
return (
)
}
}
export default ShowPics;
import { queryResource } from '@/services/haoke/houseResource';
export default {
namespace: 'houseResource',
state: {
data: {
list: [],
pagination: {},
},
},
effects: {
*fetch({ payload }, { call, put }) {
console.log("houseResource fetch")
const response = yield call(queryResource, payload);
yield put({
type: 'save',
payload: response,
});
}
},
reducers: {
save(state, action) {// state表示当前model的数据,action表示异步函数 put ,put()中的payload为封装了回调数据的属性
return {
...state,
data: action.payload,
};
},
},
};
import request from '@/utils/request';
import { stringify } from 'qs';
export async function queryResource(params) {
return request(`/haoke/house/resources/list?${stringify(params)}`);
}
官网地址
一种用于前后端 数据查询 方式的规范
GET http://127.0.0.1/user/1 #查询
POST http://127.0.0.1/user #新增
PUT http://127.0.0.1/user #更新
DELETE http://127.0.0.1/user #删除
场景一:
只需某一对象的部分属性,但通过RESTful返回的是这个对象的所有属性
#请求
GET http://127.0.0.1/user/1001
#响应:
{
id : 1001,
name : "张三",
age : 20,
address : "北京市",
……
}
场景二:
一个需求,要发起多次请求才能完成
#查询用户信息
GET http://127.0.0.1/user/1001
#响应:
{
id : 1001,
name : "张三",
age : 20,
address : "北京市",
……
}
#查询用户的身份证信息
GET http://127.0.0.1/card/8888
#响应:
{
id : 8888,
name : "张三",
cardNumber : "999999999999999",
address : "北京市",
……
}
当请求中只有name属性时,响应结果中只包含name属性,如果请求中添加appearsIn属性,那么结果中就会返回appearsIn的值
演示地址:https://graphql.cn/learn/schema/#type-system
一次请求,不仅查询到了hero数据,而且还查询到了friends数据。节省了网络请求次数
当API进行升级时,客户端可以不进行升级,可以等到后期一起升级,这样就大大减少了客户端和服务端的耦合度
GraphQL定义了一套规范,用来描述语法定义 http://graphql.cn/learn/queries/
规范 ≠ \neq = 实现
在GraphQL的查询中,请求结构中包含了所预期结果的结构,这个就是字段。并且响应的结构和请求结构基本一致,这是GraphQL的一个特性,这样就可以让请求发起者很清楚的知道自己想要什么。
语法:(参数名:参数值)
如果一次查询多个 相同对象
,但是 值不同
,这个时候就需要起别名了,否则json的语法就不能通过了
查询对的属性如果相同,可以采用片段的方式进行简化定义
Schema用于定义数据结构
https://graphql.cn/learn/schema/
每一个 GraphQL 服务都有一个 query
类型,可能有一个 mutation
类型。这两个类型和常规对象类型无差,但是它们之所以特殊,是因为它们定义了每一个 GraphQL 查询的入口。
schema { #定义查询
query: UserQuery
}
type UserQuery{# 定义查询的类型
user(id:ID):User #指定对象以及参数类型
}
type User{# 定义对象
id:ID! #!表示该属性必须不可为空
name:String
age:Int
}
GraphQL支持自定义类型,比如在graphql-java实现中增加了:Long、Byte等。
enum Episode{# 定义枚举
NEWHOPE
EMPIRE
JEDI
}
type huma{
id: ID!
name: String!
appearsIn: [Episode]! #使用枚举类型 表示一个 Episode 数组
homePlanet: String
}
一个接口是一个抽象类型,它包含某些字段,而对象类型必须包含这些字段,才能算实现了这个接口
interface Character{# 定义接口
id: ID!
name: String!
friends: [Character]
appearsIn: [Episode]!
}
#实现接口
type Human implememts Character{
id: ID!
name: String!
friends: [Character]!
starship: [Startships]!
totalCredits: Int
}
type Droid implements Character {
id: ID!
name: String!
friends: [Character]
appearsIn: [Episode]!
primaryFunction: String
}
官方只是定义了规范并没有做实现,就需要有第三方来进行实现了
官网:https://www.graphql-java.com/
https://www.graphql-java.com/documentation/v16/getting-started/
graphQL并未发布到maven中央仓库中,需要添加第三方仓库,才能下载到依赖
Maven:若使用mirrors配置镜像,则第三方配置不会生效
<project xmlns="http://maven.apache.org/POM/4.0.0"
xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance"
xsi:schemaLocation="http://maven.apache.org/POM/4.0.0 http://maven.apache.org/xsd/maven-4.0.0.xsd">
<modelVersion>4.0.0modelVersion>
<groupId>org.examplegroupId>
<artifactId>graphqlartifactId>
<version>1.0-SNAPSHOTversion>
<repositories>
<repository>
<snapshots>
<enabled>falseenabled>
snapshots>
<id>bintray-andimarek-graphql-javaid>
<name>bintrayname>
<url>https://dl.bintray.com/andimarek/graphql-javaurl>
repository>
repositories>
<dependencies>
<dependency>
<groupId>org.projectlombokgroupId>
<artifactId>lombokartifactId>
dependency>
<dependency>
<groupId>com.graphql-javagroupId>
<artifactId>graphql-javaartifactId>
<version>11.0version>
dependency>
dependencies>
project>
schema {
query: UserQuery
}
type UserQuery{
user(id:ID): User
}
type User{
id: ID!
name: String
age: Int
}
public class GraphQLDemo {
public static void main(String[] args) {
/**
* 定义User对象类型
* type User { #定义对象
* id:Long! # !表示该属性是非空项
* name:String
* age:Int
* }
* @return
*/
GraphQLObjectType userType = newObject()
.name("User")
.field(newFieldDefinition().name("id").type(GraphQLLong))
.field(newFieldDefinition().name("name").type(GraphQLString))
.field(newFieldDefinition().name("age").type(GraphQLInt))
.build();
/**
* 定义查询的类型
* type UserQuery { #定义查询的类型
* user : User #指定对象
* }
* @return
*/
GraphQLObjectType userQuery = newObject()
.name("userQuery")
.field(newFieldDefinition()
.name("user")
.type(userType)
.dataFetcher(new StaticDataFetcher(new User(1L,"张三",20)))
)
.build();
/**
* 定义Schema
* schema { #定义查询
* query: UserQuery
* }
* @return
*/
GraphQLSchema graphQLSchema = GraphQLSchema.newSchema()
.query(userQuery)
.build();
//构建GraphQL查询器
GraphQL graphQL = GraphQL.newGraphQL(graphQLSchema).build();
//查询结果
String query = "{user{id,name}}";
ExecutionResult executionResult = graphQL.execute(query);
// 打印错误
System.out.println("错误:" + executionResult.getErrors());
// 打印数据
System.out.println("结果:" +(Object) executionResult.toSpecification());
}
}
public class GraphQLDemo {
public static void main(String[] args) {
/**
* 定义User对象类型
* type User { #定义对象
* id:Long! # !表示该属性是非空项
* name:String
* age:Int
* }
* @return
*/
GraphQLObjectType userType = newObject()
.name("User")
.field(newFieldDefinition().name("id").type(GraphQLLong))
.field(newFieldDefinition().name("name").type(GraphQLString))
.field(newFieldDefinition().name("age").type(GraphQLInt))
.build();
/**
* 定义查询的类型
* type UserQuery { #定义查询的类型
* user : User #指定对象
* }
* @return
*/
GraphQLObjectType userQuery = newObject()
.name("userQuery")
.field(newFieldDefinition()
.name("user")
.argument(GraphQLArgument.newArgument()
.name("id")
.type("GraphQLLong")
)
.type(userType)
.dataFetcher(
Environment->{
Long id = Environment.getArgument("id");
//查询数据库
//TODO
return new User(id,"张三",id.intValue()+10);
})
)
.build();
/**
* 定义Schema
* schema { #定义查询
* query: UserQuery
* }
* @return
*/
GraphQLSchema graphQLSchema = GraphQLSchema.newSchema()
.query(userQuery)
.build();
//构建GraphQL查询器
GraphQL graphQL = GraphQL.newGraphQL(graphQLSchema).build();
//查询结果
String query = "{user(id:100){id,name,age}}";
ExecutionResult executionResult = graphQL.execute(query);
// 打印错误
System.out.println("错误:" + executionResult.getErrors());
// 打印数据
System.out.println("结果:" +(Object) executionResult.toSpecification());
}
}
SDL通过插件将GraphQL定义文件转换为java
schema {
query: UserQuery
}
type UserQuery{
user(id:ID): User
}
type User{
id: ID!
name: String
age: Int
card: Card
}
type Card {
cardNumber:String!
userId: ID
}
public class GraphQLSDLDemo {
public static void main(String[] args) throws IOException {
/* 1. 读取资源,进行解析 */
//资源名
String fileName = "user.graphql";
/*
org.apache.commons
commons-lang3
* */
String fileContent = IOUtils.toString(GraphQLSDLDemo.class.getClassLoader().getResource(fileName),"UTF-8");
TypeDefinitionRegistry tyRegistry = new SchemaParser().parse(fileContent);
/* 2. 数据查询 */
RuntimeWiring wiring = RuntimeWiring.newRuntimeWiring()
.type("UserQuery",builder ->
builder.dataFetcher("user", Environment->{
Long id = Long.parseLong(Environment.getArgument("id"));
Card card = new Card("number_"+id,id);
return new User(id,"张三_"+id,id.intValue()+10,card);
})
)
.build();
/* 3. 生成schema */
GraphQLSchema graphQLSchema = new SchemaGenerator().makeExecutableSchema(tyRegistry,wiring);
/* 4. 根据schema对象生成GraphQL对象 */
GraphQL graphQL = GraphQL.newGraphQL(graphQLSchema).build();
String query = "{user(id:100){id,name,age,card{cardNumber}}}";
ExecutionResult executionResult = graphQL.execute(query);
System.out.println(executionResult.toSpecification());
}
}
public HouseResources queryHouseResourcesById(Long id);
@Override
public HouseResources queryHouseResourcesById(Long id) {
return (HouseResources) super.queryById(id);
}
/*
* 实现通过id查询 房源
*
* @Param id 房源id
* @return
* */
HouseResources queryHouseResourcesById(Long id);
@Override
public HouseResources queryHouseResourcesById(Long id) {
return houseResourcesService.queryHouseResourcesById(id);
}
HouseResourceService
@Reference(version = "${dubbo.service.version}")
private ApiHouseResourcesService apiHouseResourcesService;
/*
* 根据id查询房源数据
*
* @Param id
* @Return
* */
public HouseResources queryHouseResourcesById(Long id){
//调用dubbo服务查询数据
return this.apiHouseResourcesService.queryHouseResourcesById(id);
<repositories>
<repository>
<snapshots>
<enabled>falseenabled>
snapshots>
<id>bintray-andimarek-graphql-javaid>
<name>bintrayname>
<url>https://dl.bintray.com/andimarek/graphql-javaurl>
repository>
repositories>
<dependency>
<groupId>com.graphql-javagroupId>
<artifactId>graphql-javaartifactId>
<version>16.0version>
dependency>
haoke.graphql
schema {
query: HaokeQuery
}
type HaokeQuery{
# 通过Id查询房源信息
HouseResources(id:ID): HouseResources
}
type HouseResources{
id:ID!
title:String
estateId:ID
buildingNum:String
buildingUnit:String
buildingFloorNum:String
rent:Int
rentMethod:Int
paymentMethod:Int
houseType:String
coveredArea:String
useArea:String
floor:String
orientation:String
decoration:Int
facilities:String
pic:String
houseDesc:String
contact:String
mobile:String
time:Int
propertyCost:String
}
graphql —— Bean
@Component//将GraphQL对象注入IoC容器,并完成GraphQL的初始化
public class GraphQLProvider {
private GraphQL graphQL;
@Autowired
private HouseResourceService houseResourceService;
@PostConstruct//在IoC容器初始化时运行
public void init() throws FileNotFoundException {
//导入graphql脚本
File file = ResourceUtils.getFile("classpath:haoke.graphql");
//初始化graphql
this.graphQL = GraphQL.newGraphQL(//schema { query: HaokeQuery}
new SchemaGenerator().makeExecutableSchema(
new SchemaParser().parse(file),//TypeDefinitionRegistry
RuntimeWiring.newRuntimeWiring()//RuntimeWiring
.type("HaokeQuery",builder ->
builder.dataFetcher("HouseResources", Environment->{
Long id = Long.parseLong(Environment.getArgument("id"));
return this.houseResourceService.queryHouseResourcesById(id);
})
)
.build()
)
).build();
}
@Bean
GraphQL graphQL(){
return this.graphQL;
}
}
@RequestMapping("graphql")
@Controller
public class GraphQLController {
@Autowired
private GraphQL graphQL;
@GetMapping
@ResponseBody
public Map<String,Object> graphql(@RequestParam("query")String query){
return this.graphQL.execute(query).toSpecification();
}
}
每当增加查询时,都需要修改该方法
改进思路
package com.haoke.api.graphql;
import graphql.schema.DataFetchingEnvironment;
public interface MyDataFetcher {
/**
* 查询名称
*
* @return
*/
String fieldName();
/**
* 具体实现数据查询的逻辑
*
* @param environment
* @return
*/
Object dataFetcher(DataFetchingEnvironment environment);
}
@Component
public class HouseResourcesDataFetcher implements MyDataFetcher {
@Autowired
HouseResourceService houseResourceService;
@Override
public String fieldName() {
return "HouseResources";
}
@Override
public Object dataFetcher(DataFetchingEnvironment environment) {
Long id = Long.parseLong(environment.getArgument("id"));
return this.houseResourceService.queryHouseResourcesById(id);
}
}
this.graphQL = GraphQL.newGraphQL(
new SchemaGenerator().makeExecutableSchema(
new SchemaParser().parse(file),//TypeDefinitionRegistry
RuntimeWiring.newRuntimeWiring()//RuntimeWiring
.type("HaokeQuery",builder ->{
for (MyDataFetcher myDataFetcher : myDataFetchers) {
builder.dataFetcher(
myDataFetcher.fieldName(),
Environment->myDataFetcher.dataFetcher(Environment)
);
}
return builder;
}
)
.build()
)
请求地址:
响应:
所以,数据只需要返回图片链接即可
use haoke;
CREATE TABLE `tb_ad` (
`id` bigint(20) NOT NULL AUTO_INCREMENT,
`type` int(10) DEFAULT NULL COMMENT '广告类型',
`title` varchar(100) DEFAULT NULL COMMENT '描述',
`url` varchar(200) DEFAULT NULL COMMENT '图片URL地址',
`created` datetime DEFAULT NULL,
`updated` datetime DEFAULT NULL,
PRIMARY KEY (`id`)
) ENGINE=InnoDB DEFAULT CHARSET=utf8 COMMENT='广告表';
INSERT INTO `tb_ad` (`id`, `type`, `title`, `url`, `created`, `updated`) VALUES (
'1','1', 'UniCity万科天空之城',
'https://haoke-1257323542.cos.ap-beijing.myqcloud.com/ad-swipes/1.jpg',
'2021-3-24 16:36:11','2021-3-24 16:36:16');
INSERT INTO `tb_ad` (`id`, `type`, `title`, `url`, `created`, `updated`) VALUES (
'2','1', '天和尚海庭前',
'https://haoke-1257323542.cos.ap-beijing.myqcloud.com/ad-swipes/2.jpg',
'2021-3-24 16:36:43','2021-3-24 16:36:37');
INSERT INTO `tb_ad` (`id`, `type`, `title`, `url`, `created`, `updated`) VALUES (
'3', '1', '[奉贤 南桥] 光语著',
'https://haoke-1257323542.cos.ap-beijing.myqcloud.com/ad-swipes/3.jpg',
'2021-3-24 16:38:32','2021-3-24 16:38:26');
INSERT INTO `tb_ad` (`id`, `type`, `title`, `url`, `created`, `updated`) VALUES (
'4','1', '[上海周边 嘉兴] 融创海逸长洲',
'https://haoke-1257323542.cos.ap-beijing.myqcloud.com/ad-swipes/4.jpg',
'2021-3-24 16:39:10','2021-3-24 16:39:13');
dubbo服务提供方
<dependencies>
<dependency>
<groupId>com.haoke.managegroupId>
<artifactId>haoke-manage-dubbo-server-commonartifactId>
<version>1.0-SNAPSHOTversion>
dependency>
dependencies>
<dependencies>
<dependency>
<groupId>com.haoke.managegroupId>
<artifactId>haoke-manage-dubbo-server-ad-interfaceartifactId>
<version>1.0-SNAPSHOTversion>
dependency>
dependencies>
# Spring boot application
spring.application.name = haoke-manage-dubbo-server-ad
# 数据库
spring.datasource.driver-class-name=com.mysql.cj.jdbc.Driver
spring.datasource.url=jdbc:mysql://8.140.130.91:3306/myhome\
?characterEncoding=utf8&useSSL=false&serverTimezone=UTC&autoReconnect=true&allowMultiQueries=true
spring.datasource.username=root
spring.datasource.password=root
# hikari设置
spring.datasource.hikari.maximum-pool-size=60
spring.datasource.hikari.idle-timeout=60000
spring.datasource.hikari.connection-timeout=60000
spring.datasource.hikari.validation-timeout=3000
spring.datasource.hikari.login-timeout=5
spring.datasource.hikari.max-lifetime=60000
# 服务的扫描包
dubbo.scan.basePackages = com.haoke.server.api
# 应用名称
dubbo.application.name = dubbo-provider-ad
dubbo.service.version = 1.0.0
# 协议以及端口
dubbo.protocol.name = dubbo
dubbo.protocol.port = 21880
# zk注册中心
dubbo.registry.address = zookeeper://8.140.130.91:2181
dubbo.registry.client = zkclient
@Data
@TableName("tb_ad")
public class Ad extends BasePojo{
private static final long serialVersionUID = -493439243433085768L;
@TableId(value = "id", type = IdType.AUTO)
private Long id;
//广告类型
private Integer type;
//描述
private String title;
//'图片URL地址
private String url;
}
package com.haoke.server.api;
import com.haoke.server.pojo.Ad;
import com.haoke.server.vo.PageInfo;
public interface ApiAdService {
/**
* 分页查询广告数据
*
* @param type 广告类型
* @param page 页数
* @param pageSize 每页显示的数据条数
* @return
*/
PageInfo<Ad> queryAdList(Integer type, Integer page, Integer pageSize);
}
package com.haoke.server.mapper;
import com.baomidou.mybatisplus.core.mapper.BaseMapper;
import com.haoke.server.pojo.Ad;
public interface AdMapper extends BaseMapper<Ad> {}
分页配置
package com.haoke.server.config;
import com.baomidou.mybatisplus.annotation.DbType;
import com.baomidou.mybatisplus.extension.plugins.MybatisPlusInterceptor;
import com.baomidou.mybatisplus.extension.plugins.PaginationInterceptor;
import com.baomidou.mybatisplus.extension.plugins.inner.PaginationInnerInterceptor;
import org.mybatis.spring.annotation.MapperScan;
import org.springframework.context.annotation.Bean;
import org.springframework.context.annotation.Configuration;
@MapperScan("com.haoke.server.mapper")
@Configuration
public class MybatisPlusConfig {
@Bean
public MybatisPlusInterceptor mybatisPlusInterceptor() {
MybatisPlusInterceptor interceptor = new MybatisPlusInterceptor();
PaginationInnerInterceptor paginationInnerInterceptor = new PaginationInnerInterceptor();
paginationInnerInterceptor.setDbType(DbType.MYSQL);
interceptor.addInnerInterceptor(paginationInnerInterceptor);
return interceptor;
}
}
实现业务
编写接口:
package com.haoke.server.service;
import com.haoke.server.pojo.Ad;
import com.haoke.server.vo.PageInfo;
public interface AdService {
PageInfo<Ad> queryAdList(Ad ad, Integer page, Integer pageSize);
}
实现接口:
package com.haoke.server.service.impl;
import com.baomidou.mybatisplus.core.conditions.query.QueryWrapper;
import com.baomidou.mybatisplus.core.metadata.IPage;
import com.haoke.server.pojo.Ad;
import com.haoke.server.service.AdService;
import com.haoke.server.service.BaseServiceImpl;
import com.haoke.server.vo.PageInfo;
import org.springframework.stereotype.Service;
@Service
public class AdServiceImpl extends BaseServiceImpl implements AdService {
@Override
public PageInfo<Ad> queryAdList(Ad ad, Integer page, Integer pageSize) {
QueryWrapper queryWrapper = new QueryWrapper();
//排序
queryWrapper.orderByDesc("updated");
//按广告的类型查询
queryWrapper.eq("type",ad.getType());
IPage iPage = super.queryPageList(queryWrapper,page,pageSize);
return new PageInfo<>(Long.valueOf(iPage.getTotal()).intValue(),page,pageSize,iPage.getRecords());
}
}
package com.haoke.server.api;
import com.haoke.server.pojo.Ad;
import com.haoke.server.vo.PageInfo;
public interface ApiAdService {
/**
* 分页查询广告数据
*
* @param type 广告类型
* @param page 页数
* @param pageSize 每页显示的数据条数
* @return
*/
PageInfo<Ad> queryAdList(Integer type, Integer page, Integer pageSize);
}
package com.haoke.server.api;
import com.alibaba.dubbo.config.annotation.Service;
import com.haoke.server.pojo.Ad;
import com.haoke.server.service.AdService;
import com.haoke.server.vo.PageInfo;
import org.springframework.beans.factory.annotation.Autowired;
@Service(version = "${dubbo.service.version}")
public class ApiAdServiceImpl implements ApiAdService{
@Autowired
private AdService adService;
@Override
public PageInfo<Ad> queryAdList(Integer type, Integer page, Integer pageSize) {
Ad ad = new Ad();
ad.setType(type);
return this.adService.queryAdList(ad,page,pageSize);
}
}
package com.haoke.server;
import org.springframework.boot.WebApplicationType;
import org.springframework.boot.autoconfigure.SpringBootApplication;
import org.springframework.boot.builder.SpringApplicationBuilder;
@SpringBootApplication
public class AdDubboProvider {
public static void main(String[] args) {
new SpringApplicationBuilder(AdDubboProvider.class)
.web(WebApplicationType.NONE)//非web应用
.run(args);
}
}
<dependency>
<groupId>com.haoke.managegroupId>
<artifactId>haoke-manage-dubbo-server-ad-interfaceartifactId>
<version>1.0-SNAPSHOTversion>
dependency>
package com.haoke.api.vo;
import com.fasterxml.jackson.annotation.JsonIgnore;
import lombok.AllArgsConstructor;
import lombok.Data;
import java.util.HashMap;
import java.util.List;
import java.util.Map;
@Data
@AllArgsConstructor
public class WebResult {
@JsonIgnore
private int status;
@JsonIgnore
private String msg;
@JsonIgnore
private List<?> list;
@JsonIgnore
public static WebResult ok(List<?> list) {
return new WebResult(200, "成功", list);
}
@JsonIgnore
public static WebResult ok(List<?> list, String msg) {
return new WebResult(200, msg, list);
}
public Map<String, Object> getData() {
HashMap<String, Object> data = new HashMap<String, Object>();
data.put("list", this.list);
return data;
}
public Map<String, Object> getMeta() {
HashMap<String, Object> meta = new HashMap<String, Object>();
meta.put("msg", this.msg);
meta.put("status", this.status);
return meta;
}
}
package com.haoke.api.service;
import com.alibaba.dubbo.config.annotation.Reference;
import com.haoke.api.vo.WebResult;
import com.haoke.server.api.ApiAdService;
import com.haoke.server.pojo.Ad;
import com.haoke.server.vo.PageInfo;
import org.springframework.stereotype.Service;
import java.util.ArrayList;
import java.util.HashMap;
import java.util.List;
import java.util.Map;
@Service
public class AdService {
@Reference(version = "1.0.0")
private ApiAdService apiAdService;
public PageInfo<Ad> queryAdList(Integer type, Integer page, Integer pageSize) {
return this.apiAdService.queryAdList(type, page, pageSize);
}
}
package com.haoke.api.controller;
import com.haoke.api.service.AdService;
import com.haoke.api.vo.WebResult;
import com.haoke.server.pojo.Ad;
import com.haoke.server.vo.PageInfo;
import org.springframework.beans.factory.annotation.Autowired;
import org.springframework.web.bind.annotation.CrossOrigin;
import org.springframework.web.bind.annotation.GetMapping;
import org.springframework.web.bind.annotation.RequestMapping;
import org.springframework.web.bind.annotation.RestController;
import java.util.ArrayList;
import java.util.HashMap;
import java.util.List;
import java.util.Map;
@RequestMapping("ad")
@RestController
@CrossOrigin//允许跨域
public class AdController {
@Autowired
private AdService adService;
/**
* 首页广告位
* @return
*/
@GetMapping
public WebResult queryIndexad(){
PageInfo<Ad> pageInfo = this.adService.queryAdList(1,1,3);
List<Ad> ads = pageInfo.getRecords();
List<Map<String,Object>> data = new ArrayList<>();
for (Ad ad : ads) {
Map<String,Object> map = new HashMap<>();
map.put("original",ad.getUrl());
data.add(map);
}
return WebResult.ok(data);
}
}
修改home.js文件中请求地址
let swipe = new Promise((resolve, reject) => {
axios.get('http://127.0.0.1:9091/ad').then((data)=>{
resolve(data.data.list);
});
})
跨域问题:
{
"list": [
{
"original": "http://itcast-haoke.oss-cnqingdao.aliyuncs.com/images/2018/11/26/15432030275359146.jpg"
},
{
"original": "http://itcast-haoke.oss-cnqingdao.aliyuncs.com/images/2018/11/26/15432029946721854.jpg"
},
{
"original": "http://itcast-haoke.oss-cnqingdao.aliyuncs.com/images/2018/11/26/1543202958579877.jpg"
}
]
}
type HaokeQuery{
#分页查询房源信息-应用于前台房源信息
HouseResourcesList(page:Int, pageSize:Int):TableResult
# 通过Id查询房源信息
HouseResources(id:ID): HouseResources
#首页广告图-应用于前台首页
IndexAdList: IndexAdResult
}
type IndexAdResult{
list:[IndexAdResultData]
}
type IndexAdResultData{
original: String
}
package com.haoke.api.vo.ad.index;
import lombok.AllArgsConstructor;
import lombok.Data;
import lombok.NoArgsConstructor;
import java.util.List;
@Data
@AllArgsConstructor
@NoArgsConstructor
public class IndexAdResult {
private List<IndexAdResultData> list;
}
package com.haoke.api.vo.ad.index;
import lombok.AllArgsConstructor;
import lombok.Data;
import lombok.NoArgsConstructor;
@Data
@AllArgsConstructor
@NoArgsConstructor
public class IndexAdResultData {
private String original;
}
package com.haoke.api.graphql.myDataFetcherImpl;
import com.haoke.api.graphql.MyDataFetcher;
import com.haoke.api.service.AdService;
import com.haoke.api.vo.WebResult;
import com.haoke.api.vo.ad.index.IndexAdResult;
import com.haoke.api.vo.ad.index.IndexAdResultData;
import com.haoke.server.pojo.Ad;
import com.haoke.server.vo.PageInfo;
import graphql.schema.DataFetchingEnvironment;
import org.springframework.beans.factory.annotation.Autowired;
import org.springframework.stereotype.Component;
import java.util.ArrayList;
import java.util.List;
@Component
public class IndexAdDataFetcher implements MyDataFetcher {
@Autowired
private AdService adService;
@Override
public String fieldName() {
return "IndexAdList";
}
@Override
public Object dataFetcher(DataFetchingEnvironment environment) {
PageInfo<Ad> pageInfo = this.adService.queryAdList(1, 1, 3);
List<Ad> ads = pageInfo.getRecords();
List<IndexAdResultData> list = new ArrayList<>();
for (Ad ad : ads) {
list.add(new IndexAdResultData(ad.getUrl()));
}
return new IndexAdResult(list);
}
}
{
IndexAdList{
list{
original
}
}
}
参考文档:https://www.apollographql.com/docs/react/get-started/
npm install @apollo/client graphql
import { ApolloClient, gql } from '@apollo/client';
const client = new ApolloClient({
uri: 'http://127.0.0.1:9091/graphql',
});
//定义查询
const GET_INDEX_ADS = gql`
{
IndexAdList{
list{
original
}
}
}
`;
let swipe = new Promise((resolve, reject) => {
client.query({query: GET_INDEX_ADS}).then(result =>
resolve(result.data.IndexAdList.list));
})
两个问题:
package com.haoke.api.controller;
import com.fasterxml.jackson.databind.JsonNode;
import com.fasterxml.jackson.databind.ObjectMapper;
import graphql.GraphQL;
import org.springframework.beans.factory.annotation.Autowired;
import org.springframework.stereotype.Controller;
import org.springframework.web.bind.annotation.*;
import java.io.IOException;
import java.util.HashMap;
import java.util.Map;
@RequestMapping("graphql")
@Controller
@CrossOrigin//添加跨域
public class GraphQLController {
@Autowired
private GraphQL graphQL;
private static final ObjectMapper MAPPER = new ObjectMapper();
@GetMapping
@ResponseBody
public Map<String,Object> graphql(@RequestParam("query")String query){
return this.graphQL.execute(query).toSpecification();
}
@PostMapping
@ResponseBody
public Map<String, Object> postGraphql(@RequestBody String json) throws IOException {
try {
JsonNode jsonNode = MAPPER.readTree(json);
if(jsonNode.has("query")){
String query = jsonNode.get("query").asText();
return this.graphQL.execute(query).toSpecification();
}
}catch (IOException e){
e.printStackTrace();
}
Map<String,Object> error = new HashMap<>();
error.put("status",500);
error.put("msg","查询出错");
return error;
}
}
haoke.graphql
schema {
query: HaokeQuery
}
type HaokeQuery{
#分页查询房源信息-应用于前台房源信息
HouseResourcesList(page:Int, pageSize:Int):TableResult
# 通过Id查询房源信息
HouseResources(id:ID): HouseResources
#首页广告图-应用于前台首页
IndexAdList: IndexAdResult
}
type HouseResources{
id:ID!
title:String
estateId:ID
buildingNum:String
buildingUnit:String
buildingFloorNum:String
rent:Int
rentMethod:Int
paymentMethod:Int
houseType:String
coveredArea:String
useArea:String
floor:String
orientation:String
decoration:Int
facilities:String
pic:String
houseDesc:String
contact:String
mobile:String
time:Int
propertyCost:String
}
type TableResult{
list: [HouseResources]
pagination: Pagination
}
type Pagination{
current:Int
pageSize:Int
total:Int
}
HouseResourcesListDataFetcher
@Component
public class HouseResourcesListDataFetcher implements MyDataFetcher {
@Autowired
HouseResourceService houseResourceService;
@Override
public String fieldName() {
return "HouseResourcesList";
}
@Override
public Object dataFetcher(DataFetchingEnvironment environment) {
Integer page = environment.getArgument("page");
if(page == null){
page = 1;
}
Integer pageSize = environment.getArgument("pageSize");
if(pageSize == null){
pageSize = 5;
}
return this.houseResourceService.queryList(null, page, pageSize);
}
}
问题分析:上述 首页轮播广告查询接口
中的参数是固定的
实际应用中要实现根据前端的请求参数设置参数查询
https://graphql.cn/learn/queries/#variables
一种办法使直接将参数动态的设置到请求体(POST)或URL(GET)中,缺点就是可以直接通过修改查询字符串来自行获取数据。
GraphQL 拥有一级方法将动态值提取到查询之外,然后作为分离的字典传进去。这些动态值即称为变量。
query hk($id:ID){
HouseResources(id:$id){
id
title
}
}
GraphQL发送的数据如上,后端需处理请求并返回相应的数据
由GraphQL的调用流程可知,传入到后端的GraphQL字符串最终会被构造成一个 ExecutionInput
对象
GraphQLController
package com.haoke.api.controller;
@RequestMapping("graphql")
@Controller
@CrossOrigin//添加跨域
public class GraphQLController {
@Autowired
private GraphQL graphQL;
private static final ObjectMapper MAPPER = new ObjectMapper();
@GetMapping
@ResponseBody
public Map<String,Object> graphql(@RequestParam("query")String query,
@RequestParam(value = "variables",required = false) String variablesJSON,
@RequestParam(value = "operationName",required = false) String operationName){
try {
//反序列化,将JSON字符串转化为Map对象
Map<String, Object> variables = MAPPER.readValue(variablesJSON, MAPPER.getTypeFactory().constructMapType(HashMap.class,String.class,Object.class));
return this.executeGraphQLQuery(query,operationName,variables);
} catch (JsonProcessingException e) {
e.printStackTrace();
}
Map<String,Object> error = new HashMap<>();
error.put("status",500);
error.put("msg","查询出错");
return error;
}
@PostMapping
@ResponseBody
public Map<String, Object> postGraphql(@RequestBody Map<String,Object> map) throws IOException {
try{
String query = (String) map.get("query");
if(null == query){
query = "";
}
String operationName = (String) map.get("operationName");
if(null == operationName){
operationName = "";
}
Map variables = (Map) map.get("variables");
if(variables == null){
variables = Collections.EMPTY_MAP;
}
return this.executeGraphQLQuery(query,operationName,variables);
} catch (Exception e) {
e.printStackTrace();
}
Map<String,Object> error = new HashMap<>();
error.put("status",500);
error.put("msg","查询出错");
return error;
}
private Map<String, Object> executeGraphQLQuery(String query,String operationName,Map<String,Object> variables) {
return this.graphQL.execute(
ExecutionInput.newExecutionInput()
.query(query)
.variables(variables)
.operationName(operationName)
.build()
).toSpecification();
}
}
query HouseResourcesList($pageSize: Int, $page: Int) {
HouseResourcesList(pageSize: $pageSize, page: $page) {
list {
id
title
pic
title
coveredArea
orientation
floor
rent
}
}
}
{
"pageSize":2,
"page":1
}
import React from 'react';
import { withRouter } from 'react-router';
import { Icon,Item } from 'semantic-ui-react';
import config from '../../common.js';
import { ApolloClient, gql , InMemoryCache} from '@apollo/client';
const client = new ApolloClient({
uri: 'http://127.0.0.1:9091/graphql',
cache: new InMemoryCache()
});
//定义查询
const QUERY_LIST = gql`
query HouseResourcesList($pageSize: Int, $page: Int) {
HouseResourcesList(pageSize: $pageSize, page: $page) {
list {
id
title
pic
title
coveredArea
orientation
floor
rent
}
}
}
`;
class HouseList extends React.Component {
constructor(props) {
super(props);
this.state = {
listData: [],
typeName: '',
type: null,
loadFlag: false
};
}
goBack = () => {
console.log(this.props.history)
this.props.history.goBack();
}
componentDidMount = () => {
const {query} = this.props.location.state;
this.setState({
typeName: query.name,
type: query.type
})
/*axios.post('/homes/list',{
home_type: query.type
}).then(ret=>{
this.setState({
listData: ret.data,
loadFlag: true
})
})*/
client.query({query:QUERY_LIST,variables:{"pageSize":2,"page":1}}).then(result=>{
console.log(result)
this.setState({
listData: result.data.HouseResourcesList.list,
loadFlag: true
})
})
}
render() {
let list = null;
if(this.state.loadFlag) {
list = this.state.listData.map(item=>{
return (
-
{item.title}
{item.coveredArea} ㎡/{item.orientation}/{item.floor}
上海
{item.rent}
)
});
}
return (
{this.state.typeName}
{list}
);
}
}
export default withRouter(HouseList);
haoke-manage-api-server
/**
* 修改房源
*
* @param houseResources json数据
* @return
*/
@PutMapping
@ResponseBody
public ResponseEntity<Void> update(@RequestBody HouseResources houseResources) {
try {
boolean bool = this.houseResourceService.update(houseResources);
if (bool) {
return ResponseEntity.status(HttpStatus.NO_CONTENT).build();
}
} catch (Exception e) {
e.printStackTrace();
}
return ResponseEntity.status(HttpStatus.INTERNAL_SERVER_ERROR).build();
}
haoke-manage-api-server
public boolean update(HouseResources houseResources) {
return this.apiHouseResourcesService.updateHouseResources(houseResources);
}
haoke-manage-dubbo-server-house-resources-interface
ApiHouserResourcesService
/**
* 修改房源
*
* @param houseResources
* @return
*/
boolean updateHouseResources(HouseResources houseResources);
实现类ApiHouseResourcesServiceImpl
/**
* 修改房源
*
* @param houseResources
* @return
*/
@Override
public boolean updateHouseResources(HouseResources houseResources) {
return this.houseResourcesService.updateHouseResources(houseResources);
}
修改业务Service:HouseResourcesServiceImpl
@Override
public boolean updateHouseResources(HouseResources houseResources) {
return super.update(houseResources)==1;
}
BaseServiceImpl
/**
* 更新数据
* @param record
* @return
*/
public Integer update(T record) {
record.setUpdated(new Date());
return this.mapper.updateById(record);
}
render: (text, record) => (
this.handleUpdateModalVisible(true, record)}>查看
{/* 弹窗组件 */}
删除
),
reload(){// 刷新当前页面
const { dispatch } = this.props;
dispatch({
type: 'houseResource/fetch'
});
}
import React from 'react';
import {Card, Checkbox, Form, Input, Modal, Select} from "antd";
import {connect} from "dva";
import PicturesWall from "../Utils/PicturesWall";
const FormItem = Form.Item;
const InputGroup = Input.Group;
const CheckboxGroup = Checkbox.Group;
const { TextArea } = Input;
const formItemLayout = {
labelCol: {
xs: { span: 24 },
sm: { span: 7 },
},
wrapperCol: {
xs: { span: 24 },
sm: { span: 12 },
md: { span: 10 },
},
};
const paymentMethod = [
"",
"付一押一",
"付三押一",
"付六押一",
"年付押一",
"其他"
]
const decoration = [
"",
"精装",
"简装",
"毛坯"
]
const rentMethod = [
"",
"整租",
"合租"
]
const time = [
"",
"上午",
"中午",
"下午",
"晚上",
"全天"
]
const facilities = [
"",
"水",
"电",
"煤气/天然气",
"暖气",
"有线电视",
"宽带",
"电梯",
"车位/车库",
"地下室/储藏室"
]
function isChinese(temp){
const re=/^[\u3220-\uFA29]+$/;
if (re.test(temp))
return true ;
return false;
}
@connect()
@Form.create()/* 只有标注了 @Form.create() Form中的元素才可被封装 */
class EditResource extends React.Component{
constructor(props){
super(props);
console.log("====传来的信息=====")
console.log(this.props.record)
this.state={
visible:false,
pics:new Set()
};
}
/* 显示编辑弹窗 */
showModal = () => {
this.setState({
visible: true
});
};
/* 隐藏编辑弹窗 */
handleCancel = () => {
this.setState({
visible: false,
});
};
handleSave = () => {
const { dispatch, form, record } = this.props;
form.validateFieldsAndScroll((err, values) => {
if (!err) {
// 房源id
values.id = record.id;
// 看房时间
if(isChinese(values.time)){
for (let i = 1; i < time.length; i++) {
if(time[i]==values.time)
values.time=i;
}
}
// 支付方式
if(isChinese(values.paymentMethod)){
for (let i = 1; i < paymentMethod.length; i++) {
if(paymentMethod[i]==values.paymentMethod)
values.paymentMethod=i;
}
}
// rentMethod
if(isChinese(values.rentMethod)){
for (let i = 1; i < rentMethod.length; i++) {
if(rentMethod[i]==values.rentMethod)
values.rentMethod=i;
}
}
// decoration
if(isChinese(values.decoration)){
for (let i = 1; i < decoration.length; i++) {
if(decoration[i]==values.decoration)
values.decoration=i;
}
}
if(values.floor_1 && values.floor_2){
values.floor = `${values.floor_1 }/${ values.floor_2}`;
}
// 周边设施
if(values.facilities){
values.facilities = values.facilities.join(",");
}
// 楼栋信息
values.buildingNum = record.buildingNum;
values.buildingUnit = record.buildingUnit;
values.buildingFloorNum = record.buildingFloorNum;
delete values.building;
// 照片
if(this.state.pics.size > 0){
values.pic = [...this.state.pics].join(',');
}else{
values.pic = record.pic;
}
console.log("====提交的信息=====")
console.log(values)
dispatch({
type: 'house/updateHouseForm',
payload: values,
});
setTimeout(()=>{
this.handleCancel();
this.props.reload();
},500)
}
});
};
handleFileList = (obj)=>{
const pics = new Set();
obj.forEach((v, k) => {
if(v.response){
pics.add(v.response.name);
}
if(v.url){
pics.add(v.url);
}
});
this.setState({
pics
})
}
render(){
const {record} = this.props;
const {
form: { getFieldDecorator }
} = this.props;
return (
{this.showModal()}}>编辑
{this.handleSave()}}
onCancel={()=>{this.handleCancel()}}
destroyOnClose
>
)
}
}
export default EditResource;
import { routerRedux } from 'dva/router';
import { message } from 'antd';
import { addHouseResource,updateHouseResource } from '@/services/haoke/haoke';
export default {
namespace: 'house',
state: {
},
effects: {
*submitHouseForm({ payload }, { call }) {
console.log("page model")
yield call(addHouseResource, payload);
message.success('提交成功');
},
*updateHouseForm({ payload }, { call }) {
console.log("uodateHouseForm")
yield call(updateHouseResource, payload);
message.success('提交成功');
}
},
reducers: {}
};
import request from '@/utils/request';
export async function addHouseResource(params) {
return request('/haoke/house/resources', {
method: 'POST',
body: params
});
}
export async function updateHouseResource(params) {
console.log(params)
return request('/haoke/house/resources', {
method: 'PUT',
body: params
});
}
关闭之后要销毁,否则影响下一项更新操作
JDBC超时,Hikari设置了最长请求时间为30s,涉及多线程问题,原因还不清楚
spring:
datasource:
hikari:
maximum-pool-size: 60
data-source-properties:
setIdleTimeout: 60000
setConnectionTimeout: 60000
setValidationTimeout: 3000
setLoginTimeout: 5
setMaxLifetime: 60000
分析:
在dubbo服务提供方的工程中(haoke-manage-dubbo-server
),将BasePOJO、BaseServiceImpl、vo.PageInfo移至该工程;导入公有依赖
其他工程,依赖此工程,并将自己工程中的相关类删除
<project xmlns="http://maven.apache.org/POM/4.0.0"
xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance"
xsi:schemaLocation="http://maven.apache.org/POM/4.0.0 http://maven.apache.org/xsd/maven-4.0.0.xsd">
<parent>
<artifactId>haoke-manage-dubbo-serverartifactId>
<groupId>com.haoke.managegroupId>
<version>1.0-SNAPSHOTversion>
parent>
<modelVersion>4.0.0modelVersion>
<artifactId>haoke-manage-dubbo-server-commonartifactId>
<dependencies>
<dependency>
<groupId>com.baomidougroupId>
<artifactId>mybatis-plus-boot-starterartifactId>
<version>3.4.2version>
dependency>
<dependency>
<groupId>mysqlgroupId>
<artifactId>mysql-connector-javaartifactId>
<version>8.0.16version>
dependency>
<dependency>
<groupId>org.springframework.bootgroupId>
<artifactId>spring-boot-starter-jdbcartifactId>
dependency>
dependencies>
project>
实现图片上传功能,以供其他服务使用
开发一个图片上传服务,需要有存储的支持,那么我们的解决方案有以下几种 :
vo:服务提供方封装的返回数据
package com.haoke.api.vo;
import lombok.Data;
@Data
public class PicUploadResult {
// 文件唯一标识
private String fid;
// 文件名
private String fname;
// 状态有:uploading done error removed
private String status;
// 服务端响应内容,如:'{"status": "success"}'
private String response;
}
PicUploadFileSystemService
package com.haoke.api.service;
@Service//Spring框架
public class PicUploadFileSystemService {
// 允许上传的格式
private static final String[] IMAGE_TYPE
= new String[]{ ".bmp", ".jpg",".jpeg", ".gif", ".png"};
public PicUploadResult upload(MultipartFile uploadFile) {
// 校验图片格式,属于图片,则允许上传
boolean isLegal = false;
for (String type : IMAGE_TYPE) {
if(StringUtils.endsWithIgnoreCase(
uploadFile.getOriginalFilename(),
type)) {
isLegal = true;
break;
}
}
// 封装Result对象,并且将文件的byte数组放置到result对象中
PicUploadResult fileUploadResult = new PicUploadResult();
//不允许上传,返回结果设置设为 error
if (!isLegal) {
fileUploadResult.setStatus("error");
return fileUploadResult;
}
String fileName = uploadFile.getOriginalFilename();
// 文件新路径
String filePath = getFilePath(fileName);
// 生成图片的绝对引用地址
String picUrl = StringUtils.replace(
StringUtils.substringAfter(filePath,"F:\\haoke-upload"),
"\\", "/");
fileUploadResult.setName("http://image.haoke.com" + picUrl);
File newFile = new File(filePath);
// 写文件到磁盘
try {
uploadFile.transferTo(newFile);
} catch (IOException e) {
e.printStackTrace();
//上传失败
fileUploadResult.setStatus("error");
return fileUploadResult;
}
fileUploadResult.setStatus("done");
fileUploadResult.setUid(String.valueOf(System.currentTimeMillis()));
return fileUploadResult;
}
private String getFilePath(String sourceFileName) {
String baseFolder =
"F:\\haoke-upload" +
File.separator +
"images";
Date nowDate = new Date();
// yyyy/MM/dd
String fileFolder = baseFolder +
File.separator +
new DateTime(nowDate).toString("yyyy") +
File.separator +
new DateTime(nowDate).toString("MM") +
File.separator +
new DateTime(nowDate).toString("dd");
File file = new File(fileFolder);
if (!file.isDirectory()) {
// 如果目录不存在,则创建目录
file.mkdirs();
}
// 生成新的文件名
String fileName =
new DateTime(nowDate).toString("yyyyMMddhhmmssSSSS") +
RandomUtils.nextInt(100, 9999) +
"." +
StringUtils.substringAfterLast(sourceFileName, ".");
return fileFolder + File.separator + fileName;
}
}
package com.haoke.api.controller;
@RequestMapping("pic/upload")
@Controller
public class PicUploadController {
@Autowired
private PicUploadFileSystemService picUploadService;
/**
* @param uploadFile
* @return
* @throws Exception
*/
@PostMapping
@ResponseBody
public PicUploadResult upload(@RequestParam("file") MultipartFile uploadFile)
throws Exception {
return this.picUploadService.upload(uploadFile);
}
}
生成的链接是url链接,需要通过nginx进行访问映射
nginx目录/conf/nginx.conf
server {
listen 80;
server_name image.haoke.com;
#charset koi8-r;
#access_log logs/host.access.log main;
proxy_set_header X-Forwarded-Host $host;
proxy_set_header X-Forwarded-Server $host;
proxy_set_header X-Forwarded-For $proxy_add_x_forwarded_for;
location / {
root E:\idea\graduateProject\code\upload;
}
}
没有修改C:\Windows\System32\drivers\etc\hosts的权限
打开C:\Windows\System32\drivers\etc\文件,找到hosts,然后分配所有权限。
找到上面提的hosts文件,打开文件直接再保存原来路径下
SwitchHosts!文件右击鼠标,点击管理员启动。
# 开发环境
127.0.0.1 manage.haoke.com
127.0.0.1 image.haoke.com
cd nginx目录
start nginx.exe
# 停止nginx
nginx -s stop
<dependency>
<groupId>com.qcloudgroupId>
<artifactId>cos_apiartifactId>
<version>5.6.37version>
dependency>
<dependency>
<groupId> org.springframework.boot groupId>
<artifactId> spring-boot-configuration-processor artifactId>
<optional> true optional>
dependency>
# 子用户id
tencent.cos.appid=100018187662
#子用户key:id,key
tencent.cos.secret-id=AKIDlNkW6P8TLwNZXExxg9MIcaQMSKwHn32l
tencent.cos.secret-key=b3YEbhkX5MNK2VhfNULXZKG3cwzrW2HH
# 容器名
tencent.cos.bucket-name=haoke-1257323542
# 容器所属区
tencent.cos.region-id=ap-beijing
# 文件前缀
tencent.cos.base-url=https://haoke-1257323542.cos.ap-beijing.myqcloud.com
package com.haoke.api.config;
import com.qcloud.cos.COSClient;
import com.qcloud.cos.ClientConfig;
import com.qcloud.cos.auth.BasicCOSCredentials;
import com.qcloud.cos.auth.COSCredentials;
import com.qcloud.cos.region.Region;
import lombok.Data;
import org.springframework.boot.context.properties.ConfigurationProperties;
import org.springframework.context.annotation.Bean;
import org.springframework.context.annotation.Configuration;
import org.springframework.context.annotation.PropertySource;
@Data
@Configuration
@PropertySource(value = {"classpath:tencent.properties"})
@ConfigurationProperties(prefix = "tencent.cos")
public class CosConfig {
private String appId;
private String secretId;
private String secretKey;
private String bucketName;
private String regionId;
private String baseUrl;
@Bean
public COSClient cosClient() {
//1. 初始化用户身份信息
COSCredentials cred = new BasicCOSCredentials(this.secretId, this.secretKey);
//2. 设置 bucket 的地域, COS 地域的简称请参照 https://cloud.tencent.com/document/product/436/6224
ClientConfig clientConfig = new ClientConfig(new Region(this.regionId));
//3. 生成cos客户端
COSClient cosClient = new COSClient(cred,clientConfig);
return cosClient;
}
}
PicUploadTencentService
package com.haoke.api.service;
@Service
public class PicUploadTencentService {
// 允许上传的格式
private static final String[] IMAGE_TYPE
= new String[] {".bmp", ".jpg",".jpeg", ".gif", ".png"};
@Autowired
private COSClient cosClient;//COS客户端
@Autowired
private CosConfig cosConfig;
public PicUploadResult upload(MultipartFile uploadFile) {
// 校验图片格式
boolean isLegal = false;
for (String type : IMAGE_TYPE) {
if (StringUtils.endsWithIgnoreCase(
uploadFile.getOriginalFilename(),type)) {
//如果文件名的后缀为上述图片类型,则为合法文件,可以上传
isLegal = true;
break;
}
}
// 封装Result对象,并且将文件的byte数组放置到result对象中
PicUploadResult fileUploadResult = new PicUploadResult();
if(!isLegal){
fileUploadResult.setStatus("error");
return fileUploadResult;
}
// 文件新路径
String fileName = uploadFile.getOriginalFilename();
String filePath = getFilePath(fileName);
String[] filename = filePath.split("\\.");
File localFile = null;
// 以字节流上传到腾讯云COS
try {
localFile=File.createTempFile(filename[0], filename[1]);
uploadFile.transferTo(localFile);
localFile.deleteOnExit();
cosClient.putObject(
cosConfig.getBucketName(),
filePath,
localFile
);
} catch (Exception e) {
e.printStackTrace();
//上传失败
fileUploadResult.setStatus("error");
return fileUploadResult;
}
cosClient.shutdown();
//返回给前端的对象
fileUploadResult.setStatus("done");
fileUploadResult.setName(this.cosConfig.getBaseUrl() + filePath);
fileUploadResult.setUid(String.valueOf(System.currentTimeMillis()));
return fileUploadResult;
}
private String getFilePath(String fileName) {
DateTime dateTime = new DateTime();
return "images/" +
dateTime.toString("yyyy")+
"/" + dateTime.toString("MM") + "/" +
dateTime.toString("dd") + "/" +
System.currentTimeMillis() +
RandomUtils.nextInt(100, 9999) + "." +
StringUtils.substringAfterLast(fileName, ".");
}
}
package com.haoke.api.controller;
@RequestMapping("pic/upload")
@Controller
public class PicUploadController {
@Autowired
private PicUploadTencentService picUploadTencentService;
/*@Autowired
private PicUploadFileSystemService picUploadService;
*/
/**
* @param uploadFile
* @return
* @throws Exception
*/
@PostMapping
@ResponseBody
public PicUploadResult upload(@RequestParam("file") MultipartFile uploadFile)
throws Exception {
//return this.picUploadService.upload(uploadFile);
return this.picUploadTencentService.upload(uploadFile);
}
}
handleFileList = (obj)=>{
const pics = new Set();
obj.forEach((v, k) => {
if(v.response){
pics.add(v.response.name);
}
});
this.setState({
pics
})
}
values.pic = [...this.state.pics].join(',');
在接口服务中,每一次都进行数据库查询,那么必然会给数据库造成很大的并发压力。所以需要为接口添加缓存
项目使用 Redis ,并且使用Redis的集群,Api使用 Spring-Data-Redis
所以缓存逻辑应该加载api服务处
# 用于显示或设置网络设备
ifconfig
#拉取镜像
docker pull redis
#创建容器
docker create --name redis-node01 -v /data/redis-data/node01:/data -p 6379:6379 redis --cluster-enabled yes --cluster-config-file nodes-node-01.conf
docker create --name redis-node02 -v /data/redis-data/node02:/data -p 6380:6379 redis --cluster-enabled yes --cluster-config-file nodes-node-02.conf
docker create --name redis-node03 -v /data/redis-data/node03:/data -p 6381:6379 redis --cluster-enabled yes --cluster-config-file nodes-node-03.conf
#启动容器
docker start redis-node01 redis-node02 redis-node03
#开始组建集群
#进入redis-node01进行操作
docker exec -it redis-node01 /bin/bash
#组建集群
redis-cli --cluster create 172.17.0.1:6379 172.17.0.1:6380 172.17.0.1:6381 --cluster-replicas 0
但可以单独连接到每个节点
172.17.0.1 是主机分配给docker容器的地址
#停止容器
docker stop redis-node01 redis-node02 redis-node03
#删除容器
docker rm redis-node01 redis-node02 redis-node03
#删除redis目录
rm -rf /data/redis-data
#创建容器
docker create --name redis-node01 -v /data/redis-data/node01:/data -p 6379:6379 redis --cluster-enabled yes --cluster-config-file nodes-node-01.conf
docker create --name redis-node02 -v /data/redis-data/node02:/data -p 6380:6379 redis --cluster-enabled yes --cluster-config-file nodes-node-02.conf
docker create --name redis-node03 -v /data/redis-data/node03:/data -p 6381:6379 redis --cluster-enabled yes --cluster-config-file nodes-node-03.conf
#启动容器
docker start redis-node01 redis-node02 redis-node03
#查看容器的ip地址
docker inspect redis-node01 -> 172.17.0.3
docker inspect redis-node02 -> 172.17.0.5
docker inspect redis-node03 -> 172.17.0.6
#进入redis-node01进行操作
docker exec -it redis-node01 /bin/bash
#组建集群(注意端口的变化)
redis-cli --cluster create 172.17.0.3:6379 172.17.0.4:6379 172.17.0.5:6379 --cluster-replicas 0
集群组建成功:
查看集群信息
root@03adb7fdf0e0:/data# redis-cli
127.0.0.1:6379> CLUSTER NODES
集群中结点的ip地址是docker容器分配的,在客户端无法访问
docker网络类型
Host模式 创建的容器没有自己独立的网络命名空间,是和物理机共享一个Network Namespace,并且共享物理机的所有端口与IP。但是它将容器直接暴露在公共网络中,是有安全隐患的
#停止容器
docker stop redis-node01 redis-node02 redis-node03
#删除容器
docker rm redis-node01 redis-node02 redis-node03
#删除redis目录
rm -rf /data/redis-data
#创建容器
docker create --name redis-node01 --net host -v /data/redis-data/node01:/data redis --cluster-enabled yes --cluster-announce-ip 8.140.130.91 --cluster-announce-bus-port 16379 --cluster-config-file nodes-node-01.conf --port 6379
docker create --name redis-node02 --net host -v /data/redis-data/node02:/data redis --cluster-enabled yes --cluster-announce-ip 8.140.130.91 --cluster-announce-bus-port 16380 --cluster-config-file nodes-node-02.conf --port 6380
docker create --name redis-node03 --net host -v /data/redis-data/node03:/data redis --cluster-enabled yes --cluster-announce-ip 8.140.130.91 --cluster-announce-bus-port 16381 --cluster-config-file nodes-node-03.conf --port 6381
#启动容器
docker start redis-node01 redis-node02 redis-node03
#进入redis-node01容器进行操作
docker exec -it redis-node01 /bin/bash
#8.140.130.91是主机的ip地址
redis-cli --cluster create 8.140.130.91:6379 8.140.130.91:6380 8.140.130.91:6381 --cluster-replicas 0
–name:容器名
-v /data/redis-data/node01:/data:容器配置文件映射到本机路径
-p 6380:6379:端口映射
–cluster-enabled yes:启用集群
–cluster-config-file nodes-node-01.conf:本节点配置文件
–cluster-announce-ip 8.140.130.91:集群公网ip
–cluster-announce-bus-port 16379: 集群的总线端口
若不设置集群公网Ip及总线端口,则会出现
JedisClusterMaxAttemptsException: No more cluster attempts left.
查看集群信息
redis-cli
CLUSTER NODES
测试集群
<dependency>
<groupId>org.springframework.bootgroupId>
<artifactId>spring-boot-starter-data-redisartifactId>
dependency>
<dependency>
<groupId>redis.clientsgroupId>
<artifactId>jedisartifactId>
dependency>
<dependency>
<groupId>commons-iogroupId>
<artifactId>commons-ioartifactId>
<version>2.6version>
dependency>
# redis集群配置
spring.redis.jedis.pool.max-wait = 5000
spring.redis.jedis.pool.max-Idle = 100
spring.redis.jedis.pool.min-Idle = 10
spring.redis.timeout = 10
spring.redis.cluster.nodes = 8.140.130.91:6379,8.140.130.91:6380,8.140.130.91:6381
spring.redis.cluster.max-redirects=5
package com.haoke.api.config;
import lombok.Data;
import org.springframework.boot.context.properties.ConfigurationProperties;
import org.springframework.context.annotation.PropertySource;
import org.springframework.stereotype.Component;
import java.util.List;
/* 读取集群配置文件 */
@PropertySource(value = "classpath:application.properties")
@ConfigurationProperties(prefix = "spring.redis.cluster")
@Component
@Data
public class ClusterConfigurationProperties {
private List<String> nodes;
private Integer maxRedirects;//集群节点最大重定向数
}
https://blog.csdn.net/qq_40091033/article/details/106682199
package com.haoke.api.config;
import org.springframework.beans.factory.annotation.Autowired;
import org.springframework.context.annotation.Bean;
import org.springframework.context.annotation.Configuration;
import org.springframework.data.redis.connection.RedisClusterConfiguration;
import org.springframework.data.redis.connection.RedisConnectionFactory;
import org.springframework.data.redis.connection.jedis.JedisConnectionFactory;
import org.springframework.data.redis.core.RedisTemplate;
import org.springframework.data.redis.serializer.StringRedisSerializer;
/*redis集群工厂*/
@Configuration
public class RedisClusterConfig {
@Autowired
private ClusterConfigurationProperties clusterProperties;
@Bean
public RedisConnectionFactory connectionFactory() {//redis连接工厂
RedisClusterConfiguration redisClusterConfiguration
= new RedisClusterConfiguration(clusterProperties.getNodes());
redisClusterConfiguration.setMaxRedirects(clusterProperties.getMaxRedirects());
return new JedisConnectionFactory(redisClusterConfiguration);
}
@Bean
public RedisTemplate<String,String> redisTemplate(RedisConnectionFactory redisConnectionFactory){
RedisTemplate<String, String> redisTemplate = new RedisTemplate<>();
redisTemplate.setConnectionFactory(redisConnectionFactory);//设置redis连接工厂
redisTemplate.setKeySerializer(new StringRedisSerializer());//key序列化
redisTemplate.setValueSerializer(new StringRedisSerializer());//value序列化
redisTemplate.afterPropertiesSet();//
return redisTemplate;
}
}
package com.haoke.api;
import org.junit.Test;
import org.junit.runner.RunWith;
import org.springframework.beans.factory.annotation.Autowired;
import org.springframework.boot.autoconfigure.SpringBootApplication;
import org.springframework.boot.test.context.SpringBootTest;
import org.springframework.data.redis.core.RedisTemplate;
import org.springframework.test.context.junit4.SpringRunner;
import redis.clients.jedis.HostAndPort;
import redis.clients.jedis.Jedis;
import redis.clients.jedis.JedisCluster;
import java.util.Set;
@RunWith(SpringRunner.class)
@SpringBootTest
public class TestRedis {
@Autowired
private RedisTemplate redisTemplate;
@Test
public void testSave(){
for (int i = 0; i < 100; i++) {
this.redisTemplate.opsForValue().set("key_" + i, "value_"+i);
}
Set<String> keys = this.redisTemplate.keys("key_*");//选择所有的键
for (String key : keys) {
String value = (String) this.redisTemplate.opsForValue().get(key);
System.out.println(value);
}
}
}
#进入redis-node01容器进行操作
docker exec -it redis-node01 /bin/bash
redis-cli -c -p 6379
实现缓存有2种方式:
判断redis缓存是否命中:若是POST请求,需从输入流中读取返回的数据,然而redis未命中,放行请求后,由于输入流只能读取一次,输入流已被销毁,无法读取到请求参数,查询不到数据,所以需要使用包装request解决多次读取输入流中的数据
生成redis缓存:读取返回的结果,由AOP思想,处理控制器返回的请求
package com.haoke.api.interceptor;
import com.fasterxml.jackson.databind.ObjectMapper;
import org.apache.commons.codec.digest.DigestUtils;
import org.apache.commons.io.IOUtils;
import org.apache.commons.lang3.StringUtils;
import org.springframework.beans.factory.annotation.Autowired;
import org.springframework.data.redis.core.RedisTemplate;
import org.springframework.stereotype.Component;
import org.springframework.web.servlet.HandlerInterceptor;
import javax.servlet.http.HttpServletRequest;
import javax.servlet.http.HttpServletResponse;
import java.io.IOException;
import java.util.Map;
@Component
public class RedisInterceptor implements HandlerInterceptor {
@Autowired
private RedisTemplate<String,String> redisTemplate;
private static ObjectMapper mapper = new ObjectMapper();
@Override
public boolean preHandle(HttpServletRequest request, HttpServletResponse response, Object handler) throws Exception {
//判断请求方式,PUT,DELETE,无需缓存,拦截器放行
if(!StringUtils.equalsIgnoreCase(request.getMethod(),"GET")){
if(!StringUtils.equalsIgnoreCase(request.getRequestURI(),"/graphql"))
//POST请求有可能是GraphQL查询,也要做拦截,不放行
return true;
}
//通过缓存做命中,查询redisKey
String data = this.redisTemplate.opsForValue().get(createRedisKey(request));
if(StringUtils.isEmpty(data)){
//缓存未命中,则放行请求
return true;
}
//若缓存命中,将data中的数据返回
response.setCharacterEncoding("UTF-8");
response.setContentType("application/json; charset=utf-8");
response.getWriter().write(data);
return false;
}
public static String createRedisKey(HttpServletRequest request) throws IOException {
String paramStr = request.getRequestURI();
Map<String,String[]> parameterMap = request.getParameterMap();
if(parameterMap.isEmpty()){
//URI中没有参数,则是POST的GraphQL请求,需要从请求体中获取数据
paramStr += IOUtils.toString(request.getInputStream(),"UTF-8");
# 对于POST请求,由于在拦截器中读取了输入流中的数据,在request中的输入流只能读取一次,请求进去的Controller时,输入流已关闭,导致获取不到数据
}else{
paramStr += mapper.writeValueAsString(request.getParameterMap());
}
String redisKey = "WEB_DATA_" + DigestUtils.md5Hex(paramStr);
return redisKey;
}
}
@Configuration
public class WebConfig implements WebMvcConfigurer {
@Autowired
private RedisInterceptor redisInterceptor;
//将自定义interceptor注册到Web容器中
@Override
public void addInterceptors(InterceptorRegistry registry) {
//所有请求都要经过拦截器
registry.addInterceptor(redisInterceptor).addPathPatterns("/**");
}
}
paramStr = "/graphql{"query":"query HouseResourcesList($pageSize: Int, $page: Int) {\n HouseResourcesList(pageSize: $pageSize, page: $page) {\n list {\n id\n title\n pic\n title\n coveredArea\n orientation\n floor\n rent\n }\n }\n}","variables":{"pageSize":2,"page":1},"operationName":"HouseResourcesList"}"
参数串形成的md5为
redisKey = WEB_DATA_822d7e70c286f68877cb6759b04498d4
由于Redis中没有键为 redisKey ,所以获取到的 data = null
然而,由于在拦截器中读取了输入流的数据,在request中的输入流只能读取一次,请求进去Controller时,输入流中已经没有数据了,导致获取不到数据。
对HttpServetRequest进行包装
package com.haoke.api.interceptor;
import org.apache.commons.io.IOUtils;
import javax.servlet.ReadListener;
import javax.servlet.ServletInputStream;
import javax.servlet.http.HttpServletRequest;
import javax.servlet.http.HttpServletRequestWrapper;
import java.io.BufferedReader;
import java.io.IOException;
import java.io.InputStreamReader;
/**
* 包装 HttpServletRequest
*/
public class MyServletRequestWrapper extends HttpServletRequestWrapper {
private final byte[] body;
/**
* Construct a wrapper for the specified request.
*
* @param request The request to be wrapped
*/
public MyServletRequestWrapper(HttpServletRequest request) throws IOException {
super(request);
body = IOUtils.toByteArray(super.getInputStream());//将输入流读入到wrapper的成员变量中
}
@Override
public BufferedReader getReader() throws IOException {
return new BufferedReader(new InputStreamReader(getInputStream()));
}
@Override
public ServletInputStream getInputStream() throws IOException {
return new RequestBodyCachingInputStream(body);
}
private class RequestBodyCachingInputStream extends ServletInputStream {
//返回一个带有缓存功能的输入流
private byte[] body;
private int lastIndexRetrieved = -1;
private ReadListener listener;
public RequestBodyCachingInputStream(byte[] body) {
this.body = body;
}
@Override
public int read() throws IOException {
if (isFinished()) {
return -1;
}
int i = body[lastIndexRetrieved + 1];
lastIndexRetrieved++;
if (isFinished() && listener != null) {
try {
listener.onAllDataRead();
} catch (IOException e) {
listener.onError(e);
throw e;
}
}
return i;
}
@Override
public boolean isFinished() {
return lastIndexRetrieved == body.length - 1;
}
@Override
public boolean isReady() {
// This implementation will never block
// We also never need to call the readListener from this method, as this method will never return false
return isFinished();
}
@Override
public void setReadListener(ReadListener readListener) {
if (listener == null) {
throw new IllegalArgumentException("listener cann not be null");
}
if (this.listener != null) {
throw new IllegalArgumentException("listener has been set");
}
this.listener = listener;
if (!isFinished()) {
try {
listener.onAllDataRead();
} catch (IOException e) {
listener.onError(e);
}
} else {
try {
listener.onAllDataRead();
} catch (IOException e) {
listener.onError(e);
}
}
}
@Override
public int available() throws IOException {
return body.length - lastIndexRetrieved - 1;
}
@Override
public void close() throws IOException {
lastIndexRetrieved = body.length - 1;
body = null;
}
}
}
使用过滤器对请求进行替换
package com.haoke.api.interceptor;
import org.springframework.stereotype.Component;
import org.springframework.web.filter.OncePerRequestFilter;
import javax.servlet.FilterChain;
import javax.servlet.ServletException;
import javax.servlet.http.HttpServletRequest;
import javax.servlet.http.HttpServletResponse;
import java.io.IOException;
/*
* 替换Request对象
* */
@Component
public class RequestReplaceFilter extends OncePerRequestFilter {
//继承该类,保证过滤器仅过滤一次请求
@Override
protected void doFilterInternal(HttpServletRequest request, HttpServletResponse response, FilterChain filterChain) throws ServletException, IOException {
if (!(request instanceof MyServletRequestWrapper)) {
request = new MyServletRequestWrapper(request);
}
filterChain.doFilter(request, response);
}
}
可见,request已经替换成了自定义的wrapper
进入 GraphQLControll
,发现此时请求中包含数据
不能通过拦截器生成缓存,因为拦截器中拿不到Controller返回的数据
通过Spring AOP实现,在结果被处理前进行拦截,拦截的逻辑自己实现,这样就可以实现拿到结果数据进行写入缓存的操作了。
package com.haoke.api.interceptor;
import com.fasterxml.jackson.databind.ObjectMapper;
import com.haoke.api.controller.GraphQLController;
import org.apache.commons.lang3.StringUtils;
import org.springframework.beans.factory.annotation.Autowired;
import org.springframework.core.MethodParameter;
import org.springframework.data.redis.core.RedisTemplate;
import org.springframework.http.MediaType;
import org.springframework.http.server.ServerHttpRequest;
import org.springframework.http.server.ServerHttpResponse;
import org.springframework.http.server.ServletServerHttpRequest;
import org.springframework.web.bind.annotation.ControllerAdvice;
import org.springframework.web.bind.annotation.GetMapping;
import org.springframework.web.bind.annotation.PostMapping;
import org.springframework.web.servlet.mvc.method.annotation.ResponseBodyAdvice;
import java.time.Duration;
@ControllerAdvice
public class MyResponseBodyAdvice implements ResponseBodyAdvice {
@Autowired
private RedisTemplate redisTemplate;
private ObjectMapper mapper = new ObjectMapper();
/*
* 选择要横切的响应
* */
@Override
public boolean supports(MethodParameter returnType, Class converterType) {
/*
* 对所有查询进行横切,
* 如 GET请求,POST的GraphQLL
* */
if(returnType.hasMethodAnnotation(GetMapping.class)){
return true;
}
if(returnType.hasMethodAnnotation(PostMapping.class) &&
StringUtils.equals(GraphQLController.class.getName(),returnType.getExecutable().getDeclaringClass().getName())){
return true;
}
return false;
}
/*
* 横切通知
* */
@Override
public Object beforeBodyWrite(Object body,
MethodParameter returnType,
MediaType selectedContentType,
Class selectedConverterType,
ServerHttpRequest request, ServerHttpResponse response) {
try {
String redisKey = RedisInterceptor.createRedisKey(((ServletServerHttpRequest) request).getServletRequest());
String redisValue;
if(body instanceof String){
redisValue = (String) body;
}else{
redisValue = mapper.writeValueAsString(body);
}
this.redisTemplate.opsForValue().set(redisKey,redisValue,Duration.ofHours(1));
}catch (Exception e){
e.getStackTrace();
}
return body;
}
}
第二次查询同一数据,发现命中
查看服务器端集群
讲解中有这部分内容,但我的mock都是GET请求,所以没有这种情况,仅做记录
package com.haoke.api.interceptor;
import com.fasterxml.jackson.databind.ObjectMapper;
import org.apache.commons.codec.digest.DigestUtils;
import org.apache.commons.io.IOUtils;
import org.apache.commons.lang3.StringUtils;
import org.springframework.beans.factory.annotation.Autowired;
import org.springframework.data.redis.core.RedisTemplate;
import org.springframework.stereotype.Component;
import org.springframework.web.servlet.HandlerInterceptor;
import javax.servlet.http.HttpServletRequest;
import javax.servlet.http.HttpServletResponse;
import java.io.IOException;
import java.util.Map;
@Component
public class RedisInterceptor implements HandlerInterceptor {
@Autowired
private RedisTemplate<String,String> redisTemplate;
private static ObjectMapper mapper = new ObjectMapper();
@Override
public boolean preHandle(HttpServletRequest request, HttpServletResponse response, Object handler) throws Exception {
/*
* 对于mock的请求,讲解中出现错误,解决方案为放行所有的mock请求
* 但我的问题是mock请求会被存入redis缓存,所以需要放行 请求参数带mock的 GET请求
* */
if(StringUtils.equalsIgnoreCase(request.getMethod(), "OPTIONS")){
return true;
}
if(request.getRequestURI().startsWith("/mock")){
return true;
}
//判断请求方式,PUT,DELETE,无需缓存,拦截器放行
if(!StringUtils.equalsIgnoreCase(request.getMethod(),"GET")){
if(!StringUtils.equalsIgnoreCase(request.getRequestURI(),"/graphql"))
//POST请求有可能是GraphQL查询,也要做拦截,不放行
return true;
}
//通过缓存做命中,查询redisKey
String data = this.redisTemplate.opsForValue().get(createRedisKey(request));
if(StringUtils.isEmpty(data)){
//缓存未命中,则放行请求
return true;
}
//若缓存命中,将data中的数据返回
response.setCharacterEncoding("UTF-8");
response.setContentType("application/json; charset=utf-8");
// 支持跨域
response.setHeader("Access-Control-Allow-Origin", "*");
response.setHeader("Access-Control-Allow-Methods", "GET,POST,PUT,DELETE,OPTIONS");
response.setHeader("Access-Control-Allow-Credentials", "true");
response.setHeader("Access-Control-Allow-Headers", "Content-Type,X-Token");
response.setHeader("Access-Control-Allow-Credentials", "true");
response.getWriter().write(data);
return false;
}
public static String createRedisKey(HttpServletRequest request) throws IOException {
String paramStr = request.getRequestURI();
Map<String,String[]> parameterMap = request.getParameterMap();
if(parameterMap.isEmpty()){
//URI中没有参数,则是POST的GraphQL请求,需要从请求体中获取数据
paramStr += IOUtils.toString(request.getInputStream(),"UTF-8");
}else{
paramStr += mapper.writeValueAsString(request.getParameterMap());
}
String redisKey = "WEB_DATA_" + DigestUtils.md5Hex(paramStr);
return redisKey;
}
}