使用DynamicBatchInsertUtil工具类,配合@DSTransactional使用
先上代码,需要的直接拿走
@Component
public class DynamicBatchInsertUtil {
private static String defaultDsKey;
/**
* 给静态变量赋值
*
* @param defaultDsKey
*/
@Value("${spring.datasource.dynamic.primary}")
public void setDefaultDsKey(String defaultDsKey) {
this.defaultDsKey = defaultDsKey;
}
/**
* 批量插入
*
* @param list 数据
* @param mapperClass 执行数据的mapper的CLass
* @return
*/
public static int batchInsert(Object list, Class mapperClass) {
DS ds = (DS) mapperClass.getAnnotation(DS.class);
// Mapper没有注解,就使用默认的
String dsKey = ds != null ? ds.value() : defaultDsKey;
return batchInsert(dsKey, list, mapperClass);
}
/**
* 批量插入-参照@DS实现源码
*
* @param dsKey 数据源
* @param list 数据
* @param mapperClass 执行数据的mapper的CLass
* @return
*/
public static int batchInsert(String dsKey, Object list, Class mapperClass) {
DynamicDataSourceContextHolder.push(dsKey);
try {
return batchInsertExecute(list, mapperClass);
} finally {
DynamicDataSourceContextHolder.poll();
}
}
public static int batchInsertExecute(Object list, Class mapperClass) {
SqlSessionFactory sqlSessionFactory = SpringUtils.getBean("sqlSessionFactory");
try (SqlSession sqlSession = sqlSessionFactory.openSession(ExecutorType.BATCH)) {
Object mapper = sqlSession.getMapper(mapperClass);
((List
1、@DS写到Mapper.java里面,不写使用默认,或者手动传
DynamicBatchInsertUtil.batchInsert("dsKey",list, Mapper.class);
2、@Value对static是扫不进来的,需加@Component,通过非static的set方法扫进来
3、原理在最后
往两个数据库里batchInsert
@DSTransactional
public void testBatchInsert() {
List list1 = Arrays.asList(InfOrdersmesIn.builder().id(SnowflakeIdWorker.getSnowflakeId()).build()
, InfOrdersmesIn.builder().id(SnowflakeIdWorker.getSnowflakeId()).build()
, InfOrdersmesIn.builder().id(SnowflakeIdWorker.getSnowflakeId()).build()
, InfOrdersmesIn.builder().id(SnowflakeIdWorker.getSnowflakeId()).build()
, InfOrdersmesIn.builder().id(SnowflakeIdWorker.getSnowflakeId()).build()
, InfOrdersmesIn.builder().id(SnowflakeIdWorker.getSnowflakeId()).build());
List list2 = Arrays.asList(EpmMoldMain.builder().id(SnowflakeIdWorker.getSnowflakeId()).build()
, EpmMoldMain.builder().id(SnowflakeIdWorker.getSnowflakeId()).build()
, EpmMoldMain.builder().id(SnowflakeIdWorker.getSnowflakeId()).build()
, EpmMoldMain.builder().id(SnowflakeIdWorker.getSnowflakeId()).build()
, EpmMoldMain.builder().id(SnowflakeIdWorker.getSnowflakeId()).build()
, EpmMoldMain.builder().id(SnowflakeIdWorker.getSnowflakeId()).build());
// 插入数据库1
DynamicBatchInsertUtil.batchInsert(list1, InfOrdersmesInMapper.class);
// 插入数据库2
DynamicBatchInsertUtil.batchInsert(list2, EpmMoldMainMapper.class);
}
通过主键冲突,测试事务
@DSTransactional
public void testBatchInsertErr() {
List list1 = Arrays.asList(InfOrdersmesIn.builder().zbguid(SnowflakeIdWorker.getSnowflakeId()).build()
, InfOrdersmesIn.builder().zbguid(SnowflakeIdWorker.getSnowflakeId()).build()
, InfOrdersmesIn.builder().zbguid(SnowflakeIdWorker.getSnowflakeId()).build()
, InfOrdersmesIn.builder().zbguid(SnowflakeIdWorker.getSnowflakeId()).build()
, InfOrdersmesIn.builder().zbguid(SnowflakeIdWorker.getSnowflakeId()).build()
, InfOrdersmesIn.builder().zbguid(SnowflakeIdWorker.getSnowflakeId()).build());
EpmMoldMain epmMoldMain = EpmMoldMain.builder().id(SnowflakeIdWorker.getSnowflakeId()).build();
List list2 = Arrays.asList(epmMoldMain, epmMoldMain, epmMoldMain, epmMoldMain, epmMoldMain, epmMoldMain);
DynamicBatchInsertUtil.batchInsert(list1, InfOrdersmesInMapper.class);
DynamicBatchInsertUtil.batchInsert(list2, EpmMoldMainMapper.class);
}
如果只往一个数据库insert,可以用@DS,放在具体方法上面,如果同时往多个库insert时,就不行了,具体原因不详,谁知道可以告诉我下,感谢!
看下DS的原理,通过方法拦截来实现的,看源码:DynamicDataSourceAnnotationInterceptor
@Override
public Object invoke(MethodInvocation invocation) throws Throwable {
String dsKey = determineDatasourceKey(invocation);
DynamicDataSourceContextHolder.push(dsKey);
try {
return invocation.proceed();
} finally {
DynamicDataSourceContextHolder.poll();
}
}
那么只需要copy过来加工下就可以用,完整代码在最上面
public static int batchInsert(String dsKey, Object list, Class mapperClass) {
DynamicDataSourceContextHolder.push(dsKey);
try {
return batchInsertExecute(list, mapperClass);
} finally {
DynamicDataSourceContextHolder.poll();
}
}
public static int batchInsertExecute(Object list, Class mapperClass) {
SqlSessionFactory sqlSessionFactory = SpringUtils.getBean("sqlSessionFactory");
try (SqlSession sqlSession = sqlSessionFactory.openSession(ExecutorType.BATCH)) {
Object mapper = sqlSession.getMapper(mapperClass);
((List) list).forEach(((MyBaseMapper) mapper)::insert);
sqlSession.flushStatements();
sqlSession.commit();
}
return ((List) list).size();
}
或者还有其他的方式,网上看到过有用Supplier来实现的,还有其他的未实践的方式,如果谁实践成功了,并且能评论分享给我,那就太好了!