假设我们要向两个不同的数据源(DataSource1和DataSource2)中插入数据,并且要使用BatchExecutor来实现批量插入。下面是一个示例代码:

首先,在application.yaml中配置多数据源:

spring:
  datasource:
    # 数据源1
    datasource1:
      driver-class-name: com.mysql.jdbc.Driver
      url: jdbc:mysql://localhost:3306/test1
      username: root
      password: root
    # 数据源2
    datasource2:
      driver-class-name: com.mysql.jdbc.Driver
      url: jdbc:mysql://localhost:3306/test2
      username: root
      password: root

接下来,我们需要定义两个数据源的DataSource对象和SqlSessionFactory对象:

@Configuration
@MapperScan(basePackages = "com.example.mapper1", sqlSessionTemplateRef = "sqlSessionTemplate1")
public class DataSource1Config {
    @Bean
    @ConfigurationProperties(prefix = "spring.datasource.datasource1")
    public DataSource dataSource1() {
        return DataSourceBuilder.create().build();
    }

    @Bean
    public SqlSessionFactory sqlSessionFactory1() throws Exception {
        SqlSessionFactoryBean sessionFactoryBean = new SqlSessionFactoryBean();
        sessionFactoryBean.setDataSource(dataSource1());
        return sessionFactoryBean.getObject();
    }

    @Bean
    public SqlSessionTemplate sqlSessionTemplate1() throws Exception {
        return new SqlSessionTemplate(sqlSessionFactory1());
    }
}

@Configuration
@MapperScan(basePackages = "com.example.mapper2", sqlSessionTemplateRef = "sqlSessionTemplate2")
public class DataSource2Config {
    @Bean
    @ConfigurationProperties(prefix = "spring.datasource.datasource2")
    public DataSource dataSource2() {
        return DataSourceBuilder.create().build();
    }

    @Bean
    public SqlSessionFactory sqlSessionFactory2() throws Exception {
        SqlSessionFactoryBean sessionFactoryBean = new SqlSessionFactoryBean();
        sessionFactoryBean.setDataSource(dataSource2());
        return sessionFactoryBean.getObject();
    }

    @Bean
    public SqlSessionTemplate sqlSessionTemplate2() throws Exception {
        return new SqlSessionTemplate(sqlSessionFactory2());
    }
}

然后,我们可以定义两个Mapper接口分别对应两个数据源:

@Mapper
public interface Mapper1 {
    void insertData(List<Data> dataList);
}

@Mapper
public interface Mapper2 {
    void insertData(List<Data> dataList);
}

接下来,我们可以编写一个Service类,该类使用BatchExecutor来实现批量插入、多线程、异步的方式实现百万级数据插入:

@Service
public class DataInsertService {
    @Autowired
    private Mapper1 mapper1;

    @Autowired
    private Mapper2 mapper2;

    @Async
    public void insertData(List<Data> dataList) {
        BatchExecutor executor1 = new BatchExecutor(mapper1::insertData, 1000);
        BatchExecutor executor2 = new BatchExecutor(mapper2::insertData, 1000);

        for (Data data : dataList) {
            if (data.getId() % 2 == 0) {
                executor1.add(data);
            } else {
                executor2.add(data);
            }
        }

        executor1.execute();
        executor2.execute();
    }
}

在上面的代码中,我们使用@Async注解来使insertData方法异步执行。我们使用BatchExecutor来实现批量插入,每个BatchExecutor对象处理1000条数据。根据数据的id属性的奇偶性来决定将数据插入到哪个数据源中。

最后,我们可以编写一个Controller类来测试我们的服务:

@RestController
public class DataInsertController {
    @Autowired
    private DataInsertService dataInsertService;

    @PostMapping("/insert")
    public String insertData(@RequestBody List<Data> dataList) {
        dataInsertService.insertData(dataList);
        return "success";
    }
}

这样,我们就完成了多数据源、批量插入、多线程、异步的实现

使用mybatis+springboot在applicationyaml中配置多数据源在service中通过BatchExecutor来实现批量插入、多线程、异步的方式实现百万级数据插入hive312中写一个例子

原文地址: http://www.cveoy.top/t/topic/fotz 著作权归作者所有。请勿转载和采集!

免费AI点我,无需注册和登录