postgresql批量插入数据

正常情况下,数据库插入一条数据耗时余额几十毫秒,这在大多数情况下是可以接受的;但是如果一次性需要插入几千、几万甚至更大数据量的时候,时间就有些长了,这个时候就需要稍微修改一下sql语句,使之可以批量插入大量数据 ;

插入一条数据sql:

INSERT INTO w008_test_insert(id, is_removed, work_id, taskid, wfid, e2eid, create_by, create_date, update_by, update_date, classid, stdanswer, knowid, iscorrect, classname) VALUES 
(3743246232764939, 0, 'W008', NULL, NULL, NULL, NULL, '2019-03-11 11:18:31.99', NULL, '2019-03-11 11:18:31.99', 2543246232764928, 'stdanswer', 2543246232764929, 10, 'classname');

执行结果:

当同时插入一千条数据时,本地测试时间在60多秒(数据库的链接、断开操作会消耗大量性能);

改成批量插入方式:

INSERT INTO w008_test_insert(id, is_removed, work_id, taskid, wfid, e2eid, create_by, create_date, update_by, update_date, classid, stdanswer, knowid, iscorrect, classname) VALUES 
(3553246232764939, 0, 'W008', NULL, NULL, NULL, NULL, '2019-03-11 11:18:31.99', NULL, '2019-03-11 11:18:31.99', 2543246232764928, 'stdanswer', 2543246232764929, 10, 'classname'),
(3243246232764939, 0, 'W008', NULL, NULL, NULL, NULL, '2019-03-11 11:18:31.99', NULL, '2019-03-11 11:18:31.99', 2543246232764928, 'stdanswer', 2543246232764929, 10, 'classname'),
(3343246232764939, 0, 'W008', NULL, NULL, NULL, NULL, '2019-03-11 11:18:31.99', NULL, '2019-03-11 11:18:31.99', 2543246232764928, 'stdanswer', 2543246232764929, 10, 'classname'),
(3443246232764939, 0, 'W008', NULL, NULL, NULL, NULL, '2019-03-11 11:18:31.99', NULL, '2019-03-11 11:18:31.99', 2543246232764928, 'stdanswer', 2543246232764929, 10, 'classname'),
(3643246232764939, 0, 'W008', NULL, NULL, NULL, NULL, '2019-03-11 11:18:31.99', NULL, '2019-03-11 11:18:31.99', 2543246232764928, 'stdanswer', 2543246232764929, 10, 'classname'),
(3743246232764939, 0, 'W008', NULL, NULL, NULL, NULL, '2019-03-11 11:18:31.99', NULL, '2019-03-11 11:18:31.99', 2543246232764928, 'stdanswer', 2543246232764929, 10, 'classname');

经测试,插入一千条数据需要0.67秒左右;插入4千条数据也就2秒多一点;

你可能感兴趣的:(数据库)