V2EX = way to explore
V2EX 是一个关于分享和探索的地方
现在注册
已注册用户请  登录
V2EX  ›  wangyu8460958  ›  全部回复第 1 页 / 共 1 页
回复总数  2
2017-04-27 15:53:42 +08:00
回复了 wangyu8460958 创建的主题 Python Python 如何分批次导入 100W 条 mysql 数据
下面是我自己写的脚本,下面的脚本还能够优化吗?

#coding=utf-8

import MySQLdb

import numpy as np

conn = MySQLdb.connect(host='localhost',port = 3306,user='root',passwd='123456',db='google')
cur = conn.cursor()
conn.autocommit(1)

query_sql = "select id from google.Video where created_at < date_sub(now(), interval 1 year);"
insert_sql = "insert ignore into tmp.Video_2017bak (select * from google.Video where id = %s);"
delete_sql = "delete from google.Video where id = %s;"
cur.execute(query_sql)
dataList = cur.fetchall()
aaa = np.array(dataList)
ids = []

for i in range(len(aaa)):
ids.append(aaa[i])
if (i+1)%100==0 :
cur.executemany(insert_sql,ids)
ids = []
cur.executemany(insert_sql,ids)
ids = []

for i in range(len(aaa)):
ids.append(aaa[i])
if (i+1)%100==0 :
cur.executemany(delete_sql,ids)
ids = []
cur.executemany(delete_sql,ids)
ids = []

cur.close()
conn.close()
2017-04-22 20:49:13 +08:00
回复了 wangyu8460958 创建的主题 Python Python 如何分批次导入 100W 条 mysql 数据
@skydiver 我想使用脚本让它每周执行一次,所以想使用 python 脚本来做。
关于   ·   帮助文档   ·   博客   ·   API   ·   FAQ   ·   实用小工具   ·   2897 人在线   最高记录 6679   ·     Select Language
创意工作者们的社区
World is powered by solitude
VERSION: 3.9.8.5 · 9ms · UTC 02:58 · PVG 10:58 · LAX 18:58 · JFK 21:58
Developed with CodeLauncher
♥ Do have faith in what you're doing.