database question
Jan. 27th, 2006 04:05 pmI'm creating a *huge* table in mysql, at least 50 Gig. I have a C program reading a file from else where, picking out the info I want/need, then stuffing it into mysql with lots of insert into table (ra,dec,mag....) values () where ra and dec are indexed.
I noticed the amount of time it takes to do the inserts increases.
Does anyone have any advice on if it is faster to create my table with indicies, and stuff away, or should I create my table without indecies,
slurp in the 50-75 gig of data, then ALTER TABLE ADD INDEX?
I noticed the amount of time it takes to do the inserts increases.
Does anyone have any advice on if it is faster to create my table with indicies, and stuff away, or should I create my table without indecies,
slurp in the 50-75 gig of data, then ALTER TABLE ADD INDEX?
no subject
Date: 2006-01-27 09:23 pm (UTC)All those DBs also have bulk copy (bcp) mechanisms designed for this case which are way faster than row by row inserts. In fact, now that you mention it the file you were parsing the other day sounds exactly like it was created by such a tool reading out of a DB. If MySQL has something like that I'd try to see if you can use it. BCP is generally considered to be the fastest way to get data into a DB.
no subject
Date: 2006-01-27 09:29 pm (UTC)I don't mind it running all weekend.
I'll drop the table, and try again.
no subject
Date: 2006-01-27 10:29 pm (UTC)no subject
Date: 2006-01-27 10:49 pm (UTC)to load the data. If its all loaded by monday 9 am i'll call it victory.