gsh: (Default)
[personal profile] gsh
I'm creating a *huge* table in mysql, at least 50 Gig. I have a C program reading a file from else where, picking out the info I want/need, then stuffing it into mysql with lots of insert into table (ra,dec,mag....) values () where ra and dec are indexed.

I noticed the amount of time it takes to do the inserts increases.

Does anyone have any advice on if it is faster to create my table with indicies, and stuff away, or should I create my table without indecies,
slurp in the 50-75 gig of data, then ALTER TABLE ADD INDEX?

Date: 2006-01-27 10:29 pm (UTC)
From: [identity profile] dckermit.livejournal.com
My experience, also with other applications, is that it's far faster to have the extraction program create a flat file from the "elsewhere" data, then allow the database application to load that flat file in bulk. Sometimes days faster.

June 2015

S M T W T F S
 123456
78 910111213
14151617181920
21222324252627
282930    

Most Popular Tags

Style Credit

Expand Cut Tags

No cut tags
Page generated Mar. 25th, 2026 02:08 pm
Powered by Dreamwidth Studios