[nycphp-talk] DB for large datasets
Tim Sailer
sailer at bnl.gov
Thu Aug 12 09:38:18 EDT 2004
I'm developing an internal application that takes information about
network traffic, and stores it in tables, currently MySQL, for each
month. A merge table gets queried instead of having to look through
each table. Now, the problem is, we're looking at more than 30M records/mo
and MySQL is just barfing. I'm getting the notorious error 127 from the
table handler, and bad corruption if I am foolish enough to try to delete
from the tables. The backend feeding the database is perl, and the frontend,
of course, is PHP. My only alternative at this point is to go to another
'robust' database like Postresql or Oracle. My inclination is Postgresql.
Not having any experience lately with Postgresql, I'm turning to the collective
brainpower of this group. Porting the code from MySQL to Postgresql seems
straightforward. Does Postgresql have something like Merge Tables? What
about performance. What does anyone thing will be the performance loss/gain
on the move? It's on a 2.8Ghz P4/2G RAM machine.
Thanks,
Tim
--
Tim Sailer <sailer at bnl.gov>
Information and Special Technologies Program
Office of CounterIntelligence
Brookhaven National Laboratory (631) 344-3001
More information about the talk
mailing list