[nycphp-talk] DB for large datasets
Tim Sailer
sailer at bnl.gov
Thu Aug 12 10:30:01 EDT 2004
On Thu, Aug 12, 2004 at 10:14:50AM -0400, Tim Gales wrote:
> Tim Sailer writes:
>
> "...I'm getting the notorious error 127 from the table
> > handler, and bad corruption if I am foolish enough to try to
> > delete from the tables."
>
> Have you considered preprocessing/aggregating the
> information for the most popular queries?
Yup, it's not feasable.
> What I am suggesting is creating some redundant
> and somewhat out of date (perhaps good up to the day before)
> tables which sum up data for the current month.
Some of what I'm storing is data from one of our network sensors that
show every connection, src and dst IPs. For each month, I'm building
a table that has a unique record for every pair, no matter how many
times the connection was made. querying for an ip, whether src or dst,
there's just no way of preprocessing that. Also, combined volume of traffic
for each record is stored. It's adding up to quite a few records...
Tim
--
Tim Sailer <sailer at bnl.gov>
Information and Special Technologies Program
Office of CounterIntelligence
Brookhaven National Laboratory (631) 344-3001
More information about the talk
mailing list