I have no idea about the difference in performance while
recording, but I don't see why you'd have to go thru the
file 24 times -- you should be able to write a "separation"
program to post-process the single capture-file into 12-24
individual files. I mean, even at a few GB doesn't take up
much room on a 100 GB HD which I've seen offered for under
$300 a few days ago.
Norm
----- Original Message -----
From: Loren Frank <[EMAIL PROTECTED]>
To: <[EMAIL PROTECTED]>
Sent: Wednesday, October 24, 2001 11:07 PM
Subject: [rtl] another disk write performance question
> Hi all. First off, thanks to those who responded to my
previous post. Turning
> on IRQ and DMA for the hard drive solves the problem.
>
> Now I have yet another slightly off topic disk question.
I'm collecting data
> from somewhere between 12 and 24 sources and I in the
data file produced by the
> collection software, the sources are all interleaved. I
need to create
> separate files, one for each source, and I'm wondering if
anyone has an idea of
> how much performance hit there would be for writing them
all out simultaneously
> rather than one at a time. The original data file could
be as large as a few
> GBytes, and I don't relish the idea of going through it
24 times, one for each
> source.
>
> Any thoughts?
>
> Thanks,
>
> Loren
> --
> Loren Frank
>
> Postdoctoral Fellow
> Neuroscience Statistics Research Laboratory
> Harvard / M.I.T. Division of Health Sciences and
Technology and
> Department of Anesthesia and Critical Care, Massachusetts
General Hospital
>
> -- [rtl] ---
> To unsubscribe:
> echo "unsubscribe rtl" | mail [EMAIL PROTECTED] OR
> echo "unsubscribe rtl <Your_email>" | mail
[EMAIL PROTECTED]
> --
> For more information on Real-Time Linux see:
> http://www.rtlinux.org/
>
-- [rtl] ---
To unsubscribe:
echo "unsubscribe rtl" | mail [EMAIL PROTECTED] OR
echo "unsubscribe rtl <Your_email>" | mail [EMAIL PROTECTED]
--
For more information on Real-Time Linux see:
http://www.rtlinux.org/