Corinna Vinschen wrote:
On Nov 9 00:56, Linda Walsh wrote:
I was running some network bandwidth tests using "dd" (WinXP2/Cygwin)
I was timing copies of a 300MB file from local disk to a remote
server. The local computer has enough memory to hold the file
in the memory cache once it is loaded.
I ran through increasing power-of-two block sizes from
512 bytes up to 512MB, where it, theoretically could read
and write the file in one read and write for the whole file.
Essentially (say "s:" is a network drive):
for (i = 512; i<=512MB; i = i*2) do {
dd bs=$i if=/tmp/input of=/s/Video/output.dat
}
Unfortunately, the test /fails/ at or above 64MB with:
dd: writing '/s/Video/output.dat': Resource temporarily unavailable
I tried this with a 120 Megs and 1.2 Gigs file and dd works with all
blocksize up to the tested 512 Megs for me.
Could this be a network issue with big blocksizes, maybe
---
Perhaps...though just tried it again with 32M, 64M, 128M 256M 512M and
using /dev/zero as input. That works up through 256M. Will have to try
looking at this some more later...strange. Thanks for the feedback.
linda
--
Unsubscribe info: http://cygwin.com/ml/#unsubscribe-simple
Problem reports: http://cygwin.com/problems.html
Documentation: http://cygwin.com/docs.html
FAQ: http://cygwin.com/faq/