size problem of large file transfer over https

2009-03-26 Thread H.S.
Hello,

I have a large data tar file of 4.4 GB. I have made it available over
https to be downloaded by the recipient. This is on a Debian Sid, 2.6.26
kernel and the partition is ext3.

When the remote user clicks on that download link, his browser is
showing the file size to be only around 130 MB. The client is a Windows
XP machine where the drive is NTFS formatted.

I am not well versed with apache server. Have it somehow hit a limit set
in the https server? If not what gives?

Thanks.

-- 

Please reply to this list only. I read this list on its corresponding
newsgroup on gmane.org. Replies sent to my email address are just
filtered to a folder in my mailbox and get periodically deleted without
ever having been read.


-- 
To UNSUBSCRIBE, email to debian-user-requ...@lists.debian.org 
with a subject of unsubscribe. Trouble? Contact listmas...@lists.debian.org



Re: size problem of large file transfer over https

2009-03-26 Thread Boyd Stephen Smith Jr.
In gqg8ru$4j...@ger.gmane.org, H.S. wrote:
I have a large data tar file of 4.4 GB. I have made it available over
https to be downloaded by the recipient. This is on a Debian Sid, 2.6.26
kernel and the partition is ext3.

When the remote user clicks on that download link, his browser is
showing the file size to be only around 130 MB. The client is a Windows
XP machine where the drive is NTFS formatted.

I am not well versed with apache server. Have it somehow hit a limit set
in the https server? If not what gives?

It's probably not a problem with the Apache server.  Instead, it is probably 
an issue with the Windows client.  The size reported by the Apache server is 
probably overflowing a 32-bit unsigned integer.  I'm pretty sure the NTFS 
supports 4.4G files, but that doesn't mean that every client (or server) is 
prepared to see a size that large.

Could you post the results of an HTTP HEAD request for the file?  I'm 
particularly interested in what Apache is sending as the value of the 
Content-Length header.  If you don't have another tool in mind for this, 
wget should be able to show you these headers.

If the Content-Length is correct, it is a problem client-side.  If the 
Content-Length is incorrect, it is a problem server-side.
-- 
Boyd Stephen Smith Jr.   ,= ,-_-. =.
b...@iguanasuicide.net  ((_/)o o(\_))
ICQ: 514984 YM/AIM: DaTwinkDaddy `-'(. .)`-'
http://iguanasuicide.net/\_/



signature.asc
Description: This is a digitally signed message part.


Re: size problem of large file transfer over https

2009-03-26 Thread H.S.
Boyd Stephen Smith Jr. wrote:
 In gqg8ru$4j...@ger.gmane.org, H.S. wrote:
 I have a large data tar file of 4.4 GB. I have made it available over
 https to be downloaded by the recipient. This is on a Debian Sid, 2.6.26
 kernel and the partition is ext3.

 When the remote user clicks on that download link, his browser is
 showing the file size to be only around 130 MB. The client is a Windows
 XP machine where the drive is NTFS formatted.

 I am not well versed with apache server. Have it somehow hit a limit set
 in the https server? If not what gives?
 
 It's probably not a problem with the Apache server.  Instead, it is probably 
 an issue with the Windows client.  The size reported by the Apache server is 
 probably overflowing a 32-bit unsigned integer.  I'm pretty sure the NTFS 
 supports 4.4G files, but that doesn't mean that every client (or server) is 
 prepared to see a size that large.
 
 Could you post the results of an HTTP HEAD request for the file?  I'm 
 particularly interested in what Apache is sending as the value of the 
 Content-Length header.  If you don't have another tool in mind for this, 
 wget should be able to show you these headers.
 
 If the Content-Length is correct, it is a problem client-side.  If the 
 Content-Length is incorrect, it is a problem server-side.

I gave it a shot myself with my own machine using Iceape browser and I
also see the size as around 132 MB. I am trying from a Debian Testing
machine (ext3 partition).

Here is the request I see if I try using wget:
.
.
.
HTTP request sent, awaiting response... 200 OK
Length: 138256384 (132M) [application/x-tar]
Saving to: `datafile.tar'

18% [==   ] 25,657,344  11.1M/s  ^C


So something is messed with the server. How do I go about checking what?

Thanks.


-- 

Please reply to this list only. I read this list on its corresponding
newsgroup on gmane.org. Replies sent to my email address are just
filtered to a folder in my mailbox and get periodically deleted without
ever having been read.


-- 
To UNSUBSCRIBE, email to debian-user-requ...@lists.debian.org 
with a subject of unsubscribe. Trouble? Contact listmas...@lists.debian.org



Re: size problem of large file transfer over https

2009-03-26 Thread Boyd Stephen Smith Jr.
In gqgau4$e8...@ger.gmane.org, H.S. wrote:
Boyd Stephen Smith Jr. wrote:
 In gqg8ru$4j...@ger.gmane.org, H.S. wrote:
 I have a large data tar file of 4.4 GB. I have made it available over
 https to be downloaded by the recipient. This is on a Debian Sid,
 2.6.26 kernel and the partition is ext3.

 When the remote user clicks on that download link, his browser is
 showing the file size to be only around 130 MB. The client is a Windows
 XP machine where the drive is NTFS formatted.
 It's probably not a problem with the Apache server.  Instead, it is
 probably an issue with the Windows client.  The size reported by the
 Apache server is probably overflowing a 32-bit unsigned integer.  I'm
 pretty sure the NTFS supports 4.4G files, but that doesn't mean that
 every client (or server) is prepared to see a size that large.

 If the Content-Length is correct, it is a problem client-side.  If the
 Content-Length is incorrect, it is a problem server-side.
I gave it a shot myself with my own machine using Iceape browser and I
also see the size as around 132 MB. I am trying from a Debian Testing
machine (ext3 partition).

Here is the request I see if I try using wget:
.
HTTP request sent, awaiting response... 200 OK
Length: 138256384 (132M) [application/x-tar]

So something is messed with the server. How do I go about checking what?

I am also not an expert in Apache, so this is about as far as I can take 
you.  You should probably scan the Apache documentation for options 
pertaining to big, large, or huge files.  (Or perhaps they are more 
technical and use the phrase 32-bit or 4G.)  While you are doing that I'd 
hit an apache-specific list (and hope someone better versed replies here) 
with your query.

If you don't get any (more) traction on the mailing lists and are fairly 
sure your setup is correct, so ahead and file a bug (if one doesn't already 
exist).
-- 
Boyd Stephen Smith Jr.   ,= ,-_-. =.
b...@iguanasuicide.net  ((_/)o o(\_))
ICQ: 514984 YM/AIM: DaTwinkDaddy `-'(. .)`-'
http://iguanasuicide.net/\_/



signature.asc
Description: This is a digitally signed message part.


Re: size problem of large file transfer over https

2009-03-26 Thread H.S.
Boyd Stephen Smith Jr. wrote:
 In gqgau4$e8...@ger.gmane.org, H.S. wrote:
 Boyd Stephen Smith Jr. wrote:

 If the Content-Length is correct, it is a problem client-side.  If the
 Content-Length is incorrect, it is a problem server-side.
 I gave it a shot myself with my own machine using Iceape browser and I
 also see the size as around 132 MB. I am trying from a Debian Testing
 machine (ext3 partition).

 Here is the request I see if I try using wget:
 .
 HTTP request sent, awaiting response... 200 OK
 Length: 138256384 (132M) [application/x-tar]

 So something is messed with the server. How do I go about checking what?
 
 I am also not an expert in Apache, so this is about as far as I can take 
 you.  You should probably scan the Apache documentation for options 
 pertaining to big, large, or huge files.  (Or perhaps they are more 
 technical and use the phrase 32-bit or 4G.)  While you are doing that I'd 
 hit an apache-specific list (and hope someone better versed replies here) 
 with your query.
 
 If you don't get any (more) traction on the mailing lists and are fairly 
 sure your setup is correct, so ahead and file a bug (if one doesn't already 
 exist).


Right. Thanks for your help. I am going to check apache docs now.

Regards.

-- 

Please reply to this list only. I read this list on its corresponding
newsgroup on gmane.org. Replies sent to my email address are just
filtered to a folder in my mailbox and get periodically deleted without
ever having been read.


-- 
To UNSUBSCRIBE, email to debian-user-requ...@lists.debian.org 
with a subject of unsubscribe. Trouble? Contact listmas...@lists.debian.org



[work around] Re: size problem of large file transfer over https

2009-03-26 Thread H.S.
Boyd Stephen Smith Jr. wrote:
 In gqgau4$e8...@ger.gmane.org, H.S. wrote:
 Boyd Stephen Smith Jr. wrote:

 If the Content-Length is correct, it is a problem client-side.  If the
 Content-Length is incorrect, it is a problem server-side.
 I gave it a shot myself with my own machine using Iceape browser and I
 also see the size as around 132 MB. I am trying from a Debian Testing
 machine (ext3 partition).

 Here is the request I see if I try using wget:
 .
 HTTP request sent, awaiting response... 200 OK
 Length: 138256384 (132M) [application/x-tar]

 So something is messed with the server. How do I go about checking what?
 
 I am also not an expert in Apache, so this is about as far as I can take 
 you.  You should probably scan the Apache documentation for options 
 pertaining to big, large, or huge files.  (Or perhaps they are more 
 technical and use the phrase 32-bit or 4G.)  While you are doing that I'd 
 hit an apache-specific list (and hope someone better versed replies here) 
 with your query.
 
 If you don't get any (more) traction on the mailing lists and are fairly 
 sure your setup is correct, so ahead and file a bug (if one doesn't already 
 exist).

Well, I split the tar file to smaller chunks of 680M each (using the
split command).
$ split -d -b 680M /tmp/datafile.tar  datfile

Now I am able to download the files successfully from the apache server
machine using wget. Apache reports the correct size now of Length:
713031680 (680M)

The only problem is that the remote user uses Windows and will need a
method to join them together back to the tar file. In Linux, one would
just cat them together. In Windows, I have searched google and found the
following will work on a command prompt:
copy /b datfile* datafile.tar /b

where datfilenn are the split files. This was from:
http://elliottback.com/wp/combine-split-files-in-windows/

The tar file then can be opened by using 7-zip in Windows.



-- 

Please reply to this list only. I read this list on its corresponding
newsgroup on gmane.org. Replies sent to my email address are just
filtered to a folder in my mailbox and get periodically deleted without
ever having been read.


-- 
To UNSUBSCRIBE, email to debian-user-requ...@lists.debian.org 
with a subject of unsubscribe. Trouble? Contact listmas...@lists.debian.org



Re: [work around] Re: size problem of large file transfer over https

2009-03-26 Thread Rob Starling
On Thu, Mar 26, 2009 at 01:27:27PM -0400, H.S. wrote:
 Well, I split the tar file to smaller chunks of 680M each (using the
 split command).
 $ split -d -b 680M /tmp/datafile.tar  datfile
...
 The only problem is that the remote user uses Windows and will need a
 method to join them together back to the tar file. In Linux, one would
 just cat them together. In Windows, I have searched google and found the
 following will work on a command prompt:
 copy /b datfile* datafile.tar /b
 
 where datfilenn are the split files. This was from:
 http://elliottback.com/wp/combine-split-files-in-windows/

just be sure that the * expands to the names of the files in the
right order!  if it's just a few, i'd play it safe and just list
'em in the right order.

--Rob*

-- 
/-\
| If we couldn't laugh we would all go insane   |
|  --Jimmy Buffett,   |
|Changes in Latitudes, Changes in Attitudes |
\-/


-- 
To UNSUBSCRIBE, email to debian-user-requ...@lists.debian.org 
with a subject of unsubscribe. Trouble? Contact listmas...@lists.debian.org



Re: [work around] Re: size problem of large file transfer over https

2009-03-26 Thread H.S.
Rob Starling wrote:
 On Thu, Mar 26, 2009 at 01:27:27PM -0400, H.S. wrote:
 Well, I split the tar file to smaller chunks of 680M each (using the
 split command).
 $ split -d -b 680M /tmp/datafile.tar  datfile
 ...
 
 just be sure that the * expands to the names of the files in the
 right order!  if it's just a few, i'd play it safe and just list
 'em in the right order.
 
 --Rob*
 

Of course. That -d tells split to number them sequentially. There are
only seven parts and they are listed in the right order.

Thanks.



-- 

Please reply to this list only. I read this list on its corresponding
newsgroup on gmane.org. Replies sent to my email address are just
filtered to a folder in my mailbox and get periodically deleted without
ever having been read.


-- 
To UNSUBSCRIBE, email to debian-user-requ...@lists.debian.org 
with a subject of unsubscribe. Trouble? Contact listmas...@lists.debian.org



Re: size problem of large file transfer over https

2009-03-26 Thread Alex Samad
On Thu, Mar 26, 2009 at 12:36:19PM -0400, H.S. wrote:
 Boyd Stephen Smith Jr. wrote:
  In gqg8ru$4j...@ger.gmane.org, H.S. wrote:
  I have a large data tar file of 4.4 GB. I have made it available over
  https to be downloaded by the recipient. This is on a Debian Sid, 2.6.26

[snip]

  If the Content-Length is correct, it is a problem client-side.  If the 
  Content-Length is incorrect, it is a problem server-side.
 
 I gave it a shot myself with my own machine using Iceape browser and I
 also see the size as around 132 MB. I am trying from a Debian Testing
 machine (ext3 partition).
 
 Here is the request I see if I try using wget:
 .
 .
 .
 HTTP request sent, awaiting response... 200 OK
 Length: 138256384 (132M) [application/x-tar]
 Saving to: `datafile.tar'
 
 18% [==   ] 25,657,344  11.1M/s  ^C

what if you point the browser at the directory instead of the file, the
listing should give you a file size as well 

 
 
 So something is messed with the server. How do I go about checking what?
 
 Thanks.
 
 
 -- 
 
 Please reply to this list only. I read this list on its corresponding
 newsgroup on gmane.org. Replies sent to my email address are just
 filtered to a folder in my mailbox and get periodically deleted without
 ever having been read.
 
 
 -- 
 To UNSUBSCRIBE, email to debian-user-requ...@lists.debian.org 
 with a subject of unsubscribe. Trouble? Contact listmas...@lists.debian.org
 
 

-- 
Genius is one percent inspiration and ninety-nine percent perspiration.
-- Thomas Alva Edison


signature.asc
Description: Digital signature


Re: size problem of large file transfer over https

2009-03-26 Thread H.S.
H.S. wrote:
 Boyd Stephen Smith Jr. wrote:
 In gqgau4$e8...@ger.gmane.org, H.S. wrote:
 Boyd Stephen Smith Jr. wrote:
 If the Content-Length is correct, it is a problem client-side.  If the
 Content-Length is incorrect, it is a problem server-side.
 I gave it a shot myself with my own machine using Iceape browser and I
 also see the size as around 132 MB. I am trying from a Debian Testing
 machine (ext3 partition).

 Here is the request I see if I try using wget:
 .
 HTTP request sent, awaiting response... 200 OK
 Length: 138256384 (132M) [application/x-tar]

 So something is messed with the server. How do I go about checking what?
 I am also not an expert in Apache, so this is about as far as I can take 
 you.  You should probably scan the Apache documentation for options 
 pertaining to big, large, or huge files.  (Or perhaps they are more 
 technical and use the phrase 32-bit or 4G.)  While you are doing that I'd 
 hit an apache-specific list (and hope someone better versed replies here) 
 with your query.
 
 Right. Thanks for your help. I am going to check apache docs now.
 


Well, looks like I know what the problem now is. The apache server on
the machine in questions is an older one:
apache  1.3.34-4.1

The problem appears with files larger than 2 GB.

Base on http://bugs.debian.org/cgi-bin/bugreport.cgi?bug=156972, the bug
appears to be fixed in apache2. The only solution is to upgrade to apache2.

That clears up the problem I was having earlier in the day.

Thanks to everyone who participated in this discussion.
Regards.

-- 

Please reply to this list only. I read this list on its corresponding
newsgroup on gmane.org. Replies sent to my email address are just
filtered to a folder in my mailbox and get periodically deleted without
ever having been read.


-- 
To UNSUBSCRIBE, email to debian-user-requ...@lists.debian.org 
with a subject of unsubscribe. Trouble? Contact listmas...@lists.debian.org