On Sun, Jan 4, 2009 at 12:36 PM, Amos Shapira <amos.shap...@gmail.com>wrote:

> 2009/1/3 Amos Shapira <amos.shap...@gmail.com>:
> > 2009/1/3 sara fink <sara.f...@gmail.com>:
> >> Recently I searched for diskimage programs and came across this program
> >> http://www.dubaron.com/diskimage/ but I haven't tried it.
> >> That program is for windows.
> >
> > I have access only to linux for that matter.
> >
> >> There are other 2 programs for linux which you might consider and they
> are
> >> for linux. Clonezilla (I wasn't succesful with that, due to other
> reasons,
> >> but might work for you) and the 2nd option which I prefer is g4u. g4u
> you
> >> rescue to ftp server. upload the files to a ftp server. This might help.
> I
> >> will try later today g4u and tell you. I use g4u to backup ntfs
> filesystems.
> >
> > My problem is with (what appears to be) a currupt media. Clonezilla
> > and g4u are sort of backup programs.
> >
> > That's why I think ddrescue is the right tool - it just keeps trying
> > to read each and every block on the device and skip over blocks which
> > can't be read even after a few retries.
> >
> >>
> >> As for your questions:
> >> What makes you think that if you put another usb hd you will start from
> 137
> >> and up?
> >
> > No, I don't think I need another USB hd, I think I need another
> > external IDE->USB enclosure with an IDE/ATA controller which can
> > support 200Gb, to read the last part of my defective disk.
>
> Now that I packed the USB box to give it back to the shop, I noticed
> that it claims to support drives up to 500Gb.
> It also comes with a (Windows?) driver mini-CD.


The driver was installed? As far as I know, these mini cds are usually for
windows 95, windows me. Old machines.
Please check what it's written on the mini-cd.

>
>
> Could it be that Windows screwed up some parameters on the disk itself
> to disable it from supporting >137Gb?


137gb is the limit of ntfs with windows service pack1. service pack2 is
greater.


> Is there something I can change (especially though the USB cable, as I
> don't have a working P-ATA chassis handy)?
>
> >
> >>
> >> To recover file names try to look here:
> >>
> http://forums.getdata.com/computer-data-recovery/90-why-cant-recover-filenames.html
> >>
> >> Maybe their program will help.
> >>
> >> as for identifying the file name in the gz file, might be tricky.
> Usually
> >> the file name appears in the first line (header). So in general cat of
> the
> >
> > I see that the gzip format indeed contains the original file name in
> > the header. I don't see a way to view this using the gzip/gunzip
> > tools. Maybe I'll have to resort to Perl script for that, I was hoping
> > I'll be able to avoid that. :(
>
> After reading this I though (and later confirmed) that using gnu tar's
> "z" argument probably means that the gzip compression function inside
> tar see an unnamed input stream and therefore won't record the file
> name in the header (fixing this might be a nice feature to add to gnu
> tar - "put filename.tar less the .gz, or convert .tgz to .tar, in the
> files 'Name' attribute").  The following little perl script confirmed
> this:
>
> #!/usr/bin/perl
>
> use strict;
> use warnings;
> #use Data::Dumper::Simple;
>
> use IO::Uncompress::Gunzip qw($GunzipError);
>
> for my $filename (@ARGV)
> {
>  my $z = new IO::Uncompress::Gunzip $filename
>    or die "$filename: $GunzipError";
>
>  my $hdr = $z->getHeaderInfo();
>  my $name = defined($hdr->{Name}) ? $hdr->{Name} : "(undefined)";
>  print "\"$name\"\n";
> #  print Dumper($hdr);
> }
>
> exit 0;
>
> Run it with:
>
> $ find . -name "*.gz" | xargs read-gzip-header.pl
>
> All .gz files retrieved by photorec had undefined "Name" attribute.
>
> Cheers,
>
> --Amos
>
> =================================================================
> To unsubscribe, send mail to linux-il-requ...@cs.huji.ac.il with
> the word "unsubscribe" in the message body, e.g., run the command
> echo unsubscribe | mail linux-il-requ...@cs.huji.ac.il
>
>

Reply via email to