=2048 count=70
gunzip -cd backup_cd2.img.gz | dd of=/dev/sdb bs=2048 seek=70
Hope it helps,
Greetz,
Sebastiaan
On 8 Feb 2001, David Bishop wrote:
> I'm trying to make backups to CD, and of course, have 800-900 megs worth of
> data, compressed. What is the best way to split up l
I'm trying to make backups to CD, and of course, have 800-900 megs worth of
data, compressed. What is the best way to split up large tar or cpio files,
that will allow them to easily be put back together, booting off of a rescue
floppy or the like? I don't need any scripts or dir
=2048 count=70
gunzip -cd backup_cd2.img.gz | dd of=/dev/sdb bs=2048 seek=70
Hope it helps,
Greetz,
Sebastiaan
On 8 Feb 2001, David Bishop wrote:
> I'm trying to make backups to CD, and of course, have 800-900 megs worth of
> data, compressed. What is the best way to split up l
I'm trying to make backups to CD, and of course, have 800-900 megs worth of
data, compressed. What is the best way to split up large tar or cpio files,
that will allow them to easily be put back together, booting off of a rescue
floppy or the like? I don't need any scripts or dir
4 matches
Mail list logo