On Wed, 17 Apr 2013, Beith, Linda <lbe...@rwu.edu> wrote:

Hi folks,

I am new to the list and am hoping someone can provide some
suggestions for a situation we have at my University. We have
had a rather catastrophic loss of all data from one of our Fall
2012 courses on our Sakai open source learning management
server. To compound matters, we have a military student who had
an incomplete in that course and is on deadline to finish his
work and submit his grades or face being dropped from his
academic program.

Since our Sakai instance is hosted by a third-party vendor we
don't have direct access to the application at the server
level, so each month the vendor makes a backup copy of our full
database and encrypts/zips it using GNU PG so we can download
it.  We then decrypt it using the passcode they provide and we
can run stats against the resulting SQL file.

I had a backup file from early December 2012 that I had
downloaded but never opened. I sent the file back to our vendor
in hopes of being able to retrieve the course data

I do not understand.  If usually you just "use the passcode" to
decrypt the backup file, why did you treat the early December
2012 differently?  Whys should the "vendor" handle this?  Why not
you, in the usual way, by using the passcode for that backup
file?

                                                   however when
they tried to unzip/decrypt it, they were not prompted for the
passcode and just got an error:

Gpg: can't open 'rwu.dbdump_Nov2012.sql.gz.gpg'
Gpg: decrypt_message filed: file open error

If I understand correctly:

1. There was an original file, call it A .

2. gzip was applied to A to get the file A.gz .

3. Then gpg was applied to get A.gz to get A.gz.gpg .

I ask

What operation of gpg was applied to produce A.gz.gpg from A.gz ?

Is the "passcode" a PGP public key, or something else?

I have likely not understood how things work, but one natural
way, it seems to me, would be to have the course publish a PGP
public key, and anyone who wanted to send a file to the
teachers/administrators of the course could just use gpg to send
the file, not as cleartext, but encrypted with the public key of
the course.  In this situation, there would be no "passcode"
provided by the vendor, but rather just the course's own public
key, with the course carefully keeping the corresponding private
key private.


We can't have them redo the backup because it is too late - the
files are no longer on their server. So the only source of the
work is locked in this zipped file. The zipped file is quite
large - over 1 GB so we know there is data there - we just
can't get to it.

The assumption is that something went wrong in the original
encryption of the file. Do you have idea if it is possible to
extract data in this situation?

I appreciate any help or suggestions you can provide,
Linda

Perhaps gpg's encryption failed.  But more likely it seems to me
the big file was corrupted in transit.  I have heard, though it
always seemed almost unbelievable, that some http browsers
corrupt files, unless the browser is specifically told not to.
It is also the case, and this I believe easily, that many email
stacks corrupt files sent via the stack.  I believe this easily
because I remember the early, and now traditional, misdesign of
parts of our email system.

I'd try getting hold of the file by transporting it from the
vendor to you by using "rsync --rsh=ssh ...".  In my experience
rsync is reliable for even gigabyte files.

oo--JS.




Linda L. Beith, Ph.D.
Roger Williams University
Director, Instructional Design
One Old Ferry Road, Bristol RI
401-254-3134
Website: id.rwu.edu<http://id.rwu.edu/>



_______________________________________________
Gnupg-users mailing list
Gnupg-users@gnupg.org
http://lists.gnupg.org/mailman/listinfo/gnupg-users

Reply via email to