On Friday, 20 January 2017 at 08:19:57 UTC, Chris M. wrote:
I have no idea if this is an issue with D, or OpenSSL, or if I'm just doing something completely wrong. I'm writing a program that will either encrypt or decrypt a string using AES in ECB mode (for a school assignment) and it's giving me a very strange bug.

[...]

    auto encOut = new ubyte[inputStr.length];

    // Encrypt and convert to base64
    AES_set_encrypt_key(aesKey.ptr, aesKey.sizeof * 8, &encKey);
AES_ecb_encrypt(inputStr.ptr, encOut.ptr, &encKey, AES_ENCRYPT);

Here's the problem. I tried running this without the if-else statements (i.e. encrypting and decrypting all in one run of the program, code below). If I leave in the base64 encoding and decoding, and use decB64 as the input to decrypt, it still doesn't work. However, if I decrypt with encOut directly, or assign encOut to decB64, it somehow works.

My guess:

The encrypted output will be a bit longer than your input. You're not getting an out of bounds exception during encryption since OpenSSL only has the pointer to write to, not a buffer length it can check. The memory behind your buffer is apparently committed, and will be written to by OpenSSL. This is why using the same buffer to decrypt works: it continues to read after the end of the buffer. (In case it's not clear, writing and reading past the end of the buffer is really bad)

I expect OpenSSL to have a helper function to calculate the required buffer size for a given input length. Use that to allocate the buffer.
  • Strange Bug Chris M. via Digitalmars-d-learn
    • Re: Strange Bug Timothee Cour via Digitalmars-d-learn
    • Re: Strange Bug Rene Zwanenburg via Digitalmars-d-learn

Reply via email to