On Thursday, 15 September 2016 at 19:03:22 UTC, Darren wrote:


This is the code I'm using: https://dfcode.wordpress.com/2016/09/15/texcodewip/
(The code for the shaders is at the bottom)

For comparison, this is the code I'm trying to make work:
http://www.learnopengl.com/code_viewer.php?code=getting-started/textures

I don't have time to try and test this myself, but this looks like a potential problem here:

```
auto textureFormat = surface.format.format;
glTexImage2D(GL_TEXTURE_2D, 0, surface.format.BytesPerPixel, surface.w, surface.h, 0, textureFormat, GL_UNSIGNED_BYTE, surface.pixels);
```

You're passing an SDL enum value to OpenGL, which knows absolutely nothing about SDL enum values. Rather than passing surface.format.format directly to OGL, you need to first match it to the appropriate OGL format enum and pass *that*, which is what I tried to explain in my previous post. This is easily done by making a function which contains a switch statement iterating all of the SDL_PIXELFORMAT_* values and returning the equivalent GL_* value.

At any rate, if you're using the same image as the example, you don't need to do that in this specific case. You can safely ignore surface.format.format and simply pass GL_RGB, which is what the example uses. You only need to do the SDL->GL conversion for a texture loader that is intended to load multiple pixel formats.

Reply via email to