Re: Using OpenGL
On Tuesday, 4 October 2016 at 16:09:34 UTC, Darren wrote: Back again with another little problem that isn't specifically OpenGL related, but is a result of getting such code to work. I actually figured it out; my own mistakes.
Re: Using OpenGL
Back again with another little problem that isn't specifically OpenGL related, but is a result of getting such code to work. Code I'm working on: https://dfcode.wordpress.com/2016/10/04/linker-problem/ What I'm learning from: http://www.learnopengl.com/#!Getting-started/Camera, http://www.learnopengl.com/code_viewer.php?type=header=camera The problem is I'm trying to move camera code into a module and import it, but when I try to build I'm getting the following error messages: source\app.d(256,30): Error: function 'fpscamera.fpsCamera.processMouseMovement' is not nothrow source\app.d(243,7): Error: function 'app.mouse_callback' is nothrow yet may throw If I add nothrow to processMouseMovement, like I did for some other functions, I get the following: .dub\build\application-debug-windows-x86-dmd_2071-3EF635850CA47CC4E927BFC9336E0233\ogl.obj(ogl) Error 42: Symbol Undefined _D9fpscamera9fpsCamera13getViewMatrixMxFNdZS4gl3n6linalg21__T6MatrixTfVii4Vii4Z6Matrix .dub\build\application-debug-windows-x86-dmd_2071-3EF635850CA47CC4E927BFC9336E0233\ogl.obj(ogl) Error 42: Symbol Undefined _D9fpscamera9fpsCamera20processMouseMovementMFNbffhZv .dub\build\application-debug-windows-x86-dmd_2071-3EF635850CA47CC4E927BFC9336E0233\ogl.obj(ogl) Error 42: Symbol Undefined _D9fpscamera9fpsCamera18processMouseScrollMFNbfZv .dub\build\application-debug-windows-x86-dmd_2071-3EF635850CA47CC4E927BFC9336E0233\ogl.obj(ogl) Error 42: Symbol Undefined _D9fpscamera9fpsCamera15processKeyboardMFE9fpscamera14CameraMovementfZv .dub\build\application-debug-windows-x86-dmd_2071-3EF635850CA47CC4E927BFC9336E0233\ogl.obj(ogl) Error 42: Symbol Undefined _D9fpscamera12__ModuleInfoZ --- errorlevel 5 It seems to happen if I make both processMouseMovement and processMouseScroll nothrow to get rid of the error messages. If one or the other is nothrow, I get the nothrow message for that function.
Re: Using OpenGL
On Friday, 16 September 2016 at 01:54:50 UTC, Mike Parker wrote: snip Okay,I actually had GL_RGB for those two fields and it didn't work, but I guess I didn't try them again after I fixed the crash issue because now it works fine. Thanks again for the guidance!
Re: Using OpenGL
On Thursday, 15 September 2016 at 19:03:22 UTC, Darren wrote: This is the code I'm using: https://dfcode.wordpress.com/2016/09/15/texcodewip/ (The code for the shaders is at the bottom) For comparison, this is the code I'm trying to make work: http://www.learnopengl.com/code_viewer.php?code=getting-started/textures I don't have time to try and test this myself, but this looks like a potential problem here: ``` auto textureFormat = surface.format.format; glTexImage2D(GL_TEXTURE_2D, 0, surface.format.BytesPerPixel, surface.w, surface.h, 0, textureFormat, GL_UNSIGNED_BYTE, surface.pixels); ``` You're passing an SDL enum value to OpenGL, which knows absolutely nothing about SDL enum values. Rather than passing surface.format.format directly to OGL, you need to first match it to the appropriate OGL format enum and pass *that*, which is what I tried to explain in my previous post. This is easily done by making a function which contains a switch statement iterating all of the SDL_PIXELFORMAT_* values and returning the equivalent GL_* value. At any rate, if you're using the same image as the example, you don't need to do that in this specific case. You can safely ignore surface.format.format and simply pass GL_RGB, which is what the example uses. You only need to do the SDL->GL conversion for a texture loader that is intended to load multiple pixel formats.
Re: Using OpenGL
On Thursday, 15 September 2016 at 02:11:03 UTC, Mike Parker wrote: //snip Okay the crashing was my fault, more or less a copy-paste error. The program now runs but has a black rectangle where a texture should be. This is the code I'm using: https://dfcode.wordpress.com/2016/09/15/texcodewip/ (The code for the shaders is at the bottom) For comparison, this is the code I'm trying to make work: http://www.learnopengl.com/code_viewer.php?code=getting-started/textures
Re: Using OpenGL
On Wednesday, 14 September 2016 at 16:49:51 UTC, Darren wrote: While googling, the idea seemed to be to create and SDL_Surface* and pass that (or surface.pixels) as the last argument for glTexImage2D. Didn't work for me, however. Does anyone have any tips? I'm driving blind here without any of your code to see, but here's some general advice. The last three arguments to glTexImage2D [1] are what you use to describe the image data you are sending to it (aside from 'width' & 'height', which also set the size of the texture being created). `format` needs to match the format of your image. For most simple things you do with OpenGL, that's going to be GL_RGB or GL_RGBA. If you don't know the format of your image, you can get it from the SDL_PixelFormatEnum value [2] in surface.format.format. Then you'll just need to send the corresponding GL enum to glTexImage2D. Though, be aware that the way SDL and OGL treat color formats may not always correspond in the way you think they should [3]. `type` is almost always going to be GL_UNSIGNED_BYTE. I don't know that SDL_image supports anything else, like floating point formats, anyway. `data` needs to be a pointer to the pixel data and nothing else (never SDL_Surface!), so in this case surface.pixels is correct. If you are getting a crash, check the return of any of the image loading functions you call to make sure the load was successful. That's the first place I'd check. [1] https://www.opengl.org/sdk/docs/man/html/glTexImage2D.xhtml [2] https://wiki.libsdl.org/SDL_PixelFormatEnum [3] https://bugzilla.libsdl.org/show_bug.cgi?id=2111
Re: Using OpenGL
Kind of resurrecting this thread; hope that's okay. I'm working through this tutorial: http://www.learnopengl.com/#!Getting-started/Textures It uses SOIL to load images, but I haven't seen any SOIL bindings in dub. I tried using SDL and SDL_Image but when the program ran it just crashed. I guess I was doing something wrong. While googling, the idea seemed to be to create and SDL_Surface* and pass that (or surface.pixels) as the last argument for glTexImage2D. Didn't work for me, however. Does anyone have any tips?
Re: Using OpenGL
On Monday, 5 September 2016 at 05:14:56 UTC, Nicholas Wilson wrote: On Saturday, 3 September 2016 at 17:13:49 UTC, Darren wrote: Now I wonder if I can load shaders from separate files (à la http://www.opengl-tutorial.org/beginners-tutorials/tutorial-2-the-first-triangle/). see: https://p0nce.github.io/d-idioms/#Embed-a-dynamic-library-in-an-executable for `import`ing external files at compile time. It's often useful to be able to reload shaders at runtime.
Re: Using OpenGL
On Saturday, 3 September 2016 at 17:13:49 UTC, Darren wrote: On Saturday, 3 September 2016 at 16:07:52 UTC, Mike Parker wrote: [...] The dynamic array! Thank you so much, I changed that on another file and it finally drew the triangle. And I ran your code and it works brilliantly. I should now be in a comfortable position to digest all this information now. Can't thank you enough. [...] Yeah, it is nicer to read. Now I wonder if I can load shaders from separate files (à la http://www.opengl-tutorial.org/beginners-tutorials/tutorial-2-the-first-triangle/). see: https://p0nce.github.io/d-idioms/#Embed-a-dynamic-library-in-an-executable for `import`ing external files at compile time.
Re: Using OpenGL
On Saturday, 3 September 2016 at 16:07:52 UTC, Mike Parker wrote: On Saturday, 3 September 2016 at 16:01:34 UTC, Mike Parker wrote: The following compiles, runs, and shows the triangle. It's the code you posted above with the corrected call to glBufferData along with more D style (as I would write it anyway) and less C. The dynamic array! Thank you so much, I changed that on another file and it finally drew the triangle. And I ran your code and it works brilliantly. I should now be in a comfortable position to digest all this information now. Can't thank you enough. One thing I overlooked. In lines where a variable is both declared and initialized, like this one: GLFWwindow* window = glfwCreateWindow(...); I normally let the compiler use type inference as I did in the manifest constant declarations: auto window = glfwCreateWindow(...); IMO, when you're dealing with long or ugly type names, it makes the code look cleaner. Yeah, it is nicer to read. Now I wonder if I can load shaders from separate files (à la http://www.opengl-tutorial.org/beginners-tutorials/tutorial-2-the-first-triangle/).
Re: Using OpenGL
On Saturday, 3 September 2016 at 16:01:34 UTC, Mike Parker wrote: The following compiles, runs, and shows the triangle. It's the code you posted above with the corrected call to glBufferData along with more D style (as I would write it anyway) and less C. One thing I overlooked. In lines where a variable is both declared and initialized, like this one: GLFWwindow* window = glfwCreateWindow(...); I normally let the compiler use type inference as I did in the manifest constant declarations: auto window = glfwCreateWindow(...); IMO, when you're dealing with long or ugly type names, it makes the code look cleaner.
Re: Using OpenGL
On Saturday, 3 September 2016 at 12:40:58 UTC, Darren wrote: I went through another tutorial. Changed the source code and left out the shaders. I get another coloured background but still no triangle. I have a feeling that glBufferData(GL_ARRAY_BUFFER, cast(int)g_vertex_buffer_data.sizeof, cast(void*)g_vertex_buffer_data, GL_STATIC_DRAW); Your vertices array is declared as a dynamic array. That means vertices.sizeof gives you the size of the *array reference*, not the data it contains. For a static array, you get the cumulative size of the data. See my changes in the code below. (In the first line, glBufferData wants an int, and .sizeof returns a uint, apparently.) That's irrelevant in this case. But sizeof is size_t, which is uint on 32-bit systems and long on 64-bit systems. I'm sure there's something simple I'm missing but I just don't have the experience to recognise it. The following compiles, runs, and shows the triangle. It's the code you posted above with the corrected call to glBufferData along with more D style (as I would write it anyway) and less C. Comments are inline. Probably best if you copy and paste it into an editor. ``` import std.stdio, std.format; import derelict.glfw3.glfw3; import derelict.opengl3.gl3; // Match the Derelict decleration of GLFW callbacks // https://github.com/DerelictOrg/DerelictGLFW3/blob/master/source/derelict/glfw3/types.d#L319 extern(C) nothrow { // Setting an error handler will let you get better error messages from GLFW // when something fails. // http://www.glfw.org/docs/latest/intro_guide.html#error_handling void onError(int error, const(char)* msg) { import std.conv : to; try { // The callback is nothrow, but format is not, so the // try...catch errMsg = format("GLFW Error #%s: %s", error, to!string(msg)); } catch(Exception e) {} } void key_callback(GLFWwindow* window, int key, int scancode, int action, int mode) { if (key == GLFW_KEY_ESCAPE && action == GLFW_PRESS) glfwSetWindowShouldClose(window, GL_TRUE); } } // This will save the error message from the callback private auto errMsg = "No Error"; // Use manifest constants rather than const // https://dlang.org/spec/enum.html#manifest_constants enum WIDTH = 800; enum HEIGHT = 600; // Rather than using multiple string literals with manual newlines, use // WYSIWYG strings as manifest constants. Looks cleaner (but requires an // extra step when calling glShaderSource). enum vertexShaderSource = `#version 330 core layout (location = 0) in vec3 position; void main() { gl_Position = vec4(position.x, position.y, position.z, 1.0); }`; enum fragmentShaderSource = `#version 330 core out vec4 color; void main() { color = vec4(1.0f, 0.5f, 0.2f, 1.0f); }`; void main() { DerelictGLFW3.load(); DerelictGL3.load(); // Set the error callback before calling glfwInit so that a useful message // can be reported on failure glfwSetErrorCallback(); // Always check for failure if(!glfwInit()) throw new Exception("Failed to init GLFW: " ~ errMsg); // Ensure that glfwTerminate is called even when an exception is thrown scope(exit) glfwTerminate(); glfwWindowHint(GLFW_CONTEXT_VERSION_MAJOR, 3); glfwWindowHint(GLFW_CONTEXT_VERSION_MINOR, 3); glfwWindowHint(GLFW_OPENGL_PROFILE, GLFW_OPENGL_CORE_PROFILE); // Optonal: Remove deprecated functionality glfwWindowHint(GLFW_OPENGL_FORWARD_COMPAT, 1); glfwWindowHint(GLFW_RESIZABLE, GL_FALSE); // Always check for failure GLFWwindow* window = glfwCreateWindow(WIDTH, HEIGHT, "LearnOpenGL", null, null); if(!window) throw new Exception("Failed to create window"); glfwMakeContextCurrent(window); DerelictGL3.reload(); glfwSetKeyCallback(window, _callback); int width, height; glfwGetFramebufferSize(window, , ); glViewport(0, 0, width, height); GLuint vertexShader = glCreateShader(GL_VERTEX_SHADER); // Because I switched the shader source to D strings, a slight adjustment // is needed here: const(char)* srcPtr = vertexShaderSource.ptr; glShaderSource(vertexShader, 1, , null); glCompileShader(vertexShader); GLint result; // Use dynamic arrays for the info logs so that you can always // ensure you have enough room. And throw exceptions on failure. glGetShaderiv(vertexShader, GL_COMPILE_STATUS, ); if (!result) { glGetShaderiv(vertexShader, GL_INFO_LOG_LENGTH, ); auto infoLog = new char[](result); glGetShaderInfoLog(vertexShader, 512, null, infoLog.ptr); throw new Exception(format("Failed to compile vertex shader:\n\t%s", infoLog)); } GLuint fragmentShader = glCreateShader(GL_FRAGMENT_SHADER); srcPtr = fragmentShaderSource.ptr; glShaderSource(fragmentShader, 1, , null);
Re: Using OpenGL
On Saturday, 3 September 2016 at 11:02:11 UTC, Lodovico Giaretta wrote: glGetShaderInfoLog(vertexShader, 512, null, [0]); I prefer: glGetShaderInfoLog(vertexShader, 512, null, infoLog.ptr);
Re: Using OpenGL
On Saturday, 3 September 2016 at 11:27:09 UTC, Mike Parker wrote: On Saturday, 3 September 2016 at 11:13:30 UTC, Lodovico Giaretta wrote: Ah! Well, providing error messages is always useful. Now I see your issue: your callback has D linkage, but OpenGL expects a function with C linkage. So you have to put `extern(C)` on your callback declaration. Well, it's GLFW, not OpenGL, but yes they do need to be extern (C) and also nothrow, as that is how the callback types are declared in Derrlict: ``` extern(C) nothrow void key_callback(GLFWwindow* window, int key, int scancode, int action, int mode) { if (key == GLFW_KEY_ESCAPE && action == GLFW_PRESS) glfwSetWindowShouldClose(window, GL_TRUE); } ``` Hey, it worked! Thanks a lot, I know what to do in the future now. Just need to figure out why this triangle isn't showing up and I should be well on my way.
Re: Using OpenGL
On Saturday, 3 September 2016 at 11:13:30 UTC, Lodovico Giaretta wrote: Ah! Well, providing error messages is always useful. Now I see your issue: your callback has D linkage, but OpenGL expects a function with C linkage. So you have to put `extern(C)` on your callback declaration. Well, it's GLFW, not OpenGL, but yes they do need to be extern (C) and also nothrow, as that is how the callback types are declared in Derrlict: ``` extern(C) nothrow void key_callback(GLFWwindow* window, int key, int scancode, int action, int mode) { if (key == GLFW_KEY_ESCAPE && action == GLFW_PRESS) glfwSetWindowShouldClose(window, GL_TRUE); } ```
Re: Using OpenGL
On Saturday, 3 September 2016 at 11:02:11 UTC, Lodovico Giaretta wrote: //glGetShaderInfoLog(vertexShader, 512, null, infoLog); glGetShaderInfoLog(vertexShader, 512, null, [0]); Thank you, I knew I had to do something like this! //glfwSetKeyCallback(window, key_callback); glfwSetKeyCallback(window, _callback); I actually tried this before but it doesn't work. I get the following error: Error: function pointer glfwSetKeyCallback (GLFWwindow*, extern (C) void function(GLFWwindow*, int, int, int, int) nothrow) is not callable using argument types (GLFWwindow*, void function(GLFWwindow* window, int key, int scancode, int action, int mode))
Re: Using OpenGL
It's not quite in a practically-usable state yet, but the SDL2 & OpenGL wrapper I'm working on may interest you as an example implementation if nothing else. https://github.com/pineapplemachine/mach.d/tree/master/mach/sdl I'm going to take a look at this, once I get my bearings, on the merits of the name alone! I'm sure there are a few tutorials that make use of SDL2 that I can use. My hope is that once I know how to set up, learning OpenGL will be more a matter of D-ifying the C-code. I'll post the modified code I'm trying to get work in full below. Some code is commented out because I couldn't make it work. Also any tips for making the code nicer would be welcome (e.g. would I change the const GLunit into enums?). # import derelict.glfw3.glfw3; import derelict.opengl3.gl3; import std.stdio; void key_callback(GLFWwindow* window, int key, int scancode, int action, int mode) { if (key == GLFW_KEY_ESCAPE && action == GLFW_PRESS) glfwSetWindowShouldClose(window, GL_TRUE); } const GLuint WIDTH = 800, HEIGHT = 600; const GLchar* vertexShaderSource = "#version 330 core\n" "layout (location = 0) in vec3 position;\n" "void main()\n" "{\n" "gl_Position = vec4(position.x, position.y, position.z, 1.0);\n" "}\0"; const GLchar* fragmentShaderSource = "#version 330 core\n" "out vec4 color;\n" "void main()\n" "{\n" "color = vec4(1.0f, 0.5f, 0.2f, 1.0f);\n" "}\n\0"; void main() { DerelictGLFW3.load(); DerelictGL3.load(); glfwInit(); glfwWindowHint(GLFW_CONTEXT_VERSION_MAJOR, 3); glfwWindowHint(GLFW_CONTEXT_VERSION_MINOR, 3); glfwWindowHint(GLFW_OPENGL_PROFILE, GLFW_OPENGL_CORE_PROFILE); glfwWindowHint(GLFW_RESIZABLE, GL_FALSE); GLFWwindow* window = glfwCreateWindow(WIDTH, HEIGHT, "LearnOpenGL", null, null); glfwMakeContextCurrent(window); DerelictGL3.reload(); //glfwSetKeyCallback(window, key_callback); int width, height; glfwGetFramebufferSize(window, , ); glViewport(0, 0, width, height); GLuint vertexShader = glCreateShader(GL_VERTEX_SHADER); glShaderSource(vertexShader, 1, , null); glCompileShader(vertexShader); GLint success; GLchar[512] infoLog; glGetShaderiv(vertexShader, GL_COMPILE_STATUS, ); if (!success) { //glGetShaderInfoLog(vertexShader, 512, null, infoLog); writeln("ERROR::SHADER::VERTEX::COMPILATION_FAILED\n", infoLog); } GLuint fragmentShader = glCreateShader(GL_FRAGMENT_SHADER); glShaderSource(fragmentShader, 1, , null); glCompileShader(fragmentShader); glGetShaderiv(fragmentShader, GL_COMPILE_STATUS, ); if (!success) { //glGetShaderInfoLog(fragmentShader, 512, null, infoLog); writeln("ERROR::SHADER::FRAGMENT::COMPILATION_FAILED\n", infoLog); } GLuint shaderProgram = glCreateProgram(); glAttachShader(shaderProgram, vertexShader); glAttachShader(shaderProgram, fragmentShader); glLinkProgram(shaderProgram); glGetProgramiv(shaderProgram, GL_LINK_STATUS, ); if (!success) { //glGetProgramInfoLog(shaderProgram, 512, null, infoLog); writeln("ERROR::SHADER::PROGRAM::LINKING_FAILED\n", infoLog); } glDeleteShader(vertexShader); glDeleteShader(fragmentShader); GLfloat[] vertices = [ -0.5f, -0.5f, 0.0f, 0.5f, -0.5f, 0.0f, 0.0f, 0.5f, 0.0f ]; GLuint VBO, VAO; glGenVertexArrays(1, ); glGenBuffers(1, ); glBindVertexArray(VAO); glBindBuffer(GL_ARRAY_BUFFER, VBO); glBufferData(GL_ARRAY_BUFFER, vertices.sizeof, cast(void*)vertices, GL_STATIC_DRAW); glVertexAttribPointer(0, 3, GL_FLOAT, GL_FALSE, 3 * GLfloat.sizeof, cast(GLvoid*)0); glEnableVertexAttribArray(0); glBindBuffer(GL_ARRAY_BUFFER, 0); glBindVertexArray(0); while (!glfwWindowShouldClose(window)) { glfwPollEvents(); glClearColor(0.2f, 0.3f, 0.3f, 1.0f); glClear(GL_COLOR_BUFFER_BIT); glUseProgram(shaderProgram); glBindVertexArray(VAO); glDrawArrays(GL_TRIANGLES, 0, 3); glBindVertexArray(0); glfwSwapBuffers(window); } glDeleteVertexArrays(1, ); glDeleteBuffers(1, ); glfwTerminate(); }
Re: Using OpenGL
Thanks for the information. The errors for the tutorial I _was_ trying to make work are as follows: source\app.d(9,5): Error: undefined identifier 'Window', did you mean variable 'window'? source\app.d(98,12): Error: undefined identifier 'Window', did you mean variable 'window'? source\app.d(101,14): Error: undefined identifier 'GLBuffer' source\app.d(104,14): Error: undefined identifier 'Shader' source\app.d(107,13): Error: undefined identifier 'Program', did you mean variable 'program'? source\app.d(110,15): Error: undefined identifier 'Attribute' I thought I might have needed another package for these, and gfm seemed to contain what I need in the form of the opengl sub-package. But even after importing that, it only gets rid of the GLBuffer error. I tried to follow another tutorial. Link to source code: http://www.learnopengl.com/code_viewer.php?code=getting-started/hellotriangle I'm having more success with this one. I pretty much hacked away at this and did my best to convert from C-style code to D. The window gets made and it has the green-ish background, but it's not drawing any triangles. Should I copy/paste the code I'm using in case I made a mistake?
Re: Using OpenGL
On Friday, 2 September 2016 at 20:38:15 UTC, Darren wrote: I'm trying to teach myself OpenGL but I'm not sure how to set it up exactly. I used "dub init" to create a project, I downloaded glew32.dll and glfw.dll and put them in this folder. I added "derelict-gl3": "~>1.0.19" and "derelict-glfw3": "~>3.1.0" to dependencies, and imported them in the app.d file. I add the load() and reload() functions where appropriate (I assume). You don't need GLEW. It's a library for C and C++ that loads all of the OpenGL functions and extensions available in the context you create. DerelictGL3 does that for you in D. DerelictGl3.load loads the OpenGL DLL and the functions up to OGL 1.1, and DerelictGL3.reload loads all the functions and extensions available in the current context. I can run a simple window program and it seems to work fine, but I notice that's when all the functions begin with "glfw". Yes, GLFW is a simple windowing toolkit for creating windows & OpenGL contexts and managing window & input events in a cross-platform manner. Without it, you would need to use the system APIs yourself (such as Win32 on Windows) or another cross-platform library like SDL or SFML. If I try to follow a tutorial for loading a triangle, I get errors when trying to build. Do I need to link an opengl.dll file, too? No, you do not need to link to OpenGL. By default, Derelict loads shared libraries at runtime so you will never have a link-time dependency on the C libraries Derelict binds to. Please post the errors you are seeing.
Re: Using OpenGL (custom port)
-BEGIN PGP SIGNED MESSAGE- Hash: SHA1 David Butler wrote: Is there a difference between a pointer to a D array and an int* in C? How do I convert between the two? Am I even looking in the right place? Thanks for any help! -Dave Butler There's nothing wrong with your int array. What's GetLastError tell you? According to the spec you can just pass null as the attrs anyway to get a default, try that. Only other thing I can suggest is maybe you've got the calling convention of PFNWGLCREATECONTEXTATTRIBSARBPROC wrong. But you only need to declare it in an extern (windows) block. Just out of curiosity where did you get the c headers for opengl 3? My card doesn't support it unfortunately so I've not bothered getting hold of it so far. - -- My enormous talent is exceeded only by my outrageous laziness. http://www.ssTk.co.uk -BEGIN PGP SIGNATURE- Version: GnuPG v1.4.7 (MingW32) Comment: Using GnuPG with Mozilla - http://enigmail.mozdev.org/ iD8DBQFKy4fbT9LetA9XoXwRAkUiAKCi/APnNEJDDVYx0jv6vyo03ji86wCfTXns Sa178i0Ry3U+tD1o2awy8d8= =t3Cb -END PGP SIGNATURE-
Re: Using OpenGL (custom port)
I got it working, it did turn out to be a slight error in the type declaration. I had this: typedef HGLRC function (HDC hDC, HGLRC hShareContext, in int *attribList) PFNWGLCREATECONTEXTATTRIBSARBPROC; And it needed to be: typedef extern (Windows) ...everything else... Whoops. But I can now successfully create a 3.0 context. Yay! You can get the latest extension headers for OpenGL from http://www.opengl.org/registry/#headers (glext.h, wglext.h, glxext.h). You should only need two of them, depending on the platform (I think) and just use the standard, old-timey gl.h from your system. glext.h is huge, but pretty easy to convert if you have a regex search/replace tool. div0 wrote: -BEGIN PGP SIGNED MESSAGE- Hash: SHA1 David Butler wrote: Is there a difference between a pointer to a D array and an int* in C? How do I convert between the two? Am I even looking in the right place? Thanks for any help! -Dave Butler There's nothing wrong with your int array. What's GetLastError tell you? According to the spec you can just pass null as the attrs anyway to get a default, try that. Only other thing I can suggest is maybe you've got the calling convention of PFNWGLCREATECONTEXTATTRIBSARBPROC wrong. But you only need to declare it in an extern (windows) block. Just out of curiosity where did you get the c headers for opengl 3? My card doesn't support it unfortunately so I've not bothered getting hold of it so far. - -- My enormous talent is exceeded only by my outrageous laziness. http://www.ssTk.co.uk -BEGIN PGP SIGNATURE- Version: GnuPG v1.4.7 (MingW32) Comment: Using GnuPG with Mozilla - http://enigmail.mozdev.org/ iD8DBQFKy4fbT9LetA9XoXwRAkUiAKCi/APnNEJDDVYx0jv6vyo03ji86wCfTXns Sa178i0Ry3U+tD1o2awy8d8= =t3Cb -END PGP SIGNATURE-
Re: Using OpenGL (custom port)
Moritz Warning: Please also keep in mind that only array literals are \0 terminated. char/wchar/dchar array literals only, I think. Bye, bearophile
Re: Using OpenGL (custom port)
Oops, I edited part of the code. Just substitute in 'tmpContext' anywhere you see 'context'.