On 11/02/2011 05:46 PM, Alan Coopersmith wrote:
On 11/02/11 04:54, walter harms wrote:
Am 01.11.2011 23:42, schrieb Alan Coopersmith:
Signed-off-by: Alan Coopersmith<[email protected]>
---
glx/single2.c | 4 +---
1 files changed, 1 insertions(+), 3 deletions(-)
diff --git a/glx/single2.c b/glx/single2.c
index 9884f40..9f8254b 100644
--- a/glx/single2.c
+++ b/glx/single2.c
@@ -351,12 +351,10 @@ int DoGetString(__GLXclientState *cl, GLbyte
*pc, GLboolean need_swap)
}
else if ( name == GL_VERSION ) {
if ( atof( string )> atof( GLServerVersion ) ) {
- buf = malloc( strlen( string ) + strlen( GLServerVersion ) + 4 );
- if ( buf == NULL ) {
+ if ( asprintf(&buf, "%s (%s)", GLServerVersion, string ) == -1) {
string = GLServerVersion;
}
else {
- sprintf( buf, "%s (%s)", GLServerVersion, string );
string = buf;
}
}
I am not sure that string = GLServerVersion in an OOM condition is the
right idea.
Having an exit(1) gives the system a chance to recover.
I kept the existing handling. The X server should be more robust than the
average program, since if it goes down, your whole desktop goes with it,
so it's better to let some other program die first to free memory.
I believe the existing (and new) behavior is correct. The version that
the application (on the other end of the wire) sees will ultimately be
the same either way. The difference is that it will see "1.4" instead
of "1.4 (2.1)", for example.
Reviewed-by: Ian Romanick <[email protected]>
_______________________________________________
[email protected]: X.Org development
Archives: http://lists.x.org/archives/xorg-devel
Info: http://lists.x.org/mailman/listinfo/xorg-devel