----- Original Message -----
> On Tue, Aug 07, 2012 at 03:43:11PM -0300, Erlon Cruz wrote:
> > From: Erlon Cruz <[email protected]>
> > 
> > Signed-off-by: Erlon R. Cruz <[email protected]>
> > Signed-off-by: Fabiano FidĂȘncio
> > <Fabiano.FidĂȘ[email protected]>
> > Signed-off-by: Rafael F. Santos
> > <[email protected]>
> > ---
> >  server/glz_encoder.c |    2 +-
> >  1 files changed, 1 insertions(+), 1 deletions(-)
> > 
> > diff --git a/server/glz_encoder.c b/server/glz_encoder.c
> > index 1ec1f9b..f8415a8 100644
> > --- a/server/glz_encoder.c
> > +++ b/server/glz_encoder.c
> > @@ -261,7 +261,7 @@ int glz_encode(GlzEncoderContext
> > *opaque_encoder,
> >      encoder->cur_image.id = dict_image->id;
> >      encoder->cur_image.first_win_seg = dict_image->first_seg;
> >  
> > -    encode_32(encoder, LZ_MAGIC);
> > +    encode_32(encoder, htole32(LZ_MAGIC));

The correct thing to do here is drop LZ_MAGIC. No one needs it.
And the version stuff that follows it as well. It's useless.
if you insist on keeping them, change to 1 byte each and then there's one less 
endianity issue 
(http://spice-space.org/page/ProtocolChanges#QUIC_and_GLZ_RGB_version_numbers_should_use_1_byte_per_number)
Y.

> 
> LZ_MAGIC really should be a string that we encode using something
> like
> encode_array(encoder, LZ_MAGIC, sizeof(LZ_MAGIC)), but we don't have
> such
> facilities in glz_encoder.c, so why not. Have you tested that a ppc
> encoded
> glz stream can be decoded on an x86 client? (thinking of it, it might
> be
> the only setup you have tested for now).
> 
> Christophe
> 
> _______________________________________________
> Spice-devel mailing list
> [email protected]
> http://lists.freedesktop.org/mailman/listinfo/spice-devel
>
_______________________________________________
Spice-devel mailing list
[email protected]
http://lists.freedesktop.org/mailman/listinfo/spice-devel

Reply via email to