Yes - it's taken from the wild (an HTML page on the internet). Then turned
into XML, then a table extracted etc - so looks to me like non-utf8 stuff
has go in there somewhere.

That's why I was wandering if there was a way to filter out arbitrary text
and make it utf8-safe. You know urlencode for utf8 - or plain text? I don't
want to encode all the spaces and normal / safe chars - just whatever weird
stuff got into the data?

On 24 July 2015 at 00:12, Monte Goulding <mo...@sweattechnologies.com>
wrote:

>
> > On 24 Jul 2015, at 7:22 am, David Bovill <david@viral.academy> wrote:
> >
> > I'm placing the text into an array and then using Monte's mergJsonEncode
> > function to decode it. Usually works fine - but in this case it looks
> like
> > the content needs some tidying before I put it into the array.
>
> mergJSON will choke on anything that’s not utf8. Is it possible there’s
> some other encoded data there or something you are doing with the data is
> messing with the encoding?
>
> --
> M E R Goulding <http://goulding.ws/>
> Software development services
> Bespoke application development for vertical markets
>
> mergExt <http://mergext.com/> - There's an external for that!
>
> _______________________________________________
> use-livecode mailing list
> use-livecode@lists.runrev.com
> Please visit this url to subscribe, unsubscribe and manage your
> subscription preferences:
> http://lists.runrev.com/mailman/listinfo/use-livecode
_______________________________________________
use-livecode mailing list
use-livecode@lists.runrev.com
Please visit this url to subscribe, unsubscribe and manage your subscription 
preferences:
http://lists.runrev.com/mailman/listinfo/use-livecode

Reply via email to