On Fri, 30 Aug 2019, Anthony Walter via lazarus wrote:
Okay, so I turned on my Windows VM with a different version of FPC and ran
VerifyUnicodeChars with both FPJson and JsonTools. The resutls are the
same. JsonTools sees the unicode correctly, and something is wrong when
using FPJson. I don't
Okay, so I turned on my Windows VM with a different version of FPC and ran
VerifyUnicodeChars with both FPJson and JsonTools. The resutls are the
same. JsonTools sees the unicode correctly, and something is wrong when
using FPJson. I don't know what the problem is, but other people are
noticing sim
On Fri, 30 Aug 2019, Bart via lazarus wrote:
On Fri, Aug 30, 2019 at 9:09 PM Bart wrote:
On Windows it prints FALSE, both with 3.0.4 and trunk r42348
It fails on both comparisons (hexadecimal representation of the
returned unicodestrings):
Name: 004A 006F 0065 003F 0053 0063 0068 006
On Fri, 30 Aug 2019, Anthony Walter via lazarus wrote:
I am not sure how under any situation parsing a JSON from a stream source
would be any faster than parsing a string.
If you would check the fpjson code, you'd see why.
You'd also see why there is plenty of room for improvement.
Also wi
For those tracking the unicode issue, could you please verify the problem
does not present in my JsonTools library on compiler revisions and
platforms? I always get true (passed) with my library, but not with any
other library. Here is the relevant test:
function VerifyUnicodeChars: Boolean;
const
On Fri, Aug 30, 2019 at 9:09 PM Bart wrote:
> On Windows it prints FALSE, both with 3.0.4 and trunk r42348
It fails on both comparisons (hexadecimal representation of the
returned unicodestrings):
Name: 004A 006F 0065 003F 0053 0063 0068 006D 006F 0065
Expected: 004A 006F 0065 00AE 0053 006
On Fri, Aug 30, 2019 at 4:04 PM Michael Van Canneyt via lazarus
wrote:
> No idea. I tested with both 3.0.4 and trunk. Both give the same result.
>
> Here are the sources I used:
...
> I test on linux, but could try windows.
On Windows it prints FALSE, both with 3.0.4 and trunk r42348
--
Bart
-
I am not sure how under any situation parsing a JSON from a stream source
would be any faster than parsing a string. Also with regards to timing I am
not sure how accurate Now is. For this purpose I've written:
{ Return a time based on system performance counters }
function TimeQuery: Double;
Imp
On 30/08/2019 18:58, Zoë Peterson via lazarus wrote:
What controls what appears in the "Debugger type and path" dropdown in
the Debugger backend preferences?
I have two macOS systems, one running 10.13 and one 10.14 that have
different items listed. The 10.13 one lists fpdebug, GDB, and LLDB
What controls what appears in the "Debugger type and path" dropdown in
the Debugger backend preferences?
I have two macOS systems, one running 10.13 and one 10.14 that have
different items listed. The 10.13 one lists fpdebug, GDB, and LLDB
based ones (6 total) and I've been successfully using
On Fri, 30 Aug 2019, Anthony Walter via lazarus wrote:
Alan, oh that's a good idea. I will do that as well as add a few more
parser libraries as requested by a few people in other non mailing lists
threads. I will also try to find out what's going on the unicode strings as
it might be a proble
Alan, oh that's a good idea. I will do that as well as add a few more
parser libraries as requested by a few people in other non mailing lists
threads. I will also try to find out what's going on the unicode strings as
it might be a problem with the compiler.
Michael,
I am on Linux as well, but I
It is maybe bug which was fixed in FPC trunk, there was some Unicode issue in
3.0.4.
>
> On my system with FPJson the test is failing it failing on "bank teller
> \u00Ae ", but on when using approximately the same code with JSONTools it
> passes on both "name" and "occupation" always. What do
On Fri, 30 Aug 2019, Anthony Walter via lazarus wrote:
On my system with FPJson the test is failing it failing on "bank teller
\u00Ae ", but on when using approximately the same code with JSONTools it
passes on both "name" and "occupation" always. What do you think is going
on?
No idea. I t
On my system with FPJson the test is failing it failing on "bank teller
\u00Ae ", but on when using approximately the same code with JSONTools it
passes on both "name" and "occupation" always. What do you think is going
on?
--
___
lazarus mailing list
l
On Fri, 30 Aug 2019, Anthony Walter via lazarus wrote:
Michael,
Can you tell me why the second half (N.Items[1].AsUnicodeString) this test
fails? This is the part that decodes "bank teller \u00Ae ".
The test fails on "Joe®Schmoe", not on "bank teller \u00Ae ".
If you WriteLn the UnicodeCh
Michael,
Can you tell me why the second half (N.Items[1].AsUnicodeString) this test
fails? This is the part that decodes "bank teller \u00Ae ".
function VerifyUnicodeChars: Boolean;
const
UnicodeChars = '{ "name": "Joe®Schmoe", "occupation": "bank teller \u00Ae
" }';
var
N: TJSONData;
begin
On Fri, 30 Aug 2019, Anthony Walter via lazarus wrote:
Michael,
I have a hurricane headed my way, but when I'm done evacuating I'll send
you a copy of my test. If you want to make improvements to the test program
to be sure the manner in which I am using the FPJson functions and classes
is co
On Fri, 30 Aug 2019, Anthony Walter via lazarus wrote:
With regards to duplicate key names, some libraries allow for the same key
to be parsed resulting in multiple child nodes of the same name. Others
throw an exception when parsing an object with a duplicate key name.
The correct way to han
Michael,
I have a hurricane headed my way, but when I'm done evacuating I'll send
you a copy of my test. If you want to make improvements to the test program
to be sure the manner in which I am using the FPJson functions and classes
is correct and send me a revised test program, then that would be
With regards to duplicate key names, some libraries allow for the same key
to be parsed resulting in multiple child nodes of the same name. Others
throw an exception when parsing an object with a duplicate key name.
The correct way to handle duplicate keys is to overwrite the existing key
when a d
Yes, JsonTools needs a method SaveToFile if it has not. It must save
formatted json with indent, set by a property or global variable
(default is 2 usually).
SaveToFile must handle Unicode strings, ie output them with \u or
like it. Use Unicode escape for all codes >=128, because utf8 code
On Fri, 30 Aug 2019, Anthony Walter via lazarus wrote:
Alexey,
Currently JsonTools anything that is valid JSON as described on this page:
https://www.json.org/
The only valid constants are: null, true, false
Arrays can contain other arrays and object to any reasonable level
[[[]]] // is a
On Fri, 30 Aug 2019, Anthony Walter wrote:
I've posted a new page that tests the speed and correctness of several
pascal based JSON parsers.
https://www.getlazarus.org/json/tests/
In full disclosure I am the author of the new open source JsonTools
library, and even though my parser seems to
Alexey,
Currently JsonTools anything that is valid JSON as described on this page:
https://www.json.org/
The only valid constants are: null, true, false
Arrays can contain other arrays and object to any reasonable level
[[[]]] // is a valid array
[{}{}[{}{}]] // is a valid array
Objects can
This is very good news, that we have JsonTools parser now. I may think
of using it in CudaText - ie replacing fpJSON to JsonTools.
About lib. 1) Pls add an option to handle // comments in json. yes,
json don't allow this but CudaText and SublimeText and many programs
have json configs with
I've posted a new page that tests the speed and correctness of several
pascal based JSON parsers.
https://www.getlazarus.org/json/tests/
In full disclosure I am the author of the new open source JsonTools
library, and even though my parser seems to a big improvement over the
other alternatives, m
27 matches
Mail list logo