Hierarchical data (was: Light weight support for JSON)
On Wed, Aug 31, 2022 at 11:11:26AM -0400, Chet Ramey wrote: On 8/29/22 2:03 PM, tetsu...@scope-eye.net wrote: It would also help greatly if the shell could internally handle hierarchical data in variables. That's a fundamental change. There would have to be a better reason to make it than handling JSON. I've only a little interest in handling JSON natively in bash (jq usually gets me there), but I have a strong interest in handling hierarchical data (h-data) in bash. I admit I've only had a few cases where I've been jumping through hoops to manage h-data in bash, but that's because, once it's clear h-data is a natural way to manage an issue, I would normally handle the problem in perl rather than trying to force clunky constructs into a bash script. In perl I use h-data all the time. I'm sure if h-data were available in bash I'd be using it all the time there as well. Chris
Re: Light weight support for JSON
On Wed, 2022-08-31 at 11:11 -0400, Chet Ramey wrote: > On 8/29/22 2:03 PM, tetsu...@scope-eye.net wrote: > > > It would also help > > greatly if the shell could internally handle hierarchical data in > > variables. > > That's a fundamental change. There would have to be a better reason > to make > it than handling JSON. My contention is not that hierarchical variables should be added just for the sake of supporting JSON. My contention is that handling a format like JSON is, in principle, a very easy problem that the shell should be well-equipped to handle; that the proper way to add functionality to a shell, when possible, is to delegate that to another program, but that writing such a program is made significantly more difficult by the shell's near-inability to communicate or store hierarchical data. That, to me, is the case for both features, providing hierarchical variables and JSON (or a different format with comparable capabilities), within the shell: Not just to support doing JSON things, but also to use that functionality to facilitate creation of tools to solve other problems. I get that you're not sold on the idea, but I wanted to try to make the case for it. I had been working on a set of loadable modules meant as a kind of proof-of-concept providing these kind of capabilities, I should really just get back to that sometime.
Re: Light weight support for JSON
I am on vacation and just skimmed this long thread so I might have missed some of the context but I wanted throw out that I recently wrote a loadable plugin that among other things, can convert JSON to and from bash arrays. Anyone interested can check it out at these git repos. https://github.com/junga-com/bg-core and https://github.com/junga-com/bg-core-bash-builtins The bg-core project is an all bash implementation. If the compiled bgCore bash loadable from bg-core-bash-builtins is available it runs much faster but the functionality is the same either way. (BTW, the bg-core-bash-builtins is a C project which includes a module bg_bashAPI.h/c that is a wrapper intended to make it easier to convert bash code to C -- i.e. create and manipulate shell variables) The bg-core library introduces the concept of Classes and Objects to bash. A key feature to enable objects is a concept of 'allocating' bash variables on a 'heap'. The 'heap' just means that heap variables use a global variable name schema with a random component so that code can create them dynamically without conflict. When an associative array element contains a string that matches the pattern of a heap variable name (e.g. heap__) or an object reference (_bgclassCall ), the object syntax and other aware functions like the toJSON and fromJSON function recognizes it as an heap variable reference which can be an associative array. The upshot is that you can have nested arrays (elements of arrays can themselves be arrays). With the ability to have nested arrays, its very easy to map any arbitrary JSON data to bash and vica versa. JSON {} objects map to bash associative arrays and JSON [] lists/arrays map to bash numeric arrays. Each time a {} or [] construct is encountered reading JSON data, a new bash array is allocated on the 'heap (i.e. a new random variable name is created and used to create a global var) and its name is stored in the associative array element of its parent. I have been using this for a while now and its all working quite nicely. --BobG On 8/31/22 11:11, Chet Ramey wrote: On 8/29/22 2:03 PM, tetsu...@scope-eye.net wrote: On 2022-08-29 11:43, Chet Ramey wrote: On 8/28/22 2:11 PM, Lawrence Velázquez wrote: On Sun, Aug 28, 2022, at 9:24 AM, Yair Lenga wrote: Wanted to get feedback about the following "extensions" to bash that will make it easier to work with simple JSON object. (...) just to make it easier to build bash scripts that cover the very common use case of submitting REST requests with curl (checking results, etc), and to perform simple processing of JSON files. I do not think bash needs to sprout functionality to support every data-exchange format of the month. Loadable builtins are the way to do this In that case could I suggest providing a simplified, stable programming interface for loadable builtins? (I understand this is not a trivial thing, of course.) I'm not opposed, but it's not going to be a high priority for me. I have more pressing things to do. That is not to minimize this goal; I simply put my time into things that impact the maximum number of users. If someone wanted to make a pass at defining a useful subset of the (quite large) internal bash API, that would be a start. As I understand it (having made a few loadable built-ins myself, but possibly a bit ignorant/hazy on the details) - the process involves a fair bit of hands-on with the nitty-gritty of the version of Bash you're building against. A loadable needs to deal directly with implementation details of how Bash handles variables, open file handles, etc., and resulting code (source and binary) is very sensitive to changes in Bash. I don't know. I suppose it depends on the complexity of what you're trying to do. I've not had to update the source in the examples when moving to a new bash version very often. It's always a good idea to rebuild a loadable against the bash version you're going to load it into, though. For me personally I was struggling with questions like, how do I make a loadable module that sets a variable? Sounds simple but it seems to require a fair bit of attention to implementation details of the shell to get it right. What if it's a local variable? What if it's an array or hash element? And so on. There are individual functions to do all of these things, but it does require knowing about them, no doubt. It would also help greatly if the shell could internally handle hierarchical data in variables. That's a fundamental change. There would have to be a better reason to make it than handling JSON. Chet
Re: Light weight support for JSON
On 8/29/22 2:03 PM, tetsu...@scope-eye.net wrote: On 2022-08-29 11:43, Chet Ramey wrote: On 8/28/22 2:11 PM, Lawrence Velázquez wrote: On Sun, Aug 28, 2022, at 9:24 AM, Yair Lenga wrote: Wanted to get feedback about the following "extensions" to bash that will make it easier to work with simple JSON object. (...) just to make it easier to build bash scripts that cover the very common use case of submitting REST requests with curl (checking results, etc), and to perform simple processing of JSON files. I do not think bash needs to sprout functionality to support every data-exchange format of the month. Loadable builtins are the way to do this In that case could I suggest providing a simplified, stable programming interface for loadable builtins? (I understand this is not a trivial thing, of course.) I'm not opposed, but it's not going to be a high priority for me. I have more pressing things to do. That is not to minimize this goal; I simply put my time into things that impact the maximum number of users. If someone wanted to make a pass at defining a useful subset of the (quite large) internal bash API, that would be a start. As I understand it (having made a few loadable built-ins myself, but possibly a bit ignorant/hazy on the details) - the process involves a fair bit of hands-on with the nitty-gritty of the version of Bash you're building against. A loadable needs to deal directly with implementation details of how Bash handles variables, open file handles, etc., and resulting code (source and binary) is very sensitive to changes in Bash. I don't know. I suppose it depends on the complexity of what you're trying to do. I've not had to update the source in the examples when moving to a new bash version very often. It's always a good idea to rebuild a loadable against the bash version you're going to load it into, though. For me personally I was struggling with questions like, how do I make a loadable module that sets a variable? Sounds simple but it seems to require a fair bit of attention to implementation details of the shell to get it right. What if it's a local variable? What if it's an array or hash element? And so on. There are individual functions to do all of these things, but it does require knowing about them, no doubt. It would also help greatly if the shell could internally handle hierarchical data in variables. That's a fundamental change. There would have to be a better reason to make it than handling JSON. Chet -- ``The lyf so short, the craft so long to lerne.'' - Chaucer ``Ars longa, vita brevis'' - Hippocrates Chet Ramey, UTech, CWRUc...@case.eduhttp://tiswww.cwru.edu/~chet/
Re: Light weight support for JSON
Greg Wooledge writes: > The standard idiom for this sort of thing is > > eval "$(external-tool)" > > This means you need to *trust* the external-tool to produce safe code. True. And I use that idiom with ssh-agent routinely. But it still strikes me as unnatural. Dale
Re: Light weight support for JSON
On 2022-08-29 11:43, Chet Ramey wrote: On 8/28/22 2:11 PM, Lawrence Velázquez wrote: On Sun, Aug 28, 2022, at 9:24 AM, Yair Lenga wrote: Wanted to get feedback about the following "extensions" to bash that will make it easier to work with simple JSON object. (...) just to make it easier to build bash scripts that cover the very common use case of submitting REST requests with curl (checking results, etc), and to perform simple processing of JSON files. I do not think bash needs to sprout functionality to support every data-exchange format of the month. Loadable builtins are the way to do this In that case could I suggest providing a simplified, stable programming interface for loadable builtins? (I understand this is not a trivial thing, of course.) As I understand it (having made a few loadable built-ins myself, but possibly a bit ignorant/hazy on the details) - the process involves a fair bit of hands-on with the nitty-gritty of the version of Bash you're building against. A loadable needs to deal directly with implementation details of how Bash handles variables, open file handles, etc., and resulting code (source and binary) is very sensitive to changes in Bash. For me personally I was struggling with questions like, how do I make a loadable module that sets a variable? Sounds simple but it seems to require a fair bit of attention to implementation details of the shell to get it right. What if it's a local variable? What if it's an array or hash element? And so on. I understand the hesitancy to support "format of the month" data formats directly in the shell. Though I think JSON has proven itself to be rather worth supporting, personally: to call it a "format of the month" is almost a bit comical. It's been quite relevant for the last 15 years or so at least. In principle, "extending" the shell to support some new data structure should be a task we can delegate to an external program. (That is, basically, the ideal of the Unix shell) In practice, though, if we use an external program to parse a hierarchical format like JSON, it can't effectively communicate the result back to the shell. To do so, it would need to encode the data, and the encoded data would have roughly the same complexity as JSON. For the external tool to actually simplify the process, the shell and the tool would have to share a common language. (JSON could serve as such a language.) It would also help greatly if the shell could internally handle hierarchical data in variables. (That is a pretty basic problem facing any potential implementation of a JSON parser for Bash: When the data is parsed, what does the parser do with it? It can't store the result in a variable because variables aren't hierarchical.)
Re: Light weight support for JSON
On 8/28/22 2:11 PM, Lawrence Velázquez wrote: On Sun, Aug 28, 2022, at 9:24 AM, Yair Lenga wrote: Wanted to get feedback about the following "extensions" to bash that will make it easier to work with simple JSON object. To emphasize, the goal is NOT to "compete" with Python/Javascript (and other full scale language) - just to make it easier to build bash scripts that cover the very common use case of submitting REST requests with curl (checking results, etc), and to perform simple processing of JSON files. I do not think bash needs to sprout functionality to support every data-exchange format of the month. A loadable module might be okay, I guess. Loadable builtins are the way to do this; you can encapsulate all the functionality you need there without changes to bash itself. -- ``The lyf so short, the craft so long to lerne.'' - Chaucer ``Ars longa, vita brevis'' - Hippocrates Chet Ramey, UTech, CWRUc...@case.eduhttp://tiswww.cwru.edu/~chet/
Re: Light weight support for JSON
On 8/28/22 5:50 PM, Yair Lenga wrote: First, thanks for taking the time to read and provide your thoughts. This is the real value of the discussion/ Second: I'm NOT trying to argue that there isn't valid use for combining bash/curl/jq, Nor do I suggest adding JSON as first class object to bash (Python/node/Perl/Groovy are way ahead ...). I hope to get feedback from other readers for the news group that may find this approach useful. I'll take the very common use case of using AWS CLI, which does produces JSON response for most calls. Processing the response, while possible with JQ, it is challenging to many junior (and intermediate) developers. In many cases, they fall into the traps that I mentioned above - performance (excessive forking or fork/exec), or code that is hard to read (I've seen really bad code - combining pipes of JQ/awk/sed). I'm trying to address those cases. Almost always they fail to properly handle objects with while space, new-lines, etc. To be practical, I'll try to follow the loadable extension path, and see how much I can get thru that path. This is the path I generally recommend. You will have to address the data representation issues through some kind of convention(s), and encapsulating those in a loadable builtin will minimize changes elsewhere. Possibly will make sense to continue discussion with a concrete implementation. I believe the necessary commands are: These can all be options to a single `json' loadable builtin. Chet -- ``The lyf so short, the craft so long to lerne.'' - Chaucer ``Ars longa, vita brevis'' - Hippocrates Chet Ramey, UTech, CWRUc...@case.eduhttp://tiswww.cwru.edu/~chet/
Re: Light weight support for JSON
On 8/28/22 8:47 PM, Dale R. Worley wrote: The "obvious" way to support Json in Bash would be a utility that parses Json and produces e.g. a Bash associative array, and conversely a utility that reads a Bash associative array and produces Json. The real limitation is that it's difficult to have a subprocess set Bash's variables. As far as I know, there's no good idiom for that. That's why a loadable builtin is the preferred mechanism. -- ``The lyf so short, the craft so long to lerne.'' - Chaucer ``Ars longa, vita brevis'' - Hippocrates Chet Ramey, UTech, CWRUc...@case.eduhttp://tiswww.cwru.edu/~chet/
Re: Light weight support for JSON
On 8/28/22 5:06 PM, Saint Michael wrote: He has a point, though. To have some of the functionality of jq inside Bash may be very useful. If he can supply a patch, why not? Because then it becomes a support and maintenance issue, and a piece of technical debt. A well-encapsulated loadable builtin addresses some of these issues. -- ``The lyf so short, the craft so long to lerne.'' - Chaucer ``Ars longa, vita brevis'' - Hippocrates Chet Ramey, UTech, CWRUc...@case.eduhttp://tiswww.cwru.edu/~chet/
Re: Looking to the future (was Re: Light weight support for JSON)
On 8/29/22 5:48 AM, Martin D Kealey wrote: The Shell persists because it has one killer feature: it does double duty as a scripting language and as an interactive command language. But we're kidding ourselves if we think that no other language could fill that gap: Python has a respectable interactive mode, though its focus is on objects rather than processes and files; the interactive "debugger" console inside Firefox speaks Javascript; and even "perl -d" is almost usable. So, neither of those could fill that gap. What could then? As for the future, I believe that if we don't move towards making the POSIX sh behaviour a truly optional part of an otherwise-more-sane language, we condemn Bash to continued obscurity and eventual extinction. Nah. I think Bash already has too many features over POSIX; anything beyond indexed arrays, indirect expansions, and `${parameter/string/replacement}' is bloat. Besides, who is going to evolve Bash into this "more-sane language"?
Looking to the future (was Re: Light weight support for JSON)
Not that I fundamentally disagree with this (JSON) proposal, but I'd rather see the effort put into support for nested arrays (like ksh has), and generally having a more forward-looking view of Bash as an evolving language. I would see this proceeding somewhat like the transition from Perl4 to Perl5, where "everything is a string" gets replaced by a functioning polymorphic type system that by default provides what look like the same strings as before, but under the hood uses direct references rather than symbolic indirection. (Yes, that would finally fix "local -n".) Then emitting JSON or YAML becomes simply a matter of defining an output formatter for a structured variable, which could even be set as its default "stringize" function with a suitable "declare" statement. It used to be that POSIX sh and AWK were the only common languages that could be assumed to be present on all systems. Somewhere along the way AWK fell out of common awareness, leaving just the shell, with its arcane-to-the-point-of-intentionally-harmful misfeatures. Since 1990, we've moved on. Systems have - quite literally - a million times more storage than when POSIX documented the state of play in 1987. In POSIX-like systems, tools like Python, Ruby, Perl & PHP have become ubiquitous. Moreover, packaging systems allow a script to declare its prerequisites, including its interpreter, so it's no longer necessary to target one "universal" language. Even embedded systems have many gigabytes of storage and generally include a range of tools such as Perl and/or Python, or at least can install them. (Those that don't generally also lack the ability to install *anything*, including shell scripts, which makes the question of "which scripting language" entirely moot.) Supporting new features on *really* older devices is hard to justify, as such devices are being replaced *just to reduce their power consumption*. The Shell persists because it has one killer feature: it does double duty as a scripting language and as an interactive command language. But we're kidding ourselves if we think that no other language could fill that gap: Python has a respectable interactive mode, though its focus is on objects rather than processes and files; the interactive "debugger" console inside Firefox speaks Javascript; and even "perl -d" is almost usable. As for the future, I believe that if we don't move towards making the POSIX sh behaviour a truly optional part of an otherwise-more-sane language, we condemn Bash to continued obscurity and eventual extinction. Existing shell scripts aren't going to curl up and die any time soon, but we're at a crossroads: either we admit that Bash has had its day, and stop adding ANY new features, or we make a decision to let it evolve in ways that will still run existing scripts, provide an effective command language, *and* allow new scripts to be written without needing to constantly work around misfeatures that are 35 years past their use-by date. Which is it to be? -Martin PS: top of my list of most-hated misfeatures isn't any of the POSIX malapropisms, but rather the fact that we can't write "shopt compatXX" with the XX being the newest Bash version, and be sure that when person A eventually installs a future version of Bash to get its new whiz-bang interactive features, it won't break a script written by person B. I've been told that "shopt compat" is a "short term measure until the script gets fixed". Implicit in that is the assumption that either the author of a Bash script is supposed provide eternal on-going preemptive updates (even to users who didn't even get the original from them), or that people who install a script must necessarily be capable of diagnosing the weird broken behaviour that emerges when they install a new version of Bash. Yes, both of those are as crazy as they sound, and so is the "short term measure".
Re: Light weight support for JSON
On Sun, Aug 28, 2022 at 08:47:24PM -0400, Dale R. Worley wrote: > The "obvious" way to support Json in Bash would be a utility that parses > Json and produces e.g. a Bash associative array, and conversely a > utility that reads a Bash associative array and produces Json. The real > limitation is that it's difficult to have a subprocess set Bash's > variables. As far as I know, there's no good idiom for that. The standard idiom for this sort of thing is eval "$(external-tool)" This means you need to *trust* the external-tool to produce safe code.
Re: Light weight support for JSON
On Sun, Aug 28, 2022 at 7:47 PM Dale R. Worley wrote: > The "obvious" way to support Json in Bash would be a utility that parses > Json and produces e.g. a Bash associative array, and conversely a > utility that reads a Bash associative array and produces Json. The real > limitation is that it's difficult to have a subprocess set Bash's > variables. As far as I know, there's no good idiom for that. > > Dale > > If your json_util outputs a Bash declare -A statement then you could just eval it to get your associative array. The utility would need to guarantee against code injection in its output. Another issue is that JSON can express things that Bash associative arrays can't and so then you get into a rat's nest of workarounds. -- Visit serverfault.com to get your system administration questions answered.
Re: Light weight support for JSON
The "obvious" way to support Json in Bash would be a utility that parses Json and produces e.g. a Bash associative array, and conversely a utility that reads a Bash associative array and produces Json. The real limitation is that it's difficult to have a subprocess set Bash's variables. As far as I know, there's no good idiom for that. Dale
Re: Light weight support for JSON
First, thanks for taking the time to read and provide your thoughts. This is the real value of the discussion/ Second: I'm NOT trying to argue that there isn't valid use for combining bash/curl/jq, Nor do I suggest adding JSON as first class object to bash (Python/node/Perl/Groovy are way ahead ...). I hope to get feedback from other readers for the news group that may find this approach useful. I'll take the very common use case of using AWS CLI, which does produces JSON response for most calls. Processing the response, while possible with JQ, it is challenging to many junior (and intermediate) developers. In many cases, they fall into the traps that I mentioned above - performance (excessive forking or fork/exec), or code that is hard to read (I've seen really bad code - combining pipes of JQ/awk/sed). I'm trying to address those cases. Almost always they fail to properly handle objects with while space, new-lines, etc. To be practical, I'll try to follow the loadable extension path, and see how much I can get thru that path. Possibly will make sense to continue discussion with a concrete implementation. I believe the necessary commands are: json_read -a data-array -m meta-array -r root-obj Parse the stdin input and into data-array with the items (as described above), and the meta-array with helper information (length, prop list under each node) - to help iterating. json_write -v variable -a data-array -m meta-array -r root The reverse - generate the JSON for associated array following '.' naming convention json_add -a data-array [-m meta-array] [-r root] key1=value1 key2=value2 key3=value3 Helper to add items into associative array representing JSON object, Autodetect type, with the ability to force stringfication using format string. On Sun, Aug 28, 2022 at 3:22 PM John Passaro wrote: > interfacing with an external tool absolutely seems like the correct answer > to me. a fact worth mentioning to back that up is that `jq` exists. billed > as a sed/awk for json, it fills all the functions you'd expect such an > external tool to have and many many more. interfacing from curl to jq to > bash is something i do on a near daily basis. > > https://stedolan.github.io/jq/ > > On Sun, Aug 28, 2022, 09:25 Yair Lenga wrote: > >> Hi, >> >> Over the last few years, JSON data becomes a integral part of processing. >> In many cases, I find myself having to automate tasks that require >> inspection of JSON response, and in few cases, construction of JSON. So >> far, I've taken one of two approaches: >> * For simple parsing, using 'jq' to extract elements of the JSON >> * For more complex tasks, switching to python or Javascript. >> >> Wanted to get feedback about the following "extensions" to bash that will >> make it easier to work with simple JSON object. To emphasize, the goal is >> NOT to "compete" with Python/Javascript (and other full scale language) - >> just to make it easier to build bash scripts that cover the very common >> use >> case of submitting REST requests with curl (checking results, etc), and to >> perform simple processing of JSON files. >> >> Proposal: >> * Minimal - Lightweight "json parser" that will convert JSON files to bash >> associative array (see below) >> * Convert bash associative array to JSON >> >> To the extent possible, prefer to borrow from jsonpath syntax. >> >> Parsing JSON into an associative array. >> >> Consider the following, showing all possible JSON values (boolean, number, >> string, object and array). >> { >> "b": false, >> "n": 10.2, >> "s: "foobar", >> x: null, >> "o" : { "n": 10.2, "s: "xyz" }, >> "a": [ >> { "n": 10.2, "s: "abc", x: false }, >> { "n": 10.2, "s": "def" x: true}, >> ], >> } >> >> This should be converted into the following array: >> >> - >> >> # Top level >> [_length] = 6# Number of keys in object/array >> [_keys] = b n s x o a# Direct keys >> [b] = false >> [n] = 10.2 >> [s] = foobar >> [x] = null >> >> # This is object 'o' >> [o._length] = 2 >> [o._keys] = n s >> [o.n] = 10.2 >> [o.s] = xyz >> >> # Array 'a' >> [a._count] = 2 # Number of elements in array >> >> # Element a[0] (object) >> [a.0._length] = 3 >> [a.0._keys] = n s x >> [a.0.n] = 10.2 >> [a.0.s] = abc >> [a.0_x] = false >> >> - >> >> I hope that example above is sufficient. There are few other items that >> are >> worth exploring - e.g., how to store the type (specifically, separate the >> quoted strings vs value so that "5.2" is different than 5.2, and "null" is >> different from null. >> >> I will leave the second part to a different post, once I have some >> feedback. I have some prototype that i've written in python - POC - that >> make it possible to write things like >> >> declare -a foo >> curl http://www.api.com/weather/US/10013 | readjson foo >> >> printf "temperature(F) : %.1f Wind(MPH)=%d
Re: Light weight support for JSON
He has a point, though. To have some of the functionality of jq inside Bash may be very useful. If he can supply a patch, why not? Philip Orleans On Sun, Aug 28, 2022, 3:22 PM John Passaro wrote: > interfacing with an external tool absolutely seems like the correct answer > to me. a fact worth mentioning to back that up is that `jq` exists. billed > as a sed/awk for json, it fills all the functions you'd expect such an > external tool to have and many many more. interfacing from curl to jq to > bash is something i do on a near daily basis. > > https://stedolan.github.io/jq/ > > On Sun, Aug 28, 2022, 09:25 Yair Lenga wrote: > > > Hi, > > > > Over the last few years, JSON data becomes a integral part of processing. > > In many cases, I find myself having to automate tasks that require > > inspection of JSON response, and in few cases, construction of JSON. So > > far, I've taken one of two approaches: > > * For simple parsing, using 'jq' to extract elements of the JSON > > * For more complex tasks, switching to python or Javascript. > > > > Wanted to get feedback about the following "extensions" to bash that will > > make it easier to work with simple JSON object. To emphasize, the goal is > > NOT to "compete" with Python/Javascript (and other full scale language) - > > just to make it easier to build bash scripts that cover the very common > use > > case of submitting REST requests with curl (checking results, etc), and > to > > perform simple processing of JSON files. > > > > Proposal: > > * Minimal - Lightweight "json parser" that will convert JSON files to > bash > > associative array (see below) > > * Convert bash associative array to JSON > > > > To the extent possible, prefer to borrow from jsonpath syntax. > > > > Parsing JSON into an associative array. > > > > Consider the following, showing all possible JSON values (boolean, > number, > > string, object and array). > > { > > "b": false, > > "n": 10.2, > > "s: "foobar", > > x: null, > > "o" : { "n": 10.2, "s: "xyz" }, > > "a": [ > > { "n": 10.2, "s: "abc", x: false }, > > { "n": 10.2, "s": "def" x: true}, > > ], > > } > > > > This should be converted into the following array: > > > > - > > > > # Top level > > [_length] = 6# Number of keys in object/array > > [_keys] = b n s x o a# Direct keys > > [b] = false > > [n] = 10.2 > > [s] = foobar > > [x] = null > > > > # This is object 'o' > > [o._length] = 2 > > [o._keys] = n s > > [o.n] = 10.2 > > [o.s] = xyz > > > > # Array 'a' > > [a._count] = 2 # Number of elements in array > > > > # Element a[0] (object) > > [a.0._length] = 3 > > [a.0._keys] = n s x > > [a.0.n] = 10.2 > > [a.0.s] = abc > > [a.0_x] = false > > > > - > > > > I hope that example above is sufficient. There are few other items that > are > > worth exploring - e.g., how to store the type (specifically, separate the > > quoted strings vs value so that "5.2" is different than 5.2, and "null" > is > > different from null. > > > > I will leave the second part to a different post, once I have some > > feedback. I have some prototype that i've written in python - POC - that > > make it possible to write things like > > > > declare -a foo > > curl http://www.api.com/weather/US/10013 | readjson foo > > > > printf "temperature(F) : %.1f Wind(MPH)=%d" ${foo[temp_f]}, ${foo[wind]} > > > > Yair > > >
Re: Light weight support for JSON
On Sun, Aug 28, 2022, at 9:24 AM, Yair Lenga wrote: > Wanted to get feedback about the following "extensions" to bash that will > make it easier to work with simple JSON object. It occurred to me to provide references for previous discussion along these lines, but it turns out there isn't very much of it. This is the most concrete example I found: https://lists.gnu.org/archive/html/bug-bash/2020-12/msg00046.html -- vq
Re: Light weight support for JSON
On Sun, Aug 28, 2022, at 4:05 PM, G. Branden Robinson wrote: > At 2022-08-28T15:52:55-0400, Lawrence Velázquez wrote: >> On Sun, Aug 28, 2022, at 2:56 PM, G. Branden Robinson wrote: >> > How about next July, when JSON is as exactly old as the Bourne shell >> > was when JSON was deployed? >> >> I do not find "well *actually* JSON is old too!!!" to be particularly >> persuasive, either. > > It's a perfectly valid rejoinder to a claim that the format is too novel > to be seriously considered. That you don't regard it as persuasive is > consistent with your protest not being a rational one in the first > place Agree to disagree. -- vq
Re: Light weight support for JSON
At 2022-08-28T15:52:55-0400, Lawrence Velázquez wrote: > On Sun, Aug 28, 2022, at 2:56 PM, G. Branden Robinson wrote: > > How about next July, when JSON is as exactly old as the Bourne shell > > was when JSON was deployed? > > I do not find "well *actually* JSON is old too!!!" to be particularly > persuasive, either. It's a perfectly valid rejoinder to a claim that the format is too novel to be seriously considered. That you don't regard it as persuasive is consistent with your protest not being a rational one in the first place; one doesn't get reasoned out of what one wasn't reasoned into. > I should have foreseen that the offhand "of the month" jab would > get undue attention compared to my actual objection, which is against > giving one data format uniquely first-class support. That's on me. That's right. Don't pad your brief with makeweight objections, particularly when their ultimate weight is feeble. > Sick burn, pal. Excuse me while I take out my dentures or whatever. Fortunately for you, I reckon that even with mushmouthed enunciation, the kids wont't have any trouble understanding that you want them to get off your lawn. Why stomp on the newbies' ideas with two feet when one will do? Regards, Branden signature.asc Description: PGP signature
Re: Light weight support for JSON
On Sun, Aug 28, 2022, at 2:56 PM, G. Branden Robinson wrote: > At 2022-08-28T14:11:25-0400, Lawrence Velázquez wrote: >> I do not think bash needs to sprout functionality to support every >> data-exchange format of the month. > > This sentiment is illustrative of the logarithmic memory scale of > grognards. The Bourne shell was first released as part of Version 7 > Unix in January 1979.[1] 22 years and three months later, in April > 2001, Douglas Crockford and Chip Morningstar sent the first JSON > message.[2] > >> A loadable module might be okay, I guess. > > How about next July, when JSON is as exactly old as the Bourne shell was > when JSON was deployed? I do not find "well *actually* JSON is old too!!!" to be particularly persuasive, either. I should have foreseen that the offhand "of the month" jab would get undue attention compared to my actual objection, which is against giving one data format uniquely first-class support. That's on me. >> Why are people so allergic to just using specific utilities for >> specific tasks, as appropriate? (This question is rhetorical. >> Please do not respond with an impassioned plea about why JSON is >> so special that it deserves first-class shell support. It's not.) > > I won't litigate this point, but your concept of novelty is distorted > beyond any standard reasonable in the computer industry. If we are only > as young as we feel, you must feel geriatric in the extreme. Sick burn, pal. Excuse me while I take out my dentures or whatever. -- vq
Re: Light weight support for JSON
interfacing with an external tool absolutely seems like the correct answer to me. a fact worth mentioning to back that up is that `jq` exists. billed as a sed/awk for json, it fills all the functions you'd expect such an external tool to have and many many more. interfacing from curl to jq to bash is something i do on a near daily basis. https://stedolan.github.io/jq/ On Sun, Aug 28, 2022, 09:25 Yair Lenga wrote: > Hi, > > Over the last few years, JSON data becomes a integral part of processing. > In many cases, I find myself having to automate tasks that require > inspection of JSON response, and in few cases, construction of JSON. So > far, I've taken one of two approaches: > * For simple parsing, using 'jq' to extract elements of the JSON > * For more complex tasks, switching to python or Javascript. > > Wanted to get feedback about the following "extensions" to bash that will > make it easier to work with simple JSON object. To emphasize, the goal is > NOT to "compete" with Python/Javascript (and other full scale language) - > just to make it easier to build bash scripts that cover the very common use > case of submitting REST requests with curl (checking results, etc), and to > perform simple processing of JSON files. > > Proposal: > * Minimal - Lightweight "json parser" that will convert JSON files to bash > associative array (see below) > * Convert bash associative array to JSON > > To the extent possible, prefer to borrow from jsonpath syntax. > > Parsing JSON into an associative array. > > Consider the following, showing all possible JSON values (boolean, number, > string, object and array). > { > "b": false, > "n": 10.2, > "s: "foobar", > x: null, > "o" : { "n": 10.2, "s: "xyz" }, > "a": [ > { "n": 10.2, "s: "abc", x: false }, > { "n": 10.2, "s": "def" x: true}, > ], > } > > This should be converted into the following array: > > - > > # Top level > [_length] = 6# Number of keys in object/array > [_keys] = b n s x o a# Direct keys > [b] = false > [n] = 10.2 > [s] = foobar > [x] = null > > # This is object 'o' > [o._length] = 2 > [o._keys] = n s > [o.n] = 10.2 > [o.s] = xyz > > # Array 'a' > [a._count] = 2 # Number of elements in array > > # Element a[0] (object) > [a.0._length] = 3 > [a.0._keys] = n s x > [a.0.n] = 10.2 > [a.0.s] = abc > [a.0_x] = false > > - > > I hope that example above is sufficient. There are few other items that are > worth exploring - e.g., how to store the type (specifically, separate the > quoted strings vs value so that "5.2" is different than 5.2, and "null" is > different from null. > > I will leave the second part to a different post, once I have some > feedback. I have some prototype that i've written in python - POC - that > make it possible to write things like > > declare -a foo > curl http://www.api.com/weather/US/10013 | readjson foo > > printf "temperature(F) : %.1f Wind(MPH)=%d" ${foo[temp_f]}, ${foo[wind]} > > Yair >
Re: Light weight support for JSON
At 2022-08-28T14:11:25-0400, Lawrence Velázquez wrote: > I do not think bash needs to sprout functionality to support every > data-exchange format of the month. This sentiment is illustrative of the logarithmic memory scale of grognards. The Bourne shell was first released as part of Version 7 Unix in January 1979.[1] 22 years and three months later, in April 2001, Douglas Crockford and Chip Morningstar sent the first JSON message.[2] > A loadable module might be okay, I guess. How about next July, when JSON is as exactly old as the Bourne shell was when JSON was deployed? > Why are people so allergic to just using specific utilities for > specific tasks, as appropriate? (This question is rhetorical. > Please do not respond with an impassioned plea about why JSON is > so special that it deserves first-class shell support. It's not.) I won't litigate this point, but your concept of novelty is distorted beyond any standard reasonable in the computer industry. If we are only as young as we feel, you must feel geriatric in the extreme. Regards, Branden [1] https://minnie.tuhs.org/cgi-bin/utree.pl?file=V7 [2] https://www.toptal.com/web/json-vs-xml-part-1 signature.asc Description: PGP signature
Re: Light weight support for JSON
On Sun, Aug 28, 2022, at 2:29 PM, Yair Lenga wrote: > I do not think that JSON (and REST) are "data exchange format of the > month". Those are established formats that are here to stay. Like YAML. > Those are "cornerstones" of cloud computing/configuration. I do not have to > argue for them, they can speak for themselves. You *do* have to argue why a shell should provide first-class support for them, or for any concrete data format. Shells exist on a much longer time scale than anything you've been talking about and are used in essentially every conceivable computing context. "I think JSON is *extra-special* important though" is not a reason why the shell itself should give it special attention. > As for using external utilities: two main issues: > * Performance - Processing data in bash processes can be 100X times faster > than using external tools. The fork/exec is expensive. And? Are you running jq 100,000 times in a tight loop? > * Readability - Each tool has its own syntax, escapes, etc. The final > result of mixing JQ and bash is not pretty (just lookup jq/bash questions > on stack overflow) Neither is your example. Bash's "type system" (as it were) cannot fully represent all JSON objects, so you're going to end up with some sort of gross approximation regardless. > Having them as a loadable extension seems like a good practical solution. > They do not have to be "built-in". The "csv" loadable provides some precedent for this. It won't solve the representation problem, though. -- vq
Re: Light weight support for JSON
I do not think that JSON (and REST) are "data exchange format of the month". Those are established formats that are here to stay. Like YAML. Those are "cornerstones" of cloud computing/configuration. I do not have to argue for them, they can speak for themselves. As for using external utilities: two main issues: * Performance - Processing data in bash processes can be 100X times faster than using external tools. The fork/exec is expensive. To emphasize, the intention is not to build ETL processes with bash - those should still use dedicated tools (or Python or frameworks). * Readability - Each tool has its own syntax, escapes, etc. The final result of mixing JQ and bash is not pretty (just lookup jq/bash questions on stack overflow) * It is not easy to construct valid (JSON) documents with bash - by concatenating strings. Many other tools that are used for automation have support to ensure correctness. will be nice to have the same - it will make bash more useful for the proper use cases. Having them as a loadable extension seems like a good practical solution. They do not have to be "built-in". Yair On Sun, Aug 28, 2022 at 2:11 PM Lawrence Velázquez wrote: > On Sun, Aug 28, 2022, at 9:24 AM, Yair Lenga wrote: > > Wanted to get feedback about the following "extensions" to bash that will > > make it easier to work with simple JSON object. To emphasize, the goal is > > NOT to "compete" with Python/Javascript (and other full scale language) - > > just to make it easier to build bash scripts that cover the very common > use > > case of submitting REST requests with curl (checking results, etc), and > to > > perform simple processing of JSON files. > > I do not think bash needs to sprout functionality to support every > data-exchange format of the month. A loadable module might be okay, > I guess. > > Why are people so allergic to just using specific utilities for > specific tasks, as appropriate? (This question is rhetorical. > Please do not respond with an impassioned plea about why JSON is > so special that it deserves first-class shell support. It's not.) > > -- > vq >
Re: Light weight support for JSON
On Sun, Aug 28, 2022, at 9:24 AM, Yair Lenga wrote: > Wanted to get feedback about the following "extensions" to bash that will > make it easier to work with simple JSON object. To emphasize, the goal is > NOT to "compete" with Python/Javascript (and other full scale language) - > just to make it easier to build bash scripts that cover the very common use > case of submitting REST requests with curl (checking results, etc), and to > perform simple processing of JSON files. I do not think bash needs to sprout functionality to support every data-exchange format of the month. A loadable module might be okay, I guess. Why are people so allergic to just using specific utilities for specific tasks, as appropriate? (This question is rhetorical. Please do not respond with an impassioned plea about why JSON is so special that it deserves first-class shell support. It's not.) -- vq
Re: Light weight support for JSON
Interesting point. Using (optional) separate array can also address the problem of "types" - knowing which values are quoted, and which one are not. This can also provide enough metadata to convert modified associative table back to JSON. On Sun, Aug 28, 2022 at 9:51 AM Alex fxmbsw7 Ratchev wrote: > > > On Sun, Aug 28, 2022, 15:46 Yair Lenga wrote: > >> Sorry for not being clear. I'm looking for feedback. The solution that I >> have is using python to read the JSON, and generate the commands to build >> the associative array. Will have to rewrite in "C"/submit if there is >> positive feedback from others readers. Yair. >> > > ah, cool > i just have a suggestion, .. to store the keys in a separate array, space > safe > > On Sun, Aug 28, 2022 at 9:42 AM Alex fxmbsw7 Ratchev >> wrote: >> >>> >>> >>> On Sun, Aug 28, 2022, 15:25 Yair Lenga wrote: >>> Hi, Over the last few years, JSON data becomes a integral part of processing. In many cases, I find myself having to automate tasks that require inspection of JSON response, and in few cases, construction of JSON. So far, I've taken one of two approaches: * For simple parsing, using 'jq' to extract elements of the JSON * For more complex tasks, switching to python or Javascript. Wanted to get feedback about the following "extensions" to bash that will make it easier to work with simple JSON object. To emphasize, the goal is NOT to "compete" with Python/Javascript (and other full scale language) - just to make it easier to build bash scripts that cover the very common use case of submitting REST requests with curl (checking results, etc), and to perform simple processing of JSON files. Proposal: * Minimal - Lightweight "json parser" that will convert JSON files to bash associative array (see below) * Convert bash associative array to JSON To the extent possible, prefer to borrow from jsonpath syntax. Parsing JSON into an associative array. Consider the following, showing all possible JSON values (boolean, number, string, object and array). { "b": false, "n": 10.2, "s: "foobar", x: null, "o" : { "n": 10.2, "s: "xyz" }, "a": [ { "n": 10.2, "s: "abc", x: false }, { "n": 10.2, "s": "def" x: true}, ], } This should be converted into the following array: - # Top level [_length] = 6# Number of keys in object/array [_keys] = b n s x o a# Direct keys [b] = false [n] = 10.2 [s] = foobar [x] = null # This is object 'o' [o._length] = 2 [o._keys] = n s [o.n] = 10.2 [o.s] = xyz # Array 'a' [a._count] = 2 # Number of elements in array # Element a[0] (object) [a.0._length] = 3 [a.0._keys] = n s x [a.0.n] = 10.2 [a.0.s] = abc [a.0_x] = false - I hope that example above is sufficient. There are few other items that are worth exploring - e.g., how to store the type (specifically, separate the quoted strings vs value so that "5.2" is different than 5.2, and "null" is different from null. >>> >>> did you forget to send the script along ? or am i completly loss >>> >>> a small thing i saw, a flat _keys doesnt do the job.. >>> >>> I will leave the second part to a different post, once I have some feedback. I have some prototype that i've written in python - POC - that make it possible to write things like declare -a foo curl http://www.api.com/weather/US/10013 | readjson foo printf "temperature(F) : %.1f Wind(MPH)=%d" ${foo[temp_f]}, ${foo[wind]} Yair >>>
Re: Light weight support for JSON
On Sun, Aug 28, 2022, 15:46 Yair Lenga wrote: > Sorry for not being clear. I'm looking for feedback. The solution that I > have is using python to read the JSON, and generate the commands to build > the associative array. Will have to rewrite in "C"/submit if there is > positive feedback from others readers. Yair. > ah, cool i just have a suggestion, .. to store the keys in a separate array, space safe On Sun, Aug 28, 2022 at 9:42 AM Alex fxmbsw7 Ratchev > wrote: > >> >> >> On Sun, Aug 28, 2022, 15:25 Yair Lenga wrote: >> >>> Hi, >>> >>> Over the last few years, JSON data becomes a integral part of processing. >>> In many cases, I find myself having to automate tasks that require >>> inspection of JSON response, and in few cases, construction of JSON. So >>> far, I've taken one of two approaches: >>> * For simple parsing, using 'jq' to extract elements of the JSON >>> * For more complex tasks, switching to python or Javascript. >>> >>> Wanted to get feedback about the following "extensions" to bash that will >>> make it easier to work with simple JSON object. To emphasize, the goal is >>> NOT to "compete" with Python/Javascript (and other full scale language) - >>> just to make it easier to build bash scripts that cover the very common >>> use >>> case of submitting REST requests with curl (checking results, etc), and >>> to >>> perform simple processing of JSON files. >>> >>> Proposal: >>> * Minimal - Lightweight "json parser" that will convert JSON files to >>> bash >>> associative array (see below) >>> * Convert bash associative array to JSON >>> >>> To the extent possible, prefer to borrow from jsonpath syntax. >>> >>> Parsing JSON into an associative array. >>> >>> Consider the following, showing all possible JSON values (boolean, >>> number, >>> string, object and array). >>> { >>> "b": false, >>> "n": 10.2, >>> "s: "foobar", >>> x: null, >>> "o" : { "n": 10.2, "s: "xyz" }, >>> "a": [ >>> { "n": 10.2, "s: "abc", x: false }, >>> { "n": 10.2, "s": "def" x: true}, >>> ], >>> } >>> >>> This should be converted into the following array: >>> >>> - >>> >>> # Top level >>> [_length] = 6# Number of keys in object/array >>> [_keys] = b n s x o a# Direct keys >>> [b] = false >>> [n] = 10.2 >>> [s] = foobar >>> [x] = null >>> >>> # This is object 'o' >>> [o._length] = 2 >>> [o._keys] = n s >>> [o.n] = 10.2 >>> [o.s] = xyz >>> >>> # Array 'a' >>> [a._count] = 2 # Number of elements in array >>> >>> # Element a[0] (object) >>> [a.0._length] = 3 >>> [a.0._keys] = n s x >>> [a.0.n] = 10.2 >>> [a.0.s] = abc >>> [a.0_x] = false >>> >>> - >>> >>> I hope that example above is sufficient. There are few other items that >>> are >>> worth exploring - e.g., how to store the type (specifically, separate the >>> quoted strings vs value so that "5.2" is different than 5.2, and "null" >>> is >>> different from null. >>> >> >> did you forget to send the script along ? or am i completly loss >> >> a small thing i saw, a flat _keys doesnt do the job.. >> >> I will leave the second part to a different post, once I have some >>> feedback. I have some prototype that i've written in python - POC - that >>> make it possible to write things like >>> >>> declare -a foo >>> curl http://www.api.com/weather/US/10013 | readjson foo >>> >>> printf "temperature(F) : %.1f Wind(MPH)=%d" ${foo[temp_f]}, ${foo[wind]} >>> >>> Yair >>> >>
Re: Light weight support for JSON
Sorry for not being clear. I'm looking for feedback. The solution that I have is using python to read the JSON, and generate the commands to build the associative array. Will have to rewrite in "C"/submit if there is positive feedback from others readers. Yair. On Sun, Aug 28, 2022 at 9:42 AM Alex fxmbsw7 Ratchev wrote: > > > On Sun, Aug 28, 2022, 15:25 Yair Lenga wrote: > >> Hi, >> >> Over the last few years, JSON data becomes a integral part of processing. >> In many cases, I find myself having to automate tasks that require >> inspection of JSON response, and in few cases, construction of JSON. So >> far, I've taken one of two approaches: >> * For simple parsing, using 'jq' to extract elements of the JSON >> * For more complex tasks, switching to python or Javascript. >> >> Wanted to get feedback about the following "extensions" to bash that will >> make it easier to work with simple JSON object. To emphasize, the goal is >> NOT to "compete" with Python/Javascript (and other full scale language) - >> just to make it easier to build bash scripts that cover the very common >> use >> case of submitting REST requests with curl (checking results, etc), and to >> perform simple processing of JSON files. >> >> Proposal: >> * Minimal - Lightweight "json parser" that will convert JSON files to bash >> associative array (see below) >> * Convert bash associative array to JSON >> >> To the extent possible, prefer to borrow from jsonpath syntax. >> >> Parsing JSON into an associative array. >> >> Consider the following, showing all possible JSON values (boolean, number, >> string, object and array). >> { >> "b": false, >> "n": 10.2, >> "s: "foobar", >> x: null, >> "o" : { "n": 10.2, "s: "xyz" }, >> "a": [ >> { "n": 10.2, "s: "abc", x: false }, >> { "n": 10.2, "s": "def" x: true}, >> ], >> } >> >> This should be converted into the following array: >> >> - >> >> # Top level >> [_length] = 6# Number of keys in object/array >> [_keys] = b n s x o a# Direct keys >> [b] = false >> [n] = 10.2 >> [s] = foobar >> [x] = null >> >> # This is object 'o' >> [o._length] = 2 >> [o._keys] = n s >> [o.n] = 10.2 >> [o.s] = xyz >> >> # Array 'a' >> [a._count] = 2 # Number of elements in array >> >> # Element a[0] (object) >> [a.0._length] = 3 >> [a.0._keys] = n s x >> [a.0.n] = 10.2 >> [a.0.s] = abc >> [a.0_x] = false >> >> - >> >> I hope that example above is sufficient. There are few other items that >> are >> worth exploring - e.g., how to store the type (specifically, separate the >> quoted strings vs value so that "5.2" is different than 5.2, and "null" is >> different from null. >> > > did you forget to send the script along ? or am i completly loss > > a small thing i saw, a flat _keys doesnt do the job.. > > I will leave the second part to a different post, once I have some >> feedback. I have some prototype that i've written in python - POC - that >> make it possible to write things like >> >> declare -a foo >> curl http://www.api.com/weather/US/10013 | readjson foo >> >> printf "temperature(F) : %.1f Wind(MPH)=%d" ${foo[temp_f]}, ${foo[wind]} >> >> Yair >> >
Re: Light weight support for JSON
On Sun, Aug 28, 2022, 15:25 Yair Lenga wrote: > Hi, > > Over the last few years, JSON data becomes a integral part of processing. > In many cases, I find myself having to automate tasks that require > inspection of JSON response, and in few cases, construction of JSON. So > far, I've taken one of two approaches: > * For simple parsing, using 'jq' to extract elements of the JSON > * For more complex tasks, switching to python or Javascript. > > Wanted to get feedback about the following "extensions" to bash that will > make it easier to work with simple JSON object. To emphasize, the goal is > NOT to "compete" with Python/Javascript (and other full scale language) - > just to make it easier to build bash scripts that cover the very common use > case of submitting REST requests with curl (checking results, etc), and to > perform simple processing of JSON files. > > Proposal: > * Minimal - Lightweight "json parser" that will convert JSON files to bash > associative array (see below) > * Convert bash associative array to JSON > > To the extent possible, prefer to borrow from jsonpath syntax. > > Parsing JSON into an associative array. > > Consider the following, showing all possible JSON values (boolean, number, > string, object and array). > { > "b": false, > "n": 10.2, > "s: "foobar", > x: null, > "o" : { "n": 10.2, "s: "xyz" }, > "a": [ > { "n": 10.2, "s: "abc", x: false }, > { "n": 10.2, "s": "def" x: true}, > ], > } > > This should be converted into the following array: > > - > > # Top level > [_length] = 6# Number of keys in object/array > [_keys] = b n s x o a# Direct keys > [b] = false > [n] = 10.2 > [s] = foobar > [x] = null > > # This is object 'o' > [o._length] = 2 > [o._keys] = n s > [o.n] = 10.2 > [o.s] = xyz > > # Array 'a' > [a._count] = 2 # Number of elements in array > > # Element a[0] (object) > [a.0._length] = 3 > [a.0._keys] = n s x > [a.0.n] = 10.2 > [a.0.s] = abc > [a.0_x] = false > > - > > I hope that example above is sufficient. There are few other items that are > worth exploring - e.g., how to store the type (specifically, separate the > quoted strings vs value so that "5.2" is different than 5.2, and "null" is > different from null. > did you forget to send the script along ? or am i completly loss a small thing i saw, a flat _keys doesnt do the job.. I will leave the second part to a different post, once I have some > feedback. I have some prototype that i've written in python - POC - that > make it possible to write things like > > declare -a foo > curl http://www.api.com/weather/US/10013 | readjson foo > > printf "temperature(F) : %.1f Wind(MPH)=%d" ${foo[temp_f]}, ${foo[wind]} > > Yair >
Light weight support for JSON
Hi, Over the last few years, JSON data becomes a integral part of processing. In many cases, I find myself having to automate tasks that require inspection of JSON response, and in few cases, construction of JSON. So far, I've taken one of two approaches: * For simple parsing, using 'jq' to extract elements of the JSON * For more complex tasks, switching to python or Javascript. Wanted to get feedback about the following "extensions" to bash that will make it easier to work with simple JSON object. To emphasize, the goal is NOT to "compete" with Python/Javascript (and other full scale language) - just to make it easier to build bash scripts that cover the very common use case of submitting REST requests with curl (checking results, etc), and to perform simple processing of JSON files. Proposal: * Minimal - Lightweight "json parser" that will convert JSON files to bash associative array (see below) * Convert bash associative array to JSON To the extent possible, prefer to borrow from jsonpath syntax. Parsing JSON into an associative array. Consider the following, showing all possible JSON values (boolean, number, string, object and array). { "b": false, "n": 10.2, "s: "foobar", x: null, "o" : { "n": 10.2, "s: "xyz" }, "a": [ { "n": 10.2, "s: "abc", x: false }, { "n": 10.2, "s": "def" x: true}, ], } This should be converted into the following array: - # Top level [_length] = 6# Number of keys in object/array [_keys] = b n s x o a# Direct keys [b] = false [n] = 10.2 [s] = foobar [x] = null # This is object 'o' [o._length] = 2 [o._keys] = n s [o.n] = 10.2 [o.s] = xyz # Array 'a' [a._count] = 2 # Number of elements in array # Element a[0] (object) [a.0._length] = 3 [a.0._keys] = n s x [a.0.n] = 10.2 [a.0.s] = abc [a.0_x] = false - I hope that example above is sufficient. There are few other items that are worth exploring - e.g., how to store the type (specifically, separate the quoted strings vs value so that "5.2" is different than 5.2, and "null" is different from null. I will leave the second part to a different post, once I have some feedback. I have some prototype that i've written in python - POC - that make it possible to write things like declare -a foo curl http://www.api.com/weather/US/10013 | readjson foo printf "temperature(F) : %.1f Wind(MPH)=%d" ${foo[temp_f]}, ${foo[wind]} Yair