Re: Looking to the future (was Re: Light weight support for JSON)
On 8/29/22 5:48 AM, Martin D Kealey wrote: The Shell persists because it has one killer feature: it does double duty as a scripting language and as an interactive command language. But we're kidding ourselves if we think that no other language could fill that gap: Python has a respectable interactive mode, though its focus is on objects rather than processes and files; the interactive "debugger" console inside Firefox speaks Javascript; and even "perl -d" is almost usable. So, neither of those could fill that gap. What could then? As for the future, I believe that if we don't move towards making the POSIX sh behaviour a truly optional part of an otherwise-more-sane language, we condemn Bash to continued obscurity and eventual extinction. Nah. I think Bash already has too many features over POSIX; anything beyond indexed arrays, indirect expansions, and `${parameter/string/replacement}' is bloat. Besides, who is going to evolve Bash into this "more-sane language"?
Re: Bash Coding style - Adopting C99 declarations
On Mon, 29 Aug 2022 at 09:14, Dmitry Goncharov wrote: > On Sunday, August 28, 2022, Andreas Schwab wrote: > > Note that the next revision of the C standard removes support for K&R > > declarations > > The code will continue to compile, won't it? > I expect "gcc -std=c23" will reject such code. Of course, compilers are unlikely to remove their compatibility modes any time soon, but I would hate to be stuck in a position of requiring "gcc -std=gnu23" to get both new *and* obsolete features at the same time. -Martin
Looking to the future (was Re: Light weight support for JSON)
Not that I fundamentally disagree with this (JSON) proposal, but I'd rather see the effort put into support for nested arrays (like ksh has), and generally having a more forward-looking view of Bash as an evolving language. I would see this proceeding somewhat like the transition from Perl4 to Perl5, where "everything is a string" gets replaced by a functioning polymorphic type system that by default provides what look like the same strings as before, but under the hood uses direct references rather than symbolic indirection. (Yes, that would finally fix "local -n".) Then emitting JSON or YAML becomes simply a matter of defining an output formatter for a structured variable, which could even be set as its default "stringize" function with a suitable "declare" statement. It used to be that POSIX sh and AWK were the only common languages that could be assumed to be present on all systems. Somewhere along the way AWK fell out of common awareness, leaving just the shell, with its arcane-to-the-point-of-intentionally-harmful misfeatures. Since 1990, we've moved on. Systems have - quite literally - a million times more storage than when POSIX documented the state of play in 1987. In POSIX-like systems, tools like Python, Ruby, Perl & PHP have become ubiquitous. Moreover, packaging systems allow a script to declare its prerequisites, including its interpreter, so it's no longer necessary to target one "universal" language. Even embedded systems have many gigabytes of storage and generally include a range of tools such as Perl and/or Python, or at least can install them. (Those that don't generally also lack the ability to install *anything*, including shell scripts, which makes the question of "which scripting language" entirely moot.) Supporting new features on *really* older devices is hard to justify, as such devices are being replaced *just to reduce their power consumption*. The Shell persists because it has one killer feature: it does double duty as a scripting language and as an interactive command language. But we're kidding ourselves if we think that no other language could fill that gap: Python has a respectable interactive mode, though its focus is on objects rather than processes and files; the interactive "debugger" console inside Firefox speaks Javascript; and even "perl -d" is almost usable. As for the future, I believe that if we don't move towards making the POSIX sh behaviour a truly optional part of an otherwise-more-sane language, we condemn Bash to continued obscurity and eventual extinction. Existing shell scripts aren't going to curl up and die any time soon, but we're at a crossroads: either we admit that Bash has had its day, and stop adding ANY new features, or we make a decision to let it evolve in ways that will still run existing scripts, provide an effective command language, *and* allow new scripts to be written without needing to constantly work around misfeatures that are 35 years past their use-by date. Which is it to be? -Martin PS: top of my list of most-hated misfeatures isn't any of the POSIX malapropisms, but rather the fact that we can't write "shopt compatXX" with the XX being the newest Bash version, and be sure that when person A eventually installs a future version of Bash to get its new whiz-bang interactive features, it won't break a script written by person B. I've been told that "shopt compat" is a "short term measure until the script gets fixed". Implicit in that is the assumption that either the author of a Bash script is supposed provide eternal on-going preemptive updates (even to users who didn't even get the original from them), or that people who install a script must necessarily be capable of diagnosing the weird broken behaviour that emerges when they install a new version of Bash. Yes, both of those are as crazy as they sound, and so is the "short term measure".
Re: Light weight support for JSON
On Sun, Aug 28, 2022 at 08:47:24PM -0400, Dale R. Worley wrote: > The "obvious" way to support Json in Bash would be a utility that parses > Json and produces e.g. a Bash associative array, and conversely a > utility that reads a Bash associative array and produces Json. The real > limitation is that it's difficult to have a subprocess set Bash's > variables. As far as I know, there's no good idiom for that. The standard idiom for this sort of thing is eval "$(external-tool)" This means you need to *trust* the external-tool to produce safe code.
Re: Light weight support for JSON
On Sun, Aug 28, 2022 at 7:47 PM Dale R. Worley wrote: > The "obvious" way to support Json in Bash would be a utility that parses > Json and produces e.g. a Bash associative array, and conversely a > utility that reads a Bash associative array and produces Json. The real > limitation is that it's difficult to have a subprocess set Bash's > variables. As far as I know, there's no good idiom for that. > > Dale > > If your json_util outputs a Bash declare -A statement then you could just eval it to get your associative array. The utility would need to guarantee against code injection in its output. Another issue is that JSON can express things that Bash associative arrays can't and so then you get into a rat's nest of workarounds. -- Visit serverfault.com to get your system administration questions answered.
Re: Light weight support for JSON
The "obvious" way to support Json in Bash would be a utility that parses Json and produces e.g. a Bash associative array, and conversely a utility that reads a Bash associative array and produces Json. The real limitation is that it's difficult to have a subprocess set Bash's variables. As far as I know, there's no good idiom for that. Dale
Re: Bash Coding style - Adopting C99 declarations
On Sunday, August 28, 2022, Andreas Schwab wrote: > On Aug 28 2022, Greg Wooledge wrote: > > > On Sun, Aug 28, 2022 at 10:47:38AM -0400, Yair Lenga wrote: > >> Hi, > >> > >> I've noticed Bash code uses "old-style" C89 declarations: > >> * Parameters are separated from the prototype > >> * Variables declared only at the beginning of the function > >> * No mixed declaration/statements > >> * No block local variables > >> > >> intmax_t > >> evalexp (expr, flags, validp) > >> char *expr; > >> int flags; > >> int *validp; > >> { > >> intmax_t val; > >> int c; > >> procenv_t oevalbuf; > >> > >> val = 0; > >> noeval = 0; > >> already_expanded = (flags&EXP_EXPANDED); > > > > You're mistaken. What you're seeing is the "K&R" coding style, which > > predates C89. This was not really a coding style. This was the c language syntax of defining parameters. > > Note that the next revision of the C standard removes support for K&R > declarations > The code will continue to compile, won't it? Regards, Dmitry > -- > Andreas Schwab, sch...@linux-m68k.org > GPG Key fingerprint = 7578 EB47 D4E5 4D69 2510 2552 DF73 E780 A9DA AEC1 > "And now for something completely different." > >
Re: Light weight support for JSON
First, thanks for taking the time to read and provide your thoughts. This is the real value of the discussion/ Second: I'm NOT trying to argue that there isn't valid use for combining bash/curl/jq, Nor do I suggest adding JSON as first class object to bash (Python/node/Perl/Groovy are way ahead ...). I hope to get feedback from other readers for the news group that may find this approach useful. I'll take the very common use case of using AWS CLI, which does produces JSON response for most calls. Processing the response, while possible with JQ, it is challenging to many junior (and intermediate) developers. In many cases, they fall into the traps that I mentioned above - performance (excessive forking or fork/exec), or code that is hard to read (I've seen really bad code - combining pipes of JQ/awk/sed). I'm trying to address those cases. Almost always they fail to properly handle objects with while space, new-lines, etc. To be practical, I'll try to follow the loadable extension path, and see how much I can get thru that path. Possibly will make sense to continue discussion with a concrete implementation. I believe the necessary commands are: json_read -a data-array -m meta-array -r root-obj Parse the stdin input and into data-array with the items (as described above), and the meta-array with helper information (length, prop list under each node) - to help iterating. json_write -v variable -a data-array -m meta-array -r root The reverse - generate the JSON for associated array following '.' naming convention json_add -a data-array [-m meta-array] [-r root] key1=value1 key2=value2 key3=value3 Helper to add items into associative array representing JSON object, Autodetect type, with the ability to force stringfication using format string. On Sun, Aug 28, 2022 at 3:22 PM John Passaro wrote: > interfacing with an external tool absolutely seems like the correct answer > to me. a fact worth mentioning to back that up is that `jq` exists. billed > as a sed/awk for json, it fills all the functions you'd expect such an > external tool to have and many many more. interfacing from curl to jq to > bash is something i do on a near daily basis. > > https://stedolan.github.io/jq/ > > On Sun, Aug 28, 2022, 09:25 Yair Lenga wrote: > >> Hi, >> >> Over the last few years, JSON data becomes a integral part of processing. >> In many cases, I find myself having to automate tasks that require >> inspection of JSON response, and in few cases, construction of JSON. So >> far, I've taken one of two approaches: >> * For simple parsing, using 'jq' to extract elements of the JSON >> * For more complex tasks, switching to python or Javascript. >> >> Wanted to get feedback about the following "extensions" to bash that will >> make it easier to work with simple JSON object. To emphasize, the goal is >> NOT to "compete" with Python/Javascript (and other full scale language) - >> just to make it easier to build bash scripts that cover the very common >> use >> case of submitting REST requests with curl (checking results, etc), and to >> perform simple processing of JSON files. >> >> Proposal: >> * Minimal - Lightweight "json parser" that will convert JSON files to bash >> associative array (see below) >> * Convert bash associative array to JSON >> >> To the extent possible, prefer to borrow from jsonpath syntax. >> >> Parsing JSON into an associative array. >> >> Consider the following, showing all possible JSON values (boolean, number, >> string, object and array). >> { >> "b": false, >> "n": 10.2, >> "s: "foobar", >> x: null, >> "o" : { "n": 10.2, "s: "xyz" }, >> "a": [ >> { "n": 10.2, "s: "abc", x: false }, >> { "n": 10.2, "s": "def" x: true}, >> ], >> } >> >> This should be converted into the following array: >> >> - >> >> # Top level >> [_length] = 6# Number of keys in object/array >> [_keys] = b n s x o a# Direct keys >> [b] = false >> [n] = 10.2 >> [s] = foobar >> [x] = null >> >> # This is object 'o' >> [o._length] = 2 >> [o._keys] = n s >> [o.n] = 10.2 >> [o.s] = xyz >> >> # Array 'a' >> [a._count] = 2 # Number of elements in array >> >> # Element a[0] (object) >> [a.0._length] = 3 >> [a.0._keys] = n s x >> [a.0.n] = 10.2 >> [a.0.s] = abc >> [a.0_x] = false >> >> - >> >> I hope that example above is sufficient. There are few other items that >> are >> worth exploring - e.g., how to store the type (specifically, separate the >> quoted strings vs value so that "5.2" is different than 5.2, and "null" is >> different from null. >> >> I will leave the second part to a different post, once I have some >> feedback. I have some prototype that i've written in python - POC - that >> make it possible to write things like >> >> declare -a foo >> curl http://www.api.com/weather/US/10013 | readjson foo >> >> printf "temperature(F) : %.1f Wind(MPH)=%d
Re: Light weight support for JSON
He has a point, though. To have some of the functionality of jq inside Bash may be very useful. If he can supply a patch, why not? Philip Orleans On Sun, Aug 28, 2022, 3:22 PM John Passaro wrote: > interfacing with an external tool absolutely seems like the correct answer > to me. a fact worth mentioning to back that up is that `jq` exists. billed > as a sed/awk for json, it fills all the functions you'd expect such an > external tool to have and many many more. interfacing from curl to jq to > bash is something i do on a near daily basis. > > https://stedolan.github.io/jq/ > > On Sun, Aug 28, 2022, 09:25 Yair Lenga wrote: > > > Hi, > > > > Over the last few years, JSON data becomes a integral part of processing. > > In many cases, I find myself having to automate tasks that require > > inspection of JSON response, and in few cases, construction of JSON. So > > far, I've taken one of two approaches: > > * For simple parsing, using 'jq' to extract elements of the JSON > > * For more complex tasks, switching to python or Javascript. > > > > Wanted to get feedback about the following "extensions" to bash that will > > make it easier to work with simple JSON object. To emphasize, the goal is > > NOT to "compete" with Python/Javascript (and other full scale language) - > > just to make it easier to build bash scripts that cover the very common > use > > case of submitting REST requests with curl (checking results, etc), and > to > > perform simple processing of JSON files. > > > > Proposal: > > * Minimal - Lightweight "json parser" that will convert JSON files to > bash > > associative array (see below) > > * Convert bash associative array to JSON > > > > To the extent possible, prefer to borrow from jsonpath syntax. > > > > Parsing JSON into an associative array. > > > > Consider the following, showing all possible JSON values (boolean, > number, > > string, object and array). > > { > > "b": false, > > "n": 10.2, > > "s: "foobar", > > x: null, > > "o" : { "n": 10.2, "s: "xyz" }, > > "a": [ > > { "n": 10.2, "s: "abc", x: false }, > > { "n": 10.2, "s": "def" x: true}, > > ], > > } > > > > This should be converted into the following array: > > > > - > > > > # Top level > > [_length] = 6# Number of keys in object/array > > [_keys] = b n s x o a# Direct keys > > [b] = false > > [n] = 10.2 > > [s] = foobar > > [x] = null > > > > # This is object 'o' > > [o._length] = 2 > > [o._keys] = n s > > [o.n] = 10.2 > > [o.s] = xyz > > > > # Array 'a' > > [a._count] = 2 # Number of elements in array > > > > # Element a[0] (object) > > [a.0._length] = 3 > > [a.0._keys] = n s x > > [a.0.n] = 10.2 > > [a.0.s] = abc > > [a.0_x] = false > > > > - > > > > I hope that example above is sufficient. There are few other items that > are > > worth exploring - e.g., how to store the type (specifically, separate the > > quoted strings vs value so that "5.2" is different than 5.2, and "null" > is > > different from null. > > > > I will leave the second part to a different post, once I have some > > feedback. I have some prototype that i've written in python - POC - that > > make it possible to write things like > > > > declare -a foo > > curl http://www.api.com/weather/US/10013 | readjson foo > > > > printf "temperature(F) : %.1f Wind(MPH)=%d" ${foo[temp_f]}, ${foo[wind]} > > > > Yair > > >
Re: Light weight support for JSON
On Sun, Aug 28, 2022, at 9:24 AM, Yair Lenga wrote: > Wanted to get feedback about the following "extensions" to bash that will > make it easier to work with simple JSON object. It occurred to me to provide references for previous discussion along these lines, but it turns out there isn't very much of it. This is the most concrete example I found: https://lists.gnu.org/archive/html/bug-bash/2020-12/msg00046.html -- vq
Re: Light weight support for JSON
On Sun, Aug 28, 2022, at 4:05 PM, G. Branden Robinson wrote: > At 2022-08-28T15:52:55-0400, Lawrence Velázquez wrote: >> On Sun, Aug 28, 2022, at 2:56 PM, G. Branden Robinson wrote: >> > How about next July, when JSON is as exactly old as the Bourne shell >> > was when JSON was deployed? >> >> I do not find "well *actually* JSON is old too!!!" to be particularly >> persuasive, either. > > It's a perfectly valid rejoinder to a claim that the format is too novel > to be seriously considered. That you don't regard it as persuasive is > consistent with your protest not being a rational one in the first > place Agree to disagree. -- vq
Re: Light weight support for JSON
At 2022-08-28T15:52:55-0400, Lawrence Velázquez wrote: > On Sun, Aug 28, 2022, at 2:56 PM, G. Branden Robinson wrote: > > How about next July, when JSON is as exactly old as the Bourne shell > > was when JSON was deployed? > > I do not find "well *actually* JSON is old too!!!" to be particularly > persuasive, either. It's a perfectly valid rejoinder to a claim that the format is too novel to be seriously considered. That you don't regard it as persuasive is consistent with your protest not being a rational one in the first place; one doesn't get reasoned out of what one wasn't reasoned into. > I should have foreseen that the offhand "of the month" jab would > get undue attention compared to my actual objection, which is against > giving one data format uniquely first-class support. That's on me. That's right. Don't pad your brief with makeweight objections, particularly when their ultimate weight is feeble. > Sick burn, pal. Excuse me while I take out my dentures or whatever. Fortunately for you, I reckon that even with mushmouthed enunciation, the kids wont't have any trouble understanding that you want them to get off your lawn. Why stomp on the newbies' ideas with two feet when one will do? Regards, Branden signature.asc Description: PGP signature
Re: Light weight support for JSON
On Sun, Aug 28, 2022, at 2:56 PM, G. Branden Robinson wrote: > At 2022-08-28T14:11:25-0400, Lawrence Velázquez wrote: >> I do not think bash needs to sprout functionality to support every >> data-exchange format of the month. > > This sentiment is illustrative of the logarithmic memory scale of > grognards. The Bourne shell was first released as part of Version 7 > Unix in January 1979.[1] 22 years and three months later, in April > 2001, Douglas Crockford and Chip Morningstar sent the first JSON > message.[2] > >> A loadable module might be okay, I guess. > > How about next July, when JSON is as exactly old as the Bourne shell was > when JSON was deployed? I do not find "well *actually* JSON is old too!!!" to be particularly persuasive, either. I should have foreseen that the offhand "of the month" jab would get undue attention compared to my actual objection, which is against giving one data format uniquely first-class support. That's on me. >> Why are people so allergic to just using specific utilities for >> specific tasks, as appropriate? (This question is rhetorical. >> Please do not respond with an impassioned plea about why JSON is >> so special that it deserves first-class shell support. It's not.) > > I won't litigate this point, but your concept of novelty is distorted > beyond any standard reasonable in the computer industry. If we are only > as young as we feel, you must feel geriatric in the extreme. Sick burn, pal. Excuse me while I take out my dentures or whatever. -- vq
Re: Light weight support for JSON
interfacing with an external tool absolutely seems like the correct answer to me. a fact worth mentioning to back that up is that `jq` exists. billed as a sed/awk for json, it fills all the functions you'd expect such an external tool to have and many many more. interfacing from curl to jq to bash is something i do on a near daily basis. https://stedolan.github.io/jq/ On Sun, Aug 28, 2022, 09:25 Yair Lenga wrote: > Hi, > > Over the last few years, JSON data becomes a integral part of processing. > In many cases, I find myself having to automate tasks that require > inspection of JSON response, and in few cases, construction of JSON. So > far, I've taken one of two approaches: > * For simple parsing, using 'jq' to extract elements of the JSON > * For more complex tasks, switching to python or Javascript. > > Wanted to get feedback about the following "extensions" to bash that will > make it easier to work with simple JSON object. To emphasize, the goal is > NOT to "compete" with Python/Javascript (and other full scale language) - > just to make it easier to build bash scripts that cover the very common use > case of submitting REST requests with curl (checking results, etc), and to > perform simple processing of JSON files. > > Proposal: > * Minimal - Lightweight "json parser" that will convert JSON files to bash > associative array (see below) > * Convert bash associative array to JSON > > To the extent possible, prefer to borrow from jsonpath syntax. > > Parsing JSON into an associative array. > > Consider the following, showing all possible JSON values (boolean, number, > string, object and array). > { > "b": false, > "n": 10.2, > "s: "foobar", > x: null, > "o" : { "n": 10.2, "s: "xyz" }, > "a": [ > { "n": 10.2, "s: "abc", x: false }, > { "n": 10.2, "s": "def" x: true}, > ], > } > > This should be converted into the following array: > > - > > # Top level > [_length] = 6# Number of keys in object/array > [_keys] = b n s x o a# Direct keys > [b] = false > [n] = 10.2 > [s] = foobar > [x] = null > > # This is object 'o' > [o._length] = 2 > [o._keys] = n s > [o.n] = 10.2 > [o.s] = xyz > > # Array 'a' > [a._count] = 2 # Number of elements in array > > # Element a[0] (object) > [a.0._length] = 3 > [a.0._keys] = n s x > [a.0.n] = 10.2 > [a.0.s] = abc > [a.0_x] = false > > - > > I hope that example above is sufficient. There are few other items that are > worth exploring - e.g., how to store the type (specifically, separate the > quoted strings vs value so that "5.2" is different than 5.2, and "null" is > different from null. > > I will leave the second part to a different post, once I have some > feedback. I have some prototype that i've written in python - POC - that > make it possible to write things like > > declare -a foo > curl http://www.api.com/weather/US/10013 | readjson foo > > printf "temperature(F) : %.1f Wind(MPH)=%d" ${foo[temp_f]}, ${foo[wind]} > > Yair >
Re: Light weight support for JSON
At 2022-08-28T14:11:25-0400, Lawrence Velázquez wrote: > I do not think bash needs to sprout functionality to support every > data-exchange format of the month. This sentiment is illustrative of the logarithmic memory scale of grognards. The Bourne shell was first released as part of Version 7 Unix in January 1979.[1] 22 years and three months later, in April 2001, Douglas Crockford and Chip Morningstar sent the first JSON message.[2] > A loadable module might be okay, I guess. How about next July, when JSON is as exactly old as the Bourne shell was when JSON was deployed? > Why are people so allergic to just using specific utilities for > specific tasks, as appropriate? (This question is rhetorical. > Please do not respond with an impassioned plea about why JSON is > so special that it deserves first-class shell support. It's not.) I won't litigate this point, but your concept of novelty is distorted beyond any standard reasonable in the computer industry. If we are only as young as we feel, you must feel geriatric in the extreme. Regards, Branden [1] https://minnie.tuhs.org/cgi-bin/utree.pl?file=V7 [2] https://www.toptal.com/web/json-vs-xml-part-1 signature.asc Description: PGP signature
Re: Light weight support for JSON
On Sun, Aug 28, 2022, at 2:29 PM, Yair Lenga wrote: > I do not think that JSON (and REST) are "data exchange format of the > month". Those are established formats that are here to stay. Like YAML. > Those are "cornerstones" of cloud computing/configuration. I do not have to > argue for them, they can speak for themselves. You *do* have to argue why a shell should provide first-class support for them, or for any concrete data format. Shells exist on a much longer time scale than anything you've been talking about and are used in essentially every conceivable computing context. "I think JSON is *extra-special* important though" is not a reason why the shell itself should give it special attention. > As for using external utilities: two main issues: > * Performance - Processing data in bash processes can be 100X times faster > than using external tools. The fork/exec is expensive. And? Are you running jq 100,000 times in a tight loop? > * Readability - Each tool has its own syntax, escapes, etc. The final > result of mixing JQ and bash is not pretty (just lookup jq/bash questions > on stack overflow) Neither is your example. Bash's "type system" (as it were) cannot fully represent all JSON objects, so you're going to end up with some sort of gross approximation regardless. > Having them as a loadable extension seems like a good practical solution. > They do not have to be "built-in". The "csv" loadable provides some precedent for this. It won't solve the representation problem, though. -- vq
Re: Light weight support for JSON
I do not think that JSON (and REST) are "data exchange format of the month". Those are established formats that are here to stay. Like YAML. Those are "cornerstones" of cloud computing/configuration. I do not have to argue for them, they can speak for themselves. As for using external utilities: two main issues: * Performance - Processing data in bash processes can be 100X times faster than using external tools. The fork/exec is expensive. To emphasize, the intention is not to build ETL processes with bash - those should still use dedicated tools (or Python or frameworks). * Readability - Each tool has its own syntax, escapes, etc. The final result of mixing JQ and bash is not pretty (just lookup jq/bash questions on stack overflow) * It is not easy to construct valid (JSON) documents with bash - by concatenating strings. Many other tools that are used for automation have support to ensure correctness. will be nice to have the same - it will make bash more useful for the proper use cases. Having them as a loadable extension seems like a good practical solution. They do not have to be "built-in". Yair On Sun, Aug 28, 2022 at 2:11 PM Lawrence Velázquez wrote: > On Sun, Aug 28, 2022, at 9:24 AM, Yair Lenga wrote: > > Wanted to get feedback about the following "extensions" to bash that will > > make it easier to work with simple JSON object. To emphasize, the goal is > > NOT to "compete" with Python/Javascript (and other full scale language) - > > just to make it easier to build bash scripts that cover the very common > use > > case of submitting REST requests with curl (checking results, etc), and > to > > perform simple processing of JSON files. > > I do not think bash needs to sprout functionality to support every > data-exchange format of the month. A loadable module might be okay, > I guess. > > Why are people so allergic to just using specific utilities for > specific tasks, as appropriate? (This question is rhetorical. > Please do not respond with an impassioned plea about why JSON is > so special that it deserves first-class shell support. It's not.) > > -- > vq >
Re: bug-bash Digest, Vol 237, Issue 30
On Sun, Aug 28, 2022, at 1:17 PM, Yair Lenga wrote: > Yes, you are correct - (most/all of) of those examples "K&R". > > However, given bash's important role in modern computing - isn't it time to > take advantage of new language features ? Why? What benefit would that actually provide? > this can make code more readable, > efficient and reliable. In practice, there's only person who really interacts with bash code. If he doesn't think there's a problem with K&R style, then it's not going to change. > I doubt that > many users are trying to install a new bash in a system that was > built/configured 15 years ago. You might be surprised. -- vq
Re: Light weight support for JSON
On Sun, Aug 28, 2022, at 9:24 AM, Yair Lenga wrote: > Wanted to get feedback about the following "extensions" to bash that will > make it easier to work with simple JSON object. To emphasize, the goal is > NOT to "compete" with Python/Javascript (and other full scale language) - > just to make it easier to build bash scripts that cover the very common use > case of submitting REST requests with curl (checking results, etc), and to > perform simple processing of JSON files. I do not think bash needs to sprout functionality to support every data-exchange format of the month. A loadable module might be okay, I guess. Why are people so allergic to just using specific utilities for specific tasks, as appropriate? (This question is rhetorical. Please do not respond with an impassioned plea about why JSON is so special that it deserves first-class shell support. It's not.) -- vq
Re: bug-bash Digest, Vol 237, Issue 30
Yes, you are correct - (most/all of) of those examples "K&R". However, given bash's important role in modern computing - isn't it time to take advantage of new language features ? this can make code more readable, efficient and reliable. Users who are using old platforms are most likely using a "snapshot" of tools - e.g., old gcc, make, ... etc. I doubt that many users are trying to install a new bash in a system that was built/configured 15 years ago. Many Java/python/C++ projects that want to move forward do it as part of the "major" release, in which they indicate Java 7 (or java java 8) support will be phased out. Same for C++ and python. On Sun, Aug 28, 2022 at 12:00 PM wrote: > Send bug-bash mailing list submissions to > bug-bash@gnu.org > > To subscribe or unsubscribe via the World Wide Web, visit > https://lists.gnu.org/mailman/listinfo/bug-bash > or, via email, send a message with subject or body 'help' to > bug-bash-requ...@gnu.org > > You can reach the person managing the list at > bug-bash-ow...@gnu.org > > When replying, please edit your Subject line so it is more specific > than "Re: Contents of bug-bash digest..." > > > Today's Topics: > >1. Bash Coding style - Adopting C99 declarations (Yair Lenga) >2. Re: Light weight support for JSON (Yair Lenga) >3. Re: Bash Coding style - Adopting C99 declarations (Greg Wooledge) > > > -- > > Message: 1 > Date: Sun, 28 Aug 2022 10:47:38 -0400 > From: Yair Lenga > To: bug-bash > Subject: Bash Coding style - Adopting C99 declarations > Message-ID: > io-antwp9vxaxjvac0elnp2tm4...@mail.gmail.com> > Content-Type: text/plain; charset="UTF-8" > > Hi, > > I've noticed Bash code uses "old-style" C89 declarations: > * Parameters are separated from the prototype > * Variables declared only at the beginning of the function > * No mixed declaration/statements > * No block local variables > > intmax_t > evalexp (expr, flags, validp) > char *expr; > int flags; > int *validp; > { > intmax_t val; > int c; > procenv_t oevalbuf; > > val = 0; > noeval = 0; > already_expanded = (flags&EXP_EXPANDED); > > > --- > Curious as to the motivation of sticking to this standard for new > development/features. Specifically, is there a requirement to keep bash > compatible with C89 ? I believe some of those practices are discouraged > nowadays. > > Yair > > > -- > > Message: 2 > Date: Sun, 28 Aug 2022 10:51:33 -0400 > From: Yair Lenga > To: Alex fxmbsw7 Ratchev > Cc: bug-bash > Subject: Re: Light weight support for JSON > Message-ID: > < > cak3_kppv5xnwbctxacmktvgqahegubm1y7bowa7j6ygpvwo...@mail.gmail.com> > Content-Type: text/plain; charset="UTF-8" > > Interesting point. Using (optional) separate array can also address the > problem of "types" - knowing which values are quoted, and which one are > not. This can also provide enough metadata to convert modified associative > table back to JSON. > > On Sun, Aug 28, 2022 at 9:51 AM Alex fxmbsw7 Ratchev > wrote: > > > > > > > On Sun, Aug 28, 2022, 15:46 Yair Lenga wrote: > > > >> Sorry for not being clear. I'm looking for feedback. The solution that I > >> have is using python to read the JSON, and generate the commands to > build > >> the associative array. Will have to rewrite in "C"/submit if there is > >> positive feedback from others readers. Yair. > >> > > > > ah, cool > > i just have a suggestion, .. to store the keys in a separate array, space > > safe > > > > On Sun, Aug 28, 2022 at 9:42 AM Alex fxmbsw7 Ratchev > >> wrote: > >> > >>> > >>> > >>> On Sun, Aug 28, 2022, 15:25 Yair Lenga wrote: > >>> > Hi, > > Over the last few years, JSON data becomes a integral part of > processing. > In many cases, I find myself having to automate tasks that require > inspection of JSON response, and in few cases, construction of JSON. > So > far, I've taken one of two approaches: > * For simple parsing, using 'jq' to extract elements of the JSON > * For more complex tasks, switching to python or Javascript. > > Wanted to get feedback about the following "extensions" to bash that > will > make it easier to work with simple JSON object. To emphasize, the goal > is > NOT to "compete" with Python/Javascript (and other full scale > language) > - > just to make it easier to build bash scripts that cover the very > common > use > case of submitting REST requests with curl (checking results, etc), > and > to > perform simple processing of JSON files. > > Proposal: > * Minimal - Lightweight "json parser" that will convert JSON files to > bash > associative array (see below) > * Convert bash associative array to JSON > > To the extent possible, prefer to borrow from jsonpath syntax. > > Parsing JSON into an a
Re: Bash Coding style - Adopting C99 declarations
On Aug 28 2022, Greg Wooledge wrote: > On Sun, Aug 28, 2022 at 10:47:38AM -0400, Yair Lenga wrote: >> Hi, >> >> I've noticed Bash code uses "old-style" C89 declarations: >> * Parameters are separated from the prototype >> * Variables declared only at the beginning of the function >> * No mixed declaration/statements >> * No block local variables >> >> intmax_t >> evalexp (expr, flags, validp) >> char *expr; >> int flags; >> int *validp; >> { >> intmax_t val; >> int c; >> procenv_t oevalbuf; >> >> val = 0; >> noeval = 0; >> already_expanded = (flags&EXP_EXPANDED); > > You're mistaken. What you're seeing is the "K&R" coding style, which > predates C89. Note that the next revision of the C standard removes support for K&R declarations. -- Andreas Schwab, sch...@linux-m68k.org GPG Key fingerprint = 7578 EB47 D4E5 4D69 2510 2552 DF73 E780 A9DA AEC1 "And now for something completely different."
Re: Bash Coding style - Adopting C99 declarations
On Sun, Aug 28, 2022 at 10:47:38AM -0400, Yair Lenga wrote: > Hi, > > I've noticed Bash code uses "old-style" C89 declarations: > * Parameters are separated from the prototype > * Variables declared only at the beginning of the function > * No mixed declaration/statements > * No block local variables > > intmax_t > evalexp (expr, flags, validp) > char *expr; > int flags; > int *validp; > { > intmax_t val; > int c; > procenv_t oevalbuf; > > val = 0; > noeval = 0; > already_expanded = (flags&EXP_EXPANDED); You're mistaken. What you're seeing is the "K&R" coding style, which predates C89. I'm pretty sure that the decision to continue in this style well past the widespread adoption of C89 was because of a desire to use bash on systems that still had a K&R C compiler. A lot of the oldest GNU projects followed this idea, in order to increase the number of commercial Unix systems on which they could be used.
Re: Light weight support for JSON
Interesting point. Using (optional) separate array can also address the problem of "types" - knowing which values are quoted, and which one are not. This can also provide enough metadata to convert modified associative table back to JSON. On Sun, Aug 28, 2022 at 9:51 AM Alex fxmbsw7 Ratchev wrote: > > > On Sun, Aug 28, 2022, 15:46 Yair Lenga wrote: > >> Sorry for not being clear. I'm looking for feedback. The solution that I >> have is using python to read the JSON, and generate the commands to build >> the associative array. Will have to rewrite in "C"/submit if there is >> positive feedback from others readers. Yair. >> > > ah, cool > i just have a suggestion, .. to store the keys in a separate array, space > safe > > On Sun, Aug 28, 2022 at 9:42 AM Alex fxmbsw7 Ratchev >> wrote: >> >>> >>> >>> On Sun, Aug 28, 2022, 15:25 Yair Lenga wrote: >>> Hi, Over the last few years, JSON data becomes a integral part of processing. In many cases, I find myself having to automate tasks that require inspection of JSON response, and in few cases, construction of JSON. So far, I've taken one of two approaches: * For simple parsing, using 'jq' to extract elements of the JSON * For more complex tasks, switching to python or Javascript. Wanted to get feedback about the following "extensions" to bash that will make it easier to work with simple JSON object. To emphasize, the goal is NOT to "compete" with Python/Javascript (and other full scale language) - just to make it easier to build bash scripts that cover the very common use case of submitting REST requests with curl (checking results, etc), and to perform simple processing of JSON files. Proposal: * Minimal - Lightweight "json parser" that will convert JSON files to bash associative array (see below) * Convert bash associative array to JSON To the extent possible, prefer to borrow from jsonpath syntax. Parsing JSON into an associative array. Consider the following, showing all possible JSON values (boolean, number, string, object and array). { "b": false, "n": 10.2, "s: "foobar", x: null, "o" : { "n": 10.2, "s: "xyz" }, "a": [ { "n": 10.2, "s: "abc", x: false }, { "n": 10.2, "s": "def" x: true}, ], } This should be converted into the following array: - # Top level [_length] = 6# Number of keys in object/array [_keys] = b n s x o a# Direct keys [b] = false [n] = 10.2 [s] = foobar [x] = null # This is object 'o' [o._length] = 2 [o._keys] = n s [o.n] = 10.2 [o.s] = xyz # Array 'a' [a._count] = 2 # Number of elements in array # Element a[0] (object) [a.0._length] = 3 [a.0._keys] = n s x [a.0.n] = 10.2 [a.0.s] = abc [a.0_x] = false - I hope that example above is sufficient. There are few other items that are worth exploring - e.g., how to store the type (specifically, separate the quoted strings vs value so that "5.2" is different than 5.2, and "null" is different from null. >>> >>> did you forget to send the script along ? or am i completly loss >>> >>> a small thing i saw, a flat _keys doesnt do the job.. >>> >>> I will leave the second part to a different post, once I have some feedback. I have some prototype that i've written in python - POC - that make it possible to write things like declare -a foo curl http://www.api.com/weather/US/10013 | readjson foo printf "temperature(F) : %.1f Wind(MPH)=%d" ${foo[temp_f]}, ${foo[wind]} Yair >>>
Bash Coding style - Adopting C99 declarations
Hi, I've noticed Bash code uses "old-style" C89 declarations: * Parameters are separated from the prototype * Variables declared only at the beginning of the function * No mixed declaration/statements * No block local variables intmax_t evalexp (expr, flags, validp) char *expr; int flags; int *validp; { intmax_t val; int c; procenv_t oevalbuf; val = 0; noeval = 0; already_expanded = (flags&EXP_EXPANDED); --- Curious as to the motivation of sticking to this standard for new development/features. Specifically, is there a requirement to keep bash compatible with C89 ? I believe some of those practices are discouraged nowadays. Yair
Re: Light weight support for JSON
On Sun, Aug 28, 2022, 15:46 Yair Lenga wrote: > Sorry for not being clear. I'm looking for feedback. The solution that I > have is using python to read the JSON, and generate the commands to build > the associative array. Will have to rewrite in "C"/submit if there is > positive feedback from others readers. Yair. > ah, cool i just have a suggestion, .. to store the keys in a separate array, space safe On Sun, Aug 28, 2022 at 9:42 AM Alex fxmbsw7 Ratchev > wrote: > >> >> >> On Sun, Aug 28, 2022, 15:25 Yair Lenga wrote: >> >>> Hi, >>> >>> Over the last few years, JSON data becomes a integral part of processing. >>> In many cases, I find myself having to automate tasks that require >>> inspection of JSON response, and in few cases, construction of JSON. So >>> far, I've taken one of two approaches: >>> * For simple parsing, using 'jq' to extract elements of the JSON >>> * For more complex tasks, switching to python or Javascript. >>> >>> Wanted to get feedback about the following "extensions" to bash that will >>> make it easier to work with simple JSON object. To emphasize, the goal is >>> NOT to "compete" with Python/Javascript (and other full scale language) - >>> just to make it easier to build bash scripts that cover the very common >>> use >>> case of submitting REST requests with curl (checking results, etc), and >>> to >>> perform simple processing of JSON files. >>> >>> Proposal: >>> * Minimal - Lightweight "json parser" that will convert JSON files to >>> bash >>> associative array (see below) >>> * Convert bash associative array to JSON >>> >>> To the extent possible, prefer to borrow from jsonpath syntax. >>> >>> Parsing JSON into an associative array. >>> >>> Consider the following, showing all possible JSON values (boolean, >>> number, >>> string, object and array). >>> { >>> "b": false, >>> "n": 10.2, >>> "s: "foobar", >>> x: null, >>> "o" : { "n": 10.2, "s: "xyz" }, >>> "a": [ >>> { "n": 10.2, "s: "abc", x: false }, >>> { "n": 10.2, "s": "def" x: true}, >>> ], >>> } >>> >>> This should be converted into the following array: >>> >>> - >>> >>> # Top level >>> [_length] = 6# Number of keys in object/array >>> [_keys] = b n s x o a# Direct keys >>> [b] = false >>> [n] = 10.2 >>> [s] = foobar >>> [x] = null >>> >>> # This is object 'o' >>> [o._length] = 2 >>> [o._keys] = n s >>> [o.n] = 10.2 >>> [o.s] = xyz >>> >>> # Array 'a' >>> [a._count] = 2 # Number of elements in array >>> >>> # Element a[0] (object) >>> [a.0._length] = 3 >>> [a.0._keys] = n s x >>> [a.0.n] = 10.2 >>> [a.0.s] = abc >>> [a.0_x] = false >>> >>> - >>> >>> I hope that example above is sufficient. There are few other items that >>> are >>> worth exploring - e.g., how to store the type (specifically, separate the >>> quoted strings vs value so that "5.2" is different than 5.2, and "null" >>> is >>> different from null. >>> >> >> did you forget to send the script along ? or am i completly loss >> >> a small thing i saw, a flat _keys doesnt do the job.. >> >> I will leave the second part to a different post, once I have some >>> feedback. I have some prototype that i've written in python - POC - that >>> make it possible to write things like >>> >>> declare -a foo >>> curl http://www.api.com/weather/US/10013 | readjson foo >>> >>> printf "temperature(F) : %.1f Wind(MPH)=%d" ${foo[temp_f]}, ${foo[wind]} >>> >>> Yair >>> >>
Re: Light weight support for JSON
Sorry for not being clear. I'm looking for feedback. The solution that I have is using python to read the JSON, and generate the commands to build the associative array. Will have to rewrite in "C"/submit if there is positive feedback from others readers. Yair. On Sun, Aug 28, 2022 at 9:42 AM Alex fxmbsw7 Ratchev wrote: > > > On Sun, Aug 28, 2022, 15:25 Yair Lenga wrote: > >> Hi, >> >> Over the last few years, JSON data becomes a integral part of processing. >> In many cases, I find myself having to automate tasks that require >> inspection of JSON response, and in few cases, construction of JSON. So >> far, I've taken one of two approaches: >> * For simple parsing, using 'jq' to extract elements of the JSON >> * For more complex tasks, switching to python or Javascript. >> >> Wanted to get feedback about the following "extensions" to bash that will >> make it easier to work with simple JSON object. To emphasize, the goal is >> NOT to "compete" with Python/Javascript (and other full scale language) - >> just to make it easier to build bash scripts that cover the very common >> use >> case of submitting REST requests with curl (checking results, etc), and to >> perform simple processing of JSON files. >> >> Proposal: >> * Minimal - Lightweight "json parser" that will convert JSON files to bash >> associative array (see below) >> * Convert bash associative array to JSON >> >> To the extent possible, prefer to borrow from jsonpath syntax. >> >> Parsing JSON into an associative array. >> >> Consider the following, showing all possible JSON values (boolean, number, >> string, object and array). >> { >> "b": false, >> "n": 10.2, >> "s: "foobar", >> x: null, >> "o" : { "n": 10.2, "s: "xyz" }, >> "a": [ >> { "n": 10.2, "s: "abc", x: false }, >> { "n": 10.2, "s": "def" x: true}, >> ], >> } >> >> This should be converted into the following array: >> >> - >> >> # Top level >> [_length] = 6# Number of keys in object/array >> [_keys] = b n s x o a# Direct keys >> [b] = false >> [n] = 10.2 >> [s] = foobar >> [x] = null >> >> # This is object 'o' >> [o._length] = 2 >> [o._keys] = n s >> [o.n] = 10.2 >> [o.s] = xyz >> >> # Array 'a' >> [a._count] = 2 # Number of elements in array >> >> # Element a[0] (object) >> [a.0._length] = 3 >> [a.0._keys] = n s x >> [a.0.n] = 10.2 >> [a.0.s] = abc >> [a.0_x] = false >> >> - >> >> I hope that example above is sufficient. There are few other items that >> are >> worth exploring - e.g., how to store the type (specifically, separate the >> quoted strings vs value so that "5.2" is different than 5.2, and "null" is >> different from null. >> > > did you forget to send the script along ? or am i completly loss > > a small thing i saw, a flat _keys doesnt do the job.. > > I will leave the second part to a different post, once I have some >> feedback. I have some prototype that i've written in python - POC - that >> make it possible to write things like >> >> declare -a foo >> curl http://www.api.com/weather/US/10013 | readjson foo >> >> printf "temperature(F) : %.1f Wind(MPH)=%d" ${foo[temp_f]}, ${foo[wind]} >> >> Yair >> >
Re: Light weight support for JSON
On Sun, Aug 28, 2022, 15:25 Yair Lenga wrote: > Hi, > > Over the last few years, JSON data becomes a integral part of processing. > In many cases, I find myself having to automate tasks that require > inspection of JSON response, and in few cases, construction of JSON. So > far, I've taken one of two approaches: > * For simple parsing, using 'jq' to extract elements of the JSON > * For more complex tasks, switching to python or Javascript. > > Wanted to get feedback about the following "extensions" to bash that will > make it easier to work with simple JSON object. To emphasize, the goal is > NOT to "compete" with Python/Javascript (and other full scale language) - > just to make it easier to build bash scripts that cover the very common use > case of submitting REST requests with curl (checking results, etc), and to > perform simple processing of JSON files. > > Proposal: > * Minimal - Lightweight "json parser" that will convert JSON files to bash > associative array (see below) > * Convert bash associative array to JSON > > To the extent possible, prefer to borrow from jsonpath syntax. > > Parsing JSON into an associative array. > > Consider the following, showing all possible JSON values (boolean, number, > string, object and array). > { > "b": false, > "n": 10.2, > "s: "foobar", > x: null, > "o" : { "n": 10.2, "s: "xyz" }, > "a": [ > { "n": 10.2, "s: "abc", x: false }, > { "n": 10.2, "s": "def" x: true}, > ], > } > > This should be converted into the following array: > > - > > # Top level > [_length] = 6# Number of keys in object/array > [_keys] = b n s x o a# Direct keys > [b] = false > [n] = 10.2 > [s] = foobar > [x] = null > > # This is object 'o' > [o._length] = 2 > [o._keys] = n s > [o.n] = 10.2 > [o.s] = xyz > > # Array 'a' > [a._count] = 2 # Number of elements in array > > # Element a[0] (object) > [a.0._length] = 3 > [a.0._keys] = n s x > [a.0.n] = 10.2 > [a.0.s] = abc > [a.0_x] = false > > - > > I hope that example above is sufficient. There are few other items that are > worth exploring - e.g., how to store the type (specifically, separate the > quoted strings vs value so that "5.2" is different than 5.2, and "null" is > different from null. > did you forget to send the script along ? or am i completly loss a small thing i saw, a flat _keys doesnt do the job.. I will leave the second part to a different post, once I have some > feedback. I have some prototype that i've written in python - POC - that > make it possible to write things like > > declare -a foo > curl http://www.api.com/weather/US/10013 | readjson foo > > printf "temperature(F) : %.1f Wind(MPH)=%d" ${foo[temp_f]}, ${foo[wind]} > > Yair >
Light weight support for JSON
Hi, Over the last few years, JSON data becomes a integral part of processing. In many cases, I find myself having to automate tasks that require inspection of JSON response, and in few cases, construction of JSON. So far, I've taken one of two approaches: * For simple parsing, using 'jq' to extract elements of the JSON * For more complex tasks, switching to python or Javascript. Wanted to get feedback about the following "extensions" to bash that will make it easier to work with simple JSON object. To emphasize, the goal is NOT to "compete" with Python/Javascript (and other full scale language) - just to make it easier to build bash scripts that cover the very common use case of submitting REST requests with curl (checking results, etc), and to perform simple processing of JSON files. Proposal: * Minimal - Lightweight "json parser" that will convert JSON files to bash associative array (see below) * Convert bash associative array to JSON To the extent possible, prefer to borrow from jsonpath syntax. Parsing JSON into an associative array. Consider the following, showing all possible JSON values (boolean, number, string, object and array). { "b": false, "n": 10.2, "s: "foobar", x: null, "o" : { "n": 10.2, "s: "xyz" }, "a": [ { "n": 10.2, "s: "abc", x: false }, { "n": 10.2, "s": "def" x: true}, ], } This should be converted into the following array: - # Top level [_length] = 6# Number of keys in object/array [_keys] = b n s x o a# Direct keys [b] = false [n] = 10.2 [s] = foobar [x] = null # This is object 'o' [o._length] = 2 [o._keys] = n s [o.n] = 10.2 [o.s] = xyz # Array 'a' [a._count] = 2 # Number of elements in array # Element a[0] (object) [a.0._length] = 3 [a.0._keys] = n s x [a.0.n] = 10.2 [a.0.s] = abc [a.0_x] = false - I hope that example above is sufficient. There are few other items that are worth exploring - e.g., how to store the type (specifically, separate the quoted strings vs value so that "5.2" is different than 5.2, and "null" is different from null. I will leave the second part to a different post, once I have some feedback. I have some prototype that i've written in python - POC - that make it possible to write things like declare -a foo curl http://www.api.com/weather/US/10013 | readjson foo printf "temperature(F) : %.1f Wind(MPH)=%d" ${foo[temp_f]}, ${foo[wind]} Yair