Re: proposed BASH_SOURCE_PATH
Le 26/06/2024 à 14:17, Martin D Kealey écrivait : Just to be clear, would this result in $0 and ${BASH_SOURCE[@]:(-1):1} potentially yielding different values? There is no reason it would alter the content of $0 which remains the name of the command involved. The arguments vector with index 0 is a very different semantic than BASH_SOURCE. I don't even understand why BASH_SOURCE would point to anything other than the source path or the real source path if some option tells it to resolve real paths. -- Léa Gris
Re: proposed BASH_SOURCE_PATH
Le 20/06/2024 à 05:25, Oğuz écrivait : What I'm saying there is name it main and execute like `./main'. I'm not against having a variable that's automatically populated with the parent directory of the source script, I just don't need it and it wasn't what we were discussing. Would it be a valid option, then to make BASH_SOURCE contain the real path in all circumstances? -- Léa Gris
Re: proposed BASH_SOURCE_PATH
Le 19/06/2024 à 22:04, Will Allan écrivait : Since I find the accepted answer to be overly complex for my needs, I usually just do this: declare -r SCRIPT_DIR="$(dirname -- "${BASH_SOURCE[0]}")" source -- "${SCRIPT_DIR}/../lib/foo.sh" source -- "${SCRIPT_DIR}/../lib/bar.sh" ... But, I still don’t like it. I have to start off each script with a slow command substitution (subshell) which introduces a variable that I don’t really want, but it’s too slow to do this repeatedly: source -- "$(dirname -- "${BASH_SOURCE[0]}")/../lib/foo.sh" source -- "$(dirname -- "${BASH_SOURCE[0]}")/../lib/foo.sh" Look like you did not find a proper answer there. Here is one simple that involve no sub-shell at all and does exactly what your sub-shell version does. declare -r SCRIPT_DIR=${BASH_SOURCE[0]%/*} Now indeed this does not solve symbolic links and to do this, you need a sub-shell and it is not system agnostic anymore. realSource=$(realpath -- "${BASH_SOURCE[0]}") && realScriptDir=${realSource%/*} -- Léa Gris
Re: REQUEST - bash floating point math support
Le 17/06/2024 à 09:17, Koichi Murase écrivait : declare -i numvar=${localeFormatted/[!0-9]/.} This would break with negative numbers. I know no other radix separator than comma or dot. If there are other radix to replace, it can be listed in a character class. Lets say there are locales that uses , ; or : declare -i numvar=${localeFormatted/[,;:]/.} In my opinion, the locale-formatted strings should only appear in the strings presented to users, such as in the output of `printf'. It is easier and maintainable to normalize all the internal representations to be the C floating-point literals. I have been in IT in France since 3 decades. And I have never ever found any piece of code that use locale-formatted literals, not even during my CS education. Sometimes one have to deal with data sources with numbers formatted in funny ways, even grouping with illegal characters. Then it is a know fact that one must deal with sourced data with zealous sanitizing. Now on the bash shell context. The usual pitfall with floating-point number parsing comes when some script author tries to parse output of a command that is not meant to be parsed, that is locale formatted and don't bother to set LC_NUMERIC=C before running this command A common pitfall is trying to parse the output of the time builtin command. It already breaks on locale ignorant scripts. This won't change. -- Léa Gris
Re: REQUEST - bash floating point math support
Le 16/06/2024 à 23:44, Zachary Santer écrivait : How do you propose to take an LC_NUMERIC-formatted floating-point literal and assign it to a variable with the numeric flag or make use of it in another type of arithmetic context? This proposal does not include conversion of locale formatted literals. Anyway, this could be handled with existing bash string replacement feature: localeFormatted=3,1415 declare -i numvar=${localeFormatted/,/.} -- Léa Gris
Re: REQUEST - bash floating point math support
On 15/06/2024 à 15:29, Koichi Murase wrote : at which point does the conversion happens The conversion to LC_NUMERIC format only happens during variable expansion outside of a numerical context. The numerical context can be explicit if the assigned variable has a numeric flag; it is implicit within ((numeric context)) Tgere are other numerical context in Bash, like string and array indexes, but those may retain only integer support. So this means that the arithmetic expansions $(()) would produce a format incompatible with the arithmetic expression. This seems inconsistent to me, but it might be a valid option if there is no other option. The numerical context with a $ prefix $(()) indeed implies a string conversion after resolving the computation within. So yes it would perform the string conversion using th LC_NUMERIC format. An other change is that witin a declare statement or if the assigned variable has a numeric flag it is considered a numerical context. # Numerical context of the assigned pi variable allow use of # C format of the 3.1415 literal to be used in assignment declare -i pi=3.1415 # give a a numeric flag typeset -i e # Because a has a numeric flag it accepts the dot radix delimiter within # the assigned value e=2.71828 # Numerical context pi / 2 computes to 1.57 # but as it is expanded into a string, it uses # the comma radix delimiter from LC_NUMERIC=fr_FR.UTF8 str=$((pi / 2)) # Then str=1,57 # This is illegal, becahse the right-side is expanded # to string and may not use the correct dot delimiter declare -i arc=$(((2 * pi) / 3) # This is a correct assignemnt ((arc = (2 * pi) / 3)) # Although arc has not been declared with a numeric flag # it was assigned within a numerical context I feel it is better to allow the printf implementations to extend the conversion rather than trying to invent strange and inconsistent behaviors among the arithmetic expressions I agree this printf limitation is questionable. Anyway, given the slow pace of standard evolution and to not break the standard. Having consideration for the context when manipulation values or expanding these values to a string can greatly help workaround this POSIX C restriction. It is really a simple rule: 1. Within a numerical context, floating-point literals use the dot radix delimiter. Internally it could yse whatever fit best like IEEE 754 double precision binary64 2. Outside a numerical context, when creating a string representation of a floating-point number, the LC_NUMERIC format applies. -- Léa Gris
Re: REQUEST - bash floating point math support
Le 15/06/2024 à 02:49, Koichi Murase écrivait : 2024年6月14日(金) 16:18 Léa Gris : Another elegant option would be to expand the existing variables' i flag to tell the variable is numeric rather than integer. Then have printf handle argument variables with the numeric flag as using the LC_NUMERIC=C floating-point format with dot radix point. In that design, how does the `printf' builtin know that the argument string is generated by a specific variable with the numeric flag? The `printf' builtin is a (builtin) command that receives arguments as strings. The arguments do not carry the contexts of how the strings are generated, (though some of the exceptions are the array assignments of the assignment builtins). Indeed printf only knows the string value. But Bash know the variable has a numeric flag when doing the value expansion, so it expands it using the current LC_NUMERIC locale in this specific case. # Numeric format use comma as radix separator LC_NUMERIC=fr_FR.UTF8 # Numeric variable uses the C locale format internally declare -i floatVar=3.1415 # Printf does not have to know about the variable type # as it is expanded in the current LC_NUMERIC format printf '%.2f\n' "$floatVar" # Prints 3,14 # Same when it is expanded to assign another variable # stringVar=3,1415 # Since the string expansion of numeric variable uses the # LC_NUMERIC format stringVar=$floatVar # Same rule applies when expanding a numerical context to a string: echo "$((2 * 3.14))" # Prints the string 6,28 because LC_NUMERIC=fr_FR.UTF8 # Do not use string expansion to assign numeric value or it will fail # with different locale. Use a numeric context to assign. # Do: (( arc = 2 * floatVar )) # Don't: # arc=$((2 * floatVar )) # Note that the numeric flag is not needed within a numeric context echo "$arc" # Prints 6.283 because arc has no numeric flag # so it is expanded verbatin to string echo "$((arc))" # Prints 6,283 because the implied numerical context # expands the value using LC_NUMERIC format -- Léa Gris
Re: REQUEST - bash floating point math support
Le 14/06/2024 à 03:41, Martin D Kealey écrivait : On Thu, 13 Jun 2024 at 09:05, Zachary Santer wrote: Let's say, if var is in the form of a C floating-point literal, ${var@F} would expand it to the locale-dependent formatted number, for use as an argument to printf or for output directly. And then ${var@f} would go the other way, taking var that's in the form of a locale-dependent formatted number, and expanding it to a C floating-point literal. How about incorporating the % printf formatter directly, like ${var@%f} for the locale-independent format and %{var@%#f} for the locale-specific format? However any formatting done as part of the expansion assumes that the variable holds a "number" in some fixed format, rather than a localized string. Personally I think this would actually be a good idea, but it would be quite a lot bigger project than simply added FP support. -Martin Another elegant option would be to expand the existing variables' i flag to tell the variable is numeric rather than integer. Then have printf handle argument variables with the numeric flag as using the LC_NUMERIC=C floating-point format with dot radix point. Expanding the existing i flag would also ensure numerical expressions would handle the same value format. The David method: Take down multiple issues with one stone. -- Léa Gris
Re: REQUEST - bash floating point math support
Le 06/06/2024 à 11:55, Koichi Murase écrivait : Though, I see your point. It is inconvenient that we cannot pass the results of arithmetic evaluations to the `printf' builtin. This appears to be an issue of the printf builtin. I think the `printf' builtin should be extended to interpret both forms of the numbers, the locale-dependent formatted number and the floating-point literals. Another way would be to expand string representation of floating-point numbers using the locale. Anyway, yes this POSIX restriction is pointless but causes troubles. Anyway, writing locale agnostic Bash scripts requires some precautions (like setting LC_ALL=C), that too many developers tend to ignore; before running commands without output to parse. Using LC_NUMERIC=C printf ... is an even less known precaution and I have already stumbled on a couple scripts that break because of this exact overlook. -- Léa Gris
Re: REQUEST - bash floating point math support
Le 06/06/2024 à 10:29, Koichi Murase écrivait : 2024年6月6日(木) 15:59 Léa Gris : Le 05/06/2024 à 17:09, Koichi Murase écrivait : 2024年6月5日(水) 21:41 Zachary Santer : Bash could potentially detect floating point literals within arithmetic expansions and adjust the operations to use floating point math in that case. [...] ksh and zsh are already behaving in that way, and if Bash would support the floating-point arithmetic, Bash should follow their behavior. Bash isn't even consistent with the floating point data/input format when using printf '%f\n' "$float_string" as it depends on LC_NUMERIC locale. Maybe I miss your point, but literals in arithmetic expressions and the printf format are unrelated. They are in Bash. Bash use the locale format for floating-point number arguments. It means: printf %f will break when the argument contains a floating-point number using a different decimal symbol than that of the current locale in the LC_NUMERIC environment variable. For example, if LC_NUMERIC=fr_FR.UTF8, the decimal separator symbol is a comma (not a dot) and printf %s 0.1 will fail because 0.1 is not recognized as a valid floating-point number argument by printf. The implication of this behaviour on floating-point implementation is that the values will not be compatible with all system locale since Bash does not allow a consistent format for the decimal separator symbol when passing floating-point numbers arguments to printf. -- Léa Gris
Re: REQUEST - bash floating point math support
Le 05/06/2024 à 17:09, Greg Wooledge écrivait : On Wed, Jun 05, 2024 at 09:57:26PM +0700, Robert Elz wrote: Also note that to actually put floating support in the shell, more is needed than just arithmetic, you also need floating comparisons in test (or in bash, in [[ ) and a whole bunch more odds and ends that aren't obvious until you need them, and they're just not there (like a mechanism to convert floats back into integers again, controlling how rounding happens). Ironically, that last one is the one we already *do* have. hobbit:~$ printf '%.0f\n' 11.5 22.5 33.5 12 22 34 As long as you're OK with "banker's rounding", printf does it. As long as you are ok with your script breaking because of LC_NUMERIC local using a different decimal symbol like a comma in French. -- Léa Gris
Re: REQUEST - bash floating point math support
Le 05/06/2024 à 17:09, Koichi Murase écrivait : 2024年6月5日(水) 21:41 Zachary Santer : Bash could potentially detect floating point literals within arithmetic expansions and adjust the operations to use floating point math in that case. [...] ksh and zsh are already behaving in that way, and if Bash would support the floating-point arithmetic, Bash should follow their behavior. Bash isn't even consistent with the floating point data/input format when using printf '%f\n' "$float_string" as it depends on LC_NUMERIC locale. LC_MESSAGES=C LC_NUMERIC=fr_FR.UTF8 printf '%f\n' 3.1415 bash: printf: 3.1415: invalid number 3,00 Chet explained it is because Bash is following the POSIX norm and C printf rules. Now imagine if Bash introduce floating-point support, you write a Bash script with decimal point floating-point numbers and your script is incompatible with systems whose locale use a decimal comma instead. -- Léa Gris
Re: proposed BASH_SOURCE_PATH
This needs to resolve symbolic links otherwise path will be that of the symbolic link rather than that of the actual script file: There are already ways to safely allow sourcing libraries relative to a bash script installation without adding features to Bash: # Create tests and library folders mkdir -p ./tests/libs # Create main bash script command cat > ./tests/real_bash_source_dir_test <<'BASH' #!/usr/bin/env bash real_source=$(realpath "${BASH_SOURCE[0]}") real_source_dir=${real_source%/*} include_path=$real_source_dir/libs # shellcheck source=./libs/testlib.bash source "$include_path/testlib.bash" || exit 1 # cannot load library TestLib::hello BASH # Make it executable chmod +x ./tests/real_bash_source_dir_test # Create symbolic link to executable ln -frs ./tests/real_bash_source_dir_test ./ # Create the library cat > ./tests/libs/testlib.bash <<'BASH' (return 0 2>/dev/null) && [ -n "${BASH_VERSION}" ] || exit 1 TestLib::hello() { printf 'Hello from TestLib\n';} BASH # Run command script from its real install dir ./tests/real_bash_source_dir_test # Run command script from its symbolic link in current dir ./real_bash_source_dir_test You can see it is able to source its library relative to its real installed path and it works regardless of if called from a symbolic link or directly from its installed dir. -- Léa Gris
Re: proposed BASH_SOURCE_PATH
Le 14/05/2024 à 08:08, Martin D Kealey écrivait : I wholeheartedly support the introduction of BASH_SOURCE_PATH, but I would like to suggest three tweaks to its semantics. A common pattern is to unpack a script with its associated library & config files into a new directory, which then leaves a problem locating the library files whose paths are only known relative to $0 (or ${BASH_SOURCE[0]}). This needs to resolve symbolic links otherwise path will be that of the symbolic link rather than that of the actual script file: mkdir ./tests/ cat > ./tests/real_bash_source_dir_test <<'BASH' #!/usr/bin/env bash real_source=$(realpath "${BASH_SOURCE[0]}") real_source_dir=${real_source%/*} printf 'Real Bash source dir: %s\n' "$real_source_dir" BASH chmod +x ./tests/real_bash_source_dir_test ln -sr ./tests/real_bash_source_dir_test ./ Then you get same answer in both cases: ./tests/real_bash_source_dir_test ./real_bash_source_dir_test -- Léa Gris
Re: "${assoc[@]@k}" doesn't get expanded to separate words within compound assignment syntax
Le 20/03/2024 à 12:59, Greg Wooledge écrivait : s to that , simply declare -p and run that back That will work in some cases, yes. The problem is that it locks you in to having the same array name and the same attributes (if any) as the original array had. It also runs "declare" which can create issues with scope, if it's done inside a function. Perhaps you wanted to populate an array that was previously declared at some scope other than your own, and other than global (which means adding -g wouldn't work either). I suppose you might argue that you could run declare -p, and then edit out the "declare" and the attributes, and then while you're at it, also edit the array name. I guess that's a possible solution, but it really seems ugly. I agree about the ugliness and all the potential issue with interpreting what is basically code subject to any kind of expansion inherited from another unknown source. I think Bash shall not try to implement its own serializations with expansion. PHP Had the exact same design issue with embedded serialization and it causes enough issue that all modern code choose JSON serialisation and JSON has been made part of PHP. A couple years ago I read here about future Bash becoming modular or something along this with a module for JSON. I'd have preferred for Bash to have some internal support for JSON even if there is no compatible nested structure. But at least expand values as strings and if anything else put the JSON text string as the value. But possibly people are overusing Bash and shell for things that it was not designed for. In a lot of case, more general purpose scripting language are available to a wide range of systems. When your favorite tool is a hammer, problems tend to look like nails. -- Léa Gris
Re: Warn upon "declare -ax"
Le 04/09/2023 à 14:18, Dan Jacobson écrivait : Shouldn't "declare -ax" print a warning that it is useless? There don's seem to be any warning system in Bash or other shells. As long as it is not a fatal error condition and errexit is not set, execution continue. There are static analysis tools like Shellcheck which might be expanded to ware of such incompatible flags but that's it. -- Léa Gris
Re: bug#65659: RFC: changing printf(1) behavior on %b
Le 02/09/2023 à 07:46, Phi Debian écrivait : On Fri, Sep 1, 2023 at 8:10 PM Stephane Chazelas wrote: 2023-09-01 07:54:02 -0500, Eric Blake via austin-group-l at The Open Group: FWIW, a "printf %b" github shell code search returns ~ 29k entries ( https://github.com/search?q=printf+%25b+language%3AShell&type=code&l=Shell ) Ha super, at least some numbers :-), I didn't knew we could make this kind of request... thanx for that. 18k results on <https://github.com/search?q=%2Fprintf%5B%5B%3Aspace%3A%5D%5D%2B%5B%22%27%5D%3F%5B%5E%22%27%5D*%25b%2F+language%3AShell&type=code> Those actual numbers vary a lot depending on request accuracy. Because there is no Regex replacement for a shell language parser; it cannot match all syntactically valid use cases, even with a carefully crafted Regex. -- Léa Gris
Re: string substitution broken since 5.2
Le 03/11/2022 à 19:50, Chet Ramey écrivait : The option is enabled by default. If you want to restore the previous behavior, add `shopt -u patsub_replacement'. Having it enabled by default is not good, because it introduces side-effects for existing scripts. Shell has historically perpetuated legacy features to preserve the function of those no-longer maintained systems and associated scripts. Are there enough reasons to break this trend and stop preserving backward-compatibility with older scripts; by enabling new features that can affect the behaviour of previous code with side-effects? -- Léa Gris
Re: declare -x non-exportable variable types
Le 25/02/2022 à 16:49, Chet Ramey écrivait : You can't export array variables, period. You can't export attributes, only variable names and values. You still can't export attributes. There is no way to export attributes Chet, I heard you, I understood it and I knew it before, while I was writing my message, and still now. It feels like that you were either in a bad mood or that I didn't manage to express my remarks and thoughts as clearly as I would have liked. I'm sorry for what happened and I didn't expect to receive such a tense reaction. Now that you and I are, were and still are (I reassure you) in absolute agreement with: "Bash variable attributes and, or arrays are incompatible with environment variables" (undisputed fact)... Is it possible that: if these variables are passed explicitly as environment variables with -x or export : - Either Bash returns an error because of "variable flags are incompatible with the environment, and it's a mistake to export Bash variables with flags", rather than having different behaviours (pass value, nothing, name...) based on the original Bash variable flags/type? - Or that Bash should now be able to "convert" the value as it does now, but in a more consistent way? - Or that the documentation contains an explicit description of what happens when one tries to export a Bash variable with flags/types (even just documented as: "The result of exporting Bash variables with attributes is indeterminate"), which might be an appropriate clarification? -- Léa Gris
declare -x non-exportable variable types
declare -x variable with unexportable flag/type is handled quite inconsistently: $ unset int_value && declare -ix int_value=42 && bash -c 'declare -p int_value' declare -x int_value="42" $ unset array && declare -ax array=(foo bar) && bash -c 'declare -p array' bash: line 1: declare: array: not found $ unset assoc && declare -Ax assoc=([key1]=foo [key2]=bar) && bash -c 'declare -p assoc' bash: line 1: declare: assoc: not found $ unset upper && declare -ux upper='hello' && bash -c 'declare -p upper' declare -x upper="HELLO" $ unset lower && declare -lx lower='WORLD' && bash -c 'declare -p lower' declare -x lower="world" $ unset str && unset -n ref && declare str=hello && declare -nx ref=str && bash -c 'declare -p ref' declare -x ref="str" The inconsistency is that sometimes: - It exports the variable with its translated value (integer to string, upper, lower) - It exports nothing of array or associative array, despite that in Bash, array referenced without an index returns the first element, but not when exported. - It export the reference name of nameref, despite it could export the value of the referee. My stance on this is that there is no real consistent way to export variables with incompatible flags; and I wonder if to be consistent, the declaration statement could be erroring instead, when the other variable flag are incompatible with export. For the built-in export or system export command, a consistent conversion of the value sounds an acceptable behavior: export int_value # export with string of value export array # export with value of first elmeent export assoc # export empty value (same as referencing an associative array without a [key]. export upper; export lower # export with converted value export nameref # export with value of the referee This means that the export command would expand the variable value before exporting, but the declare, local and typeset statements would error if flags are incompatible with -x export. -- Léa Gris
Re: Long variable value get corrupted sometimes
Le 16/02/2022 à 13:43, Greg Wooledge écrivait : text=$(cat /tmp/foo.txt; printf x) text=${text%x} or read -r -d '' text
Re: Interesting bug
On 12/02/2022 at 12:23, David Hobach wrote: function tfunc { local foo= foo="$(testCode)" || {echo "foo";} ^ Missing space after opening brace Because the code block is missing a space after the opening brace; the opening brace is ignored, but the following closing brace is parsed as the end for the tfunc function code block. The following statements are executed for the global scope since the function definition block is closed by the faulty syntax above. cat "$foo" || { badCode case $? in *) exit 1 esac } } The bug is in your code. https://shellcheck.net/ static analysis for shell scripts can help you spot such syntax errors. -- Léa Gris
empty nameref issue
This is ok as here: # declare nameref without assigned value declare -n ref # use nameref as iterator for ref in foo bar baz do ref='Hello the World!' done declare -p foo bar baz although: declare -n ref leaves the ref variable in a limbo state with no value $ unset -n ref; declare -n ref; printf %q\\n "$ref" '' This strangely cause no error and returns an empty string. $ unset -n ref; declare -n ref; printf %q\\n "${!ref}" bash: ref: invalid indirect expansion But trying to expand the value of the nameref itself causes this "invalid indirect expansion error" This seems counter-intuitive. Intuitively: - Expanding the value of a nameref without an assigned value should return an empty string. - Expanding the value of the refered variable whose nameref is undefined would return some error The other related issue is that this limbo empty state of a nameref is only obtained with an initial `declare -n ref`. There is no way to later clear or assign an empty string to a nameref unless destroying and recreating the nameref with: unset -n ref declare -n ref -- Léa Gris
Re: Arbitrary command execution in shell - by design!
Le 30/10/2021 à 07:41, L A Walsh écrivait : On 2021/10/29 12:33, Greg Wooledge wrote: On Fri, Oct 29, 2021 at 11:59:02AM -0700, L A Walsh wrote: How much lameness Chet has introduced into bash to accommodate the wrong users. This is quite unfair. Huh? It's true--look at how functions have to be stored in the environment because someone was able to hack "their own system" where they already have unrestricted shell access. ... Expect to see more of those misplaced security rants now that Bash becomes popular with git-bash and bash in Windows environment. Indeed, if a bash script is able to be abused, it is not bash's fault, neither the script's fault, but the system's security policies granting those privileges to the user running the script. -- Léa Gris
Re: Arbitrary command execution in shell - by design!
on 29/10/2021 at 21:33, Greg Wooledge wrote : Making bash *less horrible* to use for programming purposes doesn't qualify as "lameness" in my book. Even if it does "enable" people to use shells for unsuited purposes, I'd still much rather have indexed and associative arrays (bash) than not have them at all (sh). There are several *suitable* tasks for which they are immensely useful. So much good words Greg. And many thanks to you for your valuable Wiki. Thank you for giving credit where credit is due. If you think there might be security concerns, beyond genuine human errors, this is a red flag to avoid shell scripts and find a more suitable language/tool for this task. -- Léa Gris
Re: Arbitrary command execution from test on a quoted string
Le 29/10/2021 à 00:29, Greg Wooledge écrivait : On Thu, Oct 28, 2021 at 08:33:22PM +, elettrino via Bug reports for the GNU Bourne Again SHell wrote: user@machine:~$ USER_INPUT='x[$(id>&2)]' user@machine:~$ test -v "$USER_INPUT" uid=1519(user) gid=1519(user) groups=1519(user),100(users) user@machine:~$ Whoo. This uses a feature that was introduced in bash 4.2. It doesn't cause code injection in bash 4.2, though. It *does* cause code injection in bash 4.3 through 5.1. Adding it to my wiki page. A safe way to replace: test -v "$USER_INPUT" Would be: test "${USER_INPUT@Q}" But it is not backward-compatible with older bash versions. Alternatively: declare -p USER_INPUT >/dev/null 2>&1 will work with much older bash versions Any other way witch are less bulky and or more backward compatible? -- Léa Gris
declare does not always set variable flags prior to assignment
Found out that the declare statement does not properly set all variable flags before assign values: unset arr declare -i -a arr=(1 2 3) declare -p arr declare -ai arr=([0]="1" [1]="2" [2]="3") this is ok declare +i -a arr=(hello world) declare -p arr declare -a arr=([0]="0" [1]="0") this is not ok as arr assignment was handled as integers Same issue with associative array unset assoc declare -i -A assoc=([foo]=1 [bar]=2 [baz]=3 declare -p assoc declare -Ai assoc=([foo]="1" [bar]="2" [baz]="3" ) declare +i -A assoc=([yeet]=hello) declare -p assoc declare -A assoc=([yeet]="0" ) -- Léa Gris
Re: Request change to output of declare -f funcname
Le 02/10/2021 à 18:45, Greg Wooledge écrivait : On Sat, Oct 02, 2021 at 06:06:32PM +0200, Léa Gris wrote: Better illustrated how newlines are discarded: $ sudo bash -c 'echo hello echo world' hello world $ sudo -i bash -c 'echo hello echo world' helloecho world OK, that's news to me. But that looks like a bug in sudo. Asking bash to change its behavior to work around a bug in some other program seems like misdirected effort. It is not wholly misdirected since Bash already add semicolon after statements where it is optional since these are followed by newlines. Adding a semicolon after the last statement of a command group would not hurt in any circumstance. Now while reworking the output of declare -f, there could be an option to produce the most compact output without newlines and without indentation; for the purpose of serializing function declarations, similarly as declare -p already serialize variables declaration in a compact one-line even for arrays. Corollary declare -p could have an indented expanded output that would be useful for arrays with an element per line rather than space delimited elements. These changes to the serialization of declare -f and -p would not affect backward or forward compatibility between serialization types. It would make the declare -f more resilient to newline removal and would allow both compact and indented verbose formatted outputs to coexist.
Re: Request change to output of declare -f funcname
Le 02/10/2021 à 15:09, Greg Wooledge écrivait : On Sat, Oct 02, 2021 at 01:41:35PM +0200, Léa Gris wrote: $ declare -f hello hello () { echo 'hello'; echo 'world' } The issue is that in some circumstances, newline characters may be handled as space, making the function declaration invalid. Can you show an example where the output of declare -f is invalid? hello (){ echo 'hello';echo 'world';} LC_MESSAGES=C sudo bash -c "$(declare -f hello);hello" hello world LC_MESSAGES=C sudo -i bash -c "$(declare -f hello);hello" bash: -c: line 2: syntax error: unexpected end of file Better illustrated how newlines are discarded: $ sudo bash -c 'echo hello echo world' hello world $ sudo -i bash -c 'echo hello echo world' helloecho world Or: $ sudo bash -c "printf %q\\\n \"$(declare -f hello)\"" $'hello () \n{ \necho \'hello you the\';\necho \'world\'\n}' sudo -i bash -c "printf %q\\\n \"$(declare -f hello)\"" hello\ \(\)\ \{\ \ \ \ \ echo\ \'hello\ you\ \ the\'\;\ \ \ \ echo\ \'world\'\} -- Léa Gris
Request change to output of declare -f funcname
Currently declare -f funcname prints the function source with ending the last statement/command with a newline only and omitting the semicolon. Lets declare a hello function: $ hello (){ echo 'hello';echo 'world';} and now see how it is expanded like this with: $ declare -f hello hello () { echo 'hello'; echo 'world' } The issue is that in some circumstances, newline characters may be handled as space, making the function declaration invalid. It would be nice if there was the last semicolon added even if optional, same as it is added between both echo statements. Some way to expand function source as a compact one-line form would be nice as well. Could be a shopt switch or a declare switch. I'd be in favour of a shopt switch though as declare is already clogged with many different options. This shopt switch could also control the verbosity/indentation for both the output of declare -p and declare -f -- Léa Gris
Re: efficient way to use matched string in variable substitution
Le 24/08/2021 à 15:09, Mike Jonkmans écrivait : This seems to be the fastest: f12 () { [[ "$1" =~ ${1//?/(.)} ]]; local arr=( "${BASH_REMATCH[@]:1}" ); } time for ((i=1; i<=1; i++)); do f0 682390; done real0m0,296s user0m0,296s sys 0m0,000s Awesome Mike, would you like to add this answer to SO? It would be very useful there; but I don't want to be wrongly credited for this smart implementation. time for ((i=1; i<=1; i++)); do f12 682390; done real0m0.223s user0m0.223s sys 0m0.000s Made it into a fancy utility function: string2array() { # Splits the string's characters into the array # $1: The input string # $2: The output array name [[ "$1" =~ ${1//?/(.)} ]] # shellcheck disable=SC2178 # shellcheck broken nameref type check local -n arr="$2" # shellcheck disable=SC2034 # shellcheck broken nameref usage check arr=("${BASH_REMATCH[@]:1}") } -- Léa Gris
Re: efficient way to use matched string in variable substitution
Le 24/08/2021 à 14:06, Greg Wooledge écrivait : unicorn:~$ f6() { local i n=${#1} arr; for ((i=0; i See my featured version to also capture space and newlines: https://stackoverflow.com/a/68907322/7939871 -- Léa Gris
Re: efficient way to use matched string in variable substitution
Le 23/08/2021 à 21:41, L A Walsh écrivait : On 2021/08/23 12:10, Greg Wooledge wrote: On Mon, Aug 23, 2021 at 11:36:52AM -0700, L A Walsh wrote: Starting with a number N, is there an easy way to print its digits into an array? n=988421 # Need extglob for the replacement pattern shopt -s extglob # Split string characters into array IFS=' ' read -r -a array <<<"${n//?()/ }" # Debug print array declare -p array -- Léa Gris
Re: @K transformation
Le 21/08/2021 à 00:59, Greg Wooledge écrivait : The fact that "${a[@]@K}" expands to a single word is surprising to me. I know someone else already mentioned it in this thread (sorry, I forgot who it was), but it would be nice if there were a similar one that gave a list of multiple words. unicorn:~$ printf '<%s> ' "${a[@]@Q}"; echo <'1'> <'2'> <'3'> unicorn:~$ printf '<%s> ' "${a[@]@U}"; echo <1> <2> <3> unicorn:~$ printf '<%s> ' "${a[@]@L}"; echo <1> <2> <3> unicorn:~$ printf '<%s> ' "${a[@]@E}"; echo <1> <2> <3> unicorn:~$ printf '<%s> ' "${a[@]@K}"; echo <0 "1" 1 "2" 2 "3"> It really sticks out. Not all expansion transformers returns same number of arguments when expanding an array Lets test with this script: #!/usr/bin/env bash oper=(U u L Q E P A K a) arr=(foo bar baz qux) mapfile -t expansions < <( printf '"${arr[@]@%s}"\n' "${oper[@]}" ) for exp in "${expansions[@]}"; do eval "set -- $exp" if [ $# -gt 1 ]; then printf '%s expands into %d argumsnts:\n' "$exp" "$#" for ((i = 1; $#; i++)); do printf '%d.\t%s\n' $i "$1" shift done else printf '%s expands into a single argument:\n1.\t%s\n' "$exp" "$1" fi printf \\n done The output is: "${arr[@]@U}" expands into 4 argumsnts: 1. FOO 2. BAR 3. BAZ 4. QUX "${arr[@]@u}" expands into 4 argumsnts: 1. Foo 2. Bar 3. Baz 4. Qux "${arr[@]@L}" expands into 4 argumsnts: 1. foo 2. bar 3. baz 4. qux "${arr[@]@Q}" expands into 4 argumsnts: 1. 'foo' 2. 'bar' 3. 'baz' 4. 'qux' "${arr[@]@E}" expands into 4 argumsnts: 1. foo 2. bar 3. baz 4. qux "${arr[@]@P}" expands into 4 argumsnts: 1. foo 2. bar 3. baz 4. qux "${arr[@]@A}" expands into 3 argumsnts: 1. declare 2. -a 3. arr=([0]="foo" [1]="bar" [2]="baz" [3]="qux") "${arr[@]@K}" expands into a single argument: 1. 0 "foo" 1 "bar" 2 "baz" 3 "qux" "${arr[@]@a}" expands into 4 argumsnts: 1. a 2. a 3. a 4. a Now @K would have been useful if it expanded into individual arguments for each entry rather than an eval expression. The @A suffers from the same weirdness expanding into an eval expression that actually duplicates the feature from declare -p. The @a expands the attribute from the container array for each element, which is as strange, because all array elements don't have attributes of their own. -- Léa Gris
Re: @K transformation
Le 21/08/2021 à 00:06, Chet Ramey écrivait : On 8/19/21 6:37 AM, Léa Gris wrote: #!/usr/bin/env bash declare -A assoc=( [P]=piano [TB]='foldable table' ['CH AIR']=chair ) options=("${assoc[@]@K}") The best way to clone an associative array is: declare -A options eval options=\( "${assoc[@]@K}" \) The quoting @K performs is eval-safe. Although I was not attempting to clone the Associative array but turn it into a flat array with interleaved key and value each as own element. options=( key value next key next value ...k ...v ) -- Léa Gris
Re: EPOCHREALTIME
Le 19/08/2021 à 16:41, Eli Schwartz écrivait : On 8/19/21 9:41 AM, Léa Gris wrote: The error occurs, one would imagine, during the "convert string to float" stage, after parsing argv or forking to bc or whatever, but *before* passing it as an argument to printf(3). Here, bash is just doing good error checking -- if you used strtof("3.14159265358979323844", NULL) under a fr_FR.UTF-8 locale, it would silently drop everything after the ".", and you would "successfully" print 3,, but bash reports an error message. A programming language shall distinguish between display format and data format. Locale settings are for display format and shall not interfere with arguments parsing which is data format, or it create such data portability issues. This is exactly how I read the note from the POSIX documentation: https://pubs.opengroup.org/onlinepubs/9699919799/utilities/bc.html#tag_20_09_16 The bc utility always uses the ( '.' ) character to represent a radix point, regardless of any decimal-point character specified as part of the current locale. In languages like C or awk, the character is used in program source, so it can be portable and unambiguous, while the locale-specific character is used in input and output. Because there is no distinction between source and input in bc, this arrangement would not be possible. Using the locale-specific character in bc's input would introduce ambiguities into the language Especially: In languages like C or awk, the character is used in program source, so it can be portable and unambiguous printf arguments are program source even if argument comes from a variable. All things considered, if you are using floating-point numbers in a shell script, you are clearly not using the right tool for the job, but sometimes options are limited or not under your control. Having a feature implemented in such a way, *that it cannot be used reliably or requires heavy work-around* (especially if you both need to process floating-point data in a portable format, and display in locale format)… is just calling for frustration and sorry errors: For the record: ash -c 'LC_ALL=fr_FR.utf8; printf "Pi: %2.4f\\n" "$(echo "4*a(1)" | bc -l)"' Pi: 3.1416 bash -c 'LC_ALL=fr_FR.utf8; printf "Pi: %2.4f\\n" "$(echo "4*a(1)" | bc -l)"' bash: line 1: printf: 3.14159265358979323844: invalid number Pi: 3, dash -c 'LC_ALL=fr_FR.utf8; printf "Pi: %2.4f\\n" "$(echo "4*a(1)" | bc -l)"' Pi: 3.1416 ksh -c 'LC_ALL=fr_FR.utf8; printf "Pi: %2.4f\\n" "$(echo "4*a(1)" | bc -l)"' Pi: ksh: printf: 3.14159265358979323844: arithmetic syntax error ksh: printf: 3.14159265358979323844: arithmetic syntax error ksh: printf: warning: invalid argument of type f 3, mksh -c 'LC_ALL=fr_FR.utf8; printf "Pi: %2.4f\\n" "$(echo "4*a(1)" | bc -l)"' Pi: 3,1416 zsh -c 'LC_ALL=fr_FR.utf8; printf "Pi: %2.4f\\n" "$(echo "4*a(1)" | bc -l)"' Pi: 3,1416 -- Léa Gris
Re: EPOCHREALTIME
Le 19/08/2021 à 15:10, hancooper écrivait : Thusly, EPOCHREALTIME should not be made to depend on the locale. I have seen many workarounds, that complicate rather than simplify something that should be straighforward and germaine to direct numeric computation. A agree 100% It is as frustrating as printf arguments format being dependent on locale settings: This will fail because of questionable design decision of having a mutable argument format: LC_NUMERIC=fr_FR@UTF-8; printf 'Pi: %2.4f\n` "$(bc -l <<<'4*a(1)')" Note how the format indicator still use a dot, but the argument format's decimal separator is that of the system's locale. Imagine if C++ or Java had methods with different signature depending on system locale. You would scream fool. But for Bash, it was decided it was all fine. -- Léa Gris
Re: EPOCHREALTIME
Le 19/08/2021 à 14:43, hancooper via Bug reports for the GNU Bourne Again SHell écrivait : Have been using $EPOCHREALTIME but can see that the output is as follows. 1629376497,853634 The utilisation of tho comma `,` is very inconvenient for those who want to do time computations. A change towards a period `.` would be the proper way to display the variable. Furthermore, one gets nanosecond precision when using date. But such precision is not possible with EPOCHREALTIME. Thusly, the improvement to nanosecond precision is desirable so as to match the capability of data. Felicitations Han (LC_NUMERIC=C; echo "$EPOCHREALTIME") It will use a dot -- Léa Gris
Re: @K transformation
Le 19/08/2021 à 12:09, Koichi Murase écrivait : $ printf '<%s>\n' "${A[@]@A}" <-A> The problem of ${A[@]@A} is that it is not so useful when one wants to define a clone associative array with a different name but with the same contents as A. Instead, using ${A[@]@K}, one could do $ declare -A "B=(${A[@]@K})" to clone the associative array. It is even possible to save the contents of an associative array in an indexed array as $ declare -a "saved=(${A[@]@K})" Hmm..., but for this case, one could actually simply store the contents in a scalar: $ saved="${A[*]@K}" Current implementation of @K is pretty-much useless for the dialog use-case: #!/usr/bin/env bash declare -A assoc=( [P]=piano [TB]='foldable table' ['CH AIR']=chair ) options=("${assoc[@]@K}") #typeset -p options #exit choice="$( dialog \ --backtitle "Test" \ --title "Test" \ --menu "Test" \ 0 0 4 \ "${options[@]}" \ 2>&1 >/dev/tty )" Inline with "${array[@]}" expanding as multiple arguments! "${assoc[@]@K}" shall have expanded into arguments and not expand quotes as string content. This would have made the @K expansion usable for dialog. Currently the associative array's key values pairs need be added to the options array in a for loop as: options=() for k in "${!assoc[@]}"; do options+=("$k" "${assoc[$k]}") done -- Léa Gris
feature request array expansion of keys values pairs and for k v loop
Many time I see use case for expanding both keys and values of an array, and every time it requires a loop. Typical use case is with the dialog command. But there are other use cases where having both key and value expanded would save a loop. Example: array=([1]=apple [3]=banana [2]=orange) for k in "${!array[@]}"; do v="${array[k]}" printf '%s %s ' "$k" "$v" done printf \\n I'd like some syntax to expand both keys and values into a single scalar. Something like a at sign or another symbol meaning both are expanded: # Expand key values pairs as distinct arguments printf '%s ' "${@array[@]}" printf \\n or # Expand key values pairs as IFS joined string printf %s\\n "${@array[*]}" Consequently it could allow expanding the for loop with: for k v in "${@array[@]}"; do printf 'Key=%s\tValue=%s\n' "$k" "$v" done Although the for loop for this specific case would not be needed as it could be expanded in one go as: printf 'Key=%s\tValue=%s\n' "${@array[@]}" But I figure there are other use case where iterating key and value would be a QOL over indexing and assigning the value with statement within the loop. Obviously it would fit equally well with associative arrays. -- Léa Gris
Re: Issue declaring an array via a variable name
Le 15/08/2021 à 02:45, Hunter Wittenborn écrivait : - declare -g "${variable}"=("world" "me") - declare -g ${variable}=("world" "me") - declare -g "hello"=("world" "me") Invalid because the string "${variable}" cannot be assigned = the scalar ("world" "me") If you want to dynamically compose the declare arguments, it either need to be only a string or only variable=value(s) So these work because the argument is a string declare -g "$variable=( \"world\" \"me\" )" declare -g "$variable"'=( "world" "me" )' Be cautious with dynamic declare statements, because it is as insecure as eval. It will execute statements contained in variables. -- Léa Gris
Re: gettext feature request
Le 13/08/2021 à 18:14, Jean-Jacques Brucker écrivait : Le 12/08/2021 à 16:29, Chet Ramey a écrit : On 8/11/21 6:35 PM, Jean-Jacques Brucker wrote: Thank a lot Chet. I still think it would have been more consistent to extend the $'...' syntax. Why would it be more consistent to add translation to a quoting syntax that didn't have it than to post-process translations in a different way? Chet I'm probably less familiar with the history of shells and bash than you (and others on this mailing list), but it seems to me that: First were the chains: * '...' who did not expand anything * "..." which expanded the syntaxes $VAR and the unfortunate `...` [1] Then the syntax $"..." was introduced, then the syntax $'...' (bash 2.X). As the character '$' means "interprets (synonym: translates) what follows", it seems to me quite consistent that $"..." means "translates the entire following string" Conversely, the syntax $'...' seems to me much less consistent. (Reminder: C is prior to shells, and bash is itself written in C). However, the "C-string" feature is very useful (and nowadays probably more used than the translation feature). In absolute terms, if one day we would list all the historical design errors, dare to break some compatibilities, and manage to establish new shell standards (I'm probably dreaming ... but do we ever know ?). Then we could have: I would have really loved if Bash expanded translations with its built-in printf using a `-g` flag for gettext and positional arguments support instead of expanding string literal translations. It would have allowed something like : ```sh #!/usr/bine/env bash # -g Gettext translation of Hello World into variable my_string printf -v my_string -g 'Hello World!' # Printout using gettext translated format string printf -g 'String %1$q has %2$d characters\n' "$my_string" "${#my_string}" ``` Using this German language file: ```po #: a.sh:4 msgid "Hello World!" msgstr "Hallo Welt!" #: a.sh:7 msgid "String %1$q has %2$d characters\n" msgstr "%2$d Zeichen lang ist die Zeichenkette %1$q\n" ``` In default English, the output would be: ```none String Hello\ World\! has 12 characters ``` In German, the output would be: ```none 11 Zeichen lang ist die Zeichenkette Hallo\ Welt\! ``` -- Léa Gris
Re: gettext feature request
Le 24/07/2021 à 20:48, Chet Ramey écrivait : On 7/24/21 10:35 AM, Jean-Jacques Brucker wrote: Planning to use the /$"string/" feature in my bash code made me think too much : https://github.com/foopgp/bash-libs/tree/main/i18n ...what I really *love* to see in bash, is a /$'string'/ feature, which doesn't parse any «`» or «$» characters. So you want a translation feature without any further interpretation? Or one just without command substitution? What about an option to enable this variant behavior? What do you think would be a suitable name? It'd be perfectly fine if non interpreted locale string was reserved for format string of the built-in printf, and supported positional arguments. Example: declare -- str='Hello World!' declare -i num=42 printf $"string: %1$s, int %2$d\n" "$str" $num Unchanged gettext string syntax $"string" but without variable and sub-shell expansion, and %n$ positional markers support when it is used as a printf format string. Would fix the security risks of variable expansion in gettext strings. Would not overlap $'c-style string' syntax. Would maintain variable expansion outside the printf format string context to maintain backward compatibility. -- Léa Gris
Re: Crash on large brace expansion
Le 15/07/2021 à 21:23, Greg Wooledge écrivait : On Thu, Jul 15, 2021 at 05:28:04PM +0200, Léa Gris wrote: Le 15/07/2021 à 16:36, Gabríel Arthúr Pétursson écrivait : Hi all, Executing the following results in a fierce crash: $ bash -c '{0..255}.{0..255}.{0..255}.{0..255}' Brace expression expands all the value in memory. Here you are actually telling Bash to expand 256⁴ or 4294967296 42 Billion entries. 4.2 billion, but who's counting? :-) Ah yes, my billions or off by ten folds. Here is a single-loop version to absolve me of my arithmetical overstatement: a=0 while [ $a -lt 4294967296 ]; do printf '%d.%d.%d.%d\n' \ $((a >> 24)) $((a >> 16 & 255)) $((a >> 8 & 255)) $((a & 255)) a=$((a + 1)) done -- Léa Gris
Re: Crash on large brace expansion
Le 15/07/2021 à 16:36, Gabríel Arthúr Pétursson écrivait : Hi all, Executing the following results in a fierce crash: $ bash -c '{0..255}.{0..255}.{0..255}.{0..255}' Brace expression expands all the value in memory. Here you are actually telling Bash to expand 256⁴ or 4294967296 42 Billion entries. {0..255}.{0..255}.{0..255}.{0..255} expands all the IPv4 address space. It cannot fit in any current home PC memory. The crash is due to out of memory and is wholly expected. Use index loops instead: for ((a=0; a<=255; a++)); do for ((b=0; b<=255; b++)); do for ((c=0; c<=255; c++)); do for ((d=0; d<=255; d++)); do printf '%d.%d.%d.%d\n' "$a" "$b" "$c" "$d" done done done done -- Léa Gris
Re: [PATCH] Prefer non-gender specific pronouns
Le 07/06/2021 à 14:25, Dima Pasechnik écrivait : This forum is technical, not political! Technical decisions might easily have political consequences, you cannot just separate these ones. You are turning it backward. This proposal is 100% political and 0% technical. The current patches are nice. So I praise the very positive contributors implementing it with clever avoidance of cleaving cancel culture custom gender inclusive grammar. I would not have been that clever. I just let you know I have seen this live a year ago when cancel culture advocates infiltrated the board of the French non-profit ISP organization of which I am a member, and managed to get approval for an inclusive rewrite of the association's statutes, with a significant loss of legibility, and clearly for crassly political reasons. This is the head reason I react so violently and remains very vigilant. This is what is happening here. Now they just ask for mundane changes and will probably accept the patch. They will come again and ask for inclusive grammar, then come again and ask for removal of anything that is even remotely related to slavery (Example: master-slave database replication), racial discrimination (please remove blacklist and replace by exclusion-list), sexual assault or anything even remotely related. They will not give-up until everything is expunged of every trace of anything until total destruction or what's left is a bland ghostwith nothing left of history and everyone has fought each-other. Then they will vanish and destroy other community, other history. -- Léa Gris
Re: Prefer non-gender specific pronouns
Le 06/06/2021 à 16:34, Oğuz écrivait : Then there is no need to change anything. Exactly. As a woman, I take no offense when a documentation illustrate a fictive male character. (and as I will illustrate below, in French pronouns are tuned in gender and number with the object). I am not offended by the wording of the current English Bash documentation either. I am more annoyed by the over-abundance of children's stories in which women are depicted as 1950's good, dedicated and submissive housewives, cooking diner and taking care of kids. But seriously, in the few Bash manual sentences giving a male gender to the illustrative user character. This is light-years away of a worthy concern to me. I even predict this would get fixed with consensus if when more women will be involved in IT. Interestingly for the story: In the 1960's and 1970's, when we were more widely seen as housewives, we were more represented in IT, science and engineering overall than of today, that gender equity and equality are accepted modern standards. And no, I can't believe rewriting Bash documentation with gender neutral is a good thing or that it can contribute to evening the balance of gender representation in IT either. What word(s) are used in translations of the manual into languages other than English ? Do similar problems exist. In mine, no. Turkish has only one pronoun for male, female, and inanimate. In mine, possessive pronouns are gendered to the possessed target. Example with current Bash documentation place that have been subject to these gender-neutral changes, translated to French: > in a non-writable directory other than his home directory after login, dans un répertoire autre que son répertoire HOME après connexion, ^^^ French "son" (his) is male because "répertoire" (directory) is male in French. User character's gender is not even mentioned. -- Léa Gris
Re: [PATCH] Prefer non-gender specific pronouns
Le 06/06/2021 à 06:35, G. Branden Robinson écrivait : Here you go, if you're inclined. Minimally invasive and decidedly non-revolutionary in terms on English lexicon. Your careful patch not using custom grammar is admirable. Although I remain alarmed because this is a work to obey a demand from cancel culture tenants. With all your clever carefulness in patching this. It remains a rewrite of history motivated by political reasons from a lobbying group of people spreading their damaging delusions everywhere. -- Léa Gris
Re: Prefer non-gender specific pronouns
Le 06/06/2021 à 11:33, Ilkka Virta écrivait : In fact, that generic 'they' is so common and accepted, that you just used it yourself in the part I quoted above. Either you're acting in bad faith, or you're so confused by your gender-neutral delusion that you don't remember that in normal people's grammar, "they" is a plural pronoun. -- Léa Gris
Re: Prefer non-gender specific pronouns
Le 05/06/2021 à 18:47, John Passaro écrivait : I can see a couple reasons why it would be a good thing, and in the con column only "I personally don't have time to go through the manual and make these changes". but I'd happily upvote a patch from somebody that does. I can see so many reasons why it would be a bad thing to let the cancel culture adepts slip in here, rewriting bash documentations with their custom grammar. These insidious gender neutral rewrites of manuals, books and so many reference texts, movies, songs, is a cancer spreading everywhere. I can't see anyone decently take offense because the fictive user character in a Bash manual that was written 30+ years ago is of a defined non neutral fictive gender. With less heated reaction statements; I argue against changing bash documentations to use gender neutral because: - This change brings no additional value to the meaning, quality, or intelligibility of the Bash documentation. - Moreover, it could likely cause damages, by offending people who don't recognize gender neutral applied to characters, even fictive rhetorical ones. Please do not mix cleaving, clearly still non consensual, still heavily hot debated and politically loaded changes into technical projects like Bash. -- Léa Gris
Feature request: index of index
Currently, to know the index of an array's indexes (example: first or last index), it needs an intermediary index array: #!/usr/bin/env bash declare -a array=([5]=hello [11]=world [42]=here) declare -ai indexes=("${!array[@]}") declare -i first_index=${indexes[*]:0:1} declare -i last_index=${indexes[*]: -1:1} declare -p array indexes first_index last_index Which prints: declare -a array=([5]="hello" [11]="world" [42]="here") declare -ai indexes=([0]="5" [1]="11" [2]="42") declare -i first_index="5" declare -i last_index="42" It would be convenient to be able to index directly with this syntax: declare -i first_index=${!array[@]:0:1} declare -i last_index=${!array{@}: -1:1} -- Léa Gris
Bash development roadmap
As I see periodic features requests for Bash in this list. They most often misses some background plan or justification beyond QOL improvement for script coder, so they can have same feature as with other language. I always thought that Bash DNA was tied to orchestrating actions from commands forming a shell around a Unix kernel, with characters streams forming the data backbone for Bash to interact with system commands. The strength of Bash is that it is exactly fit as a shell, and it has enough POSIX and even Bash versions available in a wide variety of systems environments ; so that if you write Bash scripts avoiding cutting-edge features, or limit yourself to features with a decade old maturity, you can expect mostly flawless compatibility. Then I have concerns about all the requests for implementing new features, especially those features that would turn Bash into an all-purpose programming language, loosing grounds with its designed role in Unix systems. I'd like to see more mid-term or long term plans to keep Bash relevant ten years from now, with systems evolving more with event-driven operations; processes exchanging more structured data streams such as JSON or XML. Bash can barely deal with these format with external parsers, but then struggle to work with the data because it has no built-in internal hierarchical structures for it. I remember Chet mention future modules to deal with various formats, and it feels like a sound approach to deal with these structured data formats, but still Bash will struggle to use these with only arrays and associative arrays. I also wonder if it is even realistic to get Bash evolving to keep-up with more modern data structures and formats, when other scripting languages like Python are increasingly occupying the place of shell scripts. I can see how Perl lost grounds while it gained features but loosing relevance. What is on Bash's roadmap for the next ten or twenty years for it to remain a relevant tool, or is it going to maintain status-quo as a fall-back scripting tool you expect being on every systems, even older and no longer supported ones? -- Léa Gris
Re: Undocumented feature: Unnamed fifo '<(:)'
Le 08/04/2021 à 20:30, felix écrivait : You could have a look: https://f-hauri.ch/vrac/mandelbrot_backgndBc.sh.txt https://f-hauri.ch/vrac/mandelbrot_backgndBc_4macOs.sh.txt Have a look at my POSIX/shell version that is even slightly faster: https://gist.github.com/leagris/59e1b7e72462024b278652696f375e71 There is no need for bash specific features, although coproc and other fancynesses can help, these lack portability. -- Léa Gris
Re: is it a bug that \e's dont get escaped in declare -p output
Le 17/03/2021 à 21:13, Alex fxmbsw7 Ratchev écrivait : hm at least now we know array declare -p formatting would work in workarounds, good to .. :) Instead of: var=$'1\e[G\e[K2' ; declare -p var do: var=$'1\e[G\e[K2' ; printf 'declare -- %s\n' "${var@A}" And if you want a human readable dumpvars, then code it. Here it is: #!/usr/bin/env bash dump_vars () { local ___varname for ___varname; do local ___varflags=${!___varname@a} ___varflags=${___varflags:--} if [[ "$___varflags" =~ [Aa] ]]; then declare -p "$___varname" else printf 'declare -%s %s=%q\n' "$___varflags" "$___varname" "${!___varname}" fi done } escapestring=$'1\e[G\e[K2' escapearray=($'new \n line' $'and \e esc') declare -A assocarray=([$'1\e[G\e[K2']=$'and \e esc' [$'new \n line']=$'1\e[G\e[K2') declare -i intvar=42 declare -ai intarray=(-42 666 555) declare -Ai intassoc=([foo]=123 [$'1\e[G\e[K2']=456 [bar]=789) dump_vars escapestring escapearray assocarray intvar intarray intassoc Output: declare -- escapestring=$'1\E[G\E[K2' declare -a escapearray=([0]=$'new \n line' [1]=$'and \E esc') declare -A assocarray=([$'1\E[G\E[K2']=$'and \E esc' [$'new \n line']=$'1\E[G\E[K2' ) declare -i intvar=42 declare -ai intarray=([0]="-42" [1]="666" [2]="555") declare -Ai intassoc=([$'1\E[G\E[K2']="456" [foo]="123" [bar]="789" ) declare -i nameref=42 -- Léa Gris OpenPGP_signature Description: OpenPGP digital signature
Re: is it a bug that \e's dont get escaped in declare -p output
Le 17/03/2021 à 20:58, Ilkka Virta écrivait : On Wed, Mar 17, 2021 at 8:26 PM Greg Wooledge wrote: I thought, for a moment, that bash already used $'...' quoting for newlines, but it turns out that's false. At least for declare -p. It would be nice if it did, though. Newlines, carriage returns, escape characters, etc. It does in some cases: $ a=($'new \n line' $'and \e esc'); declare -p a declare -a a=([0]=$'new \n line' [1]=$'and \E esc') I'd expect bash to escape any character not in the POSIX [:print:] class. Although this is just a question of QOL improvement, since the produced declare statement just works as-is. It is not user friendly but code friendly and compact if you use the declare -p foo bar baz >savedvars.sh for later include savedvars.sh -- Léa Gris OpenPGP_signature Description: OpenPGP digital signature
Re: Changing the way bash expands associative array subscripts
Le 16/03/2021 à 01:12, Chet Ramey écrivait : What do folks think? Please excuse my profanity of mentioning zsh in this list, but I really think features and behavior convergence can benefit end users in multiple ways, especially when expanding beyond the POSIX sh scope. What direction has taken zsh with expanding associative array indexes? I don't know zsh, but I remember in the past, some features have been aligned across shells, and it has proven especially helpful for clarity as an end-user when dealing with non-POSIX shell features zsh also provide a nice feature to iterate both keys and values in a single loop: for key value in "${(kv)assoc_array}"; do printf '%s -> %s\n' "$key" "$value" done which is quite nicer than doing same in bash: for key in "${!assoc_array[@]}"; do value="${assoc_array[$key]}" printf '%s -> %s\n' "$key" "$value" done I'd love there would be more convergence when this is possible. -- Léa Gris OpenPGP_signature Description: OpenPGP digital signature
Re: Behaviour of test -v with assoc array and quote character in key
Le 23/02/2021 à 13:55, Greg Wooledge écrivait : On Tue, Feb 23, 2021 at 12:17:10PM +0100, Alex fxmbsw7 Ratchev wrote: what, sorry, mailing stuff isnt much clear to me, ... its not possible to have a var=\'\] ; assoc[$var] ? It should work for simple assignment and retrieval. You need to add quotes for [[ -v 'assoc[$var]' ]] to work, and math contexts are dodgy as hell. No amount of quoting will make (( 'assoc[$var]'++ )) work. If you want to increment the value of this array element, you'll need to retrieve it into a temporary string variable, increment that, and then copy it back into the array. Works, if you declare your associative array with A and i flags: ( LANG=C unset var assoc var=\'\] declare -Ai assoc assoc[$var]=1 assoc[$var]+=1 ((assoc['$var']++)) typeset -p assoc ) Output: declare -Ai assoc=(["']"]="3" ) -- Léa Gris
Assign read-only variables return code not usable inline
https://ideone.com/iw2pSv #!/usr/bin/env bash declare -r r r=2 || exit 2 echo 'still there with $?='$?', after: r=2 || exit 2' if ! r='hello'; then exit; fi echo "still there with \$?=$?, after: if ! r='hello'; then exit; fi" typeset -p r Output: still there with $?=1, after: r=2 || exit 2 still there with $?=1, after: if ! r='hello'; then exit; fi declare -r r -- Léa Gris
Re: Behaviour of test -v with assoc array and quote character in key
Le 23/02/2021 à 12:17, Alex fxmbsw7 Ratchev écrivait : what, sorry, mailing stuff isnt much clear to me, ... its not possible to have a var=\'\] ; assoc[$var] ? You can if assoc is declared an associative array before: $ (LANG=C; unset var assoc; var=\'\]; assoc[$var]=hello; typeset -p assoc) bash: ']: syntax error: operand expected (error token is "']") but: $ (LANG=C; unset var assoc; var=\'\]; declare -A assoc; assoc[$var]=hello; typeset -p assoc) declare -A assoc=(["']"]="hello" ) -- Léa Gris
Re: export loses error
Le 09/02/2021 à 01:05, Lawrence Velázquez écrivait : On Feb 8, 2021, at 5:29 PM, gregrwm wrote: $ export vim=$HOME/.GVim-v8.2.2451.glibc2.15-x86_64.AppImage $ $ vimV=$($vim --version)||echo handle error here #without export, error is captured fuse: failed to exec fusermount: No such file or directory open dir error: No such file or directory handle error here $ $ export vimV=$($vim --version)||echo handle error here #with export, error is lost fuse: failed to exec fusermount: No such file or directory open dir error: No such file or directory $ Not a bug. Kind of annoying, sure, but not a bug. https://mywiki.wooledge.org/BashPitfalls#local_var.3D.24.28cmd.29 vq Or this https://github.com/koalaman/shellcheck/wiki/SC2155 Nice helping shell linter would have warned of this pitfall. -- Léa Gris
Re: RFE: new syntax for command substitution to keep trailing newlines?
Le 27/01/2021 à 21:21, Alex fxmbsw7 Ratchev écrivait : as well as one newline instead of x, it cuts afik _one_ ending nrwline, not all It removes every trailing newline a=$(printf $'hello\n\n\n'); declare -p a Now if you want to preserve all the newlines you can use an ASCII EOF character (formerly Ctrl + Z) that is unlikely to be part of a legit string: a=$(printf $'hello\n\n\n\32'); a=${a%$'\32'}; declare -p a -- Léa Gris
Re: Feature Request: scanf-like parsing
Le 25/01/2021 à 18:58, Oğuz écrivait : I rarely use eval, but when I do, it works just fine. I can't really agree with the sentiment of your article, sorry. I use eval when I am sure there is no other safer way. Mean, if I can achieve the same with declare foo="dynamically generated content" I opt for the later, like mapping a whole file into an associative array in one go instead of iterating incrementing the array line by line. Because I know it can be done for a declare and the content is generated with printf %q or jq's @sh filter. Anyway on systems where there is a sufficiently advanced Bash available for these nice advanced bashisms, There are probably better all-purpose scripting languages available as well like Python or Perl. For obsolete closed-source systems where I don't have control on if a decent Bash version is available, using the nice Bash feature is not an option anyway, and sometimes, yes eval will help. -- Léa Gris
Re: Feature Request: scanf-like parsing
Le 22/01/2021 à 19:18, Léa Gris écrivait : Now replace the the () with {}, replace the implicit temporary fifo by and implicit temporary file; then have the same feature but without spawning a sub-shell. Instead of: tempfile=$(mktemp) || exit 1 trap 'rm -f "$tempfile"' EXIT compgen -u >"$tempfile" mapfile -t users <"$tempfile" You'd have: mapfile -t users < <{ compgen -u;} -- Léa Gris
Re: Feature Request: scanf-like parsing
Le 22/01/2021 à 18:55, Greg Wooledge écrivait : It's not hard at all. People just have a deep, almost religious, loathing against creating their own temp files. And yet, these same people are*perfectly* happy if some tool creates a temp file for them -- as long as they don't have to see any of the details or do any of the work. Because handling a temp file properly is already a bit of additional implementation that involves using trap for cleanup. Handling multiple tempfiles with proper cleanup becomes a complex unreliable task if implemented with Bash script commands. So if a syntactic sugar does it all properly, safely and with proper cleanup then it is good. You could always implement the equivalent of: read -r variable < <(command) with creating a temporary fifo: fifo=$(mktemp --dry-run) trap 'rm -f "$fifo"' EXIT mkfifo "$fifo" || exit 1 compgen -u >"$fifo" & mapfile -t users <"$fifo" But I really prefer this way because it is safer and much more reliable: mapfile -t users < <(compgen -u) Now replace the the () with {}, replace the implicit temporary fifo by and implicit temporary file; then have the same feature but without spawning a sub-shell. -- Léa Gris
Re: Feature Request: scanf-like parsing
Le 22/01/2021 à 16:11, pepa65 écrivait : I still love the idea of ">>>variable" to direct output into a variable without needing a subshell I'd prefer a syntax based off: command-list > >(command-list) command-list < <(command-list) But with curly braces for the no sub-shell version: command-list > >{ command-list;} command-list < <{ command-list;} Which could be used to assign output of a command to a variable without a sub-shell: Example: mapfile -t users < <{ compgen -u; } # No sub-shell Implementation-wise it could be a temporary file in /tmp or /dev/shm rather-than a temporary named fifo as with < <(:). -- Léa Gris
Re: ${a:=b} expands to `b', not `a''s value
Le 20/01/2021 à 13:51, Andreas Schwab écrivait : But that's not the value of the parameter. It is not, since at this point, the parameter has no value, the expansion expands the argument's value after ;= which is uppercase X here. -- Léa Gris
Re: ${a:=b} expands to `b', not `a''s value
Le 20/01/2021 à 12:16, Oğuz écrivait : $ declare -l a $ echo "${a:=X} $a" X x This doesn't jive with what the manual says. `-l`: When the variable is assigned a value, all upper-case characters are converted to lower-case. `:=`: If parameter is unset or null, the expansion of word is assigned to parameter. The assignation part: a is assigned X but it is stored as x in a. The value of parameter is then substituted. The expansion part: The value X is expanded but is not affected by the lowercase transformation flag from a, so it remains as X. The fact that an expansion also assign a value is a questionable design choice though. If I had to use this I would just silence the expansion as an argument to the dummy true or : command : ${a:=X} Is this a bug or am I missing something here? Then likely not. -- Léa Gris
Re: echo $'\0' >a does not write the nul byte
Le 17/01/2021 à 22:02, Chet Ramey écrivait : On 1/17/21 3:05 PM, h...@artax.karlin.mff.cuni.cz wrote: Description: Command echo $'\0' |od -c writes 000 \n 001 in contrast to echo $'\1' |od -c 000 001 \n 002 The nul byte is not echoed by $'\0'. Repeat-By: echo $'\0' |od -c echo $'\1' |od -c Shell builtin commands obey the same argv conventions as any other Unix program: arguments are null-terminated strings. That means that echo $'\0' echo '' echo "" are all equivalent, and none of them will output a null byte. The only way to output a null byte with shell built-in is: printf '\0' or non portable: echo -ne '\0' This is because `\0' is not a null byte or a nulll string but interpreted internally to the command to print a null byte. -- Léa Gris
Re: V+=1 doesn't work if V is a reference to an integer array element
Le 14/01/2021 à 16:15, Chet Ramey écrivait : On 1/13/21 4:40 PM, Léa Gris wrote: Le 13/01/2021 à 22:13, Chet Ramey écrivait : The `-i' forces arithmetic evaluation, which makes this expand to the equivalent of `declare -n b=1'. That's an invalid name for a nameref, which you'd see if you used `declare -n b=1' instead. The assignment error causes `declare' to return an non-zero status, but the error message checks the original value, not the result of arithmetic evaluation, and the `a[0]' is valid. I guess I never thought people would try to use integer variables as namerefs, since nameref values can't be digits. Sorry Chet, this does not make sense to me. There is no error message. OK, let's be clear that this sequence of commands from your message is what we're talking about: unset a b declare -ai a=(1) declare -in b="a[0]" declare -p a b I get this result: declare -ai a=([0]="1") x4: line 4: declare: b: not found and I explained why there is no variable `b'. Not the same issue and not exactly same sequence of commands: unset a; unset -n b declare -ai a=(1) declare -n b="a[0]" # No i flag at that point for b typeset -ni b # Now b also has a int flag alongside the nameref b+=2 # This is only allowed if b has the int flag declare -p a b As Oğuz wrote: Seems to be another bug. `bind_variable_internal' calls both `assign_array_element' and `make_variable_value' with ASS_APPEND in flags ( https://git.savannah.gnu.org/cgit/bash.git/tree/variables.c#n3140 ), the latter returns 6 (4 + 2), the former adds it to a[0]'s value. -- Léa Gris
Re: V+=1 doesn't work if V is a reference to an integer array element
Le 13/01/2021 à 22:13, Chet Ramey écrivait : The `-i' forces arithmetic evaluation, which makes this expand to the equivalent of `declare -n b=1'. That's an invalid name for a nameref, which you'd see if you used `declare -n b=1' instead. The assignment error causes `declare' to return an non-zero status, but the error message checks the original value, not the result of arithmetic evaluation, and the `a[0]' is valid. I guess I never thought people would try to use integer variables as namerefs, since nameref values can't be digits. Sorry Chet, this does not make sense to me. There is no error message. It look like to replied about Greg's post. I still wonder how when performing b+=number, the referenced value of a[0] is multiplied by 2 and the added the number back to a[0] If a[0] is 4 and you do b+=2, then a[0] would be updated to 10 The integer attribute to the nameref variable just make it work in arithmetic assignation var+=value without causing an error. I don't understand why this exact sequence: unset a; unset -n b; \ declare -ai a=('4'); \ declare -n b='a[0]'; \ typeset -ni b; \ declare -p a b; \ b+=2; \ declare -p a b Produces this output of the a array: declare -ai a=([0]="4") declare -in b="a[0]" declare -ai a=([0]="10") declare -in b="a[0]" I set the integer attribute to the nameref b after it has been assigned the "a[0]" string, so it keeps its string nameref value but is now also considered as an integer value. -- Léa Gris
Re: V+=1 doesn't work if V is a reference to an integer array element
Le 13/01/2021 à 18:49, Greg Wooledge écrivait : On Wed, Jan 13, 2021 at 07:00:42PM +0200, Oğuz wrote: $ declare -n b=a[0] I can't see any documentation that supports the idea that this should be allowed in the first place. -n Give each name the nameref attribute, making it a name reference to another variable. That other variable is defined by the value of name. All references, assign‐ ments, and attribute modifications to name, except those using or changing the -n attribute itself, are performed on the variable referenced by name's value. In at least three places there, it says that that "target" of the name reference is a variable. a[0] isn't a variable. Even more weirdness: echo $BASH_VERSION 5.0.17(1)-release unset a b declare -ai a=(1) declare -in b="a[0]" declare -p a b declare -ai a=([0]="1") declare -in b="a[0]" b+=1 declare -p a declare -ai a=([0]="3") b+=1 declare -p a declare -ai a=([0]="7") b+=0 declare -p a declare -ai a=([0]="14") -- Léa Gris
Re: Associative array keys are not reusable in (( command
Le 11/01/2021 à 15:42, Léa Gris écrivait : Declare an integer associative array instead: echo "$BASH_VERSION" 5.0.17(1)-release declare -Ai aa x='y[$(date >&2)0]' aa[$x]=1 declare -p aa declare -Ai aa=(["y[\$(date >&2)0]"]="1" ) aa[$x]+=1 declare -p aa declare -Ai aa=(["y[\$(date >&2)0]"]="2" ) And forgot one more safe use of arithmetic expression: safe_arith_index=${x@Q} declare -p safe_arith_index declare -- safe_arith_index="'y[\$(date >&2)0]'" (( aa[$safe_arith_index]++ )) declare -p aa declare -Ai aa=(["y[\$(date >&2)0]"]="3" ) -- Léa Gris
Re: Associative array keys are not reusable in (( command
Declare an integer associative array instead: echo "$BASH_VERSION" 5.0.17(1)-release declare -Ai aa x='y[$(date >&2)0]' aa[$x]=1 declare -p aa declare -Ai aa=(["y[\$(date >&2)0]"]="1" ) aa[$x]+=1 declare -p aa declare -Ai aa=(["y[\$(date >&2)0]"]="2" ) -- Léa Gris
Re: declare accept + before =
Le 08/01/2021 à 15:26, Chet Ramey écrivait : On 1/8/21 8:45 AM, Léa Gris wrote: Just curious why it accepts a + before = Try it with a variable that already has a value. OMG I feel stupid now! -- Léa Gris
declare accept + before =
Just curious why it accepts a + before = unset a; declare a+=hello; typeset -p a bash, version 5.0.17(1)-release wrote: > declare -- a="hello" -- Léa Gris
Re: New Feature Request
Le 04/01/2021 à 14:14, Greg Wooledge écrivait : It should be noted that $( var=$(ending the last line because it exactly a sub-shell shot syntax for var=$(cat file). Sub-shell $(commands list) output is always trimmed. -- Léa Gris
Re: 'Find' inside loop buggy, incorrectly make up dir.
Le 01/01/2021 à 12:12, Budi écrivait : find or bash or anything has made a bit buggy behavior Budi this is the wrong list for your question because it is 100 unrelated to Bash's bugs. Try to search the web, where there are plenty of resources to help with using the find utility to copy or move files. -- Léa Gris
Re: Checking executability for asynchronous commands
On 28/12/2020 at 21:18, Eli Schwartz wrote: if cmd=$(type -P foo) && test -x "$foo"; then foo & else echo "error: foo could not be found or is not executable" fi When you handle such logic within Bash you already lost on a race condition when foo is readable and executable when you test it, but when it reaches the actual execution, it is no longer the case. Bash is full of race conditions, because bash was never meant as a General Purpose language. Bash is a good command sequencer. Now if ppl forget to use wait PID after launching a sub-shell background command, then shame on them. -- Léa Gris
Re: New Feature Request
On 27/12/2020 at 19:30, Saint Michael wrote: Yes, superglobal is great. Example, from the manual: " Shared Memory Shared memory allows one or more processes to communicate via memory that appears in all of their virtual address spaces. The pages of the virtual memory is referenced by page table entries in each of the sharing processes' page tables. It does not have to be at the same address in all of the processes' virtual memory. As with all System V IPC objects, access to shared memory areas is controlled via keys and access rights checking. Once the memory is being shared, there are no checks on how the processes are using it. They must rely on other mechanisms, for example System V semaphores, to synchronize access to the memory." We could allow only strings or more complex objects, but using bash-language only, an internal mechanism, and also we need to define a semaphore. Is it doable? Maybe you should consider that Bash or shell is not the right tool for your needs. Bash/shell is designed to sequence commands and programs in a very linear way and only deals with character streams. If you need to manipulate complex objects, work with shared resources, Bash is a very bad choice. If you want to stay with scripting, as you already mentioned using Python; Python is a way better choice for dealing with the features and requirements you describes. -- Léa Gris
Re: [feature request] add associative array support to mapfile
Le 17/12/2020 à 17:58, Léa Gris écrivait : Maybe (it is clearly open to discussion), the associative array mapping could be allowed to: - skip blank/comment only lines (1) - allow key without value - optionally trim trailing comment (1) Forgot about Bash built-in Regex engine: - allow using a Bash Regex with 2 capture groups for key and value. -- Léa Gris
Re: [feature request] add associative array support to mapfile
Le 17/12/2020 à 17:35, Chet Ramey écrivait : One problem I see with it is that it assumes the introduction of word splitting into mapfile, with the burden on the user to ensure that there are exactly two resultant fields. It is just meant to be a replacement to looping on: IFS= read -r -d k v So yes it has only two fields per record/line regardless, the first IFS character will delimit the key from the value, and the value is allowed to contain any of IFS except newline, unless the record delimiter is assigned a different character. The incentive behind this feature request is, to allow mapping key-values pairs from a character stream/file, safely into a Bash's associative array with a single built-in command. These days, people are going out of their way to parse key value pairs out of foreign config files or result streams from jq processed JSON. Maybe (it is clearly open to discussion), the associative array mapping could be allowed to: - skip blank/comment only lines (1) - allow key without value - optionally trim trailing comment (1) Anyway I think some of the above extra parsing would be out of the scope of mapfile, and could be achieved by filtering the stream with sed or other text processor. Optionally but this would go beyond current Bash shell design IMHO, would be to allow expansion/filter modules to plug into the mapfile command for processing specific file syntax, be it JSON, CSV, configfile, inifile..., something like: mapfile -m module_name -A assoc_array
Re: mapfile associative array shim Was: Re: [feature request] add associative array support to mapfile
Shorty shim: ! { mapfile -A _/dev/null&&{ mapfile(){ local k v d=$'\n';local -n A=${*: -1:1};[ ${*: -2:1} = -A ]&&{ [ ${*:1:1} = -d ]&&d=${*:2:1};while read -rd "$d" k v||[[ -n $k && -n $v ]];do A[$k]=$v;done;:;}||command mapfile "$@";};} -- Léa Gris
mapfile associative array shim Was: Re: [feature request] add associative array support to mapfile
Here is an include shim to enable associative array support to mapfile or fall-back to regular mapfile if it has that support already: #!/usr/bin/env bash # mapfile_assoc_shim.bash ! { mapfile -A a /dev/null && { mapfile () { local k v d=$'\n' local -n A=${*: -1:1} [[ ${*: -2:1} = -A ]] && { [[ ${*:1:1} = -d ]] && d=${*:2:1} while read -r -d "$d" k v || [[ -n $k && -n $v ]] do A[$k]=$v done true } || command mapfile "$@" } } Example usage: . mapfile_assoc_shim.bash IFS='=' mapfile -d '' -A assoc_null < <(kv_null_stream) -- Léa Gris
[feature request] add associative array support to mapfile
Here is a Bash model of how, associative array support could be added to mapfile, replacing the need for interpreted while-loop read and logic by a much more efficient improved mapfile built-in: #!/usr/bin/env bash kv_cr_stream () { printf 'key1=value1\nkey2=value=2\nkey 3=value3\n' } kv_null_stream () { printf 'key1=value1\0key2=value=2\0key\n3=value3\0' } declare -A assoc_cr=() # Load associative array from line records # Proposed new feature: # IFS='=' mapfile -A assoc_cr while IFS='=' read -r k v || [[ -n $k && -n $v ]]; do assoc_cr[$k]=$v done < <(kv_cr_stream) declare -p assoc_cr declare -A assoc_null=() # Load associative array from null delimited records # Proposed new feature: # IFS='=' mapfile -d '' -A assoc_null while IFS='=' read -r -d '' k v || [[ -n $k && -n $v ]]; do assoc_null[$k]=$v done < <(kv_null_stream) declare -p assoc_null Expected output: declare -A assoc_cr=([key2]="value=2" [key1]="value1" ["key 3"]="value3" ) declare -A assoc_null=([key2]="value=2" [key1]="value1" [$'key\n3']="value3" ) -- Léa Gris
mysteries with the placeholder variable _
# GNU bash, version 5.0.17(1)-release (x86_64-pc-linux-gnu) bash -c 'unset _;_=42;echo $_;unset _;: $((_=666));echo "$_"' 666 # ksh version sh (AT&T Research) 93u+ 2012-08-01 ksh -c 'unset _;_=42;echo $_;unset _;: $((_=666));echo "$_"' 42 -3,02546243348e-123 # zsh 5.8 (x86_64-ubuntu-linux-gnu) zsh -c 'unset _;_=42;echo $_;unset _;: $((_=666));echo "$_"' 666 # dash 0.5.10.2-7 zsh -c 'unset _;_=42;echo $_;unset _;: $((_=666));echo "$_"' 42 666 It raises multiple observations: - I thought the placeholder variable _ was a sinkhole like /dev/null. It seems like it can get assigned values within arithmetic expressions in bash, dash and zsh - The weird working of ksh if out of scope here, look like any number not power of 2 produces the strange output that is not even an integer. - Dash seems to handle _ as a regular variable and accept assignment, which is probably conformant to POSIX shell specifications. -- Léa Gris
Re: No expansions performed while declaring an associative array using a list of keys and values
Le 11/12/2020 à 14:28, Oğuz écrivait : Nah, this doesn't work either. Would be really useful if it did though. $ declare -a foo=(1 2 ) $ declare -A assoc=("${foo[@]}" 3) $ declare -p assoc declare -A assoc=(["\"\${foo[@]}\""]="3" ) What would have been so useful is expanding mapfile to associative arrays: key${IFS}value declare -A assoc IFS='= ' mapfile -t assoc <declare -A assoc(["key1"]="value1" ["key2"]="value2" ["key3"]=$'{\n "otherkey": "othervalue"\n}' -- Léa Gris
Re: No expansions performed while declaring an associative array using a list of keys and values
Le 11/12/2020 à 13:08, Oğuz écrivait : I was trying the new features of bash 5.1 and came across this inconsistent behavior: $ foo='1 2' $ declare -A bar=($foo 3) $ declare -p bar declare -A bar=(["\$foo"]="3" ) $ $ bar+=($foo 3) $ declare -p bar declare -A bar=(["\$foo"]="3" ["1 2"]="3" ) Is there a particular reason to avoid performing expansions in `declare -A bar=($foo 3)'? Oğuz Look like coherent with other Bash specific constructs like previous associative array key syntax that does not split a key variable: foo='1 2' declare -A bar=([$foo]=3 ) Or double square bracket tests: foo='1 2' [[ $foo == '1 2' ]] Did you try with?: # possibly originating read -a or mapfile declare -a foo=(1 2 ) # Declare associative array from key value array above declare -A assoc=("${foo[@]}" 3) -- Léa Gris
Variables declared in arithmetic expressions have no i flag
Should variables automatically created within arithmetic constructs have the integer flag implied? unset x; ((x = 42)); typeset -p x > declare -- x="42" Should it be: declare -i x="42" Here are cases where that would make a difference: unset x; ((x = 42)); x+=624; typeset -p x > declare -- x="42624" unset x; declare -i x; ((x = 42)); x+=624; typeset -p x > declare -i x="666" -- Léa Gris
No longer receifing nntp feed from here
Anyone to look why this is no longer connected to the bug-bash@gnu.org ML? -- Lea Gris
Re: use of set -e inside parenthesis and conditionnal second command
On 17/11/2020 at 16:00, Ilkka Virta wrote: Now, perhaps that could use a note explicitly saying this also means subshells, even though they may have set -e in effect independently of the main shell. The part explaining subshells (3.7.3 Command Execution Environment) could perhaps also use a mention of that caveat, since it does mention set -e already, though in the context of command substitution. So it is clearly not a bug. The shell -e or bash's errexit option is a problematic tools with hard to understand side-effects. I can't figure a use-case where a shell would appropriately straight exit without any cleanup or fallback. This feature deserves a bold disclaimer to think twice before using it rather than explicit error handing. -- Léa Gris
Re: [ping] declare -c still undocumented.
Le 13/11/2020 à 12:47, Chris Elvidge écrivait : But ${var^} still doesn't know that it should apply to the first alpha character in a string. Similar for , and ~. If the first character of the string is a punctuation character, e.g.(, it doesn't work (as I would like it to ). Well, you are diverging from the issue here. I suggest you create a distinct post to discuss your feature request. To Chet, The feature survived from Bash 4.0 to 5.0. Are you still uncertain about the declare -c attribute's status? Anyway, here is what I found in a conversation from 2009: Available here: <https://lists.gnu.org/archive/html/bug-bash/2009-03/msg00152.html> Re: "declare -c" not documented From: Chet Ramey Subject:Re: "declare -c" not documented Date: Thu, 19 Mar 2009 09:37:06 -0400 User-agent: Thunderbird 2.0.0.19 (Macintosh/20081209) Greg Wooledge wrote: The -c option for declare (new in bash 4.0) is not mentioned in either the man page or the "help declare" text. Correct. I'm not sure the capitalization feature will survive. If it makes it into the next version, I will add the documentation. -- ``The lyf so short, the craft so long to lerne.'' - Chaucer Chet Ramey, ITS, CWRU address@hiddenhttp://cnswww.cns.cwru.edu/~chet/ -- Léa Gris
[ping] declare -c still undocumented.
Necroposting for still valid issue: declare -c to capitalize first character of string in variable is still undocumented as of GNU bash, version 5.0.17(1)-release (x86_64-pc-linux-gnu) Happy 10 years 10 months anniversary to the issue: <https://lists.gnu.org/archive/html/bug-bash/2010-02/msg00074.html> On Fri, 12 Feb 2010 09:10:07 +0100, Mikael Fridh wrote: Configuration Information [Automatically generated, do not change]: Machine: x86_64 OS: linux-gnu Compiler: gcc Compilation CFLAGS: -DPROGRAM='bash' -DCONF_HOSTTYPE='x86_64' -DCONF_OSTYPE='linux-gnu' -DCONF_MACHTYPE='x86_64-pc-linux-gnu' -DCONF_VENDOR='pc' -DLOCALEDIR='/usr/share/locale' -DPACKAGE='bash' -DSHELL -DHAVE_CONFIG_H -I. -I../bash -I../bash/include -I../bash/lib -g -O2 -Wall uname output: Linux teheran 2.6.31-19-generic #56-Ubuntu SMP Thu Jan 28 02:39:34 UTC 2010 x86_64 GNU/Linux Machine Type: x86_64-pc-linux-gnu Bash Version: 4.0 Patch Level: 33 Release Status: release Description: declare: usage: declare [-aAfFilrtux] [-p] [name[=value] ...] address@hidden:~$ declare -c moo=moo; echo $moo Moo Repeat-By: run declare -c Fix: document in bash(1), document in usage:, add to builtin help. -- Léa Gris OpenPGP_0x70946F168078ED52.asc Description: application/pgp-keys OpenPGP_signature Description: OpenPGP digital signature
Lack of documentation about mapfile callback
man bash.1 > When callback is evaluated, it is supplied the index of the next > array element to be assigned and the line to be assigned to that > element as additional arguments. callback is evaluated after the > line is read but before the array element is assigned. I can not find real-life implementation example of the mapfile callback that fit the implied scenario of this behavior of invoking the callback before the last array entry is assigned. What I figured out by experimentation, is that while the last element is not assigned to MAPFILE as seen from the callback context, the assignment is effective after the callback returns. Example: - BEGIN BASH #!/usr/bin/env bash callback() { echo "Entering Callback" printf 'Next index is: %d\n' $1 printf 'Next entry is: %q\n' "$2" printf 'MAPFILE size: %d\n' "${#MAPFILE[@]}" typeset -p MAPFILE echo "Exiting Callback" } mapfile -t -C callback -c 3 <<'EOF' Entry0 Entry1 Entry2 Entry3 Entry4 Entry5 Entry6 Entry7 Entry8 EOF - END BASH And then the output: - BEGIN OUTPUT Entering Callback Next index is: 2 Next entry is: Entry2 MAPFILE size: 2 declare -a MAPFILE=([0]="Entry0" [1]="Entry1") Exiting Callback Entering Callback Next index is: 5 Next entry is: Entry5 MAPFILE size: 5 declare -a MAPFILE=([0]="Entry0" [1]="Entry1" [2]="Entry2" [3]="Entry3" [4]="Entry4") Exiting Callback Entering Callback Next index is: 8 Next entry is: Entry8 MAPFILE size: 8 declare -a MAPFILE=([0]="Entry0" [1]="Entry1" [2]="Entry2" [3]="Entry3" [4]="Entry4" [5]="Entry5" [6]="Entry6" [7]="Entry7") Exiting Callback - END OUTPUT It reveals the weirdness of running the callback before the last assignment from the quantum. First call to callback has a MAPFILE with 2 entries, while the next two have 3 entries. There must be a reason or an intended scenario for this implementation, but with so few documentation and no real-world usage example, it is unclear to me. -- Lea Gris
Re: Two states of empty arrays
Le 12/12/2019 à 20:13, Chet Ramey écrivait : >> # Empty array declared without parenthesis >> unset myArr >> declare -a myArr >> typeset -p myArr >> echo "${#myArr[@]}" > > This is an unset variable with the array attribute; you have not assigned a > value. >> # Empty array declared without parenthesis >> unset myArr >> declare -a myArr=() > > This is an empty array variable; you have assigned a value. Thank you and Clint, it makes sense now. I was trying to play the the -v test to detect when an array or associative array has been declared, not necessarily assigned entries key, values, to not error when Bash runs with -o nounset Like here: #!/usr/bin/bash set -o nounset myArr+=(["key"]="value") ERR: line 3: key: unbound variable I can test the type of myArr this way: if [[ "$(typeset -p myArr 2>&1)" =~ ^declare\ -A ]]; then myArr+=(["key"]="value") fi But it looks sub-optimal to test the type and declaration of a variable. The -v test flag cannot be used because it requires the associative array to contain at least a [key]=value entry as mentioned in the man bash.1: >-v varname > True if the shell variable varname is set (has been assigned a > value). _has been assigned a value_ -- Lea Gris
Two states of empty arrays
Hello, Depending on how an empty array is declared, it is not stored with the same state. # Empty array declared without parenthesis unset myArr declare -a myArr typeset -p myArr echo "${#myArr[@]}" output: declare -a myArr 0 # Empty array declared without parenthesis unset myArr declare -a myArr=() typeset -p myArr echo "${#myArr[@]}" output: declare -a myArr=() 0 What is the reason for having different states for empty arrays? -- Lea Gris -- Léa Gris signature.asc Description: OpenPGP digital signature
Two states of empty arryays
Hello, Depending on how an empty array is declared, it is not stored with the same state. # Empty array declared without parenthesis unset myArr declare -a myArr typeset -p myArr echo "${#myArr[@]}" output: declare -a myArr 0 # Empty array declared without parenthesis unset myArr declare -a myArr=() typeset -p myArr echo "${#myArr[@]}" output: declare -a myArr=() 0 What is the reason for having different states for empty arrays? -- Lea Gris
Associative array entries order differs from declaration
While dealing with getting keys of arrays, I found out that associative array keys where not registered in the same order as declared: #!/usr/bin/env bash # Declare and populate an associative array a unset a declare -A a=( ["one"]="first" ["two"]="second" ["three"]="third" ["four"]="last") ) typeset -p a # show actual declaration order that differs from real one # Show how the chaotic order affect iteration of the array for v in "${a[@]}"; do echo "$v" done Output: declare -A a=([two]="second" [three]="third" [four]="last" [one]="first" second third last first This behavior looks just wrong and it is just same if you build the array incrementally: unset a; declare -A a; a=(["one"]="first"); a+=(["two"]="second"); a+=(["three"]="third"); a+=(["four"]="last"); typeset -p a Is there a way to control the order of entries in an associative array? What rules applies to the order of entries? -- Léa Gris signature.asc Description: OpenPGP digital signature
errata: Explicit variables declaration statements do not resolve back-references from same statement
Found this strange behavior difference in Bash, between explicit and implicit declarations of variables. An implicit variables declaration statement resolve back-references to variables from the same statement. Whereas: An explicit variables declaration statement does not resolve back-reference variables from the same statement. Illustration code: knip code start #!/usr/bin/env bash unset a b c printf $'\nExplicit declarations statements:\n' printf $'\ntypeset -i a=2 b=$a c="$((a - 1))":\n=> ' typeset -i a=2 b=$a c="$((a - 1))" printf 'a=%d b=%d c=%d\n' "${a}" "${b}" "${c}" unset a b c printf $'\ndeclare a=hello b=world c="$a $b":\n=> ' declare a='hello' b='world' c="${a} ${b}" printf $"a='%s' b='%s' c='%s'\\n" "${a}" "${b}" "${c}" unset a b c printf $'\nImplicit declarations statements:\n' a=2 b=$a c="$((a - 1))" printf $'\na=2 b=$a c="$((a - 1))":\n=> ' printf 'a=%d b=%d c=%d\n' "${a}" "${b}" "${c}" unset a b c a='hello' b='world' c="${a} ${b}" printf $'\na=hello b=world c="$a $b":\n=> ' printf $"a='%s' b='%s' c='%s'\\n" "$a" "$b" "$c" ==== knip code end Output: bash ./test_declare.sh Explicit declarations statements: typeset -i a=2 b=$a c="$((a - 1))": => a=2 b=0 c=-1 declare a=hello b=world c="$a $b": => a='hello' b='world' c=' ' Implicit declarations statements: a=2 b=$a c="$((a - 1))": => a=2 b=2 c=1 a=hello b=world c="$a $b": => a='hello' b='world' c='hello world' ksh93 resolves explicit back-references with typeset. -- Léa Gris signature.asc Description: OpenPGP digital signature
Explicit variables declaration statements do not resolve back-references from same statement
Found this strange behavior difference in Bash, between explicit and implicit declarations of variables. An implicit variables declaration statement resolve back-references to variables from the same statement. Whereas: An explicit variables declaration statement does not resolve back-reference variables from the same statement. Illustration code: knip code start #!/usr/bin/env bash unset a b c printf $'\nExplicit declarations statements:\n' printf $'\ntypeset -i a=2 b=$a c="$((a - 1))":\n=> ' typeset -i a=2 b=$a c="$((a - 1))" printf 'a=%d b=%d c=%d\n' "${a}" "${b}" "${c}" unset a b c printf $'\ndeclare a=hello b=world c="$a $b":\n=> ' declare a='hello' b='world' c="${a} ${b}" printf $"a='%s' b='%s' c='%s'\\n" "${a}" "${b}" "${c}" unset a b c printf $'\nImplicit declarations statements:\n' a=2 b=$a c="$((a - 1))" printf $'\na=2 b=$a c="$((a - 1))":\n=> ' printf 'a=%d b=%d c=%d\n' "${a}" "${b}" "${c}" unset a b c a='hello' b='world' c="${a} ${b}" printf $'\na=hello b=world c="$a $b":\n=> ' printf $"a='%s' b='%s' c='%s'\\n" "$a" "$b" "$c" knip code end Output: Explicit declarations statements: typeset -i a=2 b=$a c="$((a - 1))": => a=2 b=2 c=1 declare a=hello b=world c="$a $b": => ./b.sh[10]: declare: not found [Aucun fichier ou dossier de ce type] a='' b='' c='' Implicit declarations statements: a=2 b=$a c="$((a - 1))": => a=2 b=2 c=1 a=hello b=world c="$a $b": => a='hello' b='world' c='hello world' ksh93 resolves explicit back-references with typeset. -- Léa Gris signature.asc Description: OpenPGP digital signature
Re: Unexpected result of array assignment
On 18/07/2019 14:12, Greg Wooledge wrote: On Thu, Jul 18, 2019 at 10:58:52AM +0200, Henning wrote: eval 'foo=(["key"]="'"${foo["key"]}"' value2")' If you just want to work around the bug, why not do it in the simplest way possible? foo["key"]+=" value2" Of course, you are right. Can be done safely with storing the intermediate value. declare -A foo foo=(["key"]="value1") declare -p foo _v="${foo["key"]}" declare -p _v foo=(["key"]="${_v} value2") declare -p foo Still safer than eval for an extra intermediary assignment. And allow you to insert the old value of ${foo=["key"]} anywhere in-between. The += string concatenation would not work in this case. -- Léa Gris signature.asc Description: OpenPGP digital signature