I have several text files in which I have introduced shell variables ($VAR1 or $VAR2 for instance).
I would like to take those files (one by one) and save them in new files where all variables would have been replaced.
To do this, I used the following shell script (found on StackOverflow):
while read line
do
eval echo "$line" >> destination.txt
done < "source.txt"
This works very well on very basic files.
But on more complex files, the "eval" command does too much:
Lines starting with "#" are skipped
XML files parsing results in tons of errors
Is there a better way to do it? (in shell script... I know this is easily done with Ant for instance)
Kind regards
This question is related to
linux
shell
string-interpolation
If you really only want to use bash (and sed), then I would go through each of your environment variables (as returned by set
in posix mode) and build a bunch of -e 'regex'
for sed from that, terminated by a -e 's/\$[a-zA-Z_][a-zA-Z0-9_]*//g'
, then pass all that to sed.
Perl would do a nicer job though, you have access to the environment vars as an array and you can do executable replacements so you only match any environment variable once.
I know this topic is old, but I have a simpler working solution without export the variables. Can be a oneliner, but I prefer to split using \
on line end.
var1='myVar1'\
var2=2\
var3=${var1}\
envsubst '$var1,$var3' < "source.txt" > "destination.txt"
# ^^^^^^^^^^^ ^^^^^^^^^^ ^^^^^^^^^^^^^^^
# define which to replace input output
The variables need to be defined to the same line as envsubst
is to get considered as environment variables.
The '$var1,$var3'
is optional to only replace the specified ones. Imagine an input file containing ${VARIABLE_USED_BY_JENKINS}
which should not be replaced.
Actually you need to change your read
to read -r
which will make it ignore backslashes.
Also, you should escape quotes and backslashes. So
while read -r line; do
line="${line//\\/\\\\}"
line="${line//\"/\\\"}"
line="${line//\`/\\\`}"
eval echo "\"$line\""
done > destination.txt < source.txt
Still a terrible way to do expansion though.
If you want env variables to be replaced in your source files while keeping all of the non env variables as they are, you can use the following command:
envsubst "$(printf '${%s} ' $(env | sed 's/=.*//'))" < source.txt > destination.txt
The syntax for replacing only specific variables is explained here. The command above is using a sub-shell to list all defined variables and then passing it to the envsubst
So if there's a defined env variable called $NAME
, and your source.txt
file looks like this:
Hello $NAME
Your balance is 123 ($USD)
The destination.txt
will be:
Hello Arik
Your balance is 123 ($USD)
Notice that the $NAME
is replaced and the $USD
is left untouched
In reference to answer 2, when discussing envsubst, you asked:
How can I make it work with the variables that are declared in my .sh script?
The answer is you simply need to export your variables before calling envsubst
.
You can also limit the variable strings you want to replace in the input using the envsubst
SHELL_FORMAT
argument (avoiding the unintended replacement of a string in the input with a common shell variable value - e.g. $HOME
).
For instance:
export VAR1='somevalue' VAR2='someothervalue'
MYVARS='$VAR1:$VAR2'
envsubst "$MYVARS" <source.txt >destination.txt
Will replace all instances of $VAR1
and $VAR2
(and only VAR1
and VAR2
) in source.txt
with 'somevalue'
and 'someothervalue'
respectively.
while IFS='=' read -r name value ; do
# Print line if found variable
sed -n '/${'"${name}"'}/p' docker-compose.yml
# Replace variable with value.
sed -i 's|${'"${name}"'}|'"${value}"'|' docker-compose.yml
done < <(env)
Note: Variable name or value should not contain "|", because it is used as a delimiter.
$ export MY_ENV_VAR=congratulation
$MY_ENV_VAR
You can also use all other ENV variables defined by your system like (in linux) $TERM, $SHELL, $HOME...
$ envsubst "`printf '${%s} ' $(sh -c "env|cut -d'=' -f1")`" < in.txt > out.txt
$ cat out.txt
and you should see "congratulation".
Export all the needed variables and then use a perl onliner
TEXT=$(echo "$TEXT"|perl -wpne 's#\${?(\w+)}?# $ENV{$1} // $& #ge;')
This will replace all the ENV variables present in TEXT with actual values. Quotes are also preserved :)
Call the perl binary, in search and replace per line mode ( the -pi
) by running the perl code ( the -e
) in the single quotes, which iterates over the keys of the special %ENV
hash containing the exported variable names as keys and the exported variable values as the keys' values and for each iteration simple replace a string containing a $<<key>>
with its <<value>>
.
perl -pi -e 'foreach $key(sort keys %ENV){ s/\$$key/$ENV{$key}/g}' file
Caveat: An additional logic handling is required for cases in which two or more vars start with the same string ...
envsubst
seems exactly like something I wanted to use, but -v
option surprised me a bit.
While envsubst < template.txt
was working fine, the same with option -v
was not working:
$ cat /etc/redhat-release
Red Hat Enterprise Linux Server release 7.1 (Maipo)
$ envsubst -V
envsubst (GNU gettext-runtime) 0.18.2
Copyright (C) 2003-2007 Free Software Foundation, Inc.
License GPLv3+: GNU GPL version 3 or later <http://gnu.org/licenses/gpl.html>
This is free software: you are free to change and redistribute it.
There is NO WARRANTY, to the extent permitted by law.
Written by Bruno Haible.
As I wrote, this was not working:
$ envsubst -v < template.txt
envsubst: missing arguments
$ cat template.txt | envsubst -v
envsubst: missing arguments
I had to do this to make it work:
TEXT=`cat template.txt`; envsubst -v "$TEXT"
Maybe it helps someone.
Source: Stackoverflow.com