[sed] Change multiple files

The following command is correctly changing the contents of 2 files.

sed -i 's/abc/xyz/g' xaa1 xab1 

But what I need to do is to change several such files dynamically and I do not know the file names. I want to write a command that will read all the files from current directory starting with xa* and sed should change the file contents.

This question is related to sed

The answer is


u can make

'xxxx' text u search and will replace it with 'yyyy'

grep -Rn '**xxxx**' /path | awk -F: '{print $1}' | xargs sed -i 's/**xxxx**/**yyyy**/'

I'm using find for similar task. It is quite simple: you have to pass it as an argument for sed like this:

sed -i 's/EXPRESSION/REPLACEMENT/g' `find -name "FILE.REGEX"`

This way you don't have to write complex loops, and it is simple to see, which files you are going to change, just run find before you run sed.


If you are able to run a script, here is what I did for a similar situation:

Using a dictionary/hashMap (associative array) and variables for the sed command, we can loop through the array to replace several strings. Including a wildcard in the name_pattern will allow to replace in-place in files with a pattern (this could be something like name_pattern='File*.txt' ) in a specific directory (source_dir). All the changes are written in the logfile in the destin_dir

#!/bin/bash
source_dir=source_path
destin_dir=destin_path
logfile='sedOutput.txt'
name_pattern='File.txt'

echo "--Begin $(date)--" | tee -a $destin_dir/$logfile
echo "Source_DIR=$source_dir destin_DIR=$destin_dir "

declare -A pairs=( 
    ['WHAT1']='FOR1'
    ['OTHER_string_to replace']='string replaced'
)

for i in "${!pairs[@]}"; do
    j=${pairs[$i]}
    echo "[$i]=$j"
    replace_what=$i
    replace_for=$j
    echo " "
    echo "Replace: $replace_what for: $replace_for"
    find $source_dir -name $name_pattern | xargs sed -i "s/$replace_what/$replace_for/g" 
    find $source_dir -name $name_pattern | xargs -I{} grep -n "$replace_for" {} /dev/null | tee -a $destin_dir/$logfile
done

echo " "
echo "----End $(date)---" | tee -a $destin_dir/$logfile

First, the pairs array is declared, each pair is a replacement string, then WHAT1 will be replaced for FOR1 and OTHER_string_to replace will be replaced for string replaced in the file File.txt. In the loop the array is read, the first member of the pair is retrieved as replace_what=$i and the second as replace_for=$j. The find command searches in the directory the filename (that may contain a wildcard) and the sed -i command replaces in the same file(s) what was previously defined. Finally I added a grep redirected to the logfile to log the changes made in the file(s).

This worked for me in GNU Bash 4.3 sed 4.2.2 and based upon VasyaNovikov's answer for Loop over tuples in bash.


Those commands won't work in the default sed that comes with Mac OS X.

From man 1 sed:

-i extension
             Edit files in-place, saving backups with the specified
             extension.  If a zero-length extension is given, no backup 
             will be saved.  It is not recommended to give a zero-length
             extension when in-place editing files, as you risk corruption
             or partial content in situations where disk space is exhausted, etc.

Tried

sed -i '.bak' 's/old/new/g' logfile*

and

for i in logfile*; do sed -i '.bak' 's/old/new/g' $i; done

Both work fine.


You could use grep and sed together. This allows you to search subdirectories recursively.

Linux: grep -r -l <old> * | xargs sed -i 's/<old>/<new>/g'
OS X: grep -r -l <old> * | xargs sed -i '' 's/<old>/<new>/g'

For grep:
    -r recursively searches subdirectories 
    -l prints file names that contain matches
For sed:
    -i extension (Note: An argument needs to be provided on OS X)

Another more versatile way is to use find:

sed -i 's/asd/dsg/g' $(find . -type f -name 'xa*')

I'm surprised nobody has mentioned the -exec argument to find, which is intended for this type of use-case, although it will start a process for each matching file name:

find . -type f -name 'xa*' -exec sed -i 's/asd/dsg/g' {} \;

Alternatively, one could use xargs, which will invoke fewer processes:

find . -type f -name 'xa*' | xargs sed -i 's/asd/dsg/g'

Or more simply use the + exec variant instead of ; in find to allow find to provide more than one file per subprocess call:

find . -type f -name 'xa*' -exec sed -i 's/asd/dsg/g' {} +

@PaulR posted this as a comment, but people should view it as an answer (and this answer works best for my needs):

sed -i 's/abc/xyz/g' xa*

This will work for a moderate amount of files, probably on the order of tens, but probably not on the order of millions.