V
V
Valeriu1472017-06-12 23:13:46
bash
Valeriu147, 2017-06-12 23:13:46

How to find multiple words in multiple files using BASH?

There is this script:

#!/bin/bash

text="hulk hogan,dolph ziggler"
IFS=","
word=( $text )

line=`ls workdir/*.txt`
unset IFS

for a in "${word[@]}"; do
for m in $line; do
if grep -q "$a" "$m"; then
    echo "$a word is exists"
    grep "$a" "$m"
else
    echo "$a word does not exists"
    exit 1
fi
done
done

I am searching hulk hogan,dolph zigglerin multiple files. If they are, the script will skip, saying that the line such and such is in the file such and such. But it looks for the whole of these two words in each file. That is, if it does not find at least one of the words in one file, it will exit the script body. I want it to throw an exception if one of the words is not in all files. And if, for example hulk hogan, there is in the first.txt file, and dolph zigglerthere is in second.txt, then the script should work, and not leave its body.
Now, it looks in each file for two words, and if one of the files does not have at least one of these words, it will exit the body of the script, which is a bit inappropriate. Tell me please. I don't know what to do anymore.

Answer the question

In order to leave comments, you need to log in

2 answer(s)
M
Max Kostikov, 2017-06-13
@mxms

In general, this is done in one line using find.
The output is a list of files containing any of the searched strings.

S
Saboteur, 2017-06-13
@saboteur_kiev

Actually, this is done in one command:
grep -r -P "(hulk|hogan|dolph|ziggler)" *.txt

Didn't find what you were looking for?

Ask your question

Ask a Question

731 491 924 answers to any question