You are not logged in.
Please recommend an approach to the following problem.
I have a directory of perhaps 100 html files. I want to seach through those files and print a list of file names and line number where specific text can be found.
I have very little experience with scripting, but would like to learn, and this looks like a fun place to start.
I'm not looking for a script, but rather a recommendation for the correct tool for the job. I know you can do this with dozens of different approaches but I don't want to read a dozen different books just to find the correct path. So, I hoping someone get nudge me in the correct direction.
This sounds like it might be something that sed is good at. Maybe create a list of file names in a directory, then loop through the list and use sed with regex? Or is there a better approach?
Cheers,
Edit - added context.
I do not need to parse the html. I'm just looking for specific text strings, but maybe multiple strings on the same line. Using grep (as suggested my mpan) seems like a very simple approach. (I had no idea grep could look in multiple files with a single command.)
Last edited by dakota (2025-04-02 17:49:49)
"Before Enlightenment chop wood, carry water. After Enlightenment chop wood, carry water." -- Zen proverb
Offline
Since no details or constraints have been provided, only “searching for text” and not “parsing HTML”:
grep -Fir 'text to find' directory_with_filesPaperclips in avatars? | Sometimes I seem a bit harsh — don’t get offended too easily!
Offline
Well.
That was pretty trivial. I had no idea grep could look in multiple files with the same command.
Cheers,
"Before Enlightenment chop wood, carry water. After Enlightenment chop wood, carry water." -- Zen proverb
Offline