Bash: getting duplicate lines on both files

Today I needed to find duplicates files from:

– list of URLs accessed

– list of category URLs I needed to refresh data for

 

here's my code snippet:

(This assumes LOG.txt is the file with URLs logged and CATEGORIES.txt is the list of all URLs that would pottentially need to be refreshed)

 

for line in `cat CATEGORIES.txt`; do grep $line LOG.txt; done

 

You can see it in action and output to a file by adding:

| tee OUTPUT.txt

 

 

Bookmark the permalink.

Leave a Reply