S
S
Sergey Programmer 1C2015-10-02 12:15:44
linux
Sergey Programmer 1C, 2015-10-02 12:15:44

How to remove duplicates in all files in a folder?

I installed Cygwin on Windows, I delete duplicate lines in the file like this
$ awk '!x[$0]++' 1233.txt > 1233-i.txt
or
$ cat 123.txt | sort | uniq > 1233-i.txt
everything works
but now I want all text files in the folder to be processed:
$ find . -name "*.txt" -exec cat {}\; | sort {}\; | uniq {}\; > 12345.txt
this find does not work, tell me how to process all the files? oak in linux

Answer the question

In order to leave comments, you need to log in

3 answer(s)
V
Valery Ryaboshapko, 2015-10-02
@spetrov

You can just loop through all the files in the current folder.

for i in *.txt; do sort $i | uniq > $i-sorted; done

A
Azazel PW, 2015-10-02
@azazelpw

look for the
uniq command

M
Maxim Terentiev, 2015-10-06
@maxitso

fdupes comparison by content

Didn't find what you were looking for?

Ask your question

Ask a Question

731 491 924 answers to any question