Find duplicate files, request

English support forum

Moderators: Hacker, petermad, Stefan2, white

Post Reply
User avatar
AnthonyCian
Senior Member
Senior Member
Posts: 265
Joined: 2005-06-16, 01:45 UTC
Location: Thatcher Az. USA

Find duplicate files, request

Post by *AnthonyCian »

Hello:

Sometime ago I requested to see if it was possible to set your own default when using the Search>Find Duplicate Files. It currently defaults with "Same Name" and "Same Size". I always search using "Same Size" and "Same Contents". And was hoping this setup can also be possible as a default. I always have to make changes and was hoping in V7 this was enhanced. Is it possible to edit the INI file to do this?

Have an option to set it to remember the last setting when used. Or something similar. At the time of my suggestion, it was well received and was put on the wish list as I last recall.

Anthony
User avatar
Sombra
Power Member
Power Member
Posts: 814
Joined: 2005-12-27, 22:23 UTC
Location: Zaragoza, Spain

Post by *Sombra »

Hi, Try the next steps:
  • 1- You can check "same names" and "same size" in advanced tab.
    2- In general tab, set empty the field "Search in:"
    3- Go to the tab "load/save" and save it to "duplicates" (for example)
    4- Make a button with this command: LOADSEARCH duplicates (duplicates is the previous saved search)
I can read English, but... I write like Tarzan. (sorry)
User avatar
Samuel
Power Member
Power Member
Posts: 1930
Joined: 2003-08-29, 15:44 UTC
Location: Germany, Brandenburg an der Havel
Contact:

Post by *Samuel »

I already use this way very often. With long lists... Is it somehow possible to select (and delete) all entry's that are duplicates?

The list may be (Not only 2 duplicates)

Code: Select all

File a
File b (same content as File a)
File c (same content as File a)
----------------
File d
File e (same content as File d)
----------------
File f
File g (same content as File f)
File h (same content as File f)
File i (same content as File f)
----------------
...
Is there a way to automatically select (and remove): (File b, File c, File e, File g, File h, File i)
Last edited by Samuel on 2007-06-12, 12:41 UTC, edited 2 times in total.
User avatar
MacQ
Junior Member
Junior Member
Posts: 72
Joined: 2004-04-13, 12:54 UTC
Location: Slovenia

Post by *MacQ »

That's what I would also like to know.
Or at least to select all files except one, that are the same. This way only one copy remains unselected of all duplicates.
Then you could just make some corrections if you're not satisfied with the selection.
User avatar
Lefteous
Power Member
Power Member
Posts: 9536
Joined: 2003-02-09, 01:18 UTC
Location: Germany
Contact:

Post by *Lefteous »

2Samuel
I don't think that this is a realistic scenario. Maybe I want to keep file b, d, and h because they for example in the same directory. I think it's very difficult to say which of the duplicates should be kept. I don't wanna claim an automatic method isn't possible but it's not so simple.
User avatar
Samuel
Power Member
Power Member
Posts: 1930
Joined: 2003-08-29, 15:44 UTC
Location: Germany, Brandenburg an der Havel
Contact:

Post by *Samuel »

2Lefteous
As MacQ stated he also would need such selection. (So that he can afterwards change the selection, if it has to be adjusted)

I wouldnt care if it were around 10-20 files. But often, I have a huge amount, and would prefer a automatic selection.

And I'm also often in the situation, that I only need them to be removed. No matter what directory, or filename...

2Ghisler
Would be a great thing to have a new checkbox "select duplicates after search".

Otherwise I think upon an external AHK solution... But there is no way to figure out, where the lines are. So I would have to compare files again...
User avatar
Samuel
Power Member
Power Member
Posts: 1930
Joined: 2003-08-29, 15:44 UTC
Location: Germany, Brandenburg an der Havel
Contact:

Post by *Samuel »

Workaround with an ahk script.

I just had the idea, to select every second line. Then many duplicates (best case=all of them / worst case half of them) can be deleted in one pass. (So its useful to remove files, where it doesn't matter, which duplicates shall be removed)

strg + alt + a
-> mark every second line in the current filelist

Here is the script:

Code: Select all

; ############ only works in TTOTAL_CMD window
#IfWinActive,ahk_class TTOTAL_CMD
^!a::
; ############ is one filelist focused?
ControlGetFocus Focused
if(Focused!="TMyListBox1" && Focused!="TMyListBox2")
{
 return
}

; ############ unselect all currently selected items
SendMessage,1075,524

; ############ get index of last item
SendInput,{End}
SendMessage,0x188,0,0,%Focused%,ahk_class TTOTAL_CMD
ItemLast := ErrorLevel

; ############ go down and mark every second line
SendInput,{Home}
Loop
{
 SendInput,{Down}{Ins}
 SendMessage,0x188,0,0,%Focused%,ahk_class TTOTAL_CMD
 Item := ErrorLevel
 if(Item==ItemLast){
  break
 }
}
return
#IfWinActive
User avatar
AnthonyCian
Senior Member
Senior Member
Posts: 265
Joined: 2005-06-16, 01:45 UTC
Location: Thatcher Az. USA

Post by *AnthonyCian »

Hi Sombra
I tried your suggestion and I like it. Never knew about the LoadSearch feature. Thanks a bunch!

Anthony
burg1
Junior Member
Junior Member
Posts: 30
Joined: 2007-07-10, 13:56 UTC

Post by *burg1 »

I agree with Samuel that an option to automatically select the duplicates would be nice. This is a realistic scenario. Sometimes it just doesn't matter which of the duplicates is kept, e.g. because you're going to rename them afterwards anyway. Therefore there would be no sophisticated programming logic needed: Just select every file from the second line on, and there you go :-)
User avatar
eugensyl
Power Member
Power Member
Posts: 564
Joined: 2004-06-03, 18:27 UTC
Location: România
Contact:

Post by *eugensyl »

Samuel wrote:I already use this way very often. With long lists... Is it somehow possible to select (and delete) all entry's that are duplicates?

The list may be (Not only 2 duplicates)

Code: Select all

File a
File b (same content as File a)
File c (same content as File a)
----------------
File d
File e (same content as File d)
----------------
File f
File g (same content as File f)
File h (same content as File f)
File i (same content as File f)
----------------
...
Is there a way to automatically select (and remove): (File b, File c, File e, File g, File h, File i)
Why did you not select all files with the same path (internal command cm_SelectCurrentPath) ?
Work like in branch view. With a shortcut it's very easy to use.


Sometimes helps.


Best wishes,
My Best Wishes,

Eugen
User avatar
Samuel
Power Member
Power Member
Posts: 1930
Joined: 2003-08-29, 15:44 UTC
Location: Germany, Brandenburg an der Havel
Contact:

Post by *Samuel »

Sorry I dont understand your idea.
Nothing is known about folder structure in general. Sometimes duplicate files are even in the same folder.
User avatar
eugensyl
Power Member
Power Member
Posts: 564
Joined: 2004-06-03, 18:27 UTC
Location: România
Contact:

Post by *eugensyl »

Samuel wrote:Sorry I dont understand your idea.
Nothing is known about folder structure in general. Sometimes duplicate files are even in the same folder.
Let's be more explicit:

if duplicate files founded are in few directories, doesn't mater conditions, very often, all duplicate files founded can be safety removed from same directory for different reasons: same generation, backup files, and so on.

Only in this case, if you accept to copy/move/delete duplicate files from same directory it's ok to use the selection by the cm_SelectCurrentPath command through menu, button bar or/and shortcut.

For example I looking for duplicate mp3 files. Even are or not with same name, size or content, it's comfortable to select and/or delete only founded duplicate files form same directory like current file under cursor, with internal command invoked above.

This command (cm_SelectCurrentPath) can be used in branch view or list provided by feed to listbox form search utility. With cm_UnSelectCurrentPath command you can UnSelect files from same path, if you want to select all before and keep only duplicate files from one directory.

With selection based on particular rule (second, third, and so on) depend how do you sort the files. I think it's more sure to select files based on location. Anyway duplicate file are the same. This is mean is no matter very much which files are removed, except files belong to different applications, like .ini files.

For me this solution it's good enough.

Another solution can be:
- select all files (duplicate)
- save selection to file (with path)
- edit convenient the file with selection saved before
- load selection from this files
- inverse selection if you have think in this way when you have edited the file.

BTW, duplicate files in same directory... hmm... I don't think that is a usual situation, but is possible with different name or extension. In this case what is the difference which files are removed?


Best wishes,
My Best Wishes,

Eugen
burg1
Junior Member
Junior Member
Posts: 30
Joined: 2007-07-10, 13:56 UTC

Post by *burg1 »

You might want to take a look at this tool (Find Duplicates 1.25a):
http://www.geocities.com/hirak_99/goodies/finddups.html

It offers an "Auto Mark" option which automatically marks all but one file from each group of duplicates for deletion.

Another option allows to mark all duplicates from a certain folder.

These would be nice features for TC too! :-)
User avatar
eugensyl
Power Member
Power Member
Posts: 564
Joined: 2004-06-03, 18:27 UTC
Location: România
Contact:

Post by *eugensyl »

burg1 wrote:You might want to take a look at this tool (Find Duplicates 1.25a):
http://www.geocities.com/hirak_99/goodies/finddups.html

It offers an "Auto Mark" option which automatically marks all but one file from each group of duplicates for deletion.

Another option allows to mark all duplicates from a certain folder.

These would be nice features for TC too! :-)
I agee!


Best wishes,
My Best Wishes,

Eugen
Post Reply