Huge number of files

English support forum

Moderators: white, Hacker, petermad, Stefan2

devzero
Junior Member
Junior Member
Posts: 14
Joined: 2006-01-17, 12:30 UTC

Huge number of files

Post by *devzero »

Totalcommander is ekstremly slow when working with a huge number of files. Typicaly i have a directory with 30000+ files inn it, and it takes from 2-5 minutes to enter it.

Is there any workaround to this? Any other software i can use to work with this amount of files?
User avatar
gbo
Senior Member
Senior Member
Posts: 329
Joined: 2005-03-31, 19:58 UTC
Location: Lausanne (Switzerland)

Post by *gbo »

This is probably not what you are looking for, but this plugin allow to separate files into folders. It would improve performance but I guess your files have to be in the same folder :(
Gil
Licence #17346

90% of coding is debugging. The other 10% is writing bugs.
User avatar
Hacker
Moderator
Moderator
Posts: 13068
Joined: 2003-02-06, 14:56 UTC
Location: Bratislava, Slovakia

Post by *Hacker »

devzero,
Totalcommander is ekstremly slow when working with a huge number of files. Typicaly i have a directory with 30000+ files inn it, and it takes from 2-5 minutes to enter it.
Try Configuration - Options - Display - Show symbols to the left of the filename - No symbols.
Any other software i can use to work with this amount of files?
Well, command line should be fast. :)

HTH
Roman
Mal angenommen, du drückst Strg+F, wählst die FTP-Verbindung (mit gespeichertem Passwort), klickst aber nicht auf Verbinden, sondern fällst tot um.
devzero
Junior Member
Junior Member
Posts: 14
Joined: 2006-01-17, 12:30 UTC

Post by *devzero »

Ill test the plugin later today, if it can keep the speed up it would be a lifesaver.

I should probably have mentioned that commandline is mostly to slow also, i had a directory that i ran
dir /b > ../dir.txt
After running for 15 minnutes the txt file was 16mb and still running.

Doing a
del *.* /Q took about 10 minutes.

And no, i have no idea exactly how many files that directory contained.
User avatar
Hacker
Moderator
Moderator
Posts: 13068
Joined: 2003-02-06, 14:56 UTC
Location: Bratislava, Slovakia

Post by *Hacker »

Maybe it would be better not to have so many files in a dir?

Roman
Mal angenommen, du drückst Strg+F, wählst die FTP-Verbindung (mit gespeichertem Passwort), klickst aber nicht auf Verbinden, sondern fällst tot um.
devzero
Junior Member
Junior Member
Posts: 14
Joined: 2006-01-17, 12:30 UTC

Post by *devzero »

That is unfortanatly not an option all the time for me :(
User avatar
Lefteous
Power Member
Power Member
Posts: 9535
Joined: 2003-02-09, 01:18 UTC
Location: Germany
Contact:

Post by *Lefteous »

2devzero
two questions:

1) Is directory opening slow the first time only or even in repeated openings?
Please report the cpu usage while the directory is read.

2) What is the performance in Windows Explorer?
devzero
Junior Member
Junior Member
Posts: 14
Joined: 2006-01-17, 12:30 UTC

Post by *devzero »

1) It is slow everytime, if i could just wait while it opened the first time, then be able to work with the files normaly it would be ok. Problem is not only is it slow on first open, but somtimes it seems to "autorefresh" when i select it and takes another several minutes to do this.

2) Explorer is actually much worse.

This was tested on windows 2000 adv. server with ntfs file system and a fairly fast SCSI disk, altho i don't know exact spec's on the disk. The dir contains 38000 files and 30000 dirs.
User avatar
Lefteous
Power Member
Power Member
Posts: 9535
Joined: 2003-02-09, 01:18 UTC
Location: Germany
Contact:

Post by *Lefteous »

2devzero
It is slow everytime, if i could just wait while it opened the first time, then be able to work with the files normaly it would be ok. Problem is not only is it slow on first open, but somtimes it seems to "autorefresh" when i select it and takes another several minutes to do this.
OK to disable all possible types of refrehes do the following:
In main menu call Configuration/Change Settings Files directly.
In Wincmd.ini changed/add the following settinga:

Code: Select all

NoReRead=ABCDEFGHIJKLMNOPQRSTUVWXYZ\
WatchDirs=0
Restart Total Commander after applying this settings.

Please add the cpu usage information.
Explorer is actually much worse.
After all it seems Total Commander is the fastest program of all.
The dir contains 38000 files and 30000 dirs.
Ok that means we are not talking about 30000 but about 68000 objects in that particular directory.

Another idea:
Maybe you could use a filter like *.jpg if don't need to see all files at once.
CoolWater
Power Member
Power Member
Posts: 737
Joined: 2003-03-27, 16:33 UTC

Post by *CoolWater »

May be OT:

At this point, the speed of SpeedCommander could be quite interessting because (as far as I know) SC does not read all entries but only the amount of items which would be visible without scrolling. As soon you scroll the list, both names and icons are read. All non-visible items are empty, no icon, no text... (see: http://blog.speedproject.de/2005/10/26/verkuerzte-baumansicht/ (german blog entry))

TC does it different: Read all entries (all names), extract icons (if not disabled) only for those which are visible.

Correct me if i'm wrong... ;)

Regards,
CoolWater
devzero
Junior Member
Junior Member
Posts: 14
Joined: 2006-01-17, 12:30 UTC

Post by *devzero »

CPU usage is from 10 to 20% on total commander, other prosesses uses next to nothing. (when totcmd used 20%, total system load was about 22%) for the first few minnutes, last 30 sek totcmd used 50% (or 100% of one CPU as this system has 2 cpu's )

Now off to test speedcommander...
aguirRe
Junior Member
Junior Member
Posts: 88
Joined: 2003-02-06, 17:33 UTC
Contact:

Post by *aguirRe »

I don't know if it helps here, but I know there's an NTFS tweak that is supposed to help when having massive amounts of files/dirs. It's a registry key called NtfsDisableLastAccessUpdate (search Google for more info).
CoolWater
Power Member
Power Member
Posts: 737
Joined: 2003-03-27, 16:33 UTC

Post by *CoolWater »

devzero wrote:CPU usage is from 10 to 20% on total commander, other prosesses uses next to nothing. (when totcmd used 20%, total system load was about 22%) for the first few minnutes, last 30 sek totcmd used 50% (or 100% of one CPU as this system has 2 cpu's )

Now off to test speedcommander...
may be you should turn off your anti virus program for testing... ;)
User avatar
Lefteous
Power Member
Power Member
Posts: 9535
Joined: 2003-02-09, 01:18 UTC
Location: Germany
Contact:

Post by *Lefteous »

2devzero
CPU usage is from 10 to 20% on total commander, other prosesses uses next to nothing. (when totcmd used 20%, total system load was about 22%) for the first few minnutes, last 30 sek totcmd used 50% (or 100% of one CPU as this system has 2 cpu's
This is a very interesting information. This sounds as if "the first minutes" are used to get the files and last 30 seconds to display them.
devzero
Junior Member
Junior Member
Posts: 14
Joined: 2006-01-17, 12:30 UTC

Post by *devzero »

No antivirus program running on this computer.
Post Reply