[REQ] Tool to determine best copy method settings
Moderators: Hacker, petermad, Stefan2, white
[REQ] Tool to determine best copy method settings
I wonder - wouldn't it be great if Christian made a small utility to test various values for various copy methods so we wouldn't have to guess the best values for our drives? The tool would simply try to copy eg 50 MB, each time with different settings. These would be customizable (so we can test our own values) but there would also be an automatic mode, where eg the 50 most promising value combinations would be tested and after 10-20 minutes the best would be shown.
What do you think?
TIA
Roman
What do you think?
TIA
Roman
Mal angenommen, du drückst Strg+F, wählst die FTP-Verbindung (mit gespeichertem Passwort), klickst aber nicht auf Verbinden, sondern fällst tot um.
- ghisler(Author)
- Site Admin
- Posts: 50475
- Joined: 2003-02-04, 09:46 UTC
- Location: Switzerland
- Contact:
If only it would be so simple! The Windows disk cache and the cache of the harddisk itself have big influence too! Currently the default configuration is maximum safety, but not maximum speed.
Author of Total Commander
https://www.ghisler.com
https://www.ghisler.com
- StickyNomad
- Power Member
- Posts: 1933
- Joined: 2004-01-10, 00:15 UTC
- Location: Germany
Indeed such a tool would be very useful, I support that! Manual testing is neither comfortable nor accurate and I suppose the effort to create such an autotest-tool should not be too much.
[EDIT]
2Ghisler(Author)
Hm, but if the user would run the test-tool on his machine, he could at least find an optimal setting for his actual windows-conifguration, or am I wrong?
Of course security is an important aspect, but if the deafault settings are kept as set actually, the user could be pointed to possible issues when tuning the settings with the tool.
[/EDIT]
[EDIT]
2Ghisler(Author)
Hm, but if the user would run the test-tool on his machine, he could at least find an optimal setting for his actual windows-conifguration, or am I wrong?
Of course security is an important aspect, but if the deafault settings are kept as set actually, the user could be pointed to possible issues when tuning the settings with the tool.
[/EDIT]
Christian,
The cache could be circumvented by first writing a few sets of random data to the source disk and then copying them.
Roman
That's exactly the reason I suggested the test tool.The Windows disk cache and the cache of the harddisk itself have big influence too!
The cache could be circumvented by first writing a few sets of random data to the source disk and then copying them.
Roman
Mal angenommen, du drückst Strg+F, wählst die FTP-Verbindung (mit gespeichertem Passwort), klickst aber nicht auf Verbinden, sondern fällst tot um.
- ghisler(Author)
- Site Admin
- Posts: 50475
- Joined: 2003-02-04, 09:46 UTC
- Location: Switzerland
- Contact:
If you remember, we had a problem with a specific SCSI controller from Adaptec where the big file copy mode was causing corrupted files. The problem was caused by unclearly formulated documentation of the Windows API. The function was subsequently changed by us to make it work also with this specific Adaptec controller. This problem has prevented me so far from making this copy method the default. There have been no other problems since then, though.What does "safety" means here? Will copied data be corrupted when I use other settings?
Author of Total Commander
https://www.ghisler.com
https://www.ghisler.com
A tool for testing would really be great.
Such a tool should allow the following:
Select a file/folder to be copied.
Select one or more Drives/paths to copy to.
Select a range of block sizes to test.
Do the test automatically and log the results.
Optionally delete testfile after operation.
Thus I could find the best values for my Machine easily by running the tool overnight.
Now I would have to copy a file over and over again, probably to many different Harddisks and all the time watch this copying with my stopwatch.
And never forget to adjust the blocksize. Today I do not even know which blocksizes are making sense (10240; 10260;10280? or rather 10240; 11262 etc.)
sheepdog
Such a tool should allow the following:
Select a file/folder to be copied.
Select one or more Drives/paths to copy to.
Select a range of block sizes to test.
Do the test automatically and log the results.
Optionally delete testfile after operation.
Thus I could find the best values for my Machine easily by running the tool overnight.
Now I would have to copy a file over and over again, probably to many different Harddisks and all the time watch this copying with my stopwatch.
And never forget to adjust the blocksize. Today I do not even know which blocksizes are making sense (10240; 10260;10280? or rather 10240; 11262 etc.)
The average user is not very interested in what influences the copy speed but wants to find out the best setting for copying on his machine, I think.If only it would be so simple! The Windows disk cache and the cache of the harddisk itself have big influence too!
sheepdog
"A common mistake that people make when trying to design something
completely foolproof is to underestimate the ingenuity of complete fools."
Douglas Adams
completely foolproof is to underestimate the ingenuity of complete fools."
Douglas Adams
- majkinetor !
- Power Member
- Posts: 1580
- Joined: 2006-01-18, 07:56 UTC
- Contact:
-
- Power Member
- Posts: 556
- Joined: 2006-04-01, 00:11 UTC
Crucial problem here is this -- once you create a source file for copying it is cached and it stays cached until you reboot. That would influence the results of every test but the first one. So you would have to reboot before each test pass.
Anyway, why bother when everybody knows that the fastest way to copy files in Windows is to use asynchronous file I/O?
Anyway, why bother when everybody knows that the fastest way to copy files in Windows is to use asynchronous file I/O?
LCX-01-B-B-SL, Seasonic S-12 600HT, Intel D975XBX, Core 2 Duo E6300, Zalman CNPS 9500 LED, Corsair TWIN2X2048-6400, e-GeForce 8800 GTX, WDC WD1500ADFD, 2x WDC WD2500YS, Pioneer DVR-111DBK, Viewsonic VP930b, KM-3701, M-BJ69
This has been discussed above. There is no reason to reuse the same file.once you create a source file for copying it is cached and it stays cached until you reboot.
Roman
Mal angenommen, du drückst Strg+F, wählst die FTP-Verbindung (mit gespeichertem Passwort), klickst aber nicht auf Verbinden, sondern fällst tot um.
Indeed, it's not necessary to reuse the file. Using slightly different files would have some effect on the results, perhaps, but that's "fine" tuning. What we're talking about is a test that gives you "coarse" tuning, because the difference between the different values can be quite significant, I think.
It's currently too time consuming, annoying, and unreliable to test these things manually. The effort might be worth it if you'll be using the same drives indefinitely (for several months/years), but in my work we're often using a particular drive for just one or two days, a week tops.
A tool like this would allow me to find the best copy parameters for that drive on the first day in just a few minutes (e.g. a coffee break), and thus speed up operations for the rest of the week.
Support +++
It's currently too time consuming, annoying, and unreliable to test these things manually. The effort might be worth it if you'll be using the same drives indefinitely (for several months/years), but in my work we're often using a particular drive for just one or two days, a week tops.
A tool like this would allow me to find the best copy parameters for that drive on the first day in just a few minutes (e.g. a coffee break), and thus speed up operations for the rest of the week.
Support +++