ftp ul/dl many files - some intelligence ?

English support forum

Moderators: white, Hacker, petermad, Stefan2

Post Reply
trevor12
Junior Member
Junior Member
Posts: 65
Joined: 2012-12-06, 15:16 UTC
Location: Czech republic

ftp ul/dl many files - some intelligence ?

Post by *trevor12 »

when I upload download many small files, most of time is consumed on connection setting of particular file, so speed of transfer as whole is drastically low .. is it possible to add function automatically compress or " stored" some piece of files (in regard of free space/free ram of source pc) ahead and decompress on the target so the speed would be higher ?
User avatar
Dalai
Power Member
Power Member
Posts: 9364
Joined: 2005-01-28, 22:17 UTC
Location: Meiningen (Südthüringen)

Post by *Dalai »

You could try Options > FTP > Compress during transfer (MODE Z) if your FTP server supports it.

Regards
Dalai
#101164 Personal licence
Ryzen 5 2600, 16 GiB RAM, ASUS Prime X370-A, Win7 x64

Plugins: Services2, Startups, CertificateInfo, SignatureInfo, LineBreakInfo - Download-Mirror
gdpr deleted 6
Power Member
Power Member
Posts: 872
Joined: 2013-09-04, 14:07 UTC

Post by *gdpr deleted 6 »

Compression will likely not help you much here, because (as far as i can tell from your post) the issue is that you transfer a large(-ish) number of small files.

A huge part of the time cost here os not in the actual transfer of the file content itself. Much of the cost is in the time taken to prepare and initialize the transfer of each file between the client and the FTP server. Data compression would not help you with that.

I am not sure whether TC does support concurrent (parallel) uploads. If not, i would suggest using a FTP client that supports concurrent uploads (pretty much any dedicated FTP client should support that).

Doing parallel uploads (or downloads) of many small files will in all likelihood reduce the time spent on the file transfers in total, as multiple file transfers can be initialized/prepared at the same time in parallel. (In contrast to the actual transfer of file data, this preparation/initialization requires only an insignificant amount of bandwidth and doing this in parallel can thus even be beneficial for low bandwidth connections.)

But keep in mind that a FTP server can be configured so that it rejects parallel file transfers. If that is the case for your FTP server, you will have to bite in the sour apple and do the file transfers sequentially, accepting the incurred time overhead...
User avatar
Dalai
Power Member
Power Member
Posts: 9364
Joined: 2005-01-28, 22:17 UTC
Location: Meiningen (Südthüringen)

Post by *Dalai »

elgonzo wrote:I am not sure whether TC does support concurrent (parallel) uploads.
TC doesn't support parallel FTP transfers, in any direction.

Regards
Dalai
#101164 Personal licence
Ryzen 5 2600, 16 GiB RAM, ASUS Prime X370-A, Win7 x64

Plugins: Services2, Startups, CertificateInfo, SignatureInfo, LineBreakInfo - Download-Mirror
trevor12
Junior Member
Junior Member
Posts: 65
Joined: 2012-12-06, 15:16 UTC
Location: Czech republic

re

Post by *trevor12 »

ok thank you for your info I solve that problem so I firstly pack what I want to transfer, next transfer archive and next decompress on the target but it would be comfortable when it was done automatically

anyway I tried Filezilla client and server that both support multi connections from same IP adress on LAN but result is not much better than default transfer (I set max 100 connections on ftp server and 32 connections on client and I upload to ftp server from ftp client)

I also tried old leechftp that has limit 16 concurrent connections and results are slight better than when 32 connections on filezilla ftp client
gdpr deleted 6
Power Member
Power Member
Posts: 872
Joined: 2013-09-04, 14:07 UTC

Re: re

Post by *gdpr deleted 6 »

trevor12 wrote:ok thank you for your info I solve that problem so I firstly pack what I want to transfer, next transfer archive and next decompress on the target but it would be comfortable when it was done automatically
If you have some form of admin access to the server you could be able to automate this.

It might be that the FTP server software on your server provides some facility to start a script/program after file uploads. All you would need to do is to write a script that unpacks the uploaded archive.

If your FTP server software does not provide such a facility, you could still automate by writing a program/script on the server that monitors the FTP upload directory (i.e., responds to filesystem events) and whenever an archive file is appearing in this directory it will unpack it (or something along the lines).

anyway I tried Filezilla client and server that both support multi connections from same IP adress on LAN but result is not much better than default transfer (I set max 100 connections on ftp server and 32 connections on client and I upload to ftp server from ftp client)
It sounds like you have an internet connection with a rather low latency. Lucky you :)
(or is this within some LAN environment?)
trevor12
Junior Member
Junior Member
Posts: 65
Joined: 2012-12-06, 15:16 UTC
Location: Czech republic

re

Post by *trevor12 »

thanks but my programming knowledge are poor to be able create such scripts. Maybe in the future some app developer will create two apps - one ftp client on first machine and two ftp server on second machine which would collaborate in these for me only manually solved problem
Post Reply