Using TC as an download manager?

English support forum

Moderators: Hacker, petermad, Stefan2, white

Post Reply
Hurdet
Power Member
Power Member
Posts: 704
Joined: 2003-05-10, 18:02 UTC

Using TC as an download manager?

Post by *Hurdet »

Do it is possible to use TC to download a file from a url?
User avatar
HolgerK
Power Member
Power Member
Posts: 5408
Joined: 2006-01-26, 22:15 UTC
Location: Europe, Aachen

Post by *HolgerK »

"Net -> FTP New Connection <Ctrl+N>"
Paste the url into the combobox.

Regards
Holger
Hurdet
Power Member
Power Member
Posts: 704
Joined: 2003-05-10, 18:02 UTC

Post by *Hurdet »

ty :)
User avatar
ghisler(Author)
Site Admin
Site Admin
Posts: 50390
Joined: 2003-02-04, 09:46 UTC
Location: Switzerland
Contact:

Post by *ghisler(Author) »

You can also write all URLs in a TEXT file (ftp or http), and then use menu Net - FTP download from list. If you have installed the OpenSSL DLLs, you can also download https URLs this way.
Author of Total Commander
https://www.ghisler.com
Hurdet
Power Member
Power Member
Posts: 704
Joined: 2003-05-10, 18:02 UTC

Post by *Hurdet »

good to know, ty.
User avatar
Mummu
Junior Member
Junior Member
Posts: 12
Joined: 2009-06-25, 15:05 UTC

Post by *Mummu »

Is it possible to download a file from url by passing it in command line?

I tried totalcmd.exe /L=http.......(verylong)..... and total commander just crashed

I know its possible to use /L=ftp://user:pass@url.... and it works
RaoulJ
Junior Member
Junior Member
Posts: 7
Joined: 2008-07-30, 15:13 UTC

Re:

Post by *RaoulJ »

ghisler(Author) wrote: 2014-08-07, 10:00 UTC You can also write all URLs in a TEXT file (ftp or http), and then use menu Net - FTP download from list. If you have installed the OpenSSL DLLs, you can also download https URLs this way.
This is fast!
Thanks!
Raoul
Freeman004
New Member
New Member
Posts: 1
Joined: 2024-12-03, 09:22 UTC

Re: Using TC as an download manager?

Post by *Freeman004 »

Hi Guys!

I have been using Total Commander as a download manager for years. I love it, but now I have found myself in a new situation. I'm trying to download thousands of images that have the following structure for example:

https://webshop.koelner.hu/cikkkep/?cid=24849&dst=termekadatlap&num=0&ts=1692615372&wtype=1&ext=.JPG

The problem is that when Total Commander downloads it the downloaded file is empty (0 bite). Any idea how to download such images from this kind of urls?

Thanks in advance,
Steve
User avatar
beb
Power Member
Power Member
Posts: 579
Joined: 2009-09-20, 08:03 UTC
Location: Odesa, Ukraine

Re: Using TC as an download manager?

Post by *beb »

2Freeman004
Even my browsers can see nothing here if just to follow the given URL.
However, wget downloads it easily.
user-command (usercmd.ini):

Code: Select all

[em_wget_clip]
cmd=pwsh -c si -path env:path -value $env:path';'$env:commander_path\Plugins\app\Get;sv url -value (Get-Clipboard);
param=wget.exe -S -N --no-if-modified-since --no-hsts -U=Mozilla --referer="$url" "$url";pause
user-button:

Code: Select all

TOTALCMD#BAR#DATA
em_wget_clip

wciconex.dll


0
-1
What it does:
- the command takes a URL from the clipboard (assuming a URL was copied there) and downloads it into the active pane.
What it needs:
1. the command based on cross-platform PowerShell, if your system still has outdated Windows PowerShell 5.1, change 'pwsh' to 'powershell' there.
2. wget binary, which, e.g., can be taken from there (only wget.exe is actually required):
https://eternallybored.org/misc/wget/releases/

Question:
Why my browser, etc., cannot download the given URL(s).
My answer:
My tests show that it requires a referer.
Even using browsers, the direct URL is not enough -- you have to be physically on the image page to let the browser get the referer on its own and then to download the image properly using the direct link.
In my command, the referer just equals to the original image URL, and it does the trick.
Without the referer, I will get the null-sized image download, too.
#278521 User License
Total Commander [always the latest version, including betas] x86/x64 on Win10 x64/Android 10/15
User avatar
beb
Power Member
Power Member
Posts: 579
Joined: 2009-09-20, 08:03 UTC
Location: Odesa, Ukraine

Re: Using TC as an download manager?

Post by *beb »

Freeman004 wrote: 2024-12-03, 09:24 UTC I'm trying to download thousands of images...
Assuming those thousands of images' URLs are gathered in a file like *.txt, I use the following approach (which is universal and covers the above example as well):
user-button:

Code: Select all

TOTALCMD#BAR#DATA
em_wget

wciconex.dll
wget

0
-1
user-command (wincmd.ini):

Code: Select all

[em_wget]
cmd=pwsh -c "%commander_path%\Plugins\app\PowerShell\wget.ps1"
param=%WL
PowerShell script (wget.ps1):

Code: Select all

$lap   = [system.diagnostics.stopwatch]::StartNew()

# binary
$env:Path+=";$env:commander_path\plugins\app\Get"

# function
function get {param($arg)
$noname |foreach {if ($url -match $_) {$names = $null} else {$names = '--content-disposition --trust-server-names'}}
$nossl  |foreach {if ($url -match $_) {$ssl   = $null} else {$ssl   = '--no-check-certificate'}}
$noagent|foreach {if ($url -match $_) {$agent = $null} else {$agent = '--user-agent=Mozilla'}}
$noref  |foreach {if ($url -match $_) {$ref   = $null} else {$ref   = '--referer='+"$url"}}
$url   = $url.Trim()
$arg   = @($basic,$names,$ssl,$agent,$ref,$('"'+$url+'"'))
''
"wget {0}" -f ("$arg" -replace '\s+',' ')|Write-Host -f Green
start wget -a $arg -no -wait}

# basic options
$basic = '-S -N --no-if-modified-since --no-hsts'
# other options
# https://eternallybored.org/misc/wget/
# https://eternallybored.org/misc/wget/1.21.4/wget.html

# specials
$noname  = @('sourceforge') # no naming
$nossl   = @()              # no ssl certificate check
$noagent = @('sourceforge') # no user-agent
$noref   = @('sourceforge') # no referer

# define arrays
$urls  = $files = $text = $wget = $clip = @()

# get files from selection, or a file under the cursor if any
# (where $args is defined by %WL input parameter) 
if      ($args){
foreach ($line in [IO.File]::ReadLines($args)){
$file  = $line; $files += $line}}

# get urls from all *.* selected files in the active directory if any, or
# get urls from *.txt file under cursor if any 
if      ($files.count -gt 1){
foreach ($file in $files){
foreach ($url in Get-Content $file){$wget +=$url}}}
else {
if      ($file -like '*.txt'){
foreach ($url in Get-Content $file){$wget +=$url}}}

# get urls from all *.wget files in the current directory if any
# (if there is at least one such a file in selection or under cursor)
if      ($files -like '*.wget'){
foreach ($file in Get-ChildItem -filter *.wget){
foreach ($url in Get-Content $file){$wget +=$url}}}

# get urls from the clipboard if any
$clip = (Get-Clipboard) -split "[\x0d\x0a]" # coping line-feed in clipboard
foreach ($url in $clip){$clip +=$url}

# combine urls from each source if any
$urls  = $text + $wget + $clip

# define valid and unique urls if any
$urls  = $urls|Where {($_ -like 'http*')}|Sort|Get-Unique

# download valid and unique urls if any
if    (!($urls)){'no urls found...'|Write-Host -f Magenta}
else            {'downloading...'  |Write-Host -f Yellow}
foreach ($url in $urls){get}

# finalizing
"{0} unique url(s) found and processed for {1:mm}:{1:ss}.{1:fff}" -f $urls.count,$lap.elapsed|Write-Host -f Cyan
''
pause
What it does:
- downloads into a current active pane all the entries listed below:
- all valid URLs found in all (*.*) selected file(s) (regardless of file types) in the active pane (if any), or
- all valid URLs found in a *.txt file under the cursor (if any).
- all valid URLs that reside in the clipboard (if any).
- there's also a special case regarding my personal preferences related to the *.wget files processing: if there is at least one *.wget file under the cursor or in the selection all other *.wget files within an active directory (if any) will be found and processed too (where *.wget files are the dedicated text files -- the hubs where I keep my collections of URLs to be re/downloaded from time to time).

Such a file example (here, downloader of downloaders):

Code: Select all

# https://eternallybored.org/misc/wget/
# https://eternallybored.org/misc/wget/releases/
https://eternallybored.org/misc/wget/releases/wget-1.21.4-win64.zip

# https://gitlab.com/gnuwget/wget2/-/releases
https://github.com/rockdaboot/wget2/releases/download/v2.1.0/wget2.exe

# https://everything.curl.dev/
# https://curl.se/windows/
https://curl.se/windows/latest.cgi?p=win64-mingw.zip

# https://aria2.github.io
# https://github.com/aria2/aria2/releases
https://github.com/aria2/aria2/releases/download/release-1.37.0/aria2-1.37.0-win-64bit-build1.zip

# https://github.com/yt-dlp/yt-dlp/releases
# https://github.com/yt-dlp/FFmpeg-Builds/releases
https://github.com/yt-dlp/yt-dlp/releases/latest/download/yt-dlp.exe
https://github.com/yt-dlp/FFmpeg-Builds/releases/download/latest/ffmpeg-master-latest-win64-gpl.zip
Note: commented lines (# ...) won't be recognized as valid URLs, they are there for convenience; only uncommented URLs are to be downloaded.

Edit. fixed some errors
Last edited by beb on 2024-12-04, 10:05 UTC, edited 6 times in total.
#278521 User License
Total Commander [always the latest version, including betas] x86/x64 on Win10 x64/Android 10/15
User avatar
Dalai
Power Member
Power Member
Posts: 9943
Joined: 2005-01-28, 22:17 UTC
Location: Meiningen (Südthüringen)

Re: Using TC as an download manager?

Post by *Dalai »

2beb
wget has a parameter -i (--input-file) which can point to a file from which it can retrieve the URLs. Sure, the clipboard content needs to be saved to a file in that case, but it would result in much less process forking than starting a new instance of wget for each URL.
#101164 Personal licence
Ryzen 5 2600, 16 GiB RAM, ASUS Prime X370-A, Win7 x64

Plugins: Services2, Startups, CertificateInfo, SignatureInfo, LineBreakInfo - Download-Mirror
User avatar
beb
Power Member
Power Member
Posts: 579
Joined: 2009-09-20, 08:03 UTC
Location: Odesa, Ukraine

Re: Using TC as an download manager?

Post by *beb »

2Dalai
Oh, I didn't even ever think about it.
However, if that question does matter, all the valid and unique URLs are gathered in the $urls variable before being fed to wget at the final stage (in 'foreach ($url in $urls) {get}' line which effectively is the last line of the script).
Regarding your remark, the script can be easily modified by a concerned user by saving the $urls variable into a temporary file*, which in its turn can be fed to wget as such in the way you mention.
Thank you.

* something like:
creating temp file with $urls within:
$tmp = [IO.Path]::combine($env:Temp,'wget_temp.txt')
[IO.File]::WriteAllLines($tmp,$urls)

then using it with wget
-i $tmp
and cleaning it thereafter:
$tmp|Remove-Item -force
#278521 User License
Total Commander [always the latest version, including betas] x86/x64 on Win10 x64/Android 10/15
Post Reply