使我的autodownloading shell脚本更好

时间:2022-12-15 19:54:41

So I want to download multiple files from rapidshare. This what I currently have. I created a cookie by running-

所以我想从rapidshare下载多个文件。这就是我现在拥有的。我通过运行创建了一个cookie

wget \
    --save-cookies ~/.cookies/rapidshare \
    --post-data "login=USERNAME&password=PASSWORD" \
    --no-check-certificate \
    -O - \
    https://ssl.rapidshare.com/cgi-bin/premiumzone.cgi \
    > /dev/null

and now I have a shell script which I run which looks like this-

现在我有一个我运行的shell脚本,看起来像这样 -

#!/bin/bash
wget -c --load-cookies ~/.cookies/rapidshare http://rapidshare.com/files/219920856/file1.rar
wget -c --load-cookies ~/.cookies/rapidshare http://rapidshare.com/files/393839302/file2.rar
wget -c --load-cookies ~/.cookies/rapidshare http://rapidshare.com/files/398293204/file3.rar
....

I want two things-

我想要两件事 -

  1. The shell script needs to read the files to download from a file.
  2. shell脚本需要读取要从文件下载的文件。

  3. The shell script should download anywhere from 2 - 8 files at a time.
  4. shell脚本应该一次下载2到8个文件。

Thanks!

2 个解决方案

#1


When you want parallel jobs, think make.

当你想要并行工作时,请考虑make。

#!/usr/bin/make -f

login:
        wget -qO/dev/null \
--save-cookies ~/.cookies/rapidshare \
--post-data "login=USERNAME&password=PASSWORD" \
--no-check-certificate \
https://ssl.rapidshare.com/cgi-bin/premiumzone.cgi
$(MAKEFILES):
%: login
        wget -ca$(addsuffix .log,$(notdir $@)) \
--load-cookies ~/.cookies/rapidshare $@
        @echo "Downloaded $@ (log in $(addsuffix .log,$(notdir $@)))"

Save this as rsget somewhere in $PATH (make sure you use tabs and not spaces for indentation), give it chmod +x, and run

在$ PATH中将其保存为rsget(确保使用制表符而不是缩进空格),给它chmod + x,然后运行

rsget -kj8 \
    http://rapidshare.com/files/219920856/file1.rar \
    http://rapidshare.com/files/393839302/file2.rar \
    http://rapidshare.com/files/398293204/file3.rar \
    ...

This will log in, then wget each target. -j8 tells make to run up to 8 jobs in parallel, and -k means "keep going even if a target returned failure".

这将登录,然后wget每个目标。 -j8告诉make并行运行多达8个作业,-k表示“即使目标返回失败也继续”。

Edit

Tested with GNU Make 3.79 and 3.81.

用GNU Make 3.79和3.81测试。

#2


Try this. I think it should do what you want:

试试这个。我认为它应该做你想要的:

#! /bin/bash

MAX_CONCURRENT=8
URL_BASE="http://rapidshare.com/files/"
cookie_file=~/.cookies/rapidshare

# do your login thing here...

[ -n "$1" -a -f "$1" ] || { echo "please provide a file containing the stuff to download"; exit 1; }

inputfile=$1
count=0
while read x; do
  if [ $count -ge $MAX_CONCURRENT ]; then
    count=0
    wait
  fi
  { wget -c --load-cookies "$cookie_file" "${URL_BASE}$x" && echo "Downloaded $x"; } &
  count=$((count + 1))
done < $inputfile

#1


When you want parallel jobs, think make.

当你想要并行工作时,请考虑make。

#!/usr/bin/make -f

login:
        wget -qO/dev/null \
--save-cookies ~/.cookies/rapidshare \
--post-data "login=USERNAME&password=PASSWORD" \
--no-check-certificate \
https://ssl.rapidshare.com/cgi-bin/premiumzone.cgi
$(MAKEFILES):
%: login
        wget -ca$(addsuffix .log,$(notdir $@)) \
--load-cookies ~/.cookies/rapidshare $@
        @echo "Downloaded $@ (log in $(addsuffix .log,$(notdir $@)))"

Save this as rsget somewhere in $PATH (make sure you use tabs and not spaces for indentation), give it chmod +x, and run

在$ PATH中将其保存为rsget(确保使用制表符而不是缩进空格),给它chmod + x,然后运行

rsget -kj8 \
    http://rapidshare.com/files/219920856/file1.rar \
    http://rapidshare.com/files/393839302/file2.rar \
    http://rapidshare.com/files/398293204/file3.rar \
    ...

This will log in, then wget each target. -j8 tells make to run up to 8 jobs in parallel, and -k means "keep going even if a target returned failure".

这将登录,然后wget每个目标。 -j8告诉make并行运行多达8个作业,-k表示“即使目标返回失败也继续”。

Edit

Tested with GNU Make 3.79 and 3.81.

用GNU Make 3.79和3.81测试。

#2


Try this. I think it should do what you want:

试试这个。我认为它应该做你想要的:

#! /bin/bash

MAX_CONCURRENT=8
URL_BASE="http://rapidshare.com/files/"
cookie_file=~/.cookies/rapidshare

# do your login thing here...

[ -n "$1" -a -f "$1" ] || { echo "please provide a file containing the stuff to download"; exit 1; }

inputfile=$1
count=0
while read x; do
  if [ $count -ge $MAX_CONCURRENT ]; then
    count=0
    wait
  fi
  { wget -c --load-cookies "$cookie_file" "${URL_BASE}$x" && echo "Downloaded $x"; } &
  count=$((count + 1))
done < $inputfile