Jump to content

Welcome to Geeks to Go - Register now for FREE

Need help with your computer or device? Want to learn new tech skills? You're in the right place!
Geeks to Go is a friendly community of tech experts who can solve any problem you have. Just create a free account and post your question. Our volunteers will reply quickly and guide you through the steps. Don't let tech troubles stop you. Join Geeks to Go now and get the support you need!

How it Works Create Account
Photo

Schedule task online


  • Please log in to reply

#1
Valentino14

Valentino14

    Member

  • Member
  • PipPip
  • 16 posts

Hi,

I'd like to schedule a task daily, which consists of visiting a URL. Actually, this URL represents a direct link to a file, which I’m downloading on a daily basis. See example below:

 

http://www.markets.rbs.nl/NL/ExcelExport.ashx?ListName=ProductList&config=ActiveProducts&st_search=smVUClJgyncg54s%2fXHulQrrA%2blYEamThFGaF9vbZiuJIg20zaZF6lNpn7UgiO%2b1w5X5yJSeb%2brJMuLVMD4lJuKs4HzFyqw2QkyMqQOuSjy8JZLSnGk721HD%2bIy0Nhwb0YIyOzcwVXKZFgAA9NssHoD0ujOsmAwJIX%2bIYEjGlSKe%2batJCRfo3YYJ8cFt81ytLMQQNr5HV6CyfX6UkR04utQ%2bTcc7TqcLKJHDpJcgJexKWarN5Ke5sJm4YFAqzV1BxaVT6JZ8fO0AvQkfkteXrPXsqHR3%2f0kXZ7%2bTMXP4ERlw%3d

 

Currently, I’m using Windows Task scheduler for starting my browser and visiting the URL (after which download starts automatically), and even though it worked fine until recently it now fails every time I run another heavy application. Hence, I’d like to find an alternative way of doing the same task without using Windows task scheduler.

 

Ideally the file should be saved in a dropbox, but if it’s saved to my computer than that’s also fine. Autofilemove is a great tool for downloading to Dropbox but it contains a lot of functionalities which i will not be using hence it’s a little bit expensive.

 

Any thoughts on which application might do the trick?

 

Thanks in advance!

Alexandra

 


  • 0

Advertisements


#2
SleepyDude

SleepyDude

    Trusted Helper

  • Malware Removal
  • 4,974 posts

Hi Alexandra :welcome:
 
 
Try with a batch script and curl:

  • download cURL from here (select 32-bit or 64-bit according to your Operating System) use the link Files Only
  • extract the zip file and open the bin folder inside to copy curl.exe to the folder where you want to create the script let's say C:\MyScript
  • Press Windowskey+R.jpgand type:
notepad c:\MyScript\GetData.cmd
  • Notepad will open Copy & Paste the following batch file:
@echo off
setlocal
curl.exe -L -v -J -O "http://www.markets.rbs.nl/NL/ExcelExport.ashx?ListName=ProductList&config=ActiveProducts&st_search=smVUClJgyncg54s%2FXHulQrrA%2BlYEamThFGaF9vbZiuJIg20zaZF6lNpn7UgiO%2B1w5X5yJSeb%2BrJMuLVMD4lJuKs4HzFyqw2QkyMqQOuSjy8JZLSnGk721HD%2BIy0Nhwb0YIyOzcwVXKZFgAA9NssHoD0ujOsmAwJIX%2BIYEjGlSKe%2BatJCRfo3YYJ8cFt81ytLMQQNr5HV6CyfX6UkR04utQ%2BTcc7TqcLKJHDpJcgJexKWarN5Ke5sJm4YFAqzV1BxaVT6JZ8fO0AvQkfkteXrPXsqHR3%2F0kXZ7%2BTMXP4ERlw%3D" >fileout.txt
: get the name of the file
for /F "tokens=2 delims='" %%f in (fileout.txt) do set FN=%%f
del fileout.txt
: rename the file for the apropriated file extension
ren %FN% %FN%.xls
: copy the file to the Dropbox folder
REM xcopy %FN%.xls c:\Dropbox\...
  • Adjust the last line by removing REM and correcting the path c:\Dropbox\... to the final location here do you want to save the downloaded file
  • Save the script file
  • To test the script double click GetData.cmd

  • 1

#3
Valentino14

Valentino14

    Member

  • Topic Starter
  • Member
  • PipPip
  • 16 posts

Hi SleepyDude,
 
Many, many thanks for your effort to help me on this one! I'd almost given up hope but this looks quite good :spoton: The solution is quite easy to implement thanks to your script, I don't have any experience on programming such a thing myself. 
 
The download works well when i run GetData.cmd as suggested, but what I still wonder is how to schedule running this...? Can this also be programmed in cURL, eg to run it daily at let's say 20:00...? Or do I still need Windows Task Scheduler for that? I would prefer not to, so if you have a suggestion to schedule without using Task Scheduler that would be great.

 

If my computer for some reason would happen to be shut down, is there any way it could be run over the web and save the file to Dropbox webservice...? I guess this is tricky, but I'm not sure what cURL is capable of...?
 
And what if I would subsequently want to download a second file in a similar way from a different URL? Could i copy the script below the last line, starting with

curl.exe -L -v -J -O "http://www.new website"

 
And finally, the file is 8mb rather than 1.5mb if I download it manually...not a big problem, it looks complete when I open it but is there any logic in why it should inflate if i use cURL for downloading?
 
Many thanks again, very impressive and good solution especially if it can be scheduled!!!
 
Alexandra


Edited by Valentino14, 02 May 2014 - 03:55 PM.

  • 0

#4
SleepyDude

SleepyDude

    Trusted Helper

  • Malware Removal
  • 4,974 posts

Hi SleepyDude,
 
Many, many thanks for your effort to help me on this one! I'd almost given up hope but this looks quite good :spoton: The solution is quite easy to implement thanks to your script, I don't have any experience on programming such a thing myself. 
 
The download works well when i run GetData.cmd as suggested, but what I still wonder is how to schedule running this...? Can this also be programmed in cURL, eg to run it daily at let's say 20:00...? Or do I still need Windows Task Scheduler for that? I would prefer not to, so if you have a suggestion to schedule without using Task Scheduler that would be great.

 
The Task Scheduler is fine, make it run GetData.cmd
It probably fails because not all the programs are made to work in non interactive mode...
 

If my computer for some reason would happen to be shut down, is there any way it could be run over the web and save the file to Dropbox webservice...? I guess this is tricky, but I'm not sure what cURL is capable of...?

 
cURL it's only a download tool that can be used in scripts.
Actually the Task Scheduler can wake up a machine to run a task.
 

And what if I would subsequently want to download a second file in a similar way from a different URL? Could i copy the script below the last line, starting with

curl.exe -L -v -J -O "http://www.new website"

 

Yes.

 

And finally, the file is 8mb rather than 1.5mb if I download it manually...not a big problem, it looks complete when I open it but is there any logic in why it should inflate if i use cURL for downloading?
 
Many thanks again, very impressive and good solution especially if it can be scheduled!!!
 
Alexandra

 

The link you provided its a little bit tricky is there any short link to download before resulting on that big URL for the final download.

I'm not an expert on cURL, some servers may need to receive the request in a specific way instead of the big url if for example you need to fill a form before sending the request.


  • 0

#5
Valentino14

Valentino14

    Member

  • Topic Starter
  • Member
  • PipPip
  • 16 posts

SleepyDude,

 

Thanks, I'll use Task Scheduler to run GetData.cmd then.

 

Last question: There's another link which i can download manually, but not with GetData.cmd. No idea why it's not working, it's exactly the same code as the one which is working, but this one isn't :no: :no:

it's breaking my head.

This is the code that's not working:

@echo off
setlocal
curl.exe -L -v -J -O "http://markets.rbs.de/DE/ExcelExport.ashx?ListName=ProductList&config=ActiveProducts&st_search=smVUClJgyncg54s%2fXHulQrrA%2blYEamThFGaF9vbZiuJIg20zaZF6lNpn7UgiO%2b1w5X5yJSeb%2brJMuLVMD4lJuKs4HzFyqw2QkyMqQOuSjy8JZLSnGk721HD%2bIy0Nhwb05ohkKlWNDokWjPaDOiq5gfojetCqB%2bSkgSdqstiCT6NC00M0N5SuGeTQW6Aw5uVXdJYRb%2fZiplTN7isdtuQDJxi8fv9wp6Dqs9KFQ045X7B3R%2fcYvtsAxlxM5ijoPCmYoxahipUnMLe81fIEETI4ptwcsFlMNk47Cxpo9e6S58mGvKl7IM58bq%2b%2f2bBgYMCG" >fileout.txt
: get the name of the file
for /F "tokens=2 delims='" %%f in (fileout.txt) do set FN=%%f
del fileout.txt
: rename the file for the apropriated file extension
ren %FN% %FN%.xls
: copy the file to the Dropbox folder
xcopy %FN%.xls c:\MyScript

The website which i'm trying to take the Excel file from, is http://markets.rbs.de/Mini-Futures

 

Hope you could give me another hand with this issue, it's rather important to me and i owe you big time :spoton:

 

Alexandra


Edited by Valentino14, 05 May 2014 - 02:02 AM.

  • 0

#6
SleepyDude

SleepyDude

    Trusted Helper

  • Malware Removal
  • 4,974 posts

Hi,

 

cURL sometimes have problems determining the name of the file to save, in that cases we need to provide one to avoid the error.

@echo off
setlocal
cd /d "%~dp0"
curl.exe -L -v -J -o ExcelExport.ashx "http://markets.rbs.de/DE/ExcelExport.ashx?ListName=ProductList&config=ActiveProducts&st_search=smVUClJgyncg54s%2fXHulQrrA%2blYEamThFGaF9vbZiuJIg20zaZF6lNpn7UgiO%2b1w5X5yJSeb%2brJMuLVMD4lJuKs4HzFyqw2QkyMqQOuSjy8JZLSnGk721HD%2bIy0Nhwb05ohkKlWNDokWjPaDOiq5gfojetCqB%2bSkgSdqstiCT6NC00M0N5SuGeTQW6Aw5uVXdJYRb%2fZiplTN7isdtuQDJxi8fv9wp6Dqs9KFQ045X7B3R%2fcYvtsAxlxM5ijoPCmYoxahipUnMLe81fIEETI4ptwcsFlMNk47Cxpo9e6S58mGvKl7IM58bq%2b%2f2bBgYMCG"
: copy the file to the Dropbox folder
rem xcopy ExcelExport.ashx c:\MyScript

I suspect that the last line (xcopy) in your script is wrong, if you have getdata.cmd in c:\MyScripts the downloaded file is already on the same folder the idea is to copy the file to your dropbox folder and this way have the file uploaded to the dropbox servers and sync with all the computers/devices where you have dropbox.


  • 0

#7
Valentino14

Valentino14

    Member

  • Topic Starter
  • Member
  • PipPip
  • 16 posts

Hi SleepyDude,

 

Based on your suggestion and with some help of the manual I adjusted the code and now looks like:

@echo off
setlocal
cd /d "%~dp0"
curl.exe -L -v -J -o c:/Path/To/Dropbox/"Name $(date +"%m-%d-%y").xls" "http://markets.rbs.de/DE/ExcelExport.ashx?ListName=ProductList&config=ActiveProducts&st_search=smVUClJgyncg54s%2fXHulQrrA%2blYEamThFGaF9vbZiuJIg20zaZF6lNpn7UgiO%2b1w5X5yJSeb%2brJMuLVMD4lJuKs4HzFyqw2QkyMqQOuSjy8JZLSnGk721HD%2bIy0Nhwb05ohkKlWNDokWjPaDOiq5gfojetCqB%2bSkgSdqstiCT6NC00M0N5SuGeTQW6Aw5uVXdJYRb%2fZiplTN7isdtuQDJxi8fv9wp6Dqs9KFQ045X7B3R%2fcYvtsAxlxM5ijoPCmYoxahipUnMLe81fIEETI4ptwcsFlMNk47Cxpo9e6S58mGvKl7IM58bq%2b%2f2bBgYMCG"
: copy the file to the Dropbox folder
rem xcopy ExcelExport.ashx c:\MyScript

As you can see, I'm using -o (not -O) followed by explicit pathname and hence save the file directly to the Dropbox, which works well. I guess this means i can drop this part of the code...?

cd /d "%~dp0"
[...]
: copy the file to the Dropbox folder
rem xcopy ExcelExport.ashx c:\MyScript

To avoid the issue of existing names you already signalled, I would like to add a timestamp to make every file's name unique. I tried to do so by adding

$(date +"%m-%d-%y")

but this creates an error. Also %F (as I found in the manual) is not working. Do you have any idea where I'm going wrong on this timestamp?

 

As usual, many thanks!!

Alexandra


  • 0

#8
SleepyDude

SleepyDude

    Trusted Helper

  • Malware Removal
  • 4,974 posts

Hi SleepyDude,
 
Based on your suggestion and with some help of the manual I adjusted the code and now looks like:

@echo off
setlocal
cd /d "%~dp0"
curl.exe -L -v -J -o c:/Path/To/Dropbox/"Name $(date +"%m-%d-%y").xls" "http://markets.rbs.de/DE/ExcelExport.ashx?ListName=ProductList&config=ActiveProducts&st_search=smVUClJgyncg54s%2fXHulQrrA%2blYEamThFGaF9vbZiuJIg20zaZF6lNpn7UgiO%2b1w5X5yJSeb%2brJMuLVMD4lJuKs4HzFyqw2QkyMqQOuSjy8JZLSnGk721HD%2bIy0Nhwb05ohkKlWNDokWjPaDOiq5gfojetCqB%2bSkgSdqstiCT6NC00M0N5SuGeTQW6Aw5uVXdJYRb%2fZiplTN7isdtuQDJxi8fv9wp6Dqs9KFQ045X7B3R%2fcYvtsAxlxM5ijoPCmYoxahipUnMLe81fIEETI4ptwcsFlMNk47Cxpo9e6S58mGvKl7IM58bq%2b%2f2bBgYMCG"
: copy the file to the Dropbox folder
rem xcopy ExcelExport.ashx c:\MyScript
As you can see, I'm using -o (not -O) followed by explicit pathname and hence save the file directly to the Dropbox, which works well. I guess this means i can drop this part of the code...?
cd /d "%~dp0"
[...]
: copy the file to the Dropbox folder
rem xcopy ExcelExport.ashx c:\MyScript

 
Yes you can remove the lines above an use -o with the full path and file name but make sure you put them inside quotes.
 

To avoid the issue of existing names you already signalled, I would like to add a timestamp to make every file's name unique. I tried to do so by adding

$(date +"%m-%d-%y")
but this creates an error. Also %F (as I found in the manual) is not working. Do you have any idea where I'm going wrong on this timestamp?

 


I suspect you find some example about running cURL on linux because on linux the $ symbol is used to get the output of commands...

 

for Windows you have to use something like this:

@echo off
setlocal

:: cut time to only HH:MM:SS
set _TIME=%time:~0,8%
:: replace ':' by '_' because ':' isn't valid in file names
set _TIME=%_TIME::=_%
set FN=%date%-%_TIME%
:: File named: Report_dd-mm-yyyy-hh_mm_ss.xls
curl.exe -L -v -J -o "Report_%FN%.xls" "http://markets.rbs.de/DE/ExcelExport.ashx?ListName=ProductList&config=ActiveProducts&st_search=smVUClJgyncg54s%2fXHulQrrA%2blYEamThFGaF9vbZiuJIg20zaZF6lNpn7UgiO%2b1w5X5yJSeb%2brJMuLVMD4lJuKs4HzFyqw2QkyMqQOuSjy8JZLSnGk721HD%2bIy0Nhwb05ohkKlWNDokWjPaDOiq5gfojetCqB%2bSkgSdqstiCT6NC00M0N5SuGeTQW6Aw5uVXdJYRb%2fZiplTN7isdtuQDJxi8fv9wp6Dqs9KFQ045X7B3R%2fcYvtsAxlxM5ijoPCmYoxahipUnMLe81fIEETI4ptwcsFlMNk47Cxpo9e6S58mGvKl7IM58bq%2b%2f2bBgYMCG"

  • 0

#9
Valentino14

Valentino14

    Member

  • Topic Starter
  • Member
  • PipPip
  • 16 posts
SleepyDude, 
 
Thanks for the advice again, this code worked brilliantly! It saves the file with a timestamp in the Dropbox, which were two of the main requirements. The file size is still an issue however...  
 
When I download manually, the file size is 3.7mb vs 39mb when i use cURL. I've found that this is due to the fact that all products are downloaded (in a slightly different format), not only the MINI's which I'm after. Here are the steps with a manual download: 
 
- Start at www.markets.rbs.de --> http://markets.rbs.d...e.aspx?pageID=3
- Click on Tab Tools --> http://markets.rbs.de/tools
- Click on MINI Finder --> http://markets.rbs.d...aspx?pageID=426
- Click on Excel icon --> http://markets.rbs.d...M58bq+/2bBgYMCG
- After clicking on the Excel icon, download starts automatically, which yields an .ASHX file of about 3.7MB (all Mini's from RBS Germany). 
 
If I run the script however using the same last URL, it returns the 39mb file! So all selections are lost, and instead it appears that the whole database is returned! Is there anyway to download the 3.7mb file using this solution? 
 
I've looked at different forums extensively to find a solution for this but got a bit lost... I've read for instance about HTTP GET vs POST downloads, is this related...? A push in the right direction to would be very welcome!!
 
Many thanks again!!!
 
Alexandra

  • 0

#10
SleepyDude

SleepyDude

    Trusted Helper

  • Malware Removal
  • 4,974 posts

Hi,

 

I think I found the problem, the URL to download include the % symbol that have a special meaning for the command prompt (specifies a variable name).

 

Edit the URL and for every % replace it with %%


  • 0

Advertisements


#11
Valentino14

Valentino14

    Member

  • Topic Starter
  • Member
  • PipPip
  • 16 posts

SleepyDude,

 

Indeed this works very well, replacing the single % gives the desired result, amazing!! Not sure how you discovered this...

 

Is there anything similar about the URL below as well...? If I run this one in a browser, the downloaded file is also different then the one I get when cURL is used... although it doesn't have any % :headscratch:

 

http://www.ishares.c...ax?fileType=csv

 

Thanks, have an excellent weekend!

 

Alexandra


  • 0

#12
SleepyDude

SleepyDude

    Trusted Helper

  • Malware Removal
  • 4,974 posts

SleepyDude,
 
Indeed this works very well, replacing the single % gives the desired result, amazing!! Not sure how you discovered this...
 
Is there anything similar about the URL below as well...? If I run this one in a browser, the downloaded file is also different then the one I get when cURL is used... although it doesn't have any % :headscratch:
 
http://www.ishares.c...ax?fileType=csv

 

I couldn't access that one, direct link doesn't work, maybe the site uses cookies. Probably the reason cURL failed.

 

Thanks, have an excellent weekend!

 

I wish you the same.


  • 0

#13
Valentino14

Valentino14

    Member

  • Topic Starter
  • Member
  • PipPip
  • 16 posts

SleepyDude, 

 

Isn't it possible to use this cUrl process to download if a site uses cookies? The site doesn't ask to accept cookies when I visit it, and it is possible to download the file manually. 

 

Isn't there any work around possible for downloading this file..? For example, could this method work in your opinion?

http://kb.kristianre...try&EntryID=150

 

Or the suggestion in the manual from http://curl.haxx.se/docs/manual.html

curl -c cookies.txt www.example.com

I have been looking for a site that has similar info for quite some time during the weekend but didn't find any....  :smashcomp:

 

Thanks again!!

 

Alexandra 


Edited by Valentino14, 13 May 2014 - 03:31 PM.

  • 0

#14
SleepyDude

SleepyDude

    Trusted Helper

  • Malware Removal
  • 4,974 posts

Hi,

 

Sorry for the delay, I have been busy...

 

I'm not sure how the website blocks the direct linking it could be using cookies or something else. I did some testing with the cookies options and couldn't make it work, sorry.


  • 0

#15
Valentino14

Valentino14

    Member

  • Topic Starter
  • Member
  • PipPip
  • 16 posts

SleepyDude, 

 

I've been playing around and got it to work  :D  The trick with the cookie.txt export through Chrome webstore worked pretty well. Let me know if you're interested in the code. 

 

Thanks for all the help, i've learned a lot from your tip about cURL. 

 

See ya on the next challenge!

 

Alexandra


  • 0






Similar Topics

0 user(s) are reading this topic

0 members, 0 guests, 0 anonymous users

As Featured On:

Microsoft Yahoo BBC MSN PC Magazine Washington Post HP