How would I go about downloading all the images on a specific web folder, if they're not indexed?like, lets say I wanted all the PA comics from 05, the web-folder would be http://www.penny-arcade.com.com/images/2005/(yyyymmdd.jpg)How could I go about getting copies of all the image files in that folder? Is it possible to automate?[Edited on July 28, 2006 at 4:27 PM. Reason : naming convention]
7/28/2006 4:25:28 PM
you could write a php script to grab them all. they're named following a convention so it wouldn't be hard.unless they have that disabled, like tww does
7/28/2006 4:26:45 PM
There are programs out there that do this. I've never used any personally, so I can't suggest one.
7/28/2006 4:29:05 PM
I used to use a program called offline commander to do this. I don't know if it's the best (or even if you can download it anymore, as I haven't used it in years), but it's worth a shot. go to download.com and search for "site rippers" and you should find tons of programs to do what you want.
7/28/2006 4:31:04 PM
excellent.thanks.
7/28/2006 4:32:35 PM
will firefusk do this?downthemall would work if there was a html page showing them all
7/28/2006 5:58:08 PM
curl -O "http://www.penny-arcade.com.com/images/2005/2005[00-12][00-31].jpg"
7/28/2006 6:08:48 PM
blackwidow will do it, but since they are all in the same folder and named in an organized manner, a script would be much easier.
7/28/2006 6:10:02 PM