Wallpaper Extractor

P

PuppyKatt

Right-click the imag select Save As ..., then save it in Windows as a
..bmp. When you go into your display settings, it will be listed as one
of your wall papers. No special program required. HTH.


: Is there a program to extract a bunch of wallpapers from websites?
:
:
 
J

John R. Sellers

PuppyKatt said:
Right-click the imag select Save As ..., then save it in Windows as a
.bmp.

You missed my point entirely.

I want a program to do ths automatically so I won't have to spend all day
doing this manualy.
 
M

MightyKitten

John said:
Is there a program to extract a bunch of wallpapers from websites?

Well, you could of course compleetly rip a website (with or withaout an
image extention filter) but it has also some disadvantages:

1) some sites/hosting providers will mark your attemt as a hacking attempt,
putting you on a blacklist

2) the only freeware webripper (only freeware for private use btw) I know of
is http://www.webreaper.net/ and to be honest, I'm not to keen about this
app. (It does it work, but it is nagging too much for my taste)

3) Ripping a webspace will kost the website owner a lot of banthwidth (and
not to mention your own bandwith) So if you do this, pleas be carefull with
what you wanna rip.

MightyKitten
 
J

John R. Sellers

MightyKitten said:
Well, you could of course compleetly rip a website (with or withaout an
image extention filter) but it has also some disadvantages:

1) some sites/hosting providers will mark your attemt as a hacking attempt,
putting you on a blacklist

2) the only freeware webripper (only freeware for private use btw) I know of
is http://www.webreaper.net/ and to be honest, I'm not to keen about this
app. (It does it work, but it is nagging too much for my taste)

3) Ripping a webspace will kost the website owner a lot of banthwidth (and
not to mention your own bandwith) So if you do this, pleas be carefull with
what you wanna rip.

Good to know. I guess my best bet is to save each wallpaper manually (or
start using Webshots again).
 
G

Gabriele Neukam

On that special day, John R. Sellers, ([email protected]) said...
I want a program to do ths automatically so I won't have to spend all day
doing this manualy.

If you only want to collect wallpapers automatically, you might just as
well generate your own - automatically.

There is a program, called starfish, that does this job in the
background, in regular intervals which you can set as you like, as well
as the range of used colours or the intended size of the wallpaper.

===========================
Starfish For Windows
Windows wallpaper generator
version 2.0, 6 July 2003
===========================

http://my.en.com/~jwk/starfish_for_windows/

or from the sourceforge.net


Gabriele Neukam

(e-mail address removed)
 
S

Son Of Spy

John said:
Good to know. I guess my best bet is to save each wallpaper manually (or
start using Webshots again).

No it isn't ;^)

There are also HTTrack and Webdownloader, non-nagging site rippers.

but you just want the images...

Picture Downloader (Mihov's) Do you want to download a lot of pictures from
one page, but you don't want to click every thumbnail or link to that
picture, and then save it? Mihov Picture Downloader is exactly what you
need. It scans a page, that you specify, for each and every picture, that
is linked from that page. Then it displays all pictures found and you can
select only the pictures, that you would like to have downloaded. One click
and download of all pictures starts. It's that easy!
On My Internet3 page or HERE:
http://download.mihov.com/pd.exe ~263Kb

Or for an entire website:

Web Image Collector is a free program you can use to spider Websites for
image files. It provides a handy wizard interface to get the process
started. You can select a site from your Favorites folder, or simply type
in a URL. Then enter optional login information and a destination for the
actual image files. You can choose to stay on the same folder or spider all
available links to whatever level you choose. You can even skip duplicates
and declare minimum and maximum file sizes. The scan produces thumbnail
images of all files found. Just double-click on a thumbnail to view the
image in its associated program. An odd but interesting feature of Web
Image Collector is the inclusion of an animated Microsoft Agent character
that vocally steps you through the setup process...

Naturally the site has stopped updating or offering this app for download
however.... **I** have a copy. If you prefer this option, goto my website
and contact me...we can work someything out...

But please PLEASE don't go back to Webshots 8^P

Cheers!

Son Of Spy
--

Some You Won't Find Anywhere Else...

http://www.sover.net/~wysiwygx/index.html
. --- . . - - - - - - - - - - - -
/ SOS \ __ / Freeware - - - - - -
/ / \ ( ) / - - - - -
/ / / / / / / \/ \ - - - -
/ / / / / / / : : - - -
/ / / / / ' ' - -
/ / //..\\
=====UU==UU=====
'///||\\\'
' '' '
 
J

John R. Sellers

Son Of Spy said:
Picture Downloader (Mihov's) Do you want to download a lot of pictures from
one page, but you don't want to click every thumbnail or link to that
picture, and then save it? Mihov Picture Downloader is exactly what you
need. It scans a page, that you specify, for each and every picture, that
is linked from that page. Then it displays all pictures found and you can
select only the pictures, that you would like to have downloaded. One click
and download of all pictures starts. It's that easy!
On My Internet3 page or HERE:
http://download.mihov.com/pd.exe ~263Kb
Just tried it. It does the job. Thanks.
 
E

Exeter

In alt.comp.freeware on Tue, 13 Apr 2004 15:52:03 -0500 "John R.
Sellers said:
Apparently, I 'spoke' too soon. I put
http://www.visualparadox.com/images/index.htm in the URL spacxe, set it to
download *.jpg, clicked "Load page", and...nothing. It did work on the test
URL.
That is odd you should have come up with one JPG since there is *one*
JPG on that page. Take a look at the links, they don't link to the
images they link to another page.

This is one of the links...
http://www.visualparadox.com/wallpapers/aboveclouds.htm

The actual image is called from that page. The actual image is at...
http://www.visualparadox.com/images/no-linking-allowed/aboveclouds.jpg
Since there is no JPG linked by the list there is nothing for the
program to find.

--
 
M

MightyKitten

That is odd you should have come up with one JPG since there is *one*
JPG on that page. Take a look at the links, they don't link to the
images they link to another page.
</SNIP>

Mightbe the picture(s) are embedded in an aditional frame?
I've seen this kind of 'protection' before (flash game sites are very keen
on this technique, since many Flash Grabbers have dificulties with finding
the right flash file*. If Picture Downloader works in the same way, it just
might be fooled by using a simple technique like frames.

Therefore, If I realy want to have 80% or more of a site, I often just use a
Site Ripper (or a offline browser > sounds nicer) to grab all and filter out
the junk. Also handy if you need to edit a website for others, filtering out
all unused junk. Though you will lose any server based scripting, moste
sites I have to edit do not use these techniques.

*
Most of them compare the list of flash movies in the selected frame with the
flash movies found in the Internet Cache directory. Embedding the flashgames
in a seperate frame with the size of the falshmovie itself will succesfully
hide the flash movie from these kind of grabbers. Therefore I like Flash
grabbers that just search the Internet cache.

MightyKitten
 
S

starwars

In alt.comp.freeware on Tue, 13 Apr 2004 15:52:03 -0500 "John R.

That is odd you should have come up with one JPG since there is *one*
JPG on that page. Take a look at the links, they don't link to the
images they link to another page.

This is one of the links...
http://www.visualparadox.com/wallpapers/aboveclouds.htm

The actual image is called from that page. The actual image is at...
http://www.visualparadox.com/images/no-linking-allowed/aboveclouds.jpg
Since there is no JPG linked by the list there is nothing for the
program to find.

Get Picpluck 2.1 - there are lots of download sites on the web - just
do a Google search for "picpluck". It worked when I tested the
http://www.visualparadox.com/wallpapers/aboveclouds.htm link. You
may need to set it to download pictures only one or two clicks away
if you don't want to spider the whole visualparadox site. It can be
set to not download duplicate named files unless you set it to avoid
maximum filename collision.
 
C

CoMa

Newsreader:
Get Picpluck 2.1 - there are lots of download sites on the web - just
do a Google search for "picpluck". It worked when I tested the
http://www.visualparadox.com/wallpapers/aboveclouds.htm link. You
may need to set it to download pictures only one or two clicks away
if you don't want to spider the whole visualparadox site. It can be
set to not download duplicate named files unless you set it to avoid
maximum filename collision.

PicPluck 2.1
http://www.winsite.com/bin/Info?500000023952

PicPluck 2.1 a adware program,
if I'm not mistaken.


/CoMa


--
Conny (CoMa) Magnusson
(e-mail address removed)
http://www.algonet.se/~hubbabub/
ICQ : 1351964
=============================
Dont't ask me, I'm making this up as I go...
 
O

omega

MightyKitten said:
Mightbe the picture(s) are embedded in an aditional frame?

No frames, but perhaps close enough to the barrier you might be referencing:
Each jpg is linked from an individual page.

http://www.visualparadox.com/images/index.htm
http://www.visualparadox.com/wallpapers/alienpods.htm
http://www.visualparadox.com/images/no-linking-allowed/alienpods.jpg

(btw, at further click-depth, for pics offered in specific resolutions:
http://www.visualparadox.com/wallpapers/alienpods800.htm
http://www.visualparadox.com/images/no-linking-allowed/alienpods800.jpg)
Therefore, If I realy want to have 80% or more of a site, I often just use a
Site Ripper (or a offline browser > sounds nicer) to grab all and filter out
the junk.

Httrack is a very smart bot.

However, I could not come up with filters to get it to download strictly
the jpgs. No matter how I approached the matter, it was necessary for it
to download the htm's in addition to the jpgs.

IOW, if I wanted alienpods.jpg, I had to also receive alienpods.htm.

An Httrack filter, with http://www.visualparadox.com/images/index.htm
as the start URL, it would go something similar to this:

-> -*
(note that first killall filter. it is often the best way to begin)

-> +http://www.visualparadox.com/images/index.htm
-> +http://www.visualparadox.com/wallpapers/*.htm
-> +http://www.visualparadox.com/images/no-linking-allowed/*.jpg

(Slim the retrieve a bit by excluding the alternate resolutions:)
-> -.*800.jpg -*640.jpg -*800.htm -*640.htm)

The problem is that the filter does not result in what we want. While
it will get the jpgs, it will also waste a lot of bandwidth saving to
disk all the htms.

Yes, I did try various kill filters in the attempt to ask Httrack to not
download the htms. But the result of those tries was then the htms were
not even scanned. (Not scanned/read for links, then the target jpgs
cannot be found.)

I ran out of all possible ideas on the filters. So I finally decided
to resort to testing with settings in the "Experts Only" tab. I believe
Xavier Roche when he repeatedly advises we not change the defaults there,
that we instead achieve our objectives via the filters ("Scan Rules") tab.
But this time, I decided to give it a go.

On that tab, under "Primary Scan Rule," I chose the setting "Store
non-html files;" excludes storing html files. The result was that all
those htms were no longer appearing in the download folder. Yet, it
seemed just as slow (on my connect, all things are slow; but the point
is that the it seemed like the amount going on was still equal).

So then, on that same "Experts Only" tab, I made one other change. I
deselected "Use a Cache for Updates." This is something I would not
normally do, as it means that you cannot resume a partial download
project, and would have to start from scratch. Yet if your connection
is fast and you also don't plan on later getting updates for a site,
then I suppose it would not be bad.

These two additional changes on the settings, then the result is that
only the jpgs are downloaded.

Greater speed? Less bandwidth? Less workload on the target server? All
I can say is: not that I could observe.[*] I can only report as far as
success for what ends up showing downloaded on the drive.

Conclusion then. Httrack can handle this project. I don't know if there
is a better method for its settings than the approach I've outlined.
On the one hand, I consider myself no Httrack expert. But on the other,
I sunk my teeth in and tried all manner of variation, before coming up
with this final.


--
Karen S.

_________
[*] As far as relative bandwidth, technical guesswork is a bit more
than I would feel qualified to go into. As far as speed/time gains...
My slowmo connection precludes my ability to do tests. In addition
to the consideration that one would have to select a target server,
with a parallel directory structure, whom they had enough disdain
for, before setting into a serious round of such testings.
 

Ask a Question

Want to reply to this thread or ask your own question?

You'll need to choose a username for the site, which only take a couple of moments. After that, you can post your question and our members will help you out.

Ask a Question

Top