<a href="https://www.gsa-online.de/download/image_spider_demo.exe" title="Link: https://www.gsa-online.de/download/image_spider_demo.exe">https://www.gsa-online.de/download/image_spider_demo.exe</a>
It is offering a lot more search engines and hopefully a better and more intuitive GUI. Let me know what you think and if you are missing something from the old version. It will replace the old one if no complains arrive.
Comments
Good news is I haven't ever demoed this program so I can actually DL it soon and check it out!
I'm +1 on spiritfly suggestions and I would ask for the bulk image optimization which removes exif information and maybe resizes images into a specified range upon exporting them.
I guess it's the old version, b/c there's a spider in a hammock??
The new version's standalone export button is a very handy idea and the naming flexibility of the old version is awesome. But please add the ability to change the way the program saves the files when the filenames are the same. Currently it does by adding a number in parenthesis like this:
keyword(1)
keyword(2)
An flexibe way would be to add a %number% macro and people could choose what to do with it. However an option to choose from separator type (_,-,|, ,) and with or without parenthesis would do as well I guess.
I haven't found a way to set the number of images I want. In the old version there is an ability to set "how many search results to parse" but if I set that to 10 the program doesn't stop at 10 images so I don't really get that function.
You may consider an import images from hard drive feature to have the ability to rename the already downloaded images.
The ability to add more keyword would also come handy.
@Deeeeeeee you can download the new beta version from this very page above. The GSA website has the old version.
---
rename options: will get added
---
import from hd: yea sounds like a good feature
---
only 10 images per search...will think about it
Otherwise my personal preference would be to have a separate settings window for batch image processing. Something similar like the settings menu where there are checkboxes with all the filters and a field next to them for the values. Maybe ranges of values if you want to add randomness to the process. Maybe you can even have a save and load preset option there.
Maybe it's just personal preference but I think it's faster than choosing everything from a pop up one by one. The pop up is cool if you are tempering with one image trying to find out the perfect fit though.
Thanks for the preview window, import, exif and the preset options. I guess the file rename option is coming soon.
Also there is a check images option in the right click menu but it does nothing to me and I can't export only the selected images. It exports all.
so only thing left is the rename-option. Any gui-suggestions?
Having a maximum px height and px width option with a keep aspect ratio checkbox would would give us easier control over the final image size.
Everything else works flawlessly.
I did DL the old demo, as the link in the GSA product pages still had this one up.
@Sven, I like the new GUI, but that spider in a hammock was cool! lol Can't you shrink it down real small and stick it on the new page somewhere?
Is it possible at least to change that or pick which is the more important? Developing on the web I mostly care about width.
Most image processing programs I've used worked by resizing the image automatically based of the smallest parameter. So if the picture was a 3000x2000 image and I entered 650x1000 then it took the 650x as the maximum allowed parameter and calculated the other number which is this case 433.
Have you removed the exif remove checkbox or it's just me who can't find it now?
--
max images: you have that in next update
--
check images will check if that url delivers a valid one...thats however not showing a message when done and when you scrape all images should already be fine. so i will fix that as well in next update
Is it possible to add a URL blacklist to "search by keywords" or at least just omit the results from pixabay.com when using Google Image Search. I only selected Google Image
Search and that returns quite a few results from Pixabay. The problem is that Image Spider can't download them or it downloads a grey picture saying "Discover pixabay free images No hotlinking"
If I use the built in pixabay scraper it works fine. It returns different results but at least those work fine.
Thanks for the del function! Could you make the selection bar not to disappear and not to go back to the beginning or at the end of the list after every deleted image?
Thanks!
I've got 12 keywords and Google Image Search selected. When I hit OK it returns the results for some of the keywords and stops as if there were no more keywords. When I hit search again it scrapes some random kw again including some of the already scraped ones.
Is any way to switch between the licensed and free to use images in google search?
---
that option is for all images and all keywords. It would stop after adding 10+ images if you enter 10 there even if you have 1000 keywords.
---
google: was that an url parameter?
-----
Would it be possible to make it go through all the keywords and respect the limiter as a minimum for each keyword? That would make much more sense when scraping for multiple keywords for various projects.
---
I only used it through gui in scrapebox image grabber add-on or rarely in google but I guess it has a URL parameter as well. This may help (3.2): http://jwebnet.net/advancedgooglesearch.html
Currently it looks like this:
Imported filename:
crazy-widget.jpg
slow-widget.jpg
Exported filename:
crazy-widget.jpg-some-modification.jpg
slow-widget.jpg-some-modification.jpg
It would be nice to have something like this:
crazy-widget-some-modification.jpg
slow-widget-some-modification.jpg
Also, if you are looking for to further expand the filter functions in IS an exposure and zoom function would come handy.
Thanks
the other filters are basically a resize + shave thing no?
Yeah, I guess you can skip the zooming. Resize and shave does the job.
I've got some minor feedback for you:
-Wouldn't it be easier just to include the %number% macro in the file name template field? If no %number% used then IS would default on standard numbering. This would expand the flexibility of the renaming as we could add numbers in the middle of the filename.
-When scraping for images and aborting the process it takes an eternity for IS to stop if it does at all.
-When applying filters IS slows down as a snail. Even entering number into the boxes takes seconds let alone using the slider. I don't see any extreme memory or cpu usage though.
-In the filter windows could you add a next and previous arrow to the preview? Currently there is no way to check how would the applied filters would look on the rest of the images. Similar to the Viewer window would be perfect.
Meanwhile the docu: http://docu.gsa-online.de/image_spider
---
abort: should be faster in next update
---
filter: thats all happening in main thread as it has to due to gdi operations, i cant do much of a speedup here sorry
---
next on filter: you can use the tool->use filter and it will work even if not exporting. though i might be able to add s next/prev when being in that mode.
My-super-keyword-1-BrandName.jpg
My-super-keyword-2-BrandName.jpg
Although I understand that it is kind of specific as people don't even bother renaming images anymore as they use wordpress anyway. I just thought I could save a step here but it's fine like this too.
Thanks!
Like : Only images "with that's" in url.
Only images "with that's" in images name.
One or all of this options possible or not in same time.
For exemple if i want specific images type on one website :
https://exemple.com/0389/58740389/pics/photo_58740389_avatar_4.jpg
Another thing is to block some others urls or mandatory another for spider by mask :
https://exemple.com/pics/photo_58740389_avatar_4.jpg
After scraping is done it can be good too select all images or Urls by word.
Just for exemple :
https://exemple.com/photos/géométrie/?&pagi=2
https://exemple.com/photos/géométrie/?&pagi=[INCREMENT]
Enter the 2 values, start and end to visit and the spider visit all the beach without having to click (in the rare cases where he could not).
For "accepted image type", would it be possible to define the extension ourselves? (like .gif .svg ... and all future format).
I have added support for this in latest update. Read here for more:
http://docu.gsa-online.de/image_spider/scrape_settings#parse_urls
1. When using the del key to delete images the highlighting row disappears. It comes back when I hit the one of the arrows but it goes back to the top of the urls. It would be nice if it could remember the positions same as the delete radio button works.
2. Clicking the green update button in the bottom right corner shuts down IS and nothing happens.
Thanks!
Any chance adding more filters to it?
An exposure slider is greatly missed as that is the proper way to lighten up pictures.
Others like vividness, highlight and shadows settings would be fancy too but not as crucial.
Would it be possible to decently load the images in ram? checkbox if the machine has a lot of ram to visualize faster images to keep or delete. So there would be no need to download the images after the selection but just save them.
Regarding the crawl of a site, why do threads go down to 0 before continuing? It is not faster to fill a list where threads could take urls without interruption?
Regarding to code samples I haven't given up yet but I'm very close to that. I thought it's all over the net as these things are kind of generic when it comes to image processing...