@richardbesson and @judderman - That's the whole point of renaming the engines. When you rename the modified engines, SER doesn't overwrite them. This is all done in step 3.
I unninstalled the old version and reinstalled from your new link. Shows the correct changelog but soft is still showing 1.0.0.0 Please PM or tell me how to delete completely so I can reinstall. Thank you.
I wanted to write this a couple of weeks ago, but forgot all about it. I tried using your software to get the amount of indexed pages for some footprints, but I discovered that on footprints in quotes with no indexed pages it returns the result for the footprint without quotes. Google will tell you that there's no results when doing the search query. Will you be able to make it react to this message and return the correct number (0)?
@s4nt0s Such a nice tool an thanks so much for fixing the 'no results' issue. I just got one question regarding the methods you mention to extract footprints.
Can you explain why importing your verified list for a specific platform and then extracting the footprints from these is so much less effective than scraping a fresh list and extracting the footprints from these? I mean correct me if I'm wrong, but the way I see it, you're adding in a lot of potentially false footprints this way, because from that scraped list, maybe only 5-15% (at best) actually get verified using GSA SER and the rest of them is unrelated crap which we don't need. Now if we extract the footprints from this unverified URL list, we'll automatically get the beforementioned false footprints, because 85% of that list are links that wouldn't get verified and we're extracting their footprints.
We might actually get more scraping results, because we unlock a lot more potential websites, but the submitted/verified ratio of our scraped lists is going to drop like hell, isn't it? Wouldn't it then be better to stick to much fewer footprints (using your already verified lists), but have a much higher chance of actually extracting footprints that will result in relevant results?
I am absolutely not telling you your technique is crap, I am a complete noob when it comes to building customized footprints. I'm just explaining you the way I see it and I'd like to learn from you and your experience.
@Tixxpff - Yes I showed that example in the video, it was less effective because for some reason Gscraper wouldn't extract the footprints from my imported list or would return very little footprints. A lot of time I would end up with none. I should have clarified that better in the video.
By all means that is a great way to do it and if you can get Gscraper to return footprints that way, definitely take advantage of it. On my end, Gscraper would only pull out footprints from lists that it had just scraped.
I need to try it again on the newer versions, maybe it performs better now.
At the end of the day, go with whatever method you feel is best for you. There might even be better ways then I explained in the video, the vid is there to help you get started but feel free to do it how you like
@S4nt0s sorry for the newb question, but if I create a bunch of new modified engines can I then export the data to my servers? Or do I have to load up the tool on the servers and export the data through the tool first?
Just before I spend time, wanted to ask first.
(lack of sleep is giving me a fuzzy head at the moment LOL)
Sorry to bump this, but is it possible to export the modified engines to my other servers? Been doing each and every engine one by one, and it's taking forever. I can't see an option to export...or is there another way? Plus it's killing my proxies..
Err nope, didn't know that thanks Santos. Maybe I'm being dumb but I can't seem to add footprints, only add URLs, which I don't want to do. I have over 10k footprints I want to add...
Thanks @Yashar Wasn't sure if it would work like that.
@Anyonymous - Hmm, I definitely see the next button showing on my end. Installed on 2 different machines using 2 different operating systems and the button shows so I'm not sure what's going on there.
Make sure you grab the version from the original post in this thread.
Comments
But didn't you want the file to get updated with new script? if not you will end up with script that are not working due to old code.
Did i messed something?
I just got one question regarding the methods you mention to extract footprints.
Can you explain why importing your verified list for a specific platform and then extracting the footprints from these is so much less effective than scraping a fresh list and extracting the footprints from these?
I mean correct me if I'm wrong, but the way I see it, you're adding in a lot of potentially false footprints this way, because from that scraped list, maybe only 5-15% (at best) actually get verified using GSA SER and the rest of them is unrelated crap which we don't need.
Now if we extract the footprints from this unverified URL list, we'll automatically get the beforementioned false footprints, because 85% of that list are links that wouldn't get verified and we're extracting their footprints.
We might actually get more scraping results, because we unlock a lot more potential websites, but the submitted/verified ratio of our scraped lists is going to drop like hell, isn't it?
Wouldn't it then be better to stick to much fewer footprints (using your already verified lists), but have a much higher chance of actually extracting footprints that will result in relevant results?
I am absolutely not telling you your technique is crap, I am a complete noob when it comes to building customized footprints. I'm just explaining you the way I see it and I'd like to learn from you and your experience.
Open Program Files (x86) and find GSA SER, click engines and rar/zip the engines you want and replace on other servers