Skip to content

Use A Random Search Engine

edited March 2013 in Feature Requests
I brought this up in another thread but not sure if Sven got to see it.

There seemed to be some support for the idea.

It's my understanding that if I have, say, 10 google search engines selected then SER will attempt to search each footprint/keyword on all 10 of them. Obviously, one option to avoid this would be to only select one search engine. However, as we all know, the problem with that is we would end up with being temp banned very quickly.

Would it not make more sense to have the option for SER to randomly select a different country's google (or other SE like Bing, Yahoo, etc) each time it searches and only conduct the search one time instead of repeating it on multiple google sites? We could then use all google sites and not have to worry about hitting any one of them too often. For those that really want to do the same search on multiple google sites then perhaps another option could be added. e.g. 'Repeat search xxx times' which would then repeat the search as many times as specified by the user, again using a random google site each time. Of course, for those wanting to keep the current way of doing things then that option should be kept available as well.

Does anyone else here think this would be a good addition?


  • SvenSven

    Let me explain why this is not making much sense to me.

    All google search engines added have a special parameter that limits the search results be be coming from that special country or in that defined language. 

    If now a search is performed with "powered by xyz" in it means it will return the German sites, in it will return the British sites. Thinking the content would be the same is not correct.

  • Thanks for your reply Sven. If that is the case then it would be even better as it would allow for much greater link diversity from a wider range of sites from different countries. However I'm not in complete agreement that that is the way it works.

    I am in SE Asia and just went to and did a search for "leave a comment" "dishwasher" and didn't see a single German site appear in the the first hundred results.

    The only way I could get German sites to show up in the results was to include a German word in my search, like "leave a comment" "Spülmaschine".

    Furthermore, when I scrape with another program I often have to use an operator like "" or "" or if I want the majority of sites to be in, say, German, Swedish or Danish.

    I'm sure there are others apart from me that would be grateful if you can give more consideration to including the aforementioned suggestion in a future update.

  • SvenSven

    >I am in SE Asia and just went to and did a search for "leave a comment" "dishwasher" and didn't see a single German site appear in the the first hundred results.

    You just changed the domain or did you also add the proper parameter to the URL shoing only results from a language/country?

  • you also need to delete your cookies, cache and history of your browser because of the filter-bubble of google.
  • @Sven. Do you mean as in add "&gl=de" to the url? I just tried that and got the same results as previously.

    @Ozz. I just cleared the cache, cookies and history on my vps and then ran the same test and still got the same results as previously.
  • AlexRAlexR Cape Town
    @LeeG - you offered a good explanation on this on another forum. Can you add your thoughts here? 

    Wasn't the issue that if you use a country proxy, then no matter what Google you use it will display the results from that country but not give you the ban. I.e. if you use a US proxy on it will display US results, but not give you the ban?
  • SvenSven
    @radikal open the se.dat file and have a look for yourself.
  • Ok having looked at the se.dat file I see, to a certain degree, the difference in the search results when using the parameters that you've set SER up to use. What I still don't get is why it wouldn't make much sense for us to have the option for SER to randomly use a different search engine for each search. I've seen it mentioned elsewhere on this forum that it is wise to select a different set of 5 or 6 search engines for each project so as to not have all the projects hit the same ones all the time.

    As it is at the moment, we are pretty much compelled to select those 5 or 6 random SE's for each project and are then stuck with using those same ones until we go in and change them manually. It is my understanding that we shouldn't select any more than that because 1) we will end up getting quite a few duplicate results and 2) other power users here, like LeeG, seem to have proven that selecting just 5 or 6 will actually help us increase our LpM. That being the case I can't see why being able to select, say, all google search engines, and having a couple of options like 'use selected randomly' and 'Repeat search xxx times' wouldn't be of any use to us. I think having that level of control would enable us to tweak our LpM to our hearts desire.

    Just my 2 cents.
  • ronron

    Back in the day (around September or October) I did a comprehensive dialogue on repeat search results. Bing owns Yahoo. Google repeats itself many times on islands close to the U.S. providing the exact same results as in the U.S. InfoSpace powers most metacrawlers, etc.

    So no matter what you change up, you will get the same results unless you simply wise up and pay attention to how narrow the search engines have become in providing unique search results.

    All LeeG is saying is that it is a complete waste of time to keep getting the same search engines to spew up the same results. It makes zero sense.

    All I am suggesting is that one day - when you are completely bored - do what I did. Open up a dozen browsers, and type in the same search phrase in Google, Bing, Yahoo, StartPage, Excite, Dogpile, and all the others - and see what happens. You will be shocked. Then you will truly understand.

    Until then, you are simply speculating. Once you see what happens, you will get it, and you will be the biggest trumpeter of what we are telling you.    

  • LeeGLeeG Eating your first bourne

    Even easier is to download Traffic Travis (free version) and do the test.

    Its when the penny dropped with me and I had a way of proving my own assumptions

    Find a phase or keyword you know the top results for, then run a test with it. Or even one of your sown sites, then watch the top twenty positions on all search results

  • Not sure we are entirely on the same page here.

    @ron..."All LeeG is saying is that it is a complete waste of time to keep getting the same search engines to spew up the same results. It makes zero sense."

    I totally agree. That has been my point all along.

    At the end of the day I basically don't care whether the results are the same, different, or otherwise. My suggestion was all about making more use of all of the search engines, but not all at the same time. The benefits?

    1) More control over how many times an individual search is carried out. 1,
    3, 5 or even 100 or so. The choice would be left up to the user.
    Personally I would be limiting it to 2 or 3. Want to use just one SE?
    That'll be fine too because it will be a different one each time.
    2) Less chance of getting our proxies banned/temp banned.
    3) Eliminate the task of ever having to change which search engines each project uses.

    Should the results differ depending on which country's google you are using then that is fine. If not, then that is fine too. I just want to find targets wherever they are from.

    My thoughts were that the inclusion of this feature would allow us to make better use of all of the SE's, give us a little more control over tailoring SER to our needs, possibly help us cast our nets a bit further and, dare I say it, enable us to tweak our LpM for the better. I'm sure we'd all like some of that wouldn't we?

Sign In or Register to comment.