I just figured it out the exact moment you posted this. I will do a quick mass edit using notepad++ (backing up the whole universe BEFORE I do so just in case. Thanks again LeeG!
GlobalGooglerLeeGOzz : Hey guys . someone mentioned he/she selected more than the default SE`s selected. My question is , on what particular SE`s should i select? Im confused what to select
There are a lot of different opinions on what to use, what works best.
I prefer the rule of less is more.
Google, no matter what country you use, can produce up to 100 results per page.
And all countries return the same results, give or take 6 to 8 positions
Some engines by default, use the blog search engines. Something that I personally have made changes to files so they are not used
A lot of engines also draw their search results from the main players, there are very few independant engines out there. Independents are Yandex in Russia and Baidu in China, plus a few others.
Yahoo is Bing. Plus there are a load of others that feed off google and bing
My own recommendations would be four to six random googles, plus four to six random google blog search engines.
But thats just me going against what everyone else recommends
The more obscure the better, so you have less chance of the ips you use being banned on them for too many search queries
lrichard2112 - 10 private proxies with 90 threads for me has tripled verified - use less restrictive options for finding TARGET URLS - start low on the threads (20-40) every few hours go up a few threads until you come to the point where the computer does not freeze (CPU being always on 99% is not good) - use more keywords so you can get more targets - experiment with more search engines - html time out should be a little above the thread count I believe (100 threads - 105 html time out) - I usually leave the search time between queries alone
@LeeG - photoshop was definitely a joke. Even the German's caught it as such. ;-)
I always read your posts..very informative!
What an epic discussion.
I'm running some SE tests to see how to improve this area.
I'm looking for a system to measure keyword overlap. I.e. "blue widget" vs "lovely blue widget", has a big overlap. Basically, trying to find a way to measure this overlap and only select keyword with a set uniqueness %. Does anybody have any ideas here?
If you use scrapebox to scrape keywords, chances are you will end up with a lot of keywords along those lines
Adding different google engines to each project is a great way to learn how many different countries / regions there are and how many countries are censored with no access to it
@LeeG "I'm looking for a system to measure keyword overlap. I.e. "blue widget" vs "lovely blue widget", has a big overlap. Basically, trying to find a way to measure this overlap and only select keyword with a set uniqueness %. " Does you have an idea on how to check/resolve this?
5) What's your custom search time between queries? - default
In less than 24 hours i got around 10,000 verified links with 6 projects + 1 tier each, a total of 12. Btw before to start any project i recommend first to scrape for website lists and import them, this is how i did and believe it or not i get daily tons of verified links.
btw ... How to set SER to only search and not submit ?
the opposit to only submit and not search you can archive through disable all SEs and uncheck the 2 checkboxes (always use keyword to find target sites + also analyse competitiors posts)
The most valuable tip I could give somebody that wants to put their submission count into hyperdrive is:
Set the custom verification on "no limit" projects to "Custom - 7200 minutes".
When you build links on lower tiers that have no limits, you will end up with projects that have tens or hundreds of thousands of submitted links to verify. If you just let GSA verify when it wants to, it will spend considerable time going through this process each day.
If you pace out when GSA does this, your submissions will go through the roof. I'm at 2,000 per hour on just 125 threads.
The best tip: get fast private proxies (PP) with repsonse < 1 sec. I had 40 PP with response > 2 seconds + 360 threads settings more or less tuned ;-) and got 20k submits + 2k verified each 24 hours. with the same amount of PP but repsonse < 1 sec ==> 130k submits + 15-30k verified each day.
360 theards on a dedi with is ~ 50% CPU usage, running other tools parallel. so you can hit much higher numbers if you want to get crazy here.
TOPtActics i put down my verify time from 5sec to 1sec as you suggested and getting much better submit / verify rates now ... what a "few" seconds can do thx
Thanks for that tip. I had it checked the whole time, thinking it was "safer," but never stopped to realize there isn't much downside to leaving it unchecked because most sites will prevent duplicate registration anyway.
Do you do this with upper tiers as well, or only lower tiers for some reason?
LeeG said:
If you want a quick blast of easy links. On the lower tiers, make sure you dont have a tick in the "dont post to the same domain" or how ever its worded.
Most websites wont let you register a second time. If its a forum, your already registered etc
Social bookmarks, ser will log into the accounts you already have and add more links to those, making the accounts look more natural. Rather than one bookmark one account
Its any easy way to add extra links. And if your using global sites lists, from time to time, links will be added to those accounts you already have set up.
@ron "Set the custom verification on "no limit" projects to "Custom - 7200 minutes".
thats 5 days..I read in another thread that SER will delete your submitted links after 5 days if not verified [unless you tick 'don't remove urls'..I am not sure how it works..but if you set verify after 5 days chances are SER will delete your submitted links as not verified and when actually it start verifying links, there are chances some of your links already been deleted from submitted list.
@LeeG - "Social bookmarks, ser will log into the accounts you already have and add more links to those, making the accounts look more natural. Rather than one bookmark one account"
1) To do this, don't you have to clear your target URl History/cache?
2) Does it do it automatially?
3) Or do you have a set of master accounts that you keep using?
Comments
nicerice the lines your looking for are
;0=no, 1=yes, 2=only blog search
use blog search=1
I just delete them myself
I know I sound extra cautious at times.
But some people are beyond help at times and giving the information is like giving them a lighter in a fireworks factory
You can also add extra footprints to boost searches and results.
I was playing for a day to get things to a stage I was happy with.
Make an edit, run for a good few hours etc etc
Cut out using scrapebox or the built in scraper and just add the footprints straight to those files.
There are a lot of different opinions on what to use, what works best.
I prefer the rule of less is more.
Google, no matter what country you use, can produce up to 100 results per page.
And all countries return the same results, give or take 6 to 8 positions
Some engines by default, use the blog search engines. Something that I personally have made changes to files so they are not used
A lot of engines also draw their search results from the main players, there are very few independant engines out there. Independents are Yandex in Russia and Baidu in China, plus a few others.
Yahoo is Bing. Plus there are a load of others that feed off google and bing
My own recommendations would be four to six random googles, plus four to six random google blog search engines.
But thats just me going against what everyone else recommends
The more obscure the better, so you have less chance of the ips you use being banned on them for too many search queries
1) How many private proxies you using?
so i cant get more results?
coz Im just getting 100+ results a week
- 10 private proxies with 90 threads for me has tripled verified
- use less restrictive options for finding TARGET URLS
- start low on the threads (20-40) every few hours go up a few threads until you come to the point where the computer does not freeze (CPU being always on 99% is not good)
- use more keywords so you can get more targets
- experiment with more search engines
- html time out should be a little above the thread count I believe (100 threads - 105 html time out)
- I usually leave the search time between queries alone
What an epic discussion.
If you use scrapebox to scrape keywords, chances are you will end up with a lot of keywords along those lines
Adding different google engines to each project is a great way to learn how many different countries / regions there are and how many countries are censored with no access to it
In less than 24 hours i got around 10,000 verified links with 6 projects + 1 tier each, a total of 12. Btw before to start any project i recommend first to scrape for website lists and import them, this is how i did and believe it or not i get daily tons of verified links.
the opposit to only submit and not search you can archive through disable all SEs and uncheck the 2 checkboxes (always use keyword to find target sites + also analyse competitiors posts)
The most valuable tip I could give somebody that wants to put their submission count into hyperdrive is:
Set the custom verification on "no limit" projects to "Custom - 7200 minutes".
When you build links on lower tiers that have no limits, you will end up with projects that have tens or hundreds of thousands of submitted links to verify. If you just let GSA verify when it wants to, it will spend considerable time going through this process each day.
If you pace out when GSA does this, your submissions will go through the roof. I'm at 2,000 per hour on just 125 threads.
360 theards on a dedi with is ~ 50% CPU usage, running other tools parallel. so you can hit much higher numbers if you want to get crazy here.
@lrichard2112 - Pick any project and open it...It's under the Options tab toward the top of the page under "How To Submit / Verify".
@TOPtActics - Fast private proxies and threads do make a big difference. Delay the verify function and you will have a lot more.
I usualy test proxy on my tools eg. UD
@lrichard2112 - You got it right!
@hyde - The response times are all listed in GSA. Options=>Configure. 2nd column from the end.
LeeG said:
If you want a quick blast of easy links. On the lower tiers, make sure you dont have a tick in the "dont post to the same domain" or how ever its worded.
Most websites wont let you register a second time. If its a forum, your already registered etc
Social bookmarks, ser will log into the accounts you already have and add more links to those, making the accounts look more natural. Rather than one bookmark one account
Its any easy way to add extra links. And if your using global sites lists, from time to time, links will be added to those accounts you already have set up.
thats 5 days..I read in another thread that SER will delete your submitted links after 5 days if not verified [unless you tick 'don't remove urls'..I am not sure how it works..but if you set verify after 5 days chances are SER will delete your submitted links as not verified and when actually it start verifying links, there are chances some of your links already been deleted from submitted list.
For safer side I would set it at 3000