Two 50k Verified DomainLists - Only ~260 verified then stops? What am I doing wrong?
Hey guys, in the past 2 days, I bought two 50k verified domain lists. One is from famous Ron's Serlist.
I followed all the PDF instructions and optimizations. I rarely have any limits for PR, daily limits, none. multiple emails, 90 private proxies.
But when I load the list in selecting only contextuals like Articles, SN, Wikis, not junk links, I get about 260 verified links, then SER kind of just stops...
Am I doing something wrong? This is happening to both 50k lists, so I think it has to be something I'm doing wrong.
If anyone can help me practically so I can blast through all 100k of my purchased lists and not just a few hundred of them, I'm willing to pay private coaching on Paypal to get this sorted out if someone can use Teamviewer to login to my VPS and show how to change settings. Thanks!
Comments
-Check how many of the proxies are working.
-On how many threads are you using GSA SER?
-Captcha solving service/software. Captcha Breaker? Spamvilla? Depending how the lists are created you may need different captcha solving services. Is it working? Start the project and look at the log of Capcha Breaker if it's connecting and receiving and solving captchas.
-On how many retries have you set your captcha service
-What types of email accounts do you use? Temporary, public (yahoo, outlook) or private (account@yourdomain.com). I have found that using your own hosted accounts works best. Check this tutorial to learn how to create thousands of them in seconds. + Remember to always test your email accounts
-How many email accounts are you using (use from 5 to 10 for each project)
-Any unwanted messages in the log? Too many download failed per se
-Any "important messages" for the project?
+How do you import the list?
Also, you should note that you will always get less links than what's being promoted (for number of reasons - it's just the way it is) And also, note that even though it's a 50k list, the contexuals are going to be much, much less
For now let's leave it at that
These are not normal numbers, the only solution i found was to re-run the lists over and over to get the most from them.
People are reporting issues with 'download failed' and issues with the way SER is handling proxies. I don't know if this could be the cause of the problems?
"continuously try to post to a site even if failed before" should be selected, so SER can try and post to all sites on the list repeatedly.
I don't see anything else that is a major problem. You can import the list multiple times to get as many links as possible or you can leave the list in identified and let SER post to it from there.
As i said before, i am noticing that lists have to be run over and over again to get most of the links at the moment.
Hope it helps.
However, over time we have tested both ways of doing things and got different results. So, it can't hurt to test both ways to ensure you get maximum links from the list.
Choose the .sl file.
Just repeat that multiple times.
@joland - No there really is no way to regulate that. What happens in real life is you have a list, and some links get built and some don't get built. But then project #2 somehow manages to build links that project #1 didn't build. It probably has to do more with the internet connection and the website host more than anything else.
And as @gooner said, and to extrapolate a little, I will highlight all projects that have the exact same engines chosen as a group. For example, my T1's will have articles, social network and wikis as an example, and I will highlight all those similar projects, and then right click, and import target URLs from sitelist:identified. Then while they are still highlighted, I then repeat and import again. I will do this like 4X on contextual projects so that there is a rough "link balance" as far as the number of links in contextual vs. junk tiers where the list always has way more targets.
Generally speaking, you have to be careful not to layer on too many imports into each project because then you can stress out SER with too much memory demand. But I do it with contextual projects all the time. Plus, by doing that, I just let SER run and run without worrying about feeding it enough targets. That way I can focus on building other websites and just other work I need to get done.