low verfication rate
Hello, Have been having mixed success with GSA which makes me feel am not using it to it maximum. my set up is 20 shared proxy, 80 threads and 120 html time out. the following errors are what i get from the message log
Registration failed
already parsed - i see this alot
download failed- i see this alot
url passed
matches no new engine
Please what can i do to increase my verification and submission rate.
Registration failed
already parsed - i see this alot
download failed- i see this alot
url passed
matches no new engine
Please what can i do to increase my verification and submission rate.
Comments
One tip that I can give you, is that if you are duplicating projects to create new projects (provided that you are targeting different urls), then you should delete the target history before you start posting.
This will remove a lot of the 'already parsed messages'.
Regards
Do you tick global list when posting with GSA.
Forget global list until you have a big one, and that is at least 3-6 months out. In the meantime, build a big list. I have been using SER almost 1.5 years, and I still don't use global lists. I personally don't bother scraping or feeding lists into SER, but to each their own.
If you focus a little bit on LPM and identifying the most efficient engines for posting, you won't waste your time on creating even more work for yourself by scraping lists. At least I personally do not think it is necessary.
That is correct, you need some history to work with before you make any decisions. That is why it is critical to select all engines with every platform you use, so you get valid data on all engines before you start cutting engines.
I try to use the biggest list I can on any project that does not build links directly to the moneysite (T1A, T2, T2A, T3, T3A). You want the most generic of one and two word phrases. You will need to scour the internet to find these things, and just mash it all together and de-dupe.
For projects linking directly to the moneysite (T1), use the biggest list you can muster for keywords in the niche. Use Google Keyword Planner, and stick in as many different 'seed' keywords related to your niche. Export everything into one file, and de-dupe.
I typically break up keyword batches into separate 100 - 1000 keyword files, and then use a token in the keyword field in SER so that SER randomly grabs one of those files. That way you don't bog down SER with a gigantic keyword list which can slow things down. I keep these keyword files in a folder in Dropbox as I found SER was faster grabbing them from there as opposed to the hard drive (at least that was my experience).
I typically break up keyword batches into separate 100 - 1000 keyword files, and then use a token in the keyword field in SER so that SER randomly grabs one of those files. That way you don't bog down SER with a gigantic keyword list which can slow things down. I keep these keyword files in a folder in Dropbox as I found SER was faster grabbing them from there as opposed to the hard drive (at least that was my experience).