low verfication rate

Hello, Have been having mixed success with GSA which makes me feel am not using it to it maximum. my set up is 20 shared proxy, 80 threads and 120 html time out. the following errors are what i get from the message log

Registration failed
already parsed - i see this alot
download failed- i see this alot
url passed
matches no new engine

Please what can i do to increase my verification and submission rate.

Comments

  • Hi @kindit77, it's hard to say from the information that you've given.

    One tip that I can give you, is that if you are duplicating projects to create new projects (provided that you are targeting different urls), then you should delete the target history before you start posting.

    This will remove a lot of the 'already parsed messages'.

    Regards
  • Thank you for your response,I would like to know if you scrape target urls by using scrapebox or any other scraper to find more target site or do you find GSA SER alone is enough for this.

    Do you tick global list when posting with GSA.
  • ronron SERLists.com

    Forget global list until you have a big one, and that is at least 3-6 months out. In the meantime, build a big list. I have been using SER almost 1.5 years, and I still don't use global lists. I personally don't bother scraping or feeding lists into SER, but to each their own.

    If you focus a little bit on LPM and identifying the most efficient engines for posting, you won't waste your time on creating even more work for yourself by scraping lists. At least I personally do not think it is necessary.

  • edited September 2013
    @ron thank you, I have not posted with GSA that much therefore I cant identify which platforms are doing well, when i check them they are far in between, i would have to post to GSA for a month more before i can identify them properly. Do you by any means add keywords to ser to find target site
  • ronron SERLists.com
    edited September 2013 Accepted Answer

    That is correct, you need some history to work with before you make any decisions. That is why it is critical to select all engines with every platform you use, so you get valid data on all engines before you start cutting engines.

    I try to use the biggest list I can on any project that does not build links directly to the moneysite (T1A, T2, T2A, T3, T3A). You want the most generic of one and two word phrases. You will need to scour the internet to find these things, and just mash it all together and de-dupe.

    For projects linking directly to the moneysite (T1), use the biggest list you can muster for keywords in the niche. Use Google Keyword Planner, and stick in as many different 'seed' keywords related to your niche. Export everything into one file, and de-dupe.

    I typically break up keyword batches into separate 100 - 1000 keyword files, and then use a token in the keyword field in SER so that SER randomly grabs one of those files. That way you don't bog down SER with a gigantic keyword list which can slow things down. I keep these keyword files in a folder in Dropbox as I found SER was faster grabbing them from there as opposed to the hard drive (at least that was my experience).  

  • edited September 2013
    @ronThis is awesome, your response has opened my eyes to why my copy of ser is slow when i load most used 100k keywords on the internet. I will do as you have said. I will try to use the token keyword field with dropbox to achieve what you have said above.
  • @ron i have been looking at how to achieve what you wrote below since yesterday and am not able to do so can you please help explain how i can go about to add token to gsa so that it grabs keyword files at random in dropbox.

    I typically break up keyword batches into separate 100 - 1000 keyword files, and then use a token in the keyword field in SER so that SER randomly grabs one of those files. That way you don't bog down SER with a gigantic keyword list which can slow things down. I keep these keyword files in a folder in Dropbox as I found SER was faster grabbing them from there as opposed to the hard drive (at least that was my experience).  



  • ronron SERLists.com
    Use the spinfolder token in the keyword field. You must go to the big Help button and go to View Macro Guide. Your answers are there. 
  • donchinodonchino https://pbn.solutions
    @ron - the spinfolder token will use a random file from folder as content, so if you have 1000 kws in a file it will submit all these together as one.. i was trying to find a proper macro for this task, but seems like there should be some mix of spinfolder and spinfile tokens to read a random line from a random file in a folder.. maybe can specify which macro to use in kw field?
  • I will look into this now and report back to you
Sign In or Register to comment.