Skip to content

Need suggestions to increase verified list

Hi everyone,
First off, let me say that I'm only doing churn and burn at the moment. I used to not care about verified urls, which is why I never let a project verify for weeks. I was simply submitting to the submitted and verified list along with scraping a little, which led to a crazy lpm of around 400-700. However it has occured to me that the majority of these submitted links very useless to me, as I had no idea which submission was successful and I was just going nuts on the same urls the majority of the time.

Let me explain a little about my current setup, which I have been testing for 1½ day now. I've got 10 projects which has options pretty much identical to each other though some has word filters and others don't. I've got a powerful dedicated server with more resources than ser can use, which allows me to run 1,5k threads (thanks to the recent updates, thanks A LOT Sven!) with a html timeout of 120 stable. 100k keyword lists imported to every project. I've got only 13 private proxies which I'm only using to submit with and for searching + submitting I'm using a list of port scanned public proxies of very good quality, which usually contains anywhere between 200-2k google passed proxies updated every hour. SER is scraping this list of public proxies at less than 400 active proxies and checking every 25 minutes.

Below is a screenshot of how my options looks like for pretty much all my projects. My idea was to only submit to the verified list to not waste power on the urls which had only been submitted and not verified. The reason why I choose to do so was to force ser into doing more scraping. Also I'm planning on manually changing the status to verify (Active (V)) for a couple hours every day or every other day. I just finished verifying the links which I had built during 36 hours and it gave me about 30k verified urls in total. My lpm is also at a little less than 100 now.

image

I'd like to know if you can recommend me doing something differently to get more verified urls. My intention is not to build thousands of verified urls on the exact same blog post, but rather post and verify a few links on as many targets as possible. Also I'd like to know your opinion on removing links after first verification try? The content I'm using is _extremely much_ spun garbage, so I'm definitely sure that it's going to be rejected by any manual review which is why I choose to remove after first verification try. Also, I realize private proxies is an obvious improvement, but I simply cannot afford to buy that many to fully utilize all my threads, and I'm pretty sure that my public proxies are doing a very good job already.

I apologize for the wall of text, but I wanted to get as much relevant information into the post as possible. I hope this also could be of some use to someone who is struggling out there. Also I'd love some healthy discussion to further improve all our experiences with SER.

Comments

  • Get yourself a copy of gscraper or scrapebox. Use footprints + keywords to scrape for targets and import them into SER. You will see great improvement over your current setup.
  • Should have added that I already am in the process of scraping for targets with scrapebox. I am also scraping external links from blog comments which seems to give great success.

    Also I have realized that I have written waaay too much in my post. Really doesn't seem fair to expect anyone to read through all that, but thank you man.
Sign In or Register to comment.