Skip to content

VpM huge jump from 450 to 700+

Hi! I am running GSA SER, Xrumer, Scrapebox, GSA Captcha Breaker, at the same time on my dedicated server. (64 GB RAM, 250 private proxies)

Dedicated Server: (Windows Server 2012)

Proxies: (new batch, bought yesterday)

Issue (verified links):

I am getting a lot of new blog comment links (I don't use verified links, I scrape 24/7) and these are coming up a lot.

They stay for 5-10 minutes and then disappear (auto approve) but then disappear.

Posted on the dedicated server and checked on my machine (home computer) and it was live, but gone soon.

Now, the problem is that I can not disable this engine because I don't want to disable general blogs as one because it gives me a LOT of my links (that, trackbacks and image comments mainly).

Here are some:

I had around 400-500 VpM and now it's spiked to 700+ because of these links that are being made (easily verified, but don't last; posts quickly, no captcha.. probably why).

They all have the footprint in common: "designed by mythemes and using free dns"

So if that string is found, can I have it automatically not post? I'm not quite sure how to do this.

Can anyone help me? Thanks.


  • ronron
    edited June 2014
    I gave it a shot to see if the 10 Blog engines had any reference to dns or mythemes in their footprints, but came up empty. I think it has to be picking up on another footprint, but I am not seeing anything except maybe "Notify me of followup comments via e-mail", but I just thought of that.

    I still would look at the footprints in those 10 blog engine files, or at least copy them to notepad so they are all staring you in the face as you analyze that example you gave. You might want to look at that example's source code - maybe it isn't showing on visual and is hidden in the html code. That is my best guess.

    If it was picking up on a footprint, then you could just delete it from that engine file.
  • edited June 2014
    Apparently these domains are some kind of traps. I've had an impression that including "" in global filter should avoid them alltogether?

    Now I see that even with:
    it is not working alas as they posted anyways.

    And apparently there are at least 4 domains with those traps.

    @Sven could you shed some light on the matter? As of why filter won't work? And does "Skip sites where the following words appear" option spends more resources overall?
  • Also looks a bit like YAD to me....
  • Thanks ron. I did a bit of research and there are quite a few of these. I enabled to not post on the sites where the words appear and GSA used a HUGE amount of memory and started lagging and posting more slowly.

    Not sure if there is a way, but can this be done more efficiently? I buy new proxies from different providers every month because my 250 are spamhaus listed towards the beginning of the new month.
  • SvenSven
    next version makes this no longer submitting by General Blog Comments and adds a new engine that is "abusing" this for article posting in a new engine.
Sign In or Register to comment.