Skip to content

Am I being too picky when creating backlinks? 24 hours 0 verified...

Monk29Monk29 China
edited March 31 in Need Help

Hi everyone, I am a new user of GSA SER, I just installed demo version yesterday.

 

I use 10 emails all tested successful, 95 proxies all tested successful, CapSolver with enough Credits, 200 threads, to run a project with article, blog comment, forum, microblog, social network, web2.0 and wiki.

 

Search engines only pick German and English, and skip all site from all languages other than German.

 

Filter URLs didn’t skip any below or above PR, didn’t changed type of backlinks, all as original (untick wiki-article and forum post).

 

With this settings, I ran the project for almost 24 hours and get only 16 submitted and 0 verified.

 

I found a very strange point which is I just cost 0.03 dollar in CapSolver, is it normal?

 

What did I do wrong, did I miss something? Thank you.

Comments

  • Hi backlinkaddict, thanks for your reply.

    now I have 32 submissions and most of them have a tag"awaiting account verification". I switched the "when to verify" to 1 hour.

    I am not so sure what you mean by "I wouldnt just tick off every engine in a group either."

    Do you mean in the "search engine to use" tick off everyone?

    I don't understand what's the point, my site is in German, I guess use German and English search engines would be enough.

    But I am going to listen to you and tick all engines, anyway I have nothing to lose :D 
  • sickseosickseo London,UK
    Before you can expect to see large numbers of verified links, you will need to spend some time building up a working site list. You can cheat by buying a ready made list from one of the vendors in the marketplace.

    Personally I seperate the 2 tasks of link building and scraping/testing for working sites. Letting GSA SER do both at the same time is going to make both of the processes run slower.

    For scraping, 95 proxies isn't going to get you very far, as search engines that block proxies (Google) will likely block all of those 95 proxies within 24 hours. For link bulding, 95 proxies is plenty.

    Sure you can scrape other search engines, but then you have no guarantee that those scraped sites are indexed in Google. You'll need a seperate process in place to check for this. If the sites aren't indexed in Google then your backlinks won't index in Google either. This means your link building will have zero impact on Google rankings.
  • Monk29Monk29 China
    edited April 1
    I didnt say that!
     I said I wouldnt just tick off every engine in a group either.

    When you tick an engine group, its going to tick them all, which likely you wont want.

    For example, if you were to tick social bookmarks, there is no point to just tick everyone as there's a few that don't work, so your just wasting resources on them. 

    You need to open in up and select want engines you want for that category, was my intended point, being selective.

    And your right, if your in Germany its likely best to stay in ip and country ranges, language that support that, as well as have related sites linking to you...best you can.


    And no, there's 3000 search engines in there, I would never suggest ticking them all either!

    LOL...sorry my bad, English is not my first language, I misunderstood your point. 

    I feel like seeing Mr.White's angry face  :D

    I have switched it back to German and English only.
  • Invest on a link list or try to scrape yourself.  Letting scraping to GSA always ended up like this due to proxy banning etc.  Least give a five mins between search quarries inside GSA options. 

  • @backlinkaddict  Totally agreed with parsing update part
  • Monk29 it's no problem, just want to make sure you get the formula correct JESSIE, no chillie P :D 

    KeywordKing 5 minute wait time I never tried that, would it help if other users are still hammering on them or if a a whole part of a proxy network gets blocked for awhile and your proxies are part of it...hmmm

    I see a lot in log that "proxy may be banned on quirie" this is also not true, it might say there are no results on page when there are many but in 2024 even simple python scraper could continue and grab those links or a browser ext and also solve the captchas that pop up as well as scroll when needed if its ajax. 

    So Im not sure what the issue is?

    Pretty sure A-parser will do these things. Not sure whats going on where with some good Google anonymous and custom passed proxies are not parsing serps? :# Its more then a few programs and no great solution still

    Have you ever thought of using Hrefer? By the way, do you have any idea what happened to Gscraper? I still use Gscraper, but I don't know whether it's up to date or not. Back in the day, I used to scrape like crazy with proxies from Proxygo (a BHW seller). Lately, Gscraper offered their own proxy server for a huge price, I guess $60/month or something, then it vanished, and now I am only using link lists, no more scraping. It's worth the time for me.
Sign In or Register to comment.