Skip to content

Am I being too picky when creating backlinks? 24 hours 0 verified...

Monk29Monk29 China
edited March 31 in Need Help

Hi everyone, I am a new user of GSA SER, I just installed demo version yesterday.

 

I use 10 emails all tested successful, 95 proxies all tested successful, CapSolver with enough Credits, 200 threads, to run a project with article, blog comment, forum, microblog, social network, web2.0 and wiki.

 

Search engines only pick German and English, and skip all site from all languages other than German.

 

Filter URLs didn’t skip any below or above PR, didn’t changed type of backlinks, all as original (untick wiki-article and forum post).

 

With this settings, I ran the project for almost 24 hours and get only 16 submitted and 0 verified.

 

I found a very strange point which is I just cost 0.03 dollar in CapSolver, is it normal?

 

What did I do wrong, did I miss something? Thank you.

Comments

  • You only have 19 submissions?

    Im not 100 percent sure the scripts in web 2.0 are updated and now they are locked for me to adjust.

    Wikis likely not get to many.

    Some of the articles and comments might take some time and can be moderated or never verified.

    I would tick to delete email once verify link is found, that way your not rechecking already verified emails.

    You capsolver is low because you have not made many submissions so you didnt solve many captchas as it would seem.

    I wouldnt just tick off every engine in a group either.

    Guess settings need to be adjusted or your not parsing and good links to post to.

    Maybe to test, use a backlink of a backlink or something, tick of some easy engines like shortners and social bookmark -Pligg and see if they start coming in, then adjust from there
    Thanked by 1Monk29
  • Hi backlinkaddict, thanks for your reply.

    now I have 32 submissions and most of them have a tag"awaiting account verification". I switched the "when to verify" to 1 hour.

    I am not so sure what you mean by "I wouldnt just tick off every engine in a group either."

    Do you mean in the "search engine to use" tick off everyone?

    I don't understand what's the point, my site is in German, I guess use German and English search engines would be enough.

    But I am going to listen to you and tick all engines, anyway I have nothing to lose :D 
  • I didnt say that!
     I said I wouldnt just tick off every engine in a group either.

    When you tick an engine group, its going to tick them all, which likely you wont want.

    For example, if you were to tick social bookmarks, there is no point to just tick everyone as there's a few that don't work, so your just wasting resources on them. 

    You need to open in up and select want engines you want for that category, was my intended point, being selective.

    And your right, if your in Germany its likely best to stay in ip and country ranges, language that support that, as well as have related sites linking to you...best you can.


    And no, there's 3000 search engines in there, I would never suggest ticking them all either!

    Thanked by 1Monk29
  • sickseosickseo London,UK
    Before you can expect to see large numbers of verified links, you will need to spend some time building up a working site list. You can cheat by buying a ready made list from one of the vendors in the marketplace.

    Personally I seperate the 2 tasks of link building and scraping/testing for working sites. Letting GSA SER do both at the same time is going to make both of the processes run slower.

    For scraping, 95 proxies isn't going to get you very far, as search engines that block proxies (Google) will likely block all of those 95 proxies within 24 hours. For link bulding, 95 proxies is plenty.

    Sure you can scrape other search engines, but then you have no guarantee that those scraped sites are indexed in Google. You'll need a seperate process in place to check for this. If the sites aren't indexed in Google then your backlinks won't index in Google either. This means your link building will have zero impact on Google rankings.
  • I believe a lot of these parsers need updates, unfortunately, very badly especially for Google.

    Some of them come back with nothing on a search that says "no results match" but clearly there are 60,000 related matching results, or a Recaptchas pops up on the quirie or even randomly when scrolling.

    All off these workflows and errors need to be taking into consideration and programmed into the parsing bots.

    I totally agree split scraping and posting if can and if you have no list your going to need to buy or run program for awhile and see what you come up with. But likely due to proxies, the debugger is filled with mostly proxy banned I bet right now.

    Wonder if there is plan to add or of we just have to program our own parsers for G or scrape through Brightdata or use an API or something for these cases.

    I dont have great solution that does not include programming a new parser myself or trying to make a more optimized one in Zenno? 

    Im stuck on what to do. hmmmm?

    Either way, need to add for certain sites I think maybe Zenno scripts maybe be answer, altogether and just make an FCS Networker like interface/workflow that can be accessed via API or just have built links pushed of to a file and imoprted into GSA.

    @sickseo I looked more into that BAS, really interesting, but I could not see easy how to split the screens apart. Theres no way to work nicely with that one one screen and I couldnt find way to split them.

    Like in Zenno you can put the browser on one screen, the traffic log, the element tree etc on another and the script on another and its a nice workflow. 

    BAS seemed all crammed together, Im sure if I spent more time I could figure it out but like you im more comfortable in Zenno project maker, so learning something else/new may be waste of time. Verifying SMS will likely become very handy moving forward for some "tasks"
    Thanked by 1KeywordKing
  • Monk29Monk29 China
    edited April 1
    I didnt say that!
     I said I wouldnt just tick off every engine in a group either.

    When you tick an engine group, its going to tick them all, which likely you wont want.

    For example, if you were to tick social bookmarks, there is no point to just tick everyone as there's a few that don't work, so your just wasting resources on them. 

    You need to open in up and select want engines you want for that category, was my intended point, being selective.

    And your right, if your in Germany its likely best to stay in ip and country ranges, language that support that, as well as have related sites linking to you...best you can.


    And no, there's 3000 search engines in there, I would never suggest ticking them all either!

    LOL...sorry my bad, English is not my first language, I misunderstood your point. 

    I feel like seeing Mr.White's angry face  :D

    I have switched it back to German and English only.
  • Invest on a link list or try to scrape yourself.  Letting scraping to GSA always ended up like this due to proxy banning etc.  Least give a five mins between search quarries inside GSA options. 

  • @backlinkaddict  Totally agreed with parsing update part
  • Monk29 it's no problem, just want to make sure you get the formula correct JESSIE, no chillie P :D 

    KeywordKing 5 minute wait time I never tried that, would it help if other users are still hammering on them or if a a whole part of a proxy network gets blocked for awhile and your proxies are part of it...hmmm

    I see a lot in log that "proxy may be banned on quirie" this is also not true, it might say there are no results on page when there are many but in 2024 even simple python scraper could continue and grab those links or a browser ext and also solve the captchas that pop up as well as scroll when needed if its ajax. 

    So Im not sure what the issue is?

    Pretty sure A-parser will do these things. Not sure whats going on where with some good Google anonymous and custom passed proxies are not parsing serps? :# Its more then a few programs and no great solution still
  • I would love to program a workflow but I would not know the inner workings to add to SER and it would likely just become a tool I use from command line in visual studio and just import to SER.

    Kinda what Ive been doing. Optimizing footprints then scraping urls myself with them then importing and see which ones are SER friendly.
  • Monk29 it's no problem, just want to make sure you get the formula correct JESSIE, no chillie P :D 

    KeywordKing 5 minute wait time I never tried that, would it help if other users are still hammering on them or if a a whole part of a proxy network gets blocked for awhile and your proxies are part of it...hmmm

    I see a lot in log that "proxy may be banned on quirie" this is also not true, it might say there are no results on page when there are many but in 2024 even simple python scraper could continue and grab those links or a browser ext and also solve the captchas that pop up as well as scroll when needed if its ajax. 

    So Im not sure what the issue is?

    Pretty sure A-parser will do these things. Not sure whats going on where with some good Google anonymous and custom passed proxies are not parsing serps? :# Its more then a few programs and no great solution still

    Have you ever thought of using Hrefer? By the way, do you have any idea what happened to Gscraper? I still use Gscraper, but I don't know whether it's up to date or not. Back in the day, I used to scrape like crazy with proxies from Proxygo (a BHW seller). Lately, Gscraper offered their own proxy server for a huge price, I guess $60/month or something, then it vanished, and now I am only using link lists, no more scraping. It's worth the time for me.
  • Hrefer is interesting yes. 

    G-Scraper apparently moved to another multiverse, I guess. Or bouncing between a few :D

    I actually got a "list" I am testing now for this reason.

    I think a lot of these parsers will "vanish" or become useless if they are not updated, as workflows change and search engines change, updates are needed quickly for software parsers to work effectively.

    I mean the free serposcope will parse search and solve captchas for keywords and backlink checking and so will free seo powersuite rank tracker so it is possible and baked into some tools already. Like ignore googles nothing here reply and keep parsing urls anyways and click show with omitted results and other errors, notice recaptchas and solve them etc.
Sign In or Register to comment.