Skip to content

Low Submissions and Very Low Verifications

edited September 2014 in Need Help
This is a continuation of this thread but will include screenshots.

I have gone thru these posts:

Inofficial GSA Search Engine Ranker FAQ
Identify Low Performance Factors

LOADS of good suggestions.  Most of which I think I incorporated.

VPS Setup:
CheapWindowsVPS.com

1 CPU
1 GB RAM
20 semi-dedicated proxies

I know my VPS isn't spectacular but I don't think it's the issue.

Project Stats:

image

I've stopped/started this project a few times as I changed settings but overall, the list of Verified sites has never gone above 6 in 4 days running.  There have been a few hundred sites submitted to but they all ultimately fail in the verification checks (bad logon, bad password, etc)

GSA Settings:

image

I am using 20 semi-private proxies.  I've seen a few "banned" go thru but they are they public ones used for searches.  I checked the IPs and they are not mine.  I have tested them with Scrapebox as well without issues.

Project Settings:

What I want to do is post to:

  • Articles
  • Web 2.0
  • Wikis

PR 3+, with 100 OB links or less on sites that use the English language that I imported or GSA found using Unites States search engines.

This is Tier 1 or my version of Tier 1 anyway.  I would like higher quality sites initially.

image

Thing is.. this project never really will pause because it will never get that many verifications in a day.

How to Get the Target URLs:

I had "always use keywords" turned on at one point.  I had over 300k keywords.  Didn't make any difference but I've left it off.

image

I checked United States for my SE selection.  I used English before but found it was using other SEs from what appears to be other countries.  Maybe good, maybe bad.  In any case, I narrowed it down to the USA.

This is my ONLY project so far so maybe the use URLs is irrelevant.

Scheduled Posting:

image

Not sure what to do here so I did nothing.

Filter URLs:

Probably too small to see but...

* PR 3+
* Skip with 100+ outbound links
* Skip unknown PR
* Avoid posting to just IP and no domain
* Use bad word list
* Skip sites with the following languages--ALL but EN (english; had that checked at one point!  Since turned it off)

image

Email Settings:

I see nothing about my email account not working.  I've tested it.  It's not banned or deleted or disabled.

image

Summary:

Am I approaching GSA the wrong way?  Trying to be uber careful about high PRs and a very limited selection to post to?
Is that really the root cause?  Too restrictive?

I am using Scrapebox to go out and build my seed lists to import as target URLs.  I am using just the footprints and various keyword files.

Any guidance here is most appreciated.  I've looked at many other threads and tried to tweak my project accordingly.
«1

Comments

  • what captcha breaker you buy ? as i know, gsa captha breaker can not bypass the recaptcha of web2.0,  article.....
  • I am using GSA Captcha Breaker with DBC as a 2nd option.

    Went thru a pretty good video on how to configure GSA Captcha breaker and when to toggle to send to secondary services.  I can see it's working.

    Also went thru the SERLists guys recommendations (PDFs when you subscribe to their newsletter).  It had a few things I'd not considered.
  • GSA Captcha Breaker Settings:

    My project is setup ask all services (in this case the only service in GSA is it's own Captcha Breaker).  I don't want it to skip hard to solve.

    image

    I have setup GSA to use this software.

    image

    And finally.. in the GSA Captcha Breaker, if it can't solve it, it will send to DBC.

    image

    I have gone thru and toggled anything that uses reCaptcha services.  So if there is a very low success rate like 8% or less, I just send that to DBC.  Kinda a cool feature of the software.  It tries to break it, if it can't, it send to DBC and in some cases, will just send them there by default.

    I do see things moving in this software.  So it's breaking stuff. 

    Having 1 project with tight controls, does mean your LpM is going to be low.

    My next step is to scrape and get my seed lists a lot bigger.. import target URLs and let GSA sort and hopefully post to some.
  • Update:

    Using a Hotmail account that was created with the FCS Networker tool.  When I got curious and looked inside, 50% of the messages are in the Junk folder.

    Turns out.. it's pretty painful to turn it off and you have to make a filter.  Which via the web is totally outdated.

    I did my best to turn it on.  But I'm thinking it might be better to use Yahoo or just create a mailbox on the domain in question that I own.
  • goonergooner SERLists.com
    Yea better to buy emails with junk filter already fixed.
    Yahoo's or mail.ru's are working well.
  • I added a 2nd account from the domain I own.  Nothing gets sent to Junk.
    Even with the Outlook.com rule I still found "account detail" emails in the Junk folder!?

    Thinking Outook.com/Hotmail.com accounts are not the answer here.

    Either my own, a catch-all or Yahoo or some email where you can turn off Junk completely.

    I really think that has a negative effect on the number of verifications if GSA thinks it never got the email.

    That being said, I submitted well over 80 sites this morning.
    Zero verified yet. 

    So far.. GSA running for a week with 7 verifications.

    Not really worth it.  If you aren't going to use this to create mass backlinks, it's much easier and probably safer to just go buy a backlink packet and manually create them.  I mean.. 7 in a week?

    Not giving up just yet. 
  • ronron SERLists.com

    @mda1125 - Your pain is killing me:

    image

    Will you please get some yahoo email accounts. You are making this too complicated. You don't want to do this with emails from your own domains. They will get blacklisted, and you will have very few links. You need about 10 yahoo emails in each project, and change them once per month.

  • I'll get 10 Yahoo emails, add them to the project and report back.
  • Forgot I had a Banditim.com account with enough credits to easily get 50 Yahoo email accounts.

    I took 10 from that list.. and imported them GSA style into the project. 
    Tested all.. all successful.

    Project now has 10 Yahoo accounts.. all verified and successful

    image

  • Alright... submissions going at a good clip now.
    Did 2 scrapes and imported well over 50k "unverified" sources using footprints.
    You'd think I'd get at least 1 verified link out of 50k.

    We'll see... next step is to pay somebody to go thru the logs and see what's up.

    image
  • edited September 2014
    Welcome to my world :D. And it is that way for moths now.
    I recently read whole SERList guide too and still cant pass 10 LPM.
    When i scrape targtes with Gscraper i got like 4 LPM, lots of no engine matches (which is normal i believe, Gscraper scrape lots of random things on the way), and when I let SER scrape it usualy come back with 000/000 results.

    For me it is also hard to normaly run campains as i;m not getting 20-30 verified contextual links. Or should I use some crap engines too? Blog comments seems to work ok, but the articles, socials, wikis are stubborn. 
  • ronron SERLists.com

    @mda125 - I kind of missed the part about GScraper - sorry I misunderstood. My image was from using our verified lists, not from a raw scrape.

    The whole scraping process (using external tools) is a difficult process, and a lot of it is not intuitive. This is one area that you will really need to invest some research because it really makes a difference.

    Ideally, and if you can afford it, I think it is smarter to have 2 servers, 2 SER, 2 CB. Let one server scrape the targets - and process them. It's only job is to sort through all the crap to find verified links. You use bogus URLs in these projects - not your actual projects. As you build that verified list, you can then use it on server 2 for your real projects.

    Myself and the SERLists team do this on a much larger scale in order to provide the size lists that we do - a whole bunch of servers. The point is that you can only find a limited number of targets with one server. If what it finds is enough, then great. If not, then you need more servers.

    This point is *exactly* why I teamed up to take this on. Not so much because of the expense (although it is a fairly high number), but because of all the moving parts. It's like a production process in a factory, haha. Except with computers.

    The bottom line is that this isn't a SER issue you are having, but rather a scraping efficiency issue. I think @gooner already mentioned footprints. I would spend more time understanding how to scrape better. I honestly believe that is the problem.

  • "The bottom line is that this isn't a SER issue you are having, but rather a scraping efficiency issue"

    Agreed.  I mean, I'm scraping a bunch of non-verified links based on footprints and then importing into GSA.

    On top of that, I have very some strict rules.. so it really makes it hard to get an actual verified post.

    People here are using GSA to create thousands of backlinks from many sources.  I'm trying to create a few higher quality from 3 sources.  So it seems common sense, it's going to be much harder for me.

    Spot on about the scraping issue.  I think that really is the crux of the problem.  I might have 50k target to submit to.. but who knows what they are until GSA can identify the platform and on top of that, it might not even meet my requirements or the scripts might fail.

    Really nice info in this thread.  I really do appreciate the assistance.

    FYI.. I signed up for your SERLists.com and the PDFs on how to configure a project were REALLY GOOD!  I took a lot from that and incorporated as much as I could that made sense.
  • How to Scrape using External Tools -- @Gooner pretty much told me what to do!

    Guess it's time to get 1000 keywords, pick a footprint.. go for it..
  • ronron SERLists.com
    edited September 2014

    @mda1125 - There you have it. It was apples to oranges all along.

    I like your approach targeting quality. You have a real business (I checked out your profile before I made my last post), and you don't want to screw it up with cheap links. They are great for lower levels, but not at the top...unless you filter them.

    Great links are hard to come by, and it's like panning for gold. But it is totally worth it. SER is one arrow in your SEO quiver. Don't forget PBN's, paid links, web2.0's, social signals and the like.

    A ton of work went into those tutorials, so we really do appreciate the compliment on the free material! ^:)^

  • The Free Tutorials over at SERLists.com were one of the best I've seen on project configuration!

    Highly recommend them.  I wish I found them a bit sooner.

    If I run a Scrape of ~137,000 keywords using the Dolphin footprint... is that insane?  Given that I'm completely new, I don't want to overload my server and I don't want to get proxies banned (as I am semi-dedicated with 3 others).  Trying to be reasonable here.

    Hoping out of that search, I can get a few decent Verified submissions.
  • YES!

    image

    Pathetic to most but.. PR 3+. 
    Just took a long time to slog thru an unverified list. 

    I'm not calling end of game here but thanks to the assistance of those on this thread, I think I'm on the right track.  There might be more tweaks on the way but overall, without having an actual high PR verified list to work with, there's going to be a lot of guess work, bad sites, unrecognized sites and such.

    Calling it a night and letting GSA do its thing.  But again... many thanks for the suggestions here.  I think the suggestions here really got a few things ironed out and re-set my expectations.
  • goonergooner SERLists.com
    ^^ Looking good, glad you got everything sorted out :)
  • Just added a demo account of ReverseProxies OCR (recommended by SERLists.com)

    So it goes CB --> ReverseProxies OCR

    I've also togged a few other services to just bypass CB and go to RP OCR since it has such a low success rate, I want to see if this service can do better.
  • Updated Captcha Settings:

    image

    A bit overkill ... but I'll say one thing.  I never see "captcha failed" anymore!
    The last setting is to notify the user if all else fails but when I sit and watch... that never happens.
  • edited October 2014
    Proxies:

    Now using @mexela 100 semi-dedicated proxies.  So far, no download sock errors, or "banned from X" and rarely do I see email lockout issues.  My average url/s is somewhere in the range of 147 to start and then tappers down to 65 thru the night.  Best customer support was Buyproxies and initially the fastest was Proxyrack.  But Mexela proxies is doing the job and they are responsive.

    Not to mention the discount was unheard of and nobody came close.  Hope it lasts for the lifetime of the subscription otherwise, it's on par with the rest of the services.
  • I've been debating where to submit...

    Currently: Articles/Web 2.0/Wikis

    Considering: Forums/Blogs/Document/Social Bookmarks/Social Networks/Microblogs

    image
    I am not doing any tier link building.  Links go to money sites period.  If I were a real person, I'd tweet about a blog post not a forum post that talks about a blog post.  But I digress, this isn't about the pros and cons of tiered link building.  It's about what makes a good place to put a link.

    If the domain of anything is a PR 5 and the out bound links are under 100.. that to me is a good enough indicator.

    By allowing myself to post to more properties per project, I can get more links.

    This Articles/Web 2.0/Wikis without doing any tiers is really limiting.  I need to open up the places to post.

  • bestimtoolzbestimtoolz High PR WEB 2.0 posting service - affordable !
    1. Yahoo accounts - I switched to them from Hotmail some time ago and never had the problems with verifications
    2. disable proxies for verifications and email checking
    3. Enable "continuosly post to a site even if if failed before"

  • 1.  Totally.  So far, no issues yet.  The occasional auth error (which isn't anything but SER's hicup) happens sometimes but to date, Yahoo has been working great.

    2.  Disabled for Verifications but I have them on for email checking.  I do not want 1 IP checking into all these accounts.  Looks a bit spammy.  Easier to have my proxies do the round robin checking.

    3.  If a Verified list I'd agree.  Otherwise, you end up with a lot of failures or sites that have failed for a reason.  Depends on the list you are using.  My list is raw (I scrape) so if it fails, I move on.  Maybe it was down, maybe they have some good anti-spam measures, who knows... if I paid for a verified list and was using it exclusively, then I would agree that setting would probably be the best bet.
  • Just for testing purposes, I opted to give the SEOSpartans "catchall" service a try.  Can't really comment yet but from the little test I did against one of SERLists.com Blue list, the thing was flying.  Way too soon to tell but I like the concept of a catch-all option without having to go thru my own host that I know would question what I am doing.
  • I love SER and all the options.  You can run this thing automatically with so little failures.  I've given into scraping my own and building my own lists.  Until I get tired of that or can justify that expense, I am not turning enough of a profit to even justify what I'm currently doing.

    I own GSA SEO Indexer but am not so sure it works well enough and the sites it's submits to (that I've seen flash by) are not my cup of tea.  I don't know what I expected so I'm being a bit judgmental without real justification.

    Current Costs:

    $40 VPS (160 SSD; 4 core; 4GB)
    $37.50 100 semi proxies
    $15.97 for Captchatronix
    $12.39 for InstantLinkIndexer
    $4.95 for Serpbook
    $6.95 SEOSpartans Catch-All Email

    Total: $117.76/month

    Next question... what can I live without or find resources to use that I already have (Market Samuri does rank tracking)

    • Can't live without a VPS (while SER can run on my home computer, it's really really inconvenient)
    • Can't live without proxies (just obvious)
    • I could live without a rank tracker as I have one and there's many options
      I could live without catch-all emails as BanditIM does an excellent job with Yahoo emails but I have to say SEO Spartans just blasts thru the verifications on my little test
    • I could live without an indexing service since Google will pick up some links anyway - natural is always good.  Yes there will be some never indexed, some will get indexed months later if they are still there but does it really matter?  I've ranked site without pinging and indexing before just fine.  The links that matter get indexed, the ones that don't don't and some get indexed 7 months after I create them.  I own GSA SEO Indexer anyway, I should re-consider using it.  It may not help as much but it's a currently owned resource and it probably can't hurt either.

    Not sure I can live without some type of ReCaptcha service though.  I do not create many links per project so it's very possible for me to put the project on Pause, fire it up when I have spare time and manually enter captchas if they can't be solved.  But over time as projects get bigger or I get looser with the options, it could be a problem.  It's done a nice enough job so far that I rarely ever get prompted for Captchas of any kind while watch SER do its job.

    Price Adjusted:

    $40 VPS (160 SSD; 4 core; 4GB)
    $37.50 100 semi proxies
    $15.97 for Captchatronix

    Total: $93.47/month

    Savings of $24.29

    Next step is to rank stuff using GSA SER only and get local clients beyond the few that I currently have.

  • ronron SERLists.com
    If you are not making a lot of links, I would do it from home. The first year I did it from home, and I was making a lot of links 24/7/365. It just got to be too much.

    If you are using an indexer do not bother with GSA Indexer as that won't help you any extra.

    So if you just use SER, and you are not making a ton of links, I say the best savings is to get rid of the VPS.
  • @Ron Any particular thoughts on using a catch-all email service? Seems like you would have the potential for unlimited accounts per project with the spin and be unable to control how many say.. Accounts get created. I didn't really consider the VPS option. Mostly because i use it for Scrapebox target list harvesting which can run for a bit. Plus the importing of some 400,000 targets makes SER run for hours. Got to say though, you nailed the largest expense. I'm trying to build links manually at first... To a site that is hand built and already indexed. Then just keep the higher authority links flowing. Forget it and do another money site. Eventually I'd like 100 projects running creating a few links a week. Just to keep things flowing. Thanks for insight. I was nickel and diming it.
  • ronron SERLists.com
    I wanted to use a catchall, but my host at the time was making it difficult so I took the path of least resistance.

    I forgot you were scraping. Scrapebox tends to run fairly light on resources, so I could imagine running a scraper and light linkbuilding from a home PC. What absolutely kills your internet connection is GSA Indexer, so as long as that is not in the mix, you can probably still get by at home.

    At a certain point when you get up enough websites, then you better be making money. And if you're making money, you get a vps or dedi. Personally I would get a dedi. For just a hair more, you will jump from 100 horsepower to 300 horsepower plus a turbo. That part of it is a no brainer, and remember I said that, as VPS's tend to not be that powerful. Forget the specs. It all sounds like a big deal but its not. Get the FR3 from Green Cloud and don't look back. $80/mth. You always want Xeon. So when you are ready, make that move.  
  • It costs me ~16.00/month if I left my PC on 24/7 It's just an i5/4GB laptop that would be doing the job. How much memory does SER really need? For roughtly the same price, I can go for a 2 core 2gHz; 2GB VPS. When I first started messing around with SER, it was running pretty good on 1 GB! At the moment, I have a 4GB VPS but it's basically asleep except for the times when I import some mammoth list. Makes me think I could downgrade the current VPS to be an equivement of what I would run at home. Until such a time as I can taken advantage of the F3 option! Great suggestions Ron. Really making me rethink the strategy.
Sign In or Register to comment.