Skip to content

Low Submissions and Very Low Verifications

2»

Comments

  • On Deck:

    Saved 20% on the VPS (it's been helpful when traveling)

    Probably will eliminate the Index services.
    Considering removing all CB services except GSA CB and just default to ask user. Half the time I'm watching anyway. I don't make very many links so I can run it when I feel I have the time.

    Strongly considering using that Catchall email service. I don't get a lot of email issues but there's a few "Auth errors" that I don't think I'd ever see on SEO Spartans services. Also considering just using 1 email account per project.

    Biggest cost is the VPS. @ron is right. If you don't make many links why not just run it manually when you have the time. Why have a VPS if you don't need to build several links a day or have loads of projects. Maybe I'll grow to that point but it is an expense that is in theory not required to get great results.
  • ronron SERLists.com
    Just to make sure you know, you can connect to your PC from home while traveling as long it is left on and you have the right software to access. After all, there is no difference between a VPS and your home...they are both PC's.
  • edited October 2014
    Added SEO Spartans to all projects (started getting some email auth errors but the accounts were fine)  Just testing it a bit more.  Verified all submissions till they ran to 0.

    Not sure this is really necessary.  I'm starting to think I want 1 email per 1 project.  I'm not really keen on 5+ articles/blogs/web 2.0 properties created per account all on the same domain.  Less links overall but I'm not looking for sheer numbers. 

    Will see how this pans out.  Strongly leaning towards using a single Yahoo account per project.  Easier to keep on eye on it if anything goes wrong.
  • Updated:

    Created a Catch-All on my local web host.  Verified it is working.  Setup the CRON job to remove everything after 7 days. 

    Given that I'm barely running more than 2 projects looking for PR 3+ properties, the # of emails I see are very low.  Should fly under the radar.
  • edited October 2014
    Update to Finding Target Sites:

    While my Identified list is now over 2.5M sites... I added back in the SE function without "always use keywords."  Added back SEs from the United States/United Kingdom/Canada

    SER will randomly (or it seems) find sites via the Global Lists I've selected and insert search engine queries over time as well.  I've seen it use generic keywords as well as the list of long tail keywords the project uses.  Won't be fast mind you but slow and steady and constantly looking when I don't add new targets to post to.

    I will say this...

    If I use my current identified list and select everything and forget the PR filter among others.. the # of Verified links is pretty good.  But when you start adding in filters, language, PR of the domain, # of outbound links, etc... it's a bear.  The links aren't crappy but they are few and far between.
  • Update on How to Get a Good Verified List from Scrapes:

    Thanks to this post:https://forum.gsa-online.de/discussion/8168/the-right-way-to-import-verified-site-lists

    and @ron and his filling in the gaps.  I think I finally figured out how to properly import my scrapes.  This thread really made it clear about why you want Dummy Projects or multiple dummy projects and why to select all the engines.  How to import the Target URLs and how to configure the projects to stop once they run out of URLs.

    Now I feel a bit more confident that when I fire up a real project and selected "Verified Only" it might be more efficient.
  • Update:

    Setup 5 Dummy Projects (1 currently running).  Imported scrapes from "footprints" found via Articles/Web2/Wikis  Have over 2.5M URLs.  Split into 400k lists.  Dummy Project 1 is currently using this imported targets to build verified links.  Everything selected.  No filters.

    2 Real Projects.  Currently set to use the Global Verified List only.

    So far... I'm seeing real progress for the 1st time using a scraped list.  No more "Identify and Sort" for me!
  • Higher Quality Links from Raw Scrapes:

    Now when I import a scraped list, my submissions/verifications are way up!  Makes sense though.  No PR, no filters, post away.  But here's the catch.

    My Tier 1 projects are using sites that probably use ReCaptcha.  So if I just blast away to get a verified list using only GSA CB, I potentially skip a lot of valid links.

    Here is my experiment. 

    Ask All Services to Fill Captchas but tweak it enough so that my raw list doesn't use up costly resources.  LpM will go down but the verified list might improved.

    image
  • edited October 2014
    Updated:

    Submissions are great.. Verifications are okay.  However, Tier 1 properties are pathetic.  There just isn't much that qualifies as PR 3+ in the lists I'm scraping. 

    I do think SER using the scraping functions and a ton of keywords did better.  I have loads of low quality sites to post to but nothing in this scrape that is substantial.

    I know stuff is out there.  I just am not in the right place to fish.  I have to agree that with my tight settings on what I want to achieve with Tier 1, it's not going to be very good.  But not impossible and I'm not quite ready to quit yet.
  • edited October 2014
    Close to Capitulation:

    The prevailing attitude is that if you scrape a raw list, you check everything and turn off filters (for the most part) and see what you get Verified.  That becomes gold.

    Looking back at my 1st ~8000 verified list,it's crap.  Loads of NA/PR 0 blog comments, index, directory.. nothing I really want to build links from for any site.

    Made me change the settings for my 2nd scrape.  I removed blog comments, indexder, directory, pingbacks, redirects, exploits...   So far the submit/verify ratio is less than stellar which quite frankly is to be expected.  Did I really think I was going to automate my way to high quality backlinks?  Nobody ever said SER was specifically for that anyway.  I have an entire FCS Web 2.0 network that I need to give some TLC to as well.

    I want to be very clear that it's not SER, it's me.  It's really a very cool product.  Maybe I just haven't learned it well enough to make it do what I want.

    My expectations are a bit high and very unrealistic.

    You don't just scrape with some footprints and expect to post to 50 dofollow/contextual high authority sites by the time you get back from the gym.

    Could be other option is to buy lists because I figure, these guys will have something and probably more than I am able to get on my own with my limited resources.


  • OctoberFest:

    Signed up for 1linklist.com

    I have to say... it's too expensive for me to stay on it long-term (can't justify the costs) but for $55 a month, it's unreal the # of identified and verified sites you get on a daily basis!

    Forums are weak (way to new to have any solid info) but the lists from previous days are there.  There's loads of zips to download on a per platform basis or just the Verified links.

    I ran some last night and it just blazed thru them.  Given that I run a tight ship over here, I still managed to get 3 links per project that meet the criteria.

    I can't imagine how many links you'd get for a Tier 2 or Tier 3 project.  Thousands is my best guess.

  • Current Costs:

    $32 VPS (160 SSD; 4 core; 4GB)
    $20 Webhosting
    $14.95 Webhosting
    $37.50 100 semi proxies
    $15.97 for Captchatronix
    $12.39 for InstantLinkIndexer
    $4.95 for Serpbook
    $6.95 SEOSpartans Catch-All Email
    $8.00 BanditIM Text Captcha
    $55.00 1LinkList.com

    Total: $199.71/month

    Proposed Changes:

    $20 Webhosting
    $37.50 100 semi proxies
    $15.97 for Captchatronix
    $4.95 for Serpbook
    $6.95 SEOSpartans Catch-All Email
    $8.00 BanditIM Text Captcha

    Total: $93.37

    Summary:

    I love 1listlink's options.. thousands of identified and verified links per day.  The project I am using it on now for a test is still banging away.  I'll never run out of links for projects.  While I like the idea, I also like the idea of finding my own links.  $660 a year is a lot to pay if you aren't making $2k a month.

    @Ron suggested running on a home pc for now.  That saves $32 until I can truly justify the expenses.  Box should arrive today.

    I'll miss a lot of viable links if I don't have a ReCaptcha breaker and I am not sitting in front of the screen to catch it all.  I can ditch the other (human) services.  It's hard to use SER as an automated tool if all you use is CB.  But it's a fine product and really gets some awesome stuff.  Might just bite the bullet and use it exclusively and see if I get anything decent from my scrapes.

    No real need for Serpbook (although I don't see they have actually changed me?) I have other free methods of checking my SERP position.

    Love the Catch-All service.  It's pennies really.  But my host does all it and I can get a domain for my projects and use it.  Save myself some money in the course of a year.  Or just use Yahoo email addresses.  Although I will say, with SEO Spartans, I never seen any auth errors like I used to see on Yahoo.  Those are gone.  Then again, the other catch all I am using on my own host, I never see those errors either.

    Slowly making changes.  Toning down the costs.  Making my own lists is tough but I think it's a valuable skill to have and might help me understand SER better.
  • xeroxiasxeroxias United States
    you could save some money web hosting (depending on what kinda website you are running)
    http://www.webhostingtalk.com/forumdisplay.php?s=ae3837c0fa711589245d8ad6bd580ab5&f=4
    ^ i use cheap hosting services from this site for my smaller projects!
Sign In or Register to comment.