Skip to content

How to use gsa ser verified lists to maximum extent ?

I bought few verified lists for gsa ser and added all into my identified folder.
here are the stats of my identified folder.
image
I ran the list twice to a project but i couldnt get only 1129 verified after verifying the links 3 times, removing the bad links.
I was using only articles and wiki links for this project.
I have CB and deathbycaptcha as backup.
20 private shared proxies only for posting.
I set NO to domains with IP, no pr and obl limits.

Can any one tell me why i couldnt get more links ?

Comments

  • edited May 2014
    @sampath You should remove duplicate domains first. It no point parsing trough 1 mil duplicate domains.
    Second. You bought a private list from here or some cheap $5 ones from fiverr?
    The ones on fiverr are crap(actually those are public lists that you can find on forums)
  • 186k verified article sites list... i dont think its possible. Someone scammed you or your list have duplicates.
  • first export you sitelist to scrapebox and remove duplicate domains . Then test again . 1129 is quite low . Hope you didn't pay a fortune .
  • @gopo2k - I did remove duplicates in gsa ser before posting it here. I bought the lists from gsa forums and bhw which includes red list, blue list, travor's list and few.

    @satyr85 - No, one scammed me lol. All the lists are genuine.

    @DonCorleone - Removed duplicates using GSA SER only !
  • goonergooner SERLists.com
    @sampath - Maybe you would be better to separate the lists.

    For example, put 1 list in failed.
    Another list in submitted
    Another list in identified

    Then you can set certain projects to use certain lists and know which lists are performing well for you.

    I would suggest there is a problem with your proxies or setup to get such a low amount.

    I can run one of our lists for 15 mins and get more verified than that.
  • @gooner - Please add me in skype.
    skype id: sampathreddy484

    I would liek to talk to you about this as well as about PBN's.
  • goonergooner SERLists.com
    I don't really use Skype mate. But feel free to send me a PM or start a thread.
    Cheers
  • ronron SERLists.com

    Like @gooner said, separate the lists. I was a smarty pants and combined lists, and I should have just grabbed a gun and shot myself as that would have been more fun. All it takes is one old list mixed in with a good list. Mix shit with lemonade and all you get is liquid shit.

    First figure out which list is worthwhile. Separate the lists as suggested. Clear all project Target URL Caches, then import target URLs from Sitelist - only one of the lists - and let it run, and keep notes. You will figure out the answer pretty darn fast.

  • @ron that doesn't make any sense. Why in the world would a combined list yield less results?
  • ronron SERLists.com
    edited May 2014

    Well, what I thought was that I would import several lists into one folder which works fine by the way. (You have to understand that I had all of them separated to begin with, so this was no big deal, just an experiment.) Then I deleted all non-contextual platforms in this combined site list to create a mega-contextual list. I did it for the hell of it, but also just to separate contextuals from junk in case I ever needed it.

    What I found was that old non-performing lists created a huge drag on performance. A much larger effect than I would have expected. It basically beat the most recent list (which was great) into a piece of crap because I mixed in too many old targets that weren't working any more.

    See, that's the thing. I test all sorts of weird stuff. I get an idea and I see if it works. Well anyway, it turned out to be a bad idea, lol.

  • @ron & @gooner - Thanks tried adding the list to failed with completely new project got 80 article and wiki links in around 10 minutes. But am not sure if some links are same from the first project.

    Do you guys use any other captcha service other then CB ?
    @gooner - how can you tell that my proxies might not working ?

    Can i use shared proxies even if they are google banned for posting ONLY in gsa ?
  • I think a good sign that proxies are failing at the "download failed" error messages. You can save your log to a file and then go back and check it out.
  • I suggest with any list that you purchase or even for periodic cleaning of your own existing verified list that you use paid captcha solving to eliminate or at least almost completely eliminate captcha failures as a possible cause as to why the posting fails. Remember with OCR you only have a 30-50% chance of solving so you are greatly diminishing your chances of posting successfully to targets that were once previously verified by list sellers.This helps you to eliminate that as a variable and gives you a "true" standard to go off of as to how many targets are still actually postable not accounting for hosting/registration closure issues. 

    I do this periodically with my own verified list of contextual targets. 
  • ronron SERLists.com
    edited May 2014
    @sampath - You will probably laugh at this, but it is the truth...

    I get the failed proxy all the time. I just totally ignore it. I check all 30 proxies regardless of how they test in SER, and just let them run. I no longer care if they show all those weird messages in the log. And I still get 250 - 400 LPM.

    My only advice is to not worry about the proxies. Sure, check them in KM, SB, SEOSpyglass and whatever you have to make sure they are not truly dead. But if they pass in those other software packages, just let them run.
  • @ron - I have 5 verified lists including the your red and blue list. How can i use them since we have only identified,failed,submitted available in gsa ser ?

    I followed the settings you suggested in the pdf. But still I couldn't get more contextual links.
    Now how do i use all my lists ? I did remove dups. Any way I can use them to get more contextual links ?

    Thank you very much for your time.
  • ronron SERLists.com
    @sampath - It is really hard to answer that question unless you start posting screenshots on your settings. It could be one of a hundred different settings that could be raining on your parade.
  • @ron - image
    image

    I mostly use the same settings for all my tier1 contextual links. But i couldnt get more then 2k conextual links from all my verified lists that I bought.
  • ronron SERLists.com
    edited May 2014
    1. Don't have "Skip" on the captcha solving - click it until you get 'Random'

    2. Uncheck "Skip hard to solve captchas". I personally have changed my view with how I use GSA-CB (I assume you have it also). I now have all captchas checked in CB. I no longer have a minimum checked (like 20%) checked in CB - I now leave it uchecked. It used to be where GSA-CB slowed down when it tried to solve harder captchas. Now it flies like greased lightning. I have it try to solve everything, with no restrictions, and here is a screenshot of how fast it goes:
    image
    So my point is let CB run unobstructed. Now if you use other recaptcha solvers, then follow whatever protocol for settings they suggest. But if you use just CB, then try what I said.

    3. Uncheck "Use URLs linking on same verified URL..." No need for that.

    4. I'm assuming these are settings for a T1. You only have Article and Wiki. I would throw in Social Networks as that is a very legitimate contextual platform. Do not expect blazing speed if all you are running is contextuals.

    5. You may be on a home PC or a VPS. The speed varies tremendously on these types of systems. You may have a slow internet connection or a weak VPS. Honestly, there are a ton of variables. The guys and gals with the best speeds have:
    a) A ton of projects
    b) A large number of projects with no link limits
    c) High internet connection speed
    d) A dedi
    e) A damn good fresh list

    I am not saying you cannot enjoy high speed at home or anything like that. I have hit 200 LPM at home. Just remember that there many, many variables.
  • @ron -
    So, far my CB stats
    image

    4. Thanks, i will add Social Networks to my tier1 links from now. How many tier1 links you build normally for each project - approx. to what extend you are able to get links ?

    5. am using dedi and lists i bought.

    Is it good option to use only verified list or should i scrape lists on my own ?
    I dont want hight LPM, i want more contextual links because they are the links which really matter for google i think.

    Thank you again !

  • How does social network links are contextual links, Social Network links = social network profile links ???
    Which engines in social network are contextual ?
  • @ron - do you guys index tier2 links ?
    If yes, how do you send tier2 links to indexing service since we have comment links in tier2 which we dont want to index.
  • @sampath, it's always good to keep increasing the size of your verified list, so yes you should also collect lists yourself.

    Either right click in the platform section within your project and choose "uncheck engines that use no contextual links" or change the type of backlinks to create within your project, options, and filter urls.

    Why would you ever build a link if you don't want it indexed? That doesn't make any sense. I'd just send everything to my indexing services.
  • @fakenickahl - how come social network and social bookmark come under contextual links, as i see they give profile links.
  • edited May 2014
    @sampath because some social network and bookmark engines ALSO creates contextual links. I'll qoute myself slightly edited "change the type of backlinks to create within your project > options > filter urls."
Sign In or Register to comment.