Skip to content

Why is my custom GSA SER Identified List Not Working Well?

Hello, I have a list of 170k identified URLS and for some reason GSA SER isn't giving me many verified links. I am getting like 10 verified links per day...

Here's what I did:
  1.  Used scrapebox to scrape 3 million URLS by using GSA footprints
  2. Removed duplicate Urls and duplicate domains
  3. Ran the list through GSA SER platform Identifier
  4. Ended up with 170k identified URLS
  5. Ran the identified URLS with GSA SER for one day
  6. Checked stats, and only had about 10 verified urls.
  7. I had all the engines selected
  8. I had, "identified URLS", selected
Things I noticed:
  1. About 99.99% of my verification's where not successful. 
Tools I am using:
  1. GSA CB
  2. GSA SEO indexer 
  3. GSA SER
  4. Scrapebox
I am wondering why my custom list won't yield any good results...

Comments

  • s4nt0ss4nt0s Houston, Texas
    Are you getting submissions and no verifications or no submissions either? Are you using good email accounts?
  • BroMichaelBroMichael United States
    Yeah, I'm getting some submissions and no verifications. And yeah, I'm using my own e-mail accounts made with my hosting provider. 
  • BroMichaelBroMichael United States
    I'm getting a lot of these too...

    image
  • BroMichaelBroMichael United States
    image
    Unchecked circled options...and I'm not getting the captcha errors anymore...so I did fix one problem. 
  • s4nt0ss4nt0s Houston, Texas
    If you aren't getting any verifications but plenty of submissions, its probably emails causing the problem.

    The option that says, "if a form field can't be filled - skip" click on the words "skip" and switch to "choose random". 
  • BroMichaelBroMichael United States
    I just updated my e-mails with a fresh batch of e-mails, but I'm still getting barely any verification's...maybe my identified list just really sucks...

    I'm going to build a new list with GSA SER Scraper instead of Scrapebox and see if that works...


  • BroMichaelBroMichael United States
    Just tested out a new list of URLS scraped with GSA SER and I'm still getting no verification's...I'm starting to think, maybe it is my e-mails.
  • BroMichaelBroMichael United States
    I'm going to scrape a new list of URL'S with Scrapebox, and purchase the e-mail wizard later on today. Hopefully that helps...


  • anyone one can help you ,I have find that there so many like this things , I also have this problem,submitted 50K.. and get less 10 verifiled. 
    although my LPM 200, vpm only 1-2 at the beginning .after I run for some hours will not get vpm more,some times 10 mins get 1 verifiled ,sometims one hour get one verfiled links (guess book links and blog comment links )
    I have ask for reason for two months no answear untill now, the people s4nt0s will answear your question,but nothing use.
    he will ask your some question: if you don't use proxies he will say probixes problem.
    if you don't have vps ,he will ask you to try on vps, or links problem,etc ! after you try all the suguesstion,you will also not resolve the problem.

    I have find so many the same question in the forum ,but no one get resolve in the finnal . 
  • sorry for @ you s4nt0s 
  • Identified lists usually have poor results (in my experience anyway). That's because they are fresh sites, and most of your list could contain sites that are monitored and have anti spam measures in place (like Askimet). Just because SER knows what script they are using, doesn't mean SER will be able to post to it.

    Only verified lists have good results, because they have been posted to before with SER.

    Like others have said, it could be email issues - test them in SER if they pass than it's not the emails.
  • What I have learned by testing is that 90% is that you NEED good verified list of your own and the rest 10% is proxy, email and setting related issues.
  • BroMichael  with good options like my post lays out here:

    https://forum.gsa-online.de/discussion/16676/triple-digit-lpms-on-1-running-campaign-thanks-to-ron-ozz-and-others/p1

    Sure its about LPM and VPM, but I get great VPM too with this setup and layout.

    Having a good list is also killer. 
  • s4nt0ss4nt0s Houston, Texas
    @yans - Not sure what you're talking about. If you don't use proxies, it is a proxy problem and I've never told anyone they have to buy a VPS. I try to help people as much as I can. :)
  • BroMichaelBroMichael United States

    • Well, I scraped a fresh batch of URL's with Scrapebox, and this time I didn't filter them. I just left the list the way it is, no matter if it has duplicates etc etc...
    • RuFFCuT said, "Just because SER knows what script they are using, doesn't mean SER will be able to post to it." So I didn't bother to use the platform identifier.
    • So, instead of using the platform identifier, I just imported the URL's into my project via clipboard.
    • Bingo!
    For some reason, GSA SER doesn't give a sh*t if your list isn't filtered...it seems to like a URL list that's uncut, unfiltered and a little bit quirky.

    So far, I've gotten 63 verified's in less than an hour...woot, success! 

    Thanks for the help guys!


  • BroMichael

    What you mentioned wasn't an issue but a common behavior associated with scraped targets. I myself had scraped over 2M targets from Gscraper and imported directly to SER campaign. What I got was a couple hundred new verified links in the end.

    This is perfectly fine and verification for scraped targets is usually very very low. You should keep the scraper running at the background and keep them importing to SER on a daily basis. This way you will end up with a good amount of fresh verified links on a weekly basis. May be 20K+ unique verified URLS.
  • BroMichaelBroMichael United States
    Good point kashifM, 20K+ verified URLS would be nice. I'm going to purchase GSA IP next week, so I'll see how things work out.
  • I'm a total newbie....I'm running 500 threads for Gscraper on my VPS and averaging 2200/minute with GScraper, and I just wanted to find out if this was good. I setup a Gscraper to feed of SER and GSA Proxy Scraper's saved Proxies for my setup. I am not using any private proxies. I am also sending additional public proxies from No Hands Proxies software to the setup......so the working proxies I am getting to use in GScraper are averaging over 1000, which i set to refresh every 15 minutes. Am I on the right track here? I just need input from more experienced people like you.
  • Let me show you this - thats how look serious harvesting with Gscraper. With 1-2k lpm you wont get anywhere any you wont harvest anything resonable. You need private proxies for posting, paid proxy service for harvesting or your results will be far from perfect.
  • sysco32sysco32 Skopje
    @satyr85 That might look a serious harvesting...getting close  :P Not going to play the game who's is bigger. Also don't forget a regular person is not going to buy hundreds of proxies/threads to harvest links for his own use,as it is very expensive.Talking about a good setup is with several VPS,private,shared,public proxies,multiple GSA SER and CB,plus scraper program... $1K or more.
  • edited May 2015
    @satyr85    .....wow!....that's some serious harvesting you got going there. extremely intimidating for a newbie like myself. Thats some commercial grade stuff you got going ther, but like @sysco32 mentioned.....for personal use, that far exceeds my use.......with those many links, I'd have to start selling them off which is not why I am into this to begin with. However, I love to sponge off pros like you guys, so I at least know I am in the right place. I am completely new to this so bear with me. I intend to learn from you guys, and who knows, maybe in a few months to years, I will have the same bragging rights as @satyr85.

Sign In or Register to comment.