Skip to content

Low Submissions and Verifications: Campaign Review requested

edited April 2014 in Need Help
Hey everyone,  :)

I have been using GSA for the past 3 weeks and have got mixed results. I am following the Tiered Linking Concept of Ozz to build my backlinks. Till, date the amount of submissions and verifications has been pretty low for my Tier 1,2 and 3. However, secondary links has been pretty high but verifications is low.

image

For my Tier 1, I have manually written an article, spun it using TBS. I followed Matthew Woodward's article on advanced spinning. I used 'Content Foundry' to generate Tier 2, Tier 3 and Secondary links to all of my tiers. The GSA is presently operating on the Solid SEO VPS - Newbie plan. Within 1-2 days of starting the campaign, the above red triangles showed up. It says no targets to post to which I find strange as there has been so low submissions.

Funny enough, all my emails are working fine and none are blacklisted. These are new emails as per Sven's advice in this thread. I also found that despite using fresh emails, most of my submission is showing as 'Awaiting Account Verification'.

image

------------------------------------------------

Below is the settings of my Tier 1:

Data Settings: engines selected are contextual-do follow

image

Article Manager Settings:

image

Options Settings:

image

15/852 checked (below): I selected only UK and US.

image

Proxy Settings: I am using 30 private (semi-dedicated) proxies from Buy Proxies. Currently, all are working.

image

Email Settings: I am using 30 seperate emails for each Tier.

image


Settings for Submission:

image

Advanced Settings:

image

I just started using Linkexed for indexation purpose. I am still scratching my head as to what am I doing wrong... :(

I don't care whether this website fails or not despite my best efforts. It is important for me to know at least what went wrong so as not to repeat the mistakes made. Any feedback would be greatly appreciated.

Regards,
Tyler

Comments

  • Any feedback guys ?
  • goonergooner SERLists.com
    You've got OBL filters and PR filters, that will slow you down a lot. I would remove the OBL filter altogether.

    You've selected "Use URLs linking on same verified URL" - I tested this once and it didn't work very well, filled up my target cache with crap. So maybe try removing that.

    Under "Types of backlinks to create" you might want to tick "Article-wiki" as you have selected Wiki's.

    On another note, you have your indexed links being sent to GSA indexer, you should choose "other indexer" if you are using Lindexed

    Maybe try those settings and see how it goes.
  • Hi Gooner,

    Your reputation precedes you. I am grateful that a pro like you took time to respond to my query.

    I have made the required changes. Currently, my LPM has dropped to 0.03... :(....no clue why. Yesterday it was over 30. I hope to buy one of the Blue/ Red SER Lists facts...Hope it will help.

    Could you tell me why these red triangles keep popping up even when my articles are fresh ? If you are using Kontent Machine for Tier 2 and Tier 3 content, how frequently do you update/refresh the content ? Weekly, monthly ?
  • goonergooner SERLists.com
    Hi @tyler,

    Thanks for the kind words, but really not necessary... Everybody tries to help out here, which is why this is such a great forum :)

    With regards the red triangles, what message do you see?

    With the LPM, i was just offering general advice to improve how SER works for you, but it's been maybe 5 or 6 months since i scraped with SER. So probably i am not the best person to advise you in that particular circumstance.

    Of course, my advice will be to buy lists because i am a list seller. But i can tell you genuinely that using verified lists will improve your LPM and total verified/submitted stats.

    Aside from the lists i offer, there are some very reputable sellers on this forum who also offer very good sitelists. @Trevor_Bandura and @donaldbeck are two that spring to mind and there are others too.

    @donaldbeck also offers a video tutorial on scraping if you prefer to scrape your own URLs with Scrapebox/Gscraper etc.

    SER is the best posting software available, no doubt about it. But it is not the best scraping software. So you will always get better productivity if you use tools only for what they are best at.

    Hope that helps.
  • In your article manager settings you are not adding a link.
  • Hey Brumnick,

    I agree. I am not adding a link as I have manually included contextual links in the article itself.

    Hi Gooner, with regard to the red triangles, I see the following message:
    No targets to post to (maybe blocked by search engines, no url extraction chosen, no scheduled posting).

    I don't know why this is the case. In the Options tab, search engines to use, I have selected country wise: UK and USA Only (total 15). I have put 800 keywords in the Data tab at the 'Keywords'.

    There is another thing I have to mention. The niche I am targeting is a small one. I used Content Foundry to scrape for content but it was not that big as I would get if I scraped for e.g. weight loss.

    What could be further done to improve the results ?

  • ronron SERLists.com
    • Use all English speaking search engines. It is more like 150. You need more search engines.
    • You also need a big keyword list to scrape. I use 100,000 generic keywords. Search on the internet. Just get a big ass group of generic words, the more generic the better.
    • Check the boxes "Continuously try to post..." and "Try to always place an URL...".
    • Uncheck all of that public proxy crap on the left side of the second proxy screen (options) - you don't use public proxies.
    • Get rid of the PR filter - it is killing you. If you must use it, only use PR1.
    • Quit writing to submitted in Main Options. You should only be writing to verified.
    • Check "Always use keywords to find target sites".
    • Uncheck "Use URLs linking on the same..."
    Then report back...

  • The bullet points really help Ron, O:-)

    A few questions...sorry if it sounds basic...

    • Selecting All English gives me 131 engines.
    • With respect to the point, "Quit writing to submitted in Main Options. You should only be writing to verified.", should I untick the Submitted (Settings) in the pic below ?
    • image
    • I actually used the Google Keyword tool to get 800 (max.) keywords relevant to my niche. I suppose it is not enough. Would it be ok to put these keywords and scrape for more keywords using 'In-built Keyword Scraper of Scrapebox '?...I mean putting the 800 words and scraping to get about 100,000..? (I am a bit new to scraping..:| )
    • When I check "Always use keywords to find target sites", a pop-up comes up which says, 'a lot of potential sites would get skipped...etc.'...Is it ok to ignore this? If I press continue anyway, another pop-up comes which is shown below...
    image

    Do I need to uncheck "Use URLs from global site lists here" ?
  • edited April 2014
    • "With respect to the point, "Quit writing to submitted in Main Options. You should only be writing to verified.", should I untick the Submitted (Settings) in the pic below ?"

    He's talking about the lists to post to within your project options right below your search engine selection. Also I personally keep submitted ticked in the advanced options (not within the project) for statistics. When you've got a good amount of verified and submitted urls, you can compare the two between each other with a tool in advanced options listed in tools to know which engines aren't giving any results and to disable.

    • "When I check "Always use keywords to find target sites", a pop-up comes up which says, 'a lot of potential sites would get skipped...etc.'...Is it ok to ignore this? If I press continue anyway, another pop-up comes which is shown below..."

    It's just fine to ignore both of these. The potential sites that would get skipped can get caught if you do a quick scrape for just the footprint without any keywords, if I'm not mistaken. The second warning is telling you that SER will use targets which is possibly not niche relevant. I really wouldn't worry about this unless I were doing blog comments.

    About the keywords. It sure sounds like a good idea to use keywords that are relevant to your niche as much as possible, but the problem is just that generic keywords are much more likely to appear on a potential target opposed to "very fast scooter". Therefore generic keywords are much more likely to give you more targets per scrape.


    Also, I have to prove a point concerning the amount of keywords to use. I mean no disrepect to those who keep saying it, but it's just insane to use 100k keywords.

    The footprint list I'm using in gscraper has 1027 footprints. If I merge this with 10k keywords, I'll get 10,270,000
    search queries. Lets say I'm using a 10 second wait time between search queries.

    10270000 * 10 seconds / 60 / 60 / 24 / 365 = 3.26 years to go through each and every search query. Now I'm pretty damn sure that SER is choosing a random search query every time it scrapes, so even after a year with 10k keywords, you'd still have pretty good odds of hitting a search query that hadn't already been _SUCCESSFULLY_ scraped. If you were to only scrape google along with a few other search engines with large indexes, you'd need a higher wait time between search queries. Mine was 60 seconds with 21 search engines and I was still getting banned all the time with 100 dedicated private proxies. With a 60 second wait time, I'd have search queries for 19.54 years with 10k keywords.

    Please let me know if my math is horribly wrong, and I'll quickly eat my words. I just want to change the common standard of using too many keywords as I believe it's slowing down SER.
  • ronron SERLists.com
    @fakenickahl - I should have expanded my answer on the keywords. I don't store any in SER. I use this token in the keyword field:

    %spinfolder-C:\Users\Administrator\Dropbox\kwspins%

    I then have the 100,000 keywords split into 4,000 files with 250 keywords each, and they are stored in Dropbox. A very fast and efficient method...
  • edited April 2014
    @ron, that sure is a great solution to my concerns of slowing down SER. I never even considered doing this. However I still believe that it's overkill that just about everyone is telling new people to use 100k keywords if they're not telling them to do as you are.
  • goonergooner SERLists.com
    edited April 2014
    When i used to scrape with SER (many moons ago), i used @ron's method with several million keywords in many languages. I found this was the best method after experimenting with other amounts.

    Not totally sure why, but i guess the more KWs you have, the lower the chances of SER randomly picking one you've used already.
  • ronron SERLists.com
    @fakenickahl - It's something I tested a long time ago. If you use my system, it works great, trust me on this one ;)
  • edited April 2014
    I'm with ron/gooner with the kws batched into chunks. I have a 500k list that's split into smaller files, pulling from Dropbox. In fact everything that can be 'macro'd' is. Add in Santos's Footprint Tool and you have a very powerful setup. If I could be bothered, I'd learn how to scrape but currently everything is running sweet for me and rankings are following suit, so I let SER scrape (still!) and I honestly believe ensuring SER is ultra-efficient is the key (again, thanks to ron, gooner and others for various tips throughout the forum), I'm sure that if I did scrape then I'd see incrediblererer results but it just sounds like a hassle/time-sapping game.

    OP - increase your time between searches to 30 seconds from 10. I don't think anyone mentioned that. It will help if you have lots of threads/projects.

  • edited April 2014
    Ron,

    I will be happy if you point out if I am doing anything  wrong or not. :)
    --------------------------
    1. I did a bit of research about generic keywords and came across this guy who is sharing about 3000 generic seed keywords.
    2. I copied these keywords into the keyword scraper of Scrapebox and scrape for more keywords. Around 100,000.
    3. I copy these 100k keywords as shown below. I suppose, I can also put my niche relevant 800 keywords apart from the generic keywords.
     image

    Did I get it right till here ? :| If I do this, I suppose GSA will slow down.

    "...I then have the 100,000 keywords split into 4,000 files with 250 keywords each...."

    How can I do that? Sounds interesting....Can GScraper/Scrapebox be used to accomplish this ?
  • ronron SERLists.com
    Look for @kaykay's free file splitting tool. Use the searchbox on the website - you'll find it. 
  • Split the files then right click the keyword box and choose %spin folder% macro - choose the file from your desktop then SER will pick a random file and random keywords within that file going forward.


Sign In or Register to comment.