Skip to content

My First Tier Project - Low LpM

Hey guys,
I'm trying to rise up my fisrt TIER campaign using GSA SER and Scrape Box. I scrape links on SB and then import to GSA SER.
Fistly, I already read a ton of posts around here before asking what I'm going to.

I'm having problems with the LpM and VpM on my campaign... 
LpM = 1~ 2 
VpM =  0.74

Running on an home server Win7 32-bit/4Gb RAM/Intel i3 Dual Core/SSD ... monitoring resources everything looks fine. Internet 100M
Using 20 shared proxies

GSA Threads = Tried between 15 and 100, nothing helped
Search Engines = All English and Portuguese
PR check and  PR filters = disabled
Capcha Services = GSA 1st DBC 2nd
HTML timeout = tried 10, 45 and 60s
Can try to repost even when failed
No language nor country filter
5 emails per project/site
Tier 1 = WEB2.0, Wiki, Articles and Social
Tier 2 and 3 = Comments, Web2.0, Wiki, Articles, Social, Guestbook, Forum, Pingback, Trackback and so on

I have around 3~4 links+anchor text per site, two articles per site, 4 emails per site...

Any ideas of what I'm missing? Really appreciate if someone can point me to the right direction, I dunno where else to look for.



  • Are you sure you have enough target keywords added?
  • I scrape keywords on Scrapebox and import them on GSA SER (250~700 keywords)
  • shaunshaun

    You have a few things set up wrong mate.

    Firstly you dont need any search engines ticked, thats going to slow SER down as you are scraping with Scrapebox.

    The URLs you scrape with scrapebox should have duplicates removed and then put into your SER identified folder.

    You then need to set up projects in SER thats soul job is to verify these links. They are literally churning through your scraped links 24/7 verifying them. The job of these projects is not to rank anything, just to build your verified folder. Set up a project and remove everything in the targets part of the options tag then tick the identified folder. Make sure these projects are only using GSA CB, NOT DBC else you will pay throwing your money away. With these projects there is little point in letting it post to unsuccessful target urls again as they are probably duds, some do slip through but in my experiance it works better without it ticked.

    Next you set up your projects to rank your sites, with these projects you have everything diabled in the target urls part of the options tag appart from the verified folder. This means they are only trying to post to URLs that SER has posted to before so increases your sucess rate and LPM. As SER has already posted to these URLs at least once these projects can have post to same site again ticked.

    Your thread count is a whole different story and it depends on a lot of things, same with your HTML timeout so dont worry about those for now.

    Untick web 2.0 as SER does not handle them well and they will slow everything down, some people use wikis on T1 but I dont, if you check back on your wiki links after a few weeks usually they are gone.

    SER uses alot more resources to post to sites where it posts an article so I would remove article, wiki, social network and web 2 from your T2. 

    In all honesty you are wasting money at this stage using DBC credits with SER. If you are having problems with ReCaptcha then pay for the cheapest Reverse Proxy OCR package and it will save you money and do you better.

    I am presuming you are using the footprints from SER to help target your scrapes and not just scraping random URLs?
  • @shaun
    Wow, great feedback man, thanks a lot.

    I started to implement some of your tips fairly sucessfully... Yet:
    1 - I'm not using custom footprints on GSA Ser. In fact, I have some difficulties to understand how to find footprints to use even on Scrapebox.

    2 - Should I check "use verified URL's from another project - Tier/linkwheel" on projects options? Is not very clear to what it does or what should I use it, and the URL's field go blank when it is checked.

    So, I run a spammy project not to rank, but to get verified links, right? Then I import these links to my actual projetcs, with no search engines enabled. That's it?

    Thanks again,

  • About "use verified URL's form another project".. is something like this? uses verified url's of uses verified url's of 

    Or this is totally wrong?

  • shaunshaun

    You dont need custom footprints to start, the regular SER ones are fine, click the options button on the SER main window. Click Advanced, click Tools, select search online for URLs, select the footprints you want from the window and copy them over to text file with scrapebox one at a time and then merge that text file with your keyword list in scrapebox.

    yea thats how the use verified urls of another project works, that is setting up urls to build links to. your target links to build links on come from your verified folder.

    You run the spammy project to get verified links, you have to make sure that in the general SER options you have ticked the tick box next to the verified folder location,the other three dont need to be ticked. As the verified folder is ticked SER will automatically save all verified URLs it gets to this folder. You ranking projects are set up to automatically pull their target urls from this folder so you dont have to import anything, one thing you do need to do is remove duplicate urls/domains once a day. There are a few advanced tricks you will be able to do in the future to tweak this though as I currently run pretty strict checks on my verified folder and then send what I want to the failed folder and have everything pulled from there.

  • sagesage In Front Of My Computer
    @shaun don't we need to scrape the urls that is related to our niche? Please let me know if I'm wrong. But using scrapebox with footprint + keywords should be the best. 

    As what @odissey2010 experiencing, I also had same problem, that LPM is very low, and I started think to buy a list out there, but not yet execute it.

    Please advice also :)
  • shaunshaun
    sage No you don't just scrape for general keywords.

    Although I used to use premium lists I now think they are a scam, I made this post to explain it....

    Have you implemented the steps I have said in this thread?
    How long have you been using SER?
    How many projects do you currently have?
  • sagesage In Front Of My Computer
    @shaun I have been running GSA SER for about 2 months. But not all of that time I used to run campaign. I started from scratch, so I bought GSA SER + GSA CB + 20 Dedicated Proxy + 5 Semi Dedicated Proxy + WAC to spin articles and used that 2 months to learn everything.

    I have only 3 projects, that is really for dummy testing.
    1 project to rank youtube video url
    2 project to rank local websites.

    result so far: nothing is rank to even page #10 in google search. I checked the backlinks with ahrefs so far only 3-8 backlinks for those each projects.

    Lpm was very low, I had bought url list once a time from fiverr, indeed the Lpm is very high but fact It didn't give any backlinks, even I have used instantlinkindexer to index the verified list.

    Thanks I have read your post on your provided link, and I have put comment on my other thread you joined too.
  • Hi guys,

    I managed to rank the website to the fist page on Google, then I noticed the keyword I was ranking to have only 200 searches/month hahaha and high competition. Damn noob mistakes. I'm getting little traffic from it. Also, I can't get better then 8~6 place. I'm creating more backlinks with no luck.

    During this month I'm having some more doubts... I try to solve them by myself but I'm not always sucessful on it, and sometimes is just better to ask. 

    - How often should I change my articles? Should I use the same article for all projects if I'm ranking them for the same thing?

    - I have a verified list that contains about 60k links right now. Should I use this verified list to post on another projects? If I use theses links on a new website is a lot faster... With this verified list a get high lpm and vpm.

    So you guys can review what I'm doing, check, advise, and see if I'm in the righteous way:

    - I want to rank Money Website.
    - I'm Not messing with Money Website, I'm not creating any direct links to this websites with any tools.
    - I created 5 blogs so far. I purchased 3 articles from a freelancer to post on each blog. Quality content, niche related.
    - note that I'm using fresh domains. I'm on little budget so I couldn't buy some old domain with high DA. (Should I?) 
    - On each blog, one of these high quality posts contains a link to Money Website, with the keyword I want to rank for on the anchor text.
    - Taking care: every blog have a different IP and different theme. So far I purchased domains from the very same registrar. I don't know if 5 domains already trigger an "warning".
    - I'm creating links with GSA to each one of these blogs. These links I'm creating and links automatically with GSA, no manual process. 
    - Directly to the blogs I'm using contextual follow AND nofollow URLs.
    - On GSA, to each one of these blog projects, I have tiered like this:
    All good?
    Should I create more blogs?
    The next keywords I'm aiming have around 20k search/month and high competitivity  .
    To scrape lists to post I shoukd NOT use general keyword right? I have to scrape keyword spefic to each project? Or can I just use random?

    Thanks again 
  • i would throw in (my opinion) that I would not use SER to build Tier 1 links. I build 2.0 links with another software and use SER to build tier 2 +3 which it does extremely well. I also have my own PBN that I have been building for the last 8 years. I have no issues ranking this way for my keywords and my money site is well protected. 
Sign In or Register to comment.