Skip to content

Low LPM using

Hi,

I have purchased GSA SER one month ago. Using Captcha Breaker as 1st service and 2captcha for 2nd service. I have bought a verified list http://www.autoapprovemarketplace.com/

My LPM is 0.44 only.


I basically want to create 2nd tier bookmarking link of web 2.0 which I have created(web 2.0) from rankerx and then uploaded these tier1 link in GSA SER.

I have uploaded 8 verified web 2.0 links to do 2nd tier bookmarking. I need atleast 10 link per url means total 80 2nd tier links

Here what I did in project settings-




 

GSA SER Settings-

 


 

 

Proxy settings



Please advise me, how can I improve LPM,VPM??


 Thank you in advance!!

Neha

 

 





Comments

  • @Sven @shaun Please check my above settings and let me know what should I change to get good LPM/VPM?

    Waiting for the reply.

    Thanks,
    Neha
  • Try unchecking drip feed.  Just choose verified instead of submissions.

    Also your screenshots are too tiny for my eyes :( so i couldn't go through all the sshots ;)
  • Use more threads can help depending on your server. Test 500 then more if your not crashing. If you crash then use the scheduler.
  • @Forthe300 I will try 500 threads and update you here.

    @jhonnyshadow sorry for the bad screenshot. I have created new ones now. Please check it now.


  • shaunshaun https://www.youtube.com/ShaunMarrs
    SER is pretty bad with social bookmarks, I would imagine that's your problem.

    Duplicate the exact same project, change its URL to some random made up one so its just a test project and disable the social bookmarks platforms then enable articles, wikis and social networks just to compaire the LPM.

    Another thing, GSA CB is a really good bit of kit, IMO there's no point in having it try a captcha 5 times, its just slowing you down. In my experience, if CB doesn't get the captcha the first time then it probably won't so I only ever use it once.

    Turn off your custom wait time between search queries as you are using a premium list so this is just wasting your time. From what I can see you don't have any search engines enabled anyway so i'm not sure if SER will actually try activate threads to search or not.

    I would also drop 2captcha for generic CMS links, they really don't last long enough these days, I have a bunch of pages that are showing signs that they are only getting the benefit of links after 5 months or so now. You can see my analytics screenshots in this post over on BHW. Everything has slowed right down now with Penguin 4 being real time.

    Most my SER links are now dead for those sites with a few select CMS ones that I used 2captcha on still sticking. I think SERE is probably the best way to go if you want automation. I posted my retention check for a SERE blast in this thread over on BHW a few month back. I canceled my VPS at the start of last month so cant check the projects now but I checked them before ending my sub and very few links from the 30 day period had been removed so that's pretty good retention for automated links. I also have a theory about link uptime being important now with juice being turned off if the page with a link to your site goes on and offline too much like many SER platforms do.

    I can't remember if I said it on here or on BHW but I think the days of auto-generated content are gone now. All of my new stuff is using either 100% human wrote articles on self hosted domains or manually created web 2.0 accounts with some human spun articles being used on a few SER CMS' and SERE platforms. There was a thread on BHW about a week back about a guy who was using nothing but autogenerated content on his PBN and it got slapped with one of the October rollouts. I posted a while back comparing the effects of my own sites that used 100% human content and the ones with auto-generated content. I cant find the post but its somewhere in my journey thread on BHW. The stuff using 100% unique articles outperformed the auto-generated content stuff massively with most the auto generated stuff either staying at its initial entry possition in the SERPs or falling over time with no bounce back.
  • @shaun Okay, Thanks for the reply. Could you please suggest any software for 2nd tier bookmarking. I was using bookmarking demon previously. http://bookmarkingdemon.com/

    BookmarkingDemon Team is planning to stop it.

    https://gyazo.com/f5a89583f2be4dbd11bd1c8279d8e719

    Waiting for your response.

    Thanks,
    Neha
  • shaunshaun https://www.youtube.com/ShaunMarrs
    I don't use bookmarks so havent researched other tools sorry.
  • @shaun Okay, Is this SEREngines.com good? Do you have any idea how much links we will get from this service??
  • shaunshaun https://www.youtube.com/ShaunMarrs
    Its the best web 2.0 creator I am currently aware of, you get as many links as you want from it but its from a pool of about 20 domains if I remember correctly.
  • neha319 Settings seems fine. Maybe the targets you are using. Try to re-identified.  
    And how many bookmark verified in your list? 




    @shaun - I can understand what you are saying by gsa ser is not good for bookmarking. but still gsa ser can post to those pre-verified targets as ser already done before? Right 
    ?


  • shaunshaun https://www.youtube.com/ShaunMarrs
    @Maxhosting not 100% sure what you mean mate tbh. I have never had much success with social bookmarking with SER and preffererd articles, social networks and wikis.. 
  • @shaun knows this the best and I just reread his post test and test. Also if a campaign is running like shit just delete it and rebuild. You would be surprised by things that happen when you forget to check something and it works better. Other times that same trick won't work. Change is the name of the game. Good luck
  • edited November 2017
    shaun - I feel my english like shit ha ha  <span>:dizzy:</span>

  • shaunshaun https://www.youtube.com/ShaunMarrs
    A thing you have to factor in is how long the created link will be alive. Its taking longer and longer for a link to pass link juice. The problem with links created to domains you don't own is there is a very high chance they will go offline before google can award any link juice from it to pass it up your chain.

    So you have the problem of a limited number of social bookmarking links anyway and then the uptime has to be factored in. My advice would be to do something like this and use SER to automate it. I posted this update in my BHW journey thread a few days back and some of those pages are using the bridge network concept but its automated with scrapebox rather than SER. Traffic and income is on track to double this month so I am happy with the progress.
Sign In or Register to comment.