Skip to content

What am i doing wrong with GSA ranker it does not want to work properly!

edited May 2014 in Need Help

So i bought gsa ranker a couple of months ago and because i was so busy back then i didn't have so much time over to explore the software. But since 1 month ago i started to learn the software in and out, and at the same time i watched all tutorials, read manuals, discussion on forum etc..

 

But after all i cant get gsa to work...

my lpm is very low maybe 1 if i am lucky, and there is no contexual links built or verified.

i have no site list or verified site list to import, and i dont want to have one.

I cant even finde targets to post to!!

I tried to scrape my own list with scrapebox and I scraped 500 articles and inserted them into gsa and only got 90 submitted and no verified+ all the articles are waiting on activation whatever that is.

 

I read on blackhatworld that one guy is building every type of link on gsa except for doc sharing and videos  and he is scraping and posting with gsa only and his lpm is somewhere between 400 and 500!! He is not using a verified list….

He is using only 40 private proxys, 20 shared and 20 private.

And a keyword list that contains 50k keywords.

He is creating around 100k links a day.

I want also to spam like him on that scale.


Here are my settings for gsa

my submissions settings

[IMG]http://i61.tinypic.com/2pyulv6.jpg[/IMG]


my captcha settings i use captcha sniper

[IMG]http://i60.tinypic.com/212fyvd.jpg[/IMG]


my indexing settings i use indexification

[IMG]http://i58.tinypic.com/103682s.jpg[/IMG]


my filter settings

[IMG]http://i61.tinypic.com/2a8gi7p.jpg[/IMG]


my advanced settings

[IMG]http://i62.tinypic.com/344pkqx.jpg[/IMG]


my proxys i use 10 semidedicated proxys from buyproxy


[IMG]http://i58.tinypic.com/29m2i9u.jpg[/IMG]

[IMG]http://i59.tinypic.com/xptycz.jpg[/IMG]


and now my project settings

my data

[IMG]http://i58.tinypic.com/219yn90.jpg[/IMG]


article manager 

[IMG]http://i61.tinypic.com/7131j6.jpg[/IMG]


options

how to submit and verify

[IMG]http://i60.tinypic.com/2kikie.jpg[/IMG]


how to get target urls i use country united states and english language

[IMG]http://i58.tinypic.com/9au2i8.jpg[/IMG]


scheduled posting

[IMG]http://i57.tinypic.com/v4rxgm.jpg[/IMG]


filter urls

[IMG]http://i57.tinypic.com/2lljejq.jpg[/IMG]


email verifiaction i use two gmail accounts with dot trick and they have pop 3 enabled and support spam.

[IMG]http://i61.tinypic.com/2cqgo48.jpg[/IMG]


and i use captcha sniper with 90% succes but this time i forgot to start it but even then it is 70% succes.

[IMG]http://i58.tinypic.com/20t1ncg.jpg[/IMG]

i dont use anything for recaptcha


Maybe it is my pc that is the problem?

[IMG]http://i60.tinypic.com/rtzvxy.jpg[/IMG]


if i brake down everything i would ask for

1. how many emails do i need to have with dot trick to support 100k a day mass spam.

2. how do i get target urls i mean wich search engines do i need to use and how many

3. i have a 700k keywords list scraping.


plz help...i would be for ever thankful.

i know that gsa works and i know for sure that the problems are made all by me.




Comments

  • I bet he's not posting to unique domains. 

    Use 10 emails per project.

    Use Santos's footprint tool and Google some footprints, modify the engines, so you can get more targets.

    Split your 700k kw list into smaller lists 50k each, and use %spin folder% macro to pull random kws from your desktop (keep SER as light as possible).


  • Buy a list or generate one. Then see if you get some results .
  • ok i will use 10 emails per project, but who is santo? and why do i need the footprints i am not going to scrape after links using tools like scrapebox. Gsa does have already built in footprints?

    In the project settings in "how to get target urls" wich engines do i use and how do i modify the engines? i dont understand that part.

    And yes i am going to split the keyword list into 50k keywords files list.

    What do you mean with use %spin folder% macro to pull random kws from your desktop (keep SER as light as possible).

    Do you mean i have to merge the 50k keyword file with %keyword%, and run gsa light as possible wich means not have indexing services checked etc.
  • https://forum.gsa-online.de/discussion/7692/free-software-ser-footprint-editor-easily-add-footprints-to-ser-engines/p1 You adjust the footprints that are used in SER so you're aiming at other targets other users aren't. SER scrapes, so you need to feed it more info.

    Split your 700k file into 50k txt files. Put those 50k txt files into a folder on your desktop, then in the Keyword section of SER, use the %spin folder% macro. This stops you bloating SER with 700k words and helps it scrape/find targets faster. The same goes for all content within each project. Use spin folder and have all of your content for titles, bookmarks, articles etc on your desktop. 

    Use an indexing service.
  • oke thanks i will see if that helps. 

    Thanks again.
  • The results are all found easily using the search function. Look up threads related to increasing lpm.

    9 months or so ago i too was stuck at only aobut 1lpm 4 if i was REALLY lucky :P

    Been able to get steady 200s a few months ago.

    Took a cpl of months of tweaking.

    No one setting, just do a search of the common threads related to lpm and you will start to see the trends after youve tested thoroughly yourself each setting one by one ofc.
  • it looks like i need gscraper to get and extract the footprints.
    The tools i only have is scrapebox and gsa ranker, captcha sniper.
  • you dont need any of that shit. jsut let gsa scrape. i got 200 lpm with defualt footprints and gsa scraping.
  • What is your settings?
Sign In or Register to comment.