@kijix84 They no longer have PR saved with the url. I quit doing that some months ago because no one was using PR anyway. So they don't have PR with them anyway.
@loopline ok, and does the subscription service have a limit on # subscribers before you won't sell it to any more people? i don't see that mentioned in the initial post.
@kijix84 It does have a limit, sort of. So it has a limit per group. But I have the system setup so I can clone it as needed and produce multiple clusters of servers which produce multiple seperate lists. I have more then 1 technique to go about getting the lists so while there could be some bleed over between lists they can be more or less unique lists.
I am working on redoing the recaptchas. Im going to try out blazing seo recaptcha, but I may just discontinue the recaptcha. Almost no one uses it or the identified.
Im not sure "good success" and "recaptcha" really go in the same sentance, but sure they are probably as good as the rest.
i have been having good luck with reverseproxies OCR. i would try that one. i dont know why people wouldn't want the recaptcha targets. they can be better targets.
@kijix84 I think its that people don't understand. I send out at least 2 emails plus sales material explaining stuff but I still get questions from people and even people that are having issues that let me login to their server. I see people with stuff setup wrong and that just don't have recaptcha solving or have the identified setup wrong or just don't understand what its for.
I mean there is a lot going on, SER has a lot of options etc.. but in surveys people want what captcah breaker can solve and thats about it.
@kijix84 I don't know, I don't track it. Identified list is going away in the very near future, like probably in the next week. Its usless now.
I had some users that wanted to parse it and use recaptcha etc... but now that I have a recaptcha list and Im redoing it, I already go thru the entire identified list with captcha breaker multiple times, and then turn around and go thru it with the recaptcha/text captcha solving. So there is really no point to waste resources on it. So all the resources and effort will shortly be going into the recapcah/text captcha list.
Verified links are from a diverse range of niches, as are the recaptcha and text captcha verified list.
They are true verified links. ID no longer offer an identified list. IT was pointless, I already comb all lists with captcha breaker many times over.
Then I take whatever doesn't pass with that and comb over it for targets that may use recaptcha or text captcha. So there isn't really a point to give out an identified list as I have a server farm that has already hit it from every angle.
I don't check or limit PR with my verified links as all projects vary in need. So you can setup those filters in GSA and it will automatically only post to targets out of my list that qualify as meeting your filters.
You can import directly if you want, but I give a full video tutorial of how to hook up the live sync to your projects.
Live sync is ideal for many reasons, a few of which are that my system finds new targets all day every day. Those targets are synced out to users within minutes of being found and your GSA can start posting to them right away.
Using live sync your GSA automatically picks up urls from my list, scans them against any filters you have and then creates links on them, so no need to keep importing new lists all the time.
I remove dead targets daily as well and the monthly I completely reset the entire list and build it from scratch again. So by using live sync your gsa will always have the freshest list up to the minute.
I'm working on amazon affiliate niche. GSA Ser/SB is committed to the appropriate keyword related inks scrape.But you have to give the general verified lists. How to find a my niche to submit. "Keywords must be present on" the place, do not give them any keyword so that the Data / keyword field is to identify keyword becomes available. Your non-niche related verified links to identify to submit how to hold my niche. Can you elaborate me ????
You can add filters in GSA that work with some pages to post to pages that contain your keywords, but many pages, like article, wiki etc.. basically any contextual links, as well as indexer links and some other platforms create a unique/new page for your content and link to go on.
So it won't have your keyword on it, but with contextual sites you can just build whatever content you want, so all about your niche and then include a link on the page as well.
But for the others you can still build some links. Its not natural to have all links to a site from only niche related sites, because people link randomly. So you want to look natural so some random links would be a good thing, but you can filter the rest. Also for contextual and ones where you create content you can create the niche content your self.
You can add filters in GSA that work with some pages to post to pages that contain your keywords
I say that with what I understanding. Then I bought from you verified links, Options / "Always use keyword to find target sites enabled" if put in keyword filed the keywords I gave in to the related links submit framed? Am I to understand it?
That setting is only for scraping. You need to use a setting further down the options tab. But the captcha explains the same thing I said above, if you try and filter to just posting on pages that are about your niche or have that keyword on it, you will lose basically all your contextual targets, which is among the best links you can get.
I would like to become SUCCESSFUL from T1 CAMPAIGN, I'm buying from you in VERIFIED LINKS PACK PR> 8 CONTEXTUAL LINKS Would that be over? OR LOW QUALITY VERIFIED LINKS are there?
I don't track PR, given that its shut down. You can use GSAs PR product or PR jacker and guesstimate PR and then set that filter in GSA.
But I don't know what would be in the list exactly. There are definitely quality domains in there, and there are low quality domains, I just add everything, I don't discriminate. So you can set GSA to only post to what you want.
But if you kill of contextuals and only look for pages with your keyword and PR 8, but you want to be successful in T1, then I would not recommend my list, or any list for that matter. These sites are going to be few, as in VERY few, and far between.
Keep "Use Yandex TIC as PR" "Skip sites with a PR Below" am, Does it work?
You give, PACK be bulk verified link from, but in my SER daily limit submission or verification limit -> 20 am with that...
Verified links on your project-> import verified links submission condition in accordance with the daily 20 verified import if there had been? I wish to submit the link velocity.
I check all my metrics in Scrapebox, so I don't know if it works in GSA, I presume it does, Sven is a good programmer. I don't know why it would not work.
I don't really understand your question, but I think your asking if my list will work with your 20 limit per day, and the answer is yes. GSA SER still controls everything with my link list, so it will use all the filters and limits you set.
But based on what your saying, I don't think my list service is right for you and I would advise you don't buy it. I not out to get peoples money. If a service isn't a good fit then Ill readily say so. I don't think my service is a good fit for you, I don't think any service is. I think you should scrape for your own stuff, as you plan to use really high filters.
I would also recommend you get text and recaptcha solving for your needs in general.
I check all my metrics in Scrapebox, so I don't know if it works in GSA,
I presume it does, Sven is a good programmer. I don't know why it
would not work.
I don't really understand your question, but I
think your asking if my list will work with your 20 limit per day, and
the answer is yes. GSA SER still controls everything with my link list,
so it will use all the filters and limits you set.
But based
on what your saying, I don't think my list service is right for you and I
would advise you don't buy it. I not out to get peoples money. If a
service isn't a good fit then Ill readily say so. I don't think my
service is a good fit for you, I don't think any service is. I think
you should scrape for your own stuff, as you plan to use really high
filters.
I would also recommend you get text and recaptcha solving for your needs in general.
@webm200 I check all my metrics in Scrapebox, so I don't know if it works in GSA,
I presume it does, Sven is a good programmer. I don't know why it
would not work.
I don't really understand your question, but I
think your asking if my list will work with your 20 limit per day, and
the answer is yes. GSA SER still controls everything with my link list,
so it will use all the filters and limits you set.
But based
on what your saying, I don't think my list service is right for you and I
would advise you don't buy it. I not out to get peoples money. If a
service isn't a good fit then Ill readily say so. I don't think my
service is a good fit for you, I don't think any service is. I think
you should scrape for your own stuff, as you plan to use really high
filters.
I would also recommend you get text and recaptcha solving for your needs in general.
Thank you for your valuable response, You respect your customers, All of your answers are considered to be very tidy and respectful. My congratulations to continue your work is specialized.
Well it sounds like you want only very high PA/DA type links, and on top of that you want to filter out ones that don't have your words on them, so by the time you add in all the filters you want, your probably going to get very few links if any from my list. So thats why I wouldn't recommend it.
You would do better to just let GSA scrape for your keywords with your filters for platforms already applied and such. I still don't think you are going to get very many links, but a few great links is still good. Just my opinion.
I'm using GSA scraper but If I use filters,It could not generate high quality links. I need to purchase verified links. Now I'm little studying about it.
I have one more doubt.
1) What is the main role
of the private proxy ??? "Search query" or "submission"? 2) I think, we need to make the public proxy is verification purpose? (any changes leak IP) 3) What is important to IP Leakage Search query/submission/verification/PR check???? 4) GSA Ser scrape are doing at the moment, verified links import-> project I use to? At the same time, both will work?
Submissions is the "main" role, but you can use them for scraping if you do it correctly
Yes you need to "test and verify" that a public proxy is anonymous, but GSA will do this.
Its important for search query and submissions, you can't get PR any more and gsa and pr jacker wont' care about your ip. Verification doesn't require a ip either.
Yes you can scrape your own and use verified links.
1) How to Niche related comment post GSA Ser/Scrapebox ? Please suggest me any best guide links and youtube. 2) What I tick "Type of Backlinks to create" for my
T1 – Contextual
(Article/Wiki/Web 2.0/Social networks)
1) Not sure I follow... your asking how to make a niche related comment for the site your posting on or your site? Or just post on sites related to your niche?
For contextual sites, just create content about your niche. Elsewise you can use the filter in GSA to filter out sites that don't have your desired niche words on them.
2.) thats more of an SEO strategy which isn't something I go into with this, I leave it up to you. but if in dobut stick with contextual related links and then build other links higher up in your tiers.
Comments
They no longer have PR saved with the url. I quit doing that some months ago because no one was using PR anyway. So they don't have PR with them anyway.
It does have a limit, sort of. So it has a limit per group. But I have the system setup so I can clone it as needed and produce multiple clusters of servers which produce multiple seperate lists. I have more then 1 technique to go about getting the lists so while there could be some bleed over between lists they can be more or less unique lists.
I am working on redoing the recaptchas. Im going to try out blazing seo recaptcha, but I may just discontinue the recaptcha. Almost no one uses it or the identified.
Im not sure "good success" and "recaptcha" really go in the same sentance, but sure they are probably as good as the rest.
I think its that people don't understand. I send out at least 2 emails plus sales material explaining stuff but I still get questions from people and even people that are having issues that let me login to their server. I see people with stuff setup wrong and that just don't have recaptcha solving or have the identified setup wrong or just don't understand what its for.
I mean there is a lot going on, SER has a lot of options etc.. but in surveys people want what captcah breaker can solve and thats about it.
I replied in your thread
I don't know, I don't track it. Identified list is going away in the very near future, like probably in the next week. Its usless now.
I had some users that wanted to parse it and use recaptcha etc... but now that I have a recaptcha list and Im redoing it, I already go thru the entire identified list with captcha breaker multiple times, and then turn around and go thru it with the recaptcha/text captcha solving. So there is really no point to waste resources on it. So all the resources and effort will shortly be going into the recapcah/text captcha list.
Verified links are from a diverse range of niches, as are the recaptcha and text captcha verified list.
They are true verified links. ID no longer offer an identified list. IT was pointless, I already comb all lists with captcha breaker many times over.
Then I take whatever doesn't pass with that and comb over it for targets that may use recaptcha or text captcha. So there isn't really a point to give out an identified list as I have a server farm that has already hit it from every angle.
I don't check or limit PR with my verified links as all projects vary in need. So you can setup those filters in GSA and it will automatically only post to targets out of my list that qualify as meeting your filters.
You can import directly if you want, but I give a full video tutorial of how to hook up the live sync to your projects.
Live sync is ideal for many reasons, a few of which are that my system finds new targets all day every day. Those targets are synced out to users within minutes of being found and your GSA can start posting to them right away.
Using live sync your GSA automatically picks up urls from my list, scans them against any filters you have and then creates links on them, so no need to keep importing new lists all the time.
I remove dead targets daily as well and the monthly I completely reset the entire list and build it from scratch again. So by using live sync your gsa will always have the freshest list up to the minute.
Does that answer your questions?
So it won't have your keyword on it, but with contextual sites you can just build whatever content you want, so all about your niche and then include a link on the page as well.
But for the others you can still build some links. Its not natural to have all links to a site from only niche related sites, because people link randomly. So you want to look natural so some random links would be a good thing, but you can filter the rest. Also for contextual and ones where you create content you can create the niche content your self.
But you can do it
http://imgur.com/a/zYXF1
@webm200
PR> 8 CONTEXTUAL LINKS Would that be over? OR LOW QUALITY VERIFIED LINKS are there?
I don't track PR, given that its shut down. You can use GSAs PR product or PR jacker and guesstimate PR and then set that filter in GSA.
But I don't know what would be in the list exactly. There are definitely quality domains in there, and there are low quality domains, I just add everything, I don't discriminate. So you can set GSA to only post to what you want.
But if you kill of contextuals and only look for pages with your keyword and PR 8, but you want to be successful in T1, then I would not recommend my list, or any list for that matter. These sites are going to be few, as in VERY few, and far between.
You give, PACK be bulk verified link from, but in my SER daily limit submission or verification limit -> 20 am with that...
Verified links on your project-> import verified links submission condition in accordance with the daily 20 verified import if there had been? I wish to submit the link velocity.
I don't really understand your question, but I think your asking if my list will work with your 20 limit per day, and the answer is yes. GSA SER still controls everything with my link list, so it will use all the filters and limits you set.
But based on what your saying, I don't think my list service is right for you and I would advise you don't buy it. I not out to get peoples money. If a service isn't a good fit then Ill readily say so. I don't think my service is a good fit for you, I don't think any service is. I think you should scrape for your own stuff, as you plan to use really high filters.
I would also recommend you get text and recaptcha solving for your needs in general.
I don't really understand your question, but I think your asking if my list will work with your 20 limit per day, and the answer is yes. GSA SER still controls everything with my link list, so it will use all the filters and limits you set.
But based on what your saying, I don't think my list service is right for you and I would advise you don't buy it. I not out to get peoples money. If a service isn't a good fit then Ill readily say so. I don't think my service is a good fit for you, I don't think any service is. I think you should scrape for your own stuff, as you plan to use really high filters.
I would also recommend you get text and recaptcha solving for your needs in general.
I check all my metrics in Scrapebox, so I don't know if it works in GSA, I presume it does, Sven is a good programmer. I don't know why it would not work.
I don't really understand your question, but I think your asking if my list will work with your 20 limit per day, and the answer is yes. GSA SER still controls everything with my link list, so it will use all the filters and limits you set.
But based on what your saying, I don't think my list service is right for you and I would advise you don't buy it. I not out to get peoples money. If a service isn't a good fit then Ill readily say so. I don't think my service is a good fit for you, I don't think any service is. I think you should scrape for your own stuff, as you plan to use really high filters.
I would also recommend you get text and recaptcha solving for your needs in general.
I have to make the decision.......
Well it sounds like you want only very high PA/DA type links, and on top of that you want to filter out ones that don't have your words on them, so by the time you add in all the filters you want, your probably going to get very few links if any from my list. So thats why I wouldn't recommend it.
You would do better to just let GSA scrape for your keywords with your filters for platforms already applied and such. I still don't think you are going to get very many links, but a few great links is still good. Just my opinion.
I have one more doubt.
1) What is the main role of the private proxy ??? "Search query" or "submission"?
2) I think, we need to make the public proxy is verification purpose? (any changes leak IP)
3) What is important to IP Leakage Search query/submission/verification/PR check????
4) GSA Ser scrape are doing at the moment, verified links import-> project I use to? At the same time, both will work?
Yes you need to "test and verify" that a public proxy is anonymous, but GSA will do this.
Its important for search query and submissions, you can't get PR any more and gsa and pr jacker wont' care about your ip. Verification doesn't require a ip either.
Yes you can scrape your own and use verified links.
@webm200
2) What I tick "Type of Backlinks to create" for my T1 – Contextual (Article/Wiki/Web 2.0/Social networks)
For contextual sites, just create content about your niche. Elsewise you can use the filter in GSA to filter out sites that don't have your desired niche words on them.
2.) thats more of an SEO strategy which isn't something I go into with this, I leave it up to you. but if in dobut stick with contextual related links and then build other links higher up in your tiers.