Skip to content

How Do Article Submissions work?

Hi, I have just scrapped a list of 200k article sites using a footprint and keywords and scrapebox.
After removing duplicate urls there was 67k urls remaining.

My First Query is this:

Do I need to trim the urls to root for GSA SER to register and submit articles or are the urls good.

My Second Query is:

Should I include any kind of resource box or copyright notice at the foot of each (unspun) article. The article already contains links to my site?

My Third Query is:

I am using:
Captcha sniper
Then DeathByCaptcha
30 purchased semi/private/shared proxies
A single catchall email address

And still out of 67k I am only getting 13 verified submissions. (although the process is only 2 x 3rds through).

Still 13 is very low I would think?

Here are 3 images of my:
Main Options
Project Options
Email Verification Settings

Also note that e4ent0s got me to this ponit which is working and great (cnstant) as far as scraping goes. So this isnt a reflection on him and his great input.

image


image



image


Any and all advice and tips much appreciated.

PS: I thought I would ask before experimenting myself in order to ad any advice to the experiments to come.

Much appreciated and all the best to you.






Comments

  • edited August 2015
    1. No
    2. No
    3. most people here use CB instead of Captcha sniper
    Dont use DBC unless you 100% know what you are doing and you are experienced user or you will put $ in toilet.

    You scraped 200k sites or 200k urls using SB? Probably you scraped 200k urls, not sites. I scrape 200k urls every 30 seconds - this can tell you how fast and how much you need to scrape to get decent list.
    Btw you removed duplicate urls, For contextual links you need to remove duplicate domains, not urls.

    It looks like you are new to it and you dont know/understand how many things work so read forum, tutorials etc.

    Edit:
    13 verified is ok probably ok for 200k scraped urls. We dont know what engines you tried to scrape, thats also important. Some engines give 1 verified per 10 unique harvested domians some give 1 verified per 200-300 unique harvested domains.
  • Thanks for the tips. Much appreciated.
    Yes Im new, 
    200k urls not domains.

    I scraped the big 3 plus deeper web.

    Thanks for the tips though.

    To be honest Id rather ask questions than spend hours studying reading posts and watching videos when after all that the methods produce no better results than my default settings that I have gathered/set so far.

    EG: I got more from your reply here than I would watching several posts or videos that effect basically nothing.

    CB is something I will most likely use in the future.

    Can you use both together? 

    CS was the cheapest option to get me started.



  • You can use both together (CB and CS) but it wont give much profit. If you are new to all this better buy some verified list. I reccomend this lists https://forum.gsa-online.de/discussion/7660/over-400k-ser-verified-lists-mth-custom-cb-definitions-custom-ser-engines
  • Trevor_BanduraTrevor_Bandura 267,647 NEW GSA SER Verified List
    Try these settings

    image

    image

    image

    image




  • Thanks for the responses. Much appreciated.

    Im trying those settings now but to be honest. I think that to run at 300 threads Plus the 60 for proxies is more than my connection can handle.

    I actually forgot to mention that my connection is crap.

    Having 40 semi private proxies I thought upto 80 threads (considering the 2:1 ratio I read somewhere in here)

    But think that 40-50 is more thank likely because of the bad connection.

    Now, having thought about it and seen the results I have been getting (the new settings above seem to work better) I think that all is pretty well and its just my connection that's stopping me from knocking the threads up and going for it.

    Using gsa to scrape as well that is.

    Due to the same issue I cannot scrape enough urls with scrapebox to make scraping worthwhile.

    So after fixing my connection which is due october (moving providers to fibre) I should be pretty much ready to go.

    Il test my old defaults and the new settings @Trevor_Bandura gave until that time.

    So thanks gents. I really appreciate you taking the time to answer my newbie queries.

    But I think I have seen and read enough now to know a little bit about whats going on and how to solve things.

    NOW FOR THE VERIFIED LISTS.

    The lists you link to above seem to be very small. 400k? Or am I wrong.

    400k urls would be especially small for guys with no connection issues and running dedicated servers and VPs's although from where I am looking and my crappy connection 400k is quite large. Just because of the lack of resourcs to scrape that number etc.

    But thanks, I may try one mnth at least to test.

    Thanks. Muchos Appreciato.


      
  • Trevor_BanduraTrevor_Bandura 267,647 NEW GSA SER Verified List
    @hawk007 Ya I guess I forgot to mention in my post, don't use 300 threads unless you have enough proxies and also a good connection. I always recommend about 3-4 threads per proxy max. Others use more but for me, that works the best from all the testing that I have done.
  • Thanks Trevor, You enlightened me a lot on how it actually works, and so I can solve my issues and get going.

    Thanks. 
  • Trevor_BanduraTrevor_Bandura 267,647 NEW GSA SER Verified List
    No problem. If you have any other questions, you can PM me or even hit me up on Skype. I'd be more than happy to help if I can.
  • I do have another query but Il open a new post for it for organizational purposes of forum. etc. That way I get varied responses which helps in the learning process seeing other peoples perspectives.

    Thanks.
Sign In or Register to comment.