Skip to content

Small GSA SER research (global list vs normal active) old scrapebox vs new scrapebox

Guys, I want to give a little insight on my own research that I have done for GSA SER.
Basically, I'm running into some issues, and I hope someone can clarify some things im experiencing currently.


So before I start with everything. I had used before a Nulled scrapebox version for my links (yes I know, I'm evil bla bla)
And when I scraped with it, out of couple million links (1-2) only like 100-300k were duplicates.

Second thing I want to add is, when I used GSA Pi on it, almost 80-90% of the platform was detected.
Meaning I had a good list to use in GSA SER to create backlinks.

Other thing is the lpm / vpm.. It was just insane... between 800 to even 1200 lpm (500 threads for 800, and when I used 800+ threads It went up to 1200lpm)

so here's what happened after that.
I bought the original Scrapebox because a friend told me he scraped 100 times faster than me. seriously the nulled version scraped 200links per second max.. while he was saying he scraped 6k links per second.


So ever since i bought scrapebox, I get literally garbage lists.
I scraped 78 million links in 1 night. GSA PI duplicate removal removed 60+ million. and I was left with 11 million links. 

that's not the only issue..
when I run GSA PI out of the 11 million links over 9,8 million were undetected platforms. 
So I was left with 2-3 millon links for GSA.

Hold on, Hold on.. that's not all!

I created 10 projects (i always do that, duplicate my projects with fresh emails)
I grouped one 10 projects as "search active"
I made another 10 projects and grouped as "scrapebox list"

So basically I got 20 projects, working on the exact same URL list, exact same keywords, everything is the exact same. (except fresh mails / proxies)
the only difference is I put on group on "active" the other group on "global site list" with my 2-3 million links from scrapebox.


So here are the results after 2 days :
image

So the image explains everything. 
GSA searching for links has better, faster and more results than my list. 

Which before with my own list was the exact opposite. In the same amount of days (2-3 days) I have buildt 2.3 million links to my site.

Can anyone clarify what could be the issue of this? 
Is GSA SER stuck on something? 
Is Scrapebox original new version a peace of garbage compared to the old nulled one?

what am I doing wrong here ?

hope someone can clarify..

who am I ?
Sir Doon, at your service, bringing you personal tests, stats and results for the lazy fucks who don't want to do them their self.

Comments

  • DoonDoon Netherlands
    Let me rewind everything here again.


    Nulled Scrapebox
    • (4-5 hours) to get a 2+ million list. -100-300k duplicates only
    • (6-7 hours) GSA PI to identify the platform and had a decent 2.3 million list left (allmost all links were identified by their platform 2.3m list)
    • GSA SER - in 2-3 days could blast 2.3 million links with 800-1200 lpm


    today my test results 

    Original Scrapebox

    • (1 day / 24 hours) 78 million list -65million duplicates!
    • (98 hours) GSA PI to identify a 11 million list, where 9,8 million were undetected platforms and left me with a 3+- million list for GSA SER
    • GSA SER - in 2-3 days could only create 4k verified links



    <:-P
  • DoonDoon Netherlands
    P.S. !!! Another thing to add that I forgot was.. I ran Scrapebox the entire night, because if I stopped it in an hour or 2 (which I did several times) it sometimes removed from a 7 million list like 90% and left me sometimes with 30k links.....!

    what the hell is going on here..
  • lol exactly the same issue i ran into when I was trying to do my own lists.
  • It might be removing duplicate domains and url's.
  • HinkysHinkys SEOSpartans.com - Catchalls for SER - 30 Day Free Trial
    Are you sure your old nulled version isn't only scraping the first 100 results (as opposed to the 1000 in new one)?

    Cause that's exactly what those original vs nulled scrapebox numbers look like. If nulled was doing only 100 results / keyword, you would get much less duplicates but also you would get less links at the end.

    And that would also explain the GSA PI part because when you're pulling 1000 results / keyword, you get a much less % of identified links (but more identified links in total).

    Now it certainly wouldn't explain the SER part because that one just doesn't make sense.

    2.3mil vs 4k ?

    There's no way in hell that scraping with different versions of SB would have such a drastic impact on number of verified links. Hell, if you were to do 2 scrapes with completely different keywords & completely different footprints, you still wouldn't get even close to that much difference in number of verified links at the end.

    As far as your first post / test goes, try directly importing the list to all your projects rather than using the global site list, I found that I got much better performance in the past that way.

    And also make sure that you use a different site that you're building links to in each project (my sugestion is a made-up url).
  • DoonDoon Netherlands
    As far as your first post / test goes, try directly importing the list to all your projects rather than using the global site list, I found that I got much better performance in the past that way. 

    Could you please explain this maybe with screenshots ?

    I know there are 2 ways of importing a list. And I dont know exactly which one you mean with that comment. 

    @hinkys
  • DoonDoon Netherlands
    I also opened "remaining target urls" it took a while to open it, and the whole dedicated froze for a moment. lol..
    but when it got open I saw 3.8million links.. Sooo could it be that GSA SER is just aint doing what it's supposed to do? because to me that looks like the entire list.
  • HinkysHinkys SEOSpartans.com - Catchalls for SER - 30 Day Free Trial
    @Doon
    I meant importing them by selecting all projects and then Right Click -> Import Target URLs (rather than using the "Use URLs from global site list" option)
  • DoonDoon Netherlands
    I did this way as you explained? but shouldnt it that be on global site list only? or should it be on active ? 

  • HinkysHinkys SEOSpartans.com - Catchalls for SER - 30 Day Free Trial
    @Doon
    Well as far as I know, on Active it's going to go through the remaining target urls first and after that, try to get / find more target URLs if you have any options for that enabled ("How to get target URLs" section under project's options).

    When on Global site list, it's going to ignore the current remaining target urls and just use...well... the global site list. The same thing (or at least that's how I understand it) as setting the project to "Active" but only enabling the "Use URLs from global site list if enabled" option (and disabling everything else under "How to get target URLs" section) as a way to get target links.


  • jamessaylorjamessaylor california
    I just can harvest 100k link from scrapebox .
    I see you can harvest 78 milion link . It's huge url's list.
    How can you scrape 78 milion link from scrapebox ?
    How many keyword do you use ?
  • sagesage In Front Of My Computer
    @jamessaylor, I would say also how many proxies he used also. I tried this way with only 20 proxies before, and proxies easy to die when scrapping with scrapebox, maybe you need at least 100-200 proxies to get millions links
  • DoonDoon Netherlands
    I got a 1000 and I had 1100 before.. 100 from blazing 1000 from **bigsecret**
Sign In or Register to comment.