What is the expectant life span of a GSA SER list

Has a newbie of GSA i purchased a SER list the red one for $49 what is the life span of one of these meaning when will i have to replace for aniother 
«1

Comments

  • Accepted Answer
    there is no life span @sven is brilliant with updates ~!
  • catchallseocatchallseo www.catchallseo.com <- Disposable Emails NO MORE!
    @craigbal1

    If the list is sold to 50+ users. 1-2 weeks most of the time...
  • craigbal1craigbal1 United Kingdom
    what for 49$ you must be joking can anybody else shed any light on this
  • Tim89Tim89 www.expressindexer.solutions
    Accepted Answer
    Lists can be used over and over and over, the effectiveness of that list will deteriorate as time goes by due to being spammed, It's quite impossible to give an exact time frame or even a general time frame unfortunately.

    You could try your best to keep the list clean and working by checking to see if the root domains are still in googles index (or which ever search engine your targetting) so you can weed out the domains that have been de-indexed and just work with indexed domains, if the main root domain isn't indexed in the search engines you're working with, then there's no point building a link from it from an inner page.
  • craigbal1craigbal1 United Kingdom
    Thanks that sounds more promising maybe i should build my own by the sounds of it. Any advice on building my own list for the future how i should go about this for future reference.point me in the right direction i have scrape box do i need anything else.
  • Tim89Tim89 www.expressindexer.solutions
    edited March 2015 Accepted Answer
    It is fairly easy to scrape targets yourself, it's good that you have scrapebox, that's a starting point.

    You just need some footprints and some keywords, and then literally it's a click and waiting game...

    If you need some help with footprints, for now you can grab the ones that are preinstalled in SER, they'll do, for getting targets, here's how to do it.

    1. image

    Options > Advanced > Tools > Search Online for URLs.

    2.image

    Select which ever platforms you want to scrape for here, and they'll get added to the main window as you're adding them.

    3. image

    Once you've added all of your platforms, literally copy them and past them into a text file for future use, then just import them into scrapebox and get nice big keyword list and merge them together using scrapebox and then scrape! It's that easy.

    Moving onto another level of scraping would be to begin to create your own personal custom footprints so you can yield a better verified ratio, find out which elements/footprints of a page that SER likes and then rinse and repeat... You can also scrape using only the keywords related to your niche too but this will bring less results depending on your niche popularity.

    You're going to need some proxies as well, you don't need to go for expensive private proxies, there's a service by BanditIM which is good, you would need to install scrapebox v2 to get it to work properly.
  • craigbal1craigbal1 United Kingdom
    This is absalutely brilliant cheers
  • shaunshaun https://www.youtube.com/ShaunMarrs
    Accepted Answer
    My experience with purchased lists is around the four week mark, by the six week mark they are all but dead. Its amazing what taking that $49 and getting a VPS, some proxies and learning how to scrape your own list can do.
  • Accepted Answer
    craigbal1 its what you do with the list and how many people are using the list,other than that,u can use the list for a lengthy period of time mate
  • dwwwbdwwwb 94941
    edited March 2015 Accepted Answer
    @Tim89 Great stuff. Love Scrapebox- I would add that GScraper with their proxy service is worth the investment- this is good until tomorrow  (not affiliate link):

    http://www.gscraper.com/gscraper-for-2015.php
  • Tim89Tim89 www.expressindexer.solutions
    Accepted Answer
    @dwwwb sure, gscraper works well too.
  • dwwwbdwwwb 94941
    Accepted Answer
    @Tim89 How do you like the Bandit Proxies with Scrapebox? Which plan would be needed for 24 hr type scrapes?
  • Tim89Tim89 www.expressindexer.solutions
    Accepted Answer
    @dwwwb The 50 port plan currently at $52.50 would suffice, works upto 500 threads and they rotate every 10 minutes, actual recommendations would be pretty vague because everyone can adjust their number of threads/delay per proxy/port in their scraping tool which would increase or decrease their functionality when it comes to burning them out..

    I'd suggest starting with 50 port plan though, for 1 installation running 24/7 it'll sure scrape you a ton of urls without having to wait the full 10 minutes for rotating dead/burned proxies, but experimenting your own setup is the only real way to find out the cost to yield ratio.
  • Tim89Tim89 www.expressindexer.solutions
    edited March 2015
    I agree with @shaun... scraping really doesn't require that much effort nor does it require an amazing VPS to run, it can even easily be run in the back ground while running SER and if your budget can stretch a little further GSA Platform Identifier would make it even easier for processing raw scrapes into identified site lists which are easily imported into SER which saves even more time.

    - Cheap vps (or current ser vps) = $40 - $70 depending on your VPS plan
    - Proxies = $30 - $50
    - Scrapebox or Gscraper whichever you go with = $97/$68 one off payment
    - GSA SER PI = $57 one off payment

    Recurring cost of around $70~ and a life time of scraping millions and millions of potential targets, it's really not that expensive.
  • @Tim89 Thanks! Currently break up 100 buyproxies to scrape and feed other tools. I guess I'll try on a third machine to test head to head and see what the results are. Once configured do they need to be changed out manually every month?
  • Tim89Tim89 www.expressindexer.solutions
    @dwwwb When you purchase the subscription, you're given static IPs which you import into your scraping tool once... these are personal IPs that only you will have and IPs are rotated and funneled/routed to your static IP:PORT.

    So no, you will not have to change them out, they run on their own and refresh by themselves, but yes, definitely test them to see if they work with your setup requirements.
  • dwwwbdwwwb 94941
    Accepted Answer
    @Tim89 That sounds like a better solution. Not having to monitor and change them out constantly would be extra valuable time for creating good content and building sites. This works for any application using ip:port:user:pass configuration?
  • Tim89Tim89 www.expressindexer.solutions
    Accepted Answer
    Yes they should, I would recommend contacting @BanditIM for further information and set up procedures for additional tools though, just to make sure they will work with the tools you need them to work with.
  • dwwwbdwwwb 94941
    Accepted Answer
    Thanks- Much appreciated bro.
  • BlazingSEOBlazingSEO http://blazingseollc.com/proxy
    Accepted Answer
    @dwwwb - We simply give you a list of static proxies that are ip:port that authorize to your IP. @loopline, a respected list seller on this forum, uses my proxies to build his lists... here's a video he made to show the incredible speed:


  • dwwwbdwwwb 94941
    Accepted Answer
    @BanditIM Just bought 50 ports to test with. Seems to work well with GSA. Figuring out different aspects of Scrapebox- there are some tools that don't function normally.
  • Accepted Answer
    as someone who purchases that actual list, you will see declining performance from the list, as one would expect. that list gets beat to hell pretty quickly.The first week, expect excellent results with high verification rates, second week good results, and then just toss the list as you will be spinning your wheels at that point. remember the whole reason for buying a list is to spam high tiers with massive links. It is a time saver not to have to scrape your own list. But as others have already said, you need to get in the habit of scraping your own lists. I build my own tier 1 manually, use ser list for tier 2, and blast with my own lists at tier 3. seems to work for me. good luck.
  • Accepted Answer
    agreed. @BanditIM are the best proxies. 
  • BlazingSEOBlazingSEO http://blazingseollc.com/proxy
    Accepted Answer
    Just here to let everyone know that tonight we will be doubling our pool size for proxybandit users, and by mid next week we will have our "fail-safe" algorithm implemented that will ensure that 90%+ of the proxies you use are alive (not necessarily Google-passed, but simply alive... but nearly all of our proxies pass Google anyways)
  • Accepted Answer
    @viking

    Not sure why you're doing it that way, but it really looks like a broken strategy. The way you're doing it now your tier 2 will consist of spammed to death domains -and because of it- it will result in a almost completely broken tier 2 in a couple of weeks. 

    Regarding the question of the OP, I can't comment on that since I don't use bought lists anymore, but my own scraped list lose about 10% of targets each week. I think that's pretty normal.

    I can also vouch for Proxybandit. Currently using the 25 ports package and I'm up to a solid 300 URL's/s. Using their ultrafast webscraper as well for the advanced operator footprints.


  • Accepted Answer
    @rogerke oops, my bad, I meant to say my list for tier 2 and ser list for tier 3. you are right, the way I wrote it is not a good way to go.

    re: @BanditIM I use 20 private proxies and they are rock solid. never have an issue with them.
  • catchallseocatchallseo www.catchallseo.com <- Disposable Emails NO MORE!
    @Craigbal1

    If you are willing to spend your time and resources to make your own list! This is the best option to choose rather for buying verified list.

    However, I am using a monthly subscription service by @IdentifiedURLs which scrapes links for me every 4 hours and get them IDENTIFIED as a working platform for GSA SER. It automatically injects links into the folder on my server which I have set as IDENTIFIED URLS.

    Please read MY POST to discover how the service works which helped me accumulate 8000 unique domains from contextual engines.
  • Accepted Answer
    ^ I can second sabahat. IdentifiedURLs is a whole diff ball game. $35ish recurring, definitely cheaper than what you would incur if you were to scrape yourself
  • craigbal1craigbal1 United Kingdom
    I agree time to start scraping myself
  • Can i scrape using just SER?
Sign In or Register to comment.