Skip to content

GSA Ser Millions of List Exchanges

Hi, i Have More that 50 Million link list 

but i never say enough !! :-bd (Gsa Links Greednessism @-))

So for those who have lists +1 million minimum and want to exchange them for another 1 million !!
to get 2 million link by doing nothing !! \m/

then contact me on gmail (ridofacne.com@gmail.com)

the lists i have With high VpM and LpM and obl Less than 50!!

Comments

  • will now they become + 60 mil  =D>

    the newly Exchanged links

    Category - Article............: 447079
    Category - Blog Comment.......: 4312215
    Category - Directory..........: 43699
    Category - Document Sharing...: 46
    Category - Exploit............: 331615
    Category - Forum..............: 179570
    Category - Guestbook..........: 312303
    Category - Image Comment......: 136519
    Category - Indexer............: 753267
    Category - Microblog..........: 4483
    Category - Pingback...........: 701472
    Category - Referrer...........: 10840
    Category - RSS................: 2454
    Category - Social Bookmark....: 6253
    Category - Social Network.....: 126241
    Category - Trackback..........: 577796
    Category - Unknown............: 2239776
    Category - URL Shortener......: 832999
    Category - Video..............: 8246
    Category - Web 2.0............: 4713
    Category - Wiki...............: 110913
    -------------------------------
    Total.........................: 11142499
  • Trevor_BanduraTrevor_Bandura 267,647 NEW GSA SER Verified List
    What do you have after you remove all duplicate URLs and Domains?
  • no point of wasting links by removing duplicate domain because comments can be posted in many pages of the same domain so u will lose a lot of link by removing duplicate domain

    about removing duplicate links it removed about 3 mil from the new lists i received (the 11,140,499)

    but my own lists already deleted dublicate links so as i said 50 mil
  • Trevor_BanduraTrevor_Bandura 267,647 NEW GSA SER Verified List
    SER does not remove duplicate domains from the comment engines.
  • i scrape using scrapebox so i thought u meant deleting duplicate from their

    will do u have lists to exchange or u just argue about how much i delete duplicates? and as i said before i have very little number of duplicates and some sl files dont have any duplicates at all not even dublicated domains 
  • shaunshaun https://www.youtube.com/ShaunMarrs
    Are you up for trading specific platforms?

    Also I delete duplicate URLs and Domains from my verified folder on a daily basis, I personally believe it is good practise.

    Finally, how are you confirming these lists? The sheer number of article, social network and wiki's make me suspicious. I am not trying to say you are lieing about your stats but have you ever ran a purging project in SER where you run your whole list through a project to see what actually works?
  • will sometimes i use the scraped lists for high vpm but the list will take long time

    i import one sl file and the target will be over millions and take days to fininsh platform (and it create over 100k indexer (junk) ) it take looong time and indexer platform start firstly before other platform i dont know why

    by the way am starting with new engines (only for gov,edu) and one of my customers scraped over 3.9 mil in the first 3 hours

    u can find my engines (here)
  • edited November 2015
    "and as i said before i have very little number of duplicates and some sl files dont have any duplicates at all not even dublicated domains"
    image
  • mashafeeqmashafeeq iraq
    edited November 2015
    @rogerke lol nice one
  • shaunshaun https://www.youtube.com/ShaunMarrs
    You can just disable the indexer platform mate and it will skip it. Also what Rogerke is trying to point out is that your total list will have duplicates. Just because the sitelists dont have duplicated within them once you extract them all the folder then will.

    Also that is not a verified list you have, essentially right now it is just a bunch of site lists that at one point someone claimed were verified.

    What you need to do is extract ALL of the site lists you have to one folder, say identified for example. Then remove both duplicate URLs and duplicate domains. Dont worry about losing blog or image targets, SER knows not to get rid of them. Then you will set up a bunch of projects thats only source of target links is that folder. You then tick your verified folder button in settings so SER saves the verified links to that folder and activate these projects.

    They will burn through the list over the course of a week or two and confirm exactly how many of those sites are still valid. You will be astounded at how many fail now for a number of reasons.

  • yeb i do remove the dublicate from all of them 

    the sl file can be created by 7zip

    i put all text files in the filder and remove dublicate then zip it again as un sl file

    i have identified but the verified are valid and can be used in xrummer,scrapebox poster,gsa ser,gscraper,sick submitter

    for me verified = submit(able)
    identified = just scraped using scrapebox then pf them

    by the way ... i think i will close the post
    i might start selling gov,edu list so exchange might harm my business



    :-h
  • shaunshaun https://www.youtube.com/ShaunMarrs
    Also the below is pointless to have in your list...

    Category - Unknown............: 2239776

    I guess you could run it through GSA PI and then through GSA SER but in my oppinion its a waste of time.
  • yeb i delete those junk

    by the way only contextual list might worth something

    all others are junk
  • shaunshaun https://www.youtube.com/ShaunMarrs
    Just to clarify the quote....

    "for me verified = submit(able)
    identified = just scraped using scrapebox then pf them"

    So it is actually a list of submitable sites? That makes alot more sense with the numbers, its not actually a list of confirmed sites that SER has posted to successfully and has a verified link?
  • with private proxies all site get successfull submittion get verified and 1% dont

    with public proxies about 5% of submitted site get verified and most of them get the message unknown submittion state so i thought u an expert in gsa but it seems u are not one

    identified is the sites that just platformed into catogoreis and might have alot of no form at all or no engine match and alot of it get skipped so its junk
  • shaunshaun https://www.youtube.com/ShaunMarrs
    I use a highly pruned version of the list with the stats in the linked image. I am willing to trade it with you but I only want specific platforms in return, I have no use for 70-80% of the platforms SER can post to. Also I have been asked to tell you from a friend on blackhatworld that your site is returning an index error when people try to access it.

    The below stats show the list with duplicate URLs and duplicate domains removed.

Sign In or Register to comment.