Skip to content

Help with contextual link lpm

I have been running GSA for a week now for 2 tiers , first tier built 150 links so far during that week which is very low. I need to raise LPM for articles,directories,web2s and social bookmarks.

How can i do so?
«1

Comments

  • BrandonBrandon Reputation Management Pro
    If I remember correctly you posted another thread about the same topic. Contextual will never give high LPM. 150 links in a week is extremely low. You probably need more (or better) proxies, better list (I never scrape with SER) or a better server (unlikely).
  • It is not proxies or better servers as tier 2 links made 15k links in that 1 week. What do you scrape links with? scrapebox? 
  • T1 contextuals is always low until you've built up a good list.  You can buy a list, or scrape with Scrapebox or Gscraper.
  • Low as ZERO LPM?? i keep increasing kws emails and proxies with no result, I select engines by language and running 24/7 for 0.05 - 0.01 lpm 

    This is going nowhere..
  • I still can't get no links built for contextuals
  • It's probably your proxies getting burnt for PR checking and/or making search queries, and because you're scraping on search engines that aren't giving you any results.
  • 1. Export all footprints for contextual targets to file.
    2. Import to gscraper (paid version).
    3. Check index count for all fotprints, remove footprints with less than 5k results and save to file 1, than remove footprints with less than 10k results, save to file 2, remove footprints with less than 25k results, save to file 3, remove footprints with less than 50k results, save to file 5,  remove footprints with less than 100k results, save to file 6, export all footprints you have now in gscraper to file 10.
    4. For footprints from files 1-6 perform scrape with 10-20% of gscraper default keywords - that should be enough to find most targets given by these footprints.
    5. For footprints from file 10 do full scrape using default gscraper keywords or your keywords list.
    6. Remove duplicates using Gscraper from all files with scrape results.
    7. Import results of all scrapes to gscraper using "list import", remove duplicate domains, save to file.
    8. Import to GSA for identyfication.
    9. Post.
    10. Repeat from 4.

    Scrapebox is joke compared to Gscraper when it comes to google scraping so dont use it. Here is my very short comparision between both of them (same proxies, same footprints etc):

    Note: Use gscraper paid proxy service for scraping - its best google passed proxy service available in public. You will need X weeks, to perform big scrape.
  • @satyr85 Thanks a bunch for the detailed explanation on how to use gscraper for building list, i was thinking oh another $60 until i saw the proxy subscription of additional $60 then shit i saw i need X weeks,I'm sure gonna write it down for further solution but i have a lil downtime on $$ resources until the projects are finished and several weeks will put me down alot. Much Thanked will bookmark your solution.

    @fakenickahl That is probably but what confuses me is that console doesn't have any pointers to show my ips are dead,I will check with Another set of IPs to see, Thanks!
  • @satyr85 I'll give that a try also, thanks.I purchased Gscraper a while ago but haven't really put it to good use yet.
  • MeanWhile if anybody would like to help by looking through teamviewer at my Gsa settings and console messages maybe can figure out what's wrong, i would greatly appreciate it as i ran out of ideas.
  • goonergooner SERLists.com
    edited May 2014
    @blackhatpriest - There are at least 2 cheaper proxy solutions on BHW, i've tested 1 of them and it was as faster than Gscraper proxies. That one was $20/month - RedProxy - Maybe that's a cheaper option for you.
  • @gooner thanks i will get at it
  • How come you guys are able to scrape through private proxies in something like Gscraper? I tried and got all 60 proxies burnt out in minutes (none working now to scrape in Google lol) at just 100 threads.
  • @gooner i dont seem to find the thread you're talking about can you provide a link please?
  • It could also be filters. Are you filtering PageRank, Language, Country, Links on Page, etc?

    Nothing wrong with filters, but it kills LPM.
  • goonergooner SERLists.com
    @blackhatpriest - I don't have the links saved, sorry. They have a proxy section i think, i found them in there.

    Or search "redproxy" or "red-proxy" maybe and you should find it. You can also visit their website: http://red-proxy.com/


  • @Satans_Aprprentice i filter pagerank to 0+  , country filters i tried everything country based language based all leads to nothing.
    @gooner thanks again
  • It would seriously be of help before i take another step that someone checks my gsa state through TeamViewer. I would greatly appreciate it.
  • @gooner sorry for asking again, i just visited red-proxy.com and i see proxies identified as "scraping proxies" and "captcha proxies" . How are proxies defined as good only for scraping or such, how are they different than other type of proxies?
  • goonergooner SERLists.com
    I don't know mate, ask their support and they tell you i'm sure.
  • @gooner
    Faster in what time period? Proxies from bhw are fast, and you can scrape with them with good speed for X, maybe XX minutes - after this time proxies are dying, and your speed is lower and lower to zero. Compare how many targets you can scrape with Gscraper proxies in 48 hours vs BHW proxies.

    Sellers on bhw check proxies with scrapebox to see if proxies are google passed or not. Google passed proxies = good for scraping.
  • should i run gscraper on the same vps with gsa?
  • goonergooner SERLists.com
    edited May 2014
    @satyr85 - They are updated twice daily. I use a lot of inurl scrapes and gscraper proxies are notoriously bad with those. So for me over 24 hours the BHW proxies can scrape 2 - 3 times more URLs.
  • Depends on how many google passed proxies you have. My Gscraper is using 30-60% CPU of my Xeon1245v2 dedicated server on 1500 threads but i have 3-6k google passed proxies 24/7 and  thats why cpu usage is so high. Less number of working proxies - lower CPU usage. Btw how much you pay for VPS ?
  • i have cpu 3 cores ram 2.5 vps i got for $40 
  • edited May 2014
    u no what fck it im in.... i wasnt gonna do this but something changed my mind
    you want 100-200k urls a min . port scanned proxies ? thats me
    also ranked 1 for gscraper proxies on google

    http://www.blackhatworld.com/blackhat-seo/proxies-sale/544337-gscraper-proxies-2-3k-per-day-life-time-subscription.html feedback http://www.blackhatworld.com/blackhat-seo/itrader.php?u=21623

    BHW longest selling proxy seller almost 6 yrs around 1000 subs and 0 refunds
    the proxies i provide for gscraper will not work in scrapebox - but they will beat the hell outa gscraper
    my scrapebox service uses different proxies, never the 2 shall meet as 1 set wont work on the other
    had quite a few people leave gscraper proxies and come to me.

    image
  • edited May 2014
    @proxygo

    My public proxies, maybe i should start selling :P
    image
  • goonergooner SERLists.com
    @proxygo - You will only sell to BHW members that have a high post count.
    I don't know about others but i can't be bothered to jump through hoops and stupid criteria for the sake of a $20 service.


  • edited May 2014
    that all ? let me post you this from 1 of my subs users
    i believe some of u here no trevorb so u no i aint makin it up
    and these will not work in sbox so they dont get over burn

    image



  • edited May 2014
    @proxygo - are you going to sell your proxies to us or not? You little shit.

    I'd like to try them out. I steal other hrefer users proxies right now and use another paid service (not Gscraper). 
Sign In or Register to comment.