Skip to content

No progress for several days

My project is now at least 2 weeks, maybe 3 weeks, old. For the past 4 days it has stayed at 3 Follow and 2 Nofollow links. I am wondering if this means that all the links to be had have been exhausted. I am working on another article spin to add to the Article Manager in my project. Will SER go back through potential link sites because I have added a new article? What do I do if my project doesn't add any more links?
Thanks,
«1

Comments

  • Has anyone else had SER just stop progressing? I'm wondering if this product was worth the effort. It's been 5 days now with no new backlinks created. I have "Article", "Social Bookmark", "Social Network", Web2, and "Video" submissions, all the US engines checked, and about 40-50 keywords. I went through tutorials on all the options and think I have them set correctly. I can't figure out why SER just keeps running without creating any backlinks. In total, I have 73, but it has been at that number for 5 days.
  • edited July 2014
    Can you tell us more about your hardware and proxies? How are you finding targets?

    I've generated at least 20k unique domains in my verified list over the last 2 weeks < and thats small time stuff
  • 40-50 keywords? You need a tad more than that mate :)
  • 20K new backlinks in 2 weeks?? I selected an option to have my proxies increased by 100 every time they get below a certain point. But I do see a lot of messages in the Message pain that say "Proxy might be blocked". I have 499 emails for the project. One thing did happen overnight. Last night I rewrote my Description 250. I had spun it manually before, but now I have The Best Spinner, so I rewrote it with TBS. Maybe it is a conicidence, but I gained 4 more backlinks overnight. I am in the middle of spinning another article with TBS, but it is taking some time. I'm wondering if that will up my backlinks also. But I'm flabbergasted that I am here struggling with my puny 77 links while you are mopping up with 20K in the same amount of time. Could it be that my 40-50 keywords are just too low? My website is for selling fine art that I paint. I'll be making another project for my jewelry. I thought I had just about exhausted the nuber of keywords that could be used. But I looked at Google keywords for painting and there are several hundred that they have listed. I could use all of those. Do you think that number of keywords is what is slowing me down? As to how I am finding targets, I assume SER is looking for targets under Article, Web2, Social Bookmark, Social Network, and Video, then using my keywords to locate the targets. The Message pain keeps spewing out information, like it is looking. My number of submitted links keeps going up to around 50 or 60, but then it goes back down to about 20 without adding any of the submitted links. Does any of this information give you a clue about why I am doing so poorly?
  • I was checking on how I do my proxies. There are a few check boxes that I had not been able to get information on. One is "Proxies Should Resolve Domain to IP". I don't have that checked. Should it be? "Test Proxies" is set to "All good only" and "Public/Private". "Check if Anonymous" is not checked. I search for new proxied every 60 minutes if I have less than 100 proxies. "Try Using Proxy-Keep-Alife" is unchecked. I remove bad proxies after 1000 minutes. "Automatically disable public proxies when they appear to be down" is checkmarked. The one for private proxies is not. Could any of my settings account for why I am getting so many "Proxy might be blocked" messages, if those messages are even a problem.
    Thanks,
  • Find a list of 10k-100k generic or popular keywords and add into your project, 40-50 keywords on a probably thin niche will get you no where.

    "Proxy might be blocked" comes when you're scraping search engines, right? If it does then it just means that either your search terms has no results for them or your proxies are getting banned. Also I'm pretty damn sure that you already scraped every link possible with your 40-50 keywords.
  • edited July 2014
    What I was mostly asking for is:

    are you using private proxies
    are you using a vps or a decent computer with fast internet connection (more than 20 mbps)

    heres my verified list from the last few months. This is small time. There's guys out there that blow this out of the water way faster because they know how to scrape and what to try to post to in the first place.

     I scrape with gscraper and only focus on the types of CMS that are worth trying to post to (high success rates). I'm really just starting to get good at building lists. It's taken a lot of effort and experimenting to learn this, and there is still much more effort I need to put in. SER is not a waste unless you waste it.

    image

    some of this might have come from a free verified list I imported, but most of it was junk and didnt work anyway so I dont think it's still there, not sure. almost all the articles and other contextual link types in there are from scraping and posting.


  • fakenickahl, I am very interested in your answer about the keywords. If I use generic keywords, then won't SER try to enter my spun information into irrelevant websites? Does that even matter? If my articles are about fine art subjects, then don't I need to have the articles inserted into some kind of art based platform? Or maybe this does not matter either. On a similar note, the way I am typing in keywords is just one at a time from a list in an SEO app that gives me 100 keywords at a time based on a keyword I enter. There's no way to highlight the entire list, so I enter them one at a time. I assume that to input something like 10K keywords, I would have to do some serious copy/paste. What are you doing to find so many keywords? I tried the old Google keywords app, but now one has to sign up for adwords at a cost to access the app. So, the only option I can think of is my SEO app on my webhost. Any input on how to scrape a bunch of keywords at a time would be greatly appreciated.
  • the_other_dude, I am solely using the proxies I get by filling out the proxy information on the "Submission" "Options" menu. I don't remember having bought proxies or anything like that, unless my memory is failing me. I'm just relying on SER to supply the proxies. I think they are all public proxies. Should I be buying private proxies somehow? My computer has a 100mbs link over cable. And I have a quad processor running Windows 7. I don't notice any tug on the system by running SER.
  • You need to get some shared private proxies. use these in SER to scrape the search engines, and post. Dont use public proxies you are just slowing it down and getting banned in the search engines.

    If you get shared proxies this will increase your ability to find targets to post to.

    Eventually you should also buy gscraper or scrape box and learn how to scrape yourself, and you can post a lot more.
  • Could you steer me in the direction of where I would get the private proxies?
  • edited July 2014
    Proxies listed by cherub here:


    are all viable solutions... except maybe fineproxies (they are sold and used by so many people that its more appropriate to call them public in my opinion)
  • I looked at BuyProxies and they have items such as 30 proxies for $25 per month and 100 proxies for $75 per month, which is huge money. I have no idea how many proxies I need, although based on SER settings of new proxies every 60 minutes if under 100, it seems like I would need a lot more than 100 at a prohibitive cost. How many proxies to you buy per month? Is that sufficient?
  • edited July 2014
    Personally I am not using my private proxies for scraping so I am not the right person to ask. I can't help you there, sorry.

    But maybe cheaper would be to use Gscraper (68$)/Scrapebox (57$) and some public proxies packages to scrape separately (30$+ /month) and use scraped links with as low as 10 private proxies in SER. Or buy small amount of private proxies and make "precision scrapes" with good footprints in Scrapebox/Gscraper.
  • im using 30 semi dedicated from buyproxies.org. I scrape with gscraper not SER.
  • the_other_dude, @Nikodim,
    Are you saying that SER is not precise enough in its scraping so that I would need more than 10 private proxies per month if I didn't use Scrapebox or Gscraper? On a similar note, I can't find anyplace on the "Submissions Options" page where I can tell SER to use private proxies that I have bought. How is that done?
  • Personally I've never used built in scraping yet, I just suggested two options except SER scraping which are not expensive and viable in my opinion.

    I thinks its totally OK to scrape with SER, its just that I've no expertise in that field so I can't really comment. I think that if you tweak SER built in scraper enough (i.e. use multiple search engines to delay IP bans, use less aggressive footprints, etc.) it can easily give same if not better results. (maybe even with low amount of proxies)
  • edited July 2014
    I dont know how to use SER to scrape. Anytime I tried it my IPs got banned rapidly from search so I had no targets to post to. I already know how to use Gscraper and Scrapebox, so I just didnt feel like learning.

    you're going to need way more than 10 private proxies. You need at least 30.
  • Thanks. I will look into buying 30 proxies per month. Do you think that will stop all of my "Proxy might be blocked on --" and "Proxy blocked on --" messages?

    Also, can someone tell me how to import my new proxies into SER? I don't see any obvious way on the "Submissions Options" page.
    Thanks,

  • This guy has lots of good videos that will help you out with settings and other stuff. To import proxies on the submission options menu just click "configure" then add your proxies from file or clipboard.

    Adding proxies will solve the banned notification, but for how long, I don't know. I dont use SER to scrape.

  • Great video. But I went to BuyProxies.org, bought 20 proxies, 10 for scraping, and 10 marked public for page ranking as the video indicated. But when I ran SER again, it was a dog. I ran a test on all my new proxies against Google and all of them came back "not working". I'm very irked. I put in a support ticket to BuyProxies.
    Has anyone else had this problem?
    On a similar note, I bought shared private proxies by mistake when the video said to buy dedicated private proxies. If BuyProxies gets me working proxies, then next month I will get dedicated proxies.
  • The answer is apparently because I did not use a username and password. I don't understand what that means. I noticed that the Add Proxy screen said "Import from file(login:password@host.port). But I did'nt import my proxies from a file that had a username and password. I was only able to copy and paste my proxies from the BuyProxies site into a txt file, and that is how I imported them. Can anyone who uses BuyProxies tell me what the procedure is for getting my proxies and adding them with a username and password?
    Thanks,
  • did you try host:port:login:password? < try that

    10 proxies for pr checking and scraping is not going to be enough, probably but I guess it wont hurt to try.
  • The problem is that I don't know where I'm supposed to add the host:port:login:password information and where it is I should get this information from. I did have to create login information for the BuyProxies site; does this have anything to do with it? What host and port are we talking about? Where do I get this information from?
  • the_other_dude,
    Where do I get this host:port:login:password information from? And how to I use it with the import command for my file?
  • I just took another look at my proxies file. At the end of each IP address line, my username and password are included. Does this satisfy the host:port:login:password information? If it does, then the proxies are still not working. I'm still waiting for help from BuyProxies.
  • Okay. I think I may have figured this out. The file they gave me as login:password attached to the end of each IP line in the file. However, the import options for SER all indicate that they need the @host.port information also, which is not in the file that BuyProxies gave me. Can anyone comfirm that I am on the right track?
  • yes dont worry about the format from buy proxies. i import mine from them host:port:login:password and thats just how it works for some reason. Once you have done that you can check against google and see that they are very fast and working.
  • @the_other_dude. The file they gave me has IP:80:login:password. It does not have a host. Could that be why none of them are working? Do the files that you get from them have a host?. I have imported them again and still they are not working.
  • Host = IP. You can import from file or clipboard.


    image

Sign In or Register to comment.