I don't know if you have implemented analytics identifier comparison, but I thought that analyzing https certificates could be a good option. Always with the aim of tracking down a competitor.
I think if an analytics or https occurrence is found between two sites, a color could be used.
Latest update has SSL/Cert details added in FullView
"analyzing https certificates could be a good option."
Could you explain this part, and how it might be used? Is it to compare SPECIFICS of SSL certs, like the quality of the certificates? I'm
confused and I think this might help me. Thanks!
Well Im not aware that this is a ranking factor, but maybe once there it is.
If a certificate is from Lets Encrypt, it might be not as valuable as from Google itself. So maybe its a factor but with the tool you at least can guess if a site ranks better than yours...maybe it has to do with that.
@Deeeeeeee also sorry for not answering your private messages...will try to go through that tomorrow.
@Sven: No Worries! I can see how busy you've been. I mean, you're always creating new programs and adding features to existing stuff, but I guess the COVID situation, or just timing, has you in triple-time mode that will bring GAME CHANGERS!
@Kaine: Do you mean that we can find out which T1s are owned by the same SEO companies, much like you can use G00G AdSense IDs to find a company operating multiple sites? Still not getting it!
@Kaine: Do you mean that we can find out which T1s are owned by the same SEO companies, much like you can use G00G AdSense IDs to find a company operating multiple sites? Still not getting it!
Yes that's it, you can find which of your opponents owns several sites for a request that interests you.
@Sven I just found something that may still greatly increase the value of your software and that will interest many customers, I send you an MP.
Small bug, when we quit and reopen the tool, Show Data from section inside setting will reset everytime. I have tried to add proxies ( alive that im using for other tools and all successfully tested) but it won't scrape anything and will show blank result, removing them and refreshing it will scrape instantly the top10 and calculate everything.
I'm using domdetailer API, the stats are not showing up anymore http://prntscr.com/sw3z4z The api is still working fine in their app and i still have credits
I don't know if it's normal but i have tested using 20 dedicated proxy, 19 of 20 got banned instantly
What you think of one option for visit again one page with Google user agent for detect cloacked page? Sometime the ranking of one page can't be understand without this information
I'd like to know how relevant my site's pages to each other. I'd like to link them together based on topical relevance but I'd like to know which are actually relevant to each other.
Is this already possible or would it be possible to implement it?
You can create a "custom set" (TOOLS->Competitor Research (full) ->Custom Set) where you e.g. add all your sites in one and check against a given keyword. It would show you where this keyword is used and what to optimize on each page.
If thats not what you mean or can describe what you would like to have clicked to get what you want, I can try adding it.
The thing is I don't exactly know what I want. Technically it's rather a vogue idea. I hoped that we could brainstorm something for this.
The problem is that I want to silo a site.
Theory:
To create a silo I need at least 4 -6 pages around the same topic. If the topic is about widgets then the subtopics can be blue, red, yellow, pink and green widget. This is pretty easy.
Optimally the pages are already optimized for a given keyword and are easy to relate them to each other.
Practice:
But in reality this is not always the case. Especially if the pages are written by a non seo person or just not seo in mind.
So I'd like the tool to:
1. Determine what is the page about.
2. Compare the rest of the pages on the site about how relevant are those to the selected page.
3. Do this in bulk, comparing every page to every other then presenting relevancy scores in a matrix
4. An option could be implemented to see page relevancy to a given keyword. Google does this already by searching for: domain.com keyword.
So apart of option 4 I have no idea how this can be implemented and what exactly should be used to determine the relation between the pages and given keywords. Google uses the keyword buckets already and there are other sites that help find relevance between keywords but I don't know how such data could be used or even better could be determined by the software itself.
1 think it's to much hard to code (and no says how you can do that). The better you can have is to find keyword is identique to another title page for exemple but is enough for link 2 pages. One first crawl of all the website and generate all possible linking between pages.
Yes, I'm aware that I am looking for the magic "rank my site" button but one can dream.
Linking the pages together by title or keywords is easy and straightforward but that's not always possible. Nor google groups together phrases based on the same root word.
Yes, the tool should crawl each page and determine what is the page about by weighing keyword density, ngram, link positions, link anchors, site structure...
Now it would be necessary that the software is a good linguistic knowledge to understand what it speaks (in all languages). I don't know how this could be approached, it would take semantic dictionaries and some sort of AI to do the job as a search engine.
In my opinion the only realizable thing is to find similar words between the pages to succeed in linking them ... it is the cutting of the words, the conjugation and the spelling the problem, that's why I was talking about the titles.
I imagine Sven would be able to code something if it is well defined. Semantic optimization/silos are probably the things you have to work on more than ever to please Google and customers would probably be interested in this type of functionality.
Comments
I have tried to add proxies ( alive that im using for other tools and all successfully tested) but it won't scrape anything and will show blank result, removing them and refreshing it will scrape instantly the top10 and calculate everything.
I don't know if it's normal but i have tested using 20 dedicated proxy, 19 of 20 got banned instantly
Sometime the ranking of one page can't be understand without this information
Edit @Sven Do you think it's possible to have a button when it's detected to open a browser page and see this version?
The better you can have is to find keyword is identique to another title page for exemple but is enough for link 2 pages.
One first crawl of all the website and generate all possible linking between pages.