Source: SEOSignalsLab

Pick His Brain!

I’d like to introduce one of our members, Edward Kubaitis, for our next ‘Pick His Brain’ session and I want to thank him for the participation.

Edward Kubaitis is the creator of CORA, a popular SEO correlation analyzer tool.

CORA has been discussed in numerous forums, FB groups, and SEO communities with many loyal followers.

If you have any questions regarding CORA or his SEO process, feel free to pick his brain.

Here are the rules.

1) I’ll let the thread go on until he asks me to stop. Theoretically, this thread can continue until the FaceBook stock value goes to zero.

2) Please, no snarky remarks. I will not tolerate any intentional negativity. We are here to learn from each other’s success and strategies.

3) Please do not PM him and bother him. If you have a private question, ask for his permission on this thread when appropriate.

#PickHisBrain

Table of Contents

Can we expect Cora GMB version in near future?

Yes. Local SEO version of Cora is the third most requested feature.

It would take about 6 months to properly build a version

1. Maybe a little less if we got lucky on getting things right the first time, but that is often hard to do.

I want to start developing a Local Cora after I finish Cora for Writers and my Rank Tracker.

Is the CORA for writers and rank tracker separate from the CORA base software or will be bundled within?

Rank Tracker might be separate, not decided. Cora for Writers will be bundled in for existing subscribers and might be sold a la carte as a “cora lite”

Any strategy to recover from the recent Aug/Sept updates?

Most updates I have lived through usually involved a shift in which factors matter the most.

The general pattern is that sites that are negatively impacted are general short on the new factors that matter most.

When the medic update came out Barry Schwartz let me analyze his survey data of all the people impacted and their keywords.

This was very true for medic update. Kyle Roof also had a good observation about missing trust signals and contact info that looked plausible too.

Would you mind sharing some of the most powerful trust signals that are likely to move the needle?

Email, phone, terms link, privacy link, proper use of trademarks and copyrights, contact links, hours of operation, SSL, etc.

Basically showing you are accountable for the content on the page.

What’s your response to correlation naysayers?

Most of the time when people throw “correlation is not causation” in my face it is to justify their lack of use of math and evidence in their practice.

I usually ask them to show me their case study and guess what… they don’t have one.

If you want to be an SEO Shaman I really don’t mind.

We use correlation for clues into what is influencing the sort order.

This reduces a field of possible things to work on from about 600 down to 2 or 3 dozen.

In general this makes our guesses much better than your guesses which is why it works.

How do you deal with the ever-changing SERP movements of comparison sites especially when a correlation analysis is only a snapshot?

This is a 3D chess question.

Let me touch upon a couple of the many aspects of it.

Short answer.

It takes 5 minutes to run a new updated report.

That is pretty much realtime data for SEO.

Dimension one: You are stack ranked by your competition. Google isn’t manually picking which page ranks where for each keyword.

Google is aware of over 200 trillion URLs. So wrap you mind around the notion that Google doesn’t rank your site.

Your keyword Competitors do.

Dimension two: Even for a keyword like “Google” google will only show around 500-600 results.

There are millions of pages about Google.

It appears like there may be a volatile middle where google tests content from outside the primary index.

So unless you are the best performing content you only get to timeshare your spot in keywords like that.

Dimension three: If you don’t tune your meta description for your target keywords then Google may pick a snippet off the page.

Google can then go into A/B testing of snippets and your ranks can change based solely on the snippet Google displays. up 5 down 5 up 5 down 5.

If you let google pick the snippet site wide you can expect a ton of volatility in your SEO.

And if we take it into a fourth dimension… your website isn’t the only website making moves.

At any moment you may be playing against 3,4,9, or more players simultaneously.

As to how do I deal with it… I measure often and take small bites. Rome wasn’t built in a day.

Prioritize on things most likely to help the situation.

How do you factor in the off-page elements when comparing?

Me personally.

I sprinkle in off page from time to time. Most of the quick wins in SEO tend to be on page fixes.

Sometimes on page isn’t enough then I will turn on the off page APIs and look at backlinks, referring domains, social signals, etc.

I don’t use them all the time because we know backlink correlate and top sites have a lot of them and in general they don’t change often by significant amounts.

So when on page data leaves questions then I like to pull in more of the story with off page.

How long after implementation of Cora suggestions can you expect to see the needle to move?

First you need to have the skill to act on the recommendations and second you have to have enough motivation to make the changes.

If those two requirements are met then Cora customers generally see improvements around 2 to 10 days after reducing their deficits.

The clock only starts once you see your changes in the Google cache.

Google has to be aware of the change to start the timer.

It isn’t 100% either. Going from 64 to 12 is often a lot easier than going from 5 to 4 so every situation can be different.

Every niche is different too.

Some keywords have to contend with directories.

Some niches largely only compete with off page, like casinos. But a bias for action is a key ingredient for success.

How is it possible to analyze a bunch of keywords with Cora in a row? I need to analyze hundreds of keywords every month but need to type in every single keyword. This takes so much time. How can I use a bulk analyzation for a couple of keywords?

I can’t automate Cora. It would violate a core term of service with Google and I don’t want to do that.

It is the term regarding automated search queries.

Cora is a Web Browser for SEOs. 1 Human makes 1 manual search THEN cora takes over from there.

There are other business working on automation tools but I suspect Google doesn’t like me and I don’t want to give them reasons to take action.

Just wanted to preface with I’m a huge supporter of your work and an avid Cora user and I do not plan on changing that. However, what would you say is the largest difference between Cora and bench marketer? (I’ve always wondered about the similarities)

I don’t mean to slight them, but I haven’t heard of bench marketer so I honestly don’t know.

Is PoP still required if you have a subscription to Cora?

No. Pop is a totally different tool. I like POP by the way.

POP only measure about a dozen or so factors but those factors were scientifically tested and proven to matter.

So POP is a great tool for quick wins. It is built in way that you pick websites you want to emulate.

Cora is totally different. Cora analyzes the top 100 results and uses math to tell you which factors among hundreds appear to influence rankings the most and how much you need of each to be competitive.

Many people say they compliment each other. I would imagine where they agree is especially potent intel.

What made you create Cora? What were your motivations behind it?

I found all the tools on the market lacking. I needed 3 projects in rank tracker pro to support one client and they all shared the same keywords.

Link-assistant told me twice in person that they weren’t going to fix that.

I was spending most of every month collecting data in tools that wasn’t telling me what I needed to know.

I had one client fire me because why does he need an SEO because Moz subscription was $20 at the time.

So I built the tools I always wanted. I love my work. Best decision I ever made.

RELATED:  Pick His Brain! with Alex Hatala

My mission is to empower SEOs and to legitimize the industry with math and science and to deliver real advantage in a way that makes good SEOs even better.

I hope nobody else ever has to walk the shitty road I went down. Math and Science are hopefully a cure for that.

I have a post that ranks in spots 2-4 for a number of related keywords (say 10) but rarely #1. It has been sampled as a snippet. Would you still follow what Cora says despite it comparing against 100 spots or just try to push it with links?

Cora only uses the top 100 to figure out which factors appear to influence rankings, but by default Cora only looks at the top 3 and the page 1 average tell you how much of each factor you need.

There is a setting in Cora called the “Deficit Strategy” that lets you change this behavior. if you want to only compare to the top 1 or 2 you can.

There are about a dozen different options to use in different situations.

If a directory site is in the top 3, would i be best advised to refer to pop(leaving out the dorectory site) and compare suggestions of the 2 programs?

Cora you can ban sites so you can see the analysis without certain sites if you want.

Are you planning to get your own SEO show in the future?

A number of people want me to reboot SEO Fight Club. If I can get 40 likes on this comment I’ll do it.

Measurement for internal links and anchor text of said internal links, is this available in Cora?

There are a couple internal link factors via the off page APIs but Cora does not crawl websites on its own.

SageMaker or honestly regression analysis in Tensorflow would do the trick but you’d have to create the training model first in regards to classification (think of it as object recognition / classification).

Already use regression in a few places. I convert the factor data into line equations for the community shared data.

Storing the equations instead of the data saves a ton of transfer time and storage.

Will you in the near future offer pay as you go (per report) for some only have one website. People offering per report on third party selling sites would then be taken out.

This already exists. I let subscribers sell reports and there are some great providers on fiverr and elsewhere.

Can you recommend someone who follows your cora reports for single pages at a time. Precisely for small businesses who don’t have the budget….or better yet a thorough video walk thru of how to do the entire roadmap.

Any tool like Cora or POP is going to tell you how much you need of various factors.

It is still up to you as the SEO to figure out how best to do it.

I wouldn’t want to play favorites and I don’t know everybody’s pricing but I am sure if you did a shout out in your favorite SEO group that you are looking for SEO services from someone who uses Cora you can probably get some quotes.

If all you need is the report you can go to fiverr, but if you need someone to do the changes too then what you need is to hire and SEO that uses Cora.

I read your article ranting about what SEOs do and sympathize with your potential client-prospect. But for ecommerce and all of the filters and configurability (e.g. think auto parts site) – what is the best technical approach to manage all of the variations / personalizations that a page can go through to ensure Google just ‘sees’ the one base page? http://seos.blog/wp-content/uploads/2018/11/email-1.png

Without getting into the weeds…

Ignore parameters in GSC and rel=nofollow and canonical tags are you best white hat tools.

In extreme cases cloaking sorts and filters so Google cant crawl them might be needed.

Usually store that run amuck are custom built and the engineers are too elite to ask an SEO basic questions about avoiding common pitfalls.

These are fantastic insights. I feel overwhelmed by the number of things I know I should do but don’t have the time to do.

We all have been there.

It takes about 2 years of full time SEO to learn everything I know.

And about a third of that knowledge is evergreen. So you should be challenging the validity of half your SEO knowledge every year.

My SEO checklist has 2500 checkpoints separated into 61 categories and some of it I would call dated info at this point.

Can you recommend a shopify theme that you have come across that is SEO friendly.

I cannot.

I think shopify is actually OK in that in general it does more good than harm in SEO, but it is very hard to address SEO issue.

I would recommend you find an SEO that specializes in Shopify because that individual would give you a better answer than I can.

In regards to the word count recommendations on the overview. Is this the best word count for a page? (generated from the top 100).
Also Cora page tool. Can that be used to create a template that could be handed to writers to then build a page around the Cora generated page that is produced?

The default settings in Cora will recommend the largest value from results 1,2, and 3, and the page 1 average.

The overall maximum is generally much larger. Cora’s word count is a conservative word count.

Most other tools report more because Cora doesn’t include 1-2 letter words.

So in your experience (as a general rule) should we be targeting the page one average (even though conservative) or the maximum from the top 3?

Take all four factor measurements from results 1-3 AND page 1 average.

Whichever of those four is largest is the goal.

In the Cora way of doing things we refer to that as the Practical Maximum

From an SEO standpoint is it better to have less categories and no sorts/filters ? although the user experience would be less than ideal as they are scrolling through less pages but with more product per page….thanks for answering all our questions.

I don’t think that is better. I think that google doesn’t need to crawl the same category 18 times because of sorts and filters.

Google should crawl the category once to find all the product pages and then move on to the next category.

When google gets into infinite loops because of sorts and filters only 5-10% of your website gets crawled because eventually googlebot gives up after finding the same pages endlessly.

So handling sorts and filters properly will get you more pages indexed and better organic traffic.

I have an app that allows NoIndex + NoFollow: This will just apply NoIndex and NoFollow – Will not get Indexed by search engines – No links will be followed

Item will still be searchable through your stores search.

Is the ability to track “what changed” during an algorithm update as simple as “having 1” before and after Cora result? As in, not daily up until the update, but from any point since the previous update? Curious about how to start my “dif” setups.

You want a good report history for your most important terms. Sometimes it is a real debate just figuring out when the update happened.

Google often rolls things out slowly. Your most important keywords you should run at least weekly.

From SEO and speed standpoint. Which is better to use. “load more products” button or infinite scroll. The latter slows page speed the former adds pagination.

There are limits to googlebots JavaScript execution time on page.

In general googlebot runs about 1.2 seconds of cpu for JavaScript for a page render.

So I prefer old fashion pagination links and rel next/prev because I can verify complete googlebot crawls in the web logs.

RELATED:  Pick His Brain! with David Hood

What language or languages did you use to build Cora? Python, Java, C++, or C#?

Java

I have noticed ahrefs backlinks is one of the metrics you use, have you concidered using something like majestic to pull backlinks or referring domains topical relevance into the mix, finding a correlation between the topical niche of a keyword in association with the topical relevance of the referring domains?

I tried to integrate majestic but their contract terms are stupid.

They want to make api integrators financially liable if not enough majestic users use your integration.

Ahrefs and SEMRush gave me no strings and free API units… boggles my mind.

Took it to the top with majestic, I guess they feel their customers don’t need cool integrations.

Maybe if some majestic customers complain they’ll change their minds.

I know a number of majestic customers have switched to ahrefs over it and a larger number have simple started with ahrefs because of it.

So I am really speechless with how shortsighted their program is.

According to your experiment/experience which backlink is stronger – (a) Backlink from a post having relevant organic traffic vs (b) Backlink from a authority site (niche relavent) but without organic traffic to the post.

Link is a link to me. It is hard enough to get an honest link these day that I’m not going to say no to one.

I’ve built empires with exact match sitewide footer links before but I only did it with websites I controlled.

If you use spammy links you should point them at Web 2.0 sites or blogs you control and then from there to your money sites.

Obey the first law of SEO and never make a mess you can’t clean up.

I’ve seen cora is not considering ahrefs page traffic value as a ranking factor. Will you gonna consider it as a factor?

Heck yes! It wasn’t part of the api when I integrated but if it is there now I’ll happily add it.

What are your thoughts about creating lots of service landing pages to atack multiple keywords? I am asking by the fact of creating multiple landing pages regardless where they are being linked from.

I would keep the main navigation simple and focused and link the keyword pages from a site map menu in the footer.

I think most local service businesses don’t have much choice.

A plumber with a 15 mile service area probably has hundreds of basic keyword location combinations.

Blogging about each of them is ridiculous because they are practically identical.

So I would target one keyword per page and use a site map menu in the footer to cross link.

They are dual purpose pages because doing this should increase your quality score for AdWords and if you have to explain the pages to google that is what you say.

What are you referring to as a “site map menu in the footer”? An actual nav menu in the footer or the sitmap itself?

A site map can be organized as a multilevel menu. So just structure it as a jQuery menu in the footer of the page.

This lets you keep the main navigation clean and simple.

Have you seen where correlation edits just don’t work? Is it niche specific? What do you attribute this (if you’ve ever seen it).

Every case is different and I have to investigate to give you my best answer on your specific situation.

Sometimes there are bigger problems in play than the ones you are choosing to fix.

Did you change enough? Are there huge off page deficits?

Do you tune the page for multiple terms? Is this an indexing problem versus a tuning problem?

What would be your next steps if following the CORA recommendations DIDN’T improve rankings after a suitable period of time?

I would look for other large deficits that we may be contending with and I would also look for any complicating factors.

If you only did on page check the off page.

Is google crawling the whole website? Spot checking the google cache to make sure google is picking up the changes is an important first step.

Keep in mind that going from 77 to 14 is generally easier than going from 4 to 3.

So did you do enough to really get ahead of the competition? And when all else fails get a consultation.

Ask another expert if they see something in the report or situation you might be missing.

My subscribers are welcome to get consultations from me from time to time. Just have to schedule it.

How often do you recommend running Cora for your target keywords? Do you run it every month, bi weekly, weekly?

If things typically change monthly I’d run it weekly. If things typically change weekly then I’d run it daily.

So it depends how aggressive your niche is and how important the information is to you.

The important part is that you maintain an archive over time so you can do diffs from before and after noteworthy Google updates.

You’ll be very thankful you have an archive when an update happens that you need to react to.

Am I wrong on Arranging these strong>b>em>cite>u>i for on-page factors ??

Changes from keyword to keyword because it is based on what the competition is doing. And recent tests hint that diversity of factors is the strongest factor of all.

(a) Does proper use of <section> tags matters in SEO ? (b) Does Using content under a tag effect SEO, I’ve seen this kind of code in alltop (dot) com { please refer this screenshot}

If you make a hundred test pages and test a different html tag on each one Google does sort them.

And every time you repeat this test you get the same top 12 but after a certain point they all start to be roughly equivalent to just a keyword mention on page.

So outside of the top tags like titles and headings etc what you are striving for is diversity.

So it is probably valuable for diversity and keyword mentions.

For part b of your question I think it does effect your overall TF/IDF score for relevancy so it may very well help.

This effect may explain why we sometimes see keywords in HTML comments, script tags, and class names, and obscure HTML tags correlate with rankings.

Those kind of mentions by themselves may not be factors but in aggregate may effect Term Frequency, Keyword Density, and overall Diversity.

Goes Google Pickup the meta data like exif data of a image? few days ago When I download a stock image and hover mouse to that in my desktop, it shows some tags. Later I investigate and open that image on notepad ++ and Find for those tags, yes it was there, and then one thing came into my mind, why not search for http in the image, yes it was there along with adobe (dot) dom, a (dot) ch domain was there. Then I look into this domain on ahrefs, I was shocked, this website was getting backlinks from that stock image website. But my question, does google recognize these backlinks ???

For Exif data I know Google claims to ignore it. But Google has been wrong before.

I personally have not tested it so I don’t know. But it sounds like a great experiment you could run and share to make big waves in the SEO community!

Do you have an analysis of a psecific domain so we can see it in work?

If you mean example Cora reports there are 200 of them at the bottom of the page at http://seos.blog/seo-correlation-in-action/

 

 

 

Leave a Reply

Your email address will not be published.