Constellation triangulus dating ariane
We can cache data and make requests in a different way than a regular browser.
Therefore, we do not see the full benefits of scanning HTTP / 2.
As for the report processing time, it takes some considerable time.
As Mueller explained, taking measures may take "some time", but not a day or two.
"I talked to a lot of SEO specialists from big enterprises about their business and their answers differed.
These companies have different opinions on the reason why they reject links.
In general, the difficult part is that Googlebot is not a browser, so it does not get the same speed effects that are observed within a browser when implementing HTTP / 2.
I don't think that helding too many audits makes sense, because, as you noted, we successfully ignore the links, and if we see that the links are of an organic nature, it is highly unlikely that we will apply manual sanctions to a website.
In case your links are ignored by the "Penguin", there is nothing to worry about.
It is important to remember that rejecting links can lead to a decrease in resource positions in the global search results, since many webmasters often reject links that actually help the website, rather than doing any harm to it.
Therefore, referential audits are needed if there were any violations in the history of the resource.