Tag Archives: search-console



Discovering new links and efficiently managing them constitutes as an important aspect of SEO services, and SEOs use Google’s special search operators to find these links.

Over a period of time, SEOs and webmasters have minimized the use of the link operators. Since long, Webmasters have been advised by Google to use its Search Console link report in their search for new links, as these link operators show only a “sample” of a site’s links.

In the past week, Bill Hartzer pointed out that these so-called special link operators weren’t working on many large websites, which includes [www.google.com]. However, they seem to work efficiently on other large websites such as [www.yahoo.com].

This brings us to the conclusion that Google may have completely removed the link operators from its search engine. But, Google’s Gary Illyes denied the same through his twitter handle.

Contrary to that, the image below depicts that the link operator google.com is not searchable.


Whereas the operator yahoo.com is easily searchable on Google.


This issue was addressed to Google through an email, but they replied with the same message that Google’s Gary Illyes shared through his twitter handle.

Interestingly, the link operator was found missing from Google’s search operators page a few weeks ago.

Google again addressed the issue by stating that instead of using link operators, webmasters should use Google Search Console services. Well, the matter is still up for review at Google’s end and hopefully, Google’s webmaster will clear the air in his future communications.

For more info on this topic, write to our experts at sales@ebrandz.com and we’ll get back to you right away.



Globally hailed as the Holy Grail of the SEO community, Google’s webmaster’s guidelines get yet another update. The guidelines now comprise of a string of revised content along with recently inducted tools and features.

The most visible of the changes is that it no longer comprises of a single long page and it appears to contain click to expand content that even google ignores to index. The revised content can be segmented as below

(A) General Guidelines

(B) Quality Guidelines

Of the two sections, Google has revised the general guidelines content keeping the quality guidelines content kept unchanged. Besides addition of a lot of content, there is no other noticeable development in the new general guidelines sections, except the fact that some of the content has been revisited and rephrased.

Here’s an image depicting the guidelines that come with the click to expand feature.


For more info, write to us at sales@ebrandz.com.


Google, earlier this year in June, started notifying business owners to verify their Google My Business Account by sending out warning messages. You can also do the same via this link login and verify their business listings. In case, if they don’t oblige then Google may punish them by removing their Google Map’s business listing.

This time again, Google has come out with the warning message for business owners, as it did in the past.

And the other day, Google issued a fresh warning and this time on its social networking handle Google+. It says that it’s now mandatory that you log in at least twice a year and in case if you were unable to do the needful in a year’s time, Google will notify you via email and still if you were unable to login to your account then Google may expel your business account altogether.

The emailer notification from Google reads as:

“If you’re a business owner and you haven’t logged into your Google My Business account in over a year, you may receive an email from us soon asking you to sign in and confirm your business information. Just follow the steps in the email by simply logging into your Google My Business dashboard, then checking to make sure your information is up to date and submitting any changes if necessary. If your account remains inactive after receiving a notice from us, then it could run the risk of being de-verified, or in rare cases, removed from Google Maps.”

In case, if you have recently received a similar email notification from Google, don’t forget to log into your account more than once a year or else you may have to face the consequences.

Alternatively, if you need any help, kindly do not hesitate to email sales@ebrandz.com or Call 1-888-545-0616 (Toll Free) for assistance. And, do not forget to ask for a FREE SEO Audit Report.


It’s not a new update from Google but a rush of new warnings that Googlebot cannot access your CSS and JS files. These fresh warnings were issued by Google via Google Search Console. Here’s an image.


What these warnings actually mean?

Google through its user guidelines has been informing webmasters from time to time to unblock the CSS and JS files. So if you still have them blocked, the new fetch and render tool warns you when you block the CSS and JS files. Google renders the page from a user’s perspective, so blocking them can impact big time. These warnings are not penalty notifications as some may think but resolving this issue is important.

So, how do you fix these issues?

Log-in to your Google Search Console account, and then go to your site dashboard. Click on Google Index > Blocked resources and check if the search console is showing anything under “Pages affected”.


Now, click on the domain under the host column, which will show all the files which are blocked for crawling. Most probably, you see files such as theme or plugins. css &. js files which are essential for site display. If that is the case, you need to edit your site’s robots.txt file. This is applicable for almost all WordPress blogs & few other popular CMS.


If you do not see any blocked resources for your site right away, so you can use Fetch as a google feature. How to do it? Just follow these instructions:

Click on Crawl > Fetch as Google to add a fetch & render request that will be completed in few seconds. Then see how Google sees (Renders) your site. Then click on robots.txt tester to further see which line of your robots.txt file is blocking the bots from accessing your site’s CSS & JS files.


How to fix CSS and JS warning by editing robots.txt file?

If robots.txt sounds new to you then don’t worry it’s a common lingo in the SEO parlance. For WordPress websites or blogs, most of them already have “wp-includes” or “wp-content” blocked via robots.txt. A simple fix is to remove this line of code from the robots.txt which should fix most of the warnings relevant to CSS and JS.

If you are searching for answers from Google, you can probably watch a video by Matt Cutts posted way back in 2012 on why you should not block JS & CSS files on your web page.

Still have questions? Let our professionals help you out. Request a quote now. Alternatively, you can get in touch with us at 1-888-545-0616 (Toll-Free) or email sales@ebrandz.com.