This blog is about Google Search Console how we can use it to do better things with our website.
Understanding why we need a website in the first place so let me tell you in today’s world everyone
needs more than he has got and why not to wish for more, more is good. It keeps a good vibe. We all
need availability of everything. Everything should be on our tips same for our client we need to be
available for him through a website plus it is our add on earning as google provides us opportunity to
earn from ads, now to have something we will need to maintain it so here we have Google search
console which will help you with this. Surprised? Yes I said Goggle Search console will help you with this as it provides free service to maintain, monitor and troubleshoot your site. With the help of Google search console you can do a lot more I will tell you what and how.
Google Search Console offers tools for us to do actions:
- Confirmation that Google search can find and browse through your site.
- Fix indexing problems of your website.
- View Google Search traffic data for your website as on how often your site appears in Google Search.
- Which searches bring up your site, how often searchers click and more.
- Receive alerts when Google encounters indexing, spam, or other issues on your site.
- Display what all sites link to your site.
- Troubleshoot on issues regarding AMP, mobile usability, other Search features etc.
Google Search Console is useful for everyone like developer, marketers, administrators, SEO specialist even owners. Owners can be both a company owner that has website for company or newbie like me that has his website for part time earning but hey I can be called owner of my lil company. It is too many things we can do with Google Search console so starting with the more important is :
How to fix indexing coverage issues in Google Search Console ?
This will work for website, blog or article. Here I will take my very own example where I got a notification from google saying my index got blocked by robot.txt and I really got worried as robot.txt is the main file which helps in ranking on Google search. Go to your google search console check the status of Index coverage since there is no error in url my console shows zero error. Here you can manually enter the url in robot.txt for which if you have any doubt you can reach out to me and also I will try to make another video on that coming back to our topic it my google search console is showing 35 warnings. Since my blog is of just 24 pages it’s a surprise how it is showing warning for 35 pages so here I will deep dive into indexed though blocked by robots.txt, looking at the urls It is clear that the urls that are blocked by robots.txt are the urls that do not contain any content but have link to another post that are not actually what are required here so these are meant to be blocked since they don’t have any content on it. So this is something you need not worry about its all taken cared by google as google knows everything. happy coding cya bbye