Technical SEO Tips And Advice For Your a Better Blog

This article is going to show you what you have to do if you want to be successful in your technical SEO endeavours in 2017. Some of these tips have been working for quite a while now, but others are fairly new and they address the most recent changes in the search engine algorithms.

Check That Your Most Important Resources Are Crawlable

Image result for Technical SEOSimply looking into your robots.txt file may not be enough for checking whether your website is crawlable. Robots.txt does a great job when it comes to preventing robots from crawling and indexing certain pages, but there may be other reasons why some of your pages get blocked. This is why you have to use an SEO crawler, in order to make sure search engines can access all pages of your website.

Keep in mind that Google has now the ability to render pages the way modern browsers do it. This is why in 2017, it’s not enough to ensure crawlers can access your pages, as they also need to read your CSS and JavaScript files. If your CSS files can’t be indexed, Google may not see your pages properly, and their styleless version is going to be illegible. Similarly, if you want Google to index your dynamic content, you need to give them access to your JS files.

If you have to check a website relying heavily on JavaScript or built with AJAX, you may have to find a crawler that can render JavaScript such as WebSite Auditor and Screaming Frog.

Get rid of your “shallow content” or prevent it from getting indexed.

If your website is old, you may already have a lot of low quality content posted and indexed. A few years ago, the common trend was to publish 500 words articles, so very few people bothered with writing longer posts. Today, this is nothing else than a huge footprint for ‘thin content’, so I’d do my best to get rid of these pages as soon as possible.

Check out the SERPs, and take a look at the blogs that rank well today. They all have very long articles, with lots of useful information.

This means you need to search for all your thin content and delete it immediately. If you don’t want to physically delete all these posts and pages, you can assign them the “noindex” tag, in order to prevent search engines from indexing them in the SERPs.

I would personally delete these pages rather than “noindex” them, because de-indexing can take a very long time, and I just don’t want them on my sites.

If these short pages contain high-quality content, you shouldn’t delete them. Nonetheless, you should consider adding more information, in order to make them more consistent. Once you are done adding information, you can send the URL through Google Webmaster Tools to have it indexed again.

Use Internal Linking To Your Advantage

Do you struggle to get links from high authority websites? You should leverage your internal linking above other things. If you’re starting a blog, do this from the beginning as it will save time in the long term.

Take a closer look at your sitemap.

You know that sitemaps are extremely important when it comes to SEO. They are the ones that inform search engines on the structure of your website, thus speeding up the discovery of your pages. Here’s what you should be looking for:

– Freshness:  Make sure that your XML sitemap gets updated each time you add new content.

– Cleanness: Remove all garbage from your sitemaps by removing all 4XX pages, all non-canonical and redirected URLs, and all pages marked with the “noindex” tag. If your sitemap is choked full of such pages, search engines may decide to ignore it completely. you can check your sitemap for errors in your Google search Console, under Crawl > Sitemaps.

– Size: Google crawls up to 50,000 URLs, so having a bigger sitemap is pointless. As a matter of fact, your sitemap should be much smaller than this. Many SEO experts say that reducing the number of URLs in your sitemap may lead to having it crawled more often and more effectively.

Always Include Relevant Keywords In The Alt Tag Of Your Images

Google can’t recognize images, so it has to rely on metadata to understand and rank images. This is why your alt text should always include the most relevant keywords. Use it to describe your image.

Follow these tips and enjoy a more rounded and better blog.

About Cormac Reynolds

Tech journalist, copywriter and lover of all things gadget, Cormac Reynolds has covered the whole technology spectrum at one time or another and remains geek at heart.When he’s not working – he’s swotting up a book or thinking about his next project. Contact him @Brightoncormac

12 Responses to Technical SEO Tips And Advice For Your a Better Blog

  1. Faelen says:

    Thanks for the tip about thin content. I’m surprised that 500 words is now thin, though of course natural and organic websites will have longer articles.

    The rules continue changing for SEO. One thing that is constant is providing value to readers.

  2. Fred Pinto says:

    The things have changed over the times. The content needs to be optimized well to be on the top of search engines with little offpage SEO that looks decent.

  3. Seo is a great way of maximizing the rank of your website.

  4. Very well explained article Reynolds but my question is Google algorithms rules required for better SEO?

  5. Thanks for sharing Cormac. Very practical and useful advice for SEO. Quick but valuable read, glad I came across.

  6. JindaTheme says:

    Hi, Cormac

    Thanks for sharing good article. but any techniques for WordPress user?

  7. Thanks for sharing such beautiful with us. I hope you will share some more information about Technical SEO Tips.please keep sharing.

  8. Really very helpful & informative Article. Thanks for sharing this awesome Tips on SEO

  9. apwebacademy says:

    how we reduce our URLs number? I request a little brief explanation. Thank you so much.

  10. This is the reason you need to utilize a SEO crawler, with a specific end goal to ensure web indexes can get to all pages of your site. In the event that your CSS documents can’t be filed, Google may not see your pages legitimately, and their homely variant will be unintelligible. Additionally, on the off chance that you need Google to file your dynamic substance, you have to give them access to your JS documents.