A Bit About Technical SEO

technical seo informationWhen considering the deployment of many different SEO tactics when publishing a website there are some, more than others, that must be understood. To have any chance for a website’s pages to rank in the search engines, the search engines must be able to crawl and index the site efficiently. Many technical optimizations for websites are in some ways the easiest part of publishing a site. Mainly most of the technical improvements do not require above average writing abilities or the offering of or managing of merchandise. Learn the basic elements of publishing a site and a site will not be guaranteed to rank on page one of the Google rankings, but Google will assuredly know the content on your site.

Publish a HTML Sitemap

HTML Sitemap pages are for the most part a regular page on a site with text links to every page contained on a website. These pages were used around the turn of century by people looking for pages on a website. Today these pages are mostly used by the crawling bots of search engines to find and save all the content being published. Most basic change management systems (CMS) such as WordPress have the ability to create and manage these pages in an automated way.

Publish an XML Sitemap

Different from the HTML Sitemaps the XML versions have never been used by site visitors to locate content on a website. Because these files are for machines they are allowed to not be attractive for people to read. There are basic formatting parameters that must be used publishing these files that you can read about here. Special XML Sitemap tip: once you have published the Sitemap file open a Firefox browser session and then go to the address of the file. If there is a formatting problem with the file there will be an error message displayed.

Publish a Robots.txt File

A robots.txt file is a text file that informs any bots arriving to a site what files to crawl, or more importantly which files on a website’s servers should not crawl. In fact these files are  mostly used to stop search spiders from indexing specific directories and designated files. The big warning here is that if you are not 100% sure, do not exclude any files from being crawled, and just leave the content of the file empty.

Increase the Speed

The best way to speed up a site is to reduce the amount of code being used. The less code the faster search engine bots can crawl a site. The faster a site is crawled the more pages the bots can crawl per visit. That is just half the benefit of a speedier site. Google has admitted for some time that page load speeds are included in the ranking algorithm.

Special site speed tip: Use a the free tool from Google that tests the speed of a site as well as providing some advice to improve the performance.Google Page Speed Tool Link

No Text in Images or Flash

While Google has gotten increasingly more adept at crawling many types of coding, there is still no ability for the bots to crawl and understand the words included within images or within flash media. That can be a tough one. Many marketing or design people will love to use images that contain text or to publish flash messaging. If you must publish this horrible content then insist that text be included some other place on the same page.

Advanced Tech SEO

Each of the items in this category are for more experienced SEO strategists to deploy. It is possible for a novice web publisher to learn to do these things but I would recommend saving a good bit of time and effort and employ the services of a more experienced SEO. You should at least know to ask for these technical SEO practices. Also know that there are many out there professing SEO competence that get these tactics wrong and hurt websites. Make sure to get references before asking for this work.

  • Publish a responsive design so that each page will display on desktop as well as smart phones

  • Use Schema Markup when possible to better inform the search engines what type of content is on each page. Special markup tip: this code will also increase the click through rate a site gets when published in the search results.

  • Use canonical tagging and pagination to better teach the search engines which pages should be included in their indexes. This is of particular use for pages in which there are dynamic URLS.

While many SEO professionals might consider this information to be very basic it is very important. There are still way too many websites getting these things wrong. Costing many webmasters thousands of potential readers or customers. If you are going through the time and cost of publishing a website make sure to get the most from all the effort.

Leave a Reply

Your email address will not be published. Required fields are marked *