Part 4: SEO (Search Engine Optimization)
I’m sure you’ve heard the term SEO or Search Engine Optimization. You may think that this is some magical formula applied to your website to bring it up on the first page of every search engine. Sorry, no magic – just art and science.
The art is first knowing what people are looking for that you can provide them. If you manufacture Large Industrial furnaces, for example, optimizing your site for terms like “melting point of metal” is not your best option. Terms like “sintering furnace” “industrial furnace”, etc. are better choices. Keep in mind that the more terms you want to optimize for, the more you water down the keyword density.
Not all traffic is good traffic. You want to optimize for the traffic that is most likely going to convert to a new customer – not a student doing a school paper on the effects of heat on metal.
Start by creating a list of keyword phrases that you would like your website to come up for on search queries. Order them from most important to least. Give the list to your web designer/marketer. The designer/marketer will research the list for you and find out the most popular keyword phrases that are being used to find what you’re offering.
Once you know the terms that you need to target, the designer will weave these terms into your website content – body text, title tags, headlines and subheads, meta tags, etc. The content must read smoothly and not appear to have keywords “stuffed” into it.
Your designer will also have to plan on how to handle content that is going to be removed from your website. She/He must setup 301 redirects for any pages that have been removed. If your site urls are going to change for any reason, then 301 redirects need to be setup for every url. An example of this would be if your website was mydomain/home.html and all the pages were .html and then you moved to WordPress. The new urls would be mydomain/home/ without any .html in the url.
A 301 redirect tells Google and other search engines that this page has been moved permanently so that Google can remove that page from their index. You should allow about 3 weeks for Google to get all their datacenters updated – other search engines can take longer.
A robots.txt file can also aid in helping Google and other search engines understand what urls should not be indexed. Files that you don’t want to be discovered by Google are listed in the robots.txt file so that Google won’t index them. These could be files in a protected directory that you don’t want to give public access, files that are going to redirect, etc.
In part 5 we are going to discuss the new website launch and cleanup.