Monday 21 April 2014

Create a Search Engine Spider Friendly Site


The whole purpose of having a business website is to get visitors to the site that will ultimately become clients or customers. For most websites, the number one referral source of traffic is Google. For most people, Google is the first place they go when they are looking for just about anything. If you want your website to be successful, it needs to rank well in Google. The Google search algorithm is complex but we do know that the data is pulled from search engine spiders that are continually crawling the web link to link in order to determine where sites should rank.

If you want your site to rank well, it’s necessary to create a website that is search engine spider friendly. Here’s how to do so:
Create backlinks and content:
The search engine spiders go from link to link to link. The number of inbound links pointing to your site and the amount of content that you’ve produced across the web correlates with the number of visits from the search engine spiders. If you have a stagnant site that doesn’t have many inbound links, the search engine spiders may pass right over it. That’s why SEO link building is so important. A great way to make sure that your site is always “fresh” is to add a blog that is updated regularly.

Optimize titles and headings:
The title is the most important meta element for SEO purposes. Be sure to write a unique title for each page of content that includes keywords. If every title is simply the name of the company or brand, it’s not really telling the spider what that page is about. Always include unique keywords first, and then the brand name second if it’s even necessary. Within the page of content, always utilize h tags. The main heading of the page is typically an H1 while sub headings might be H3 or H4. H tags stick out from the rest of the content and tell the spider that it’s important.

Submit an up to date sitemap:
Once the site is fully functional and up and running, it’s best practice to submit a sitemap to Google through Google Webmaster Tools. This gives the search engine spiders the ability to find all of the pages of your site. For a larger website, it may make sense to have a sitemap of sitemaps. If you are constantly adding and removing pages (like an e-commerce site) it may make sense to resubmit the sitemaps on a weekly basis to get the new pages crawled and indexed more quickly.

Internal linking:
For SEO and usability purposes, a website must have a strong internal linking structure. This helps the spider understand the layout of your site and what you have to offer on different pages.

Redirects:
When a spider lands on an error page is comes to an immediate halt because there is nowhere to go. This is bad because you want it to continue to crawl the site. If a page is no longer available or relevant, don’t just let it go to a 404 error page. Instead, use a 301 redirect to send the spider to a page that is the most similar.

No comments:

Post a Comment