SEO 101 - Start with the Basics
The Newest SEO Strategies - Be Aware of the Changes in Online Marketing.
SEO is an evolving art or science, depending on how you look at it and is actually a bit of both. Over time the Search Engines have refined the way in which they grade or determine which websites should be given the highest standing in the SERPs (search engine results pages).
What may have worked 3 years ago to push a website to the top of the search engines will not work the same today, and most likely will not work the same a year from now.
There is no way of knowing precisely how Google or Yahoo or Bing calculates their SERPs exactly but a solid strategy proven through trial and error does exist.
For the purpose of this article I will only discuss those techniques that would be considered ‘white hat’ or you might say ‘officially approved’ of by the Internet community.
There are many sources out there perpetuating the idea that you cannot get indexed quickly or rank near the top using only ‘white hat’ SEO. This is simply not true. In many cases site owners actually damage their ranking status and lower their SERP position by trying to use some ‘quick’ ‘Black Hat’ trick to get their website into the top position over night.
In some cases there could be a temporary increase in page rank and SERP position for some of the ‘black hat’ strategies that exist out there but in the long run those ‘quick ranking’ tools are only an illusion. Over time the search engines weed out the ‘trash’ and give high rank and good SERP placement to those websites that genuinely provide quality content, outstanding service, or fill a vacant ‘niche’. Usually, it is the websites that manage to accomplish all three of these requirements that sit in the number one position.
Ok everyone the lecture is over. Let’s start talking about the things that we can do to get our website ranked high and stay at the top.
First, foremost and most importantly is the basic design and structure of the actual website itself. A carefully planned out, user friendly, logical, and straightforward landing page or home page is critical. Remember, this is the page that your first time visitors usually see first and it is the one that the search engines ‘bots’ see first as well.
‘You never get a second chance to make a first impression. ’
Actually I tend to believe that the search engines are far more forgiving than a typical Internet customer. The search engine’s bots will keep coming back to crawl your site no matter how bad it sucks. Your website could be full of broken links, scripting errors, spelling errors and grammatical mistakes and the bot doesn’t care at all. Well actually it does care, it reports all of these things back to search engine HQ.
Because it is only a bot, and bots do not have feelings, the bot does not hold a grudge for long and in a few days or sometimes weeks, the bot will come crawling back to you again (and yes, the pun is intended).
In all seriousness however, if a new ‘human’ guest lands on your site for the first time and runs into those same kind of issues: broken links, scripting errors, misspelled words or poorly worded content pages, they are more than likely not going to spend the time to come back and see if the website has been updated or corrected. That particular ‘potential customer’ has just become a ‘statistic’ (definitely in the red and not in the black).
You will hear a lot of talk about ‘Meta Tags’ and ‘Meta Descriptions’. These are actually words or phrases that are included in the website’s ‘code’ that are only visible to the search engines. The Meta Description however, many people see quite frequently but probably didn’t know what they were looking at when they saw it.
I Googled ‘Search Engine Optimization’ and the top 3 results are listed below:
What is placed in a Web Page’s Meta Description is normally what the search engine will use to display in the area just below a Web Sites title in the SERP. I have highlighted this information in the first listing above.
Using ordinary English and being descriptive in the Meta Description on each page of the Website will result in a good looking SERP listing.
Meta Keywords are also important. Meta Keywords are the keyword phrases that best target the specific page that they are used upon. These keywords should be targeted and be related to the content contained within the page body.
Keyword density refers to the ratio of a how many times a specific keyword phrase appears within a body of content to the total number of words that are within that content. In the early days of SEO this number was very critical to ranking high is the SERP’s. Today this number is much less important and while the appearance of a targeted keyword is important within article content, keywords phrases should be used in a natural sounding context and not ‘overloaded’ or ‘stuffed’ into the content.
Other practices that should be completely avoided are: the use of hidden content and the use of content that is completely unrelated to the overall website subject matter. The technique of using very frequently searched for keywords but keywords with no relationship to the actual website they were embedded within was once a very common and effective tool to increase SERP rank. This does not work today so do not be tempted to try using it. Some Internet Marketers might try to convince unwary website owners to engage in this technique; do not bother as it will not be effective.
Use the H1 header tag to focus each web page on a particular keyword phrase. Embedding the keyword phrase that best summarizes the content of a particular web page in the H1 header tag helps the search engine organize the content of your website and rank the importance of the content of the pages. And, of course, if you have chosen a topically relevant and appropriate Keyword Phrase, it will help your Website guest know they have found the right page.
Maintaining a structured, easy to navigate, uncluttered and not overly complicated web design results in lower bounce rates. A confusing or very unorganized web design will turn off most Internet Surfers really quickly. The typical Web Surfer is expecting to find a website with information that is easy to find, well organized, and truly informative.
It can be highly useful to actually layout the design of a proposed website on paper before any computer design work even begins. Using a graphical ‘flowchart’ also greatly helps keep things organized and makes the design much easier to understand for anyone who may not have been around during the conception of the design but will eventually be involved in the design, coding or implementation of the final project.
A flowchart is basically just a simple graphical representation utilizing directional lines, graphical symbols and pointer arrows that is utilized to depict the design characteristics of a websites menu-flow, page-order, and logic path.
In designing any website probably the most important thing to keep in mind during the actual development of the content is to include as much useful, high quality, informative data, about the business, industry, or focal point of the website as possible. The more original content that appears within the body of the website the better the search engines will like it. Your target market will appreciate this as well.
In addition to highly descriptive content describing the products and services that a website is providing, a glossary of terms related to the industry the website specializes in is a good addition to any website as well as a FAQ’s (frequently asked questions) page.
When it comes to SEO, there can literally never be too much information included in a website as long as it is all completely original unique content (or where appropriate, references are duly noted).
Original content means copying pages of data from the Wikipedia, or using content pages found on other websites or even ‘lifting’ excerpts of content directly from other websites simply will not work. Plagiarized content ‘lifted’ from other sources without express consent can also be a source of legal concern.
This is one place where Google in particular, is extremely good at detecting copied content and will give no SEO advancement for such material included in one’s website.
If a business owner is not able to author original content themselves it would be advisable to hire someone to write original content for the website. Original content is just as important if not more important than any other aspect of achieving good SEO results.
Additionally, there are many so called ‘article submission’ websites that are available to host uploaded content articles written about almost any subject. As a matter of fact it is very possible that you may be reading ‘this’ article on an article submission site at this very moment.
The purpose, in SEO, to the process of uploading and submission of articles to these types of websites is twofold: first, popular article submission sites have thousands of guests each day which is a huge potential market for readers of the submitted articles and second, as an author one is allowed to create a signature that can include a link that directs traffic back to the website of the author’s choice.
Posting articles with content that relates to the target website and then including links that have the keyword phrase tagged as the anchor text either embedded within the article or as part of the author's signature, is an excellent way of building good high quality back-links.
Blogs are another great place to submit articles where there is a good chance for a posted or submitted article to be viewed by a large audience. Blogs also allow the author of the article to receive feedback through comments left by readers of the blog.
This brings us to the topic of links, and more specifically ‘back-links’. The term back-link refers to a hypertext link, which is located on an external website that links, references, or points back to one’s own website. A ‘back-link’ is made up of two parts:
- The ‘URL’ of the destination site (or the site the link is pointing to).
- The ‘anchor text’ which is made up of the ‘keyword phrase’.
An example of a ‘back-link’ written in html code would look like the following:
<*a href='http://www.yourwebsiteurl.com'>Your Keyword Phrase<*/a>
When displayed this link would appear as:
‘Your Keyword Phrase’ would be an active link pointing to the ‘url’ that follows the ‘href’ in the link statement above. (Please be sure to remove the asterisks - I used them only to prevent the link from becoming active.)
In the early days of the Internet back-linking was the holy grail of SEO. One merely needed to acquire many, many links pointing back to one’s website to achieve a very high position within the result set of a search engine. At that time it was mainly just the quantity of back-links that determined a websites PR and not the quality of the incoming links.
Those days, I’m afraid, are long, long, gone. The search engines are much ‘smarter’ today and the number of back-links is not nearly as important as the ‘quality’ of the back-link in determining whether or not the back-link will provide any benefit to SEO at all.
To help prevent the misuse of link farming and other types of link manipulation the use of the ‘no follow’ attribute was created and has been implemented. When the use of the ‘no follow’ attribute is applied to a link it directs the search engine to award no PR or ‘PageRank’ benefit to the landing site no matter how high the ‘PageRank’ of the referring site may be.
The use of this attribute has only been somewhat effective in reducing the amount of spamming and ‘spamdexing’ that is used to try to boost SERP position. The practice of simply blasting back-links out all across the Internet is on the decline as its effectiveness diminishes.
There is however, some benefit to having a good deal of back-links inbound to the target site. Today it is more a matter of how to acquire numerous good high-quality back-links than simply an exercise in who can accumulate the most.
At this point it becomes necessary to discuss PR or PageRank. PageRank is the method of grading a websites ‘authority’ or importance on the Internet. The process was developed primarily by researcher Larry Page (hence the name ‘Page’ Rank) at Stanford University as part of a research project.
Page and another researcher named Brin later founded Google and took the technology with them and it is still the basis of the Google search engine and search tools to this day.
A website’s PageRank is a numerical weight ranging from 0 to 10 that is assigned to a website based upon, among other things, the quantity of and the PageRank of the external websites that link in or have inbound links that refer to the target site. A website that has many inbound links from websites that have high PageRank themselves will in turn be given a high PageRank as well (at least that was the way it worked before the implementation of the no-follow attribute.)
According to Google’s website the PageRank is calculated by ‘considering more than 500 million variables and 2 billion terms. ’
Needless to say trying to understand an algorithm this complex exactly will never be possible but there are still many things that can be done to boost a sites PR and help improve its placement in SERPs.
We have discussed the importance of a high quality website design, the need for very informative original content, article submissions, back-linking and PageRank. This would be a good time to begin discussing the process of determining keywords and keyword phrases.
Selecting the correct keywords and keyword phrases to focus the SEO efforts on can be the determining factor as to whether or not it will be possible to successfully achieve effective SEO. It is very important to take the time to do some research on competition and keyword popularity.
In general the most common terms that describe a business would be the best place to start. For instance if you own a business that provides roofing contracting services keywords that you might consider focusing on might be ‘Roofing Contracting’ or ‘Roofing Contractor’.
Google provides a tool that displays the average number of searches for selected keyword phrases. You can find this tool by searching Google for ‘keyword tool’. After entering the keyword or keyword phrase and pressing the ‘search’ button a page of results will be displayed that indicate the number of times the particular keyword(s) has been searched for on average each month.
Also displayed are many other similar phrases that most closely match the keywords that were searched for. Take the time to scroll through this list and many good alternative phrases can be found in this way. Choosing the phrase that has the most searches and most closely matches the website’s line of business is a solid practical approach.
If you liked this article it is available for download free here on our site: Free Downloads
Find out how our San Antonio SEO specialists can get your website to the top of Google for your keywords phrases.
Other articles of interest: SEO - Why Do I Need it for my Business Website.
Other articles of interest: How to Hire an SEO Company - 10 Rules.
Other articles of interest: Organic SEO - The Google 'Sweet Spot'.
This is our fellow SEO expert Rasmus Koelln who is a strong ressource on the same subject