Website SEO

let's talk!

website seo

Questions? Ewing Enterprise has the answers

Site URLs and Server Technology

  • People who build and run websites have a wide array of technology to choose from. Some popular programming platforms include:
  1. Hypertext Preprocesor (PHP)
  2. Active Service Pages (ASP)
  3. JavaServer Pages (JSP)
  • Each of these platforms has its own language, but they all serve pages out in HTML format. From a search engine optimization perspective there is no advantage in choosing one over the other. You can choose among the platforms based on cost of operations: hiring developers, designers, programmers, and webmasters.
  • If you are building a new website or undergoing a redesign and restructuring you should follow the W3C recommendations and build a website without revealing the server side scripting language. This means instead of:
  1. example.com/blue-widgets.html
  2. example.com/candy.php
  3. example.com/cars.asp
  • Your pages’ URLs should look like:
  1. example.com/blue-widgets/
  2. example.com/candy/
  3. example.com/cars/
  • Using this scripting language will allow you to move from one technology to another without altering your site’s URLs or its underlying structure.
  • It’s possible to serve your website under both: http://www.example.com and http://example.com. However, many search engines will see both and consider it duplicate content (the same content under two URLs), and may penalize your site accordingly. To avoid this:

·         Canonicalization

  1. Pick either http:// or http://www and use it consistently.
  2. Configure your web server to 301 redirect all traffic from the style you are not using to the style you are using.
  • For more information on this issue see Matt Cutts’ URL canonicalization advice.
  • In many cases programming implementations use parameters instead of static URLs. A URL with a parameter will look like this:

·         Static URLs and Dynamic URL Parameters

  • example.com/page/?id=widget
  • While a static URL will look like this:
  • example.com/page/widget/
  • In most cases search engines have the ability to index and rank both formats. However, best practices advise the use of static URLs over dynamic ones, for reasons like:
  1. You produce cleaner, easier to understand output.
  2. You extract the keyword value the parameter can add.
  • As you start to add more than one parameter to a URL search engines have a harder time properly indexing the URL.
  • URLs which are not in the index will never rank or drive traffic from search engines.
  • Generally speaking, it’s beneficial to have keywords contained within your URL structure. Having the keyword in your URL helps search engines understand what your page is about and also helps users know what they are likely to find on the page. Consider these two examples and see which you find more useful:

·         Keywords in URLs

  1. example.com/123467/9876/
  2. example.com/images/tulips/

·         Delimiters in URLs

  • Delimiters are used in URLs to separate words.
  1. The best practice is to use a hyphen to separate words.
  2. Search engines do have the ability to understand other characters, such as an underscore, but the hyphen is preferred over an underscore for human usability issues.
  3. For more information, see Matt Cutts’ discussion on dashes vs underscores.

·         JavaScript and Flash

  • Search engines are very limited as to what forms of information they can read and interpret.
  1. Currently, they understand text-based content which is directly on the page.
  2. If your web application relies on JavaScript, Flash, or some other non-text form of displaying information, search engines will have a difficult time interpreting the content of your pages.
  3. Consequently, your search engine rankings will suffer.

·         Design Page Structure and CSS

  • Website and design usually have more of an impact on usability and marketing. However, there are some SEO concerns to be aware of.
  1. While proper semantic markup and W3C code compliance never hurts, it’s not a requirement if you want to have a high-ranking page in search engines.
  2. Care should be taken taken when building pages to eliminate as many errors as possible.
  3. A page with several hundred coding errors is much more likely to trip up a search engine spider than one with fewer errors.
  4. Using proper standards and markup usually means pages are laid out in a more logical fashion.
  5. Using CSS makes it possible to put the main body of a page’s content first.
  • Otherwise the top banner and any side navigation appear first.
  1. Search engines still place some weight on the text that comes first on your pages.
  2. More and more website owners are using a Content Management System (CMS) to build their sites.
  3. Using these programs forces you to isolate content from the context, which usually results in cleaner and more streamlined code.
  4. Additionally, these CMS systems make it much easier to build and maintain mid- and large-sized websites.

·         On Page SEO Factors and Considerations

  • On-page SEO factors deal with the elements that are on the actual web page. Links from other sites are off-page factors.
  1. Most professional SEOs consider the title element the strongest on-page SEO factor, so it’s important to pay attention to it.
  2. You want a title that is short and eye-catching, with as many keywords as possible.
  3. Make sure your title still reads cleanly; do not have an unintelligible keyword-stuffed title, as this will display in the search engine listing for your website.
  4. Included your site name in your title for branding purposes.
  5. Whether to place your website name at the front or end of the title can be decided by personal preference.
  6. If you are a large company or well-recognized brand, such as Coca-Cola or Ford, you can place your name at the beginning of the page title. This lets you build on the trust in your brand.
  7. Smaller or less well-known companies should place their names near the end of the title, so that a browser’s focus goes to the keywords in your title.

·         Meta Keywords and Descriptions

  • These factors are largely ignored by search engines due to abuse in the past.
  1. In some cases having identical keywords and descriptions across an entire website has been shown to be a slightly negative factor in ranking.
  2. The meta description will appear under the title when your website shows up in a search engine result.
  3. Therefore, create a unique description that is well-written and eye-catching.

·         Headlines and Page Headings

  • Page headings (also known as H tags) are structural elements used to divide a page into meaningful sections.
  1. They number from H1 through H6, with H1 being the most important and H6 being the least.
  2. Your page should only have one H1 tag.
  3. You can use as many other H2-H6 tags as you want, as long as you don’t abuse them by keyword stuffing.
  4. Many people have their H1 match their title tag.
  5. You can make them different, which allows you to use a wider array of keywords and to create more compelling entries for humans.
  6. By default H tags are large and bold. You can use CSS to make them appear however you’d like on a page.

·         Bold and Italics

  • Bolding and italicizing fonts doesn’t impact search engine rankings. Use these fonts for visual or other formatting reasons, not to affect your standing in search engines.
  • Anchor text refers to the words that are clickable in a link. Internal anchor text are the words that link to other parts of your site.

·         Internal Anchor Text and Links

  1. Anchor text is one of the mechanisms search engines use to tell what a page is about.
  2. If you link to a page with the words Blue widgets, search engines think you are trying to tell it the page on the the other end of the link is about blue widgets.
  3. By using consistent or similar anchor text every time you link to a page, search engines gets a better understanding of what a page is about.
  4. Avoid using anchor text that doesn’t contain keywords (i.e., anchor text that reads “click here”) whenever possible.

·         Content Considerations

  • Content refers to the pages and articles on your website, excluding your template. Creating content that is entertaining, interesting, educational, informative, funny, or compelling in some other way is the best way for you to encourage people to visit your site frequently, link to you, and ultimately improve your rankings. The more unique and interesting your content is, the more value it has to your site’s visitors. For example, there are millions of websites about iPods. There are very few websites about putting your iPod in a blender, smashing your iPod, or hacking your iPod to do new things. Think of your content as your point of differentiation.
  • Most content falls into three different categories: boilerplate, news, and evergreen.
  • Boilerplate content is general information.

·         Boilerplate Content

  1. Your about us page, testimonials, contact information, privacy contract, and terms of service constitute boilerplate content.
  2. These pages exist to help website visitors get to know you, learn to trust you, and feel comfortable sharing information or making a purchase from you.

·         News Content

  • News content is content that has a short-term lifespan.
  1. News pages can remain relevant for a few hours, or even a few months.
  2. Eventually people stop searching for the term and the page will get very little traffic after that.

·         Evergreen Content

  • Evergreen content is content which has a long lifespan. An example of long term content is How to Paint a Room.
  1. Techniques for painting your interior rooms aren’t going to change much for the foreseeable future.
  2. The number of people who are searching to learn how to paint a room will remain fairly constant from one year to the next.
  • Most websites have a blend of news and evergreen content. This will vary from one industry to the next. Gadgets and technology websites will have more news content, whereas websites about William Shakespeare will have more evergreen content.
  • Creating good content is only part of the equation. Once you have the good content, you have to make sure other people know about it, read it, link to it, or tell other people they know about it. For new websites you will have to engage in more proactive marketing.
  • Marketing a website is no different than marketing a business. You have to advertise, send out press releases, engage in viral or word of mouth campaigns, or visit other places and tell them about your website. Though a complete marketing plan demands its own guide, some of the key goals of any website marketing plan should involve:

·         Marketing

  1. Getting people to visit your website.
  2. Getting people to link to your site.
  3. Convincing visitors to tell others about your website.
  4. Encouraging people to regularly come back to your pages.

·         Link Building and Link Development

  • Links are the primary method a search engine uses to discover your website, and a key factor in its rankings. Links help search engines determine how trustworthy and authoritative your website is, and they also help search engines figure out what your website is about.
  1. Links from trusted authoritative websites tell search engines that your website is more reliable and valuable.
  2. A link from websites like CNN, The New York Times, and The Wall Street Journal are more valuable than links from your local neighborhood garage or realtor.
  3. Search engines also look at the anchor text (words that link to your website).
  4. When someone links to you with the words blue widget they are telling the search engines you are about the words blue, widget, and blue widget.
  5. Links also increase in value over time.
  6. The longer a link has been in place, the more effective it is in passing along trust, authority, and ranking power to your website.

·         Directories

  • Once your website is built you want to try and acquire links from as many trusted sources as possible in your particular industry. Getting links from websites that are related to your industry is usually more helpful than getting links from websites that are not related to your industry, though every link helps.
  1. One of the first places many people start building links is from directories.
  2. Most directories have a fee for inclusion.
  3. Look for directories that are charging fees because they review each site before deciding whether to accept it.
  4. Don’t join a directory that lets in every site that applies; you want one that keeps out low-quality sites.
  5. To see if a directory is worth the review fee, check to see how much traffic they are going to send you.
  6. To evaluate potential traffic, check to see if the directory page is listed for its particular search term.
  7. If the directory is listed, this is usually a good indicator it will send you traffic.
  8. If the directory does not rank well for its term, check to see if it’s listed in the search engine index, and how recently it was crawled.
  • You can check the last crawl date by clicking on the cache link on the search engine result page.
  1. Pages that are in the index and have been crawled frequently are usually more trusted and will pass some of that value to you.

10.  Pages that are not in the index or have not been crawled recently are usually not worth the review fee.

·         Press Releases

  • Press releases are usually used to get the attention of journalists or industry news websites, magazines and periodicals.
  1. Many press release websites have relationships with search engine news feeds, so using them can be a very effective way to put your website in front of the right people.
  2. Most press release websites do not pass along any link value, they simply act as link pointers to your website.
  3. If a journalist, news website or blogger sees your press release and writes about you, you may get a link from them.
  4. Consider press releases in light of how much traffic and secondary links they can bring; ignore the link from the press release service.

·         Content and Article Syndication

  • Content and article syndication websites allow you to publish your content on other sites. In exchange for the free content these sites are willing to provide you with a backlink.
  1. Most of these article syndication sites are like press release sites in that they do not pass any link value, but instead act only as link pointers.
  2. To decide if this strategy should be a part of your marketing and link-building plan, look at the most popular articles in your category and see how well they rank and how much traffic they are likely to drive.
  3. You can also use article syndication sites to identify third-party websites that would be interested in publishing other articles from you.

·         Link Exchanges, Reciprocal links, and Link Directories

  • Exchanging links with other related websites is a good practice, if it makes sense for your users. Creating link directories with hundreds of links to other websites that are of very little or no use to the user is a bad practice and may cause search engines to penalize you.
  1. If the link has value to visitors of your website and you would place the link if search engines didn’t exist, then it makes sense to put up the link.
  2. If creating the link is part of a linking scheme where the primary intent is to influence search engines and their rankings then don’t exchange the link.

·         Paid Links and Text Link Advertising

  • Paying for links and advertising can be valuable, as long as you follow search engine guidelines.
  1. If a link is purchased for the advertising value and traffic it can deliver, search engines approve of the link.
  2. If the link is purchased primarily for influencing search engine rankings it is in violation of Google guidelines and could result in a penalty.
  3. If you want to buy or sell text link advertising without violating Google guidelines, look for implementations with a nofollow, JavaScript, or intermediate page that is blocked from search engine spiders.

·         Viral and Word of Mouth Marketing

  • Creating content that is viral in nature and gets you word-of-mouth marketing can help you acquire links. This process is often called linkbaiting.
  1. Content created for this purpose is often marketed on social media sites like Digg, del.icio.us, and Stumbleupon.
  2. As long as your content becomes popular naturally, without artificial or purchased votes, you will be within search engine guidelines.

·         Blogs and Social Media

  • Blogs are a relatively new form of website publishing. Their content is arranged, organized or published in a date or journal format. Blogs typically have a less formal, almost conversation-like style of writing, and are designed to help website owners and publishers to interact more with their customers, users, or other publishers within their community.
  1. The journal and conversational format of blogs usually makes it a much easier way to gain links from your community.
  2. You must create content that members of that community value and are willing to link to.
  3. For a blog to be truly successful the authors must participate in the community and publish frequently.
  4. If this behavior doesn’t mesh with your company culture, creating a blog is not going to be effective for you.

·         Social Media

  • Social media and bookmarking sites like Digg, del.icio.us, and Stumbleupon have community members who function almost like editors. They find and vote on web pages, stories, articles, videos or other content that is interesting or engaging.
  1. Most social media or bookmarking sites are looking for new content on a regular basis.
  2. The frequent publishing demands of blogs also requires a constant flow of new material.
  3. To get the most out of social media you must become involved in the community and submit stories from other sources, not just from your website.
  4. Each social media website has its own written and unwritten rules. Learn these before submitting stories.
  5. Every community frowns upon attempts to “game” the voting procedure. Tactics such as voting rings and paid votes that artificially influence the voting mechanism should not be permitted.

·         Analytics and Tools

  • Once your website is up and running you will want to know how many people are coming to your site, how they are getting there, what pages they are viewing when they arrive and how long they are staying. For this you will need a website analytics package.
  1. There are a wide variety of analytics packages, ranging in cost from free to several hundred thousand dollars each month.
  2. Each analytics package measures data in its own way, so it’s not uncommon for two programs to have slightly different results from the same set of data.
  3. Additionally, each package provides a different level of detail and granularity, so you should have some idea what you are looking for before purchasing a package.
  4. The two main methods of implementation are log files and JavaScript tracking.
  5. The most commonly used analytics package is Google Analytics.
  • See Mahalo’s introductory guide to using Google Analytics for more information.
  • A good general overall linking strategy is to slowly acquire links from as many trusted sources as possible, with a wide variety of anchor text, to both your home page and sub-pages. If, over a short period of time, you gain too many links with similar words in the anchor text, from a few or low-trusted websites, to a limited number of pages, this would create an unnatural linking profile. Your website will be penalized or filtered by search engines for such behavior.
  • You can build a website with great content and institute an effective marketing plan, yet still be foiled by technical issues. Here are some of the most common problems:
  • A robots.txt file communicates what pages or sections of your website you want search engines to crawl.

·         Linking Strategy

·         Common SEO Problems

·         Robots.txt File

  1. A common mistake is blocking search engine spiders from a section or entire site you want indexed.
  2. You can learn how to create a robots.txt file from Google Guidelines.
  3. Google’s Webmaster Central has a tool to let you verify that your robots.txt file is performing how you’d like.

·         Response and Header Codes

  • When your web server serves a page there is a special code that tells the browser or spider the status of the file served.
  1. A 200 response code means the page serves normally.
  2. If not configured correctly, some web servers will serve a 200 code even when a file is missing.
  3. This can create a problem when search engines index a lot of blank empty pages.
  4. A 404 is the response code when a page or file doesn’t exist.
  5. To improve usability, set up a custom 404 page with a message explaining what happened, a search box, and links to popular pages from your website.

·         Duplicate Content

  • The content from any page should only exist on one URL.
  1. If the same content exists under multiple URLs, search engines will interpret this as duplicate content.
  2. Subsequently, the search engines will try to make a best guess as to the best URL for your content.
  3. If this condition is true for a large amount of your pages, your website may be judged low quality and be filtered out of the search results.

·         Duplicate Titles

  • Every page of your website should have a unique title. When a search engine sees duplicate titles it will try to judge the better page and eliminate the other from the index.
  • If a large number of pages have identical or very similar meta descriptions, these pages may be filtered for low quality and excluded from the index.
  • In an attempt to create a large number of pages very quickly, many people will employ automated solutions that end up generating pages with fill-in-the-blank or gibberish content. Search engines are getting better at catching this condition and filtering these sites from the index.
  • Some people engage in tactics or methods that violate search engine guidelines to achieve higher rankings.

·         Duplicate Meta Descriptions

·         Poor or Low Quality Content

·         Blackhat SEO and Spamming

  1. They can employ a wide variety of tactics including (but not limited to):
  • Keyword stuffing
  • Link spamming
  • Paid linking
  • Artificial link schemes
  • Sneaky or deceptive redirects
  1. If you employ a tactic that seems to involve tricks or is done primarily to manipulate search engines and artificially inflate rankings, you can be considered to be engaged in blackhat SEO or spamming. This has repercussions for any site you work with, and should be avoided.
  2. For more detailed information, review Google’s guidelines.

·         Conclusion

  • Good SEO takes time, as you need to develop great content and a strong community voice. But this is just what the Internet needs: high-quality pages that provide a valuable service to users.