An important aspect of SEO is making sure your website is easy for both users and search engine robots to understand and utilize. Although search engines have become increasingly sophisticated, they still can’t see and understand a web page the same way a human can. SEO helps the engines figure out what each page is about, and how it may be useful for users. Imagine you posted online a picture of your family dog. A human might describe it as “a black, medium-sized dog, looks like a Lab, playing fetch in the park.” On the other hand, the best search engine in the world would struggle to understand the photo at anywhere near that level of sophistication. How do you make a search engine understand a photograph? Fortunately, SEO allows webmasters to provide clues that the engines can use to understand content. In fact, adding proper structure to your content is essential to SEO. Understanding both the abilities and limitations of search engines allows you to properly build, format, and annotate your web content in a way that search engines can digest. Without SEO, a website can be invisible to search engines.
The Limits of Search Engine Technology
The major search engines all operate on the same principles, as explained in Chapter 1. Automated search bots crawl the web, follow links, and index content in massive databases. They accomplish this with dazzling artificial intelligence, but modern search technology is not all-powerful. There are numerous technical limitations that cause significant problems in both inclusion and rankings. We’ve listed the most common below: Problems Crawling and Indexing
- Online forms: Search engines aren’t good at completing online forms (such as a login), and thus any content contained behind them may remain hidden.
- Duplicate pages: Websites using a CMS (Content Management System) often create duplicate versions of the same page; this is a major problem for search engines looking for completely original content.
- Blocked in the code: Errors in a website’s crawling directives (robots.txt) may lead to blocking search engines entirely.
- Poor link structures: If a website’s link structure isn’t understandable to the search engines, they may not reach all of a website’s content; or, if it is crawled, the minimally-exposed content may be deemed unimportant by the engine’s index.
Problems Matching Queries to Content
- Non-text Content: Although the engines are getting better at reading non-HTML text, content in rich media format is still difficult for search engines to parse. This includes text in Flash files, images, photos, video, audio, and plug-in content.
- Uncommon terms: Text that is not written in the common terms that people use to search. For example, writing about “food cooling units” when people actually search for “refrigerators.”
- Language and internationalization subtleties: For example, “color” vs. “colour.” When in doubt, check what people are searching for and use exact matches in your content.
- Incongruous location targeting: Targeting content in Polish when the majority of the people who would visit your website are from Japan.
- Mixed contextual signals: For example, the title of your blog post is “Mexico’s Best Coffee” but the post itself is about a vacation resort in Canada which happens to serve great coffee. These mixed messages send confusing signals to search engines.
Make sure your content gets seen
Getting the technical details of search engine-friendly web development correct is important, but once the basics are covered, you must also market your content. The engines by themselves have no formulas to gauge the quality of content on the web. Instead, search technology relies on the metrics of relevance and importance, and they measure those metrics by tracking what people do: what they discover, react, comment, and link to. So, you can’t just build a perfect website and write great content; you also have to get that content shared and talked about. Take a look at any search results page and you’ll find the answer to why search marketing has a long, healthy life ahead. There are, on average, ten positions on the search results page. The pages that fill those positions are ordered by rank. The higher your page is on the search results page, the better your click-through rate and ability to attract searchers. Results in positions 1, 2, and 3 receive much more traffic than results down the page, and considerably more than results on deeper pages. The fact that so much attention goes to so few listings means that there will always be a financial incentive for search engine rankings. No matter how search may change in the future, websites and businesses will compete with one another for this attention, and for the user traffic and brand visibility it provides.
Constantly Changing SEO
When search marketing began in the mid-1990s, manual submission, the meta keywords tag, and keyword stuffing were all regular parts of the tactics necessary to rank well. In 2004, link bombing with anchor text, buying hordes of links from automated blog comment spam injectors, and the construction of inter-linking farms of websites could all be leveraged for traffic. In 2011, social media marketing and vertical search inclusion are mainstream methods for conducting search engine optimization. The search engines have refined their algorithms along with this evolution, so many of the tactics that worked in 2004 can hurt your SEO today. The future is uncertain, but in the world of search, change is a constant. For this reason, search marketing will continue to be a priority for those who wish to remain competitive on the web. Some have claimed that SEO is dead, or that SEO amounts to spam. As we see it, there’s no need for a defense other than simple logic: websites compete for attention and placement in the search engines, and those with the knowledge and experience to improve their website’s ranking will receive the benefits of increased traffic and visibility