SEO – Where is the Silver Bullet?

original article written by Steve Smith Feb 22, 2012

Have you seen the emails? “Guaranteed first page on Google® in one week”, “SEO Expert, free consultation”, “Secret of Search Engine Rankings revealed”, or variation of these themes. There are thousands of companies and/or individuals around the world that are basically taking advantage of the fact that the average person does not really understand, nor need to, understand all the facts about Search Engine Optimization (SEO) on their website and so by making claims of some sort of “silver bullet” to achieve search engine glory they play upon people desire to be successful, its not a lot different than those that pump up stocks, or, to be a little more brutal, the “snake oil salesmen” of old. I am not saying they are all like that but, sadly, many of them are.

Before getting into the details of why most of these claims won’t work, lets take a step back and examine what is really meant by Search Engine Optimization. First off, a common misunderstanding is that paid advertising is the same as SEO. It’s not. Paid advertising will get you listed in search results even if your site is not search optimized at all. The position of your listing and the frequency by which it is shown will depend on how much you are willing to pay every time someone clicks on your ad. This is referred to as the pay-per-click rate or abbreviated as PPC. True Search Engine Optimization deals with things you can do to your website to help it achieve good search result placement (preferably on the first page of results) without having to pay on a per click basis. Normally this has to do with things like; the quality of content on your site, the positioning of specific keywords to help the search engines identify what you are about, adding specific types of tags in the right places, simple straight forward navigation and the number of pages that link to your page. There are many more things the search engines look for, but those are some of the more common things.

Over the years Google® and other search providers have refined the ways they index and rank websites and they will continue to do so in the future. That means achieving good search engine rankings is not a one-time event, it requires ongoing “tweaks” to the site to make sure it stays current with the latest ranking logic. Just as the search providers continue to make improvements, there is an army of programmers out there who seek to figure out ways to circumvent their logic, in other words figure out ways to cheat the search engines in order to achieve ranking. Over the years the search providers have figured out most of these so-called “black hat” techniques and have put in place ways to penalize those sites that employ them. But the black-hatters are always looking to be one step ahead, and for a while it might work, and then one day, boom, the rankings are gone and the site that had employed those techniques descends into search ranking purgatory. That’s why these instant gratification strategies seldom work for long. There’s another major problem as well. In order to effectively optimize your site a SEO expert will have to have full access to the code on your site. There are at least two problems with this. First, they could potentially place stuff in your code you really don’t want, or in a worst case scenario could compromise the security of your site. Secondly, most web developers don’t appreciate giving up their original code to others who they don’t know. Its like asking a famous songwriter or artist to give away all their secrets. Be very careful about who you are giving access to your site and be sure you understand just what they intend to do.

OK, so now that I’ve probably raised some alarms about what not to do, lets talk about some things we can do to optimize your site effectively, and legitimately for the long term. First off, lets try to get a basic understanding of how exactly a search engine gets information from your site. The key to understanding what a search engine does is by realizing that at its core it follows links, as pages link to other pages the search engine “spider” or “web-bot” crawls along those threads created by the links between pages and sites and makes notes of the interesting things it finds along the way. It also looks at how many other places link to a specific page. It then deduces that the more incoming links are found for a specific page the more popular it must be, and therefore starts to assign a level of importance to that page. The next thing it looks at is what the page itself contains. This is where it can get interesting.

Another important thing to understand is that (web) spiders are primarily interested in textual content. While they can acknowledge that other types of content (images, videos, flash, etc) exists they can’t effectively interpret what it is without some “helper” text to describe what they are looking at. Spiders also have complex logic when it comes to determining which text on a page best describes what the page is all about. This goes back to my earlier comment about effective keywords and positioning that in the right place on a page. Over time spiders have gotten much better at interpreting our languages and just what it is we are trying to say. As I said initially, they have also gotten much better at figuring out the little tricks that us humans like to play.

As the web spiders are out gathering information they are sending it all back to home base, which in this case is one or more massive data centers operated by the search providers. All this data is compiled, compared, analyzed and ranked and then eventually passed to all the other connected data centers that provide search services. Some people have this vision that Google® is this massive complex that houses thousands of computers, and it is, but those computers are spread out throughout various parts of the world. helping providing speed of delivery and redundancy. At any given time its highly likely that many of them are not completely in sync, as different datacentres are being updated at different times. Ever wondered why one day you search for a topic and get one set of results and the next day you may get a different set? Chances are you are being directed to a different datacentre, based on overall system load.

Admittedly this is a highly simplified view as to how spiders and their accompanying datacentres work. The actual details of how many of the final determinations are made are closely guarded by the search providers, but hopefully that has given you some idea of how the process works at a high level. The only way to understand how to improve one’s own results is to understand the method and criteria by which they are being assessed. As you may also be starting to see, this is not a process that will necessarily happen quickly, especially on sites that are more complex with lots of pages. We typically see changes in ranking after periods of time ranging from several weeks to several months and recently these times are improving. It used to take much longer. You can start to see however how claims of “top ranking in 24 hours”, etc are just not realistic, unless of course you are willing to pay for placement (online ads).

After having a better understanding of how SEO works, its time to take the next steps in place. First, we can help you come up with a customized SEO strategy based on your site and your goals as a business. Secondly we implement those elements as part of the web design (or re-design). Thirdly we measure and monitor results to see if things are moving forward in the right direction. Where things need to be tweaked those tweaks are put in place. As you can see it’s not a one time event, its an ongoing process of continual monitoring and updating as conditions and the whims of the search providers change. There are always ebbs and flows as conditions change, and as I’ve mentioned before sometimes the search engines can be a bit temperamental. It’s a bit like steering a large ship, you can’t make major course changes every time the sea conditions change, you have to anticipate ahead and make small incremental changes to get back on track.