As a web developer and operations manager, I know what a tedious process search engine optimization is. From the development phase of any website, it is important to consider SEO, from titles, description, and copy to how your page files are named. Throughout this article you'll see that I refer to search engine bots as if they are human. It's almost as if they are. Actually they are just a set of complex algorithms that make up a protocol for an automated process of reading and evaluating the genuinity of your website and to rate the importance of it. Search engine developer are constantly adding to these algorithms and making the function of the bots more complex and more fool-proof, adding to their human-ness. During developement phase, it is important to pay close attention to every aspect of every page file. Beginning with the name, for example, let's say we are creating a page file about what makes Cancun a great vacation spot, we would rather call it "What-Makes-Cancun-Mexico-A-Great-Vacation-Spot.htm" rather than "canc_vaca_spt.htm" or something to that effect. I say this because it's a known fact that search engines, especially Google, not only index meta tags and content but also the file names, too. This makes the file name searchable along with everything included in the file. This can also gain you brownie points in the eyes of Googlebot when you've got the generic terms like "cancun" and "vacation" in the tags, copy, title, and file name. This shows the search engine bots that your page is genuine. You've already seen me use the term "copy" a few times. "Copy" is the SEO term for words. More copy equals more words in the content of your page. Search engine bots love to see a ton of copy. The more they have to read, the more they like your page. However, it is also important for this copy to be a relevant to the subject of you page as possible. Bots are also picky about how many times you use certain key words and phrases, as this is a means of preventing any kind of abuse of the algorithm.
It should remain in the intentions of every web developer to please these bots ad stick to the rules. Google publishes an updated report of how to optimize your site for their bots, and because Google is the leader in the industry, its also serves as an industry standard as all other engines attempt to mimic the Googlebot.
Bots can penalize your page for trying to trick it. For example, it became very common for certain enterprising developers to insert keyword-rich text into their pages that made no grammatical sense but served as a way to increase the count of instances of those words and phrases. They would put the text in a discrete location on the page, decrease the font size, and set the color the same as the background color to keep it out of plain view. This helped in the short run, but when Google found out, the bots were programmed to catch this and penalized millions of sites for doing it.
Another aspect of search engine optimization is keeping your contect update and fresh. This is a more recent things with the advent of RSS feeds, blogs, and content management system, all which make updating content much easier. When the Googlebot sees that a page has been updated since it's last index, it tends to begin to revisit and reindex the site much more frequently, adding to you overall organic ranking. Even better than replacing your content with new content, is simple adding to the old content, adding to the amount of copy. Blogs and many CMS' make this simple as they have archive features which keep all of the old content in indexible archives.
I've lost my train of thought, and just got a little lazy. I'll have more on this at another time.