My first reaction last week to schema.org was pure excitement. “Finally. the semantic Web is going to make a real difference in the world of search” I gushed.
I’ve been spicing up my markup with Microformats for years. Any chance I have to add more semantics to a webpage, I’ll take.
A couple of months ago—impatient with the wait for the semantic Web to hit search—I started playing around with SPARQL to query RDFa datasets from DBpedia. I’ve known for several years that the semantic Web is the future, and I’m beyond psyched that I can start incorporating more semantics–real semantics, on a macro level–into projects I work on. Woohoo!
I just have to forget about RDFa if I want the VIP treatment from Google,Bing and Yahoo.
What about my beloved DBpedia; the jewel of semantic data knowledge bases? How are they going to deal with this? According to Christian Bizer, DBpedia might begin publishing Microdata with it’s next release, mapping DBpedia’s ontology to Schema.org’s with OWL.
What about Wikipedia itself though. How do they feel about these changes? It certainly affects their vast implementation of structured data.
And why the proprietary format again?
Granted, the W3C moves slowly
Yes, we’re all aware of the W3C’s painfully… slow… process of going from drafts to recommendations and standards, but to be fair, certain browser vendors (do I need to name names) are even slower to adopt them. It’s difficult to observe the adoption of Web standards in wild if a huge chunk of the market doesn’t even implement the specifications.
Internet Explorer. Okay, there, I said it. Like I had to. How ironic is it that that they’re 1 of the W3C’s 324 members? Discuss amongst yourselves ;)
Stepping away from my enthusiasm for search 3.0—at least for a minute or two—because something deserves attention:
As great as it is that structured data will really get some recognition in the world of search, wouldn’t it have been a good idea in the spirit of the open Web to get some public opinion on what to include/not include in schema.org?
Why Google wants control of the semantic Web
My observation is that with Google’s attempted (and many failed) advances into the world of social media, now is a good time to have some control in the semantic web. The social space opens up a whole new application for semantic technologies. Their also-ran +1 button will probably benefit from a ubiquitous schema.org and help in their competition with Facebook.
The schema.org news is sudden, and though I’ll gladly play along with my new Web 3.0 toys, Im really hoping that Mr BYG (Bing, Yahoo & Google—you heard it here first folks ;) ) will listen to the dev community, and take public opinion seriously.
Remember Web 2.0—2 way conversations—before quickly moving on to Web 3.0
Competitive search visibility can make or break a business, and that fact has many sales teams drooling for a piece of the action.
I’ve seen salesmen’s jaws drop when learning of the huge profits attainable by selling a service that is actually legitimate, and won’t land them in legal hot water. Yes, it’s fair to say that SEO is the salesman’s wet dream.
Pitching SEO like a real
Whether the unlucky target of the stereotypical, greasy salesman is the recipient of a cold call proposing Web services, or has been passed to the sales team of one of tens of thousands of fly-by-night, cookie cutter SEO operations, the tone is the same: overconfident, aggressive, and fast talking. Very fast talking, because we all know that coffee is for closers.
Salesmen love simple metrics
That sales guy in the back room doesn’t know what it takes for a Web presence to succeed, so he needs easy to grasp, easy to demonstrate concepts for his sales pitches. He knows enough about SEO to sound convincing. He’s a specialist in tapping into client emotions, and uses the right lingo to cast illusions of SEO supremacy.
But his prospects often know less about search visibility then even he does. Perfect.
Instead of complicating an SEO sales pitch with confusing details, seemingly logical illusions work best to gain the confidence of his mark. After all, it’s much easier to dazzle a naive—and I mean that in the least insulting way— prospect by presenting rudimentary performance indicators than it is to try describing the search visibility benefits of Web usability, structure, semantic markup, content strategy, content marketing and Web analytics (let alone understand them).
So, dear consumer, without further ado, here are the 3 favorite lies that the SEO salesman loves to sell you on:
Lie #1:You need hundreds more backlinks!
Backlinks play a major role in search visibility; it’s true, we all know that. The role they play, however, isn’t a part of a numbers game, but a quality game.
Novice SEOs/pro salesmen love the directory link package because there’s no work involved (it’s outsourced), the markup is huge (sometimes 10x), and they can easily use false logic to demonstrate its value.
All a salesman has to do to prove his supposed point is open a backlink checker and compare your domain with that of one of your competitors that happens to have thousands more backlinks. It doesn’t matter if those links are mostly site wide hotlinked images, or from scraper sites or anything else. Only the numbers are meant to dazzle.
The truth is, hundreds of directory links can’t compete with even 2 or 3 backlinks from authoritative websites, no matter how relevant he says those few hundred directories are.
Certain directories do help with a website’s search results, especially for visibility in business listings, but they number much fewer than even 100, and for true organic listings, their value is negligible.
You are the weakest backlink!
And so, dear client, that I want—and can do—the best for:
Do you really want me to spend a few hundred dollars outsourcing some directory submissions to India or anywhere else and charge you $1000 to $2000 for a worthless “service”? Say it with me No, I don’t.
What do you want? You want me to add value to your site with content strategy and improvements to usability and structure. Compelling, original, useful content teamed with a cunning marketing strategy is the best proactive method of acquiring the types of links that actually help rankings. But that actually takes planning and talent; sometimes a problem for the sales team turned “SEO Company”.
The next time a client tells me that the other SEO company says they’ll get them 500 links, and asks how many I can get them, I’m going to curl up into a ball and gently cry myself to sleep.
Lie #2: The pages of your site should pass W3C validation!
Fool’s gold at the end of the rainbow
Snake oil salesmen love this one.
Running a webpage through an HTML validator is so easy that a monkey could probably do it.
Any syntax errors in the page’s markup result in a big red banner displaying their number, opposed to a page that passes validation, which boasts a reassuring Congratulations over a green banner. So official looking! “We must fix these errors at once, right? Then the search engines will love us, right? Right?” Wrong.
Now don’t get me wrong here. I’m a huge fan of the World Wide Web Consortium and Web standards. I’ve been manually coding websites according to strict Web standard guidelines for several years, and Web pages I code pass validation. I can assure you though, that passing W3C validation isn’t part of my SEO strategy.
Many Web standard practices do help search visibility, but valid HTML isn’t an accurate measure of Web standard compliance. Far from it, actually. The validator is merely a syntax check, alerting the developer to deprecated HTML elements and errors.
Semantic HTML and website accessibility, on the other hand, are examples of Web standard recommendations that add value to on-page SEO, but they’re much more difficult to understand, achieve, and demonstrate, so they have less visual impact on a sales pitch.
Lie #3: Top rankings Are All That Matter!
First place garbage is still garbage
Let’s imagine that the outfit our eager little salesman works for is able to somehow achieve rankings that are anywhere near competitive or have conversion potential. Are their clients’ sites being optimized in a way that compliments content strategy, or do keywords get thrown into content similar to ink flicked against a canvas?
Competitive search visibility is worth only as much as the number of visitors that take action (and hitting the back button isn’t the type of action I’m talking about).
If visitors are greeted with keyword crammed, unintuitive, valueless marketese and fluff, they’ll like bounce off the site and never return. Yeah, it can be really lonely at the top.
Many other lies are told by SEO-wannabe salesman, but few bring his company as much profit for little work as the ones I mentioned above.
Lie #3 Top rankings Are All That Matter! and its associated keyword littered garbage content and title tags, take up most of the company’s time (when they can pull themselves away from the cold calls) but they can’t really get around that one. They need to get some first page results for their poor clients to avoid being sued.
Just about the time that the elation of first page results (for what are usually less than ideal keyphrases) wears of, the client starts to realize what an atrocity their site has now become.
What we can do as professionals
Business owners, desperate for more business, sometimes lose site of the big picture regarding their web presence. They become so enamored with the idea of beating their competition in the natural search results that they lose focus of everything else. This is the perfect time for snake oil salesmen to swoop down and catch their prey.
Remind your clients that the best SEO is practically invisible, and done properly, won’t harm their brand, it will help it. Remind them that SEO tactics worth their salt for longterm search visibility also improve their site’s usability, not detract from it. Show them data and metrics, but encourage them to hold on to their common sense. Remind them that if something doesn’t look right, it probably isn’t.
Remind them of the value their site should offer, and how far that can go to helping both their rankings and their bottom line.
Wanna-be SEO types (you know, the ones that would be better off in boiler rooms) love to pitch W3C Validation as being crucial to SEO for one simple reason; it’s easy to demonstrate to a client that a competitor’s page has 376 errors, and then compare it to the soothing green ‘Congratulations’ of having no errors on some other page that passes validation.
The visual impact of “Congratulations, no errors” from the W3C can go a long way to leaving a great impression on trusting, and sometimes gullible clients. Whether validating pages play into even half a percentage point with search engine algorithms is highly speculative.
Web standards on the other hand, offer great value to SEO efforts. Especially the W3C recommendations that enhance accessibility and semantics. However, Web standard markup goes well beyond passing a simple validation test against a strict – or in some cases – a transitional doctype.
So Why Validate at All?
Valid HTML is just one result of adhering to Web Standards. If your page passes validation, it means that in your quest to meet strict standards, you didn’t accidentally use deprecated tags or make any syntax errors. Passing validation does not measure adherence to Web Standard coding practices. I’ll say it again: Passing validation does not measure adherence to Web Standard coding practices.
W3C standard validation is nothing more than a syntax check, not a measure of how web standards compliant a Web page is (In case you didn’t get the subtle hint from the end of the previous paragraph).
So Then How Do You Measure Web Standard Coding practices?
Web standard markup is mostly about separating content (text, images and other media embedded within the proper HTML elements) from presentational data (which should be restricted to external cascading style sheets, or CSS). Presentational data consists of any code that’s necessary to alter the original appearance of an HTML file – from Arial font-families to z-index – and all attributes in between.
The only way to measure the adherence to Web standards of a Web page is to understand what Web standards really mean, and to look at the page’s source code for yourself.
Separating content from presentational data alone isn’t guaranteed to help search engine optimization efforts, but it is a step towards the issue and SEO benefits of HTML semantics.
For example, a page that uses tables for layout purposes is going against the fundamental rule of Web Standards. Table based page layouts weave presentational data throughout a page’s source code, adding a tremendous amount of code bloat, and have all sorts of other nasty effects, notably much more time-consuming and costly site redesigns.
However (and this is important), a table based layout will pass strict HTML validation. Do you see where this is going? Tables are allowed in strict, Web Standard markup. It is up to you as to whether you use them appropriately.
Now I’m not saying that using tables for layout will necessarily have a huge impact on SEO efforts, but poor semantic choices of other elements just may.
The paragraph tag will obviously not cause a syntax error in your markup. It’s one of the most basic HTML elements. But if you erroneously used the paragraph tag in place of a secondary heading or vice-versa, the validator wouldn’t know the difference, but would it make sense to use a secondary heading in place of every paragraph? Of course not. Visually, paragraph and secondary heading fonts could look similar, but they aren’t; they have very different meaning (semantics), and this is where the benefits of Web Standards on search engine optimization become clearer.
Do I even need to make a case for missing alt attributes? I didn’t think so.
Search engines, similar to other visitors of your site, need as many structural cues as possible to have the best possible chance of cutting their way through your content with ease.
Tables are semantically inappropriate for page layout; they’re suitable for tabular data. Spreadsheet stuff – not mastheads, sidebars or footers. Aside from adding a lot of unnecessary markup bloat to pages and driving up the ratio of code to content on a page, the detriment to SEO is probably negligible – on a small scale; however, for sites with tens or hundreds of thousands of pages of content, I wouldn’t leave it to chance. Especially at the development stage of a site. Going back to properly recode three hundred thousand pages of markup can reach a point of diminishing returns pretty quickly.
- Secondary headings are semantically inappropriate for holding content more suitable for paragraph tags
- Navigation links are effectively lists of related links, and belong in HTML list items
- Strong and emphasized text will add clarity to text
Individually, none of these semantic choices will have a huge impact on a site’s rankings; but overall, a site that is lean on code, has clear, structural markup, and embeds a minimum level of semantics into its documents will be much easier for search engine algorithms to understand and properly index.
Understanding which elements of Web Standard markup add to the search engine friendliness of a site comes with front-end development experience. To some, that pretty validation button might seem like a neat and convenient way to measure the “quality” of a page’s source code, but the only way to measure quality in this case is through understanding.
Incidentally, the main difference between transitional and strict web standard markup is certain elements and attributes have been deprecated (phased out) from strict HTML standards. In other words, if you use deprecated HTML tags, those tags will earn you syntax errors when validating against a strict doctype.
Tags such as font and center have been omitted from strict standards because they have no semantic value; they’re purely presentational elements – and presentational data should be reserved for external stylesheets, not markup.
Don’t take the easy way out!
It’s easier to show a client 0 validation errors than it is to explain Web Standards, semantics, tableless layouts and the separation of content from presentational data. Okay, you have me there.
Then again, you could just bookmark this article and give the link to your more inquisitive clients when the need arises ;)
Montreal’s Springboard SEO prioritizes usability as well as findability—for maximum online profits.
- Advertising (1)
- Bing (2)
- Business (4)
- Content Strategy (1)
- Copywriting (1)
- Domains (1)
- Google (14)
- Humour (2)
- Internet Marketing (2)
- Landing Page Optimization (1)
- Link Building (1)
- Misinformation (1)
- news (16)
- Online reputation management (1)
- Semantics (2)
- SEO Tips (2)
- SEO Tools (7)
- Social Media (7)
- Social News (1)
- Technology (2)
- Video (4)
- Web analytics (1)
- Web Usability (1)