My first reaction last week to schema.org was pure excitement. “Finally. the semantic Web is going to make a real difference in the world of search” I gushed.
I’ve been spicing up my markup with Microformats for years. Any chance I have to add more semantics to a webpage, I’ll take.
A couple of months ago—impatient with the wait for the semantic Web to hit search—I started playing around with SPARQL to query RDFa datasets from DBpedia. I’ve known for several years that the semantic Web is the future, and I’m beyond psyched that I can start incorporating more semantics–real semantics, on a macro level–into projects I work on. Woohoo!
I just have to forget about RDFa if I want the VIP treatment from Google,Bing and Yahoo.
What about my beloved DBpedia; the jewel of semantic data knowledge bases? How are they going to deal with this? According to Christian Bizer, DBpedia might begin publishing Microdata with it’s next release, mapping DBpedia’s ontology to Schema.org’s with OWL.
What about Wikipedia itself though. How do they feel about these changes? It certainly affects their vast implementation of structured data.
And why the proprietary format again?
Granted, the W3C moves slowly
Yes, we’re all aware of the W3C’s painfully… slow… process of going from drafts to recommendations and standards, but to be fair, certain browser vendors (do I need to name names) are even slower to adopt them. It’s difficult to observe the adoption of Web standards in wild if a huge chunk of the market doesn’t even implement the specifications.
Internet Explorer. Okay, there, I said it. Like I had to. How ironic is it that that they’re 1 of the W3C’s 324 members? Discuss amongst yourselves ;)
Stepping away from my enthusiasm for search 3.0—at least for a minute or two—because something deserves attention:
As great as it is that structured data will really get some recognition in the world of search, wouldn’t it have been a good idea in the spirit of the open Web to get some public opinion on what to include/not include in schema.org?
Why Google wants control of the semantic Web
My observation is that with Google’s attempted (and many failed) advances into the world of social media, now is a good time to have some control in the semantic web. The social space opens up a whole new application for semantic technologies. Their also-ran +1 button will probably benefit from a ubiquitous schema.org and help in their competition with Facebook.
The schema.org news is sudden, and though I’ll gladly play along with my new Web 3.0 toys, Im really hoping that Mr BYG (Bing, Yahoo & Google—you heard it here first folks ;) ) will listen to the dev community, and take public opinion seriously.
Remember Web 2.0—2 way conversations—before quickly moving on to Web 3.0
Google has to get past one main sticking point if they hope to stop sucking at social media. They need to think about user experience before trying to figure out how social signals can best improve search.
If the main motivation for creating a social platform is to improve organic ranking algorithms through social signals, it’s destined to fail.
It’s as if Google is so desperate to get into the social media game that they’re choking from performance anxiety.
From one failure to another, they just can’t seem to get it right: Google Wave, Google Answers, Orkut, Knol, Google Buzz, Dodgeball, and many of their other platforms have failed to make the grade.
Google’s latest foray into the social milieu is Google +1. It’s similar to Facebook’s ‘Like’, and votes for websites are intended to improve the quality of Google’s search results. Social votes are popular, but who wants to share a search result before they ever click on the link to begin with?
After striking out in so many areas—including search—Google should be concentrating on giving their users a great social experience. They need this now more than ever.
If Google could develop just one social media platform that pleases its users first, with less focus on retrieving user data, raking in more Adwords revenue, or search integration, they’d be halfway there.
With all the money Google has poured into dead-end social strategies and misguided acquisitions, you’d think they could afford a loss-leader.
I, along with everyone else, have used exact match domain names in the past, but for the most part (with some minor exceptions) I’ve abandoned them in favour of more brandable solutions. If an exact match domain and brandability coincide, then great, I’m obviously all for it.
If you’re looking for an opportunity for quicker, stickier indexing, as it stands, an exact match can carry you farther and quicker for cheaper. But for how long? When the day comes that Google blows your house of cards down, you’d better have a serious backup plan.
Losing a spot to an exact match domain often means nothing
While online marketing is in many ways different from traditional marketing, both share some common traits on the path to success, and one of them is opportunity. Another is competence.
If a competitor ranks above me with an exact match domain, but the site is garbage, it can make my compelling, well structured, usable site look that much better in comparison. If my competitor’s site isn’t garbage, maybe it deserves to be there, and I can stop looking for a scapegoat to my indexing dilemma.
In my opinion, the debate over the fairness of exact match domains is moot. It’s really a question of which exact match domains represent a brand, and which are clearly taking advantage of search engine favoritism while adding little real value to search results.
If your company’s name is New York Bus Tours, and you’ve scored the domain name, you deserve top placement. Whether people are searching for companies that provide your service, or for your actual brand, it’s clear that you should be somewhere at the top.
Until search algorithms refine their evaluation of exact match domains, let’s keep in mind that—At risk of being told to shove my quotations book up my you-know-what—often in life, it isn’t what happens to you, it’s how you react to it.
Don’t like the success of that exact match domain sitting atop your placement on page 1 of Google? You can weaken, grumbling that “if you can’t beat ‘em, join ‘em” and buy a domain with underscores instead of dashes, or something-even-more-spammy.com (no, it doesn’t exist, but give it time) OR add more value to your site, to your business, and to your longterm digital presence.
Before the SEO world gets its collective panties in a knot about Google Instant‘s potential for putting us all out of work, it would be a good idea to remember why people use search engines in the first place: we want information suited to our specific needs.
… “By predicting your search and showing results before you finish typing, Google Instant can save 2-5 seconds per search.”
Are those couple of seconds really saving time if results are too broad? I don’t see on-the-fly results as an enhancement; rather, as just another distraction en route to optimal search results. The real beauty of search is in specificity.
If searchers want to save 1 second in the query stage of search, they can easily omit a word or two from their search terms; but this is obviously counterproductive since a lack of search refinement costs time in the long-run.
Google continues with another supposed benefit:
Instant Results: Start typing and results appear right before your eyes. Until now, you had to type a full search term, hit return, and hope for the right results. Now results appear instantly as you type, helping you see where you’re headed, every step of the way.
Suggestions can occasionally be helpful; however, if someone is searching on Google for ‘cheap car insurance in Detroit’, they aren’t going to stop at cheap car or cheap car insurance just because Google is streaming results on a keystroke-by-keystroke basis for shorter queries. Search is all about the longtail.
In my opinion, Google Instant encourages sheep mentality: How will Google know what people are searching for if out of laziness, we click on their suggestions?
Ok, I’ll state the obvious
The more results that Google can throw in front of searchers, the more opportunity they have to display sponsored results. Enough said.
Why I’m not worried
Push marketing is quickly becoming a thing of the past, and in my opinion, Google Instant is unwanted noise—a distraction that I’ll be glad to have the option of ignoring or opting out of as an able search engine user.
And I don’t think I’m alone.
My favorite Caribbean restaurant’s website can be hard to find, even when searching for it by name and city together. I first discovered their site last year by looking at their take-out menu after ordering a delicious meal.
When I visit a site that holds some interest for me – and it lacks basic functionality such as an html title, I’ll often look at the bottom of the page for a link to the site’s creator. If it’s a web company, I’ll take a look at the services they offer.
I followed the link I found to a small web design outfit that advertises “Search engine submission” at the bottom of their services page.
Submitting your website to the search engines is the first step to getting found and increasing your website traffic!
If you want people searching on Google, Yahoo! or MSN to find your website – the first step is Search Engine Submission. Search engine submission is the process of getting your website included in the various search engines’ databases. If you’re not listed – there’s little chance of being found!
Don’t wait for valuable exposure – Get Listed Today!
We submit your site to all the major search engines every month for $20 per month. We also Guarantee 7 day listing in Google, Yahoo! or MSN.
Let’s just break this down, shall we.
Submitting your website to the search engines is the first step to getting found and increasing your website traffic!
Opening a text editor (or in this company’s case, Frontpage?) is also a first step to getting found and increasing web traffic . Do you charge a couple of hundred dollars to open the text editor as well? Do you charge $200 to close your html tags as well? </sarcasm> <- This one is free.
If you want people searching on Google, Yahoo! or MSN to find your website – the first step is Search Engine Submission.
You just said that. Overcompensate much?
Search engine submission is the process of getting your website included in the various search engines’ databases.
Wrong. Search engine submission is the process of submitting your website to the search engines. But whatever.
If you’re not listed – there’s little chance of being found!
Nice way to manipulate the perception of what’s needed to achieve search visibility.
Of course if a site isn’t “listed”, there’s no chance of being found in search. If you aren’t listed, you aren’t listed. Thanks for that insight. But what do listings have to do with search engine submission?
Try: There’s little chance of being found if you aren’t showing up for search terms relevant to your site.
Don’t wait for valuable exposure – Get Listed Today!
Okay, now they’re leaning into their scam a bit harder. Exposure. What they’re saying is: submission implies listings, and listings imply exposure. Therefore, if you want search engine exposure, all you need to do is submit? Smooth.
We submit your site to all the major search engines every month for $20 per month.
Say what? So not only are you charging $240 to knock on Google’s door to tell them you exist (and maybe Yahoo and Bing), but you’re going tell them every month? While you’re at it, why don’t you open up every .jpg and .png from the site in Photoshop, re-save them exactly as they are, and re-upload them every month.
Time well spent.
These people are looking for residuals on search engine submission. I’m getting angry now.
We also Guarantee 7 day listing in Google, Yahoo! or MSN.
Does that mean listed in 7 days or for 7 days? Either way, even if this search engine submission scam was worth anything; in search, the only guarantee is that there are no guarantees.
I cringe at the thought of anyone paying for this type of bogus service.
Talking about its indexing process, Google says:
We add thousands of new sites to our index each time we crawl the Web, but if you like, you may submit your URL as well. Submission is not necessary and does not guarantee inclusion in our index. Given the large number of sites submitting URLs, it’s likely your pages will be found in an automatic crawl before they make it into our index through the URL submission form.
Search engine submission scams aren’t as widespread as they were a few years ago; others have taken their place and are regularly used to prey upon naive site owners.
It isn’t my intention to create FUD. If you’re looking for a competent SEO company, ask for references, examples of past work, and educate yourself on at least the basics of web visibility before jumping into bed with any company.
Thanks to Rishil Lakhani for inspiring this piece.
Rishil, I almost linked to your site with search engine submission scams. How ironic would that have been.
Many SEOs have coding (opposed to marketing) backgrounds, and enjoy hours on end of alone-time in front of the computer. The stereotypical B movie computer nerd, glued to the monitor in mom’s basement and surrounded by Coke cans isn’t typically known for his social skills.
Not that B movie stereotypes dominate our industry, but I think you see what I’m getting at.
The ass-kissing, social climbing, or mutual m*sturbating nature of “link building” can be a real turn off to someone attracted to the more technical aspects of search. But the harsh reality is that building an online business is in many ways similar to running a brick and mortar operation; you need to develop professional relationships beyond your clientele if you want to succeed with your online ventures.
With some social skills (or at least interaction), your chances of discovering mutually beneficial opportunities and partnerships increase exponentially.
I’ll admit that I’d like to see Google put less value on incoming links. If unethical schemes for exaggerating a site’s worth were reserved to on-site tactics, some of us wouldn’t feel so bitter about the indexing advantages acquired by those willing to chance paying for links.
With all cards closer to – if not on – the table, quality content and site architecture would take on even greater importance in establishing a visible web presence.
But I digress.
For the time being, the right types of links matter – a lot. So let’s accept it. For now.
Visualize your Web site as a shop that you opened on the edge of town. If enough reliable tenants were to vouch for you, the landlord would probably trust you enough to rent you a choice spot, closer to the center of town.
With some connections and networking, chances are you’ll find a more visible section of online real estate from which to run your business; especially if you continue to nourish your inner geek’s appetite for the more technical SEO skillsets.
Technorati Claim ID: 33T642PQRMHS
Without warning, Google has removed PageRank data from the ‘Diagnostics’ section of their Webmaster Tools (WMT). The majority of the SEO community once considered PageRank to be the quintessential metric to track , but the last few years have seen a steady decrease in the little green bar’s popularity.
Webmaster Trends Analyst Susan Moskwa commented in a recent thread on Google Webmaster Central, that PageRank data was removed from WMT simply because they felt it was silly to display data that Big G has been trying to wean webmasters off of for quite some time.
“We’ve been telling people for a long time that they shouldn’t focus on PageRank so much; many site owners seem to think it’s the most important metric for them to track, which is simply not true. We removed it because we felt it was silly to tell people not to think about it, but then to show them the data, implying that they should look at it. :-)”
Moskwa concluded her brief, but to-the-point comment with a link to Google’s Webmaster Help FAQ on crawling, indexing & ranking that stresses:
“…worry less about PageRank, which is just one of over 200 signals that can affect how your site is crawled, indexed and ranked. PageRank is an easy metric to focus on, but just because it’s easy doesn’t mean it’s useful for you as a site owner. If you’re looking for metrics, we’d encourage you to check out Analytics, think about conversion rates, ROI (return on investment), relevancy, or other metrics that actually correlate to meaningful gains for your website or business”
I agree 100%. PageRank isn’t the link popularity metrics panacea that it once might have been. But as Barry Schwartz points out’, why then, is PageRank data still displayed in Google’s Toolbar – too silly for Google Webmaster Tools, but not too silly for Google Toolbar? What gives Google?
Barry then goes on to ask:
“… how many people have the Google Toolbar installed compared to those who use Google Webmaster Tools? I assume a fraction of those use Google Webmaster Tools.”
Barry offers a possible explanation:
“Google cannot remove PageRank from the Toolbar, it is too much of their branding. No matter how much Matt Cutts and the Google search quality and webmaster trends team want it removed, I cannot see Google’s executives allowing it.”
I partially agree here. Yes, PageRank is a big part of Google’s branding, but this branding has made its mark primarily on search marketers and webmasters, at best. I don’t think Google would be too worried about hurting its brand by removing a once-relevant link popularity metric, especially if the majority of experienced search marketers have long since accepted that PageRank offers little if any value as an actionable or meaningful metric.
Marketing Pilgrim’s Andy Beal made a comment that’s a humorous as it is true:
“The problem is, Google’s not yet ready to remove the PageRank score from the toolbar installed on hundreds of millions of web browsers. This really leads you to conclude that role of PageRank has been reduced to nothing more than a comfort blanket for SEO noob. “
PageRank is Dead – Long Live PageRank?
My take here is that Google is “giving notice”, and perhaps PageRank is officially on its way out, one step at a time. Or, this is what they’d have us believe – one less road map on a huge ‘let’s game the search engines” safari.
I for one, hope PageRank sticks around – at least in the shadows somewhere – for one reason only; that little green bar doesn’t do many things, but one thing it does do really quickly is indicate if a site is suffering from a serious indexing problem. Andy feels the same way:
“I only use it as an early warning that a site is not behaving in Google’s index. Any green means ‘go.’ No green, means there’s something to investigate.”
Hang in there PageRank. It never was easy being green.
Okay, we like to
find the hidden meaning behind what google says poke fun at google, right?
Yesterday, Google’s Non-Profits Team Google Grants published a post on their official blog, about a training session they held recently in Washington D.C. Their blog entry details the material that was taught to campaign managers on how to move sites up in natural search results.
We’ve taken this opportunity to test the beta version of our soon to be patented Google PR Cynic translation application (GCTA), AKA Goognic™, on the Google Grants Team‘s recent post.
|Google Grants Team wrote:||Goognic™ Translation|
At a recent non-profit training held in our D.C. office, I got the chance to teach a group of issue campaign managers the basics of “search engine optimization” (SEO), or how to earn a spot for your content that is closer to the top of Google’s natural (left-hand side) search results
At a not-immediately-profitable training session, our Google Propaganda Team got the chance to explain how to organize website content in order to help the successful targeting of Adwords campaigns.
It was a rewarding experience because we were able to take what’s often a technical conversation and make it feel like something everyone could (and should) do.
Our strategy here was two-pronged. From those of whom that will achieve results by following the guidelines, Google shall reap the rewards of better organized sites added to the index (we’ll profit when they switch to Adwords, once their site gets buried on page 6 a couple of “updates” down the line). Those that are overwhelmed by the whole “SEO thing” will realize there is really only one way to go. Did I mention Adwords?
Indeed, when most people hear the words “search engine optimization,” they figure it’s too technical for them or that it doesn’t apply to them. But if you’re running a long-term education or awareness campaign, you need to know how to improve the chances that interested users will find your information through natural search results. It’s just as important as learning how to use your Google Grant effectively.
See previous section. Oh and by the way. The Adwords store called. They want your rankings back.
Fortunately, much of what you can accomplish with SEO doesn’t require any programming or technical skills, but it does require a big-picture awareness of your issue. Because ultimately, you’re not trying to rise to the top of any one search results page, but rather to make your site more relevant to the whole search picture, which means designing your site, sections, and sub-pages with the most high-demand search terms related to your issue in mind
Don’t be intimidated by all this SEO stuff, because if after all your hard work your site still doesn’t rank, well hey, that’s ok, because an Adwords campaign will probably work REAL sweet now!
Doing well in high-demand search results pages requires that you first know what search terms or keywords are most popular. Take concepts and terms you discuss on your site and test them against related terms using tools like Google’s Keyword Tool and Insights for Search. Make sure you’re developing individual pages centered around what people are looking for, using the language they use
Lets get to know some of the tools you’ll be needing to run your first PPC campaign! Sktool, Analytics, Google’s Keyword Tool. Mmmmm, do you smell what the Goog is cooking?
Use these high-demand keywords where they accurately describe your content, especially in page titles, section headings, and in URLs. If you have lots of images or interactive graphics, make sure your most important content appears in text too, because the Googlebot doesn’t read images.
Googlebot has been able to “read” images for over a year. Nobody’s seemed to notice so far, so we’ll hold out a little longer from telling you, we don’t want to have to deal with curtailing a landslide of image sculpting. Well not until we endorse it first. (Ok, that was cheap, sorry, I couldn’t resist)
Finally, understand that the number and quality of other sites that link to your content determines much of your ranking in search results. Make sure you know the other online players on your issue, and encourage them to link to you. Starting a blog or Twitter feed is a great way to keep users abreast of the latest updates to your site and encourages them to link to you too
We’ll be acquiring Twitter soon. Get ready to transfer all your Twitter profiles to your Google accounts, suckahhhhs!
Ok ok, maybe I went a little far this time. But how could I resist? Google has a working++ business model, and I respect that. They’ve done many great things for the search industry, and will continue to do so, while making a profit (Go figure). But who says we can’t entertain the troops in the meantime.
In case you passed on clicking through to the Google Grants site, here are the slides from the recent training session. Enjoy!
In what is arguably the biggest SEO news so far this year, Matt Cutts announced yesterday that using nofollows is no longer a solution to preventing loss of a site’s or page’s link juice, and hasn’t been for over a year!
When the rel=nofollow attribute was introduced in 2005, it was meant as an annotation for not “vouching” for a link. Virtually all forum and blog pages have nofollow attributes associated with visitor generated content, as a means of instructing search engines not to follow (crawl) these untrusted user comments or guestbook entries. Not long after the introduction of rel=”nofollow”, we learned to minimize leakage of our sites’ total allocated PageRank by ‘sculpting’ PR with the attribute as well as to push it to more important pages of our sites. We can now cross this technique off the list.
In Google, nofollow links don’t pass PageRank and don’t pass anchor text. However, we find out now through Matt Cutts (who else) that nofollow links no longer conserve the linkjuice from an outgoing nofollow link in order to be be divided among other links on the page in question.
Old PageRank Algorithm
2 separate cases of a page with “x” amount of available link juice.
As a somewhat simplified example: In the original PageRank algorithm, a page of PR10 would have passed PR2 each to 5 regular links (fig.1). The same page would have passed PR2.5 each to 4 regular links and PR 0 to the nofollow link in fig. 2.
New PageRank Algorithm
Page with “x” amount of available link juice
As you can see in fig 3, nofollowing a link no longer passes extra juice through to the remaining live links. Many SEOs are now considering cutting down substantially on outgoing links, or going back to previous PR Sculpting methods such as:
- Embedding robots.txt-blocked iframes containing certain links
- Embedding Java, Flash or other non-parseable applications to contain certain links
Many SEOs are disillusioned by the fact that using internal nofollows were advocated as best practice by the powers that be at the Big G, and now feel they’re being told the opposite. There will be a lot of speculating, calculating and theorizing in the SEO community on this one in the upcoming weeks. I’ll be back with news on this one soon enough, because I know there’ll be some.
Bing was originally scheduled for launch June 3, 2009, however, Microsoft’s “Decision Engine” went live today, and aims to compete with Google and divert its share with $80 Million in marketing.
In announcing the search engine May 28, 2009 Microsoft CEO Steve Ballmer said Bing (AKA Kumo) hopes to help users receive the information they’re searching for faster and that the decision engine goes beyond search to help customers deal with information overload.
When we set out to build Bing, we grounded ourselves in a deep understanding of how people really want to use the Web. Bing is an important first step forward in our long-term effort to deliver innovations in search that enable people to find information quickly and use the information they’ve found to accomplish tasks and make smart decisions.
The main feature that I notice from the start is that aside from regular search results, certain queries offer categorized search results in the sidebar as well as related search, as seen below.
Also of interest, each thumbnail in the results of video searches will play the first 30 seconds of the video on mouse over, giving you a sneak preview before clicking the link. The world of Search Engine optimization just got another toy to play with – or rip apart, depending who’s sitting in front of the toy box.
- Advertising (1)
- Bing (2)
- Business (4)
- Content Strategy (1)
- Copywriting (1)
- Domains (1)
- Google (14)
- Humour (2)
- Internet Marketing (2)
- Landing Page Optimization (1)
- Link Building (1)
- Misinformation (1)
- news (16)
- Online reputation management (1)
- Semantics (2)
- SEO Tips (2)
- SEO Tools (7)
- Social Media (7)
- Social News (1)
- Technology (2)
- Video (4)
- Web analytics (1)
- Web Usability (1)