Before the SEO world gets its collective panties in a knot about Google Instant‘s potential for putting us all out of work, it would be a good idea to remember why people use search engines in the first place: we want information suited to our specific needs.
… “By predicting your search and showing results before you finish typing, Google Instant can save 2-5 seconds per search.”
Are those couple of seconds really saving time if results are too broad? I don’t see on-the-fly results as an enhancement; rather, as just another distraction en route to optimal search results. The real beauty of search is in specificity.
If searchers want to save 1 second in the query stage of search, they can easily omit a word or two from their search terms; but this is obviously counterproductive since a lack of search refinement costs time in the long-run.
Google continues with another supposed benefit:
Instant Results: Start typing and results appear right before your eyes. Until now, you had to type a full search term, hit return, and hope for the right results. Now results appear instantly as you type, helping you see where you’re headed, every step of the way.
Suggestions can occasionally be helpful; however, if someone is searching on Google for ‘cheap car insurance in Detroit’, they aren’t going to stop at cheap car or cheap car insurance just because Google is streaming results on a keystroke-by-keystroke basis for shorter queries. Search is all about the longtail.
In my opinion, Google Instant encourages sheep mentality: How will Google know what people are searching for if out of laziness, we click on their suggestions?
Ok, I’ll state the obvious
The more results that Google can throw in front of searchers, the more opportunity they have to display sponsored results. Enough said.
Why I’m not worried
Push marketing is quickly becoming a thing of the past, and in my opinion, Google Instant is unwanted noise—a distraction that I’ll be glad to have the option of ignoring or opting out of as an able search engine user.
And I don’t think I’m alone.
According to pcmag.com’s John C. Dvorak, SEO is killing the internet.
Is this guy for real?
Or is John pcmag.com’s answer to Andy Rooney: (Whiny voice:) “Have you ever noticed how SEO is killing the Internet?”.
We all know that controversy is a great way to attract readership, but come on, didn’t you already squeeze all the ranty goodness out of SEO in February?
I’m think I’m embarrassed for John Dvorak more than I am disturbed by his ignorance.
Using John’s logic, and after reading his poorly written and badly researched article in pcmag.com, we could surmise that ‘blogging is killing the Internet’, but that would obviously be a ridiculous assumption. The medium isn’t what kills – it’s the careless misinformation transmitted through the medium. That’s the death of a thousand cuts.
The SEO Industry is saturated with snake oil SEO salesmen that are desperately trying to jump on the industry’s bandwagon for one very good reason; there is real value in real SEO – value to site owners, and value to the Internet as a whole. Professional search engine optimization involves a holistic approach to search visibility that encompasses Web usability, accessibility, and Web standards to support compelling content – not unethical trickery.
Are there countless blackhat search engine optimization methods being used to push sites up in the rankings? You betcha. Is the Internet saturated with spammy sites that push Viagra, Acai berry, Shamwows, and instant riches at every turn? I think we know the answer to that one. Does this imply that all reputed SEO firms reach into a bag of spam and cloaking to get their clients on the first page of the search engine results pages? Negative on that Houston.
Most of the time, blackhat SEO is used on throwaway domains, designed for fast money, and then ditched in favor of the next project. This has little to do with SEO formulated for long-term web presence. Is mechanical engineering crap as well because there are so many rip-off mechanics?
Don’t bite the tag that feeds you
Looking at the source code of the page that this ridiculous post is on (as well as other areas of the site), it’s evident that pcmag.com has attempted to implement measures that ensure some level of search visibility themselves. Does John C. Dvorak object to pcmag.com’s efforts of getting his oh-so-important opinion properly indexed on the Internet?
In February 2009 on pcmag.com Dvorak wrote another uninformed SEO rant: SEO Fiascoes: The Trouble with Search Engine Optimization. What he was struggling to explain (unsuccessfully) was that the keyword meta tag is useless because of its history of keyword spam targeting:
“…Tags, stored as such, are the modern equivalent of the metatags once used on crude HTML pages. They don’t work and are a stupid exercise in futility…since the search engines all stopped looking at metatags—and that was the end of that until tags reappeared, for some reason…”
The fact that John refers to the keyword meta tag as “metatags” is another dead giveaway (of many) that he couldn’t possibly have spent less time researching for this sloppy example of journalism at it’s worst. It’s almost comforting to read a couple of his other views on:
- Open sourced software:
…I’m not sure where this is all headed, but it’s kind of like the Open Source movement. It relies on a large and vague group of mavens…
- Semantic Web:
…this one promoted by the “social media is everything” crowd in alliance with the “semantic Web is the future” dingbats…
Here’s a revealing quote:
…I’ve complained about it before but it’s too late to do anything about it except moan more…
Know thyself, John.
Without warning, Google has removed PageRank data from the ‘Diagnostics’ section of their Webmaster Tools (WMT). The majority of the SEO community once considered PageRank to be the quintessential metric to track , but the last few years have seen a steady decrease in the little green bar’s popularity.
Webmaster Trends Analyst Susan Moskwa commented in a recent thread on Google Webmaster Central, that PageRank data was removed from WMT simply because they felt it was silly to display data that Big G has been trying to wean webmasters off of for quite some time.
“We’ve been telling people for a long time that they shouldn’t focus on PageRank so much; many site owners seem to think it’s the most important metric for them to track, which is simply not true. We removed it because we felt it was silly to tell people not to think about it, but then to show them the data, implying that they should look at it. :-)”
Moskwa concluded her brief, but to-the-point comment with a link to Google’s Webmaster Help FAQ on crawling, indexing & ranking that stresses:
“…worry less about PageRank, which is just one of over 200 signals that can affect how your site is crawled, indexed and ranked. PageRank is an easy metric to focus on, but just because it’s easy doesn’t mean it’s useful for you as a site owner. If you’re looking for metrics, we’d encourage you to check out Analytics, think about conversion rates, ROI (return on investment), relevancy, or other metrics that actually correlate to meaningful gains for your website or business”
I agree 100%. PageRank isn’t the link popularity metrics panacea that it once might have been. But as Barry Schwartz points out’, why then, is PageRank data still displayed in Google’s Toolbar – too silly for Google Webmaster Tools, but not too silly for Google Toolbar? What gives Google?
Barry then goes on to ask:
“… how many people have the Google Toolbar installed compared to those who use Google Webmaster Tools? I assume a fraction of those use Google Webmaster Tools.”
Barry offers a possible explanation:
“Google cannot remove PageRank from the Toolbar, it is too much of their branding. No matter how much Matt Cutts and the Google search quality and webmaster trends team want it removed, I cannot see Google’s executives allowing it.”
I partially agree here. Yes, PageRank is a big part of Google’s branding, but this branding has made its mark primarily on search marketers and webmasters, at best. I don’t think Google would be too worried about hurting its brand by removing a once-relevant link popularity metric, especially if the majority of experienced search marketers have long since accepted that PageRank offers little if any value as an actionable or meaningful metric.
Marketing Pilgrim’s Andy Beal made a comment that’s a humorous as it is true:
“The problem is, Google’s not yet ready to remove the PageRank score from the toolbar installed on hundreds of millions of web browsers. This really leads you to conclude that role of PageRank has been reduced to nothing more than a comfort blanket for SEO noob. “
PageRank is Dead – Long Live PageRank?
My take here is that Google is “giving notice”, and perhaps PageRank is officially on its way out, one step at a time. Or, this is what they’d have us believe – one less road map on a huge ‘let’s game the search engines” safari.
I for one, hope PageRank sticks around – at least in the shadows somewhere – for one reason only; that little green bar doesn’t do many things, but one thing it does do really quickly is indicate if a site is suffering from a serious indexing problem. Andy feels the same way:
“I only use it as an early warning that a site is not behaving in Google’s index. Any green means ‘go.’ No green, means there’s something to investigate.”
Hang in there PageRank. It never was easy being green.
Okay, we like to
find the hidden meaning behind what google says poke fun at google, right?
Yesterday, Google’s Non-Profits Team Google Grants published a post on their official blog, about a training session they held recently in Washington D.C. Their blog entry details the material that was taught to campaign managers on how to move sites up in natural search results.
We’ve taken this opportunity to test the beta version of our soon to be patented Google PR Cynic translation application (GCTA), AKA Goognic™, on the Google Grants Team‘s recent post.
|Google Grants Team wrote:||Goognic™ Translation|
At a recent non-profit training held in our D.C. office, I got the chance to teach a group of issue campaign managers the basics of “search engine optimization” (SEO), or how to earn a spot for your content that is closer to the top of Google’s natural (left-hand side) search results
At a not-immediately-profitable training session, our Google Propaganda Team got the chance to explain how to organize website content in order to help the successful targeting of Adwords campaigns.
It was a rewarding experience because we were able to take what’s often a technical conversation and make it feel like something everyone could (and should) do.
Our strategy here was two-pronged. From those of whom that will achieve results by following the guidelines, Google shall reap the rewards of better organized sites added to the index (we’ll profit when they switch to Adwords, once their site gets buried on page 6 a couple of “updates” down the line). Those that are overwhelmed by the whole “SEO thing” will realize there is really only one way to go. Did I mention Adwords?
Indeed, when most people hear the words “search engine optimization,” they figure it’s too technical for them or that it doesn’t apply to them. But if you’re running a long-term education or awareness campaign, you need to know how to improve the chances that interested users will find your information through natural search results. It’s just as important as learning how to use your Google Grant effectively.
See previous section. Oh and by the way. The Adwords store called. They want your rankings back.
Fortunately, much of what you can accomplish with SEO doesn’t require any programming or technical skills, but it does require a big-picture awareness of your issue. Because ultimately, you’re not trying to rise to the top of any one search results page, but rather to make your site more relevant to the whole search picture, which means designing your site, sections, and sub-pages with the most high-demand search terms related to your issue in mind
Don’t be intimidated by all this SEO stuff, because if after all your hard work your site still doesn’t rank, well hey, that’s ok, because an Adwords campaign will probably work REAL sweet now!
Doing well in high-demand search results pages requires that you first know what search terms or keywords are most popular. Take concepts and terms you discuss on your site and test them against related terms using tools like Google’s Keyword Tool and Insights for Search. Make sure you’re developing individual pages centered around what people are looking for, using the language they use
Lets get to know some of the tools you’ll be needing to run your first PPC campaign! Sktool, Analytics, Google’s Keyword Tool. Mmmmm, do you smell what the Goog is cooking?
Use these high-demand keywords where they accurately describe your content, especially in page titles, section headings, and in URLs. If you have lots of images or interactive graphics, make sure your most important content appears in text too, because the Googlebot doesn’t read images.
Googlebot has been able to “read” images for over a year. Nobody’s seemed to notice so far, so we’ll hold out a little longer from telling you, we don’t want to have to deal with curtailing a landslide of image sculpting. Well not until we endorse it first. (Ok, that was cheap, sorry, I couldn’t resist)
Finally, understand that the number and quality of other sites that link to your content determines much of your ranking in search results. Make sure you know the other online players on your issue, and encourage them to link to you. Starting a blog or Twitter feed is a great way to keep users abreast of the latest updates to your site and encourages them to link to you too
We’ll be acquiring Twitter soon. Get ready to transfer all your Twitter profiles to your Google accounts, suckahhhhs!
Ok ok, maybe I went a little far this time. But how could I resist? Google has a working++ business model, and I respect that. They’ve done many great things for the search industry, and will continue to do so, while making a profit (Go figure). But who says we can’t entertain the troops in the meantime.
In case you passed on clicking through to the Google Grants site, here are the slides from the recent training session. Enjoy!
In what is arguably the biggest SEO news so far this year, Matt Cutts announced yesterday that using nofollows is no longer a solution to preventing loss of a site’s or page’s link juice, and hasn’t been for over a year!
When the rel=nofollow attribute was introduced in 2005, it was meant as an annotation for not “vouching” for a link. Virtually all forum and blog pages have nofollow attributes associated with visitor generated content, as a means of instructing search engines not to follow (crawl) these untrusted user comments or guestbook entries. Not long after the introduction of rel=”nofollow”, we learned to minimize leakage of our sites’ total allocated PageRank by ‘sculpting’ PR with the attribute as well as to push it to more important pages of our sites. We can now cross this technique off the list.
In Google, nofollow links don’t pass PageRank and don’t pass anchor text. However, we find out now through Matt Cutts (who else) that nofollow links no longer conserve the linkjuice from an outgoing nofollow link in order to be be divided among other links on the page in question.
Old PageRank Algorithm
2 separate cases of a page with “x” amount of available link juice.
As a somewhat simplified example: In the original PageRank algorithm, a page of PR10 would have passed PR2 each to 5 regular links (fig.1). The same page would have passed PR2.5 each to 4 regular links and PR 0 to the nofollow link in fig. 2.
New PageRank Algorithm
Page with “x” amount of available link juice
As you can see in fig 3, nofollowing a link no longer passes extra juice through to the remaining live links. Many SEOs are now considering cutting down substantially on outgoing links, or going back to previous PR Sculpting methods such as:
- Embedding robots.txt-blocked iframes containing certain links
- Embedding Java, Flash or other non-parseable applications to contain certain links
Many SEOs are disillusioned by the fact that using internal nofollows were advocated as best practice by the powers that be at the Big G, and now feel they’re being told the opposite. There will be a lot of speculating, calculating and theorizing in the SEO community on this one in the upcoming weeks. I’ll be back with news on this one soon enough, because I know there’ll be some.
Forbes Media has released the results of its “Ad Effectiveness Survey“, revealing the digital marketing preferences among senior marketing executives polled.
The survey, published yesterday was conducted between February 19, 2009 and March 19, 2009 in order to better understand behaviors and beliefs regarding digital marketing, and to predict areas of growth and weakness in the industry over the next six months.
Some of the highlights of the study
- Search Engine Optimization, Email and e-newsletter marketing are by far the 3 leading methods of digital marketing among respondents.
- Ad networks were the most unpopular, with 50% of respondents stating that the results did not meet their expectations.
- The tools considered most effective for generating conversions were SEO (48%), email and e-newsletter marketing (46%), and PPC/search marketing (32%)
- In the coming six months, half of the respondents expect that viral marketing (54%) and SEO (50%) will likely see the biggest increases. Ad networks see the highest percentage of expected decreases (52%).
Forbes Media includes Forbes and Forbes.com, the #1 business site on the Web that reaches 18.6 million people monthly. The results of the “Ad Effectiveness Survey” are available at www.forbes.com/adinfo/research.html
Bing was originally scheduled for launch June 3, 2009, however, Microsoft’s “Decision Engine” went live today, and aims to compete with Google and divert its share with $80 Million in marketing.
In announcing the search engine May 28, 2009 Microsoft CEO Steve Ballmer said Bing (AKA Kumo) hopes to help users receive the information they’re searching for faster and that the decision engine goes beyond search to help customers deal with information overload.
When we set out to build Bing, we grounded ourselves in a deep understanding of how people really want to use the Web. Bing is an important first step forward in our long-term effort to deliver innovations in search that enable people to find information quickly and use the information they’ve found to accomplish tasks and make smart decisions.
The main feature that I notice from the start is that aside from regular search results, certain queries offer categorized search results in the sidebar as well as related search, as seen below.
Also of interest, each thumbnail in the results of video searches will play the first 30 seconds of the video on mouse over, giving you a sneak preview before clicking the link. The world of Search Engine optimization just got another toy to play with – or rip apart, depending who’s sitting in front of the toy box.
Chrome Brought Us More Speed
Features such as hidden class transitions, dynamic code generation, and precise garbage collection help Chrome outperform it’s peers by about 2:1 in speed. Benchmark tests compared the browser’s speed with that of Safari, Firefox 3, Internet Explorer 7, and Internet Explorer 8.
Chrome Brought Us More Stability
As with many others, my main interest in Chrome laid in the fact that it’s a multi-threaded browser. Single-threaded browsers must be completely restarted if a problem site crashes your current tab or window, but with Chrome’s Task Manager, not only can you see which sites are using the most resources – including memory, processor and data transfer – but you can also terminate problem threads, saving you from having to restart your browser in these cases.
Is Chrome Really Ready To Lose it’s Training Wheels?
4 days ago, on December 11 – only 100 days after Google released the Beta version – Chrome Browser was officially stripped of its beta label. By now, your beta version will have been automatically updated to v220.127.116.11, bringing you the improvements and bug fixes afforded by the last 104 days of user feedback and automatic crash reports analysis.
Chrome v.1 even faster
Other improvements in Chrome’s Official Release:
- Improved bookmarking features (a top users request)
- A more user-friendly privacy control panel
- Improved video and audio plug-in support
So The Bugs Are Mostly Fixed – But where’s the Rest of the Browser
I abandoned Internet Explorer as my browser of choice years ago in favor of much more web standard compliant Firefox and Opera. They were more secure, faster (once loaded) and all around better development tools. Enter Firefox Extensions. If you haven’t used any of the many Firefox extensions, for example the Web Developer Toolbar, you’re missing out. Not just bells and whistles, some serious functionality exists in hundreds of Firefox Extensions.
I’m sure that Chrome will eventually support the addition of useful extensions, and who knows, maybe even outdo Firefox in that department one day; but no RSS reader? In my opinion chrome isn’t ready to be freed of its beta status.
If there’s any question in your mind whether social media is merely a passing trend or a major consideration in any internet marketing campaign, wonder no longer.
Last month, more than 200 major advertising and market research executive representatives attended the sold out Industry Leader Forum – “Transforming Research. Are You Listening” – held by the Advertising Research Foundation (ARF). The ARF, a leading Research Transformation initiative, will
enable members to stay ahead of the curve in a fast-changing, consumer-driven world.
The event, which took place in New York on Oct 29, focused primarily on methods of tracking the ubiquitous online discussions of brands, companies, products and services that numerous social media web sites and platforms host. Bob Barocci, President and CEO of The ARF, shed some light on several of the newer terms being used by advertising researchers, such as ‘listening pipes’, ‘storytelling’, ‘inspiration’, ‘content masters’ (referring to millennials), ‘consumer backyard’ and ‘brand backyard’.
Case histories of “listening” in action were presented by General Mills, MTV, Sony Electronics, and Unilever. Obama pollster Joel Benenson, revealed how public perceptions were gathered in the president-elect’s leading-edge electoral campaign.
The Arf’s Research Transformation Council are:
- Joel Benenson – Founding Partner & President, Benenson Strategy Group – Co-Founder, iModerate, & pollster for Barack Obama
- Jonathan Carson – President, International, Nielsen Online
- Kim Dedeker – VP, External Capability Leadership-Global Consumer & Market Knowledge, Procter & Gamble
- Jeff Flemings – SVP, Renaissance Planning, VivaKi
- Gayle Fuguitt – VP, Consumer Insights, General Mills
- Stephen Kim – Senior Director, Microsoft Branded Experiences and Entertainment, Microsoft Advertising
- Michael Perman – Senior Director, Levi Strauss
- Eric Salama – Chairman and CEO, Kantar
- Patti Wakeling – Senior Manager, Media Insights, Unilever
Pete Blackshaw, the Executive Vice President of the Digital Strategic Services group at Nielsen Online gave a presentation on the “Six Signals of Listening to the Unprompted Voice of the Consumer.“. Pete is a recognized expert in interactive marketing, word of mouth, and consumer understanding, and originally coined the term consumer-generated media (CGM). See Pete’s summary of the highlights from October’s event in his video below.
The ARF’s next Forum will be a one day workshop from 8:00AM to 6:00PM on January 27, 2009 at Bently Reserve, San Francisco. Confirmed speakers include:
- Kim Dedeker (Proctor & Gamble)
- Joel Benenson (Benenson Strategy Group)
- Michael Perman (Levi Strauss)
- Pete Blackshaw (Nielsen Online)
Very exciting stuff!
Yesterday, Google launched SearchWiki, the biggest news in Web 2.0 since sliced Wikipedia. Once logged into your google account, SearchWiki allows you move search results up or out of Google’s index, for your own personalized results on return visits to the Goog. As well as allowing users to edit, reorder, and remove search results to their liking, SearchWiki allows public commenting on search results, letting others know their opinions on individual web sites [Insert scary music here]. Google’s reasoning here is to make it easier for you to find the results that best suit our needs, with these custom indexed results stored in your Google Account.
Well, for those of us with hearts already 100% dedicated to Google, we’ll now have to find other parts of ourselves to dedicate to our beloved search behemoth.
Of course, these pseudo-bookmarked, tailored search results fit nicely into our relatively recent, present day social-media-heavy virtual existence. In the same vein as Del.icio.us, Stumbleupon, Digg, Sphinn, Reddit, Technorati, and countless others, Google now allows us to share our thoughts amongst each other – on the good, bad, and the ugly of all the sites in the Interverse. But wait. Google tells us that
The changes you make only affect your own searches. Well, we’ll see how long until Google revises that statement, because
once if those changes did affect public indexing, we might never have to leave Google for online bookmarking or social-networking communities at all. The comments you leave however, will be public.
I don’t know how I’d feel if Google did incorporate the voting system into their results. Actually, I think I do. Personally, I prefer to surf recommendation engines such as StumbleUpon, or other social networking sites such as Digg when I feel like browsing social media. The way I see it, the thing that sets the internet apart from all other forms of media is that the “hits” don’t necessarily prevail in search engines; instead, the Long Tail of media, including the “misses”, have as much of a chance of producing results in the SERPs – as long as the results are relevant. Granted, “relevant” results – while being based on indexing algorithms – do rely on some forms of indirect user input. One site linking to another, for example, counts as a vote in the eyes of Google’s Pagerank algorithm. I just don’t know if I’m ready to welcome the fact wholeheartedly that ‘what mainstream internet deems to be the best results’ could affect my Google experience in such a direct way. In any case, for now at least, the changes we make only affect our own searches.
So Google, you still do it for me, but…
As Googey-baby states:
SearchWiki also is a great way to share your insights with other searchers. You can see how the community has collectively edited the search results by clicking on the ‘See all notes for this SearchWiki’ link.
I don’t remember ever seeing an online
grafitti feedback system that wasn’t chalk full of gorilla marketing. It’ll be interesting to see how this one plays out.
- Advertising (1)
- Bing (2)
- Business (4)
- Content Strategy (1)
- Copywriting (1)
- Domains (1)
- Google (14)
- Humour (2)
- Internet Marketing (2)
- Landing Page Optimization (1)
- Link Building (1)
- Misinformation (1)
- news (16)
- Online reputation management (1)
- Semantics (2)
- SEO Tips (2)
- SEO Tools (7)
- Social Media (7)
- Social News (1)
- Technology (2)
- Video (4)
- Web analytics (1)
- Web Usability (1)