Before the SEO world gets its collective panties in a knot about Google Instant‘s potential for putting us all out of work, it would be a good idea to remember why people use search engines in the first place: we want information suited to our specific needs.
… “By predicting your search and showing results before you finish typing, Google Instant can save 2-5 seconds per search.”
Are those couple of seconds really saving time if results are too broad? I don’t see on-the-fly results as an enhancement; rather, as just another distraction en route to optimal search results. The real beauty of search is in specificity.
If searchers want to save 1 second in the query stage of search, they can easily omit a word or two from their search terms; but this is obviously counterproductive since a lack of search refinement costs time in the long-run.
Google continues with another supposed benefit:
Instant Results: Start typing and results appear right before your eyes. Until now, you had to type a full search term, hit return, and hope for the right results. Now results appear instantly as you type, helping you see where you’re headed, every step of the way.
Suggestions can occasionally be helpful; however, if someone is searching on Google for ‘cheap car insurance in Detroit’, they aren’t going to stop at cheap car or cheap car insurance just because Google is streaming results on a keystroke-by-keystroke basis for shorter queries. Search is all about the longtail.
In my opinion, Google Instant encourages sheep mentality: How will Google know what people are searching for if out of laziness, we click on their suggestions?
Ok, I’ll state the obvious
The more results that Google can throw in front of searchers, the more opportunity they have to display sponsored results. Enough said.
Why I’m not worried
Push marketing is quickly becoming a thing of the past, and in my opinion, Google Instant is unwanted noise—a distraction that I’ll be glad to have the option of ignoring or opting out of as an able search engine user.
And I don’t think I’m alone.
Without warning, Google has removed PageRank data from the ‘Diagnostics’ section of their Webmaster Tools (WMT). The majority of the SEO community once considered PageRank to be the quintessential metric to track , but the last few years have seen a steady decrease in the little green bar’s popularity.
Webmaster Trends Analyst Susan Moskwa commented in a recent thread on Google Webmaster Central, that PageRank data was removed from WMT simply because they felt it was silly to display data that Big G has been trying to wean webmasters off of for quite some time.
“We’ve been telling people for a long time that they shouldn’t focus on PageRank so much; many site owners seem to think it’s the most important metric for them to track, which is simply not true. We removed it because we felt it was silly to tell people not to think about it, but then to show them the data, implying that they should look at it. :-)”
Moskwa concluded her brief, but to-the-point comment with a link to Google’s Webmaster Help FAQ on crawling, indexing & ranking that stresses:
“…worry less about PageRank, which is just one of over 200 signals that can affect how your site is crawled, indexed and ranked. PageRank is an easy metric to focus on, but just because it’s easy doesn’t mean it’s useful for you as a site owner. If you’re looking for metrics, we’d encourage you to check out Analytics, think about conversion rates, ROI (return on investment), relevancy, or other metrics that actually correlate to meaningful gains for your website or business”
I agree 100%. PageRank isn’t the link popularity metrics panacea that it once might have been. But as Barry Schwartz points out’, why then, is PageRank data still displayed in Google’s Toolbar – too silly for Google Webmaster Tools, but not too silly for Google Toolbar? What gives Google?
Barry then goes on to ask:
“… how many people have the Google Toolbar installed compared to those who use Google Webmaster Tools? I assume a fraction of those use Google Webmaster Tools.”
Barry offers a possible explanation:
“Google cannot remove PageRank from the Toolbar, it is too much of their branding. No matter how much Matt Cutts and the Google search quality and webmaster trends team want it removed, I cannot see Google’s executives allowing it.”
I partially agree here. Yes, PageRank is a big part of Google’s branding, but this branding has made its mark primarily on search marketers and webmasters, at best. I don’t think Google would be too worried about hurting its brand by removing a once-relevant link popularity metric, especially if the majority of experienced search marketers have long since accepted that PageRank offers little if any value as an actionable or meaningful metric.
Marketing Pilgrim’s Andy Beal made a comment that’s a humorous as it is true:
“The problem is, Google’s not yet ready to remove the PageRank score from the toolbar installed on hundreds of millions of web browsers. This really leads you to conclude that role of PageRank has been reduced to nothing more than a comfort blanket for SEO noob. “
PageRank is Dead – Long Live PageRank?
My take here is that Google is “giving notice”, and perhaps PageRank is officially on its way out, one step at a time. Or, this is what they’d have us believe – one less road map on a huge ‘let’s game the search engines” safari.
I for one, hope PageRank sticks around – at least in the shadows somewhere – for one reason only; that little green bar doesn’t do many things, but one thing it does do really quickly is indicate if a site is suffering from a serious indexing problem. Andy feels the same way:
“I only use it as an early warning that a site is not behaving in Google’s index. Any green means ‘go.’ No green, means there’s something to investigate.”
Hang in there PageRank. It never was easy being green.
Okay, we like to
find the hidden meaning behind what google says poke fun at google, right?
Yesterday, Google’s Non-Profits Team Google Grants published a post on their official blog, about a training session they held recently in Washington D.C. Their blog entry details the material that was taught to campaign managers on how to move sites up in natural search results.
We’ve taken this opportunity to test the beta version of our soon to be patented Google PR Cynic translation application (GCTA), AKA Goognic™, on the Google Grants Team‘s recent post.
|Google Grants Team wrote:||Goognic™ Translation|
At a recent non-profit training held in our D.C. office, I got the chance to teach a group of issue campaign managers the basics of “search engine optimization” (SEO), or how to earn a spot for your content that is closer to the top of Google’s natural (left-hand side) search results
At a not-immediately-profitable training session, our Google Propaganda Team got the chance to explain how to organize website content in order to help the successful targeting of Adwords campaigns.
It was a rewarding experience because we were able to take what’s often a technical conversation and make it feel like something everyone could (and should) do.
Our strategy here was two-pronged. From those of whom that will achieve results by following the guidelines, Google shall reap the rewards of better organized sites added to the index (we’ll profit when they switch to Adwords, once their site gets buried on page 6 a couple of “updates” down the line). Those that are overwhelmed by the whole “SEO thing” will realize there is really only one way to go. Did I mention Adwords?
Indeed, when most people hear the words “search engine optimization,” they figure it’s too technical for them or that it doesn’t apply to them. But if you’re running a long-term education or awareness campaign, you need to know how to improve the chances that interested users will find your information through natural search results. It’s just as important as learning how to use your Google Grant effectively.
See previous section. Oh and by the way. The Adwords store called. They want your rankings back.
Fortunately, much of what you can accomplish with SEO doesn’t require any programming or technical skills, but it does require a big-picture awareness of your issue. Because ultimately, you’re not trying to rise to the top of any one search results page, but rather to make your site more relevant to the whole search picture, which means designing your site, sections, and sub-pages with the most high-demand search terms related to your issue in mind
Don’t be intimidated by all this SEO stuff, because if after all your hard work your site still doesn’t rank, well hey, that’s ok, because an Adwords campaign will probably work REAL sweet now!
Doing well in high-demand search results pages requires that you first know what search terms or keywords are most popular. Take concepts and terms you discuss on your site and test them against related terms using tools like Google’s Keyword Tool and Insights for Search. Make sure you’re developing individual pages centered around what people are looking for, using the language they use
Lets get to know some of the tools you’ll be needing to run your first PPC campaign! Sktool, Analytics, Google’s Keyword Tool. Mmmmm, do you smell what the Goog is cooking?
Use these high-demand keywords where they accurately describe your content, especially in page titles, section headings, and in URLs. If you have lots of images or interactive graphics, make sure your most important content appears in text too, because the Googlebot doesn’t read images.
Googlebot has been able to “read” images for over a year. Nobody’s seemed to notice so far, so we’ll hold out a little longer from telling you, we don’t want to have to deal with curtailing a landslide of image sculpting. Well not until we endorse it first. (Ok, that was cheap, sorry, I couldn’t resist)
Finally, understand that the number and quality of other sites that link to your content determines much of your ranking in search results. Make sure you know the other online players on your issue, and encourage them to link to you. Starting a blog or Twitter feed is a great way to keep users abreast of the latest updates to your site and encourages them to link to you too
We’ll be acquiring Twitter soon. Get ready to transfer all your Twitter profiles to your Google accounts, suckahhhhs!
Ok ok, maybe I went a little far this time. But how could I resist? Google has a working++ business model, and I respect that. They’ve done many great things for the search industry, and will continue to do so, while making a profit (Go figure). But who says we can’t entertain the troops in the meantime.
In case you passed on clicking through to the Google Grants site, here are the slides from the recent training session. Enjoy!
In what is arguably the biggest SEO news so far this year, Matt Cutts announced yesterday that using nofollows is no longer a solution to preventing loss of a site’s or page’s link juice, and hasn’t been for over a year!
When the rel=nofollow attribute was introduced in 2005, it was meant as an annotation for not “vouching” for a link. Virtually all forum and blog pages have nofollow attributes associated with visitor generated content, as a means of instructing search engines not to follow (crawl) these untrusted user comments or guestbook entries. Not long after the introduction of rel=”nofollow”, we learned to minimize leakage of our sites’ total allocated PageRank by ‘sculpting’ PR with the attribute as well as to push it to more important pages of our sites. We can now cross this technique off the list.
In Google, nofollow links don’t pass PageRank and don’t pass anchor text. However, we find out now through Matt Cutts (who else) that nofollow links no longer conserve the linkjuice from an outgoing nofollow link in order to be be divided among other links on the page in question.
Old PageRank Algorithm
2 separate cases of a page with “x” amount of available link juice.
As a somewhat simplified example: In the original PageRank algorithm, a page of PR10 would have passed PR2 each to 5 regular links (fig.1). The same page would have passed PR2.5 each to 4 regular links and PR 0 to the nofollow link in fig. 2.
New PageRank Algorithm
Page with “x” amount of available link juice
As you can see in fig 3, nofollowing a link no longer passes extra juice through to the remaining live links. Many SEOs are now considering cutting down substantially on outgoing links, or going back to previous PR Sculpting methods such as:
- Embedding robots.txt-blocked iframes containing certain links
- Embedding Java, Flash or other non-parseable applications to contain certain links
Many SEOs are disillusioned by the fact that using internal nofollows were advocated as best practice by the powers that be at the Big G, and now feel they’re being told the opposite. There will be a lot of speculating, calculating and theorizing in the SEO community on this one in the upcoming weeks. I’ll be back with news on this one soon enough, because I know there’ll be some.
On March 4, Matt Cutts responded to questions from concerned SEOs and webmasters regarding Google’s apparent shift towards pushing known brands in the Search engine results.
The basic observation made by the SEO community at large is that since February 2009, search results for generic terms such as “car” or “laptop” have been favoring the big brand names more than ever before. Obviously, branded companies boast digital marketing budgets and consumer awareness that greatly surpass lesser known brands, but many, including Aaron Wall, speculate that this month saw a spike in the rankings of brand heavyweights like never before.
In his video response, Matt starts by explaining that while there was a recent change in the algorithm, it was not a major update, rather just one of 300 – 400 simple algorithm changes that Google performs annually. Some of the staff over at Googleplex have nicknamed this algorithm tweak “Vince’s Algorithm Change“, in reference to one of Google’s engineers extensively involved in its completion.
Matt goes on to explain that the concept of “Brand” does not really exist in Google’s indexing system. Instead, the notion of trust (TrustRank) is strengthened, relevant inbound links are discussed as usual, and Matt also touches on some of the other standard Google guidelines.
Take a look at the video. Oh, and for some humor, watch as Matt accidentally mixes his words at 2:48 in. “The um, net update, the net uh upshot of this change is pretty simple…”. Just kidding Matt!
…off to strengthen my brand.
Chrome Brought Us More Speed
Features such as hidden class transitions, dynamic code generation, and precise garbage collection help Chrome outperform it’s peers by about 2:1 in speed. Benchmark tests compared the browser’s speed with that of Safari, Firefox 3, Internet Explorer 7, and Internet Explorer 8.
Chrome Brought Us More Stability
As with many others, my main interest in Chrome laid in the fact that it’s a multi-threaded browser. Single-threaded browsers must be completely restarted if a problem site crashes your current tab or window, but with Chrome’s Task Manager, not only can you see which sites are using the most resources – including memory, processor and data transfer – but you can also terminate problem threads, saving you from having to restart your browser in these cases.
Is Chrome Really Ready To Lose it’s Training Wheels?
4 days ago, on December 11 – only 100 days after Google released the Beta version – Chrome Browser was officially stripped of its beta label. By now, your beta version will have been automatically updated to v22.214.171.124, bringing you the improvements and bug fixes afforded by the last 104 days of user feedback and automatic crash reports analysis.
Chrome v.1 even faster
Other improvements in Chrome’s Official Release:
- Improved bookmarking features (a top users request)
- A more user-friendly privacy control panel
- Improved video and audio plug-in support
So The Bugs Are Mostly Fixed – But where’s the Rest of the Browser
I abandoned Internet Explorer as my browser of choice years ago in favor of much more web standard compliant Firefox and Opera. They were more secure, faster (once loaded) and all around better development tools. Enter Firefox Extensions. If you haven’t used any of the many Firefox extensions, for example the Web Developer Toolbar, you’re missing out. Not just bells and whistles, some serious functionality exists in hundreds of Firefox Extensions.
I’m sure that Chrome will eventually support the addition of useful extensions, and who knows, maybe even outdo Firefox in that department one day; but no RSS reader? In my opinion chrome isn’t ready to be freed of its beta status.
Yesterday, Google launched SearchWiki, the biggest news in Web 2.0 since sliced Wikipedia. Once logged into your google account, SearchWiki allows you move search results up or out of Google’s index, for your own personalized results on return visits to the Goog. As well as allowing users to edit, reorder, and remove search results to their liking, SearchWiki allows public commenting on search results, letting others know their opinions on individual web sites [Insert scary music here]. Google’s reasoning here is to make it easier for you to find the results that best suit our needs, with these custom indexed results stored in your Google Account.
Well, for those of us with hearts already 100% dedicated to Google, we’ll now have to find other parts of ourselves to dedicate to our beloved search behemoth.
Of course, these pseudo-bookmarked, tailored search results fit nicely into our relatively recent, present day social-media-heavy virtual existence. In the same vein as Del.icio.us, Stumbleupon, Digg, Sphinn, Reddit, Technorati, and countless others, Google now allows us to share our thoughts amongst each other – on the good, bad, and the ugly of all the sites in the Interverse. But wait. Google tells us that
The changes you make only affect your own searches. Well, we’ll see how long until Google revises that statement, because
once if those changes did affect public indexing, we might never have to leave Google for online bookmarking or social-networking communities at all. The comments you leave however, will be public.
I don’t know how I’d feel if Google did incorporate the voting system into their results. Actually, I think I do. Personally, I prefer to surf recommendation engines such as StumbleUpon, or other social networking sites such as Digg when I feel like browsing social media. The way I see it, the thing that sets the internet apart from all other forms of media is that the “hits” don’t necessarily prevail in search engines; instead, the Long Tail of media, including the “misses”, have as much of a chance of producing results in the SERPs – as long as the results are relevant. Granted, “relevant” results – while being based on indexing algorithms – do rely on some forms of indirect user input. One site linking to another, for example, counts as a vote in the eyes of Google’s Pagerank algorithm. I just don’t know if I’m ready to welcome the fact wholeheartedly that ‘what mainstream internet deems to be the best results’ could affect my Google experience in such a direct way. In any case, for now at least, the changes we make only affect our own searches.
So Google, you still do it for me, but…
As Googey-baby states:
SearchWiki also is a great way to share your insights with other searchers. You can see how the community has collectively edited the search results by clicking on the ‘See all notes for this SearchWiki’ link.
I don’t remember ever seeing an online
grafitti feedback system that wasn’t chalk full of gorilla marketing. It’ll be interesting to see how this one plays out.
Yesterday evening, Google announced the release of their Search-based Keyword Tool (beta) (SBKT), a nice little addition to their ever expanding suite of free internet marketing and keyword research applications. Google’s SBKT suggests terms that are semantically related to the content of any provided URL – ones that aren’t currently part of an AdWords campaign associated with that particular site.
If you don’t run an AdWords campaign on the website that you’re doing KW research for, the tool can still come in handy by providing you with a list of related terms; similar to Google’s regular Keyword Tool, but with somewhat broader, yet highly relevant results. If, however, you are logged into your AdWords account when you perform the search, SBKT will display only the keywords that you aren’t already advertising for.
For each keyword or keyphrase displayed in the results, columns representing monthly search volume, competition, and suggested bids are offered, as they are in some of Google’s other tools. Some extremely valuable information is offered in the data column that displays what percentage of time you’re showing up in search ad spots for the AdWords campaign you’re running.
SEOs will be glad to know that Google says the Search-based Keyword Tool doesn’t generate keyword ideas from AdWords accounts associated with any websites, and that data is derived from
aggregated and anonymous Google search data from Google users in several different countries.
Comments, as usual, are welcomed. Let us know what you think of Google’s Search-based Keyword Tool!
Until this week, the closest Google has ever come to publishing anything on recommended Search Engine Optimization practices has been the well known and rather vague Google Webmaster Guidelines. Just the fact that they used the jack-of-all-trades-but-master-of-none term ‘webmaster’ hints at the rather limited potential value of the document to anyone that’s been involved with the industry for say, a few days. The guidelines are a great starting point for someone new to web development, but aside from advice and hints posted by Matt Cutts on his and others’ blogs, the public hasn’t ever had any Googficial (How’s that for a word – Come on Oxford and Websters, I dare you) documentation on how to get better website visibility in Google.
2 days ago, however, Google’s SEO Starter Guide appeared on the Official Google Webmaster (shudder – there’s that word again) Central Blog. Now in terms of the information value in the guide, I’d say it’s a step up from Google’s Webmaster Guidelines, but not by much. New to SEO or web development and need a reliable source of information on how to make your site more search engine friendly? Need information you can trust, since you are new to the game? Well here it is. Which brings me to the real value of these guidelines as far as SEOs and Internet marketers go.
Trust. We know that having the experience with the basic SEO practices outlined in the SEO Starter guide and far beyond is what really makes the difference in a competent Internet Marketing professional, and if there is even a single piece of information in this guide that you aren’t familiar with, you might want to reassess your worth to your clients. However, the basics of SEO, now officially outlined by Google in their guide, will help to take some of the mystery out of SEO for business owners when wondering whether what they’re paying for is actually worth it. Nothing in this guide is new to us, except that Google has finally put it’s stamp of approval of the most basic of SEO principals that we’ve all been using for a long time; and they finally refer to it as SEO as well. Potential clients often want to understand what basic steps you’re taking to help their internet presence in exchange for their hard earned money. Just the fact that Google has official documentation on the basics of SEO is a step in the right direction. Thanks Google.
By now it’s widely known that Barrack Obama’s superior internet marketing campaign surpassed McCain’s online strategy by fostering an efficient online community early on. Both of the campaigns took advantage of online behavioral targeting, using cookies set on visitors browsers to track what types of sites they visited, and displaying targeted ads to them on subsequent visits. BarackObama.com’s much higher traffic was complemented by social media platforms such as Facebook, MySpace, YouTube, and Wikis to Organize Volunteers, as well as reaching to various demographics with text messaging for younger voters, and succinct emails to older ones.
Obama’s web team not only raised incredible sums in $30 and $50 increments, but also maximized their fund raising efforts by running multivariate conversion tests to optimize donations and minimize bounce rates.
Obama’s Donations page utilized free Google Website Optimizer to test the most successful of a variation of t-shirt gifts on donations of $30 or more – and on the site’s home page, displayed variations of campaign images, in order to measure bounce rates.
It seems Political Campaign Internet marketing change is here to stay!
- Advertising (1)
- Bing (2)
- Business (4)
- Content Strategy (1)
- Copywriting (1)
- Domains (1)
- Google (14)
- Humour (2)
- Internet Marketing (2)
- Landing Page Optimization (1)
- Link Building (1)
- Misinformation (1)
- news (16)
- Online reputation management (1)
- Semantics (2)
- SEO Tips (2)
- SEO Tools (7)
- Social Media (7)
- Social News (1)
- Technology (2)
- Video (4)
- Web analytics (1)
- Web Usability (1)