Sorry I haven’t made any posts in a while…as we recently took on a big project AND moved hosts on SEO Book (currently on a speedy quad core), and I wanted to have minimal activity around the time of the move.
Google recently announced adding breadcrumbs to the search results for some sites which offer hierarchical breadcrumbs in their navigation. The display looks likeso:
Each breadcrumb is a clickable link to the associated page (which could increase traffic to the target site in some cases), but the initial implementation is a bit sloppy for a couple reasons
- Google initial implementation shows the hierarchy (and places more emphasis on hierarchy) rather than listing the current page…this has a net effect of making the result look less relevant UNLESS the breadcrumbs are really tightly associated with each other and/or the site covers a small tight niche
- when people look at the search results they scan them and match patterns. the lack of showing the current page hurts perceived relevancy, and even when a search keyword is in the breadcrumb it is not highlighted
As an example of how far astray the above 2 points can go, check out the following listing for Joost’s great Wordpress SEO guide.
While seeing the site structure might be nice…the exact reason people are using search is because they don’t want to have to drill down through someone’s site structure…they want the most relevant thing shown in the search results.
So did Google do this for relevancy? It is hard to believe they did given that they don’t list the current page and employ no bolding.
Perhaps they want to make the results harder to scrape? Or they wanted to give advertisers even more options with the ads (many new ad formats hit the organic search results first)? Or maybe, as John Andrews mentioned, “Google would LOVE to eliminate the URL altogether. Just another try…”
Do I recommend using breadcrumbs? Historically I have, but if Google does not fix the above issues it will likely end up costing publishers some perceived relevancy, and in some cases I might not recommend using them except for on small sites or those with tight and descriptive breadcrumb structures. And on larger sites they might make more sense on category listing pages rather than on item detail pages.
Posted by R.W. Casandra Date: Wednesday, November 25, 2009
LinkedIn has opened up its platform to developers. It can be accessed at developer.linkedin.com.
“Over fifty million users entrust their professional identities and relationships with LinkedIn, helping build LinkedIn into the largest global professional network today,” says LinkedIn’ Adam Nash. “However, professionals around the world use a wide variety of applications and Web sites to get their work done, and they have spoken loud and clear that they want the ability to leverage their professional networks wherever they work.”
Now developers can integrate LinkedIn into their business applications. LinkedIn’s developer site has APIs and widgets.
“Over the past months, LinkedIn has supported integrations with some of the most prominent and critical software applications in the enterprise,” says Nash. “Partnerships with companies like IBM, Blackberry (Research in Motion), and most recently Microsoft, have given us time to invest in both functionality and scalability of the platform.”
Developers interested in using LinkedIn in their apps need only fill out a form at the site. The LinkedIn platform leverages the open OAuth standard, so integrations should be that much more simple.
On a related note, Twitter client TweetDeck is already utilzing the LinkedIn platform. They just announced that you can view or take action on your LinkedIn network updates from within the TweetDeck application.
It should be interesting to see the kinds of apps that start taking advantage of LinkedIn’s APIs. This could turn out to be a very significant event for increasing business networking, and even matching prospective job candidates with jobs.
Posted by R.W. Casandra Date: Tuesday, November 24, 2009
In this day and age, you pretty much can’t ignore mobile users. The rate at which consumers are accessing the web via mobile devices is growing rapidly, largely thanks to the increasing popularity and production of smartphones.
Just having a mobile site isn’t even enough. Sure, it’s a great start, but you have to start thinking about a mobile site just as you would a regular site. Can people find it? Just because you have a good ranking in Google does not mean that your mobile site has a good ranking in Google’s mobile search engine, or is even indexed at all.
Google recently shared a few important tips for making sure your mobile site is being indexed in Google’s Mobile Search.
1. Create a mobile sitemap and submit it to Google so Google knows it exists. This can be done using Google Webmaster Tools, just like with a regular sitemap.
2. To make sure Googlebot-Mobile can access your site, allow any User-agent to access it.
“You should also be aware that Google may change its User-agent information at any time without notice, so it is not recommended that you check if the User-agent exactly matches ‘Googlebot-Mobile’ (which is the string used at present),” says Jun Mukai, a software engineer on Google’s mobile search team. “Instead, check whether the User-agent header contains the string ‘Googlebot-Mobile’. You can also use DNS Lookups to verify Googlebot.”
3. Check that your mobile-friendly URLs’ DTD (Doc Type Definition) declaration is in an appropriate mobile format such as XHTML Mobile or Compact HTML.
If you run both a regular site and a mobile version of it, there is a possibility that the wrong version will show up in the wrong search results. There are ways you can prevent this.
“When a mobile user or crawler (like Googlebot-Mobile) accesses the desktop version of a URL, you can redirect them to the corresponding mobile version of the same page,” explains Mukai. “Google notices the relationship between the two versions of the URL and displays the standard version for searches from desktops and the mobile version for mobile searches.”
If you do use a redirect, you should make sure content on the corresponding URL matches as closely as possible, because Google finds sites that abuse the practice in order to try and boost their rankings. Google says this should be avoided at all costs, so you can probably expect to be penalized for such an action.
Another way you can make sure a user is pointed to the right version of your site is simply to provide a link. In fact, that is what Google itself does. If you access the mobile version of Google, you will find a link to the desktop version.
Another way still, is to switch content based on the User-agent, so mobile users automatically see the mobile version and desktop users see the desktop version, even though both are accessing the same URL.
Google warns, however, that if you use this method, there is a chance that if you fail to configure your site correctly, it could be mistaken for cloaking, which you can be penalized for.
“To remain within our guidelines, you should serve the same content to Googlebot as a typical desktop user would see, and the same content to Googlebot-Mobile as you would to the browser on a typical mobile device,” says Mukai. “It’s fine if the contents for Googlebot are different from the one for Googlebot-Mobile.”
Have you taken the necessary steps to ensure you are being indexed in Google’s mobile search engine? Have you been left out due to cloaking-related confusion? Discuss here.
Posted by R.W. Casandra Date: Tuesday, November 24, 2009
Google’s Matt Cutts discussed how the search engine handles sites that that are “in the cloud” with regards to how listings are affected. Matt’s explanation was a response to the following user-submitted question:
Can moving my website to “the cloud” harm my listings? Say my server’s in Germany and I move the website to Google’s App Engine or Amazon S3. Does this harm my listings for German results – or is it enough to set the “geographic target” in GWT to Germany?
Matt broke the question down into separate parts to answer them. First, he took on the part about moving a site to “the cloud” harming the users’ listings. His answer for this is basically that Google doesn’t even know if your site is in the cloud, so it can’t use that information to affect listings.
“We don’t know what is happening on the side of your web server. Your web server could be running Perl, PHP, Python, or Ruby on Rails,” said Cutts. “All we know is what the web server returns. So your web server could be running code that would go talk to Amazon’s cloud or Appspot or anywhere else in the cloud, but we wouldn’t even know that. We don’t even know whether a page is dynamically created or statically created. All we know is what the web server sends back.”
He says if your site is talking to the cloud behind the scenes, there is now way for any search engine or bot to know about that. Watch the video above to hear Matt’s explanation for the second part of the user’s question.
Posted by R.W. Casandra Date: Monday, November 23, 2009
In the past many SEOs have called organic search results the results on the left side of the page and the pay-per-click / AdWords results as the results on the right side of the page. As Google has grown more aggressive with promoting vertical/universal search I think a better way of defining the portions of the search result page are ABOVE THE FOLD and BELOW THE FOLD.
As recently as yesterday Google stripped the phone numbers off of non-sponsored map listings, even if you were doing a navigational search! And that shows that the primary goal of the maps is as filler content (rather than utility).
Update: it looks like Google claimed the phone number removal was a bug, but weird timing that the bug appeared at the same time they started selling premium local ads that appear on the regular search results.
So lets redefine these search result pieces as they are…
- AdWords Ads: the ads at the top of the search results and those which run down the right rail of the search results.
- Universal Search Results: filler stuff to put in the search results to a.) drive the organic search results lower down the page, while b.) driving additional incremental click volume to other Google properties which display ads.
- Organic Search Results: the results on the search result page that are determined algorithmically and appear below the fold. On some larger monitors a listing or 2 from this category may appear above the fold, at least for the time being.
In the future A LOT of verticals (movies, music, books, news, ecommerce, travel, etc.) are going to look more and more like local, where Google in some cases has at least 15 ads above the fold AND filler pushing down the organic search results…quietly building a backdoor portal that sends Google the second click if they were not able to monetize the first one.
To me this screams the importance of working the tail of search, because the more obscure a search query is the greater the risk to Google if they pollute it with junk from vertical search databases.
As Google gets stingier with their traffic that will increase the importance of relationship development and lead capture, as well as developing distribution channels outside of Google.
This new search result layout also highlights the importance of being #1 for your most important keywords…if only 1 result is going to show above the fold then there is little point being #2. So that will really help/force you to decide which words are practical to target and which words are not. If you have some valuable #3 or #4 listings you better start marketing them today before they end up below the fold tomorrow.
The last important thing this search result signals is the importance of increasing conversion rates and lifetime customer value…if/when search becomes pay-to-play in your market, will you still be able to compete? If not, what can be done to help bridge that gap?
Posted by R.W. Casandra Date: Monday, November 23, 2009
YouTube’s Partner Program has, as a general rule, allowed in only content creators who produce original and heavily viewed clips on a consistent basis. This was a smart and safe approach. But it’s an approach that also excluded a lot of very popular one-off videos, and YouTube’s now seeking to correct the problem.
A post on the YouTube Biz Blog announced this afternoon, “[W]e’re extending the YouTube Partnership Program to include individual popular videos on our site. Now, when you upload a video to YouTube that accumulates lots of views, we may invite you to monetize that video and start earning revenue from it.”
This move makes sense for several reasons. First, you have to consider how content creators will react. The ones who are contacted by YouTube will no doubt be thrilled to receive money. The ones who aren’t will at least know that the possibility exists, and may make more and better videos as a result. That, in turn, should benefit the average YouTube user.
Then there’s the corporate perspective. Google has been trying to monetize YouTube for years, and by extending the Partner Program, should be able to sell more ads and bring the site closer to profitability.
Anyway, here’s one last detail that’s both a drawback and a hint at the next step: the YouTube Biz Blog post stated, “For now individual video partnerships are available only in the United States, but we hope to roll these out internationally soon.”
Posted by R.W. Casandra Date: Tuesday, August 25, 2009
We all know that social media is “where it’s at” these days. People are spending more and more of their time on social networking sites. Many are checking their Facebook pages and Twitter accounts before even checking their email (or even getting out of bed in some cases).
Real-time search, while still in its infancy (if not in utero), is on the rise, and people are searching for up-to-the-minute, what’s happening right-now results for many of their every day queries. Real-time search isn’t a replacement for Google, its a compliment. That’s why Google knows it needs to gravitate in its direction and offer as fresh of results as possible, particularly when relevant.
Google already has a “sorted by date” feature (under “recent results”) in its list of search options that users can use to customize their search experience. The jury is still out on how frequently these search options are and will be used, but that option’s there, and chances are that it will get better at indexing fresh content. Chances are also that more and more people will realize that option is available. It hasn’t been around that long yet.
People aren’t just searching on Google and the traditional search engines. They’re searching on social networks too. You know about Twitter’s real-time search, and Facebook recently rolled it out too. Facebook also acquired FriendFeed, which utilizes pretty much every other social network out there.
I’ve already written about why social media is only going to become more important to search, but it’s not just about search. It’s about the way people obtain, absorb, and relay information. They’re doing it on Twitter. They’re doing it on Facebook. They’re doing it on MySpace, and they’re doing it on plenty of other networks (and believe it or not, they’re still doing it through email too).
These are the reasons your content should be as shareable as possible. If you want more people to see it, word of mouth is just as important as search. Social media is the word of mouth of the web.
Include buttons and icons that make your content easy to share on social networks. “Post to Facebook,” “Retweet,” “Digg,” etc. are all buttons that can put your content a click away from going viral. Obviously the content has to be up to par for this to work.
This can work in your favor on down the road as well as the present. Even if an article is a month or a year old, if it is still relevant, someone may happen across it and tweet it or share it some other way. This will not only place your content within the streams of that person’s followers/friends, but also on the radar of any related real-time searches taking place.
Real-time search isn’t just about what’s happening right now. It’s also about what people are talking about right now. It’s up to you to provide content that people will still be talking about later. Giving easy access to sharing features will only ease the way.
How important to you think real-time search is to the future of online marketing? Share your thoughts.
Posted by R.W. Casandra Date: Tuesday, August 25, 2009
Retweeting is a phenomenon that has taken the Twitter world by storm. The concept began when somebody added the letters “RT” to somebody else’s tweet and posted it as their own. The idea caught on on a massive scale, and now there are services that utilize retweeting as the backdrop of their entire purposes. “Some of Twitter’s best features are emergent—people inventing simple but creative ways to share, discover, and communicate. One such convention is retweeting,” says Twitter Co-founder Biz Stone.
As a Twitter user, what is your opinion of the concept of retweeting? Share with WebProNews readers.
Disclaimer: If you are not a Twitterer, you may be unfamiliar with the concept of retweeting. Basically, when someone updates their status on Twitter, that is called a tweet. When someone likes that status and wants to share it with others, they will at “RT” (for ReTweet) and the user’s name typically and post the same update. This is usually done with Tweets containing links, so naturally it provides a good, viral means of link exposure.
Tweetmeme has been around for a while, offering a service to content providers, where they can add a button onto an article page that lets a reader easily tweet a link to that article on Twitter. It then counts these tweets, which become retweets, just like similar buttons you’ve probably seen for Digg. The more retweets that are registered on that button, the more interesting the content looks at first glance. The reason for this is that theoretically, if a user sees the article has 2,000 tweets, as opposed to 2, they can assume that a lot of people found the article interesting or informative, and will be more likely to continue reading. It’s kind of like the concept behind comments. Articles that display a large amount of comments are likely to catch readers’ eyes for the same reason. The Huffington Post discussed this concept in a recent interview with WebProNews:
This week, a company called Mesiab Labs launched a service that is practically identical to Tweetmeme, at Retweet.com. Obviously, this company is hoping to cash in on the popular concept, while injecting a powerful brand to go along with it. The timing of this is interesting because Twitter recently announced its own retweeting plans in an initiative called ” Project Retweet,” which will presumably see a retweet button at Twitter.com (many consider this long overdue), and retweet functionality right in the Twitter API, opening up a lot more retweeting possibilities in third-party Twitter apps.
But back to why retweeting is useful to businesses. The attention grabbing effect of the retweet button on a piece of content is just one aspect. Another is of course, the promotion the content provider sees from a substantial amount of retweets. They’re viral by nature, and in the best-case scenario, they can drive a ton of traffic to the content.
Famed blogger Robert Scoble started an interesting discussion on FriendFeed about what is better between the retweet and the “like” feature on either Facebook or FriendFeed itself. While I’m not going to get into all of the reasons why one is better than the other, Scoble and other participants in the conversation made a number of good points bout the pros and cons of retweets. Let’s look at some of those.
- Retweets are viral
- Retweets show up as top-level items in FriendFeed
- As opposed to a Facebook “like,” a retweet is shared with everyone
- Retweets typically give credit to sources
- While giving credit to sources, retweets can lead to relationships
- Susbstantial amounts of retweets can say a lot about the quality of content
- Retweets can inspire further conversation
- Retweets can be good for branding
- Retweets can easily be shared across multiple networks, like Twitter, Friend, Facebook, etc.
- Retweets can provide followers with additional value in quality content
- It’s hard to provide a list of the things you’ve retweeted, as Scoble mentions. He mentions how people can see your “likes” on FriendFeed
- Retweeting creates what many people consider to be “noise” on Twitter
- Twitter’s 140 character limit
- Some people consider retweeting to be like copying other people’s work for your own gain, though this concept is heavily disputed
A recent study from Pear Analytics found that about 8.70% of the tweets it researched were retweets. In some of the more web-oriented circles, this probably even seems quite low. Without a doubt though, Twitterers are retweeting tweets like there’s no tomorrow. Obviously businesses can see value in this, especially if they provide some kind of content that they would like to see shared.
As always, it comes down to providing quality content – the old “content is king” cliché. Even as the web has evolved, that simple fact remains true. If you provide something interesting, people will share it.
Scoble’s whole “Retweet vs. Like” concept is an interesting one in itself. We have certainly seen Facebook make numerous changes to its interface that seem to move the network closer to the realm of Twitter. You have to wonder if Facebook will eventually incorporate some kind of retweet-like functionality itself.
Posted by R.W. Casandra Date: Sunday, August 23, 2009
At the Search Engine Strategies conference in San Jose, WebProNews attended the session on how SEO can help save the publishing industry, a quite interesting topic, considering the controversy the industry has been experiencing of late. Do you think SEO can help publishers save their businesses? Share your thoughts here.
The session looked at challenges, tactics, and opportunities unique to online publishers. It covered solutions for technical obstacles, duplicate content and CMS issues, writing keyword rich headlines, training the editorial staff and updating the publishing culture from print to online. Essentially, the session was designed to educate participants on how to save jobs by leveraging SEO, driving traffic, and putting ad dollars back in publishers’ pockets, as described by SES.
Liesel Kipp, VP Global Head of Product Management at Thomas Reuters shared four tips:
1. Show the value of SEO
2. Data is the key to your success
3. Set goals and show how you will beat them.
4. Evangelize, evangelize, evangelize.
Kipp says Reuters was able to increase its visitors by 500% in 5 years, and that you have to constantly talk about search and SEO. According to Kipp, relationship building is critical, and you should talk about your successes and failures.
BusinessWeek Search Marketing Manager Ulli Muenker offered some more tips on the subject:
1. Spread the SEO Excitement in Editorial.
- Get the high level buy in
- Find SEO champions in the editorial team
- Create peer relationships to overcome skepticism
- Show projected traffic increase
- Show competitor’s search traffic results
- Demonstrate the before and after effect of page increase
2. Conduct Regular Training
- Run regular individual and small group training sessions
- Train the trainer for new hires
- Engage external SEO editorial consultant
- Limit group training to 10-12
- Create a relaxed environment with cookies, lunch and learning
- Give them what they need to learn
3. Make Editorial Part of the Success
- Create SEO friendly article headlines. Online headlines are different than print headlines. Write straightforward headlines. No puns, sarcasm or jokes online. It just doesn’t work! Just bring in keywords so that people understand the message.
- Write sub-headlines under the headline. Write keyword rich sub headlines. Include keywords, synonyms and derivatives.
- Use keyword-rich link text. Use keywords when linking to other internal pages. Check connecting landing page’s keywords.
The Atlanta Journal-Constitution SEO Manager Allison Fabella offered these tips:
- Location, location, location. In your section’s front load your title tags with Location such as “Cobb count News / ajc.com. The same goes for meta descriptions, url’s, and headlines and sub-headlines. Also, use H1 and H2 tags.
- It is so critical that your CMS is setup to be able to implement these tips. This is key to your success. There are a lot of CMS’s out there… make sure your SEO team approves. Once you purchase your CMS, make sure you stay involved. This may make you unpopular. Also, make sure your sitemaps are part of your requirements.
- Sitemaps are your newspaper’s best friend. Site maps help get along structural road blocks built into bad site architecture. Use both web sitemaps and news sitemaps (Google News). Group your sitemap into different sections. In each sitemap include no more than 50,000 stories. Also, follow sitemap protocols. They make a less than perfect sitemap more perfect!
Tribune SEO Director Brent Payne talked about Twitter for media companies. He said there are 4 account types that publishers should set up. They are:
- RSS feed – Do not follow people back from this account, follow your own accounts.
- Get your celebrities involved. Make it a job requirement to have a Twitter profile. Most of our broadcast personalities are required to make 4-5 social connections per day.
- Let employees Tweet. “I am an example of that. I have the second highest Twitter account of employees at the Tribune.” Talk to them about legal issues and ground rules but encourage them to do that. Understand that mistakes happen from time to time. But do not officially endorse these twitter accounts as official voices of the company.
- Building a persona. Tribune created the colonelTribune, which is actually tweets from 4 or 5 of us. Create a character that your audience can connect with personally. Spend time to create a decent avatar. This is our best twitter account with 300,000 followers!
Payne says you then need to promote your Twitter profiles. One way to do this, that the Chicago Tribune did, is to recreate your masthead with the Twitter names of writers instead of the actual reporters. He also says to use the Twitter directories, and to use big ones like Twellow and Wefollow.
Engaging the locals, he says (Twellow’s feature TwellowHood is a great way to find the btw – my words, not his ). He suggests having a Tweetup and inviting top journalists or TV personalities and top referrers and bloggers. He also recommends taking a lot of pictures for “longer promotional shelf-life”. “Don’t buy the alcohol,” he warns though. Trouble could arise.
Finally, Marshall Simmonds of the New York Times and Define Search Strategies says to define “the almighty tag.” He says they ask their editors to “enhance” titles for SEO. They want to see links off the domain in order to become a resource and an authority. He also said journalists didn’t have linking in their head, and that it’s ok to link out.
A couple more interesting items Simmonds shared include:
- “We pushed back our registration wall to 8 clicks and crawlers to 5 clicks. Google quit crawling the New York Times in 2005. Yahoo crawled our registration page 5 million times. They literally kept crawling it.”
- “If you are not keeping in constant communication with your IT Department they are going to screw it up. It is a constant issue. There is also the problem with template roll-backs. We put a lot of check lists in front with the IT Department. This goes for marketing as well. The Ad Department is eventually going to try to sell an advertisement that is going to hurt search traffic as well.”
That about does it for that session. Some very interesting tips on SEO education for publishers. Stay tuned to WebProNews for further coverage of the Search Engine Strategies conference.
Is lack of strong SEO tactics a big contributor to online publishing woes? We’d love to know what you think.
Posted by R.W. Casandra Date: Sunday, August 23, 2009
Google appears to be testing breadcrumbs in some search results, at least in some areas. If you are unfamiliar with the term breadcrumbs, it refers to the hierarchical display commonly used in site navigation. For example: Home Page>Product Page>Product A Page.
Do you utilize breadcrumbs on your site? Comment here.
Several bloggers have noticed Google displaying these types of breadcrumbs in various places in seemingly random results to some queries. For example, Rob Hammond provides the following screen shot:
Leo Fogarty provides another, which shows the breadcrumbs displayed in a different position within the search result:
Google’s use of breadcrumbs appears to only be a test, and a limited one at that. Google has talked repeatedly about sites having good site architecture in the past. This allows Google to more easily and quickly crawl sites.
Bing acknowledges this too. Rick DeJarnette of Bing Webmaster Center recently said, “You can have great content and a plethora of high quality inbound links from authority sites, but if your site’s structure is flawed or broken, then it will still not achieve the optimal page rank you desire from search engines.”
If Google begins incorporating the breadcrumbs display as in the above tests, on a mainstream level, that will be all the more reason to clean your site architecture up, at least in the navigation area. Site architecture certainly goes beyond this, but it is a key part of usability anyway.
Have you seen breadcrumbs show up in Google results? What do you think about the idea? Share your thoughts.