468x60 Ads

Labels

Saturday, April 20, 2013


The Atlantic


The Guardian


"The new scheme is being operated by Freedom Registry, the company which operates a similar .TK system for Tokelau – the tiny cluster of coral atolls in the South Pacific with a population of less than 2,000 – but which is now the most popular domain name in the world, with more active domain name registrations than Russia and China combined."


Starting this July. Domainers, start your dictionaries.


Google Webmaster Central


Mistake 1: rel=canonical to the first page of a paginated series
Imagine that you have an article that spans several pages:

example.com/article?story=cupcake-news&page=1 example.com/article?story=cupcake-news&page=2 and so on

Specifying a rel=canonical from page 2 (or any later page) to page 1 is not correct use of rel=canonical, as these are not duplicate pages. Using rel=canonical in this instance would result in the content on pages 2 and beyond not being indexed at all.


The New York Times


"Fake followers are typically sold in batches of one thousand to one million accounts. The average price for 1,000 fake followers is $18, according to one study by Barracuda Labs. Mr. Stroppa and Mr. De Micheli said some sellers bragged that they made $2 and $30 per fake account. A conservative estimate, they said, was that fake Twitter followers offered potential for a $40 million to $360 million business."


ZDNet


"More than two thirds, or 67.8 percent, of South Korean smartphone users changed their devices last year–becoming the country to do so the most."


The Guardian

The Channel 4 documentary on dogging (see Urban Dictionary…) had one person state they loved Lynx, the deodorant. Lynx took it in good stride



"Rather than panicking about the negative publicity, the official Lynx Effect Twitter account tweeted "Good choice of fragrance over on @Channel4 – guaranteed to get a bit more attention, whatever the situation..! #DoggingTales"


It then followed up the next morning with a spoof photo of Lynx's social media team holding "crisis meetings" wearing masks similar to the doggers in the documentary."



Photoimaging Manufacturers and Distributors Association


"While Facebook, Twitter, and Flickr remove embedded information like copyright notes, the name of the creator, the description and more, the results show that other social networks like Google+ or Tumblr protect photographers' data better."


Full test results at http://www.embeddedmetadata.org/social-media-test-results.php


The Guardian


"Iran's minister for information and communications technology, Mohammad Hassan Nami, announced this week that his country was developing what he described as an "Islamic Google Earth" to be called Basir (spectator in Farsi) which will be ready for use "within the next four months". [...]


"We are doing our best to launch the Islamic Google Earth in the next four months as an Islamic republic's national portal, providing service on a global scale," he added.


"On the surface, Google Earth is providing a service to users, but in reality security and intelligence organisations are behind it in order to obtain information from other countries," Nami said."


PEW Research


"While many teens have a variety of internet-connected devices in their lives, the cell phone has become the primary means by which 25% of those ages 12-17 access the internet. Among teens who are mobile internet users, that number rises to one in three (33%). Among teen smartphone owners, 50% say they use the internet mostly via their cell phone."


The New York Times


"It accuses Google of using the Android software "as a deceptive way to build advantages for key Google apps in 70 percent of the smartphones shipped today," said Thomas Vinje, the lead lawyer for Fairsearch Europe, referring to Android's share of the smartphone market.


For example, phone makers that agree to use Android – and that also want Google applications like YouTube – face contractual requirements to place those applications and other Google-branded applications in prominent positions on the mobile device's desktop, Mr. Vinje said."


Now that I could get behind: a case against crapware. Don't stop at Android! Think of PC's sold with trial version of Office and shortcuts to ebay — ebay! a shortcut! — on your desktop. Away, away you go!

Written by Ruud Hein

My paid passion job at Search Engine People sees me applying my passions and knowledge to a wide array of problems, ones I usually experience as challenges.


People who know me know I love coffee.


Ruud Hein


View the original article here


It is amazing how little some online marketers know about their landing pages. Like a tourist in a foreign country, a marketing newbie often gets a standard set of conversion optimization tips that normally include: "unclutter-unclutter-unclutter", "use compelling titles and images", etc.


But is there more to these tips? Well, the more one learns about conversion optimization, the more one realizes it's a well without a bottom, and the ideas for things you can do with your website are endless.


At the same time, it is important to know where to look when improving a landing page, because, for a month-old site owner, analyzing their pages' performance can be similar to a fairy hunt – one needs to know what they are chasing in the first place.


Hence is this post, which you can use for a landing page vision check and see whether your sight of your page's potential and its possible problems is clear.


Are you using the right keywords?


You have probably optimized your page for the keywords which, in your opinion, people will use to look for your page. But then, are they really the right keywords?


Look into your analytics and see what keywords searchers actually used to find your site. Did anyone search for man in a green shirt, which happens to be the title of one of your images? If so, you should probably use image titles and image alt texts that suite your page's topic better.


Another metric to look into is whether your page is attracting visitors with the right searcher intent. Was anyone looking for night clubs Hong Kong, but ended up on your page about a Hong Kong golf club? Luckily, there are technologies like Zenya that help you discover keywords with just the right searcher intent.


Are you visible to the search engines?


Even if you check your rankings when you page goes live, this doesn't mean it will continue to rank the same for good. Search engine algorithms change, new services appear (think Google's Knowledge Graph), new players come to the market, etc. So, it's important to keep an eye on your search engine visibility at all times.


Besides, it is also important to make sure that the page from your site that ranks for a particular keyword is actually the "right" landing page that should be ranking for it. And, we have just put together a quick tutorial on how you can check that with Rank Tracker.


Do they click on your page in the SERPs?


When your page is displayed in the search results, how often do people click on it? You can find this out by checking out the number of times your page was served in the SERPs (impressions) and the number of clicks it received (clicks) in Google Webmaster Tools.



If the difference between the two is drastic, consider making your pages title and meta description catchier (but not spammy!) – just avoid moving around/removing your target keywords, if possible.


What does your page look like on mobile?


Do you know how many people have visited your page from mobile devices? You can see this in Google Analytics, just check under Mobile and see what your bounce rates from mobile are:



If your mobile bounce rates are way higher than your non-mobile ones, see how your page renders across different devices. You can use a tool Google developed for this purpose.


How high are your bounce rates?


Web marketing newbies often ask: what's a good bounce rate? However, the right question to ask probably is: what's a bad bounce rate? The answer to the latter would be "100%". Everything else pretty much depends.


Over time, you're likely to determine an average value that would define your perfect bounce rate and take it from there. Then, if your bounce rates go up, this indicates a problem. To get to the core of it, check the landing page performance indicators further in this post.


Is your page copy scaring people away?


One of the reasons people leave your site immediately upon arriving on it could be that they don't like the copy on your page. First off, people will barely read a copy written in the stream of consciousness technique (that is, with no subheadings, no bullet points, etc.)


Second, it a fine line between your copy being lively and overly aggressive. Better avoid using what I call SHOUTING graphical means and avoid making unrealistic claims, such as that your product or service is "the best in the world", unless it really is. (I also recommend reading this post that talks about the terms to avoid when describing yourself on LinkedIn – I think many of these points apply to page copy as well).


Is your selling proposition that unique?


The secret to winning your visitors' hearts and minds often lies in the ability to stand out of the crowd. This is why marketers believe any online offer should contain a unique selling proposition. In other words, you should be uniquely positioned in the market compared to your competitors.


I like how Mint break their unique offer:



Can they see your call to action?


A lot has been said about the call to action on the Internet: that it has to be a really big orange (!) button, that you should split test different designs and wordings, etc. This is all very true. Just one thing I'd like to add: do not be afraid to repeat your call to action as many times as appears natural. Some people may dislike buttons or dislike the color orange – you never know. So, provide additional links that lead visitors to perform the desired action(s) throughout the page.


Is your navigation straightforward?


I once came across a site on the Web that seemed to have many links on its home page, but those "links" were actually underlined text that did not lead you anywhere. Unfortunately, I can't remember the name of the site, but the fake links had quickly gotten annoying and I felt like leaving the site, even though it was rather good.


To get insight into how visitors interact with your site's links, images, text, forms and other parts, you can use click tracking software (like MouseFlow, for example).


Do you provide shortcuts to checkout?


As I said earlier, not all people will want to read your entire copy (even if it's well-structured). So, can they see clearly where to go to get what they came for?


For example, here is what SiteBuiltIt! did: they provided a plan for their landing page with breadcrumbs right at the top of it. If one wants, they can just go ahead and take SBI! for a test drive right away:



Then, at the end of the shorter version of the page, there is another start-trial button and a bright yellow line that invites you to click if you'd like to read more about the product:



Does your signup section look like a tax return form?


When it comes to forms on a page, the general rule is the shorter, the better. Ive been noticing mostly forms that ask for one's name/email these days. However, sometimes you might want to exclude poorly qualified segment of subscribers who are unlikely to become paying customers or other bring you any value. If that's the case, it's wise to include more fields into your form. But then keep in mind that you will probably get fewer signups.


Does your page inspire trust?


Perhaps you've seen Amit Singhal's list of 24 questions a site owner should ask himself/herself about their site. Some of these question concern trust:

Would you trust the information presented in this article?Would you be comfortable giving your credit card information to this site?For a health related query, would you trust information from this site?

By honestly answering these questions about your page, you should get an idea of the level of trust it inspires in people.


When visitors are about to give you their personal data or money, they want to be sure that (A) it's safe and (B) they'll get the value they expect. Hence, include proof that your company is real (address, phone number, etc.), add testimonials and, if possible, a money-back guarantee.


Why do they actually leave?


Now, when you have looked at all of the above metrics, how do you tell which one is the reason people don't do what you want them to do on your site? Even though there is no way to know this for sure, you can get closer to solving this mystery by looking at your abandonment rates in Google Analytics (you find them under Conversions -> Goals).


This way you can see what Goals on your page do not get completed and try to figure out why.


Are you Web 2.0 compatible?


Does you page have sharing buttons? If it doesn't, get some. You can use the old-style design or get something "sexier" than that.


Web 2.0 compatibility also means that you host your videos on YouTube. In most cases, it's better than hosting videos on your own site, because YouTube videos are easier to rank on both Google and YouTube and let people comment on, share and add videos to playlists in a familiar way.


Do you have a positive ROI?


The ultimate question to ask about your landing page performance would be: does it drive positive return on investment (ROI) in the end? If it doesn't, you might want to see how it is connected to other elements of your Internet marketing campaign and analyze them together.


For instance, paid ads could be cannibalizing your organic traffic, but how do you know that? I really like this method offered by Brad Geddes, which helps you to figure it out.


Conclusion


Have you tested your landing page vision using the 15 questions? There is just one more thing I'd like to leave you with: don't take any tip from any Web marketer for granted – they could be judging by their experience. Always test things to find out what works on your site and for your audience.


For example, some marketers say pop-ups are a no-no. I don't agree. I've seen pop-up forms, chats and other types of pop-ups work out for many website owners. So, don't jump to conclusions and don't believe anything before you test it and see the results with your own eyes.


Image credit: foshydog via Flickr.com


Alesia Krush is a blogger and digital marketer at Link-Assistant.Com, home to the industry's best SEO and SMM tools. The company's most recent initiative has been the release of the revolutionary BuzzBundle social media management tool that helps Web marketers manage their brand's reputation and establish presence at social networks, blogs, forums, Q&A sites and other social platforms.


View the original article here

Friday, April 19, 2013

Best Practices for Promoting Infographics

Posted by suerte.. On 11:06 PM No comments

With thousands of infographics posted on a daily basis, it can take a lot of work to get your latest creation the online attention it deserves. This can be problematic if you've invested the time, money and resources and the infographic fails in promotion. As the owner of both an infographic design agency and an infographic submission website, I've seen my share of promotion wins and promotion flops. While not all infographics have what it takes to make it in even the best of promotion circumstances, here are some tips for success:

1. Choose a Topic that Makes an Impact

It may sound trite, but a headline can make or break your infographic. Salesy topics will get you nowhere, because webmasters will rarely publish someone else's advertisement for free. Bloggers want to post infographics that will get their readers talking, and even better, commenting on their site. They want topics that provide that "A-ha" moment, make people laugh, incite debate, or shed light on something nobody knew about. Some of the most widely shared infographics online have topics that impacted their audience:

2. Design Makes a BIG Difference

A topic can be amazing, but if the design is not properly executed, your infographic will have a hard time getting shared online. Infographics are meant to tell a story visually, so people don't want a reading assignment: they want eye candy. Additionally, the design should be well-organized and original. There are too many infographics released today using stock photos, stock icons, or templates. Most blogs won't post them because it's obvious that the design was done quickly and only for links. Bloggers and webmasters care about what they post to their sites, and those that will provide quality backlinks or social shares expect quality content. If you're unwilling to spend time on creating something eye-catching and amazing, it's an insult to ask webmasters of any caliber to post it to their site.

3. Identify an Outreach List

Once you have an amazing infographic, the next step is to identify a list of blogs and websites that you think will post your design. This list should include sites that your core audience often visits as well as blogs that have posted similar infographics in the past. Be sure to identify sites in which you think your infographic would be of genuine interest to their readers. The list should include a minimum of 100 webmasters, as well as contacts for all of the infographic submission sites that exist.

If you're promoting multiple infographics, be sure to create a unique outreach list for each one. Getting backlinks from the same sites over and over again will create a subpar backlink profile and devalue your infographics.

4. Mondays Are the Best Days

Once you have your outreach list ready, it's time to promote your infographic. After promoting over 1,000 viral infographics, we've learned that Mondays are the best days to launch a new design. These are the days that many webmasters are planning their content for the week and are more apt to read a pitch email. On the other hand, don't post during a holiday week. The odds of your pitch email being read during a holiday week go way down as many bloggers have day jobs keeping them busy during a short workweek, or they may take long vacations during holidays.

5. Don't Spam Anyone

This is another tip that may sound trite, but it's amazing how few people take it to heart. SubmitInfographics.com receives over 25 infographic submissions every day. Of those, only one or two emails seem like the sender actually took their time to speak to our reviewers directly. Most send canned emails, some just include a link to an infographic, and even others include a salutation like "Hello INSERT NAME HERE" – to be clear, they are actually leaving the words "INSERT NAME HERE" instead of filling in the blanks of their own form letter.

In a world where email has become the main form of communication, recognizing spam is now second nature for most. So, rather than cutting a few corners and trying to reach out to all of your contacts in one email, take the time to write up something unique to stand out. Based on our own experience, less than 4% of people are actually doing this, so you have a far better chance of getting your infographic posted if you just take the time to write a personal email to each webmaster on your contact list.

6. Follow Up

Since you're already taking the time to write a personal email, it's also good to follow up. Wait a week from when you first sent an email and reach out to those who didn't respond. It's possible they were out of the office or maybe they still flagged your email as spam. A personal follow up shows them that you are a real person, who feels that their readers would be genuinely interested in the infographic you are promoting.


In the end, running an infographic promotion is similar to most content marketing: there's a great deal of things you can do to set it up for success, but there are a variety of factors outside of your control that may lead to less than ideal results. Still, integrating these best practices into your infographic promotion strategy can increase your chances at a more successful campaign. Start today and watch the results!


View the original article here

Most people in SEO have been in this situation.

You're on a call with a client, discussing changes that need to be made to the site to make it SEO-friendly, when you hear the words that make your blood run cold and your hand start to reach for aspirin.

"Wait, I need to talk to my developer."

It doesn't need to be this way.

Developers come in all shapes and sizes, and trust me – this isn't a post to rag on them (they're vitally important to the SEO process, too!). In an ideal world, the developer may be part of the same team as you, such as in an agency or you may actually be the developer. This is great because communication can be easily facilitated and everyone can be on the same page from day one. You can be involved in the site building process or can give advice along the way. But when the developer is someone not in the same communication loop as you, this can cause quite the problem.

Developers who also do SEO (good, bad, or otherwise) may not be keen to work with someone else, and sometimes, that's justified. Think about how you would feel if another SEO was brought into the picture for consulting or on-site work. Would that make you feel better, or would you feel like you were threatened?

But to put the pride and protectiveness of the site aside, there is something that needs to be recognized: you both are working toward helping your client – you BOTH benefit from working together well.

So, when you are faced with the challenge of working with an outside developer, here are some things you can do to make both of your lives easier – and your client's campaign more successful:

Adopt a positive mindset. It seems like the minute you hear that someone else is involved, you automatically go into an attack mode or become overly defensive. Instead, take a deep breath – and realize that your client is going to make those kinds of business decisions. It's not up to you to do that and sometimes, that does mean there will be an extra cook in the kitchen. However, be positive. You both have the same goal – help the client – and you can work with each other to make that happen. Plus, *gasp* you could LEARN from each other. Walk into the project with the intention of working together to achieve your common goals. Be friendly. Be helpful. Be easy to work with.Create an open line of communication. This is the reason why so many developer and SEO relationships go sour. I'm a firm believer in making sure everyone involved in a project (including the client and developer) are on the same page. Schedule group conference calls, CC each other on emails – do whatever it takes, but keep communication open and ongoing so that everyone knows what's happening at all times, which brings us to the next point.Be upfront about expectations. Your job has a scope and the developer's job has a scope. Work together to avoid creating extra work for each other – and extra frustration. Define your roles in the project and make sure the client understands who he or she needs to go to when they have a question or want to make a change. Work with the developer on understanding what each of your jobs entail and make sure the client knows that too. Be respectful (and encourage respect) of each other's jobs. You both contribute to the client's success, but it does make things easier when everyone knows what he or she is or isn't supposed to do.Cultivate a collaborative environment. Working with the same client can mean that YES, you CAN work together! Think of yourself as part of a team. You want to work with the developer and the developer wants to work with you, so create opportunities for that to happen. Ask for his or her opinion or feedback. Have brainstorming or strategy sessions together, rather than apart. Foster an environment where everyone is contributing and feels like they're doing something meaningful. This makes a huge difference.

Going through steps like these can sometimes yield a great partnership between SEO and development, but there will also be cases where you can't really do much to make this relationship less rocky because the other person isn't willing to go through the same steps. I'm an optimist, so I think that in most cases, the developer you're working with wants the same things you do: to work together, to work well, and to help the client achieve his or her goals. Developers are people too! Be open, honest, communicative, and committed – the rest will follow.

Have other suggestions for making the developer and SEO relationship easier? Leave a comment with your tips!

jQuery(document).ready(function($) { window.setTimeout('loadLinkedin_35248()',1000);window.setTimeout('loadGoogle1_35248()',1000); }); function loadLinkedin_35248(){ jQuery(document).ready(function($) { $('.dd-linkedin-35248').remove();$.getScript('http://platform.linkedin.com/in.js'); }); } function loadGoogle1_35248(){ jQuery(document).ready(function($) { $('.dd-google1-35248').remove();$.getScript('https://apis.google.com/js/plusone.js'); }); }Written by Mandy Boyle

Mandy Boyle gets her daily fix of copywriting as the SEO Team Leader at Solid Cactus. She is also a published freelance writer and was probably a baker in another life. Cupcakes, anyone?

Mandy Boyle

(function(){ var s='hubspotutk',r,c=((r=new RegExp('(^|; )'+s+'=([^;]*)').exec(document.cookie))?r[2]:''),w=window;w[s]=w[s]||c, hsjs = document.createElement("script"), el=document.getElementById("hs-cta-0a357470-d83b-4005-8fe6-2751cd4587c1"); hsjs.type = "text/javascript";hsjs.async = true; hsjs.src = "//cta-service-cms2.hubspot.com/cs/loader-v2.js?pg=0a357470-d83b-4005-8fe6-2751cd4587c1&pid=214726&hsutk=" + encodeURIComponent(c); (document.getElementsByTagName("head")[0]||document.getElementsByTagName("body")[0]).appendChild(hsjs); try{el.style.visibility="hidden";}catch(err){} setTimeout(function() {try{el.style.visibility="visible";}catch(err){}}, 2500); })();

View the original article here


You dream about building links purely on the awesomeness of your website. And maybe you've earned some. I have, and it feels great to be recognized. But let's be honest– not every website can bring something new and special to the marketplace that's going to earn purely organic recognition.

SEOs crave natural links that are voluntarily given. Yet for all the talk of how quality content begets links, they rarely build themselves at a rate that actually impacts a site's search result placement. And that's especially true if the target site serves a highly competitive niche. Millions of perfectly good businesses have difficulty differentiating themselves online and earning some of the exposure that larger marketing budgets always seem to monopolize. Here's how to gain links and greater exposure by being a proactive publisher.

Spend some time meeting and interacting with your prospective hosts before you ever pitch them. Even though cheap link building services still blast guest posting requests because someone won't stop paying them, there is no site worth your content that'll ever respond to a cold, robotic pitch. Every experienced white hat SEO has helped sites outrank competitors with just a tiny fraction of their links. That happens by focusing on authoritative sites that really matter to the business, and getting to know the folks behind them. You know, like networking in the "real" world, only with blog comments, social media interactions and personal emails.

No one cares what you're selling. It doesn't matter. All that matters is whether you can provide value for that site's audience in an authentic way. Naturally, you'll create content from your own area of expertise, but it shouldn't be about your products or services. And when you create an article, infographic, SlideShare or video before you choose a site for it, you're much less likely to find a great placement. Obviously, whether another site will want to link to your YouTube video shouldn't stop you from producing it, and having a large stable of content makes it easier to audition as a credible contributor. However, creating content that is purpose-built for the target site will almost always get a better reception.

Timely content is great. Advance content is divine. A seed vendor could offer content on when the last frost is expected in certain areas a month or two ahead of time. A business consultant could provide a roundup of upcoming business events. See, guest blogging and content marketing aren't just about coming up with ideas that you hope readers might enjoy, but rather, delivering actionable content you know they'll need, before they need it. Now that's a great way to make host sites look good. Of course, the post you share may simply be a small recap of a more complete resource hosted on your own site. So work on your new ideas with future events in mind.

Once you get going, see whether you can stretch an article into two, three or more posts. Sometimes, the content comes easier that way, rather than grasping for brand new topics. A high quality series comes pre-packaged with value for the host, and showcases your value for an audience that likely contains prospective clients for you. If you can't manage a series, a recurring contributor role on a popular site is just as good. Content in the right places can earn significant referral traffic.

Site owners have been pitched so much spam and for so long now that some don't even entertain guest post requests anymore. Reaching out to bloggers (or other related sites that happen to have blogs) requires you to come with your content ready. Unless you or your site's brand has the street cred to get a guest spot from a simple ask, you'd best show your prospective host exactly what you think they might like to publish. If you've been present in their online orbit, the pitch won't be cold anyway, but simply asking for a guest spot communicates no value proposition. Pointing to related works can also help to guide a discussion in asking the host what he or she could really use.

Some sites owners will ignore a guest blogging request, but happily publish a sponsored post. That's not necessarily a bad thing. Targeted sponsored posts may not be an immediate boon to SEO, but if they send worthwhile traffic, so what? Over time, I've earned many organic links, social media mentions and great relationships from site owners that began with nothing more than a sponsored post. If you go into it assuming you need one type of content and link, you're needlessly restricting your options. You might offer to record a podcast or to interview the site owner for your own site.


You can always get what you want, so long as you focus on giving others what they want first. Being a proactive publisher puts you in control of constructing a durable online presence that other site owners will happily help you build.


If you liked this post, you might also enjoy Here's Why We Tell You To Blog


Mike Sobol is Partner and Co-Founder of Guest Blog Genius, a guest blogging service for SEO professionals, and Content BLVD, a content writing service for busy bloggers and site owners. Building businesses since 1999, Mike's passion is to create effective new services to fulfill unmet needs in a variety of niche markets, which include internet marketing, content creation and SEO.


Guest Blog Genius


View the original article here

One of the greatest things about working in the Search Industry is the copious amount of data that we get to play with. I have a quantitative background with an undergrad in electrical engineering and in my former life was a forecast analyst. I love numbers and the ability to pull insights from data.


A difficulty that most Canadian search professionals have is that given our proximity to the United States we rarely get access or insights into the Canadian market. Speaking to a local search professional, he exclaimed that his eyes glaze over when he reads articles from people that are trying to infer Canadian insights from US data.


With the Bing Ads Intelligence Tool you can get data on the 14 million unique searchers on the Yahoo! Bing Network in Canada[1] with a click of a button.


The Bing Ads Intelligence tool is a research tool that allows you to build and expand on your keyword list, gather research on keywords, and help generate potential traffic for those keywords through an Excel interface. Its an extremely powerful tool that allows you to peer into how keywords perform in Canada, United States France, UK, India and Germany.


1. First step is to go and download the Excel addin here
2. After you install Bing Ads Intelligence, a tab will appear called Bing Ads Intelligence



3. Next, sign in with your Bing Ads account, select the Country / Region and the language that you would like to do research in. If you don't have a Bing Ads account you can easily create one at: https://bingads.microsoft.com



4. After that you are ready to go!


There are many cases where you can use Bing Ads Intelligence in PPC (pay per click) Campaign management but one of the best uses is the Keyword Research Template tool. Imagine you were asked to create a PPC campaign on Winter Tires. You may wish to find out the demographic information, gender breakout, volume of searches and even other search trends. By downloading Research Template, a dashboard will be created giving you insights into this keyword.


1. First select the Keyword Research Template and the Search Insights Dashboard



2. You'll be prompted to download a template; don't forget to Enable Editing and Enable Content
3. The fields, which are shaded in yellow, allow you to set your parameters to research: keyword, device and date range
4. After your parameters are set, select Refresh All on the Bing Ads Intelligence Tab



5. You now have Canadian keyword research all done in the span of 5 minutes. Note a current limitation of the tool is that there isn't any provincial insights



Feel free to leave a comment and tell me about the Bing Ads Intelligence experience.


Andrew Yang is a Search Evangelist for Microsoft Canada. He is responsible for educating, promoting and evangelizing the Bing Ads platforms. This includes support for Bing Ads Web, Bing Ads Editor, Bing Ads Intelligence, API and the various social media platforms that Bing Ads is on.


http://community.bingads.microsoft.com


 

How To Make Ecommerce Content

Posted by suerte.. On 8:21 AM No comments


So much talk abounds about content being king, especially since Google let loose its stampede of penguins and pandas across the flower gardens of the Internet (Can penguins really stampede?).


That's all very fine if you run a news site or a blog site, and (predictably enough) so much of the how-to information being pushed on us comes from how-to-blog bloggers. But what if you run an ecommerce website, an online store? What on earth can you do for content to keep the stampeding penguins and pandas from stomping all over the proverbial flower garden of your website?


Let's first look at what good content means. First, it must be unique. By unique, we don't mean just that it can pass CopyScape (any fool with a dictionary can string together a "unique" series of words) . We mean that it is truly unique. Let's not forget that the big, powerful machines powered by fish and bamboo shoots have heard of synonyms, too.


Second, it must be original. Perhaps this is a repetition of being unique, but it does take the concept one step further. Especially because the more original something is, the more it will interest people. It's not just the words that should be original, but the idea, the meaning, what the content is actually saying.


Who cares if it interests people? Well, aside from making a purchase more likely, the search engines actually know what interests people, and that is what they are now trying to promote.


So the third rule of good content is that it be "viral" – that it been excruciatingly interesting. This takes the concept of original, and knocks it up a few dozen notches. Being original is just one aspect of being interesting. Useful is interesting. New is interesting. Cool is interesting.


If you don't know what is interesting, hire somebody who does know to create your content. The penguins and the pandas are watching, and if they see that people are interested in your content, they will promote it.


How do we add unique, original and interesting content to an ecommerce website?


The most important content would be on product pages? Why? Because this content can not only lure prospective customers through the search engines, but also because these are the pages you want the search engines to love the most. So let's look at a few things we can do to create original content on these pages.


This is probably the most obvious way to ensure that your money pages have original content. Yet how many times have we seen virtually identical content across dozens of pages, each selling a different size of ball bearings or a different grade and type of screwdriver? Too many similar flowers makes it boring for the animals to trample on.


OK, OK, so the products are very similar. That's no reason to get lazy and just copy-and-paste the product descriptions and change one or two words. Write each one from scratch – get multiple writers involved! – and your words will at least be original, even if the products aren't so much.


Yes, put them on your product pages. That's what I have done. You can see five testimonials on this product page, for example: http://www.seo-writer.com/freelance/ghost-writer.html.


What, were you planning to shuffle all your testimonials off to some "testimonials" page that nobody except the research-crazed, caveat-emptor fringe will ever seek out? Put them right on your product page. Somebody bought gasket number 36C? Get his testimonial right up there on the page for that particular gasket – 100 percent unique to that page, and of strong interest to any potential buyer who lands there.


This is an ideal strategy for when you don't have a testimonial for the page or even just when you get a great review on a review site. Why waste a great review, when you can harness its power right in your store?


No, don't copy the whole thing. That would kind of violate the whole originality thing, right? But do take the most impressive section and reprint it on the website, then link to the offsite review (using the target="_blank" attribute so that people don't lose your page) so people can read the full text of the review. A great little piece of content that can really help boost sales, too.


Not all content is ideal for specific product pages. An ecommerce website can have a blog. In fact, what on earth would you do putting up a store that does not have a blog? I could write a dozen articles on the value of having a blog on your store, but it's already been said a thousand times, and quite eloquently here and here and here.


When a potential client approaches me about SEO, I invariably recommend setting up a blog. This is 2013, and that's pretty much the default base – the minimum – for effective SEO these days.


But a blog might not be the direction you want to go, and it is only "pretty much" the minimum. If you don't want a blog, you can create an articles section, and load it up with articles that will serve the needs of information-seekers in your niche. Or an Infographics section (but without text to accompany the images, the pandas and penguins will not recognize this as "content".


What can you write about in your blog, or even in an articles section? That does depend to some degree on the nature of your product. Recipes might be great for selling canola oil, but not quite as useful for baby oil or motor oil. But here are some general ideas.

Your product in the news. Anything related to your product or similar products in the news. How-to tips related to your product. Styling tips related to your product. Interviews with expert users of your product Lyrics to songs related to your product Top-ten lists related to your product. Tips to save time/money/frustration, related to your product. Tips and news that have nothing to do with your product…but would be of interest to users of your product.

Let's take just a moment to stop scratching our heads over that last bullet point. Suppose you sell natural jewelry or makeup or perfume or handbags. Your audience will primarily be environmentally conscious females. They might not be buying jewelry or makeup or perfume or handbags every day, but when they are in the mood to buy, you want them on your website.


So keep feeding them news that will interest them, whenever possible tying it to your jewelry or the places your products come from. This is a great content strategy that will appeal to all three of your audiences: human, penguin and panda-ish (had some trouble with the syntax on that last word – and zoological specialists feel like jumping in).


Yes, we live in a unique time when we need to call on zoologists to help us properly word an article about search engines and website content. But if you arrive at a dance and find the room filled with penguins, best to start practicing your waddle. Just because you run an ecommerce site is no reason to skimp on the content. Got any other ideas? Please feel free to add them in the comments below.


If you liked this post, you might also enjoy On-Page Optimization for Ecommerce Websites


David Leonhardt is a Ottawa based SEO consultant. When not guest blogging he occasionally finds the time to update his own blog.


SEO Marketing Blog


View the original article here


I love using Firefox. I have experimented with many different web browsers in the past, but I always switch back to Firefox in the end. Using Google Chrome or Internet Explorer feels like trying on a shoe that is a few sizes too small. It just doesn't feel right for me.


One of the main reasons why I love using Firefox is its extensive Add-Ons Collection. These nifty browser extensions greatly improve the efficiency of my web browsing experience. I enjoy the ability to replace a mundane web command with the ease of a simple mouse click. I also find that my productivity increases when my bookmarks and my navigation panel are positioned exactly where I can find them.


I compiled a list of 10 Firefox Add-Ons that will improve your web browsing experience through several key features. My focus is on improving efficiency and productivity. All of these extensions have been personally tested with the latest update of Firefox 19.


1. Tab Mix Plus




This is one of the most essential extensions for managing the tabs in your Firefox browser. Whether you want to duplicate the tabs, merge the tabs, or lock the tabs, this powerful add-on has hundreds of customizable settings, allowing you to modify your tabs in nearly any way imaginable.


2. Tile Tabs



This super cool add-on divides your screen into two or more sections, either horizontally or vertically, allowing you to view multiple websites within the same Firefox window. This is especially convenient for opening two Firefox windows and comparing the websites side by side.


3. Vertical Toolbar



The vertical toolbar does exactly as it says on the tin. This extension adds an unobtrusive navigation panel on either the left or right side of your browser. One helpful trick is to implement your bookmarks toolbar inside the vertical toolbar, so you can easily access your Firefox shortcuts in a vertical list.


4. Speed Dial



This bookmarks manager gives you access to your favorite websites with hundreds of convenient keyboard shortcuts. You are also able to group the websites and add screenshot previews to each bookmark.


5. URL Lister



This add-on lets you open multiple links simultaneously by copying and pasting the URLs from the clipboard. It also gives you a list of all the URLs currently open in your Firefox tabs.


6. QuickFox Notes



Use this extension to jot down notes and easily manage them within your Firefox browser. You can choose to open these notes in a separate window, a new tab, or on the bottom of your screen.


7. Store Tab



It is the equivalent of adding a 'Save-As' button onto your Firefox browser. You can save all of your current tabs with one click and reload them again in a later session.


8. gTranslate



This translation tool is useful for navigating a website in an unfamiliar foreign language. You can highlight a specific block of text and view the translation in a context menu without leaving the page.


9. Screengrab!



You can take a screenshot of the website with one simple click. It allows you to save the image of the entire webpage, the visible portion only, or any selection of your choice.


10. Menu Editor



You can customize and rearrange the application menus in your Firefox browser. This is extremely useful when you installed many Firefox extensions and need to clean up your context menus.


Tony is part of the SEO Team at Search Engine People. He dutifully manages the resources, performs quality assurance, and conducts research analysis for the team.


View the original article here


There have been 100's of blog posts discussing the importance of content as a ranking factor in the last couple of years following Google's release of the Panda update to their algorithm. However, the importance of content on long tail keyword focused web pages was really brought home to me the other day.


I was reviewing the click thru rates to our site for various keyword queries. A result that really caught my attention was that queries for the term "Mac Duggal Size Chart" are only producing a 45% click thru rate to MacDuggal.com. This raised the question of "why is a search on this branded term producing a click thru rate of less than 50%?"


The answer to the low click thru rate was pretty easy to uncover. Searching on Google and Bing showed that one of our authorized retailers was outranking our site and had the top search result position for this branded term.


,


The next question was "how could one of our retailers be outranking us for this branded term?" The answer seems to be that their Mac Duggal Size Chart page features more and better content than our does. The other site includes a content section on "tips for measurements" in addition to the size chart, and our page only feature a size chart.


While there are myriad other factors that are likely influencing this ranking result, and I may be jumping to an incorrect conclusion based on a single anecdotal outcome, this seems to serve as a good demonstration of just how important on-page content has become in ranking well for long tail search terms. The fact that another site could outrank us for this branded term has the appearance of being a dramatic demonstration of the importance of on-page content as a ranking factor. In this case, the other site has obtained the top ranking for a term associated with purchase intent, so this result is probably generating good value for them.


Conclusion


Providing viewers with relevant on-page content on long tail keyword focused pages has almost certainly become a critical search engine ranking factor. While the "Mac Duggal Size Chart" example is admittedly thin evidence to prove the importance of content as a ranking factor, it is the type of correlation that has made a believer out of me. Adding the deepest content about a subject to a keyword targeted webpage almost certainly enhances the likelihood of a high ranking.


View the original article here

Thursday, April 18, 2013

It's coming; another new Google search algorithm to contend with. This time the 'Google Merchant Quality Algorithm', announced by Google's head of search spam Matt Cutts at this year's SXSW festival, aims to penalize low and poor quality merchants in local search results. 

With Google's increased focus on local search and its own Google + social network, local search results and reputation management have become vital for small and medium sized businesses alike.  The old motto 'There's no such thing as bad publicity' may no longer be true in the age of social media, but there are ways in which a social and web-savvy business can make this work in their favor.

First, by optimizing your business' Google +, Yelp, Trip Advisor and other such pages for targeted keywords Google will recognize what services you provide your target market.

Second, by responding often and professionally to both negative and positive comments on your pages, you're engaging users.

If you even remotely follow the latest SEO news, you know how much Google loves 'providing its users with new and engaging content'. This 'engaging content' helps these various profile pages (which are now optimized and contain a link to your page of course) to rank higher amongst other websites in Google's organic and local SERP's.

More penalties from Google are inevitable, so diversifying your internet marketing efforts is extremely important.

Taking control and optimizing your business' listings on these social platforms is a must, and it gives your business an easy opportunity to make as many impressions on the first page of the SERP's as possible. The only cost to you is the time you take to optimize these pages and respond to customers.

Imagine this scenario: You're a small local pizzeria and you respond to most of the reviews on your Google + profile. You've offered concise explanations to those who have complained, and maybe even offered a free pizza or two to make things right. Now, someone doing a search for a local pizzeria is more likely to view your profile, complete with its many reviews and ratings, over the one with few reviews and ratings. Reading the comments and your responses, the user is more than likely going to choose the local pizzeria with the caring and involved owner over the competitors with less to say. Clicking through to your website, they read your mouthwatering menu, learn a little more about you and order. There you have it; you've just earned a potential new best customer.

Now, while one person's complaint could turn into an avalanche of negativity, if handled properly, it can not only allow a previously scorned customer to give you a second chance, but potentially draw new business  You've engaged users, been a 'quality merchant', potentially increased your rankings, made a few positive impressions, and maybe even gained a few new customers.

If you liked this post, you might also enjoy Google Updates Rankings to Penalize Negative Reviews

jQuery(document).ready(function($) { window.setTimeout('loadLinkedin_35995()',1000);window.setTimeout('loadGoogle1_35995()',1000); }); function loadLinkedin_35995(){ jQuery(document).ready(function($) { $('.dd-linkedin-35995').remove();$.getScript('http://platform.linkedin.com/in.js'); }); } function loadGoogle1_35995(){ jQuery(document).ready(function($) { $('.dd-google1-35995').remove();$.getScript('https://apis.google.com/js/plusone.js'); }); }Written by Jason Streatch

Jason is a part time word slinger/internet troll and full time SEO specialist at Search Engine People

(function(){ var s='hubspotutk',r,c=((r=new RegExp('(^|; )'+s+'=([^;]*)').exec(document.cookie))?r[2]:''),w=window;w[s]=w[s]||c, hsjs = document.createElement("script"), el=document.getElementById("hs-cta-0a357470-d83b-4005-8fe6-2751cd4587c1"); hsjs.type = "text/javascript";hsjs.async = true; hsjs.src = "//cta-service-cms2.hubspot.com/cs/loader-v2.js?pg=0a357470-d83b-4005-8fe6-2751cd4587c1&pid=214726&hsutk=" + encodeURIComponent(c); (document.getElementsByTagName("head")[0]||document.getElementsByTagName("body")[0]).appendChild(hsjs); try{el.style.visibility="hidden";}catch(err){} setTimeout(function() {try{el.style.visibility="visible";}catch(err){}}, 2500); })();

View the original article here

10 Email Tips for Prospecting SEOs

Posted by suerte.. On 3:17 PM No comments
willy-wonka-seoFrom time to time (every day), the SEOs of this world get inundated with kind offers from prospective companies proposing to help them do their jobs by alleviating some of the tasks they must undertake in their daily grind. I’ve compiled some useful tips to help any budding spa-*cough* special ops email marketer put together [...]]]>

View the original article here

Holy sweet mother of God. There’s adjustments, and there’s EVICTIONS.  Ladies and gentlemen, allow me to introduce you to Google’s latest algo adjustment dubbed ‘Titanic.”  Why “Titanic?”  Because you’ll be searching for survivors, that’s why.  Well, the official name has now been officially entitled “Penguin.”   Our original “Titanic” moniker was at least  related to [...]]]>

View the original article here

10 Things Your SEO Doesn’t Know

Posted by suerte.. On 6:07 AM No comments
MINUTES spent online2Google is an SEO’s god.  It’s true and many SEO don’t know it.  Google says add this text, or remove that tag and an SEO will race to see how fast he can please Google.Just a whisper of a new algo change will have SEO’s running to their computer to see if they can discover [...]]]>

View the original article here

This week, Matt Cutts posted a 12 second video answering the question “Do Images pass PageRank?” This 12 second video simply responds, ‘yes.’ Ladies and gentlemen, we’ve got legal, paid juice passing. This means that there is a white-hat way to flow juice while paying for it. So, yeah, it’s not a ‘paid link’ but [...]]]>

View the original article here

Wednesday, April 17, 2013

You Can’t Compete With “Psycho”

Posted by suerte.. On 10:33 PM No comments
I often get asked, “What IS your goal, King?” “Domination to the point of demoralization,” is my answer. That’s when the bug-eyed, blank stare hits me. The concept of my strategy is so under-estimated, so feared, so rare, that no one hardly ever gets it that THAT is what I’m doing.    Go fucking psycho [...]]]>

View the original article here

Grey Hat To White Hat Disaster

Posted by suerte.. On 6:57 PM No comments
disasterI knew it was going to be rough.   I knew it was going to be bad.  I had no idea how bad.  It’s a total and complete SERP disaster. So, last quarter, I wrote about how I declared Google the winner of the “you better go white hat or else” stalemate.  I did paid [...]]]>

View the original article here

Penguin Recovery Posts; please just stop

Posted by suerte.. On 3:22 PM No comments
Penguin recovery tweetI have seen more than a few posts over the last fucking while with shit stating they had somehow (miraculously) recovered from the recent Google Penguin algorithm update. There are also a whack more of them talking about HOW to do so. Either way, I call BULLSHIT. Why? That’s fairly simple. Allow me to make [...]]]>

View the original article here

Crap hats on FacebookPartial knowledge is a dangerous thing. You have heard of Google’s Penguin update(s), maybe you have even gone as far as reading a few blog posts on the topic, not necessarily even the official version on Google’s blog or Matt Cutts’ take on it, enter the real world… You have a site to rank, what [...]]]>

View the original article here

New Google WTFU algorithm updateRecently Google announced, as part of their monthly updates, a new method of semantic analysis they believe will help them in dealing with over aggressive search engine optimization practices. Code named ‘Orca‘ (project name; hide and seek) it will apparently be a companion to the now infamous Penguin and Panda updates many in the SEO [...]]]>

View the original article here

Multiple Meta Descriptions

Posted by suerte.. On 4:46 AM No comments

I recently read a post by Adam Audette that was genuinely excellent – it was about maximising your click through rate in organic SERPs, by having very well presented search snippets. I’m always keen on having well written titles and meta descriptions, and I find it surprising that snippets in search results are, for the most part, pretty terrible. Check out SEOptimise’s excellent post on title tags if you’re looking for ways to improve there.

Patrick Altoft had an interesting tip about leaving the brand name out of the title tag – while this may not work for everyone, the idea is that for a generic keyword search (like “red widgets”), Google may display a title that’s optimised for that term. If the search term is branded, however (“Brand name”), then Google will most likely use the Dmoz title.

In a similar way, you can actually have multiple meta descriptions – potentially one for the keyword, and one for the brand name. This isn’t recommended for everyone, and I wouldn’t recommend it for many pages on your site, but it’s possible. The regular limit for meta descriptions to be displayed in full in Google is 156 characters (although I tend to stick to around 154 characters). I recently experimented with having a double length meta description – with the first snippet being designed to be well written for a generic keyword, and the second snippet written for a brand search. The full meta description for my homepage is this:

“Dave is a freelance SEO consultant, specialising in creative link building and in-depth technical site audits. To find out more, feel free to get in touch. Shark SEO is a search marketing blog with free advice on ranking your site better in Google, Bing & Yahoo. Check out the SEO blog today at SharkSEO.com.”

That’s twice the length of a regular meta description. Now check out the snippet for “freelance SEO consultant”:

And here’s the snippet for the search “Shark SEO blog”, which again returns the homepage:

When you put multiple snippets in the same meta description tag, it looks as if Google will use the snippet that’s most suitable for the query.

Flickr image by Amir K.


Source

How Google could detect paid links

Posted by suerte.. On 4:41 AM No comments

Google’s Penguin update, and the unnatural link warnings they’ve been sending out through Webmaster Tools, shows that they’re now looking to penalise suspicious & paid links instead of just devaluing them.

But the thing that really interests me is how Google determines which links are paid and which aren’t. If you’re an SEO, when you see a paid link, in most cases it’s generally pretty obvious if it’s unnatural or paid for – but it’s not as simple for a machine to detect.

There’s been some speculation as to what kinds of signals Google is looking at – sites that have the link warnings apparently tend to have a lot of sitewide links, most likely in footers and sidebars – and they also often have a very high keyword to brand anchor text ratio.

I’m not convinced that the ratio of anchor text used is enough to flag links as suspicious, at least not on it’s own. If the site or page doesn’t have that many links, then it’s a small sample size that could be easily skewed, and could lead to a lot of false positives. Another issue is that exact match domains would effectively get a free pass (although, that might still be true).

I have a theory – and please note, this hasn’t been proven – that Google is looking at another signal to work out which links are suspicious. One of the big differences between paid links and natural links is when they’re placed. The majority of paid links are added to pages retroactively – i.e. a website has a page that mentions car insurance and a company might then approach them and offer to pay them on a monthly basis to change that text to a link.

I believe that if Google has crawled a page, and then at a later date recrawls that page and discovers a new link – with hardly any extra content added – that link is now flagged as suspicious. They might devalue it, they might send out a webmaster tools message or they might do both – but that link could well be flagged. The exception to this is if the page they’re crawling is the homepage, and potentially category pages, where content might change frequently.

If a reasonable chunk of text is also added at the same time as the link, then it potentially wouldn’t be flagged (so genuine updates to news articles wouldn’t accidentally flag that link).

Other times, a paid link might be added to a sidebar in the form of a banner ad, or in a blogroll link, or as a link in the footer. These are, in 99% of cases, now sitewide links. They’d potentially trip the same filter as above, because those links would appear on pages that Google has already crawled, but there’d also be a higher percentage of false positives here (i.e. good links being flagged as bad) as bloggers often link to sites they genuinely endorse in blogrolls too.

If I were Google, I’d treat those links differently to deal with the increase in false positives. Unless I was confident that the link was classified correctly as either paid or natural, I’d consider silently devaluing that link and not sending out a link warning. After a time limit (maybe 6 months, maybe a year), I’d allow that link to start flowing Page Rank. If you’re buying links, you don’t want to pay for them and not have them work for months – you might be more likely to notice that the links you’re building aren’t working, so you stop renewing them. If it’s a genuine editorial link in a blogroll, then it’s more likely that that site can wait a while before getting the link value – because that link is mainly serving to pass them useful traffic.

“I’d like to get a few paid link reports anyway because I’m excited about trying some ideas here at Google to augment our existing algorithms” – Matt Cutts, 2007

I think this is probably something that Google have been doing for a while, way before the webmaster tools warnings were sent out. Matt Cutts mentioned in the past that they’ve been working on algorithms to automatically detect paid links, and I imagine there are probably other signals they’re looking at too.


Source

At the end of last year Danny Sullivan wrote an article for Search Engine Land titled “What Social Signals do Google & Bing Really Count?” which featured an interview between representatives from both search engines. The article confirmed that Google and Bing use Twitter and (possibly to a lesser extent) Facebook as another signal to determine where a site is able to rank in the regular search results.

While a lot of SEOs had begun to suspect that tweeted links were influencing rankings, it was really good to see it actually confirmed.

What Google & Bing didn’t mention, though, was how strongly they were using these social signals as a ranking factor. Google has claimed for years now that there are over 200 ranking factors, so it’s hard to say whether their use of Twitter is a majorly influential factor (like links) or whether it’s just one of many neglible factors.

Google also failed to mention how long the Twitter effect would last – I think quite a few people may expect it to be a very time-sensitive thing, particularly around breaking news. The assumption is that, when Google uses tweets to boost a page for a search term, the ‘Twitter effect’ will eventually stop being such a strong ranking factor after enough time (or when the tweets stop) and then the regular SEO factors (links, on-page keywords, etc) start to take over. This wasn’t confirmed or suggested, it’s just what I would have expected.

A final point that wasn’t mentioned is whether or not Google differentiates between tweets from specific countries – so whether tweets from UK users to a specific page helps boost that page in Google.co.uk, or whether it also helps in US results in Google.com.

These two points – tweet locations and how long the Twitter effect lasts for – is something that I wanted to look into because of a post I wrote a while ago on Raven Tools. I wrote it very shortly after Sugarrae published hers, and I noticed something interesting about the two posts – my post very quickly started to rank very well for the term “Raven Tools” in Google.co.uk, out-ranking Rae’s even though I linked to her post from mine, and despite the fact that Sugarrae’s post, by all the regular SEO metrics like number of links and domain authority, greatly deserved to outrank my post. My post ranked so well on Google.co.uk that the only domain that outranked it was Raventools.com itself. This wasn’t true in Google.com though, the US results showed the results that you’d normally expect, with Sugarrae outranking me and with my site towards the bottom of page 1. I should also point out, my site isn’t geo-targetted to any location in particular.

At the time I assumed it was some kind of query-deserves-freshness effect, and that eventually my site would drop down the search results. That would fit with my original idea that Google’s use of Twitter is to spot breaking news and promote tweeted articles when the topic was hot, but then dropped those articles in favour of the most linked to over time, when the topic wasn’t being tweeted about as much.

It’s been over 5 months since my Raven post, and it’s still only outranked by Raventools.com in the UK.

This would imply that, in this case at least, the Twitter effect may not be time-based, and tweets from months ago may still help your page to rank well.

I wanted to look into why my post was ranking well in the UK results, but not anywhere else. It’s a .com, hosted in the US and it isn’t geo-targetted to any country, Google shouldn’t consider it a UK specific site.

Using Backtweets I grabbed a load of the data around who tweeted my post and compared it with who tweeted Sugarrae’s. An important point to remember is that Google is likely treating some tweets diffently to others, depending on how authoritative they think a Twitter user is.

While Sugarrae had more tweets to her article than I had mine (she had 23 to my 13), the majority of my tweets were from people who had their location set to somewhere in the UK (9 of the 13), while Sugarrae had the vast majority of her tweets from the US (17 of her 23), and she only had 2 UK tweets.

This would suggest that Google is using the location of tweets to determine which search engine the page gets a boost in. The theory is, if a page becomes incredibly popular amongst UK tweeters – it may only be relevant to people in the UK, and so it only gets a boost in Google.co.uk. This is an observation for just this one specific example – it’s not a cold, hard scientific fact – but if anyone was planning on testing how tweeted links can affect rankings, I’d suggest looking into how long the effect lasts for, and whether the location of the Twitter user plays a part.

And you can download the sheet here, if you’re so inclined.

Flickr image from view-askew.

Thanks to SEO Scientist Neyne for the title advice.


Source

Robots.txt & Duplicate Content

Posted by suerte.. On 4:32 AM No comments

As most SEOs know, the robots.txt file sits in the root of the site, and is a list of instructions for search engines (and other bots, if they adhere to it) to follow. You can use it to specify where your XML Sitemap is, as well as prevent Google and the other search engines from accessing pages that you choose to block.

Every time Googlebot arrives at your site, it will first check to see if you have a robots.txt file. If the robots.txt file blocks any pages, Google won’t crawl them.

For years, website owners and web developers have used the robots.txt file to block Google from accessing duplicate content. From blocking URLs that use tracking parameters, blocking the mobile or print version of sites or just to fix flaws in CMS’s, I’ve seen a lot of duplicate content blocked with robots.txt in my time.

But the robots.txt file is a terrible way to deal with duplicate content. Even if you’re 301 redirecting the duplicate URL to the real one, or using the canonical tag to reference the proper URL, the robots.txt file works against you.

If you have a 301 that redirects to the proper page, but you block the old URL with robots.txt, Google isn’t allowed to crawl that page to see the 301. For example, have a look at Ebooker’s listing for ‘flights’:

The URL that’s ranking (on page 1 of Google for ‘flights’) is blocked in robots.txt. It’s got no proper snippet because Google can’t see what’s on the page, it’s had a guess at the title based on what other sites have linked to it with. And here’s the reason why Google can’t crawl that URL:

If Ebooker unblocked that URL, Google would be able to crawl it to discover the 301, and the page would most likely have a better chance of ranking higher (as it wouldn’t just appear to be a blank page to the search engines).

If you block Google from seeing a duplicate page, it’s not able to crawl it and see that it’s duplicate. If there’s a canonical tag on that page, it may as well not be there as Google won’t be able to see it. If it redirects elsewhere, Google won’t know.

If you have duplicate content, don’t block the search engines from seeing it. You’ll just prevent the links to those blocked pages from fully counting.

Flickr image from Solo.


Source

Does Google Have A Secret Tablet?

Posted by suerte.. On 4:27 AM No comments

As an SEO blog, this site tends to get a few visits from Google employees every now and then. I was looking through my Google Analytics stats the other day and noticed that, after writing my startup SEO advice post, I had a visit from Google Ireland that I couldn’t really explain.

There was a visit from Google based in Dublin, with the screen resolution 800 x 1153. Looking further into it, whatever that device was runs Android (and Google Analytics reports Safari as the browser, although I’m pretty sure that’s because Android’s default browser uses webkit, which GA may simply record as Safari).

It also has Flash installed:

From checking around, and from looking through Wikipedia’s list of Android devices, I genuinely can’t find what device this is. Is this a Googler that’s hacked a different device and installed Android on it, or does Google have a secret tablet?

If anyone knows what this device is – please, please put me out of my misery and let me know in the comments.

Flickr photo from Leo Reynolds


Source

Google’s Love Affair with Anchor Text

Posted by suerte.. On 4:22 AM No comments

The majority of SEOs (and possibly most site owners) know that the search engines heavily value links with optimised anchor text. A link with the text “cheap car insurance” will help you rank for “cheap car insurance”. That sounds obvious, although it’s also kind of sad, because that’s not really how normal people link.

What’s less clear is just how much Google weights the anchor text in it’s algorithm compared to other search engines. Ordinarily it would be difficult to test this – you’d need to find a huge range of varied sites, all linked to with a common phrase. Luckily, Hacker News is a good example – if you use the phrase “Show HN” (as people often link to their new startup/project using that phrase).

Here’s what Duck Duck Go displays for the query “Show HN”:

While it (brilliantly) has a !bang syntax for searching HNSearch.com, it’s regular results show pretty much what you’d expect – sites that use “Show HN” in the title tag of the page and within the text of the page, along with something like hn-show.com which features the keywords within it’s domain name.

And here’s what Blekko shows:

Blekko is relatively similar, in that it promotes sites that use the word “Show” in the title tag and on the page a lot (maybe not as much with “HN” though).

Here’s Bing’s results:

Bing, weirdly, doesn’t have any results from Hacker News in it’s top 10 – the first result is from FriendFeed. After that, it very heavily focuses on the keyword being in the domain name or in the title tag & on-page text.

And finally, here’s Google’s results:

Other than the first result from HackerNews, not a single listing features the text “Show HN” in either the title, domain or on-page text. They’re ranking for the phrase, despite not mentioning it anywhere on the page, because some of the links pointing to them include the phrase “Show HN”.

Keep in mind, this may be an edge case – typical on-page weightings might be dialled up for search terms that are more heavily searched for.

I’m not saying that this is necessarily a flaw in Google’s results at all – I much prefer Google’s results in this edge case than to the other search engines. I just wanted to highlight how Google appears to weight anchor text very heavily – much more so than the others.


Source

Bing’s Google Argument Makes No Sense

Posted by suerte.. On 4:18 AM No comments

Recently Google accused Bing of effectively copying their results by using toolbar data, and data from Internet Explorer if the suggested sites feature is enabled – you can read Google’s side of the story here, and the story of Bing’s response here.

I’m not going to explain it all in too much detail because I think those two articles cover it quite well, but as a quick summary:

1. Google suspected Bing of using some of Google’s data in Bing’s results
2. Google set up a test to prove this – by allowing pages to rank for “synthetic queries” (Googlewhacks), using IE8 with the Bing bar installed to search for and then visit those pages, and then found Bing returning around 9% of those results a few weeks later
3. Bing very strongly denied “copying” Google’s results once accused

Bing’s description of what’s happening appears to be around the use of “clickstream data” – it sounds like the Bing toolbar (and IE with suggested sites) looks at which pages you’re on and which pages you visit afterwards. This isn’t restricted to Google – this is, apparently, for all pages on the Internet.

There’s arguments from people saying that Google is right to find this unacceptable, and others saying that Bing is in the right.

I was actually quite surprised by the number of people siding with Bing over this, there’s something about Bing using it’s browser to collect user data from competitors that doesn’t sit quite right with me. Regardless, I was surprised by some of the things that Bing said to defend itself.

Google engaged in a “honeypot” attack to trick Bing. In simple terms, Google’s “experiment” was rigged to manipulate Bing search results through a type of attack also known as “click fraud.” That’s right, the same type of attack employed by spammers on the web to trick consumers

– Yusuf Mehdi, Bing.

What Bing is complaining about here, is that Google engineers chose to adjust Google’s results for specific terms, searched in Google for those keywords and then clicked on those listings. In Google. That’s not an “attack”, nor is it a “trick” and it’s definitely not “click-fraud”.

Bing also mentions that the clickstream data that they’re using is one of 1,000 signals used to determine where a site should rank, and that the honeypot keywords that Google used were noticeable because they were outliers – and as such they only really had the clickstream data to go on.

But this is what I don’t fully understand – the clickstream data itself. Bing says that the clickstream data isn’t just for Google – it’s for all sites on the web. But of course, Google – their biggest competitor – is the second most visited site on the Internet from the US, so it’s fair to say that a very hefty chunk of that clickstream data actually contains data from people searching on Google.

The other thing I don’t understand is what happens when you scale that clickstream data. We’ve only seen what happens when it’s used on 100 invented terms from Google’s honeypot test, where around 9% of those queries then appeared to affect Bing’s results. Bing implies that this isn’t a lot, and that the effect is much smaller when it’s scaled – but I’m not so sure. I’d actually be quite surprised if, when this was scaled to something the size of the Bing toolbar’s userbase, there wasn’t a very noticeable impact on Bing’s results. This is one of those things that cannot really be proved – we have to take Bing’s word for it.

During the Farsight video, the Bing rep mentioned that they were only using publicly available clickstream data – but of course, that data isn’t publicly available. The data is coming from a toolbar, and the conditions are, let’s face it, buried away somewhere in a EULA which nobody in their right mind ever reads. These users have legally opted in to sharing that data, but I don’t think they’re aware of it.

Regardless of that, though – Bing is taking data from Google users, who are searching on Google and allowing it to influence Bing’s search results. It may be legal, but it doesn’t mean you have to agree with it.

Flickr image from reway2007.


Source

Clever Keyword Research

Posted by suerte.. On 4:15 AM No comments

Richard Baxter from SEO Gadget has released a keyword research tool that does some particularly clever things. I’ve been using it for a while now, and it’s clear that there’s been a very big focus on usability as well as a focus on it providing you with something that’s instantly actionable.

The set-up process is very straight-forward. You first add your URL, and then sync it up with your Google Analytics account – this step is optional, you can skip the GA import and just copy and paste the keywords you want it to focus on (useful if you’re using a different analytics package to Google Analytics). The tool then runs through all of the selected keywords – the ones you’ve manually put in, or the ones that have driven traffic to the site if you’ve used the Google Analytics option – and checks your ranking for that term, along with the estimated search volume (on exact match) from Google’s keyword tool. The ranking data can take a while to be gathered, depending on how many keywords you’ve added, but the search volume appears remarkably quickly. You are then presented with your keywords, graphed with where they rank, what their rough potential is based on the expected search volume and by how much traffic they’ve actually sent you.

You’re then free to start creating categories and filters.

One of the main features is being able to group sections of your keywords into categories. For example, the default category is ‘Brand’, and automatically includes all of your brand terms. You can create your own categories if you want to further seperate things out – for example, if you’re a clothing retailer, you can create a category for things like ‘shoes’ and ‘boots’, one for ‘jeans’, and so on. As an example, the graphs used here are from a snowboard site that I own – I have categories for “brand name”, “boots” and “boards”.

The one thing that I’d really like to see is a negative option, so you could create a category that *doesn’t* include your brand terms. That change would mainly benefit sites that have a huge amount of longtail brand traffic that are using the GA import option, although it’s far from essential.

Setting up categories is ridiculously easy, and gives you the ability to quickly drill down into particular product areas (especially useful if you’re dealing with a huge data set). This is particularly useful for retail sites, as you can seperate out product types easily and quickly.

Filters allow you to manipulate your data by setting up rules – rules like “show me keywords that have a search volume greater than 1000 according to the AdWords keyword tool, that I rank at the bottom of page 1 for and that sent me at least 50 visits”. This is what that rule would look like:

Easy.

You’re then free to slice and dice the data to how you see fit – as an example, I now have a list, in order of priority, of which terms I should go after if I want to sell more snowboards:

The graph has been filtered to only show keywords that mention “boards” or “snowboards”, that have sent traffic to the site already and where I’m not ranking in the top 5.

The real beauty in this is the speed at which you can slice and dice the data. If I wanted to focus on boots, or snowboard bindings, I can be up and running within seconds. That’s especially useful if you have a large site with a lot of potential categories. The best way to see it in action, though, is to actually play with it – Richard has been kind enough to give out a coupon. At the sign up page, if you use the coupon code SHARKSEO, you can get 1 month free.

Flickr image by C.SooHoo.


Source

Site search