Tuesday, October 8, 2013

Graph Search Recruiter Makes It Easy to Find Talent on Facebook


It is now easier than ever to find the perfect job candidate on Facebook, thanks to a new Graph Search add-on.
Graph Search Recruiter, a new tool from social-recruiting firm Work4, leverages the social network’s Graph Search to give recruiters the chance to source talent from Facebook’s 1.1 billion-strong membership.
“Facebook Graph Search is a revolutionary tool in and of itself,” Work4 CEO Stéphane Le Viet says in a press release.
“But we saw the need for recruiters to be able to quickly and easily access the millions of potential candidates and sort through the data efficiently. While LinkedIn gives recruiters access to search about 25 percent of the market, Facebook gives recruiters access to everyone, everywhere.”
The recruiter dashboard, launched Monday, allows companies to sync their career sites to Facebook’s network. The tool — using a Graph Search-like algorithm — finds potential candidates based on occupation and location.
“Graph Search Recruiter gives you a way to source talent for your job postings on Facebook with a click of a button,” Le Viet says in a blog post.   “Need software engineers in San Francisco? Simply click a button, and Graph Search Recruiter will search Facebook for you — and instantly show you a list of your potential candidates, who you can message for a fraction of the cost of LinkedIn.”
Users of Work4 can keep tabs on candidates they have contacted, those who have applied to one of their jobs as well as receive suggestions for potential candidates. Graph Search Recruiter also means companies will have to shell out less money for recruiting.
“Graph Search Recruiter will be a game changer for the social recruiting industry, because it puts the largest passive talent pool in the world–Facebook–directly at recruiters’ fingertips,” said Le Viet. “With Graph Search Recruiter, for the first time everyone in the world becomes a potential candidate, saving recruiters time and money while making sourcing for the right fit easier than ever before.”

Why You Can’t Trick Google, at Least Not For Long


The term key-word stuffing has nothing to do with the edible concoction of vegetables, spices and bread that adds heft to a meat dish. In years past, however, many Web denizens crammed keywords into their sites to game Google, and other Web crawlers and get to page one of the search results. They literally stuffed their websites with popular keywords, and the result was a concoction all right, but of a particularly indigestible kind.
Keywords are really nothing more than the most common words or phrases that people searching the Web enter into the search box. Sprinkling (rather than stuffing) popular key words into website design and relevant content is the only thing that works now, because Google and the rest are on to the stuffing scam and will definitely penalize — if not exclude altogether — any website that tries it.
The classic joke about this topic goes: “An SEO walks into a bar, bars, beer garden, hangout, lounge, night club, mini bar, bar stool, tavern, pub, beer, wine, whiskey.”
What is keyword stuffing, really?
Keyword stuffing isn’t just inserting a block or list of keywords helter-skelter on web pages. Here are three examples:
  • Repeating keywords and phrases unnaturally. Here’s an example using “cheap athletic shoes”: “Are you looking for cheap athletic shoes? If you need cheap athletic shoes, look no further. Our cheap athletic shoes website… etc.”
  • Using alternate keyword phrases that mean the same thing: “Are you looking for cheap athletic shoes? If you need inexpensive athletic footwear, look no further. Our low-cost athletic running shoes…etc.”
  • Creating bad (or awkward) grammar or phrasing with keywords and including locations: “Cheap Los Angeles athletic shoes can be difficult to locate…”
When Google caught on one Local SEO and G+ consultant, ‘Al Remetch, put it this way:
“Another keyword stuffing tactic was the excessive use of the keyword in the title, in headings, and throughout the content. Keywords were popping up all over the place.  And again Google caught on and began penalizing this keyword stuffing tactic.”
Obviously, the penalty of being banished from Google’s first page was pretty severe for the site owners, but welcome relief for most users. (Most Web users rarely proceed beyond page one for any search.) But, according to Remetch, “for many that fear is irrational.” Don’t sprinkle keywords unnaturally and redundantly into your website, but follow this advice:
“If a keyword fits naturally into the content without looking forced it is ok to use [the] keyword as often as needed and you won’t go to Google jail for keyword stuffing.”
What does Google look for?
All the SEO experts agree that given the most recent changes to Google, the most important thing you can do is just be natural, but create a lot of content. One of the key things Google looks for is how current your site is. You can have lots of great pages, but there’s a real sense of “What have you done for me lately?” permeating all search results these days.
Staying out of “Google Jail”
Anyone interested in (1) knowing what Google considers the best practices in SEO and keyword use, and (2) knowing how to apply those practices in optimal website design only needs to read Google’s free Search Engine Optimization Starter Guide. The strictures against keyword stuffing appear in the context of what to avoid. For example:
“Stuffing keywords into alt text or copying and pasting entire sentences.” (Alt text is used by text browser that can’t view images.)
and
“stuffing unneeded keywords into your title tags.” (Tags are those little yellow text boxes that pop up when you run your cursor over coded text.)
How can you avoid keyword stuffing?
In addition to laying off the spam tactics, here’s how you can give your readers and customers the best Web experience:
  • Read up on and use the Long-Tail Method when writing your page or topic titles. This is a brilliantly simple method of targeting general, medium-sized and focused searchers. In short, even though there are billions of Google searches every day, 20 percent of ALL searches have never happened before. People are wonderfully complex and individual, and the stuff they search for and the way they search for it is, too.
  • Only use keywords that are totally relevant to your topic or product. Think about how your potential customers would search for you.
  • Finally, the very best advice is the most stunningly simple approach: Don’t over-think the problem. Just concentrate on producing naturally useful content that uses the best keywords in the most natural context. Providing authentic valuable content that both informs your customers and boosts your Google ranking is key to success. The Google search engine always rewards the best content with a high ranking on its hit parade.

Social Media Marketing Vs Social Networking Portals

Although most of us use the terms interchangeably, there is a difference between social media and social networking. For me, understanding the distinctions was a big aha moment. To develop a comprehensive and effective digital marketing strategy, it is helpful (even crucial) to understand the differences.
Social Media
Social media is the media (content) that you upload -- whether that's a blog, video, slideshow, podcast, newsletter or an eBook. Consider social media as a one-to-many communication method. Although people can respond and comment, you own the content and have to produce (write/record/create) the media yourself.
Your social media goal and strategy: Decide if you want to connect with your audience in the form of a blog, video, newsletter, podcast or eBook. Blogs are a great way to get started, particularly if you are a non-fiction author as you already have your book to re-purpose content into a blog or to use your book as an idea generator for your blogs. Or you can create blogs from scratch that eventually can be the framework for your next book. Blogs help brand authors, increase exposure, and can easily be shared, helping consistently to increase your following and enhance your promotional efforts. Plus, you can always include links in your blog directly to your book, driving traffic to your book.
Social Networking
Once you decide what media you are going to use, begin with social networking sites like Facebook and Twitter to engage with your audience. Having a Facebook business page for you and your brand is essential because, as you know, people on Facebook read books and will tell their friends and colleagues about your book. Facebook and Twitter also provide you with numerous opportunities to connect with your prospective audience through web links, posts, news stories, notes, photo sharing, blog posts, direct messages, questions and comments. Eventually you may want to branch out with other social networking platforms like LinkedIn and Pinterest.
Social networking is all about engagement -- creating relationships, communicating with your readers, building your following and connecting with your online audience. If you treat social networking like social media, you will come off as someone using a bullhorn. It's important to listen as much as talk with social networking.
Your social networking goal and strategy: Your social networking goal is to interact, converse and create conversation: search conversations, begin new conversations, set alerts to monitor your name, find new ways to connect. Realize social networking is a marathon and not a race as it will take time to build relationships and grow your following. Work for buzz and excitement on your book, product or service. Remember that people naturally gravitate towards people who they find relatable and who have similar experiences and interests. Investing in relationships can build loyal fans.
There is some overlap and integration with social media and social networking. Social media experts at SocialMediaExaminer.com, say that Facebook, Twitter and Pinterest are whole package platforms -- and are considered both social media (tools) and social networking (a way to engage). YouTube, on the other hand, is a tool for video, so it's social media. Chatting with other colleagues on LinkedIn? That's social networking. Both work together for your overall social media strategy.
To develop your digital strategy, decide what types of media you want to create and use social networking to build up your following so you can brand yourself as an author.
Once you successfully have your social media and social networking strategies working in harmony, you will be more connected with your audience and be able to more effectively promote not only your books, but also your apps, conferences, videos, webinars, websites and more. You will be actively increasing the value of your personal brand as an author, and reaching the right people with your unique message.
Fauzia Burke is the Founder and President of FSB Associates, a digital publicity and marketing firm specializing in creating awareness for books and authors. For online publicity, book publishing and social media news, follow Fauzia on Twitter: @FauziaBurke. To talk with FSB and ask your book publicity questions, please join us on Facebook.

Using Social Media To Boost SEO

Google’s Penguin algorithm update was designed to allow the Google search engine to tell good websites apart from poor quality ones.
Whether your website has been affected by the Penguin and Panda updates or not, you need to stay alert to the general direction that Google is taking with these updates. Google uses these updates to help promote only good, useful websites on its search results. If you don’t respond to these new efforts by Google, you could well find your search engine ranking yanked out from under you on a future update.
How the social media figure in your efforts to rank well
While Google cannot actually understand the content on a website to know if it is valuable, it can turn to the general public for a few clues. When human visitors on a website approve of what they see, they are likely to talk about their experience on the social media. Checking out a page’s social media activity, then, is a valuable way by which Google can tell if a website is really popular.
Going forward, social signals are likely to grow in their importance to Google’s ranking algorithm. Businesses need to try their best to gather a following of people who talk about them on the social networks. It may not even be enough to do this on Facebook. Gathering a following on Google’s own Google+ social network will be important to ranking on Google in the future. Google, after all, has better access to content on Google+ than content on Facebook.
Building a following on the social networks
Many business websites try to game Google by buying fake Facebook Likes and building fake Facebook pages that they can get to follow their websites. While these moves might net you a reasonable search ranking for some time, you never know when the next Google or Facebook algorithm update will find you out.
It would be a far better way to simply start with a quality content marketing plan and publish good content that draws genuine interest. When you get a genuine following behind your website, you can earn your social visibility on Google the safe and natural way.
Publishing good content isn’t as difficult as it sounds. Often, the social networks themselves offer great clues on what to write about.
  • Look at the comments to your blog posts and social media posts. Often, these comments are about the questions that readers have. Answering these questions can be a great way to finding interesting ideas to write about.
  • Check out social media pages to do with your general industry. You are likely to come across a number of instances where people ask questions. If you are in the landscaping business, for instance, you can find plenty of questions by people about what time is best to plant a lawn, to prune trees, and so on. If you are in the interior design business, you will see people asking questions about how to make hardwood floors less slippery and how to make a large room look cozy, among other things.
Using the social networks to build backlinks
You need to market your blog to your social following. You can do this by placing links on your social networking accounts for each new blog post that you publish. You can make use of automated tools to do this. These tools can repeatedly announce new blog posts on your social networking accounts to get people’s attention. The more people read your new blog posts, the more likely you are to get backlinks and improve your search rankings.
It is best to not completely rely on the search engines
Google’s algorithm updates aren’t precise. They cause considerable collateral damage. Legitimate websites often see their search engine rankings fall when Google releases an update. Every business should try to reduce its reliance on natural search engine visibility for this reason. A strong social media presence can be a good way to achieve this. When you have a dependable following on the social media, you can rely on your visibility on those media even if the search engines don’t support you.
Trying your best to build yourself up as an authority in your niche is a good way to gain visibility on the social media. When you publish regular posts that prove you an authority in your field, you become the primary resource to head for when people need the kind of information you provide. This kind of reputation is what helps websites survive fickle search engine algorithms.

Mobile Application, Why Users Delete Your Mobile App

Mobile app stores are growing by the minute. There are over 45,000 apps added to app stores every month.
It isn't enough to just develop and launch your mobile app. Getting users aware of your mobile app can be competitive and expensive.
In 2012, the average user downloaded 80 apps per device. This is eight times more than when there were only 10 apps downloaded for every one device in 2008.
With those numbers it's quite a feat to get your app downloaded, but what can be even harder is getting users to use your app more than once. Marketers and developers need to plan and execute marketing efforts to keep users engaged after your app is downloaded to their device.

Why Users Delete Mobile Apps

Because the data doesn't lie, most of your users either aren't using your app or uninstalling it altogether. Research by Mobilewalla revealed that users eventually delete 90 percent of all downloaded apps. Make one wrong move that angers or frustrates users – and chances are your app will be deleted.
Mobile App Types of Problems Encountered
The survey by Compuware also sheds light on the most common reasons why users might delete mobile apps or give them bad reviews. The number one cause: freezes. Sixty-two percent of the people who were surveyed said they would delete an app if it froze up.
Another survey from uSamp revealed that 71 percent of users said they would delete an app that crashed, 59 percent for slow responsiveness, 55 percent for heavy battery usage, and 53 percent for too many ads. That's a lot of pressure for developers and marketers!
This tells us a couple of things:
  1. Mobile customers are intolerant and fickle. (You know you are). If your app isn't a knockout on first impression, it's probably going to be deleted or will be forgotten on their smartphones.
  2. You're potentially losing thousands or millions of dollars in revenue. That's no joke.
Profit in the app world is a numbers game dictated largely by your app users. Whether your app business model is pay to play, in-app purchases, advertising, or freemium; theoretically, the amount you can generate from an app is highly dependent on the number of users you have.
In a study by Gartner, this year, in-app purchases will account for 17 percent of revenues to more than $4 billion. Advertising will account for 7 percent of revenues, which relies heavily on mobile app usage data.
Mobile App Store Revenue

Why Users Abandon Your App

Here are top reasons why users abandon your app:
  • Complex or bad registration process: Users won't keep an app if the registration process is complicated. Mobile users want to start using their new apps quickly. If logging in isn't a fluid experience, you may have users leave your app for good. Greet new users with useful welcome messages or an intuitive tour.
  • It's another "me too" app: It's difficult to stand out in a crowd of a million. (Literally a million apps just in the iOS app store). Search "to-do list" on any app store and you'll find pages of apps that offer "the best" solution to managing your to-do list. Finding the best combination of channel, creative, and timing for your marketing campaigns to reach your target audience – and stand out from competitors – is critical to make sure your app gets repeat use.
  • Lots of bugs and errors: Mobile users have a very low tolerance for unstable apps and nothing can turn them away faster than crashes and buggy interfaces. That's why 71 percent of users will delete an app after it crashes. If your app happens to have bugs when it's released, be sure you have the resources to handle support questions or you may receive a lot of poor reviews in app stores.
It's important to remember that the factors above aren't the only reasons customers will delete an app after one use. Every app is different. The problems in your app's first time user experience might not always be apparent from just looking at these three issues.

Don't Leave Money on the Table

To figure out how to prevent users from deleting your app, you need to understand why they're leaving in the first place. If your users are abandoning your app after only one use, whatever is turning them away is probably not very deep into your app experience.
The launch and registration of your mobile app is the first opportunity to impress new users of your app. By seeing how your first time users navigate through your app and where and when they leave, you'll be able to identify the exact feature and/or screen that caused them to drop off – and fix it.
Monitor valuable metrics, such as tutorial completion, time spent on each screen, quitting the app, back tracking between screens and more. For example, if users are closing out of your app after connecting their Facebook account to create a new account, it may mean there's a bug that's causing your app to crash or it could be that users are reluctant to share their social network information on their first visit. You may need to offer email registration as an additional option.
The mobile app landscape is only going to get bigger and more competitive. As more apps enter the market, you're going to need every advantage you can get to stay ahead of your competitors.

BrightEdge Shifts Marketers from Keyword to Page Performance

While Google was quietly planning its release of fully secure search, BrightEdge had already launched new optimization and tracking features that analyzed websites from a page level to fill the soon-to-be gap and prepare for the shift in marketing mindset that BrightEdge said it already saw on the horizon.
Page reporting is a tracking feature in BrightEdge that shows the value of pages from organic search traffic. This report complements the already existing "page manager" feature, which helps BrightEdge customers optimize at the page level, not just the keyword level.
"The way page reporting works is we integrate with site analytics, pull in page data and overlay that with keyword and rank data," said Brad Mattick, VP of marketing and products at BrightEdge. "And then we let businesses define page groups to model their business structure."
Mattick said this offers "powerful and flexible" ways of understanding the performance of pages in groups for both B2Bs and B2Cs based on a site's defined conversions.
brightedge-page-reporting
Mattick said Google's secure search move helps marketers to step away from "intermediate metrics" such as keywords, and start looking at the performance of content as a whole.
"For any search marketer who has been paying attention, secure search should not have been a surprise," Mattick said. "This is a very positive change for the industry."
So what can those marketers do who want to make a transition from the keyword-focused model of success to a new paradigm?
Mattick pointed to the "triangulation" method in the interim, whereby a person can take into account the keyword's search volume, its rank in the search results and the click rate to estimate how much traffic a keyword is driving (which Mattick said is built into BrightEdge's tools already).
Mattick said you can also factor in revenue if you have a typical conversion rate on your site or on pages of your site, or a typical transaction value that you can apply to the equation.
"But the really important point in all this is that it's just estimation," Mattick said. And because estimation can only take you so far, Mattick said one thing marketers can know for sure when tracking progress is visits, conversions, and rank by page.
"People visit pages and they convert on pages not keywords," he said. "Start with the value of a page."

Monday, October 7, 2013

Link Building 101: Competitor Analysis

Link Building 101 Competitor Analysis
Link building is something anyone can accomplish. There's no great secret, just hard work, creativity, and determination to get links that matter.
When you're looking for some practical link building opportunities that will help you find and acquire quick, yet quality, links, there are five "quick wins" you should explore at the beginning of a link building campaign:
  1. 404 Pages and Link Reclamation
  2. Competitor Analysis
  3. Fresh Web Explorer/Google Alerts
  4. Local Link Building
  5. Past/Current Relationships

Competitor Analysis/Backlink Profile

Competitor analysis is an integral step in any link building campaign. Why? Because running a backlink analysis on a competitor:
  • Teaches you about the industry:
    • Gives you a sense of which sites within the vertical are providing links
  • Helps you understand your competitors, including:
    • Their link profile, and why they're ranking
    • Their strategies used to acquire links
    • Their resources that didn't acquire many links
Gives you a list of obtainable links (if they can, why not you?)
Competitor backlink analysis is great – you get the initial research into the industry done, it helps you understand the competition, and it gives you a tidy list of high opportunity links.
So, let's dive into the how of competitor backlink analysis:
  1. Make a list of competitors
    • Direct
    • Indirect
    • Industry influencers
    • Those ranking for industry money keywords
    • Watch fluctuations – who's winning and who's losing
  2. Take those competitors and run their sites' through a backlink tool previously mentioned (OSE, Majestic, Ahrefs, CognitiveSEO, etc.)
  3. Backlink Analysis
  4. Download the top 3-4 competitors' backlinks into CSVs. Combine into a single Excel sheet, removing duplicates, and find obtainable quality links already secured by competitors.
Step 2 and 3 were previously covered in "Link Building 101: How to Conduct a Backlink Analysis", and step 1 is pretty self-explanatory.
To recap the advice for these steps:
  • Don't phone-in the list of competitors. Spend time doing research and investigation, giving yourself a well thought out and understood list of potential competitors.
  • Information you should be examining in a backlink analysis:
    • Total number of links
    • Number of unique linking domains
    • Anchor Text usage and variance
    • Fresh/incoming links
    • Recently lost links
    • Page Performance (via top pages)
    • Link quality (via manual examination)
  • Additionally, think creatively while looking through competitors' backlinks. Think about:
    • Which resources/pages performed well
    • Which resources/pages performed poorly
    • Commonalities in competitor's link profiles
    • Differences in competitor's link profiles
    • Strategies likely used to acquire links

How to Find Obtainable Quality Links

So, that takes us to Step 4: downloading competitors links into CSVs, combining in Excel, and drilling down into the data to find worthwhile links and insights.
Honestly, SEER has done an amazing job of writing a very easy to follow guide for Competitor Backlink Analysis in Excel.
To summarize their steps, you:
  • Download CSVs of competitor's backlink portfolios (‘Inbound Links' will give you a list of all the pages linking, ‘Linking Domains' will give you only the domains).
    • Note: if you're unfamiliar with your own (or client's) backlink portfolio, you may wish to include their backlink portfolio in this process for reference.
    • Using OSE don't forget to filter to the whole domain:
Pages on this root domain export to CSV
  • Open the CSVs and combine (copy and paste) all the data into a single Excel sheet.
  • Filter down to clean URLs, keeping the originals intact.
    • Move Column J (target URL) to Column P (to be the last column)
Move Column
    • Delete Column J (the now empty column)
Delete Empty Column
    • Duplicate the URL and Target URL columns on either side
Duplicate URL Target URL columns
    • Remove http:// and www. from both column A and column P - select the column, click control+H (find and replace shortcut), type in what you want to find (http:// and www.) and replace them with nothing (by leaving the second line blank).
Remove http and www
    • You might want to rename column A and P at this point - call them bare URL and bare target URL, or whatever you so desire (in the SEER article they were called ‘clean').
  • Remove duplicates
Remove Duplicates
    • Make sure it's only for column A (bare URL) and P (bare target URL)
Remove Duplicates URL
Notice the check mark on "My data has headers". This is important to keep your data from being jumbled up. Anytime you're removing duplicates make sure this box is checked.
This will give you a complete list of stripped URLs next to the full URL linking (along with the rest of the important information provided by OSE) and a list of full target URLs next to a complete list of stripped target URLs.
Note: you'll still likely have a lot of duplicate URLs in column A (the linking URLs) at this point. This is because there's multiple links on the same page going to different landing pages – which is potentially important information (shows a competitor acquired multiple links per page).
If you'd like to delete these multiple link pages/URLs to reduce data noise, highlight column A, and run ‘Delete Duplicates' again - making sure to have the ‘My data has headers' box is checked:
Remove Duplicates Bare URLs
Now, you'll be down to unique URLs (pages, not domains if you've used Inbound Links) linking to competitors. If you're looking for only referring domains, you should start back at step 1 and download a CSV of referring domains, as opposed to all links.
At this point, you're still dealing with a lot of data, so you'll want to filter it further. I recommend filtering by domain authority to see the most authoritative links first.
Filter Domain Authority
This will make your list ordered from highest domain authority to lowest – pretty useful information. Keep in mind however that the domain authority is thrown off by any subdomains hosted on a popular site – example.wordpress.com, example.blogspot.com, etc.
So, don't take the domain authority as absolute – you'll need to verify.
There's also a few other filters you can use to find interesting data:
  • Page Authority (PA)
  • Anchor Text
  • Number of domains linking (shows best ranking pages - don't get stuck on home pages)
Take time and play around with the data. Look through the top DA's (manually excluding anything artificially inflated), then PA's, check out top performing pages via number of domains linking, and even play around with filtering the anchor text.
This should be the fun part - the analysis. You've filtered the data down to a semi-digestible level, and should start taking advantage to find insights and understand your competitor's links.
Remember, any links your competitor has should be considered fair game for yourself. Once you've determined quality links from domains you haven't secured, look into the link and pursue it appropriately.

More Insights

If you're looking for an even better (and more advanced) deep data insights you can move all this information into pivot tables. Simply select all rows, click over to the insert tab, and select ‘Pivot Table':
Insert Pivot Table
Once here you have the option to choose which fields you'd like to further examine:
Pivot Table Fields to Add
Playing with this data should reveal potential insights, although we're getting a bit beyond Link Building 101.
Furthermore, if you want to really dive into pivot tables (or excel in general), I can't recommend Annie Cushing enough. Check out her Moz article "How to Carve Out Marketing Strategies by Mining Your Competitors' Backlinks".

Pengiun 2.1 Update - 5 Things you can do today about Pengiun 2.1

Penguin 2.1:

In case you hadn’t heard, Google rolled out a significant Penguin update on Friday October 4, 2013. To recap, Panda updates mainly concern your on-page SEO (on your website/s) whereas Penguin updates are about off-page issues (your backlinks, basically).
If you missed it, here was the industry chatter about Penguin 2.1:
http://searchengineland.com/penguin-2-1-and-5-live-173632
http://www.seroundtable.com/google-penguin-2-1-17474.html
BTW, one of my fave tactics for getting additional insight on a topic like this is to read through ALL the comments. In fact, I often pick up more gems in the comments area than in articles themselves.
And, you can read some early analyses of Penguin 2.1 (e.g what it’s targeting) on Traffic Planet here:
http://trafficplanet.com/topic/7282-penguin-21-launched-today/
So if you have been slapped by Penguin 2.1, here are some response options for you:

1. Don’t give up on SEO.

Usually when there’s a high impact algo update and stories emerge like this:
slap2
the temptation is to totally abandon SEO and shift over to a paid traffic strategy. However, before doing that, keep a few factors in mind (these are from my anecdotal observations over the years and/or research I’ve come across):
[a] AdWords ads get (probably) less than 5% of clickthroughs on Google’s SERPs
[b] Most ad campaigns lose money – banner clickthroughs are shockingly low
[c] Paid traffic is a specialist area that requires proper study, analysis and fine tuning i.e. time commitment
[d] Visitors clicking on ads probably have higher levels of resistance to salesy messages than those coming from a referral or Organic listing
Now a no-cost but time-using alternative is Forum Marketing but that doesn’t suit every niche (try Forum Marketing for clients like funeral companies, local dentists, locksmiths or real estate agents!) – it’s best for the expert/information/software type operations.
SEO has always been in a war with Google and constant adaptation has therefore been essential. It’s simply a fact of life with SEO. Paid traffic also requires constant trial-and-error plus changes to platform rules (plenty of mega-spending Internet Marketers have been booted out of AdWords over the years too).
Before long – it’s too early yet – we’ll have a better understanding of trigger factors on Penguin 2.1. Plus, don’t forget that there is ALWAYS variation in Google results (deliberate on their part I believe) to avoid 100% accurate reverse engineering by SEOs. In SEO, we’re always working with generally consistent trends that MOSTLY work, not 100% foolproof blueprints (there are always exceptions).

2. Get your traffic back in 48-72 hours.

What you really want is the targeted traffic more than specific rankings. To that end, depending on your niche, an Authority Parasite Strategy could work.
What’s that exactly?
It’s where you bring traffic through a page on a site that Google ALREADY regards as high authority. Using  my normal SEO methods, I’ve gotten inner pages on Authority Sites to Page 1 in about 48 hours and often they outrank the relevant category page on that same Authority Site (because the category page wasn’t SEOd).
Examples of sites suitable for this approach include:
  • YouTube videos (obviously)
  • PRWeb press releases
  • SB Wire press releases
  • LinkedIn profile pages
  • Pinterest pages
  • Amazon product pages (e.g. if you have a Kindle book)
  • Udemy product pages (e.g. free/paid mini-course on Udemy.com, Udemy pages rank fast and high with good SEO)
  • Facebook pages
  • Forum thread pages (e.g. Warrior Forum, Traffic Planet or major forums in your niche – don’t forget to keyword optimize your title when starting a new thread)
  • Twitter profile pages
  • Storify inner pages
  • Yahoo Answers
  • Lynda.com
  • Appsumo.com
  • Groupon deal inner page
  • Tumblr (riskier option)
  • Squidoo (riskier and has been slapped by Google in the past)
  • Rebelmouse.com page e.g.
slap3
Basically, just keep an eye on the SERPs and see what the Google darlings are.
If you want to pursue this option, email me at support(at)terrykyle.net and I’ll go over what you need to rank your Authority Parasite page/s.
Plus, take a look at what this past (accidental) traffic win did for one of Shoemoney’s sites:
traff16a
From that post:
It reminds me of when I started my mobile ringtone website. I did it because it was fun. Then one day it appeared on digg.com and went from 1,000 visitors a day to 150,000 and remained a consistent 75-100k unique visitors a day for years. This is where you see my famous 134,000.00 adsense check from. That site made next to nothing for years.
http://www.shoemoney.com/2013/02/07/how-ray-lewis-made-deer-antler-spray-lucky

3. Create new sites ASAP.

Instead of being too emotionally attached to that site that was just slapped by Penguin 2.1 (and it may or may not be coming back), setup several new smaller websites and get those to Page 1 in 3-4 weeks (email me on your SEO strategy).
That’s my current average for most new sites and 5-10 pages of content is plenty (my British writer is $22 per 1000 word article so the re-entry cost isn’t high – the cost of waiting and hoping for your slapped site is WAY higher). Yes try to bring your site back (see below on the Link Detox tool) but ALSO get some new sites moving, at the same time.
Spread the risk around across several traffic-bringers to minimize (but not eliminate) the risk of future slaps (that’s also where the Authority Parasite Strategy above is very useful).

4. Check out the Link Detox tool from Chris Cemper.

We’ll be testing this software tool out  a lot more in the next couple of months but Chris and his software users have a bunch of interesting case studies up here (nothing on Penguin 2.1 of course yet but several on Penguin 2.0). Interesting:
cemper5x
http://www.linkresearchtools.com/case-studies/

5. Stay up to date on analysis of Penguin 2.1

Basically, by monitoring threads like these, you can track where the SEO community is up to on this latest Penguin update.
http://trafficplanet.com/topic/7282-penguin-21-launched-today/
http://www.webmasterworld.com/google/4614730.htm
Stay tuned on Penguin 2.1 because we’re going to be hearing a lot more about it…
Have a great Sunday.
Terry K.
My SEO AgencyBlogsurge.net (password required)Traffic PlanetHigh PR KingdomRankmaxx

Read more: http://sundayseo.net/5-things-you-can-do-today-about-penguin-2-1/#ixzz2h11Vhx00

Thursday, October 3, 2013

Google Penguin 2.0 Algorithm Update May 2013 - Infographic, SEO Update May 2013

New Face of SEO in 2013 & SEO Trend in 2013

How SEO Has Changed in 2013 Infographic

Google Hummingbird Update, Hummingbird Algorithm, Seo Hummingbird Update


Googles Hummingbird Update And The Implications For Video SEO
There were two cataclysmic events in the SEO world last week; one we all saw coming (eventually) but the other arrived unannounced and set the search world alight with speculation and discussion. The first was the total encryption of keyword referral data from organic traffic to Google Analytics, something that many marketers and brands relied on for feedback regarding rankings and visitor traffic for certain key phrases. The second was the introduction of a brand new search algorithm from Google, Hummingbird, which makes the search engine more capable than ever of handling complex queries. Google wants to understand the user's intent behind a particular topic search rather than just literally interpreting the keywords used in that search. It's huge and it's going to change almost everything about Google search as we know it. Let's take a look at the new features and what they mean for video marketing.


Hummingbird: User Intent vs. Actual Keyword Phrase Used

On September 26th, Google announced the release of their new algorithm, Hummingbird. The fact that they choose to do so from the the garage where the company started 15 years ago should have been a clue to the importance to them of this update. Hummingbird isn't just an algorithm update, it's a completely brand new algo, built on the feedback gathered from previous updates and Google estimates that it will affect about 90% of search queries worldwide. So what's changed? Essentially, the emphasis is now on ‘answers’ rather than ‘results’, with Hummingbird paying particular focus to conversational search. Conversational search, or semantic search, takes into account they way users phrase their queries while actually talking to each other (or to Google) as opposed to the way we phrase a question while typing.
Earlier this year Google introduced the 'conversational search' feature to Chrome, allowing users to search Google by asking it a question rather than typing it. It's a safe bet to assume that the feedback gathered from this new feature all went towards shaping the new algorithm.
Googles Hummingbird Update And The Implications For Video SEO
Google wants to change the way that you interact with it and it wants to change the way it interacts with you. More and more of us are turning to software like Siri to do our dirty search work for us and our reliance on mobile devices means that Google as a search engine needs to adapt to us, the end user, if it wants us to continue to use its services. Users on mobiles will often ask a question in a different way from the one they might ask if they were sat in front of their PC.
Also, Google themselves confirm that 15% of the billion searches a day have never been seen before - I would suggest that this figure is made up entirely of long tail key phrases, something that the search engine has never been that great at returning results for. With a change in user behaviour regarding search queries plus the need to understand the long tail better, the goal of Hummingbird is to not only supply you with the best information you requested, but also the best information you didn't request but which enhances your experience.
Googles Hummingbird Update And The Implications For Video SEO
There are some key elements to the new algorithm which need to be understood if you want your content to be served to the people who are searching for your products, your services or any other information you provide.

Content Needs to Completely Satisfy The User's Intent

In the past few years, Google's algorithms have adapted to serve the user with the best content they think you are searching for. If they consistently served up bad or irrelevant results than you are going to go elsewhere, somewhere like Bing for instance, and Google will suffer (as will their Adwords revenue). The Caffeine, Panda and Penguin updates all had this goal in mind but Hummingbird drills down even further at the first query by effectively personalising the results based on location, context, Knowledge Graph data, device, local factors and platform. Suddenly, user intent has become far more important than the actual keyword phrase used. 
For example, if I search "how do I clean a silver necklace?", Google should return a set of results that understands that I'm looking for information or a way of doing something NOT a sales pitch. Whereas in the past the words "silver necklace" would have been the triggers, Hummingbird is now taking into account "How do I" and "clean". We know that video results have always tended to do well on informational searches but now, more than ever, creators need to think about just how they can answer a user question with video content so that they have a chance of ranking well in the blended results. Hummingbird will anticipate the user's intent and so must you.
Video SEO Takeway: Think about what question are you answering with your video. If you're not answering a question then what exactly are you offering the user?
Googles Hummingbird Update And The Implications For Video SEO

Content Needs to Be Rich and Multi Faceted 

In order to survive the long night of the knives and make it through to the first couple of pages of Google, content (all types) has to be the most relevant for that user at that given time. Websites that offer true value - as opposed to pages and pages of thin content - will be at a distinct advantage. An easily navigable site that is rich in all types of optimised, fresh, useful content should do well from the new algo (as long as they haven't been caught up in the Panda/Penguin horror show that has affected so many).
Video SEO Takeway: Understand how your on site video content works for the user. Is it easily found? Do you have unique video landing pages? Are your videos in context? Are they surrounded by relevant, informative text? Are they linked to from other pages? Do they link out to other similar pages? No? Then, you'll need our handy guide to creating the best video landing page more than ever.

Schema Mark Up Is Going To Become Increasingly Important

Schema/Rich Snippets/Structured Data are all terms for informational about information that sends a very clear message to the search engines about the focus of your content. The more info the search engines have, the more inclined they will be to return your content to the user.
Video SEO Takeway: Users are searching for video content so mark up your individual video landing pages and submit video sitemaps maps via Webmaster Tools.
Googles Hummingbird Update And The Implications For Video SEO

Keywords (not provided): The End Of SEO?

In the good all days (pre September 2013), you were able to gather data on the organic visitors to your site and to confirm the keywords they used to find you. This was extraordinarily useful for two reasons: #1) it was an easy metric to measure whether your rankings in Google were driving traffic on certain key terms and #2) it gave you clues as to the other search terms that people were finding you on which you could choose to optimise further if it suited your marketing goals.
In October 2011, Google announced that they were "encrypting searches" made by users logged into to the Google network (inc. YouTube, Docs and Gmail) to make those searches more "secure". This meant that this information would never reach Google Analytics and the site owner would be served the (not provided) phrase instead. At first, this averaged anywhere between 5-15% of non paid searches, depending on the site, but slowly the percentage increased until Google announced last week that it was making 100% of these referral terms private. Yippee!
Googles Hummingbird Update And The Implications For Video SEO
There's a way around it of course, pay Google to push you onto the front page via an Adwords campaign and receive as much keyword referral data as you like. This feature is still available in AdWords and looks like it will be for the foreseeable future but you are going to have to spend some of those hard earned advertising dollars to access it. Arguably, PPC has always had the edge when it comes to hard ROI and now it holds absolute power when it comes to giving you the facts and figures about your own keyword data. However, Google is a business and for all its altruistic mutterings it exists to be financially viable. Pushing budgets towards paid advertising to get the data that was once free makes total business sense and I'm surprised they gave it away for free for so long.
Video SEO Takeway: Don't optimise solely for key phrases. Optimise for the whole user experience. Think about your target market and in what context they may search for you and create content around that.

Conclusion

Is SEO dead? Nope, I'd argue that it's far from it but it did just get a whole lot more complicated. Keyword rankings as a metric may be devalued but search engine optimisation, as a discipline, is needed now more than ever. As users and search engines get more sophisticated, marketers need to work harder to understand what it is going to take to get their content visible. Optimising for video means the creator and the publisher need to understand what the user is looking for. It's no longer the case that you can produce a video and people will come and take a look. Post-Hummingbird, creators and brands needs to seek out the questions that are being asked and answer them with relevant, informative, quality videos. The landscape has just changed and we need to change with it.

FAQ aboutGoogle Hummingbird Algorithm

Hummingbird
Google has a new search algorithm, the system it uses to sort through all the information it has when you search and come back with answers. It’s called “Hummingbird” and below, what we know about it so far.

What’s a “search algorithm?”
That’s a technical term for what you can think of as a recipe that Google uses to sort through the billions of web pages and other information it has, in order to return what it believes are the best answers.

What’s “Hummingbird?”
It’s the name of the new search algorithm that Google is using, one that Google says should return better results.

So that “PageRank” algorithm is dead?
No. PageRank is one of over 200 major “ingredients” that go into the Hummingbird recipe. Hummingbird looks at PageRank — how important links to a page are deemed to be — along with other factors like whether Google believes a page is of good quality, the words used on it and many other things (see our Periodic Table Of SEO Success Factors for a better sense of some of these).

Why is it called Hummingbird?
Google told us the name come from being “precise and fast.”

When did Hummingbird start? Today?
Google started using Hummingbird about a month ago, it said. Google only announced the change today.
What does it mean that Hummingbird is now being used?
Think of a car built in the 1950s. It might have a great engine, but it might also be an engine that lacks things like fuel injection or be unable to use unleaded fuel. When Google switched to Hummingbird, it’s as if it dropped the old engine out of a car and put in a new one. It also did this so quickly that no one really noticed the switch.
When’s the last time Google replaced its algorithm this way?
Google struggled to recall when any type of major change like this last happened. In 2010, the “Caffeine Update” was a huge change. But that was also a change mostly meant to help Google better gather information (indexing) rather than sorting through the information. Google search chief Amit Singhal told me that perhaps 2001, when he first joined the company, was the last time the algorithm was so dramatically rewritten.
What about all these Penguin, Panda and other “updates” — haven’t those been changes to the algorithm?
Panda, Penguin and other updates were changes to parts of the old algorithm, but not an entire replacement of the whole. Think of it again like an engine. Those things were as if the engine received a new oil filter or had an improved pump put in. Hummingbird is a brand new engine, though it continues to use some of the same parts of the old, like Penguin and Panda
The new engine is using old parts?
Yes. And no. Some of the parts are perfectly good, so there was no reason to toss them out. Other parts are constantly being replaced. In general, Hummingbird — Google says — is a new engine built on both existing and new parts, organized in a way to especially serve the search demands of today, rather than one created for the needs of ten years ago, with the technologies back then.
What type of “new” search activity does Hummingbird help?
Conversational search” is one of the biggest examples Google gave. People, when speaking searches, may find it more useful to have a conversation.
“What’s the closest place to buy the iPhone 5s to my home?” A traditional search engine might focus on finding matches for words — finding a page that says “buy” and “iPhone 5s,” for example.
Hummingbird should better focus on the meaning behind the words. It may better understand the actual location of your home, if you’ve shared that with Google. It might understand that “place” means you want a brick-and-mortar store. It might get that “iPhone 5s” is a particular type of electronic device carried by certain stores. Knowing all these meanings may help Google go beyond just finding pages with matching words.
In particular, Google said that Hummingbird is paying more attention to each word in a query, ensuring that the whole query — the whole sentence or conversation or meaning — is taken into account, rather than particular words. The goal is that pages matching the meaning do better, rather than pages matching just a few words.
I thought Google did this conversational search stuff already!
It does (see Google’s Impressive “Conversational Search” Goes Live On Chrome), but it had only been doing it really within its Knowledge Graph answers. Hummingbird is designed to apply the meaning technology to billions of pages from across the web, in addition to Knowledge Graph facts, which may bring back better results.
Does it really work? Any before-and-afters?
We don’t know. There’s no way to do a “before-and-after” ourselves, now. Pretty much, we only have Google’s word that Hummingbird is improving things. However, Google did offer some before-and-after examples of its own, that it says shows Hummingbird improvements.
A search for “acid reflux prescription” used to list a lot of drugs (such as this, Google said), which might not be necessarily be the best way to treat the disease. Now, Google says results have information about treatment in general, including whether you even need drugs, such as this as one of the listings.
A search for “pay your bills through citizens bank and trust bank” used to bring up the home page for Citizens Bank but now should return the specific page about paying bills
A search for “pizza hut calories per slice” used to list an answer like this, Google said, but not one from Pizza Hut. Now, it lists this answer directly from Pizza Hut itself, Google says.
Could it be making Google worse?
Almost certainly not. While we can’t say that Google’s gotten better, we do know that Hummingbird — if it has indeed been used for the past month — hasn’t sparked any wave of consumers complaining that Google’s results suddenly got bad. People complain when things get worse; they generally don’t notice when things improve.
Does this mean SEO is dead?
No, SEO is not yet again dead. In fact, Google’s saying there’s nothing new or different SEOs or publishers need to worry about. Guidance remains the same, it says: have original, high-quality content. Signals that have been important in the past remain important; Hummingbird just allows Google to process them in new and hopefully better ways.
Does this mean I’m going to lose traffic from Google?
If you haven’t in the past month, well, you came through Hummingbird unscathed. After all, it went live about a month ago. If you were going to have problems with it, you would have known by now.
By and large, there’s been no major outcry among publishers that they’ve lost rankings. This seems to support Google saying this is very much a query-by-query effect, one that may improve particular searches — particularly complex ones — rather than something that hits “head” terms that can, in turn, cause major traffic shifts.
But I did lose traffic!
Perhaps it was due to Hummingbird, but Google stressed that it could also be due to some of the other parts of its algorithm, which are always being changed, tweaked or improved. There’s no way to know.
How do you know all this stuff?
Google shared some of it at its press event today, and then I talked with two of Google’s top search execs, Amit Singhal and Ben Gomes, after the event for more details. I also hope to do a more formal look at the changes from those conversations in the near future. But for now, hopefully you’ve found this quick FAQ based on those conversations to be helpful.
By the way, another term for the “meaning” connections that Hummingbird does is “entity search,” and we have an entire panel on that at our SMX East search marketing show in New York City, next week. The Coming “Entity Search” Revolution session is part of an entire “Semantic Search” track that also gets into ways search engines are discovering meanings behind words. Learn more about the track and the entire show on the agenda page.

Monday, August 26, 2013

Google AdWords Adds New Paid & Organic Report

Google AdWords is introducing a new feature for advertisers to give more data right within the AdWords interface, even when it isn't paid ads specific. This is part of their campaign to connect data between the Google AdWords, Google Analytics, and Webmaster Tools.
adwords-paid-and-organic-report
The new paid & organic report, which can help advertisers see their search footprints and enable them to determine if there are keyword areas that can be supplemented with paid advertising. It also allows you to view detailed reports to show for particular keywords, how much organic traffic as well as how it advertising traffic you are getting or have the potential to get.
Google suggesting several ways for advertisers confined to the inclusion of organic traffic information beneficial to their business. You can:
  • Look for additional keywords where you might have impressions on natural search, but without any related ads.
  • Use it to optimize your presence for your most important high-value keyword phrases, so you can see where you need to improve your presence.
  • Use it to test website improvements and AdWords changes, as you can compare traffic across both AdWords and organic in the same interface, which enables you to adjust accordingly.
In order to take advantage of this new report, you need to have a Google AdWords and Webmaster Tools account, and you will need to verify and sync them.
In unrelated AdWords news:
  • Google has introduced a new option for reporting trending traffic. Now, you can toggle between daily, weekly, monthly and quarterly so you can quickly and easily see any resulting trends during those time periods.
  • And finally, Google will officially retire the AdWords Keyword Tool on August 26. However, the keyword planner has been out for several months so you can easily get all the same data in their all-in-one tool.

Facebook Advertisers Can Access Professional Photos at No Extra Charge

Facebook puts content front and center again with its latest update; Facebook advertisers will now have access to millions of stock photos that are commercially licensed and available for all Facebook ad formats at no additional cost.
"High-quality, engaging photos often increase the performance of ads, particularly in News Feed. And now, through our collaboration with Shutterstock, it will be easier for businesses to integrate beautiful photography into their Facebook ads," Facebook said in its announcement.
Facebook Ads Powered by Shutterstock
Advertisers can also create multiple ads at once with several images. From Facebook:
When creating a group of Facebook ads, our new image uploader allows people to select a range of Page photos, images from previous ads and Shutterstock images. The ability to simultaneously upload multiple images means advertisers can now create multiple ads at one time (with multiple images) for a single campaign, and test images to increase performance of their campaigns.
Multiple image uploader
This new effort seems to be connected to Facebook’s initiative to streamline its advertising with the hopes of making ads look and feel more consistent.
The performance of news feed advertisements can be particularly challenging for advertisers when they don’t have the right imagery. As users scroll through photos of people and brands they know and like, a bad ad with imagery that doesn’t seem to quite fit can be jarring or even ignored.
Sponsored Post Will Ignore
So something for advertisers to keep in mind as they scroll through their now endless options at Shutterstock is to not just think about the "beautiful" aspect of photos, but the "fit" for the type of advertisement – especially in a News Feed. Ask: Does it feel authentic?

Content Curation & SEO: A Bad Match?

Many publishers continue to seek cheap and easy ways to publish lots of pages and get them up with a goal of bringing in incremental search traffic. One method many people look at is content curation.
However, content curation could be a very risky practice for you. There may be a role for curated content on your site, but there is a chance that you should be placing noindex tags on those pages.
Two forms of content curation that can potentially be useful to users are:
  • Posting a list of links of the latest news.
  • Building pages listing the best resources you can find on a particular topic area.
But, what does Google think about it?

The Content Curation Spectrum

For some background on this, it is useful to watch this video in which Matt Cutts answers the following user provided question on whether sites are better off removing sections creating duplicate content, such as press releases or news:
Cutts said the answer is probably "yes." He then goes on to draw a line representing a spectrum beginning with really poor quality sites and ending with the New York Times. I've taken the liberty of greatly enhancing his chart as follows:
Content Curation Spectrum
The New York Times will be fine with curated content, as are certain other major media properties. Why? Getting in the New York Times is hard. They have a strong editorial process.
Let's break this down a bit and look at how Google may algorithmically determine these things:
  1. Brand: Obviously the New York Times has a highly recognizable brand. One of the reasons they have a strong editorial policy is that publishing poor quality content would damage that brand. A simple algorithm approach to evaluating brand strength is counting the number of searches on an entity's brand name. Lots and lots of searches is a quick indicator for you of brand strength (I'm not suggesting that Google does this by the way, but you can use it as a crude measure for yourself).
  2. Links: The link graph remains alive and well. A rich and robust link graph that can be an indicator of an authoritative site. One of the most important patents in search engine history is the Jon Kleinberg patent on hubs and authorities. Regardless of how search engines determine authority today, the concept of using the link graph to do so is likely the major way they do that. Link graph shows a lot of authority? Then you move to the right of our Content Curation Spectrum.
  3. Quality of the Linked Resources: This is something that they could use as well. For an extreme example, if your curated list includes links to obvious spam sites, or really poor quality pages, than the whole list is called into question. Of course, there is a whole spectrum from obvious crap to obvious awesome stuff.
  4. Publisher Authority: Here I'm specifically referring to the potential use of rel=publisher tags on the content you create. While the tag has been known for a while, little was known about how it might be used until Google's announcement of in-depth articles. Consider the possibility that Google will use this as a tool to help it track the overall quality of the content published on your site.
  5. Author Authority: Google has been actively using rel=author as a way to show rich snippets including the author's face in the search results for some time. I wrote about Author Rank as a potential signal in March, and predicted that it would become a ranking signal this year in January. The in-depth articles announcement may even have been a step in that direction. For our purposes, Author Rank is a signal that isn't tied to a given website, but to a specific author, and can travel with you wherever you as an author publish content.
These are all signals that can clue Google into where a given site is on the Content Curation Spectrum.

How Will Google Use These Signals?

As explained in "Google Doesn't Care if it Ranks Your Site Properly", Google really isn't targeting your site individually (unless you're penalized). Google's primary focus is on improving the quality of their product.
Perfect ranking of each individual site is fundamentally not an achievable goal for them. They operate a different level, and they work hard at making their product better. But, how it impacts organic search traffic to individual sites is a side effect for them.
As it relates to content curation, the main point is that Google already has a curated list of content. It's called their search results. Yes, this list is algorithmically curated, but it's a curated list nonetheless.
Google's curated content is backed by an extraordinary amount of scientifically-driven testing. If you're using software to curate content for you, the chances that your machine-generated list is better than Google's is basically zero.

Curated Content vs. Search Results

Your hand-curated content isn't that much different than a set of search results. Long ago Google made it clear that they don't want to show your site's search results within their search results.
Unless there is some clear differentiation, it is my opinion that curated content is in the same boat as a site's search results. Google has little reason to show it, because a different set of search results, hand picked or not, doesn't really add any value to people over what they get in their search results.
Don't get me wrong, truly expert humans can probably pick a better list than Google, or perhaps even just a comparable list, with a somewhat different focus. But how might Google actually detect that added value?
Google can clearly detect the New York Times. They can clearly detect the people on the far left of the Content Curation Spectrum.
But what if you're in the middle? If I were a Google engineer, I'd place no value on the middle either, and I wouldn't rank it, unless other signals gave a clear indication that something was different about it.
The curated content list may be great stuff, but there's no way to know really, and it isn't worth the risk. Besides, there may be one or two highly authoritative lists of curated content in their search results (more power to you if you're one of them!), so showing another one from someone of unknown authority doesn't make sense for their product.

What Should Publishers Who Want to Curate Content Do?

Now, the curated content you create isn't necessarily resulting in a poor quality page for your site, but for purposes of this discussion that doesn't matter. From an SEO perspective, what matters is whether that unique and differentiated quality is detectable both by users and Google.
In the video, Cutts gives some indications of what this might take. Here are three key phrases he uses:
  • "a lot of effort into curation"
  • "editorial voice"
  • "distinctive point of view"
He also talks about including access to information that isn't otherwise generally available. Here are some ideas on how you can make it quite different:
  1. Recognized Expert Analysis: In addition to producing the list of resources, add commentary and analysis from recognized experts and cement that with rel=author tagging. This is where the "editorial voice" and "distinctive point of view" come in.
  2. Unique Expert Reviews: Include expert reviews and commentary on the curated content. The key here: these reviews aren't published elsewhere. Rel=author tagging is a good idea here, too.
  3. Data Sources: Accessing data sources not available to Google is also useful. Bear in mind, though: it's critical that these data sources be very distinct and differentiated in a way that will be immediately obvious to both users and Google.
  4. Freshness: If you have a method for updating this in real time, and significantly faster than Google does, this may work as well. Note: a regurgitated news feed fails this test.

Conclusion

If you can meet one or more of these tests, great! Or you may want to publish the page anyway because you think your users will value it. That's fine, but if the Google perceivable value isn't there, I would noindex those pages.