Google+ Hangouts on Air, Videos & Photos Enhancements Revealed

This morning in a San Francisco warehouse Google had rented out for the occasion, the search and social giant unveiled new and upgraded features for Google+, their social networking platform. A blog post released to coincide with the event reveals details of the upgrade, which brought 18 more Google+ features.

 This announcement means a slew of photo and video upgrades, most available this week.

Google+ appears to be taking a run at the iMovies with improvements to Auto Awesome, which they promise “can help bring your story to life in many ways.” Auto Awesome Movies gives users the ability to have movies complete with effects, transitions and a soundtrack created automatically, using the photos from an album.

Other Auto Awesome Movies features include different styles and filters users can apply over the video with a swipe and finer controls for video length, soundtrack, etc.

Google+ Photo Enhancements


Some of the changes apply to Google+ photo editing and storage on the network, while others are within the Auto Awesome tool. On photos, Google+ upgrades include:

  • Full size backups and background sync for Google+ for iOS, to automatically upload photos as they’re taken.
  • Over 1000 new search terms to improve Google+ photo search, expanding upon the deep learning photo surfacing search unveiled at Google I/O.
  • The ability to dial auto enhancements up or down, or to exempt an album entirely.
  • HDR Scrape, a new setting as part of SnapSeed to allow users to control filters strength.
  • A new filter called Analog Efex Pro for the Nik collection (premium service, $149) of pro photo editing tools.
This morning in a San Francisco warehouse Google had rented out for the occasion, the search and social giant unveiled new and upgraded features for Google+, their social networking platform. A blog post released to coincide with the event reveals details of the upgrade, which brought 18 more Google+ features.

 This announcement means a slew of photo and video upgrades, most available this week.

Google+ appears to be taking a run at the iMovies with improvements to Auto Awesome, which they promise “can help bring your story to life in many ways.” Auto Awesome Movies gives users the ability to have movies complete with effects, transitions and a soundtrack created automatically, using the photos from an album.

Other Auto Awesome Movies features include different styles and filters users can apply over the video with a swipe and finer controls for video length, soundtrack, etc.

Google+ Photo Enhancements


Some of the changes apply to Google+ photo editing and storage on the network, while others are within the Auto Awesome tool. On photos, Google+ upgrades include:

  • Full size backups and background sync for Google+ for iOS, to automatically upload photos as they’re taken.
  • Over 1000 new search terms to improve Google+ photo search, expanding upon the deep learning photo surfacing search unveiled at Google I/O.
  • The ability to dial auto enhancements up or down, or to exempt an album entirely.
  • HDR Scrape, a new setting as part of SnapSeed to allow users to control filters strength.
  • A new filter called Analog Efex Pro for the Nik collection (premium service, $149) of pro photo editing tools.

Will Disavowing a Competitor's Links Hurt Their Search Rankings?

The Missing Link" is a Search Engine Watch exclusive reader-driven Q&A column with veteran content publicist Eric Ward. You can ask questions about all aspects of links and link building and Eric will provide his expert answers. Submit your questions here, and you may be featured in a future installment!

I run a small SEO agency. We have a client for whom we are about to upload a disavow file to Google. The file contains only the URLs we are unhappy with. The client asked me what would happen if we also included URLs containing backlinks to one of his competitor's sites in that disavow file. Would this have the effect of causing his competitor's site to drop in the rankings, because the URLs pointing to his site had been disavowed? And if this is true, what's to keep any site from doing this as part of a negative SEO campaign against their competitors?

– Nervous About Negative SEO

You bring up a fair point, and there have been cases where negative SEO has been shown to work.

However, there is an assumption in your question we should clarify. The question could be rephrased like this: "Do disavowed URLs lose all negative (or positive) impact for all sites they might be linking to?"

I received the following feedback from Google:

"When a URL is disavowed, it is done so only when a site owner is logged into Google Webmaster Tools, and on behalf of a site that the logged-in user has verified they own/control. So any disavow file that is uploaded is associated only with the site that user controls, and no other sites will be affected."
The Missing Link" is a Search Engine Watch exclusive reader-driven Q&A column with veteran content publicist Eric Ward. You can ask questions about all aspects of links and link building and Eric will provide his expert answers. Submit your questions here, and you may be featured in a future installment!

I run a small SEO agency. We have a client for whom we are about to upload a disavow file to Google. The file contains only the URLs we are unhappy with. The client asked me what would happen if we also included URLs containing backlinks to one of his competitor's sites in that disavow file. Would this have the effect of causing his competitor's site to drop in the rankings, because the URLs pointing to his site had been disavowed? And if this is true, what's to keep any site from doing this as part of a negative SEO campaign against their competitors?

– Nervous About Negative SEO

You bring up a fair point, and there have been cases where negative SEO has been shown to work.

However, there is an assumption in your question we should clarify. The question could be rephrased like this: "Do disavowed URLs lose all negative (or positive) impact for all sites they might be linking to?"

I received the following feedback from Google:

"When a URL is disavowed, it is done so only when a site owner is logged into Google Webmaster Tools, and on behalf of a site that the logged-in user has verified they own/control. So any disavow file that is uploaded is associated only with the site that user controls, and no other sites will be affected."

Buffer’s Response to Hacking: A Study in Social Media Crisis Management


More often than brands would probably like, we’re given opportunities to learn about social media crisis management through the highly visible fallout from the experiences of others. This weekend, social sharing platform Buffer was hacked, resulting in a Saturday afternoon and evening crisis for the start-up.

I wouldn’t say it was a positive experience for Buffer, but I will say this: it turned out okay. Not awesome, but okay. That’s about the best you can hope for when hackers cause an interruption in service for your customers that lasts several hours.

Buffer Responded to Spam Hack Saturday Afternoon

Over several hours, I watched as Buffer communicated with media, customers and their greater social audience. Few were bashing the brand; in fact, the social buzz was largely positive across their channels. Customers praised the company for their transparency and timely communications. I was amazed to see a Buffer rep, Andy, tweeting in response to each and every mention they received at the peak of their crisis. Staff were communicating across their blog, Twitter, Facebook and through the media, to ensure customers were fully informed.

Buffer co-founder and CMO Leo Widrich took the time to discuss his company’s social media crisis management strategy with us just a day after it happened. “It was really incredible to see how everyone on the team just tried to find a way to help our users, whether in comments, with Tweets, on Facebook and via email,” he said. “I'm incredibly grateful for the people on our team and how they've responded here.”

More often than brands would probably like, we’re given opportunities to learn about social media crisis management through the highly visible fallout from the experiences of others. This weekend, social sharing platform Buffer was hacked, resulting in a Saturday afternoon and evening crisis for the start-up.

I wouldn’t say it was a positive experience for Buffer, but I will say this: it turned out okay. Not awesome, but okay. That’s about the best you can hope for when hackers cause an interruption in service for your customers that lasts several hours.

Buffer Responded to Spam Hack Saturday Afternoon

Over several hours, I watched as Buffer communicated with media, customers and their greater social audience. Few were bashing the brand; in fact, the social buzz was largely positive across their channels. Customers praised the company for their transparency and timely communications. I was amazed to see a Buffer rep, Andy, tweeting in response to each and every mention they received at the peak of their crisis. Staff were communicating across their blog, Twitter, Facebook and through the media, to ensure customers were fully informed.

Buffer co-founder and CMO Leo Widrich took the time to discuss his company’s social media crisis management strategy with us just a day after it happened. “It was really incredible to see how everyone on the team just tried to find a way to help our users, whether in comments, with Tweets, on Facebook and via email,” he said. “I'm incredibly grateful for the people on our team and how they've responded here.”

A Plan For Effective Internet Marketing

Internet marketing isn't always the easiest thing to do. However there is one key to effective internet marketing that far too many people overlook. So, what is it that they don't do? They don't make a plan. To be fair, a lot of internet marketers think they have a plan, but saying "I'm going to make some money" is not a plan.

Perhaps that's one of the primary reasons why franchises are so popular. They present a very detailed plan for just about every aspect of running a business. But we are talking about internet marketing, not a traditional bricks and mortar business. So, what you really need for effective internet marketing is a business plan.

Let's be blunt, creating a business plan is going to take time and effort. Without one, however, you will be wandering aimlessly through your business and bringing in money in a way that's too hit or miss. You may do okay without one, but you won't see your full profit potential unless you have a business plan in place. Think of it as the difference between having a hobby and having a real business that makes you money. If you want it to be a hobby, then that's up to you; maybe you don't really need a plan.

Your plan is a sort of road map to help you define and reach your goals. A good way to start your business plan is to think of where you want to end up, then start working backwards from there. What steps will you have to take? How much will you make? How will you make it? How much will you invest? What marketing methods will you use to promote your online business? Answering questions like these will help you make a good plan.
Internet marketing isn't always the easiest thing to do. However there is one key to effective internet marketing that far too many people overlook. So, what is it that they don't do? They don't make a plan. To be fair, a lot of internet marketers think they have a plan, but saying "I'm going to make some money" is not a plan.

Perhaps that's one of the primary reasons why franchises are so popular. They present a very detailed plan for just about every aspect of running a business. But we are talking about internet marketing, not a traditional bricks and mortar business. So, what you really need for effective internet marketing is a business plan.

Let's be blunt, creating a business plan is going to take time and effort. Without one, however, you will be wandering aimlessly through your business and bringing in money in a way that's too hit or miss. You may do okay without one, but you won't see your full profit potential unless you have a business plan in place. Think of it as the difference between having a hobby and having a real business that makes you money. If you want it to be a hobby, then that's up to you; maybe you don't really need a plan.

Your plan is a sort of road map to help you define and reach your goals. A good way to start your business plan is to think of where you want to end up, then start working backwards from there. What steps will you have to take? How much will you make? How will you make it? How much will you invest? What marketing methods will you use to promote your online business? Answering questions like these will help you make a good plan.

The Value of Referrer Data in Link Building

Before we get into this article let me first state, link building is not dead.  There are a lot of opinions floating around the web on both sides; this is just mine.  Google has shut down link networks and Matt Cutts continues to make videos on what types of guest blogging are OK.  If links were dead, would Google really put in this effort?  Would anyone get an “unnatural links” warning?

The fact is, links matter.  The death is in links that are easy to manipulate.  Some may say link building is dead but what they mean is, “The easy links that I know how to build are dead.”

What does this mean for those of us who still want high rankings and know we need links to get them?  Simply, buckle up, because you have to take off your gaming hat and put on your marketing cap.  You have to understand people and you have to know how to work with them, either directly or indirectly.

I could write a book on what this means for link building as a whole, but this isn't a book, so I'll try to keep focused.  In this article, we're going to focus on one kind of link building and one source of high quality link information that typically goes unnoticed: referrer data.

I should make one note before we launch in, I'm going to use the term loosely  to provide additional value.  We'll get into that shortly but first, let's see how referrer data helps and how to use it.

The Value Of Referrer Data

Those of you who have ignored your analytics can stop reading now and start over with “A Guide To Getting Started With Analytics.”  Bookmark this article and maybe come back to it in a few weeks.  Those of you who do use your analytics on at least a semi-regular basis and are interested in links can come along while we dig in.

The question is, why is referrer data useful?  Let's think about what Google's been telling us about valuable links: they are those that you would build if there were no engines.  So where are we going to find the links we'd be happy about if there were no engines?  Why, in our traffic, of course.
Before we get into this article let me first state, link building is not dead.  There are a lot of opinions floating around the web on both sides; this is just mine.  Google has shut down link networks and Matt Cutts continues to make videos on what types of guest blogging are OK.  If links were dead, would Google really put in this effort?  Would anyone get an “unnatural links” warning?

The fact is, links matter.  The death is in links that are easy to manipulate.  Some may say link building is dead but what they mean is, “The easy links that I know how to build are dead.”

What does this mean for those of us who still want high rankings and know we need links to get them?  Simply, buckle up, because you have to take off your gaming hat and put on your marketing cap.  You have to understand people and you have to know how to work with them, either directly or indirectly.

I could write a book on what this means for link building as a whole, but this isn't a book, so I'll try to keep focused.  In this article, we're going to focus on one kind of link building and one source of high quality link information that typically goes unnoticed: referrer data.

I should make one note before we launch in, I'm going to use the term loosely  to provide additional value.  We'll get into that shortly but first, let's see how referrer data helps and how to use it.

The Value Of Referrer Data

Those of you who have ignored your analytics can stop reading now and start over with “A Guide To Getting Started With Analytics.”  Bookmark this article and maybe come back to it in a few weeks.  Those of you who do use your analytics on at least a semi-regular basis and are interested in links can come along while we dig in.

The question is, why is referrer data useful?  Let's think about what Google's been telling us about valuable links: they are those that you would build if there were no engines.  So where are we going to find the links we'd be happy about if there were no engines?  Why, in our traffic, of course.

10 Costly Search Engine Mistakes to Avoid

If you have a website then you already know the importance of traffic. Traffic is to Internet marketing as location is to real estate. It's the only thing that really matters. If you cannot generate targeted visitors to your site, you will not make any sales.

Usually the owner or designer of the website is the person designated to drive traffic to the site. The chief ingredient in generating traffic is the search engine. Of course, you can use advertising, but it's going to cost you. Using the search engines to generate targeted (interested in your product) traffic is the least expensive method known.

Unfortunately, many website owners do not understand the importance of search engine visibility, which leads to traffic. They place more importance on producing a "pretty" website. Not that this is bad, but it is really secondary to search engine placement. Hopefully, the following list of common mistakes, made by many website owners, will help you generate more targeted traffic to your site...after all, isn't that what you want.

1. Not using keywords effectively.
This is probably one of the most critical area of site design. Choose the right keywords and potential customers will find your site. Use the wrong ones and your site will see little, if any, traffic.

2. Repeating the same keywords.

When you use the same keywords over and over again (called keyword stacking) the search engines may downgrade (or skip) the page or site.

3. Robbing pages from other websites.

How many times have you heard or read that "this is the Internet and it's ok" to steal icons and text from websites to use on your site. Don't do it. Its one thing to learn from others who have been there and another to outright copy their work. The search engines are very smart and usually detect page duplication. They may even prevent you from ever being listed by them.
If you have a website then you already know the importance of traffic. Traffic is to Internet marketing as location is to real estate. It's the only thing that really matters. If you cannot generate targeted visitors to your site, you will not make any sales.

Usually the owner or designer of the website is the person designated to drive traffic to the site. The chief ingredient in generating traffic is the search engine. Of course, you can use advertising, but it's going to cost you. Using the search engines to generate targeted (interested in your product) traffic is the least expensive method known.

Unfortunately, many website owners do not understand the importance of search engine visibility, which leads to traffic. They place more importance on producing a "pretty" website. Not that this is bad, but it is really secondary to search engine placement. Hopefully, the following list of common mistakes, made by many website owners, will help you generate more targeted traffic to your site...after all, isn't that what you want.

1. Not using keywords effectively.
This is probably one of the most critical area of site design. Choose the right keywords and potential customers will find your site. Use the wrong ones and your site will see little, if any, traffic.

2. Repeating the same keywords.

When you use the same keywords over and over again (called keyword stacking) the search engines may downgrade (or skip) the page or site.

3. Robbing pages from other websites.

How many times have you heard or read that "this is the Internet and it's ok" to steal icons and text from websites to use on your site. Don't do it. Its one thing to learn from others who have been there and another to outright copy their work. The search engines are very smart and usually detect page duplication. They may even prevent you from ever being listed by them.

Google's Misogynistic Autocomplete Suggestions: Who's Responsible?

Google’s autocomplete function is a complicated matter. Scratch that. The web’s content as a whole is a complicated matter. Or is it the world we live in that’s complicated? Actually, it’s a little of all three. And nowhere is this more apparent than in a recent campaign by UN Women, which demonstrates perceptions held across the globe about what women should and should not do, using Google autocomplete as the catalyst.

But what’s Google to do? Censor free speech? Well, that’s tricky. People will always hold strong beliefs offline, which will be reflected in the content that lives online. And Google isn’t going to change that without first censoring free speech.

What about suggesting something before a person has even finished typing his or her query – is that something Google should be policing?

Should Autocomplete Exist?

Many people use Google’s autocomplete function on a regular basis and find it helpful. In fact, this study by Rosetta back in 2011 showed autocomplete was one of the most-used features in an eye tracking study.

Autocomplete suggestions are served up by an algorithm that takes into account the popularity of certain search phrases, the localion of the person searching, the freshness of the search query and a person’s previous searches.

Marketers sometimes use the info in autocomplete to get a sense of what the majority of people are searching for when using a keyword or set of keywords.

At face value, Google autocomplete doesn’t sound all that bad, until you start talking about important issues like the misogynistic suggestions that are surfacing and referenced in the ads UN Women put out.

Should Google Censor Autocomplete Then?

Well, Google has and does censor autocomplete and the connected Google Instant feature (serving up results before a person hits enter) at will. There is a rumored blacklist of terms that Google won’t assist searchers in finding results for in autocomplete or in Instant.
Google’s autocomplete function is a complicated matter. Scratch that. The web’s content as a whole is a complicated matter. Or is it the world we live in that’s complicated? Actually, it’s a little of all three. And nowhere is this more apparent than in a recent campaign by UN Women, which demonstrates perceptions held across the globe about what women should and should not do, using Google autocomplete as the catalyst.

But what’s Google to do? Censor free speech? Well, that’s tricky. People will always hold strong beliefs offline, which will be reflected in the content that lives online. And Google isn’t going to change that without first censoring free speech.

What about suggesting something before a person has even finished typing his or her query – is that something Google should be policing?

Should Autocomplete Exist?

Many people use Google’s autocomplete function on a regular basis and find it helpful. In fact, this study by Rosetta back in 2011 showed autocomplete was one of the most-used features in an eye tracking study.

Autocomplete suggestions are served up by an algorithm that takes into account the popularity of certain search phrases, the localion of the person searching, the freshness of the search query and a person’s previous searches.

Marketers sometimes use the info in autocomplete to get a sense of what the majority of people are searching for when using a keyword or set of keywords.

At face value, Google autocomplete doesn’t sound all that bad, until you start talking about important issues like the misogynistic suggestions that are surfacing and referenced in the ads UN Women put out.

Should Google Censor Autocomplete Then?

Well, Google has and does censor autocomplete and the connected Google Instant feature (serving up results before a person hits enter) at will. There is a rumored blacklist of terms that Google won’t assist searchers in finding results for in autocomplete or in Instant.

Google AdSense and AdWords - Like Yin and Yang

Many websites include a section or two with the "Ads by Google" above it or below it. These are ads displayed via Google's AdSense. When you do a search on Google and see ads in the search results pages, they are generally AdSense ads.

AdSense - The Yin:

Google's AdSense is programming that "senses" the content of a page or search. It finds keywords on a web page or in a search phrase to determine the subject of the content. It does this by either "reading" the page, or taking a look at the search term that was typed into Google Search. It can "Sense" which ads in the system are relevant to the content or search and display them - thus: "AdSense."

Before ads on a page are displayed, AdSense searches its database of advertisers and finds ads that are associated with keywords on the page or in the search. Now AdSense needs to decide which of the thousands of ads vying for position are actually displayed. To make this decision, AsSense looks at the advertiser's bids for the relevant keywords. The advertisers that have a combination of the highest bid, keyword relevance and best click through get displayed first.
Many websites include a section or two with the "Ads by Google" above it or below it. These are ads displayed via Google's AdSense. When you do a search on Google and see ads in the search results pages, they are generally AdSense ads.

AdSense - The Yin:

Google's AdSense is programming that "senses" the content of a page or search. It finds keywords on a web page or in a search phrase to determine the subject of the content. It does this by either "reading" the page, or taking a look at the search term that was typed into Google Search. It can "Sense" which ads in the system are relevant to the content or search and display them - thus: "AdSense."

Before ads on a page are displayed, AdSense searches its database of advertisers and finds ads that are associated with keywords on the page or in the search. Now AdSense needs to decide which of the thousands of ads vying for position are actually displayed. To make this decision, AsSense looks at the advertiser's bids for the relevant keywords. The advertisers that have a combination of the highest bid, keyword relevance and best click through get displayed first.

Matt Cutts on How to Guest Blog Without Looking Like a Spammer to Google

There's a lot of speculation about the value of guest blogging. Many guest blogs don't have the highest of standards, and could be seen as paid links in the eyes of Google, rather than a legitimate way to gain exposure and direct traffic.

In the latest webmaster help video, Google's Distinguished Engineer Matt Cutts addresses the question of how you can guest blog without it looking like you pay for links.

Cutts clearly specified what makes something look like a paid link campaign via guest blogging as opposed to a genuine guest blog campaign. When you really drill down, the differences are quite obvious.

Here is what generally makes a guest blog post appear to really be part of a larger paid link scheme, according to Cutts.

"Usually there is a pretty clear distinction between an occasional press blog versus someone who's doing a large scale paid link kind of thing," Cutts said. "If you're paying for links, it's more likely that it's an off-topic or an irrelevant blog post that doesn't really match the subject of the blog itself. It's more likely you will see keyword-rich anchor text and that sort of thing."

So what makes a guest blog post seem legitimate?

"It's more likely to be hopefully someone who's expert, there will usually be a paragraph about who the person is and why you invited them to be on your blog," Cutts said. "Hopefully the guest blogger isn't dropping keywords in their anchors nearly as much as these other sorts of methods of generating links."

He said there's clearly overlap between the two, and sometimes Google's web spam team needs to look at those guest blog posts to determine whether it falls under paid links or if it is a great guest blog post that is of value to your blog and you blog's audience. Cutts continued:
There's a lot of speculation about the value of guest blogging. Many guest blogs don't have the highest of standards, and could be seen as paid links in the eyes of Google, rather than a legitimate way to gain exposure and direct traffic.

In the latest webmaster help video, Google's Distinguished Engineer Matt Cutts addresses the question of how you can guest blog without it looking like you pay for links.

Cutts clearly specified what makes something look like a paid link campaign via guest blogging as opposed to a genuine guest blog campaign. When you really drill down, the differences are quite obvious.

Here is what generally makes a guest blog post appear to really be part of a larger paid link scheme, according to Cutts.

"Usually there is a pretty clear distinction between an occasional press blog versus someone who's doing a large scale paid link kind of thing," Cutts said. "If you're paying for links, it's more likely that it's an off-topic or an irrelevant blog post that doesn't really match the subject of the blog itself. It's more likely you will see keyword-rich anchor text and that sort of thing."

So what makes a guest blog post seem legitimate?

"It's more likely to be hopefully someone who's expert, there will usually be a paragraph about who the person is and why you invited them to be on your blog," Cutts said. "Hopefully the guest blogger isn't dropping keywords in their anchors nearly as much as these other sorts of methods of generating links."

He said there's clearly overlap between the two, and sometimes Google's web spam team needs to look at those guest blog posts to determine whether it falls under paid links or if it is a great guest blog post that is of value to your blog and you blog's audience. Cutts continued:

Google Keyword '(Not Provided)': How to Move Forward

Without a doubt, Google's recent changes make performance reporting less accurate. SEO professionals and marketers no longer have the raw data that we once used to measure SEO results. We will need to use different KPIs and trending metrics to approximate the data that is now lost.

However these changes aren't a surprise. It has been widely assumed by the SEO community for some time that this change was going to happen (although few expected it to be so soon).

Google isn't the only company making "secure search" a priority. Browsers such as IE10, Firefox 14+, and Mobile Safari have put measures in place to mask keyword referral data.

Fortunately, many SEO professionals and organizations have been preparing for this eventuality. It starts with having a solid plan in place to report on data that we know historically has a high correlation to the success that we were once able to directly measure.

The good news is, unlike Google's Panda and Penguin updates, this change doesn't affect our approach to optimization for the most part other than performance reporting (with the exception of being able to use analytics data for keyword research).

As Google's Distinguished Engineer Matt Cutts has said: "Succeeding in SEO will be the same as it's always been if you're doing it right – give the users a great experience."
Without a doubt, Google's recent changes make performance reporting less accurate. SEO professionals and marketers no longer have the raw data that we once used to measure SEO results. We will need to use different KPIs and trending metrics to approximate the data that is now lost.

However these changes aren't a surprise. It has been widely assumed by the SEO community for some time that this change was going to happen (although few expected it to be so soon).

Google isn't the only company making "secure search" a priority. Browsers such as IE10, Firefox 14+, and Mobile Safari have put measures in place to mask keyword referral data.

Fortunately, many SEO professionals and organizations have been preparing for this eventuality. It starts with having a solid plan in place to report on data that we know historically has a high correlation to the success that we were once able to directly measure.

The good news is, unlike Google's Panda and Penguin updates, this change doesn't affect our approach to optimization for the most part other than performance reporting (with the exception of being able to use analytics data for keyword research).

As Google's Distinguished Engineer Matt Cutts has said: "Succeeding in SEO will be the same as it's always been if you're doing it right – give the users a great experience."

Foursquare Self-Serve Ads Now Open to All Local Businesses

Location-based social networking brand Foursquare has rolled out self-serve ads to all advertisers, in an effort to monetize their audience of 40 million users.

Foursquare Ads, which were introduced in July, use a pay-per-action model; advertisers are charged only when a user clicks through to see their business information or checks in at their physical location.

Ads are displayed to people nearby who are searching for relevant information or have visited similar locations. Foursquare won't display ads to users already checked into the business.

Ads appear when users first open the Foursquare app; they may see an ad based on their usage, location and history. Ads may also appear on a search results page.

Local businesses face a specific problem, Foursquare said in their blog post announcement: they want to get people in the door, but tons of people walk by the storefront without coming in.

They've used data acquired over their four years of consumer interactions – and relationships with over 1.5 million claimed businesses – to inform their new advertising offering, they say.

The ad creation process is relatively straightforward. Advertisers are asked to first search for their business, to ensure it's been claimed.

After logging in, advertisers then upload a picture and add content, in the form of a discount offer, tip or review. They can set a monthly budget and control their ads through the Foursquare advertising dashboard, where they can see data on views, spend and actions taken.

Location-based social networking brand Foursquare has rolled out self-serve ads to all advertisers, in an effort to monetize their audience of 40 million users.

Foursquare Ads, which were introduced in July, use a pay-per-action model; advertisers are charged only when a user clicks through to see their business information or checks in at their physical location.

Ads are displayed to people nearby who are searching for relevant information or have visited similar locations. Foursquare won't display ads to users already checked into the business.

Ads appear when users first open the Foursquare app; they may see an ad based on their usage, location and history. Ads may also appear on a search results page.

Local businesses face a specific problem, Foursquare said in their blog post announcement: they want to get people in the door, but tons of people walk by the storefront without coming in.

They've used data acquired over their four years of consumer interactions – and relationships with over 1.5 million claimed businesses – to inform their new advertising offering, they say.

The ad creation process is relatively straightforward. Advertisers are asked to first search for their business, to ensure it's been claimed.

After logging in, advertisers then upload a picture and add content, in the form of a discount offer, tip or review. They can set a monthly budget and control their ads through the Foursquare advertising dashboard, where they can see data on views, spend and actions taken.

Google Shared Endorsements Ads to Include User Names, Photos, Ratings, Comments


Google is expanding its “Shared endorsements” program. Starting in November, your name, photo, comments, and ratings or +1's you’ve given to a brand or local business can show up in both organic search results and in advertisements for that brand or businesss.

"This only happens when you take an action (things like +1’ing, commenting or following) – and the only people who see it are the people you’ve chosen to share that content with," Google explained in its announcement.

Google reminded users with its updated terms that users control how their endorsements are displayed in ads through their settings.

On November 11, Google+ will update its terms of service agreement to clarify how your profile name and photo might be used in Google products, including in reviews and advertising.

“This update to our Terms of Service doesn’t change in any way who you’ve shared things with in the past or your ability to control who you want to share things with in the future,” Google said in its announcement.

If you'd like to check your settings, here’s a couple simple steps:

Sign into your Google account. Go to the "Shared Endorsements" settings page.
You can enable or disable the setting that allows people to see your name and photo in shared endorsements on ads by checking or unchecking the box shown in the following screenshot:


Google is expanding its “Shared endorsements” program. Starting in November, your name, photo, comments, and ratings or +1's you’ve given to a brand or local business can show up in both organic search results and in advertisements for that brand or businesss.

"This only happens when you take an action (things like +1’ing, commenting or following) – and the only people who see it are the people you’ve chosen to share that content with," Google explained in its announcement.

Google reminded users with its updated terms that users control how their endorsements are displayed in ads through their settings.

On November 11, Google+ will update its terms of service agreement to clarify how your profile name and photo might be used in Google products, including in reviews and advertising.

“This update to our Terms of Service doesn’t change in any way who you’ve shared things with in the past or your ability to control who you want to share things with in the future,” Google said in its announcement.

If you'd like to check your settings, here’s a couple simple steps:

Sign into your Google account. Go to the "Shared Endorsements" settings page.
You can enable or disable the setting that allows people to see your name and photo in shared endorsements on ads by checking or unchecking the box shown in the following screenshot:

Bing It On Challenge Hits London: Bing Still Finds Bing Superior to Google

Just days after the validity of the Bing It On Challenge faced serious questions, Microsoft has apparently decided that people have been without marketing pokes aimed at Google and kicked off another of its Bing It On advertising campaigns.

On the Bing It On UK site, you can elect to watch a video where Microsoft takes an actor into the streets of London and does a search test that involves taking "jumpers" from people whenever Bing defeats Google in a challenge.

Which search engine gives better quality results, Bing or Google? We know that many of you would likely answer Google - it's time for that to change! We are delighted to announce that in blind tests, using the UK's most popular web searches, more people picked Bing than Google*. See what happened when we took the Bing It On challenge to the streets of London. You may be surprised by the result!

Why the asterisk? Because the tests, which consisted of a sample size of 1,000 people and took place between December 2012 and June 2013, is based on web results only – stripping out things such as Google's Knowledge Graph and ads.

It appears that Microsoft has spent good marketing money on proving that a some people would select a Bing search result over a Google one, if it was delivered in a vanilla format.
Just days after the validity of the Bing It On Challenge faced serious questions, Microsoft has apparently decided that people have been without marketing pokes aimed at Google and kicked off another of its Bing It On advertising campaigns.

On the Bing It On UK site, you can elect to watch a video where Microsoft takes an actor into the streets of London and does a search test that involves taking "jumpers" from people whenever Bing defeats Google in a challenge.

Which search engine gives better quality results, Bing or Google? We know that many of you would likely answer Google - it's time for that to change! We are delighted to announce that in blind tests, using the UK's most popular web searches, more people picked Bing than Google*. See what happened when we took the Bing It On challenge to the streets of London. You may be surprised by the result!

Why the asterisk? Because the tests, which consisted of a sample size of 1,000 people and took place between December 2012 and June 2013, is based on web results only – stripping out things such as Google's Knowledge Graph and ads.

It appears that Microsoft has spent good marketing money on proving that a some people would select a Bing search result over a Google one, if it was delivered in a vanilla format.

Dissecting Moz's 2013 Search Engine Ranking Factors

Imagine playing a team sport, where your team could be penalized at any time, but you and your teammates have only a rudimentary idea of what may or may not trigger a penalty or even whether both teams were being held to the same standards. Sound intriguing?
 

If that appeals to you, you may be an SEO professional. Or an anarchist. Maybe both.
Moz recently published the full results of their 2013 Search Engine Ranking Factors survey. SEW previously reported on the initial findings, but I decided to take a look to see if I'd find anything of substance. I thought if nothing else, the input of more than 120 SEO professionals and a scientific approach should be a good starting point.

The report was apparently compiled in two phases: a survey of 129 SEOs, soliciting their impressions and opinions on the importance of various factors and signals, as well as a scientific correlation analysis of over 14,000 keywords.

Here's what the respondents to the survey had to say:

Ranking Factors or Correlation Estimates?

Before we get into the details, let me just say that the first problem I have with this report is that it's poorly named. Calling it a search engine ranking factors report might lead some to believe it details search engine ranking factors. Duh! Wrong!

The report undoubtedly skirts a number of actual ranking factors (and signals), but where it does, it's pure coincidence. What this report describes would be more aptly named the Moz Search Correlations Report.

Now to be fair, the report, and every mention of it, is quite clear on the fact that what it calls out is simply correlation. They aren't presenting it as a listing of actual search engine ranking factors. But the name is misleading. Terribly so.
Imagine playing a team sport, where your team could be penalized at any time, but you and your teammates have only a rudimentary idea of what may or may not trigger a penalty or even whether both teams were being held to the same standards. Sound intriguing?
 

If that appeals to you, you may be an SEO professional. Or an anarchist. Maybe both.
Moz recently published the full results of their 2013 Search Engine Ranking Factors survey. SEW previously reported on the initial findings, but I decided to take a look to see if I'd find anything of substance. I thought if nothing else, the input of more than 120 SEO professionals and a scientific approach should be a good starting point.

The report was apparently compiled in two phases: a survey of 129 SEOs, soliciting their impressions and opinions on the importance of various factors and signals, as well as a scientific correlation analysis of over 14,000 keywords.

Here's what the respondents to the survey had to say:

Ranking Factors or Correlation Estimates?

Before we get into the details, let me just say that the first problem I have with this report is that it's poorly named. Calling it a search engine ranking factors report might lead some to believe it details search engine ranking factors. Duh! Wrong!

The report undoubtedly skirts a number of actual ranking factors (and signals), but where it does, it's pure coincidence. What this report describes would be more aptly named the Moz Search Correlations Report.

Now to be fair, the report, and every mention of it, is quite clear on the fact that what it calls out is simply correlation. They aren't presenting it as a listing of actual search engine ranking factors. But the name is misleading. Terribly so.

Google Penguin 2.1: Who Got Hit?

Everybody's favorite (or least favorite) aquatic bird is back, and now site owners are once again asking how Google Penguin 2.1 has affected their website.

As he did for Penguin 1.0 and Penguin 2.0, Glenn Gabe at G-Squared Interactive has analyzed 26 websites impacted by the algorithmic change to determine what factors contributed to sites that got hit.

Gabe, who has now analyzed more than 275 sites hit by Penguin, told Search Engine Watch that he believes Google Penguin 2.1 had a much greater impact than its predecesor.

Why Sites Were Hit by Google Penguin 2.1

Not surprisingly, Penguin 2.1 appears to have identified newer link spam – those links that were created at a later date than the Penguin 2.0 rollout in May, Gabe said.

Of that link spam, Gabe said the following represent the culprits:
 
  • Forum spam: This includes comments in forums with exact match anchor text links.
  • Forum bio spam: Biographies of forum users containing exact match anchor text links.
  • "Do follow" blogs: Blogs that don't add nofollow to the links posted. "Let's face it," Gabe said. "Being listed on do-follow resource sites can absolutely send Google a signal that you are trying to game links."
  • Blogroll spam: Watch for blogroll links gone wrong. "Some may be fine," Gabe said. "If you are unsure which ones are bad versus good, ask for help from a seasoned SEO."
  • Spammy directories: If you've used spammy directories in the past, and still have links out there, Gabe said "nuke them, have them nofollowed, or disavow them."
  • Blog comment signature spam: Google seems to be targeting these links even when they're not followed, Gabe said.
Everybody's favorite (or least favorite) aquatic bird is back, and now site owners are once again asking how Google Penguin 2.1 has affected their website.

As he did for Penguin 1.0 and Penguin 2.0, Glenn Gabe at G-Squared Interactive has analyzed 26 websites impacted by the algorithmic change to determine what factors contributed to sites that got hit.

Gabe, who has now analyzed more than 275 sites hit by Penguin, told Search Engine Watch that he believes Google Penguin 2.1 had a much greater impact than its predecesor.

Why Sites Were Hit by Google Penguin 2.1

Not surprisingly, Penguin 2.1 appears to have identified newer link spam – those links that were created at a later date than the Penguin 2.0 rollout in May, Gabe said.

Of that link spam, Gabe said the following represent the culprits:
 
  • Forum spam: This includes comments in forums with exact match anchor text links.
  • Forum bio spam: Biographies of forum users containing exact match anchor text links.
  • "Do follow" blogs: Blogs that don't add nofollow to the links posted. "Let's face it," Gabe said. "Being listed on do-follow resource sites can absolutely send Google a signal that you are trying to game links."
  • Blogroll spam: Watch for blogroll links gone wrong. "Some may be fine," Gabe said. "If you are unsure which ones are bad versus good, ask for help from a seasoned SEO."
  • Spammy directories: If you've used spammy directories in the past, and still have links out there, Gabe said "nuke them, have them nofollowed, or disavow them."
  • Blog comment signature spam: Google seems to be targeting these links even when they're not followed, Gabe said.

The World of Duplicate Content - Use of a Filter

The World Wide Web is like a running race or marathon where websites compete to reach the finish line first. In this case, the finish line is a higher ranking. And in this race for supremacy, it is important to avoid duplicate content and its penalties.

To facilitate the efficient functioning of directories, search engines have been armed with content filters. This removes or filters duplicate content from pages it's indexing. And the most hurtful penalty is lower rankings.

Unfortunately, these filters not only catch rogues but web pages that are genuine too. What webmasters need to do is understand how filters function and know what action is to be taken to avoid being filtered out.

When a search engine sends out spiders the filters leave out or sieve:

• Websites that feature identical content. And when within a site the webmaster includes many copies or versions of pages to cheat the search engines. Filters are also extremely sensitive to "doorway" pages.
• Content masked by different packaging. Known as "scraped content" this duplication of pages with little or no relevant changes falls prey to filters.
• Product descriptions featured by e-commerce sites. Most e-commerce sites publish alongside a product the manufacturer's description of the product and this content then appears on zillions of e-commerce sites falling victim to filters.
• Articles distributed widely over the net. While some engines are programmed to find the origin of the article there are others who may not be able to source the origins.
• Pages that are not duplicates but contain the same core material written by different people.
 
The World Wide Web is like a running race or marathon where websites compete to reach the finish line first. In this case, the finish line is a higher ranking. And in this race for supremacy, it is important to avoid duplicate content and its penalties.

To facilitate the efficient functioning of directories, search engines have been armed with content filters. This removes or filters duplicate content from pages it's indexing. And the most hurtful penalty is lower rankings.

Unfortunately, these filters not only catch rogues but web pages that are genuine too. What webmasters need to do is understand how filters function and know what action is to be taken to avoid being filtered out.

When a search engine sends out spiders the filters leave out or sieve:

• Websites that feature identical content. And when within a site the webmaster includes many copies or versions of pages to cheat the search engines. Filters are also extremely sensitive to "doorway" pages.
• Content masked by different packaging. Known as "scraped content" this duplication of pages with little or no relevant changes falls prey to filters.
• Product descriptions featured by e-commerce sites. Most e-commerce sites publish alongside a product the manufacturer's description of the product and this content then appears on zillions of e-commerce sites falling victim to filters.
• Articles distributed widely over the net. While some engines are programmed to find the origin of the article there are others who may not be able to source the origins.
• Pages that are not duplicates but contain the same core material written by different people.
 

How to Raise Your Rankings in Search Engines - Basic SEO

I don't know how many times I've been asked. "How do I get traffic to my website?"
To best discuss this topic I should break this down into categories.

1. Title Tags

The title Should contain keywords and keyword phrases that are important to your site. Our recommended maximum number of characters for this tag is 60.
Also when counting your characters remember that spaces are considered as well.
Titles should appeal to the reader otherwise even a top position will lose a lot of clicks.
For example, "shoes,nike shoes,best shoes,review shoes," is unlikely to induce a click. What may induce a click would be like:
Shoes - Find out the latest styles on name brand shoes.

2. Description

The maximum number of characters I recommend for this Tag is 150. Any longer than this and it will only be cut off and may count against your site being listed high in the search engines. Try to repeat your keywords that you used in your title in a proper sentence and in 3rd party. Avoid I, Me, Myself etc

3. Keywords

keywords My recommended maximum number of characters for this tag is 250. Any more than that may be considered spamming. Keep your keywords focused upon what your site is about.
Don't think that if I use an assortment of different keywords that your site will be a seller. Targeted marketing is associated with your keywords.
 
I don't know how many times I've been asked. "How do I get traffic to my website?"
To best discuss this topic I should break this down into categories.

1. Title Tags

The title Should contain keywords and keyword phrases that are important to your site. Our recommended maximum number of characters for this tag is 60.
Also when counting your characters remember that spaces are considered as well.
Titles should appeal to the reader otherwise even a top position will lose a lot of clicks.
For example, "shoes,nike shoes,best shoes,review shoes," is unlikely to induce a click. What may induce a click would be like:
Shoes - Find out the latest styles on name brand shoes.

2. Description

The maximum number of characters I recommend for this Tag is 150. Any longer than this and it will only be cut off and may count against your site being listed high in the search engines. Try to repeat your keywords that you used in your title in a proper sentence and in 3rd party. Avoid I, Me, Myself etc

3. Keywords

keywords My recommended maximum number of characters for this tag is 250. Any more than that may be considered spamming. Keep your keywords focused upon what your site is about.
Don't think that if I use an assortment of different keywords that your site will be a seller. Targeted marketing is associated with your keywords.
 

BrightEdge Shifts Marketers from Keyword to Page PerformanceWhile Google was quietly planning its release of fully secure search, BrightEdge had already launched new optimization and tracking features that analyzed websites from a page level to fill the soon-to-be gap and prepare for the shift in marketing mindset that BrightEdge said it already saw on the horizon.

While Google was quietly planning its release of fully secure search, BrightEdge had already launched new optimization and tracking features that analyzed websites from a page level to fill the soon-to-be gap and prepare for the shift in marketing mindset that BrightEdge said it already saw on the horizon.
Page reporting is a tracking feature in BrightEdge that shows the value of pages from organic search traffic. This report complements the already existing "page manager" feature, which helps BrightEdge customers optimize at the page level, not just the keyword level.
"The way page reporting works is we integrate with site analytics, pull in page data and overlay that with keyword and rank data," said Brad Mattick, VP of marketing and products at BrightEdge. "And then we let businesses define page groups to model their business structure."
Mattick said this offers "powerful and flexible" ways of understanding the performance of pages in groups for both B2Bs and B2Cs based on a site's defined conversions.

Mattick said Google's secure search move helps marketers to step away from "intermediate metrics" such as keywords, and start looking at the performance of content as a whole.
"For any search marketer who has been paying attention, secure search should not have been a surprise," Mattick said. "This is a very positive change for the industry."
So what can those marketers do who want to make a transition from the keyword-focused model of success to a new paradigm?
Mattick pointed to the "triangulation" method in the interim, whereby a person can take into account the keyword's search volume, its rank in the search results and the click rate to estimate how much traffic a keyword is driving (which Mattick said is built into BrightEdge's tools already).
Mattick said you can also factor in revenue if you have a typical conversion rate on your site or on pages of your site, or a typical transaction value that you can apply to the equation.
"But the really important point in all this is that it's just estimation," Mattick said. And because estimation can only take you so far, Mattick said one thing marketers can know for sure when tracking progress is visits, conversions, and rank by page
While Google was quietly planning its release of fully secure search, BrightEdge had already launched new optimization and tracking features that analyzed websites from a page level to fill the soon-to-be gap and prepare for the shift in marketing mindset that BrightEdge said it already saw on the horizon.
Page reporting is a tracking feature in BrightEdge that shows the value of pages from organic search traffic. This report complements the already existing "page manager" feature, which helps BrightEdge customers optimize at the page level, not just the keyword level.
"The way page reporting works is we integrate with site analytics, pull in page data and overlay that with keyword and rank data," said Brad Mattick, VP of marketing and products at BrightEdge. "And then we let businesses define page groups to model their business structure."
Mattick said this offers "powerful and flexible" ways of understanding the performance of pages in groups for both B2Bs and B2Cs based on a site's defined conversions.

Mattick said Google's secure search move helps marketers to step away from "intermediate metrics" such as keywords, and start looking at the performance of content as a whole.
"For any search marketer who has been paying attention, secure search should not have been a surprise," Mattick said. "This is a very positive change for the industry."
So what can those marketers do who want to make a transition from the keyword-focused model of success to a new paradigm?
Mattick pointed to the "triangulation" method in the interim, whereby a person can take into account the keyword's search volume, its rank in the search results and the click rate to estimate how much traffic a keyword is driving (which Mattick said is built into BrightEdge's tools already).
Mattick said you can also factor in revenue if you have a typical conversion rate on your site or on pages of your site, or a typical transaction value that you can apply to the equation.
"But the really important point in all this is that it's just estimation," Mattick said. And because estimation can only take you so far, Mattick said one thing marketers can know for sure when tracking progress is visits, conversions, and rank by page

Yeah, But...It's So Easy to Run PPC Campaigns

We hear this going through business owners' and marketing executives' minds a lot when presenting the idea of managing their PPC campaigns for them. Many of them seem to have a fear of allowing someone outside of their organization (or maybe even someone on the inside) to take over control of something they may feel they already have control over. This is an understandable feeling, especially if their campaigns are making a profit.

They most likely feel that since they've built something that's working, why risk having someone else mess it up? Combine this with the fact that PPC platforms have put forth a lot of effort to make it easy for anyone to set up campaigns, and you can see why it might be hard to believe that letting someone else manage your PPC campaigns (while paying them a fee) is necessary. [tweet] Therefore, they become yeah, butters.


Why People Think It's Easy


Because it is. It can literally take 5 minutes to give your campaign a name, establish your audience with some keywords or interest categories, write your ad and hit enable. With a few clicks, you can be sending loads of traffic to your site just like that.

The problem is that although the interfaces of most platforms are easy to use, the systems that the platforms run on are complex. Given all of the buttons, bells and whistles that comprise the "control room" of the typical interface, when you go below the surface of creating a campaign, it can start to feel like you're sitting in an airplane cockpit.
We hear this going through business owners' and marketing executives' minds a lot when presenting the idea of managing their PPC campaigns for them. Many of them seem to have a fear of allowing someone outside of their organization (or maybe even someone on the inside) to take over control of something they may feel they already have control over. This is an understandable feeling, especially if their campaigns are making a profit.

They most likely feel that since they've built something that's working, why risk having someone else mess it up? Combine this with the fact that PPC platforms have put forth a lot of effort to make it easy for anyone to set up campaigns, and you can see why it might be hard to believe that letting someone else manage your PPC campaigns (while paying them a fee) is necessary. [tweet] Therefore, they become yeah, butters.


Why People Think It's Easy


Because it is. It can literally take 5 minutes to give your campaign a name, establish your audience with some keywords or interest categories, write your ad and hit enable. With a few clicks, you can be sending loads of traffic to your site just like that.

The problem is that although the interfaces of most platforms are easy to use, the systems that the platforms run on are complex. Given all of the buttons, bells and whistles that comprise the "control room" of the typical interface, when you go below the surface of creating a campaign, it can start to feel like you're sitting in an airplane cockpit.

Google Fixes Webmaster Tools Bug, Missing Search Query Data to Return

When Google turned on secure search last Monday, it meant webmasters were seeing nearly all their keyword referral data as "(not provided)". The best workaround to get organic search data was to access similar keyword data under Search Queries within the Search Traffic section of Google Webmaster Tools.

However, once the secure search switch was flipped, that keyword data stopped being updated or provided in Google Webmaster Tools. Well, webmasters can breathe a sigh of relief, as the missing keyword data in Google Webmaster Tools was simply a bug, Google has confirmed.

"We've recently fixed a small bug related to data reporting in Webmaster Tools. We expect reporting to return to normal in the coming days," a Google spokesperson told Search Engine Watch.

This is great news, as currently Webmaster Tools is the only way to get actual data from Google
When Google turned on secure search last Monday, it meant webmasters were seeing nearly all their keyword referral data as "(not provided)". The best workaround to get organic search data was to access similar keyword data under Search Queries within the Search Traffic section of Google Webmaster Tools.

However, once the secure search switch was flipped, that keyword data stopped being updated or provided in Google Webmaster Tools. Well, webmasters can breathe a sigh of relief, as the missing keyword data in Google Webmaster Tools was simply a bug, Google has confirmed.

"We've recently fixed a small bug related to data reporting in Webmaster Tools. We expect reporting to return to normal in the coming days," a Google spokesperson told Search Engine Watch.

This is great news, as currently Webmaster Tools is the only way to get actual data from Google