Subscribe: How To Make Money Online Fast
http://maximizingrevenuetips.blogspot.com/feeds/posts/default
Added By: Feedage Forager Feedage Grade A rated
Language: English
Tags:
bing  click  company  content  google  information  links  make  new  results  search engine  search  site  time  users 
Rate this Feed
Rate this feedRate this feedRate this feedRate this feedRate this feed
Rate this feed 1 starRate this feed 2 starRate this feed 3 starRate this feed 4 starRate this feed 5 star

Comments (0)

Feed Details and Statistics Feed Statistics
Preview: How To Make Money Online Fast

How To Make Money Online Fast



How to make money online from free paid surveys,earn cashback on purchases made online,make money writing product reviews,forum posting and much more.Make Money Online With GuruMonetizer



Last Build Date: Mon, 15 Jan 2018 08:28:30 PST

 



Mark Zuckerberg Sued For Unloading Facebook Stock

Mon, 04 Jun 2012 15:54:44 PDT

Mark Zuckerberg and mega-social media site Facebook have both been in the news quite a bit recently, between the company’s disastrous IPO and the many, many people who are unhappy with it. Now, a class-action lawsuit is being brought against Zuckerberg by some of those unhappy investors, who claim the mogul unloaded a huge amount of stock on inside information that it wasn’t worth its estimated value. Zuckerberg is no stranger to legal battles; last year he was sued over a Facebook page that garnered a wave of backlash from the Jewish community, and just a week after the company went public, they were sued for hiding “unfavorable growth forecasts” before the IPO. Oh, and there was that whole Winklevoss scandal. This lawsuit, however, could get big very quickly. It claims that JP Morgan, Goldman Sachs, and Morgan Stanley all tipped off only the investors with the largest stake in the company about the serious undervalue of the stock well before the IPO, leaving everyone else who invested their hard earned dollars in the dust. As of Monday morning, shares had fallen to around $26.00 apiece, far below the initial $38.00 projected value. Former Wall Street analyst Henry Blodget spoke up on behalf of investors, calling the insider trading allegations “absurd and unfair.” He goes on to say that the SEC should change their rules about such information being shared, asserting that every investor has the right to know what’s going on with the IPO. “This is an absurd and unfair practice,” he said. “The estimates themselves are material information–the consensus of smart, well-trained analysts who have worked with the company’s management to develop realistic forecasts. Most investors don’t even know that these estimates exist, let alone that they’re whispered verbally to only a handful of big investors. All potential investors should have easy access to these estimates, as well as to any logic underlying them. The SEC needs to change the rules here.” While the exact amount of Zuckerberg’s sold shares isn’t known, rumors put it around a billion dollars, and that adds up to a lot of angry stockholders. There have been accusations that Zuckerberg himself is to blame for the disastrous IPO on the grounds that he is an egomaniac who allowed the company to offer inflated projections in order to justify Facebook’s $100 billion valuation. The fact that Facebook’s head honcho took off on a honeymoon right after the stocks began to tank isn’t sitting well with investors, either. It looks like this case could get nasty very quickly as the very people who are supporting Facebook are demanding answers and accountability from its owner. Miss Day @March_Aries22 RT @AskKissy LOL. Mark got over.. Aren’t you glad you couldn’t afford to buy the stock? http://t.co/mmDLvsoT 2 minutes ago via UberSocial for Android · powered by @socialditto  Reply  · Retweet  · Favorite Tony Sharp @subtlearray When are people going to realize Zuckerberg is a horrible person? “Facebook, Mark Zuckerberg, Banks Sued Over IPO” http://t.co/wJIwuKEN 27 minutes ago via Tweetbot for iOS · powered by @socialditto  Reply  · Retweet  · Favorite Jennifer Simon @JenniferLSimon I guess someone is having a more stressful day than me…Mark Zuckerberg Sued For Unloading Facebook Stock #WebProNews http://t.co/LyD1A4Ip 46 minutes ago via Tweet Button · powered by @socialditto  Reply  · Retweet  · Favorite [...]



Google Announces New Features for Doublclick Ad Exchange

Sun, 22 Aug 2010 09:00:48 PDT

Google announced that it will roll out some new tools for DoubleClick Ad Exchange buyers. These features, the company says, will help buyers buy quality inventory and check their campaigns.

The DoubleClick Ad Exchange was launched in September as a real-time marketplace where online publishers and ad networks/agencies can buy and sell ad space for prices set in a real-time auction.

One new feature is called "Site Packs" which the company describes as "manually crafted collections of like sites based on DoubleClick Ad Planner and internal classifications, vetted for quality."

(image) Google is also making changes to its Real-time Bidder. "The biggest change here is for Ad Exchange clients who work with DSPs," says DoubleClick Ad Exchange Product Manager, Scott Spencer in an interview with AdExchanger.com (reposted to Google's DoubleClick Advertiser Blog). "Historically, Ad Exchange buyers were hidden from publishers behind their DSP. By introducing a way to segment out each individual client's ad calls, inventory can be sent exclusively to an Ad Exchange buyer even when that buyer uses a DSP. It increases transparency for publishers and potentially give buyers more access to the highest quality inventory, like 'exclusive ad slots' – high quality inventory offered to only a few, select buyers as determined by the publisher."

Google will also roll out a beta of a feature called "Data Transfer", which is a report of all transactions bought/sold by clients on the Ad Exchange.(image)



Like.com Announces Acquisition by Google

Sun, 22 Aug 2010 09:00:22 PDT

Rumors surfaced recently that Google was buying Like.com. Those rumors have now been confirmed, as Like.com has announced the news on their site. Founder and CEO Munjal Shah writes:

Since 2006, Like.com has been moving the frontiers of eCommerce forward one steap at a time. We were the first to bring visual search to shopping, the first to build an automated cross-matching system for clothing, and more. We didn't stop there, and don't have plans to stop now. We see joining Google as a way to supersize our vision and supercharger our passion. This is something we are truly excited about.

Along the way we built a team that was not just hard working but obsessed with the mission at hand. We are so very proud of this team, and they deserve all the credit for how far we have come. In addition, there are many folks outside the company who have been pivotal to our success. All the Like.com alumni are incredible folks who left our little company better than they found it. Our investors were patient, insightful, and supportive of our plans to build a bigger platform. Our merchant partners were cutting edge and innovative, and in many cases they were willing to try new approaches and new technologies to better the user experience.

(image)
The company says it has developed technology that lets it understand what terms like "red high-heeled pumps" or "floral patterned sleeveless dress" mean, and has created algorithms to understand whether or not items will complement or clash with one another. The company also operates personalized shopping site Covet.com, user-generated fashion site Weardrobe.com, and Couturious.com, which it says "pushes the envelope on Rich Internet Architectures."

Google is obviously impressed with the technology behind Like.com, and it will be interesting to see what they do with it.

Financial details of the acquisition have not been disclosed. TechCrunch says its heard that the price was over $100 million.(image)



Aardvark Already Part of Google, Answers will Show Up in Search

Sat, 13 Feb 2010 10:40:48 PST

Yesterday, news broke that Google was acquiring social Q&A site Aardvark for about $50 million. Aardvark sent its users an email today saying:
Dear friends,

Aardvark has just been acquired by Google!

Aardvark will remain fully operational and completely free, providing quick, helpful answers to all of your questions. For more information about how the acquisition affects Aardvark users, check out the FAQ that we've put together....


"We want social search to reach hundreds of millions people around the world, and joining with Google lets us reach that scale — we’re also excited to work with the team at Google: our company has a culture that was inspired by Google in many ways, and we have a lot of respect for the folks who work there," the company says in a blog post.

Aardvark is already available in Google Labs. Users will keep the same Aardvark account. It will continue to work under Google.

The company says it will continue to keep introducing new features, fixing bugs, and improving speed and quality. They say the main thing that is going to change is that they will be able to move faster with the support of Google.

User questions and answers will show up in search results from Google, Bing, Yahoo and other search results if you choose to share them publicly.

Ask (formerly Ask Jeeves) thinks Google is coming after its business.(image)



Track Your Family With New iPhone App

Sun, 07 Feb 2010 08:40:38 PST

AT&T has introduced its FamilyMap App for the iPhone, which allows users to track the location of family members.

Users can download the FamilyMap the App Store on iPhone or at iTunes. Users can track tow phones on an account for $9.99 a month or up to five phone for $14.99 per month. The FamilyMap App can also be used on most other AT&T smartphones. Previously the app was only available via a desktop.

(image)

Features of the FamilyMap App include:

  • Interactive Map: View whereabouts within an interactive map, including surrounding landmarks such as schools and parks; and, toggle between satellite and interactive street maps.
  • Personalize: Assign a name and photo to each device within an account, and label frequently visited locations such as "Bobby's house" and "School."
  • Schedule Checks: Use the app to see if a family member is on schedule. Parents can schedule and receive text and email alerts.
  • My Places: Set up and view a list of landmarks within the app. Users can display the landmark on the map, edit the landmark's details, and remove or add landmarks.
(image)



Google Doesn't Care if You USED to Get Links

Sun, 07 Feb 2010 08:36:59 PST

You may have gotten some good links in the past, but don't count on them helping you forever. Old links go stale in the eyes of Google.Google's Matt Cutts responded to a user-submitted question asking if Google removes PageRank coming from links on pages that no longer exist.The answer to this question is unsurprisingly yes, but Cutts makes a statement within his response that may not be so obvious to everybody."In order to prevent things from becoming stale, we tend to use the current link graph, rather than a link graph of all of time," he says. (Emphasis added) Now, this isn't exactly news, and to the seasoned search professional, probably not much of a revelation. However, to the average business owner looking to improve search engine performance (and not necessarily adapting to the ever-changing ways of SEO), it could be something that really hasn't resonated. Businesses have always been told about the power of links, but even if you got a lot of significant links a year or two ago, that doesn't mean your content will continue to perform well based on that. WebProNews has discussed the value of "link velocity" and Google's need for freshness in the past:Link velocity refers to the speed at which new links to a webpage are formed, and by this term we may gain some new and vital insight. Historically, great bursts of new links to a specific page has been considered a red flag, the quickest way to identify a spammer trying to manipulate the results by creating the appearance of user trust. This led to Google’s famous assaults on link farms and paid link directories.But the Web has changed, become more of a live Web than a static document Web. We have the advent of social bookmarking, embedded videos, links, buttons, and badges, social networks, real-time networks like Twitter and Friendfeed. Certainly the age of a website is still an indication of success and trustworthiness, but in an environment of live, real time updating, the age of a link as well as the slowing velocity of incoming links may be indicators of stale content in a world that values freshness. So how do you keep getting "fresh" links? If you want fresh links, there are a number of things you can do. For one, keep putting out content. Write content that has staying power. You can link to your old content when appropriate. Always promote the sharing of your content. Include buttons to make it easy for people to share your content on their social network of choice. You may want to make sure your old content is presented in the same template as your new content so it has the same sharing features. People still may find their way to that old content, and they may want to share it if encouraged.Go back over old content, and look for stuff that is still relevant. You can update stories with new posts adding a fresher take, linking to the original. Encourage readers to follow the link and read the original article, which they may then link to themselves.Leave commenting on for ongoing discussion. This can keep an old post relevant. Just because you wrote an article a year ago, does not mean that people will still not add to it, and sometimes people will link to articles based on comments that are left.Share old posts through social networks if they are still about relevant topics. You don't want to just start flooding your Twitter account with tweets to all of your old content, but if you have an older article that is relevant to a current discussion, you may share it, as your take on the subject. A follower who has not seen it before, or perhaps has forgotten about it, may find it worth linking to themselves. Can you think of other ways to get more link value out of old content? [...]



Google Cranks Up Number of Sitemaps Allowed

Sun, 24 Jan 2010 08:44:30 PST

Google has at some point quietly increased its sitemaps limit from 1,000 to 50,000. In a discussion on a Google Webmasters forum thread back in April of last year, Google employee Jonathan Simon said that each sitemap index file can include 1,000 sitemaps.

Just recently, however, David Harkness posted to that same thread, pointing to official Google documentation for sitemap errors, which says under the "Too many Sitemaps" error:

The list of Sitemaps in your Sitemap index exceeds the maximum allowed. A Sitemap index can contain no more than 50,000 Sitemaps. Split your Sitemap index into multiple Sitemap index files and ensure that each contains no more than 50,000 Sitemaps. Then, resubmit your Sitemap index files individually.

(image) The larger number was confirmed by Simon, who came back to the conversation, saying, "Thanks for resurfacing this thread as we've improved our capacity a bit since then. The limit used to be 1,000. The Help Center article you point to is correct. The current maximum number of Sitemaps that can be referenced in a Sitemap Index file is 50,000."

As Barry Schwartz at Search Engine Roundtable, who stumbled across this post points out, "This is a huge increase in capacity...Still, each Sitemap file can contain up to 50,000 URLs, so technically 50,000 multiplied by 50,000 is 2,500,000,000 or 2.5 billion URLs can be submitted to Google via Sitemaps."

In other words, you can have a lot of sitemaps in one sitemap index file. That's some good information to know, and it is a little surprising that there wasn't a bigger announcement made about this.(image)



A Markup That Could Have Big Implications for SEO

Sun, 24 Jan 2010 08:43:52 PST

RDFa, which stands for Resource Description Framework in attributes, is a W3C recommendation, which adds a set of attribute level extensions to XHTML for embedding rich metadata within web documents. While not everyone believes that W3C standards are incredibly necessary to operate a successful site, some see a great deal of potential for search engine optimization in RDFa.In fact, this is the topic of a current WebProWorld thread, which was started by Dave Lauretti of MoreStar, who asks, "Are you working the RDFa Framework into your SEO campaigns?" He writes, "Now under certain conditions and with certain search strings on both Google and Yahoo we can find instances where the RDFa framework integrated within a website can enhance their listing in the search results."Lauretti refers to an article from last summer at A List Apart, by Mark Birbeck who said that Google was beginning to process RDFa and Microformats as it indexes sites, using the parsed data to enhance the display of search results with "rich snippets". This results in the Google results you see like this: "It's a simple change to the display of search results, yet our experiments have shown that users find the new data valuable -- if they see useful and relevant information from the page, they are more likely to click through," Google said upon the launch of rich snippets.Google says it is experimenting with markup for business and location data, but that it doesn't currently display this information, unless the business or organization is part of a review (hence the results in the above example). But when review information is marked up in the body of a web page, Google can identify it and may make it available in search results. When review information is shown in search results, this can of course entice users to click through to the page (one of the many reasons to treat customers right and monitor your reputation).Currently Google uses RDFa for reviews, but this search also displays the date of the review, the star rating, the author and the price range of an iPod, as Lauretti points out.Best Buy's lead web development engineer reported that by adding RDFa the company saw improved ranking for respective pages. They saw a 30% increase in traffic, and Yahoo evidently observed a 15% increase in click-through rates.(via Steven Pemberton)Implications for SEOI'm not going to get into the technical side of RDFa here (see resources listed later in the article), but I would like to get into some of the implications that Google's use of RDFa could have on SEO practices. For one, rich snippets can show specific information related to products that are searched for. For example, a result for a movie search could bring up information like: - Run time- Release Date- Rating- Theaters that are showing it "The implementation of RDFa not only gives more information about products or services but also increases the visibility of these in the latest generations of search engines, recommender systems and other applications," Lauretti tells WebProNews. "If accuracy is an issue when it comes to search and search results then pages with RDFa will get better rankings as there would be little to question regarding the page theme." (Source) He provides the following chart containing examples of the types of data that could potentially be displayed with RDFa: "It is obvious that search marketers and SEOs will be utilizing this ability for themselves and their clients," says Lauretti. Take contact information specifically. "Using RDFa in your contact information clarifies to the search engine that the text within your contact block of code is indeed contact information." He says in this same light, "people information" can be displayed in the search results (usually social networking info). You could potentially show manufacturer information or author information.RDFa actually has implicati[...]



How Big Google Rates Links from Facebook and Twitter

Sat, 16 Jan 2010 07:21:53 PST

The first Matt Cutts Answers Questions About Google video of the year has been posted, and in it Matt addresses links from Twitter and Facebook, after talking about his shaved head again. Specifically, the submitted question he answers is:

Links from relevant and important sites have always been a great way to get traffic & acceptance for a website. How do you rate links from new platforms like Twitter, FB to a website?

Essentially, Matt says Google treats links the same whether they are from Facebook or Twitter, as they would if they were from any other site. It's just an extension of the pagerank formula, where its not the amount of links, but how reputable those links are (the company uses a similar strategy for ranking Tweets themselves in real-time search).
(object) (embed)
While Facebook and Twitter links may be treated like any other links, they do still come with things to keep in mind. For one, with Facebook, you have to keep in mind that a lot of profiles are not public. When a profile is not public, Google can't crawl it, and it can't assign pagerank on the outgoing links if it can't fetch the page to see what the outgoing links are. If the page is public, it might be able to flow pagerank, Matt says. With Twitter, most links are nofollowed anyway.

"At least in our web search (our organic rankings), we treat links the same from Twitter or Facebook or, you know, pick your favorite platform or website, just like we'd treat links from Wordpress or .edus or.govs or anything like that," says Cutts. "It's not like a link from an .edu automatically carries more weight or a link from a .gov automatically carries more weight. But, the specific platforms might have issues, whether it's not being crawled or it might be nofollow. It would keep those particular links from flowing pagerank."

There you have it. Matt's response probably doesn't come as much of a surprise to most of you, but it's always nice to hear information like this straight from Google.(image)



U.S. Video Game Sales Reach $19 Billion In 2009

Sat, 16 Jan 2010 07:20:27 PST

U.S. sales of video games, which includes portable and console hardware, software and accessories, generated revenues of close to $19.6 billion, an 8 percent decrease over the $21.4 billion generated in 2008, according to The NPD Group.

Retail sales in the PC game software industry also saw declines, with revenues down 23 percent, reaching $538 million in 2009. The total console, portable and PC game software industry hit $10.5 billion, an 11 percent decrease compared to the $11.7 billion generated in 2008.

"December sales broke all industry records and underscores the incredible value consumers find in computer and video games even in a down economy," said Michael D. Gallagher, president and CEO of the Entertainment Software Association, the trade group which represents U.S. computer and video game publishers.

"This is a very strong way to transition into 2010. I anticipate these solid sales numbers to continue upward through 2010 with a pipeline full of highly-anticipated titles."

(image)

Portable hardware was a bright spot with a 6 percent increase in revenue in 2009, while the remaining video game categories all saw declines, with the largest decrease coming from console hardware (-13%). Consoles software and portable software both saw declines of 10 percent, while video game accessories saw a 1 percent dip.

"When we started the last decade, video game industry sales, including PC games, totaled $7.98B in 2000," said Anita Frazier, industry analyst, The NPD Group.

"In ten years, the industry has changed dramatically in many ways, but most importantly it was grown over those years by more than 250 percent at retail alone. Considering there are many new sources of revenue including subscriptions and digital distribution, industry growth is even more impressive."

(image)



Google PPC Click Fraud Now Harder To Detect

Sat, 16 Jan 2010 07:19:35 PST

Perpetrators of click fraud are getting sneakier and sneakier. Harvard Business School professor Ben Edelman has uncovered one of the more diabolical click fraud schemes known to be hatched. As he summarizes it:Here, spyware on a user's PC monitors the user's browsing to determine the user's likely purchase intent. Then the spyware fakes a click on a Google PPC ad promoting the exact merchant the user was already visiting. If the user proceeds to make a purchase -- reasonably likely for a user already intentionally requesting the merchant's site -- the merchant will naturally credit Google for the sale. Furthermore, a standard ad optimization strategy will lead the merchant to increase its Google PPC bid for this keyword on the reasonable (albeit mistaken) view that Google is successfully finding new customers. But in fact Google and its partners are merely taking credit for customers the merchant had already reached by other methods.Edelman details all of the specifics about his dicovery, pointing to an example perpetrator - Trafficsolar, which he blames InfoSpace for connecting Google to. He also suggests Google discontinue its relationship with InfoSpace and other partners who have their own chains of partners, making everything harder to monitor. In his example, he finds an astounding seven intermediaries in the chain between the click and the Google ad itself. "Furthermore, Google styles its advertising as 'pay per click', promising advertisers that 'You're charged only if someone clicks your ad,'" says Edelman. "But here, the video and packet log clearly confirm that the Google click link was invoked without a user even seeing a Google ad link, not to mention clicking it. Advertisers paying high Google prices deserve high-quality ad placements, not spyware popups and click fraud."As Andy Greenberg with Forbes points out in an article, which brought Edelman's findings to the forefront of mainstream exposure (and likely to Google's attention), Edelman has a history of criticizing Google, is actually involved with a lawsuit involving misplacement of Google ads, and has served as a consultant to Microsoft, but maintains that this research is not funded by Microsoft or a company involved in that lawsuit. Greenberg reports:As for its ability to detect the new form of click fraud, Google has long argued that it credits advertisers for as much as 10% of their ad spending based on click fraud that the company detects. While the company wouldn't comment on Edelman's TrafficShare example, a spokesperson wrote that the company uses "hundreds of data points" to detect fraud, not just clicks.In a report last October, click fraud research firm Click Forensics measured click fraud at around 14%, significantly higher than Google's estimates. But even Click Forensics may not be counting the sort of click fraud Edelman accuses TrafficSolar of committing. Because Click Forensics' data is pulled from advertisers, the company can't necessarily detect click fraud that is disguised as real customers and real sales, according to the company's chief executive, Paul Pellman. Pellman believes, however, that the kind of click fraud Edelman discovered is likely mixed with traditional click fraud to increase the scheme's traffic volume while keeping it hidden.Click Forensics' own Steve O'Brien says "it was probably a fairly low-volume scheme to begin with. It's limited to machines of users that are infected with spyware who also visit select Google advertisers...It's a problem, but probably not a huge one. What would make it more serious is if there were another version of the spyware that simply clicks on paid links in the background without the user’s knowledge..."As for Edelman's suggestion that Google sever ties with Infospace and the like, O'Brien doesn't think it i[...]



Google Could Sell Energy

Sun, 10 Jan 2010 10:49:46 PST

Remember when Google was just a search engine? We often still think about it that way, yet we are frequently reminded of the breadth of product offerings and ultimately the power the company possesses. Power, or energy rather, is actually something Google could end up selling in the future.

Google recently applied for approval from the Federal Energy Regulatory Commission for the right to purchase and monetize energy, just like the utility companies you are already familiar with do. Google has said that its actions had more to do with the enormous amount of energy it consumes itself (consider all of the machinery and equipment it takes to keep a company like Google running at its current pace on a daily basis).

Questions have been raised however, about if Google could actually end up functioning as a utility. From the sounds of it, the company isn't exactly ruling it out.

(image) Jeffrey Marlow with the New York Times asked Google's "Green Energy Czar" Bill Weihl if the company views its work on alternative energy as a money-making component. In his response, Weihl noted that some of Google's initiatives come from Google.org (the company's philanthropic arm), and said:

The reason Google.org is not just a foundation is that lots of people believe that if you want to have a big impact at scale on the world, then you need to go beyond what a 501(c)3 can do, which is to make charitable grants, so you need the ability to invest in companies, to do engineering projects, to do things that might at some point actually make money.

We’d be delighted if some of this stuff actually made money, obviously; it is not our goal to not make money. All else being equal, we’d like to make as much money as we can, but the principal goal is to have a big impact for good.

Google says its goal is to make renewable energy cheaper than coal. Coal is said to be the source of about half of the electricity consumed in the US. Google is looking at concentrated solar thermal, enhanced geothermal, and wind energies.(image)



Technology Holiday Sales Hit $10.8 Billion

Sun, 10 Jan 2010 10:48:46 PST

U.S. consumer technology retail sales fell less than one percent for the 2009 holiday season, according to a new report from The NPD Group.

The report found sales for the five-week holiday period reached $10.8 billion, a big improvement from the 6 percent decline during the 2008 holiday season.

The holiday season had its share of fluctuations. Overall revenue declined three out of the five weeks. While the second week of the season posted the largest revenue growth it only represented for 15 percent of overall holiday sales. The final week of the season also saw revenue growth and represented 22 percent of all sales, so the success of the final week was more relevant to the overall success of the season.

(image)

"The dynamics of the holiday season changed this year; the holiday season started before Black Friday as retailers ran Black Friday-like sales throughout November," said Stephen Baker, vice president of industry analysis at NPD.

"That move may have lessened the Black Friday hype for consumers, but the increase during the final week of the season is a sign that consumers either went back out or waited it out to get the best deal."

PCs and flash-based camcorders were popular this holiday leading the way in unit growth among large categories, but total sales numbers were dependent on the success of PCs and flat-panel TVs. Combined they accounted for 41 percent of the revenue over the five week holiday period, up from 39 percent in 2008 and 34 percent in 2007.

Despite the high revenue, flat-panel TVs registered a decline in dollars of 13 percent, in line with their performance during most of 2009. That decline pulled down the industry overall. MP3 players, for the third year in a row, were the largest unit volume category despite increased ASPs and declining unit volumes.

"Just cutting prices this year was not enough to guarantee successful sales results," said Baker. Flat-panel TVs had a disappointing holiday because there wasn't enough price-cutting on the right items, while notebook PCs and camcorders offered new form factors and price points that drove enormous increases in units and revenue despite falling prices."(image)



Content Syndication Is Your Friend

Wed, 30 Dec 2009 08:34:49 PST

Content duplication has been a buzz topic in SEO for a while now. It's one of the modern webmaster's favorite things to fret over and has been for at least two years.Google doesn't like duplicate content. We all get that now. There is still the lingering perception that there is some sort of duplicate content penalty despite repeated assurances from multiple Googlers to the contrary. Maybe there is no penalty; maybe there is some sort of mechanism at work that webmasters perceive as a penalty... it really matters very little. At the end of the day, if you aren't showing up for your own content but somebody else is... you probably aren't the happiest little webmaster.As a result, syndication has been quite unfairly vilified. Traditionally speaking, having a site link to your content has always been perceived as a compliment of sorts (Google certainly thought it was a fair indicator of quality). That said, syndicating content... having your great content actually picked up by a larger, more influential site was even better in a lot of ways. The syndicated content was put right in front of a whole new user base without them having to click a thing. Generally you also got a nice link back to your site to boot. If you produced a great piece of content, why not have it show up everywhere you possibly could?Penalty or not, it is clearly the case that the site where content originates may not always rank best for that content. Google wants to do their best to make sure they keep the content of their results pages as distinct from one another as they can. In short, Google doesn't want to have a result page where 4 of the 10 results are all essentially the exact same article.Here's the thing though syndication is good. It can drive traffic to your site. It can establish your reputation and credibility within a niche and it can generate high quality inbound links. If you are upset because the larger, more recognized and more popular site's syndication of your content outranks your own then I'd have to say you might need to rethink that one a little bit. So what if it does? You are there because you want to be exposed to the larger site's community. You want the links, attention, reputation and all the good things that go along with that don't you? Of course you do. So if you do a search and find that the big site is number one on a good search query with your content, you don't get upset - you say 'yay'.Why do you say yay? Because your super great content would never have that top position if not for the fact that Google found it on the larger more authoritative site. Sure, if it's that good you can probably get a decent ranking but it won't be as good. Beyond the ranking, even if your site is #2 and the big site is #3 for the same article, guess which one is likely to get clicked thru more; the link to your site, which is not all that well known? Or the link to a site that somebody has heard of?If you aren't a household name or a recognized authority in whatever areas you are covering, the fastest way to build that reputation and credibility is to become associated with the brand that is. What's the best way to do that? Get your name, your company and your link on their domain. Because at the end of the day the likelihood of you just outranking them on your own for similar subject matter is probably going to be a tough order.Abby Johnson talked to Eric Enge from Stone Temple Consulting at SES recently about the syndication vs. duplicate content problem. Eric has some great tips in the video for minimizing the negative aspects of duplication on a syndication model. Three specific items he talks about are syndicating excerpts, including a no-index tag, and writing 'a[...]



Google Provides an Update on the AdMob Acquisition

Mon, 28 Dec 2009 08:08:36 PST

Google has issued a statement regarding the company's pending acquisition of AdMob. Google's intent to acquire the company was announced back in early November. The deal was for $750 million in stock.Since then, the Federal Trade Commission has vowed to closely scrutinize the deal. Google had this to say today:As we said when we announced the deal, we don't see any regulatory issues with this deal, because the rapidly growing mobile advertising space is highly competitive with more than a dozen mobile ad networks.That said, we know that closer scrutiny has been one consequence of Google's success, and we've been talking to the U.S. Federal Trade Commission over the past few weeks. This week we received what's called a "second request," which means that the FTC is asking for more information so that they can continue to review the deal.While this means we won't be closing right away, we're confident that the FTC will conclude that the rapidly growing mobile advertising space will remain highly competitive after this deal closes. And we'll be working closely and cooperatively with them as they continue their review. Upon announcement Google highlighted these things about the deal: - The deal will bring new innovation and competition to mobile advertising, and will lead to more effective tools for creating, serving, and analyzing emerging mobile ads formats.- This deal will benefit developers, publishers, and advertisers by improving the performance of mobile advertising, and will provide users with more free or low-cost mobile apps.- The mobile advertising space will remain highly competitive, with more than a dozen mobile ad networks. The deal is similar to mobile advertising acquisitions that AOL, Microsoft, and Yahoo have made in the past two years "Mobile advertising has enormous potential as a marketing medium and while this industry is still in the early stages of development, AdMob has already made exceptional progress in a very short time," said Susan Wojcicki, Vice President of Product Management at Google upon the announcement.Google says since the announcement, they have seen a quite positive reaction from advertisers and publishers, who are "enthusiastic" about the possibilities the deal might bring. It's hard to say how long the regulatory process will take, but we'll keep you posted as we learn more. [...]



Advertising Trends In 2010

Sun, 20 Dec 2009 09:24:51 PST

It's the time of year when not only does everybody reflect upon trends and happenings from the year past, but they also look forward and make predictions for the coming year. Nielsen has shared its projections for the top advertising trends for 2010. These are: 1. Optimizing media convergence is a top priority.2. New models emerge to take advantage of smartphones.3. More cross-media ad campaigns surface.4. Commercialization of social networking hubs increase.5. More interesting and interactive online ads appear. "A better understanding of media convergence will manifest in order to deliver a better return on investment," the firm says. "The ability to accurately measure activity and link online ads to offline purchasing behavior will be critical."Nielsen says accurate mobile measurement will be required for advertisers to stay ahead of "snowballing growth" of that media platform and that the massive growth of online games will lead the way for more successful interactive and cross-media advertising campaigns. The firm expects growth in innovation and adoption in this area.Of course social media will continue to provide new opportunities and Nielsen thinks there will be increased use of more creative advertising and content models online. John Burbank, CEO of Nielsen Online says the next phase of the Internet will be the "audience-centric web" and will be characterized by the audience being the center of everything, "online" no longer being an island, and richer business opportunities due to richer data being consumed."Whether it’s reaching men aged 18 to 24, women with incomes of over $150,000, heavy users of Tide or Hispanic teens, the match of consumer need to marketing message starts with the audience," he says. "In the audience-centric Web, that richness of insight will now be available to online marketers, just as it has been offline."Nielsen also shared its top five cross-media trends for 2010, which include: convergence in demand, second and third screen initiative growth, continued audience fragmentation, new and varied approaches to content, and the formation of multiple distribution opportunities. [...]



YouTube Partner Program Turns Two

Sun, 20 Dec 2009 09:24:05 PST

It's no secret that people work harder when given a financial incentive, and that companies also like to make a little money. Now, the program that YouTube created to capitalize on these facts has turned two years old, and YouTube's having a little celebration.

As Shenaz Zack, a product manager, explained on the YouTube Blog, the Partner Program is pretty much a huge success. He pointed out, "[M]any partners are earning real dollars from their videos. Some users have quit their jobs to concentrate on YouTube full-time, including several who make six figures a year from the site."

(image)

There have been some other interesting real-world effects as a result of the Partner Program, too. Fred's set to become a movie star. Brownbagfilms already has, with a video entering the Oscar runoffs. And Zack wrote, "Others have earned such awards as a Tech Award in Education, Digista Award in Japan, and the distinction of being one of TIME's 50 best inventions of 2009."

This string of successes is almost sure to continue. Indeed, since YouTube controls access to the program, it could open the figurative floodgates at any time, allowing all sorts of content creators the opportunity to step up their game and share revenue from ads.

Of course, this would mean having ads on all sorts of content, and would in turn generate a lot of revenue for YouTube.

(image)



Google Now Supporting rel="canonical" Across Domains

Sun, 20 Dec 2009 09:23:03 PST

Google announced that it now offering cross-domain support of the rel="canonical" link element. If you are unfamiliar with this link element, Google's Matt Cutts discussed it. Basically, it's a way to avoid duplicate content issues, but until now, you couldn't use it across domains."For some sites, there are legitimate reasons to [have] duplicate content across different websites — for instance, to migrate to a new domain name using a web server that cannot create server-side redirects," says John Mueller, Webmaster Trends Analyst with Google Zürich."There are situations where it's not easily possible to set up redirects," he says. "This could be the case when you need to move your website from a server that does not feature server-side redirects. In a situation like this, you can use the rel='canonical' link element across domains to specify the exact URL of whichever domain is preferred for indexing. While the rel='canonical' link element is seen as a hint and not an absolute directive, we do try to follow it where possible." Mueller gives the following ways of handling cross-domain content duplication: - Choose your preferred domain- Reduce in-site duplication- Enable crawling and use 301 (permanent) redirects where possible- Use the cross-domain rel="canonical" link element Barry Schwartz at Search Engine Roundtable gives three reasons why the addition of cross-domain support for the rel="canonical" link element is really important: 1. Some hosts don't allow webmasters to deploy 301 redirects2. Some site owners aren't technical enough to implement a 301 redirect3. In some cases, webmasters do not want to redirect users but rather only search engines (i.e. pagination, weird filtering, tracking parameters added to URLs, etc). To use the link element, pages don't have to be identical, but they should be similar. According to Google, slight differences are fine. You should not point point rel="canonical" to the home page of the preferred site. Google says this can result in problems, and that a mapping from an old URL to a new URL for each URL on the old site is the best way to go.You should not use a nonindex robots meta tag on pages with a rel="canonical" link element because those pages would not be equivalent with regards to indexing, Google says. One would be allowed while the other would be blocked. Google also says it's important that these pages aren't disallowed from crawling through a robots.txt file, because search engine crawlers won't be able to discover the rel="canonical" link element. [...]



Google's Matt Cutts Talks Future of Search

Sun, 06 Dec 2009 10:38:43 PST

This time of year everybody likes to start making predictions about where industries are heading. This is especially true in the search industry. My guess is that we will see quite a few pieces this month regarding where search is going in 2010. These can make for entertaining reads and get the mind going with regards to how we are going to have to plan for an ever-changing future of search engine marketing.When Google itself comes out with predictions for where search is headed, things get even more interesting. This is obviously because Google is such a huge and critical part of the search landscape. Google's Matt Cutts discussed some of his own predictions for search in a recent upload to Google's Webmaster Central YouTube channel.One thing Matt stressed is that Google is always looking for new types of data to search. He gave examples of searching email with Gmail, books with Google Book Search, and patents with Google Patent search. He predicts Google will continue this trend and find more data sources to provide search functionality for. Another prediction he gave was that Google will continue to improve search over harder problems. Specifically, he noted things like determining what is really going on with the words in documents and in queries - semantic search if you will."A lot of people think that if you type in 'A B C,' all Google does is crawl the web and return pages that match 'A,' 'B,' and 'C'. And that's not it," says Cutts. "We do a lot of sophisticated stuff. Think about synonyms, morphology...all sorts of ways where we can kind of find out, 'oh, this is really related to them conceptually.' Whether you want to call it semantic stuff or statistical processing, we do a lot of stuff to try and return relevant documents."As part of this prediction, Cutts says Google will continue trying to find new ways to extract "good data" from the web. He mentions Google Squared (which is still in an experimental stage) as an example of doing so. Google Squared, in Google's words, takes a category and creates a starter 'square' of information, automatically fetching and organizing facts from across the web. Cutts also predicts that people will get more comfortable with storing their data in the cloud. He expects more people will migrate their data from their hard drives to different cloud services, and that this will make it easier and better for search, and contribute to the delivery of more relevant results.He also mentions real-time and mobile as playing significant roles in the future or search. No surprise there."It's going to be a lot of fun. Search is nowhere near done, and every time we make search better, people ask us harder and harder questions, " he says. "So the nice thing is knowing that we'll pretty much always have more to do to make search better."Cutts recently discussed the possibility that page speed could play a role in search engine rankings. He made no mention of this in this set of predictions, but that is another thing to consider as we get ready to move into 2010. [...]



New Google Home Page: Does it Remove or Add Distraction?

Sun, 06 Dec 2009 10:32:26 PST

Google has launched its new homepage, which looks generally the same, but removes everything but the logo, search box, and two buttons until the user moves the mouse. Google says most people go to the Google home page to search, and they wanted to remove the other distractions, unless users specifically want to see them.This is an interesting philosophy, because it certainly grabs your attention when you move the mouse and and a bunch of new stuff appears. Perhaps, the move is really designed to draw attention to Google's other services."Since most users who are interested in clicking over to a different application generally do move the mouse when they arrive, the 'fade in' is an elegant solution that provides options to those who want them, but removes distractions for the user intent on searching," Google says. Google has been testing similar designs for several months. The company explains:All in all, we ran approximately 10 variants of the fade-in. Some of the experiments hindered the user experience: for example, the variants of the homepage that hid the search buttons until after the fade performed the worst in terms of user happiness metrics. Other variants of the experiment produced humorous outcomes when combined with our doodles — the barcode doodle combined with the fade was particularly ironic in its overstated minimalism . However, in the end, the variant of the homepage we are launching today was positive or neutral on all key metrics, except one: time to first action. At first, this worried us a bit: Google is all about getting you where you are going faster — how could we launch something that potentially slowed users down? Then, we realized: we want users to notice this change... and it does take time to notice something (though in this case, only milliseconds!). Our goal then became to understand whether or not over time the users began to use the homepage even more efficiently than the control group and, sure enough, that was the trend we observed.Judging from conversation on Twitter, opinions of the new page are pretty evenly mixed. Some think it's "snazzy," while others feel it's distracting." Some don't like that you have to move the mouse to know where specific links are.As with any design change to a popular site, there are going to be critics and supporters. The Google home page affects a great deal of web users. What is your opinion of the new change? [...]



Link Building For New Bing Rankings: Dos and Don'ts

Fri, 27 Nov 2009 09:23:34 PST

It's easy for businesses to get caught up in Google's expectations for their sites, when trying to market through search. That's certainly a wise thing to do, considering Google dominates the search market by a huge margin. Still, there are other search engines that people are using, and it is also wise to make sure your site is performing to the best of its ability in those too.I'm obviously talking about Yahoo and Bing, but Yahoo's share is declining, while Bing's is gaining. Furthermore, if the deal between Microsoft and Yahoo goes through, Bing search will be talking over Yahoo anyway.We don't hear as much about what Bing wants out of a site for rankings, but Rick DeJarnette of Bing Webmaster Center has shared some dos and don'ts of link-building for Bing. Not surprisingly, a lot of his advice for honoring Bing's policy, does not differ too much from advice that Google would give you. It is, however, still always nice to see how they feel, just to clear up any possible confusion.Like Google, Bing places great emphasis on quality links to determine its rankings. "Just don't make the mistake of believing it will result in instant gratification. Successful link building efforts require a long-term commitment, not an overnight or turnkey solution," says DeJarnette. "You need to continually invest in link building efforts with creativity and time."What Not To DoDeJarnette shared a list of things that you should avoid in your link building efforts, if it is a good Bing ranking that you are after. Here is what Bing says will get your site reviewed more closely by staff: 1. The number of inbound links suddenly increases by orders of magnitude in a short period of time2. Many inbound links coming from irrelevant blog comments and/or from unrelated sites3. Using hidden links in your pages4. Receiving inbound links from paid link farms, link exchanges, or known "bad neighborhoods" on the Web5. Linking out to known web spam sites "When probable manipulation is detected, a spam rank factor is applied to a site, depending upon the type and severity of the infraction," says DeJarnette. "If the spam rating is high, a site can be penalized with a lowered rank. If the violations are egregious, a site can be temporarily or even permanently purged from the index."What To DoDeJarnette also shared some tips for getting more quality links. Following are Bing's tips for effective link building (paraphrased): 1. Develop your site as a business brand and brand it consistently2. Find relevant industry experts, product reviewers, bloggers, and media folk, and make sure they're aware of your site/content3. Publish concise, informative press releases online4. Publish expert articles to online article directories5. Participate in relevant conversations on blogs/forums, referring back to your site's content when applicable6. Use social networks to connect to industry influencers (make sure you have links to your site in your profiles)7. Create an email newsletter with notifications of new content8. Launch a blog/forum on your site9. Participate in relevant industry associations and especially in their online forums 10. Strive to become a trusted expert voice for your industry, while promoting your site Most of the stuff DeJarnette shared is nothing any savvy search marketer is not already aware of. That said, there are clearly plenty of online (and offline for that matter) businesses out there that don't have savvy search marketers on the payroll. It can be quite helpful when a search engine itself lays out wha[...]



Iraq Comes to YouTube

Fri, 27 Nov 2009 09:21:40 PST

Google has announced that the Iraqi government has launched a dedicated YouTube channel. It can be found at youtube.com/iraqigov.The Iraqi government joins the Pope, the Royal Family, Queen Rania, and the presidents of the United States, France, South Korea, and Estonia in having YouTube channels to communicate with the public. Here's a YouTube message from Iraq's Prime Minister Nuri al-Maliki: "Earlier this year, I visited Baghdad as a guest of the U.S. State Department to engage in conversations about the role of technology in Iraq," says Hunter Walk, Director of Product Management for YouTube. "In discussions with elected officials, private companies and NGOs, I routinely heard the desire to connect with fellow citizens, Iraqis outside the country's borders, and cultures across the world.""But it wasn't just the Iraqi Government who expressed an interest in YouTube — I was pleasantly surprised by the high level of awareness from a wide variety of Iraqis," he continues. "One young student told us she uses YouTube to understand what is really happening in her country based on the variety of opinions, citizen journalism and news reports uploaded to the site. There was little difference between her examples and those we often hear in other countries, which speaks to both the global community on YouTube and the universality of the video experience."On a related note, Google CEO Eric Schmidt himself visited Iraq this past week, where he met with government officials. He offered the following video via YouTube's Citizentube site: It's quite interesting to see how online video, social media, and YouTube in particular are changing the way governments connect with the people. It is likely that even more countries' governments will follow suit in the future. [...]



Microsoft Takes Users Behind Bing

Fri, 27 Nov 2009 09:20:58 PST

Microsoft has launched a new site for Bing, where users can go to find out the latest features that have been implemented into the search engine (excuse me, "decision engine"). The site's called Behind Bing.

"You can see each feature in action though a screencast, see me talk about why we did it the way we did (for those who like to geek out), and get some drill-down details," says Bing's Stefan Weitz. "For those of you pressed for time, check out 'Features for You' at the bottom of the site which highlights some features that I thought were especially cool depending on what and where you are."

(image)

Highlighted on the site currently are sections looking at:

- Real-time search
- Bing Local
- Weather/Event results
- Enhanced Results
- Enhanced hover
- Bing for mobile
- Videos
- Bing Travel
- Bing Health
- Visual Search
- Bing Shopping
- Wolfram Alpha
- Search Sharing
- Reference
There are videos and other sections for "explore," "overview," and "insight guide." If you don't regularly keep up with Bing's announcements or search news in general, this should serve as a good place to check out from time to time just to see what the search engine has been up to, and to stay informed about any functionalities that you may have otherwise missed. That will of course require that Microsoft keeps it updated.

On a related note, all of the features that Bing announced last week are supposed to be "100%" live now for all users, but that doesn't seem to be accurate, as I am not able to access some of the new stuff yet.(image)



Retailers Funneling More Money To Bing

Wed, 25 Nov 2009 08:08:51 PST

Bing has managed to turn retailers' heads in a big way. After looking at statistics from part of 2008, SearchIgnite reported that retailers spent almost 50 percent more with Microsoft's search engine this time around, which puts Google and Yahoo partly to shame.

Or, to be more precise, "Retailers have spent 47% more on search ads on Bing in Q4 this year than during this same time period in 2008," according to SearchIgnite. "Compared with Google and Yahoo!, Bing also saw better YoY click volume growth."

(image)

Additionally, "[a]verage order values on Bing are 21% higher than across all engines, which could account for the spend growth."

Impressive, right? It's only when you sort of step back for a moment that Bing's achievements look less stunning. That's because, despite the progress Microsoft has made, exactly 75 percent of advertisers' dollars went to Google during the first part of this quarter, and 16 percent headed to Yahoo. Bing grabbed just 8 percent.

Still, some headway is better than none, and retailers are demonstrating a lot of confidence in Bing by giving it a try during the all-important holiday season.

(image)



Is it Really Crazy to Block Google?

Wed, 25 Nov 2009 08:08:07 PST

After all is said and done Rupert Murdoch may still be seen as the sly old fox that really knew best. Many bloggers and journalists have pounded the insanity of Murdoch's suggestion that News Corp publications might strike an exclusive indexing deal with Bing and delist itself from Google's search engine. However, what if Murdoch was really only talking about the Wall Street Journal and not all News Corp publications? Then the idea might actually make a lot of sense. According to Compete.com WSJ.com already receives the largest percentage of its traffic from Microsoft' (18.74%). This is contrary to many sites which typically receive the majority of their referrals from Google, often many times more than what Microsoft delivers. Yahoo provides another 6.3% and since Bing will likely be owning Yahoo's search business that means Microsoft is actually delivering 25% of the Wall Street Journals current traffic. If Rupert Murdoch can get Microsoft to pay possibly as much as $50 million or more a year to lose just 11.5% of his Google traffic sent to WSJ.com the deal makes a lot of sense. According to Hitwise Google and Google News combined deliver approximately 26% of WSJ.com visitors. However, even with this larger percentage (vs. Compete's) Hitwise notes in a blog post why this might not be as much of a traffic loss as it appears: Analyzing Google search terms driving traffic to the Journal, the top 100 terms accounted for over 21.6% of all Google search traffic to WSJ.com. Of that 21.6%, 13.4% were navigational or brand searches (e.g. "Wall Street Journal," "WSJ," "WSJ.com" etc...). Even if Murdoch decides to block Google, these navigational search queries will most likely remain intact. Of the remaining 8.2%, the majority of searches were for stock quotes, and general business related searches. Most specific news related searches fill-out the long tail of search queries. While the Journal may lose traffic if it ceases to cooperate with Google the loss may be less then anticipated. From Bing's perspective Wall Street Journal exclusivity not only differentiates Bing from Google but it could also help change its image as a more consumer focused search engine. The Wall Street Journal is the most read business publication in the World and this deal could go a long way toward modifying Bing's consumer image in the minds of business executives. After all, a click resulting from a B2B oriented search term usually demands a premium price, which could help offset Bing's cost of paying Murdoch for exclusive inclusion. [...]