Subscribe: I, Cringely . The Pulpit | PBS
http://www.pbs.org/cgi-registry/cringely/cringelyrdf.pl
Added By: Feedage Forager Feedage Grade B rated
Language: English
Tags:
apple  car  companies  company  cost  don  job  jobs  make  moore law  new  power  steve jobs  steve  time  week  years 
Rate this Feed
Rate this feedRate this feedRate this feedRate this feedRate this feed
Rate this feed 1 starRate this feed 2 starRate this feed 3 starRate this feed 4 starRate this feed 5 star

Comments (0)

Feed Details and Statistics Feed Statistics
Preview: I, Cringely . The Pulpit | PBS

I, Cringely . The Pulpit | PBS



I, Cringely is the blog of Robert X. Cringely. Copyright 2006 PBS Online.



Last Build Date: Tue, 16 Dec 2008 10:01:01 -0500

Copyright: Copyright 2009 PBS Online
 



End Game

Tue, 16 Dec 2008 10:01:01 -0500

This is my 603rd and last column for pbs.org. If you want to continue reading my work, please visit http://www.cringely.com, which is also in this week's links. Thanks for your support. Everybody in my line of work writes prediction columns for the coming year, but I wonder how many we will see this time around? The world is unsettled. It's not just this damned financial nightmare we have to deal with but also a sense of between-ness, like something has just ended yet still lingers slightly though it is obvious that something new is about to arrive. But will it be a good something new? That's hard to tell. So for this reason I think the prognosticators will mainly keep their heads down this year. Except, of course, for me. I'm too stupid to shut up. So let's get on with this experiment in humiliation. You know the drill. We begin with a look at last year's predictions to see how I did then jump into my predictions for 2009. If you care to follow along you'll find last year's predictions column in this week's links. For a real laugh you can find my predictions from many previous years in the archive. I wrote a year ago that we'd see the beginning of a shift away from PC-centrism with other platforms beginning to supercede the venerable PC. This is a slow process as I said it would be but generally I think I was correct. Sales growth for PCs slowed in general while growth for smartphones and netbooks increased. I never said PC sales were going in the toilet but it seems clear that the action these days is elsewhere, so I'm going to claim this one. I said the Digital TV conversion would be a nightmare, though the greatest pain would be felt in 2009 when the analog transmitters are actually turned off. I think this is correct. Poll your friends and you'll find most are in denial. While everyone has seen a DTV commercial, there are millions of people who still don't know what's happening. Free converter boxes are sold out, which ought to be good, but expected DTV sales have not met forecasts, so I say there are 10-15 million people who are going to wake up mad as hell in February. So I got this one right and claim it for 2009, too. While it may seem quiet now, February and March are going to be ugly. I wrote that Cisco would acquire Macrovision, which didn't happen. Two right and one wrong. I still think Macrovision has to find a landing place somewhere or the company is doomed. I predicted that venture capitalists would sour on start-ups with revenue models based solely on advertising, citing Facebook as an example. This one is hard to call because the general tightening in the economy has led VCs to push all their companies toward multiple revenue models and much tighter books. Still, I probably got this one wrong, though I'd say it is still coming. I predicted that Google would bid and win the 700 MHz spectrum auction. They bid, true, and made a good effort at shaping the deals that resulted, but Google didn't win so this was wrong. I am not worthy. I predicted that IBM would have bad earnings, would try to sell Global Services, and failing that might fund the sale itself. Wrong, wrong and wrong. IBM's earnings were saved by the weak dollar or I would have been right. They couldn't sell Global Services because no company was stupid enough to buy. But they didn't have to finance anything because the credit crunch came and it was clearly not going to happen. I'm the loser here. If you are keeping score it is pretty dismal, down to two right and four wrong. I said Microsoft would indefinitely extend the life of Windows XP. I might well claim this one but -- like Wall Street -- I may as well take all my losses while I can. Yes, you can still get XP, but if you are an individual it requires downgrading from Vista so you have to buy Vista anyway. In the long run this strategy really hurts Microsoft because the made-for-Vista computers that are being downgraded to XP don't work as well and Microsoft's reputation suffers even further, if that's possible. Redmond sees this as a clever success on their part, too, which say[...]



Insanely Great

Sun, 07 Dec 2008 09:05:13 -0500

Looking for improved business models for the personal computer business, Apple CEO Steve Jobs often used to cite automobile makers, though never American car companies. The examples were invariably German. Whether it was the design aesthetic of his Mercedes sedan or Porsche's success at selling high-margin cars as entertainment devices, Jobs could always point to farfegnugen as a way to sell a good car for a great price. So since he thinks about these things anyway, and because the U.S. automobile industry is on the skids and begging for help this week, I find myself wondering what would happen if Steve Jobs were put in charge of any of the Big Three car companies? It wouldn't be boring, that's for sure, and I'm fairly certain Steve could do a better job than the Detroit executives currently in charge. When Steve Jobs returned to Apple in 1997, the computer company was in worse shape than some of these car companies. Apple's share price was in the toilet, it had poorly conceived products it couldn't sell, the company was losing money, market share was dismal, and CEOs from John Sculley on had tried without success to find ANY company that would buy Apple. Steve himself had such low expectations for Apple under Gil Amelio that he sold all his new Apple shares shortly after Apple bought his NeXT Computer. What a difference a decade makes. Today Apple and Jobs are at the top of their game, taking market share from other computer companies while at the same time establishing game-changing new product concepts like the iPod and iPhone. Apple is America's largest music seller (who could have seen that one coming back in '97? Nobody), has no debt, and $22+ billion in the bank. Even at its currently depressed stock price, Apple is worth more than any of the car companies and for good reason: Apple has a future. What did Jobs do to make Apple such a business success and how would he translate these techniques to a car company? It's not really that hard to imagine. Back in 1997 Apple had a huge list of products it made or sold, many of them not for a profit. Here is a partial list of Apple products from 1997 courtesy of my friend Orrin, who brought this idea to my attention: PowerBook Quadra Performa Power Macintosh Workgroup and network servers LaserWriter laser printers StyleWriter inkjet printers Newton PDAs Displays External disk drives Modems Scanners Lots of software And don't forget the Mac clones. Jobs killed the clones, dropped the Newton, and streamlined the Mac product line into what today are four ranges of computers -- personal and professional, desktop and portable. Yes, there are the Mac Mini and the xServe, I know, but nearly all Apple computer sales lie with the MacBooks, MacBook Pros, iMacs and Mac Pros. Apple quit the printer business entirely and, over time, got out of the business of manufacturing its own computers at all. The decisions Steve Jobs made in 1997 were that Apple's core competence was in making computers and its future then lay with graphics and desktop publishing professionals who loved the products. While these conclusions may seem obvious, they weren't reflected in the Apple product line at the time. Steve knew the value he had in his product development team, too, which was a clear difference between he and Sculley, Spindler, and Amelio, all of whom had come in varying degrees under the sway of the diabolical product development chief Jean-Louis Gassee. One advantage of my having written about this industry since dinosaurs roamed the earth is that there are columns about Apple in my archive dating from 1997 that give a sense of what the company, its products and lack of leadership were like at the time. Read them: they are in this week's links. They give a sobering look at how bad things were and show an eery resemblance to the positions of the automakers today. Look at the American car companies with their many brands that often compete with each other within a single company. It's bad enough competing with Chrysler and GM, but why should [...]



Saving Detroit

Wed, 26 Nov 2008 21:37:06 -0500

My first car was an Oldsmobile, a red 1966 convertible I wish I still owned today. It was big and heavy yet somehow managed to average 18 miles per gallon in an era when gasoline cost 35 cents. Detroit and the U.S. automakers ruled the world when that car was built, yet now the companies say they are on the skids, bleeding money and headed for bankruptcy. What happened? And what can we do -- if anything -- to save an industry that for a century defined our nation as well as our youth? I have some ideas. Whatever the mechanism of their demise, the car companies did it to themselves. They love to blame labor agreements, pension plans, and health plans for their precarious financial situation, yet didn't the companies negotiate and sign those deals in good faith? Surely the down-the-road financial burdens were calculable at the time. Is it that we're living longer than expected, rather than expiring early like Pinto gas tanks? Maybe that's part of it, but to blame the unions for good negotiating is worse than forgiving the companies for bad. And what does it matter? The real issue at hand -- and the only one that really matters -- isn't who to blame or even whether or not to save these specific companies, but how to get me a really sweet ride. That's because the only way the U.S. auto industry is going to survive in any form is by making cars so cool that we'll stand in line to buy them even in a global financial crisis. It's the cars, stupid. My hobby is building small airplanes and one of my favorites is a Davis DA-2A, winner of the Outstanding New Design contest in 1966, the same year my Oldsmobile (and my current Thunderbird convertible) was built. That little Davis can teach us a lot about cars. I didn't build my DA-2A, but I am rebuilding it right now and know it intimately. My Davis is an all-aluminum two-seater with an 85-horsepower engine. The engine was built in 1946, the plane in 1982, and the whole thing cost under $4,000 at the time, though today I have more than that invested in the instrument panel alone. The plane weighs 625 lbs. empty, 1125 lbs. loaded, has a top speed of 140 miles per hour and can travel about 600 miles on its 24-gallon fuel tank. Why can't I buy a car like that? Imagine if we took the basic design parameters of my DA-2A and applied them to a modern automobile. The new design would have to carry two people and luggage, have an empty weight of no more than 625 lbs. and use an 85-horsepower engine. With a loaded weight of 1125 lbs., the car would have a power-to-weight ratio comparable to a Chevy Corvette and be just as quick -- probably even faster than the airplane's 140 mph. Driven only 20 percent over posted speed limits as God intended, the car would easily get 50+ miles per gallon. Who wouldn't want to buy one? At the heart of manufacturing is the simple concept of buying raw materials in volume at a low price per pound and selling manufactured products at retail for a high price per pound. The eventual retail price per pound is determined by the marketplace and ideally it ought to be high enough for the manufacturer to make a profit. The very light weight of our DA-2A car analog suggests that it ought to be inexpensive to buy, but maybe all that means is we have to look beyond the car industry to bicycles. Car buyers and bicycle buyers approach retail pricing from completely different directions. Car buyers, whether they think about it this way or not, traditionally try to buy cars that cost the least on a per-pound basis. Do some research on the Internet and you'll see that luxury cars, whether we are talking about a Cadillac SUV or a big Mercedes sedan, tend to cost about $10 per pound; mid-range cars cost about $6 per pound; and economy cars cost about $4 per pound. Manufacturers prefer luxury cars because, given the same profit margins, they make vastly more gross profit on a fancy car than they do on an entry-level car. This pricing bias is part of what is working against Detroit right now. Bi[...]



Not Enough Indians

Tue, 18 Nov 2008 18:18:48 -0500

There is no joy at Yahoo, for mighty Jerry has struck out. This week Yahoo cofounder Jerry Yang announced he was stepping down after 17 turbulent months as CEO of the big Internet portal -- a time in which the company rebuffed a buyout offer from Microsoft, flubbed an ad sales agreement with Google, and ended up being worth a third of its former self when the rest of the market is down only 40 percent. Jerry blew it. And rare in the annals of public companies, JERRY blew it, nobody else. There is no blame to be shared because the Chief Yahoo took his anti-Microsoft stand pretty much single-handed, having bounced Terry Semel from the job in June 2007. Semel, who was more Hollywood than Silicon Valley and never well suited for the job anyway, had backed the Microsoft deal. Freed from his duties at Yahoo, Semel also voted with his brokerage account, selling a large number of company shares while the selling was good. If there is a lesson to be learned here it is not so much that Jerry was wrong, but that Jerry was Jerry and that wasn't the right thing for Yahoo shareholders. There are three seminal ideas that guided Jerry Yang, who is, after all, a diverted graduate student who got on-the-job training in business. To understand these three ideas is to understand Yahoo under Yang: 1) Microsoft is evil. Yang came of age in the Netscape era and saw Microsoft break the law to destroy that company and try to control the Internet. Whatever its motivation, Microsoft did all the bad things they were accused of and more and Yang could never forget or forgive that, even at the cost of his own company. He took it personally. 2) The power of "no." There was a time in the 1990s when venture capitalists Kleiner Perkins and Sequoia Capital were trying to get Excite and Yahoo -- their respective portals -- to merge in a forced marriage designed to benefit only the VCs. It didn't feel right to Jerry, who put his foot down and scotched the deal. It worked that time, so saying "no" became for Yang the default position, especially after Broadcast.com. 3) Don't get screwed. When Yahoo bought Broadcast.com for $4.7 billion and it became clear that Yang & Co. got almost nothing of value for their money, they resolved never to get screwed on another deal again. That was the moment Yahoo embraced bureaucracy. They never made a quick decision again and in many cases hardly made any decisions at all. Mix these three concepts together, add independent wealth and a personal golf course, and you get the Jerry Yang of today. He was inclined to say "no," couldn't embrace Microsoft's evil, and sure as heck wasn't going to be screwed by Redmond, which he knew could never be trusted. As long as Jerry was in command the deal would never happen -- and didn't. Given all this it's a wonder Yang can remain with the company as he says he will. I couldn't do it. He must feel like Ralph Nader. Or maybe that's exactly it; Jerry Yang, like Nader, still doesn't get it. The best thing Yang could have done for Yahoo shareholders was to sell the company to Microsoft. He chose, instead, to do what he thought was best for the Yahoo COMPANY, which is weird given that it no longer feels anything like it did back in those glory days. He threw away $20+ billion just to preserve a memory. Comcast's Cap Hey, I have been thinking about Comcast's new 250-gigabyte monthly download cap and what to do about it. Comcast, of course, is just trying to keep its top 2-3 percent of P2P gonzos from ruining things for the rest of us. If a tiny minority of users are taking half the available bandwidth, well they have to be somehow crushed (that's the theory). Comcast first tried slowing down the miscreants, you'll remember, denied the company was doing that, then got busted by the AP of all outfits. So now they'll try this new cap. As a guy who sees a GRAND PLAN nearly everywhere, of course I see one here. Comcast says the new cap will affect less than 5 percent of its users, but that's now. What hap[...]



Now For Something Completely Different

Fri, 14 Nov 2008 15:57:16 -0500

President-Elect Barack Obama has announced that when he's in office he'll appoint a Chief Technology Officer (CTO) for the whole darned USA. Though Google CEO Eric Schmidt already said he isn't interested in the job, I am. I accept, Mr. President. And while the idea of Cringely for CTO may seem lame to most everybody I know (including my Mom), I think I can make a strong case for why I am EXACTLY the right guy for the job. For one thing, unlike Eric Schmidt I don't have a lot of money. Schmidt can't afford to take the job because Google stock is down and he'd lose a fortune. Not so for me. I come encumbered only with debts, which is to say I am a true American. I'd be perfectly willing to put those debts in a blind trust ASAP. The U.S. CTO would have to be a dynamic leader capable of speaking his or her mind and holding his or her own against a tide of critics and special interests. Hey, that's what I do every week (sometimes twice)! Maintaining and defending technology opinions is my only business and some people think I do it too well, which I take as a compliment. Now we need to consider why President-Elect Obama thinks the country needs a CTO in the first place. The President has long had a Science Adviser, so why appoint a CTO? It's the distinction between adviser and officer that I'd say is the whole point; one simply advises while the other implements and leads directly. And I think there is plenty of room for new leadership in this area. America has always been tops in science, tops in research and development, tops in medicine, tops in industrial development, tops in technical infrastructure -- tops, tops, tops. But are we tops today? I don't think so, and I'd say we've been slipping steadily for the last eight years and probably for many years before that. The rest of the world has caught up and some other countries now lead the U.S. in many respects. Yes, we have technical traditions and deep institutions, but those traditions are weaker than they were and the institutions are, too. I think something can be done about that. My belief that something CAN be done is critical, because most of the usual suspects for this job probably think it can't. The reason I am so optimistic is because of the very financial disaster that is the current U.S. economy. Things are so bad right now that I am greatly encouraged. Huh? Sometime in February the new Obama Administration is likely to propose the mother of all economic stimulus packages. It won't be a $650 check that comes in the mail. It won't be a $700 billion equity injection in various financial institutions. It WILL be a public spending plan modeled after the New Deal of the 1930s, injecting $600+ billion primarily into infrastructure construction and reconstruction. The difference between this New New Deal and the first one is that while plenty of roads and bridges will be rebuilt, a lot of the money this time will probably go into information infrastructure. Well that's my bag. The U.S. CTO - at least this FIRST U.S. CTO - will be the buyer-of-cool-stuff-in-chief for the entire nation. I would make a better buyer-in-chief than almost anyone else because of two important characteristics in my warped personality: 1) I would be immune to special interest groups so this wouldn't turn into another National Information Infrastructure boondoggle, and; 2) yet as a true enthusiast I would buy with such reckless abandon that I'd easily fulfill the economic stimulus needs while spewing money widely enough to guarantee at least a few good technical investments for the nation. This latter point probably requires some explanation. As we can see from the current $700 billion bank bailout, the ranks of those actually benefitting are pretty small. We're $325 billion into the thing and consumers - the people paying for it -- have yet to benefit at all as far as I can tell. Most banks haven't even benefitted. And those that hav[...]



Love-Hate

Fri, 07 Nov 2008 14:48:12 -0500

Steve Jobs is not like you and me. He has millions of customers, 32,000 employees, and a board of directors who think he can do no wrong. Running a company that is immensely profitable, gaining in market share, has no debt and $20 billion in cash, he can afford to make bold moves, the most recent of which is his decision to replace Tony Fadell, until moments ago head of the division that produces Apple’s iPod. Like everything Jobsian, Fadell’s departure is part of an Apple GRAND PLAN. The variables at work here are (in no particular order) ego, competitive advantage, ego, management technique, ego, strategic thinking, and ego. To say that Steve Jobs’ ego can expand to fill any known space might be an understatement but I’ll stand by it anyway. Fadell’s failing in this regard is his being hailed as the “father of the iPod.” What does that make Jobs? Who made THE BIG DECISION? Who committed the company? Who – most importantly of all – seduced all the record companies? That last guy would be James Higa, but since I don’t want to get HIM fired, too, let’s just attribute it all to Steve Jobs – for all intents and purposes the REAL father of the iPod. All hail Steve. Apple exists solely as an extension of Steve Jobs. Remember that. Anything attributable to Apple is really attributable to Jobs. Other people work at Apple, of course, and excel at their positions, but that is primarily because they were chosen, anointed, or inspired by Jobs. Not that Jobs doesn’t make the occasional mistake. Look at the Mac Cube, for example. But that was our mistake as consumers, not realizing that it really ought to have been worth an extra $500 to us to have a computer with no cooling fan. Steve Jobs makes very few such mistakes, in fact. That, and his total domination of Apple at every level allow the company to be literally the only PC vendor to have anything like a strategic plan. Dell and HP have the odd strategic initiative, like getting into or out of media players or TVs, but the idea of a comprehensive corporate strategy, well that’s too much to expect from companies that are managed, not led. Steve Jobs is a leader 100 percent in the mold of General George S. Patton. Rent the movie and it will start to make sense. Heck, rent it on iTunes. So here’s what’s going on with Tony Fadell. First, he was vulnerable as a charismatic leader in his own right who has been talked about in the press as a possible heir to Jobs. That alone meant he had to die, but it wasn’t enough to mean that he had to die just now. That decision required an external variable in the form of former IBM executive Mark Papermaster. Steve Jobs wants to give Tony Fadell’s job to Papermaster. It’s not that Papermaster would be any better at the job than Fadell, but there are two over-riding factors here: 1) Jobs can only have so many direct reports, and; 2) he thinks putting Papermaster in Fadell’s job is the best way to get past any legal objections from Papermaster’s former employer, IBM. Papermaster most recently ran IBM’s blade server division and in the mind of Steve Jobs blade servers and iPods couldn’t be farther apart. One is an enterprise sale while the other is consumer. One is a clear IT sale and the other has nothing to do with IT, really, since iPods and iPhones aren’t aren’t computers or computer peripherals. Jobs thinks Apple can make this point stick with a judge and he might well be correct. Papermaster has to be gone from IBM for a year before he can take a job that clearly competes with his last position at IBM. But Jobs doesn’t want Papermaster for blade servers, nor does he even want him for iPods. Jobs wants Papermaster for the expertise he showed two jobs ago at IBM running Big Blue’s PowerPC operation. Jobs wants Papermaster to lead Apple’s PA Semi acquisition and create a new family of scalable processors optimized for Snow Leopa[...]



Azure Blues

Thu, 30 Oct 2008 14:21:28 -0500

It isn't very often I get to apply Moore's Law to a non-Information Technology business and rarer still that I can then relate the whole thing back to Microsoft, so I'm going for it. Here's what the solar power industry can teach us about Microsoft: The wonderful thing about Moore's Law is what the lady at the bank called the "miracle of compound interest." That halving of manufacturing cost every 18 months (the OTHER way of looking at Moore's Law that we generally don't use) has little apparent impact in the first few years, but eventually the halving and re-halving takes a real bite out of the cost side until substantial performance is very, very cheap. That explains why there is more computing power -- a LOT more -- in your iPod than was required for the Apollo Moon missions. Well this applies to ALL silicon-substrate photolithography applications, not just computer chips. It applies equally well, for example, to silicon solar cells. There are many types of solar cells. Some solar cells involve crystalline silicon just like computer chips and others use amorphous silicon, but all types benefit from Moore's Law. In fact one especially good aspect of solar cells is that they can make use of older process technologies that are obsolete for computer work. So every time Intel or AMD builds a new fab there is a market in the solar industry for their old machines. Look at those round solar cells used in many arrays today and you'll notice the smaller wafer sizes favored in Silicon Valley 15-20 years ago. That's no coincidence. The result of this relentless application of Moore's Law to the solar industry is that we can see a time in that near future when the cost of producing a watt of electricity from a solar cell on your roof will be approximately the same as the cost of delivering that same watt over a power line from an electric utility. And of course that means that 18 months after that point the solar watt will cost HALF of what the same power would cost from the electric company, which will completely change the game. The time when that electricity cost parity will be reached, I'm told, is seven years from now. Just think of the impact that will have on electric utilities! Why would any of us continue to buy our power from them? We might use them as a giant storage battery and possibly for backup on cloudy days, but why would we use them at all for power if we can generate it cheaper at home? You can bet that's a question the electric power generating industry is asking itself. The whammy for the power companies is two-fold, because not only will power be cheaper but, by definition, the cost of building and installing solar panels will be substantially cheaper, too, than it is today. If it costs $40,000 on average to refit your house today, a lot of homeowners can't afford that, but what if it becomes $10,000? That's what worries electric companies that are used to having easier access to capital than do their customers. But once installing solar power costs relative chump change (the cost of a nice Ski-Doo or remodeling a bathroom), we'll see massive conversion and the power companies know that. So what can they do? They can find ways to get us to use more power than can possibly be generated from the roof of a typical American home. And that's why this week the Electric Power Research Institute proposed that we all get plug-in hybrid cars. It would save billions of barrels of oil, they say, lower greenhouse gas emissions, clean the air, oh and by the way require more electricity than your solar cells can produce, thanks. And it will work -- for a while. But Moore's Law is relentless, you know, and the role of electric utilities will change dramatically over the next decade as a result. As far as I can see, this is all for the better. But what does it have to do with Microsoft? Well that brings us to Windows Azure, which was still called Windows [...]



Collateral Damage

Thu, 23 Oct 2008 13:23:10 -0500

I am not a very sophisticated mobile phone user. I don't use most of the bells and whistles on my phone, probably because I don't know what they even are. But just because I'm an idiot about USING mobile phones doesn't mean I don't understand the emerging mobile market, to which I have been paying a lot of attention of late. And why not? As personal computers fade from what Al Mandel called "ubiquity to invisibility," something has to take over. And everyone I respect thinks the new dominant platform will be mobile. So it's my job to tell you, then, that Windows Mobile is probably doomed. Interestingly, this conclusion isn't based on any personal preference or subjective analysis. I'm not saying that Windows Mobile is bad, just that it is probably doomed. It's a simple matter of market economics. There is generally room in any technology marketplace for three competing standards. Notice I say "standards," not "brands." There can be many brands of road vehicles, but they generally come down to cars, trucks, and motorcycles -- each a standard. In personal computers we have Windows, Macintosh, and Linux (or similar Unix workstation variant). In HVAC systems, just to stretch the point, there are radiant, forced air, or evaporative systems -- again three standards. And among those three standards there tends to be a market-share distribution that is more or less 85-10-5. These numbers can jump around a bit and one can argue that the Mac is now more than 10 percent of recent sales, though not of the installed PC base, so I hope you get my point. This magic 85-10-5 distribution also happens to mirror what happens at the racetrack or in the casino, where 85 percent of gamblers lose, 10 percent break-even, and 5 percent are winners, which explains all those big buildings in Las Vegas. The mobile phone marketplace shows a similar distribution, though that now appears to be in some transition. One could argue that the old 85-10-5 came down to basic or dumb phones (85), smartphones (10), and specialized or vertical phones like the old Nextel (5). Moore's Law now seems to be inexorably turning all phones into smartphones, so we're probably moving toward an 85-10-5 based on programming platform. Let's consider this smartphone migration for a moment, first with Samsung in mind. Last week Samsung announced that it would no longer be making high-end phones and would stick to basic phones in the future -- going for higher volumes at lower cost. This makes a lot of sense given that sophisticated phones must cost more to develop yet tend to be more expensive as a result and therefore have lower sales numbers. So why bother? This was the message Samsung sent out and everyone bought, but I really think that if you look at it in the context of a dynamic market the announcement means something else altogether. Deconstructing the Samsung announcement we'd have to wonder how the company sees itself and its competitors. That answer is pretty simple: Samsung sees itself as a global electronics company competing with outfits like Sony. Samsung has been for 40 years all about copying and eventually crushing Sony. Now given that's how Samsung sees itself (nobody I know contests this vision, by the way), how can the company possibly afford to let Nokia, Motorola, Sony, and Apple make high-end phones, which is to say smartphones, without Samsung competing in that space? That would be giving up a lifelong dream and Samsung just won't do it. So were they lying? No, Samsung wasn't lying, they were just doing what my old friend, PR man Martin Quigley, called "dissembling." Samsung probably has no intention of abandoning the smartphone market because ALL phones are becoming smartphones. What they truly intend to do, however, is make smartphones that are generally inexpensive, hoping to gain market share as a result. We'll see this tren[...]



Ctrl-Alt-Del

Mon, 20 Oct 2008 15:21:49 -0500

Apple last week introduced a pair of very nice notebook computers that, not at all surprisingly, looked like riffs on the MacBook Air. The company in a separate announcement released 600 high-definition television episodes through the iTunes Store. This week Apple will reportedly release new 20-inch and 24-inch iMacs, also for the Christmas season. Two weeks, three announcements, but what strikes me (and apparently only me at this point) is what won't be announced -- the big surprises that are missing. What happened? A MacBook and a MacBook Pro are nice, but not overwhelming. I like the dual GPU in the Pro and I hate the lack of a FireWire port in the MacBook, but beyond that there is little to say about these products except that the glass screens (on the iMacs, too) are better for houses like mine filled with LCD screen-destroying pre-school boys. These new products don't appear to break any price or performance barriers and sure as heck don't allow time travel or make me more handsome. We were led to expect more -- a lot more. And I am not talking about rumors. Back on July 21st in his regular conference call with industry analysts, Apple Chief Financial Officer Peter Oppenheimer said that Apple's profit margin would likely shrink from 34.8 percent in the just-concluded quarter to 31.5 percent in the quarter ending in September. "We've got a future product transition that I can't discuss with you today," Oppenheimer said as he spelled out the reasons for the anticipated profit reduction. "One of the reasons that we see gross margin being down sequentially is because of a product transition." What kind of Apple product could be expected to come along, taking a $244 million profit hit for the company? It certainly isn't any of the products we've discussed so far, nor is it the iPhone 3G or the new iPod Touch, which have both been publicly dissected and found to have gross margins in the 56 percent range. It's something else that was probably intended to be announced this week but wasn't. The change of plan could have come for many reasons. Maybe the revolutionary product wasn't ready in time. Maybe introducing an aggressive, low-margin product in the middle of a global financial crisis was considered a bad risk. Maybe some strategic alliance had to be in place and wasn't ready. Whatever it was, the same analysts will ask about it this Tuesday when Apple has another such conference call scheduled. But of course none of this keeps me from speculating about what's missing from Apple's announcements and the reasons it might be missing. I think the delayed product has everything to do with Apple's desire for Blu-ray DVDs to die as a standard. Apple CEO Steve Jobs took a swipe at Blu-ray in last week's announcement -- a swipe that felt out of sync with the rest of the program. Steve has no difficulty at all NOT talking about subjects he wants to avoid, so leaning into Blu-ray was not at all offhand or without strategic importance. Don't expect Blu-ray drives on Apple computers, Steve said, yet he didn't offer a clear alternative. The alternative Jobs would like to offer, of course, is full 1080p HD video distribution on iTunes, but that's not currently possible. It will happen in time, of course, but certain prerequisites have to be in place. Apple hardware has to support it in a practical sense, for one. Interestingly, users of the new Apple notebooks began reporting that CPU utilization for H.264 decoding on their new machines dropped from 100 percent on an earlier model with the same processor to sub-20 percent on the new aluminum MacBooks. Though it wasn't announced, Apple seems to have (finally) enabled H.264 decoding on the Nvidia GPUs in these new machines. Equally significant is the fact that ONLY H.264 appears to be accelerated. HD content using the MPEG-2 or VC-1 codecs se[...]



Cool Threads

Mon, 13 Oct 2008 19:50:04 -0500

A couple of columns ago we touched on the practical rebirth of parallel computing. In case you missed that column (it's in this week's links), the short version is that Moore's Law is letting us down a bit when it comes to the traditional way of increasing the power of microprocessors, which is by raising clock speeds. We've hiked them to the point where processors are so small and running so hot that they are in danger of literally melting. Forget about higher clock speeds then; instead we'll just pack two or four or 1000 processor cores into the same can, running them in parallel at slower speeds. Instantly we can jump back onto the Moore's Law performance curve, except our software generally doesn't take advantage of this because most programs were written for single cores. So we looked back at the lessons of parallel supercomputers, circa 1985, and how some of today's software applies those lessons, such as avoiding dependencies and race conditions. But we didn't really talk much in that column about the use of threads, which are individual processes spun off by the main CPU. Each time the microprocessor adds a new task, it creates a thread for that task. If the threads are running on the same processor they are multiplexed using time-slicing and only appear to run in parallel. But if the threads are assigned to different processors or different cores they can run truly in parallel, which can potentially get a lot of work done in a short amount of time. Most traditional PC applications are single-threaded, meaning the only way to make them go faster without a completely new architecture is to run the CPU at a faster clock rate. Single-threaded apps are simpler in that they are immune to the dependencies and race conditions that can plague true parallel code. But they are also totally dependent on tasks being completed in a quite specific order, so in that sense they can be dramatically slower or dramatically more resource-intensive than multi-threaded apps. For an example of where single-threaded applications are just plain slower, consider Eudora, which is still my favorite e-mail client (I'm old, you don't have to tell me). Until not very long ago Eudora still couldn't send mail in background, so everything (and everyone, including the user -- me) had to wait until the mail was sent before completing anything else, like writing a new message or replying to an old one. I KNOW THIS IS NO LONGER THE CASE, SMARTY-PANTS -- THIS IS JUST AN EXAMPLE. The program was single-threaded and, since sending mail is a very slow activity, users were generally aware that they were waiting. Today Eudora sends mail in background, which is the same as saying "in another thread." Multithreading has been great for user interactivity because nothing should ever stop the input of data from typing, mouse movements, etc. There are many ways to use threads and before we consider some let's think about scale -- literally how many threads are we talking about? To run at true clock speed we'd have only one thread per CPU core, but a fast processor can multiplex hundreds or even thousands of threads and multi-core processors can do even more. So the EFFICIENT shift to multi-threaded programming requires a significant change in thinking on the part of developers. Here's an example: A hard problem with programming games is when you want something to happen every so often. That's not very efficient to code because it traditionally requires a program loop that spins as fast as the CPU will let it (making the CPU go to 100 percent) and keeps checking the time to see if it is time to do that thing. But threads are different: With threads you can very easily put them to sleep for any period of time, or even put them to sleep indefinitely until some event occurs. It's not only easier for th[...]



Off With Their Heads!

Wed, 08 Oct 2008 11:56:08 -0500

My promised column on threads will appear in this space on Friday. It would have appeared here today but the crumbling global financial system suddenly seemed a more appropriate topic. We're in trouble and by "we" I mean the whole darned planet. What started as a mortgage problem in the U.S. has blown into global financial paralysis that threatens us all with recession and maybe even with depression. I know I'm feeling depressed, how about you? The crisis seems immune to any and all efforts to fix or end it. NOT passing a $700 billion mortgage bailout can send Wall Street into a tailspin, for example, but then finally passing the bailout didn't seem to improve things, either. The Federal Reserve and Treasury Department are running out of tools and time yet still the system flirts with suicide. So I say it is time to take a completely different view of the problem and to look to a new leader to solve it, in this case Jack Welch. Jack Welch is the retired chairman and CEO of General Electric who took the company during his 23-year tenure from being worth about $14 billion to about $410 billion by really MANAGING the business and concentrating on creative use of capital. I've written columns and columns deriding managers as a profession but none of that applies to Jack Welch and GE, where managers really manage -- they manage the heck out of the place and to generally good effect. Jack Welch built that system, he has time on his hands, I say let's give him a new job. There is very little difference, in fact, between the global financial system and General Electric. Welch saw GE entirely in terms of cash flows and the application of capital to those parts of the business where it would do the most good. Welch also thought in terms of continuous quality improvement, which is virtually unknown on Wall Street OR in Washington, where such things aren't even talked about, much less measured. I've been thinking about this crisis and a lot of it comes down, I believe, to a fear of failure especially on the part of the banks. There is no credit available to anyone, anywhere, no matter what the credit rating or score. This is because the banks are frozen by fear to the point where they won't even lend to each other much less to customers. This fear of failure seems to be pretty much guaranteeing failure. And the regulators are now throwing what will soon be trillions of dollars at trying to break these bankers out of their paralysis. But I think there is a better way: use this very fear of failure as a motivator. Before we get to Jack let's deconstruct this current psychological crisis on the part of the banks. They aren't lending money because they are afraid it won't be repaid. They won't lend even to each other because banks seem to be failing all over yet there hasn't been an instance yet when an overnight loan has resulted in default. So what's the problem? More properly, what is the outcome the banks fear? They fear going out of business either through honest failure or through being forced to merge or having their deposits taken away by the Federal Deposit Insurance Corporation (FDIC). In short it comes down to fear of losing their licenses, because as a highly regulated industry the banks can only do business at all with the permission of government. Right now they are totally fixated on the idea that if they lend money and it isn't repaid the government will pull their licenses. Yet the government has made it clear that the most important thing is to LEND MONEY, breaking this credit paralysis. All the banks know this but none of them want to be the first to take the big risk of lending money. You do it! No, you!! Enough of this crap. What if Jack Welch was the U.S. banking czar? We know the result of all such crises these days in the U.S. is the ap[...]



Data Debasement

Fri, 03 Oct 2008 15:20:13 -0500

Last week I was in Boston to moderate a panel at the MIT Technology Review’s Emerging Technologies Conference — one of those tech shindigs so expensive I can only attend as hired help. My panel was on parallel computing and it produced this column and another I’ll file early next week. This week is about databases and next week is about threads. Isn’t this a grand time to be a nerd? Thanks in part to Larry Ellison’s hard work and rapacious libido, databases are to be found everywhere. They lie at the bottom of most web applications and in nearly every bit of business software. If your web site uses dynamic content, you need a database. If you run SAP or any ERP or CRM application, you need a database. We’re all using databases all the time, whether we actually have one installed on our personal computers or not. But that’s about to change. We’re entering the age of cloud computing, remember? And clouds, it turns out, don’t like databases, at least not as they have traditionally been used. This fact came out in my EmTech panel and all the experts onstage with me nodded sagely as my mind reeled. No database? No database. Parallel computing used to mean scientific computing, where hundreds or thousands of processors were thrown at technical problems in order to solve them faster than Moore’s Law might otherwise have allowed. The rest of us were relying on rising clock rates for our performance fix, but scientists — scientists with money — couldn’t wait so they came up with the idea of using multiple CPUs to solve problems that were divided into tasks done in parallel then glued back together into a final result. Parallel computing wasn’t easy, but sometimes that was the whole point — to do it simply because it was so difficult. Which is probably why parallel computing remained a small industry until quite recently. What changed was Moore’s Law put an end to the clock rate war because chips were simply getting too hot. While faster and faster chips had for the most part linear performance increases along with cost and power consumption decreases, the core temperature inside each microprocessor chip was going up at a cubic rate. Back in 2004 Intel released a chart showing that any clock speed over 5 GHz was likely to melt silicon and Moore’s Law would, by 2010, make internal processor temperatures similar to those on the surface of the Sun! For those, including me, who think that’s pretty darned hot I’ll point out one of my astronomer readers immediately had to mention that the Sun’s chromosphere is actually much hotter than the surface. Forgive him, he means well. Faced with this absolute thermal performance barrier, Intel and AMD and all the other processor companies had to give up incessant clock speed increases and get us to buy new stuff by putting more than one CPU core in each processor chip can. Now chips with two and four processor cores are common and Intel hints darkly that we’ll eventually see hundreds of cores per chip, which brings us right back into the 1970s and ’80s and the world of parallel computing, where all those principles that seemed to have no real application are becoming very applicable, indeed. And that’s exactly where databases start to screw up. Bob Lozano, chief visionary, evangelist, father-of-eight (same woman) at Appistry came up with the first database example I’d heard and it was eye opening. Appistry (I’ve written about them before — it’s in the links) specializes in distributing what would normally be mainframe applications across tens, hundreds, or even thousands of commodity computers that act as one. If [...]



The Cringely Plan

Fri, 26 Sep 2008 23:15:59 -0500

In the early 1980s I was a volunteer firefighter for a tiny community in the Santa Cruz Mountains of Northern California. We all lived in a beautiful redwood forest and our task was to keep that forest from burning down in a huge conflagration, taking us all with it. The job was made all the harder because our little part of paradise hadn't burned since the 1920s, so there was 60+ years of flammable undergrowth just waiting to light off. The current financial crisis facing the United States and the world really isn't much different from that. An unmanaged forest, one without the sort of fire control we attempted to provide, would naturally burn every few years. The undergrowth would build up, reach a critical mass, some source of ignition would come along -- usually lightning -- and all that undergrowth would burn. The redwoods themselves would be scarred but not really threatened, as we could see from the charring that marked them from countless such fires over centuries. Of course burning undergrowth threatened homes and property, too, so there was a natural desire on the part of that community to want the next burn to not come this year, please not this year. So there came a policy of aggressively fighting fires with the result that we eventually faced 60 (now 90!) years of flammable material growth rather than six or eight years. And the probable fire fueled by 60 years of undergrowth would have been so bad that our job changed to one of trying to prevent fires from happening, well, ever. This was an impossible task, of course. Eventually the stars would align the wrong way and the whole place would burn down, we all knew it. Just let it not happen on our watch. Does this sound familiar? Now America and much of the world face the possibility of recession and we handle that by first arguing about the definition of the term. Are we or are we not in recession? This distinction appears to be very important to some who view it like a forest fire: is it burning or not? Implicit in this distinction, I suppose, is the idea that if we're not burning -- if we are not in recession -- that maybe through some miracle we'll never face that problem. Whether this is a practical attitude or not depends entirely on your event horizon -- how soon you expect things to change. The people in power in this country have a relatively short event horizon. Politicians tend to think of two, four, or eight years as the longest periods of time that matter. Corporate honchos might look out further, you could guess, but they don't since the average duration for a U.S. Fortune 500 CEO is under four years. So while the intent of the fire chief and the mayor and the governor and the Congressman and the President and the CEO is that there be no unpleasant surprises during their watch, all of them know such surprises are coming. This short-term focus in the face of long-term difficulties leads to odd behavior at times. In the forest it is usually better to let smaller fires burn or even to deliberately set them, yet few fire chiefs are willing to take that risk, even though NOT taking the risk is so much worse. In financial markets, as companies crumble and governments prepare bailouts, short sellers pile on in scrums of doom that make the shorts rich yet hurt society. Traders with a trading mentality, they can't help themselves any more than Ralph Nader can keep from running for President. "But George Soros did it to the Bank of England," they say, as if that makes everything okay. The American economy is at the end of its longest-ever period without a recession. Through sleight of hand and a fair amount of financial fraud we've managed to keep the "R" word out of our communal vocabulary for 14+ years. [...]



Door Number Three

Mon, 22 Sep 2008 14:29:57 -0500

I’ll begin this third and (I promise) last column on IT management with a confession: I have been fired from every job I have ever held. This is certainly not something I set out to do, nor did I even realize it until one day my young and lovely wife mentioned that I had never told her about voluntarily leaving any position. It’s not that I’ve had so many jobs, either. This one and the one before it have kept me going for more than 20 years. But they always seem to end the same way. This one might, too. You can never tell. Most of the times I have been fired it’s because I’ve been judged to be unmanageable, which is to say I won’t shut up. The ultimate reason given is usually something minor. The last time around, for example, I was fired because I didn’t transfer the cringely.com domain to my employer. They asked me to do it and I said “no.” Had they said, “Transfer the domain or you will be fired,” I might have decided differently. But they never said that — never gave a hint of the consequences — so I assume the real goal was less to get the domain and more to get rid of me. The guy who had me fired, Stewart Alsop (maybe you’ve heard of him), ultimately lost his own job for firing me, at least according to International Data Group Chairman Pat McGovern, who told the story to 300 people once at a DEMO conference. Back when I was a kid and working at WWST Radio in Wooster, Ohio, I was fired for writing those seven unspeakable words in the middle of a livestock auction report. Another kid had been playing similar tricks on me for weeks, but when I finally retaliated he turned me in. I guess I was a threat to him and didn’t know it. One company hired and fired me three times and another company hired and fired me twice. And somehow in all this I’ve never received severance or unemployment compensation. I just found another job or it found me. There’s a point to all these firing stories and they actually do relate to IT. I’m typical of a lot of IT types. You know us. We are useful but sometimes a pain in the ass. We have opinions and speak our minds and don’t suffer fools at all. We stand up to authority from time to time. Sometimes we’re wrong. We get fired a lot and hired a lot, too, because we are generally useful, though dangerous. What do you do with folks like me if you are a manager? At a traditional newspaper you’d either fire me or make me a columnist. And, sure enough, look where I am. In an IT shop you give me a task to do and let me do it, generally on my own, and never EVER put anyone under me, because I am hopeless as a manager. But it turns out I’m not so bad as a leader. Weird, eh? The last two columns have shown that IT is the Cousin It of American industry. We serve the company but often don’t feel part of it. Certainly the value structures and lines of authority that function perfectly well for most of the rest of the company don’t work at all well for IT. We’re vital but at the same time, well, so different that it’s hard to imagine a CEO emerging from the IT ranks. It happens from time to time. Everyone points to John Reed, who rose from IT to CEO of Citicorp, but Reed was an exceptional case. He succeeded because his predecessor, Walter Wriston, had an unusual interest in IT and mentored Reed. Reed succeeded, too, because he didn’t really come from IT but from Data Processing, which was more hierarchical. And ultimately he didn’t succeed at all, by some measures, because John Reed was fired. So right now let’s just accept that it is very unl[...]



Leadership

Wed, 17 Sep 2008 18:29:44 -0500

Last week’s column on bad IT management and the strong response from readers that followed show this to be a huge issue. There are WAY too many IT managers who either can’t or shouldn’t manage technical teams. Last week I maintained that having a firm technology base, or at least the ability and willingness to acquire one, was essential for good managers. While readers got carried away with which technical test is the best, I don’t think there is much dispute that there are certain aspects of technical management that are helped by the manager being a code god. But that’s far from all there is to the job. So this week I want to go deeper and look at what’s really missing in nearly every instance of such bad management, which is leadership. The distinction between management and leadership is a critical one. Management is — at its very best — an exercise in coping while leadership is so much more. Last week’s simple idea that the manager ought to at least be able to tell good work from bad is exemplified by Bill Gates, who liked to claim that he could tell good code from across the room and that whatever task a team was facing was something he could code in Visual Basic over a weekend. Both statements are nonsense, of course, but Bill knew he had to talk the talk, making him at least an adequate manager. Does it make him a leader? I don’t think so. But let’s not blame Bill for that. Let’s blame Charles Simonyi. Charles is the guy who came up with Microsoft’s development process — an outgrowth of his research at Xerox PARC. I covered this extensively in my book, Accidental Empires, but the short version is that Charles came to advocate a strong program manager as the central controller of any development group. One person made all the decisions and as long as that one person was correct 85 percent of the time, it was better to have a dictatorship than a democracy or even a meritocracy. This was an effective way to extend Bill’s will to Microsoft programmers Bill would never even meet. And to Charles’ credit the system worked well enough if the dictator was really, really smart and the task at hand wasn’t too complex. It was perfect for the 1980s. But it is far from perfect today and represents one of the fundamental reasons why Windows Vista was so late to market and such a mess when it finally shipped. Vista had plenty of management, but not very much leadership. When I think of leaders what comes to mind first are political and military leaders. We use the term “leadership” to describe those roles far more than we do for what ought to be similar roles in business or technology. This week former Hewlett-Packard CEO Carly Fiorina said that John McCain, Barack Obama, Sarah Palin, and Joe Biden were all ill-suited to be CEOs of major corporations. However badly the statement went over (Carly supports McCain, by the way), her real point was that there are different skill sets for leaders than managers. And she’s right to an extent, but it also says a lot about her own tenure at H-P, which was long on management and short on leadership. Management is telling people what to do, which is a vital part of any industrial economy. Leadership is figuring out what ought to be done then getting people to do it, which is very different. It is a vital part of any successful post-industrial economy, too, but most managers don’t know that. Let’s use the U.S. military involvement in Iraq as an example of the difference between leadership and management. As more books are written and stories come out we[...]