Sat, 27 Apr 2013 16:46:48 +0100
After 3 amazing years, yesterday was my last day working at Forward.
I've had a great time, worked with some awesome people and made some brilliant friends, now it's time for a new challenge, I'm freelance.
I've been getting more and more involved with the communities, running meetups, hack days and open source projects. I got to the point where I was running on empty and had no free time to myself and had to find a better way to balance my life.
Once I've got my freelance business stabilised I'm hoping to do even more community and open source work and help to make the UK a great place for developers to work, socialise and share knowledge.
Get in touch if you'd like to work with me on a project: firstname.lastname@example.org
Sun, 07 Apr 2013 21:25:15 +0100
I've put a kind of living CV up on GitHub pages that sums up the kind of stuff I've been working on over the last couple years.(image)
Wed, 20 Mar 2013 16:32:50 +0000
One of the sponsors that I had arranged to help pay for drones for the upcoming Nodecopter Bath event was BAE Systems, who are active in encouraging interest in science, technology, engineering and robotics.
After revealing the sponsorship, there was an active discussion within the NodeCopter community about having a large defence company as a sponsor.
Therefore, on reflection, both myself and BAE have jointly agreed to withdraw the BAE Systems sponsorship from this Saturday's event.
I'm still looking for sponsors for the event, if you would like to sponsor a drone please get in touch: email@example.com
Sat, 16 Feb 2013 21:35:07 +0000
There are so many great conferences coming up this year, I thought I'd highlight some of the best ones.Waza - 28th February
Heroku's annual developer conference, the line up this year is brilliant including Matz, the creator of Ruby.#inspect - 25th March
The first official RubyMotion conference in Brussels. If you are getting into Rubymotion, you can't afford to miss this.Ruby Manor - 6th April
The community conference organized by members of the London Ruby Users Group, I spoke there last year and had a blast.Write the docs - 8th April
A new conference for people who write and maintain documentation, this is really interesting to me as I've been focusing a lot on documentation recently.BACON - 12th April
I also attended Bacon last year, unlike most technical conferences, the talks were incredibily varied, including talks about beer, lego and coffee.Railsberry - 22th April
Yet another amazing lineup, and good to see this traditional rails conf expanding into other topics as well.Git Merge - 9th May
A conference focused purely on git. It's pretty niche but ran by GitHub so you know it's going to be good.Webshaped - 23rd May
Finland's only Frontend web design and development conference, I'm going to be speaking there about super charging your frontend development with Node.Fluent - 28th May
The massive 3 day O'Reilly conference in San Francisco with a massive range of speakers and technologies being covered.NodeConf - 27th June
The official Node conference in California with a twist, it's at the beautiful Walker Creek Ranch with series of hands-on workshops mixed with presentations and talks.HybridConf - 15th August
The most exciting UK conference this year organised by Zach Inglis, it's a good mix of design and development talks from some world class speakers.dConstruct - 6th September
dConstruct is one of those conferences that I just can't miss, I've been 3 times in a row now and it gets better every year.Barcelona Ruby Conference - 14th September
I missed going to this last year, and after watching the videos from the conference I've been kicking myself since, definetly won't miss it this year.ArrrrCamp 7 - 3rd October
The rum-fueled Belgium Ruby conference, on it's 7th year, it's small but perfectly formed.
I've also keep track of all of these on Lanyrd Guide.
Thu, 07 Feb 2013 23:08:25 +0000
A short video of my dancing nodecopter presentation at Bathcamp yesterday.
Sun, 03 Feb 2013 18:28:39 +0000
In the past few weeks there has been a fair amount of activity on the Split repo, reaching over 500 watchers and I've just released version 0.5.0, a major update to the gem.
Adding the ability to swap out the persistence adapter and sampling algorithms as well as configuring your ab_tests from a YAML file rather than in code.
I've not been using Split in production for a while now, and it's encouraging to see that whilst other people are still using it, members of the community are stepping up to help continue to drive development forwards rather than leaving it to languish until a new, shiner library comes along.
I put together a little script (https://gist.github.com/4702837) to see how much work the community has contributed to the project, and the results are surprising:
Owner contributions 3278 ++ 1360 -- Community contributions 4274 ++ 2141 --
On split, developers other than myself have added and removed more lines of code to the project, which is quite exciting. I've become more of a manager of the project, ensuring that any pull requests are inline with the goals of the project.
This feels like one of the goals of an open source project, first and foremost it should solve the problem it was designed for but after than the project should aim to be supported by the community at large, fault tolerant.
If I were to step down now, I feel that the other developers working on the project could pick up where I left off, rather than it becoming just another abandoned project.
Sat, 02 Feb 2013 11:55:31 +0000
Quadcopter got caught in a strong wind about 20 meters up, hit a tree and fell, landing on one of the rotors.
Thu, 31 Jan 2013 23:34:29 +0000
The first test flight trying to get the Nodecopter to dance (go up and down) in time with a song, code is on github: https://github.com/andrew/ar-drone-dancer
Mon, 31 Dec 2012 20:41:53 +0000
Every year I like to set myself a number of goals or challenges to improve myself, last year included reading a new book every week and improve my level of personal fitness with a better diet and exercising more.
This year, as well as continuing with the good habits I've gained in 2012, I'm going to work on some new things:
Level up my terminal skills with vim and tmux
Get back into writing on a regular basis
Ship a RubyMotion app on the App Store
Get over my irrational fear of using the telephone
Dabble in hardware hacking and electronics including Nodecopter and RaspberryPi.
Heres to a great year!
Tue, 27 Nov 2012 21:45:27 +0000
A great collection of computer science books made available for free under a creative commons license. There are also a great collection of freely available books on O'Reilly's Open Books Project.(image)
Sun, 28 Oct 2012 11:25:15 +0000
It's basically two parts:
The use case for this can be seen on LNUG.org, I wanted to embed some links to Node books but not have to manage the links myself.
Some things I'd have liked to add if I had time:
Hopefully once the API is available to all there will be a revenue-share program so anyone can profit from embedding the widget in their site.
Fri, 08 Jun 2012 10:08:14 +0100
I'm thinking of running an introductory course in Ruby on Rails in London, if you are interested in attending please register your interest.
It will be a one day introductory course for developers who want to get started using Ruby on Rails to build websites and web apps.(image)
Tue, 05 Jun 2012 10:54:23 +0100
Bicycle Merry-Go-Round at Maker Faire in San Francisco
Fri, 01 Jun 2012 03:34:37 +0100
Split, the Rack Based AB testing framework, has just been updated to 0.4.2, and now supports v3.0 of the redis gem.(image)
Sun, 13 May 2012 08:36:22 +0100
A Lightning talk timer app that I made for iOS using RubyMotion.(image)
Sun, 06 May 2012 13:01:27 +0100
Sun, 06 May 2012 12:57:05 +0100
I've been playing around with pinterest recently, follow me!(image)
Sun, 06 May 2012 12:55:45 +0100
The slides for a lightning talk I presented at Bacon Conf a few weekends ago.(image)
Sun, 06 May 2012 12:55:09 +0100
A talk about a site I recently launched using Ruby on Rails and Node which needed to scale to 3000 concurrent users.(image)
Sun, 06 May 2012 12:54:21 +0100
A presentation about the benefits of using rails over sinatra that I made last year.(image)
Fri, 06 Apr 2012 17:19:01 +0100
This month at the London Node User Group we're doing lightning talks, please get in touch if you'd like to speak.(image)
Fri, 06 Apr 2012 17:16:05 +0100
Split, the Rack Based AB testing framework, has just been updated to 0.4.1, with a few important bug fixes and the ability to disable it via a configuration option.(image)
Sun, 01 Apr 2012 18:17:12 +0100
Forward is now hosting a weekly coding school for kids called CoderDojo.
CoderDojo is a movement orientated around running free not-for-profit coding clubs and regular sessions for young people.
If you know any kids who would like to get involved then send them and their parents the link.(image)
Sun, 25 Mar 2012 16:50:13 +0100
Answer some questions and we’ll calculate your Broficiency Quotient (BQ).
My BQ is -95, making me a standard nerd!(image)
Sun, 25 Mar 2012 16:48:48 +0100
Since then it's been deployed to the live site and is working great!(image)
Sat, 10 Mar 2012 08:56:15 +0000
I've released a new version of split, it has quite a few improvements including experiment start times, visitors only participating in one experiment by default, cleaning up old sessions and some bug fixes.(image)
Sat, 10 Mar 2012 08:49:02 +0000
This is pretty awesome, Split has been featured on Railscasts, a full 10 minute tutorial of how to do A/B testing with split and some of its more advanced features.(image)
Sat, 03 Mar 2012 10:04:51 +0000
Tickets are now available for the Go meet up that I have been organising in the Forward Offices on 20th March.(image)
Sat, 03 Mar 2012 10:04:03 +0000
I've recently been taking a stab at the Project Euler challenges, these are the solutions I've done so far in Ruby.(image)
Sat, 25 Feb 2012 08:29:01 +0000
I've been dabbling with Go recently and this guide to deploying Go web apps is really handy.(image)
Sat, 25 Feb 2012 08:11:46 +0000
A small but helpful library I released yesterday for encoding and decoding base62 strings, ideal for url shorteners.
It's also my first npm release, you can install it with the following command:
npm install base62
Fri, 17 Feb 2012 18:55:00 +0000
Last week I spoke at London Webstandards about the LNUG website, these are the slides from my talk.(image)
Fri, 17 Feb 2012 18:54:07 +0000
A fun little HTML5 3D game.(image)
Fri, 17 Feb 2012 18:52:46 +0000
A surprisingly useful little page that gives you the exact instructions for switching your number between each UK mobile provider.
I just moved from O2 to 3 mobile and couldn't be happier.(image)
Fri, 17 Feb 2012 18:51:27 +0000
This awesome visualisation of Google Streetview with blow your mind!(image)
Fri, 17 Feb 2012 18:50:42 +0000
A really interesting 2 day London conference in April put on by the guys at Mint Digital with a great mix of speakers.(image)
Fri, 17 Feb 2012 18:48:57 +0000
I'm now syndicating http://cherry-pick.tumblr.com to twitter so you can easily follow along with the best repos each day.(image)
Fri, 17 Feb 2012 18:47:51 +0000
We've just finished our live streaming setup for meet ups that our hosted in our office, you can now watch the meetups live from the comfort of your own home (without the free beer and pizza that we usually provide).(image)
Sun, 12 Feb 2012 10:42:28 +0000
Split, the Rack Based AB testing framework, has just been updated to 0.3.2 and it now handles redis errors and downtime gracefully.(image)
Fri, 10 Feb 2012 22:41:41 +0000
An event that I've been organising to introduce the Go programming language to more developers across London.
It's on 20th March and it's completely free!(image)
Fri, 10 Feb 2012 22:37:32 +0000
The slides from a presentation I did recently about hosting meet ups and events.(image)
Fri, 10 Feb 2012 22:33:46 +0000
I just recently sent out second Forward Technology newsletter, getting my feet wet with email marketing.
If you've not signed up already here's the link: http://forwardtechnology.co.uk/newsletter(image)
Tue, 16 Dec 2008 15:01:01 +0000This is my 603rd and last column for pbs.org. If you want to continue reading my work, please visit http://www.cringely.com, which is also in this week's links. Thanks for your support. Everybody in my line of work writes prediction columns for the coming year, but I wonder how many we will see this time around? The world is unsettled. It's not just this damned financial nightmare we have to deal with but also a sense of between-ness, like something has just ended yet still lingers slightly though it is obvious that something new is about to arrive. But will it be a good something new? That's hard to tell. So for this reason I think the prognosticators will mainly keep their heads down this year. Except, of course, for me. I'm too stupid to shut up. So let's get on with this experiment in humiliation. You know the drill. We begin with a look at last year's predictions to see how I did then jump into my predictions for 2009. If you care to follow along you'll find last year's predictions column in this week's links. For a real laugh you can find my predictions from many previous years in the archive. I wrote a year ago that we'd see the beginning of a shift away from PC-centrism with other platforms beginning to supercede the venerable PC. This is a slow process as I said it would be but generally I think I was correct. Sales growth for PCs slowed in general while growth for smartphones and netbooks increased. I never said PC sales were going in the toilet but it seems clear that the action these days is elsewhere, so I'm going to claim this one. I said the Digital TV conversion would be a nightmare, though the greatest pain would be felt in 2009 when the analog transmitters are actually turned off. I think this is correct. Poll your friends and you'll find most are in denial. While everyone has seen a DTV commercial, there are millions of people who still don't know what's happening. Free converter boxes are sold out, which ought to be good, but expected DTV sales have not met forecasts, so I say there are 10-15 million people who are going to wake up mad as hell in February. So I got this one right and claim it for 2009, too. While it may seem quiet now, February and March are going to be ugly. I wrote that Cisco would acquire Macrovision, which didn't happen. Two right and one wrong. I still think Macrovision has to find a landing place somewhere or the company is doomed. I predicted that venture capitalists would sour on start-ups with revenue models based solely on advertising, citing Facebook as an example. This one is hard to call because the general tightening in the economy has led VCs to push all their companies toward multiple revenue models and much tighter books.[...]
Sun, 07 Dec 2008 14:05:13 +0000Looking for improved business models for the personal computer business, Apple CEO Steve Jobs often used to cite automobile makers, though never American car companies. The examples were invariably German. Whether it was the design aesthetic of his Mercedes sedan or Porsche's success at selling high-margin cars as entertainment devices, Jobs could always point to farfegnugen as a way to sell a good car for a great price. So since he thinks about these things anyway, and because the U.S. automobile industry is on the skids and begging for help this week, I find myself wondering what would happen if Steve Jobs were put in charge of any of the Big Three car companies? It wouldn't be boring, that's for sure, and I'm fairly certain Steve could do a better job than the Detroit executives currently in charge. When Steve Jobs returned to Apple in 1997, the computer company was in worse shape than some of these car companies. Apple's share price was in the toilet, it had poorly conceived products it couldn't sell, the company was losing money, market share was dismal, and CEOs from John Sculley on had tried without success to find ANY company that would buy Apple. Steve himself had such low expectations for Apple under Gil Amelio that he sold all his new Apple shares shortly after Apple bought his NeXT Computer. What a difference a decade makes. Today Apple and Jobs are at the top of their game, taking market share from other computer companies while at the same time establishing game-changing new product concepts like the iPod and iPhone. Apple is America's largest music seller (who could have seen that one coming back in '97? Nobody), has no debt, and $22+ billion in the bank. Even at its currently depressed stock price, Apple is worth more than any of the car companies and for good reason: Apple has a future. What did Jobs do to make Apple such a business success and how would he translate these techniques to a car company? It's not really that hard to imagine. Back in 1997 Apple had a huge list of products it made or sold, many of them not for a profit. Here is a partial list of Apple products from 1997 courtesy of my friend Orrin, who brought this idea to my attention: PowerBook Quadra Performa Power Macintosh Workgroup and network servers LaserWriter laser printers StyleWriter inkjet printers Newton PDAs Displays External disk drives Modems Scanners Lots of software And don't forget the Mac clones. Jobs killed the clones, dropped the Newton, and streamlined the Mac product line into what today are four ranges of computers -- personal and professional, desktop and portable. Yes, there are the Mac Mini and the xServe, I know, but nearly all Ap[...]
Thu, 27 Nov 2008 02:37:06 +0000My first car was an Oldsmobile, a red 1966 convertible I wish I still owned today. It was big and heavy yet somehow managed to average 18 miles per gallon in an era when gasoline cost 35 cents. Detroit and the U.S. automakers ruled the world when that car was built, yet now the companies say they are on the skids, bleeding money and headed for bankruptcy. What happened? And what can we do -- if anything -- to save an industry that for a century defined our nation as well as our youth? I have some ideas. Whatever the mechanism of their demise, the car companies did it to themselves. They love to blame labor agreements, pension plans, and health plans for their precarious financial situation, yet didn't the companies negotiate and sign those deals in good faith? Surely the down-the-road financial burdens were calculable at the time. Is it that we're living longer than expected, rather than expiring early like Pinto gas tanks? Maybe that's part of it, but to blame the unions for good negotiating is worse than forgiving the companies for bad. And what does it matter? The real issue at hand -- and the only one that really matters -- isn't who to blame or even whether or not to save these specific companies, but how to get me a really sweet ride. That's because the only way the U.S. auto industry is going to survive in any form is by making cars so cool that we'll stand in line to buy them even in a global financial crisis. It's the cars, stupid. My hobby is building small airplanes and one of my favorites is a Davis DA-2A, winner of the Outstanding New Design contest in 1966, the same year my Oldsmobile (and my current Thunderbird convertible) was built. That little Davis can teach us a lot about cars. I didn't build my DA-2A, but I am rebuilding it right now and know it intimately. My Davis is an all-aluminum two-seater with an 85-horsepower engine. The engine was built in 1946, the plane in 1982, and the whole thing cost under $4,000 at the time, though today I have more than that invested in the instrument panel alone. The plane weighs 625 lbs. empty, 1125 lbs. loaded, has a top speed of 140 miles per hour and can travel about 600 miles on its 24-gallon fuel tank. Why can't I buy a car like that? Imagine if we took the basic design parameters of my DA-2A and applied them to a modern automobile. The new design would have to carry two people and luggage, have an empty weight of no more than 625 lbs. and use an 85-horsepower engine. With a loaded weight of 1125 lbs., the car would have a power-to-weight ratio comparable to a Chevy Corvette and be just as quick -- probably even faster than the airplane's 140 mph.[...]
Tue, 18 Nov 2008 23:18:48 +0000There is no joy at Yahoo, for mighty Jerry has struck out. This week Yahoo cofounder Jerry Yang announced he was stepping down after 17 turbulent months as CEO of the big Internet portal -- a time in which the company rebuffed a buyout offer from Microsoft, flubbed an ad sales agreement with Google, and ended up being worth a third of its former self when the rest of the market is down only 40 percent. Jerry blew it. And rare in the annals of public companies, JERRY blew it, nobody else. There is no blame to be shared because the Chief Yahoo took his anti-Microsoft stand pretty much single-handed, having bounced Terry Semel from the job in June 2007. Semel, who was more Hollywood than Silicon Valley and never well suited for the job anyway, had backed the Microsoft deal. Freed from his duties at Yahoo, Semel also voted with his brokerage account, selling a large number of company shares while the selling was good. If there is a lesson to be learned here it is not so much that Jerry was wrong, but that Jerry was Jerry and that wasn't the right thing for Yahoo shareholders. There are three seminal ideas that guided Jerry Yang, who is, after all, a diverted graduate student who got on-the-job training in business. To understand these three ideas is to understand Yahoo under Yang: 1) Microsoft is evil. Yang came of age in the Netscape era and saw Microsoft break the law to destroy that company and try to control the Internet. Whatever its motivation, Microsoft did all the bad things they were accused of and more and Yang could never forget or forgive that, even at the cost of his own company. He took it personally. 2) The power of "no." There was a time in the 1990s when venture capitalists Kleiner Perkins and Sequoia Capital were trying to get Excite and Yahoo -- their respective portals -- to merge in a forced marriage designed to benefit only the VCs. It didn't feel right to Jerry, who put his foot down and scotched the deal. It worked that time, so saying "no" became for Yang the default position, especially after Broadcast.com. 3) Don't get screwed. When Yahoo bought Broadcast.com for $4.7 billion and it became clear that Yang & Co. got almost nothing of value for their money, they resolved never to get screwed on another deal again. That was the moment Yahoo embraced bureaucracy. They never made a quick decision again and in many cases hardly made any decisions at all. Mix these three concepts together, add independent wealth and a personal golf course, and you get the Jerry Yang of today. He was inclined to say "no," couldn't embrace Microsoft's evil, and sure as heck wasn't going to b[...]
Fri, 14 Nov 2008 20:57:16 +0000President-Elect Barack Obama has announced that when he's in office he'll appoint a Chief Technology Officer (CTO) for the whole darned USA. Though Google CEO Eric Schmidt already said he isn't interested in the job, I am. I accept, Mr. President. And while the idea of Cringely for CTO may seem lame to most everybody I know (including my Mom), I think I can make a strong case for why I am EXACTLY the right guy for the job. For one thing, unlike Eric Schmidt I don't have a lot of money. Schmidt can't afford to take the job because Google stock is down and he'd lose a fortune. Not so for me. I come encumbered only with debts, which is to say I am a true American. I'd be perfectly willing to put those debts in a blind trust ASAP. The U.S. CTO would have to be a dynamic leader capable of speaking his or her mind and holding his or her own against a tide of critics and special interests. Hey, that's what I do every week (sometimes twice)! Maintaining and defending technology opinions is my only business and some people think I do it too well, which I take as a compliment. Now we need to consider why President-Elect Obama thinks the country needs a CTO in the first place. The President has long had a Science Adviser, so why appoint a CTO? It's the distinction between adviser and officer that I'd say is the whole point; one simply advises while the other implements and leads directly. And I think there is plenty of room for new leadership in this area. America has always been tops in science, tops in research and development, tops in medicine, tops in industrial development, tops in technical infrastructure -- tops, tops, tops. But are we tops today? I don't think so, and I'd say we've been slipping steadily for the last eight years and probably for many years before that. The rest of the world has caught up and some other countries now lead the U.S. in many respects. Yes, we have technical traditions and deep institutions, but those traditions are weaker than they were and the institutions are, too. I think something can be done about that. My belief that something CAN be done is critical, because most of the usual suspects for this job probably think it can't. The reason I am so optimistic is because of the very financial disaster that is the current U.S. economy. Things are so bad right now that I am greatly encouraged. Huh? Sometime in February the new Obama Administration is likely to propose the mother of all economic stimulus packages. It won't be a $650 check that comes in the mail. It won't be a $700 billion equity injection in various [...]
Fri, 07 Nov 2008 19:48:12 +0000Steve Jobs is not like you and me. He has millions of customers, 32,000 employees, and a board of directors who think he can do no wrong. Running a company that is immensely profitable, gaining in market share, has no debt and $20 billion in cash, he can afford to make bold moves, the most recent of which is his decision to replace Tony Fadell, until moments ago head of the division that produces Apple’s iPod. Like everything Jobsian, Fadell’s departure is part of an Apple GRAND PLAN. The variables at work here are (in no particular order) ego, competitive advantage, ego, management technique, ego, strategic thinking, and ego. To say that Steve Jobs’ ego can expand to fill any known space might be an understatement but I’ll stand by it anyway. Fadell’s failing in this regard is his being hailed as the “father of the iPod.” What does that make Jobs? Who made THE BIG DECISION? Who committed the company? Who – most importantly of all – seduced all the record companies? That last guy would be James Higa, but since I don’t want to get HIM fired, too, let’s just attribute it all to Steve Jobs – for all intents and purposes the REAL father of the iPod. All hail Steve. Apple exists solely as an extension of Steve Jobs. Remember that. Anything attributable to Apple is really attributable to Jobs. Other people work at Apple, of course, and excel at their positions, but that is primarily because they were chosen, anointed, or inspired by Jobs. Not that Jobs doesn’t make the occasional mistake. Look at the Mac Cube, for example. But that was our mistake as consumers, not realizing that it really ought to have been worth an extra $500 to us to have a computer with no cooling fan. Steve Jobs makes very few such mistakes, in fact. That, and his total domination of Apple at every level allow the company to be literally the only PC vendor to have anything like a strategic plan. Dell and HP have the odd strategic initiative, like getting into or out of media players or TVs, but the idea of a comprehensive corporate strategy, well that’s too much to expect from companies that are managed, not led. Steve Jobs is a leader 100 percent in the mold of General George S. Patton. Rent the movie and it will start to make sense. Heck, rent it on iTunes. So here’s what’s going on with Tony Fadell. First, he was vulnerable as a charismatic leader in his own right who has been talked about in the press as a possible heir to Jobs. That alone meant he had to die, but it wasn’t enough to mean that he had to die just now. That de[...]
Thu, 30 Oct 2008 19:21:28 +0000It isn't very often I get to apply Moore's Law to a non-Information Technology business and rarer still that I can then relate the whole thing back to Microsoft, so I'm going for it. Here's what the solar power industry can teach us about Microsoft: The wonderful thing about Moore's Law is what the lady at the bank called the "miracle of compound interest." That halving of manufacturing cost every 18 months (the OTHER way of looking at Moore's Law that we generally don't use) has little apparent impact in the first few years, but eventually the halving and re-halving takes a real bite out of the cost side until substantial performance is very, very cheap. That explains why there is more computing power -- a LOT more -- in your iPod than was required for the Apollo Moon missions. Well this applies to ALL silicon-substrate photolithography applications, not just computer chips. It applies equally well, for example, to silicon solar cells. There are many types of solar cells. Some solar cells involve crystalline silicon just like computer chips and others use amorphous silicon, but all types benefit from Moore's Law. In fact one especially good aspect of solar cells is that they can make use of older process technologies that are obsolete for computer work. So every time Intel or AMD builds a new fab there is a market in the solar industry for their old machines. Look at those round solar cells used in many arrays today and you'll notice the smaller wafer sizes favored in Silicon Valley 15-20 years ago. That's no coincidence. The result of this relentless application of Moore's Law to the solar industry is that we can see a time in that near future when the cost of producing a watt of electricity from a solar cell on your roof will be approximately the same as the cost of delivering that same watt over a power line from an electric utility. And of course that means that 18 months after that point the solar watt will cost HALF of what the same power would cost from the electric company, which will completely change the game. The time when that electricity cost parity will be reached, I'm told, is seven years from now. Just think of the impact that will have on electric utilities! Why would any of us continue to buy our power from them? We might use them as a giant storage battery and possibly for backup on cloudy days, but why would we use them at all for power if we can generate it cheaper at home? You can bet that's a question the electric power generating industry is asking itself. The whammy for the power companies[...]
Thu, 23 Oct 2008 19:23:10 +0100I am not a very sophisticated mobile phone user. I don't use most of the bells and whistles on my phone, probably because I don't know what they even are. But just because I'm an idiot about USING mobile phones doesn't mean I don't understand the emerging mobile market, to which I have been paying a lot of attention of late. And why not? As personal computers fade from what Al Mandel called "ubiquity to invisibility," something has to take over. And everyone I respect thinks the new dominant platform will be mobile. So it's my job to tell you, then, that Windows Mobile is probably doomed. Interestingly, this conclusion isn't based on any personal preference or subjective analysis. I'm not saying that Windows Mobile is bad, just that it is probably doomed. It's a simple matter of market economics. There is generally room in any technology marketplace for three competing standards. Notice I say "standards," not "brands." There can be many brands of road vehicles, but they generally come down to cars, trucks, and motorcycles -- each a standard. In personal computers we have Windows, Macintosh, and Linux (or similar Unix workstation variant). In HVAC systems, just to stretch the point, there are radiant, forced air, or evaporative systems -- again three standards. And among those three standards there tends to be a market-share distribution that is more or less 85-10-5. These numbers can jump around a bit and one can argue that the Mac is now more than 10 percent of recent sales, though not of the installed PC base, so I hope you get my point. This magic 85-10-5 distribution also happens to mirror what happens at the racetrack or in the casino, where 85 percent of gamblers lose, 10 percent break-even, and 5 percent are winners, which explains all those big buildings in Las Vegas. The mobile phone marketplace shows a similar distribution, though that now appears to be in some transition. One could argue that the old 85-10-5 came down to basic or dumb phones (85), smartphones (10), and specialized or vertical phones like the old Nextel (5). Moore's Law now seems to be inexorably turning all phones into smartphones, so we're probably moving toward an 85-10-5 based on programming platform. Let's consider this smartphone migration for a moment, first with Samsung in mind. Last week Samsung announced that it would no longer be making high-end phones and would stick to basic phones in the future -- going for higher volumes at lower cost. This makes a lot of sense given that sophisti[...]
Mon, 20 Oct 2008 21:21:49 +0100Apple last week introduced a pair of very nice notebook computers that, not at all surprisingly, looked like riffs on the MacBook Air. The company in a separate announcement released 600 high-definition television episodes through the iTunes Store. This week Apple will reportedly release new 20-inch and 24-inch iMacs, also for the Christmas season. Two weeks, three announcements, but what strikes me (and apparently only me at this point) is what won't be announced -- the big surprises that are missing. What happened? A MacBook and a MacBook Pro are nice, but not overwhelming. I like the dual GPU in the Pro and I hate the lack of a FireWire port in the MacBook, but beyond that there is little to say about these products except that the glass screens (on the iMacs, too) are better for houses like mine filled with LCD screen-destroying pre-school boys. These new products don't appear to break any price or performance barriers and sure as heck don't allow time travel or make me more handsome. We were led to expect more -- a lot more. And I am not talking about rumors. Back on July 21st in his regular conference call with industry analysts, Apple Chief Financial Officer Peter Oppenheimer said that Apple's profit margin would likely shrink from 34.8 percent in the just-concluded quarter to 31.5 percent in the quarter ending in September. "We've got a future product transition that I can't discuss with you today," Oppenheimer said as he spelled out the reasons for the anticipated profit reduction. "One of the reasons that we see gross margin being down sequentially is because of a product transition." What kind of Apple product could be expected to come along, taking a $244 million profit hit for the company? It certainly isn't any of the products we've discussed so far, nor is it the iPhone 3G or the new iPod Touch, which have both been publicly dissected and found to have gross margins in the 56 percent range. It's something else that was probably intended to be announced this week but wasn't. The change of plan could have come for many reasons. Maybe the revolutionary product wasn't ready in time. Maybe introducing an aggressive, low-margin product in the middle of a global financial crisis was considered a bad risk. Maybe some strategic alliance had to be in place and wasn't ready. Whatever it was, the same analysts will ask about it this Tuesday when Apple has another such conference call scheduled. But of course none of this keeps me from speculating [...]
Tue, 14 Oct 2008 01:50:04 +0100A couple of columns ago we touched on the practical rebirth of parallel computing. In case you missed that column (it's in this week's links), the short version is that Moore's Law is letting us down a bit when it comes to the traditional way of increasing the power of microprocessors, which is by raising clock speeds. We've hiked them to the point where processors are so small and running so hot that they are in danger of literally melting. Forget about higher clock speeds then; instead we'll just pack two or four or 1000 processor cores into the same can, running them in parallel at slower speeds. Instantly we can jump back onto the Moore's Law performance curve, except our software generally doesn't take advantage of this because most programs were written for single cores. So we looked back at the lessons of parallel supercomputers, circa 1985, and how some of today's software applies those lessons, such as avoiding dependencies and race conditions. But we didn't really talk much in that column about the use of threads, which are individual processes spun off by the main CPU. Each time the microprocessor adds a new task, it creates a thread for that task. If the threads are running on the same processor they are multiplexed using time-slicing and only appear to run in parallel. But if the threads are assigned to different processors or different cores they can run truly in parallel, which can potentially get a lot of work done in a short amount of time. Most traditional PC applications are single-threaded, meaning the only way to make them go faster without a completely new architecture is to run the CPU at a faster clock rate. Single-threaded apps are simpler in that they are immune to the dependencies and race conditions that can plague true parallel code. But they are also totally dependent on tasks being completed in a quite specific order, so in that sense they can be dramatically slower or dramatically more resource-intensive than multi-threaded apps. For an example of where single-threaded applications are just plain slower, consider Eudora, which is still my favorite e-mail client (I'm old, you don't have to tell me). Until not very long ago Eudora still couldn't send mail in background, so everything (and everyone, including the user -- me) had to wait until the mail was sent before completing anything else, like writing a new message or replying to an old one. I KNOW THIS IS NO LONGER THE CASE, SMARTY-PANTS -- T[...]
Wed, 08 Oct 2008 17:56:08 +0100My promised column on threads will appear in this space on Friday. It would have appeared here today but the crumbling global financial system suddenly seemed a more appropriate topic. We're in trouble and by "we" I mean the whole darned planet. What started as a mortgage problem in the U.S. has blown into global financial paralysis that threatens us all with recession and maybe even with depression. I know I'm feeling depressed, how about you? The crisis seems immune to any and all efforts to fix or end it. NOT passing a $700 billion mortgage bailout can send Wall Street into a tailspin, for example, but then finally passing the bailout didn't seem to improve things, either. The Federal Reserve and Treasury Department are running out of tools and time yet still the system flirts with suicide. So I say it is time to take a completely different view of the problem and to look to a new leader to solve it, in this case Jack Welch. Jack Welch is the retired chairman and CEO of General Electric who took the company during his 23-year tenure from being worth about $14 billion to about $410 billion by really MANAGING the business and concentrating on creative use of capital. I've written columns and columns deriding managers as a profession but none of that applies to Jack Welch and GE, where managers really manage -- they manage the heck out of the place and to generally good effect. Jack Welch built that system, he has time on his hands, I say let's give him a new job. There is very little difference, in fact, between the global financial system and General Electric. Welch saw GE entirely in terms of cash flows and the application of capital to those parts of the business where it would do the most good. Welch also thought in terms of continuous quality improvement, which is virtually unknown on Wall Street OR in Washington, where such things aren't even talked about, much less measured. I've been thinking about this crisis and a lot of it comes down, I believe, to a fear of failure especially on the part of the banks. There is no credit available to anyone, anywhere, no matter what the credit rating or score. This is because the banks are frozen by fear to the point where they won't even lend to each other much less to customers. This fear of failure seems to be pretty much guaranteeing failure. And the regulators are now throwing what will soon be trillions of dollars at trying to break these bankers out of their[...]
Fri, 03 Oct 2008 21:20:13 +0100Last week I was in Boston to moderate a panel at the MIT Technology Review’s Emerging Technologies Conference — one of those tech shindigs so expensive I can only attend as hired help. My panel was on parallel computing and it produced this column and another I’ll file early next week. This week is about databases and next week is about threads. Isn’t this a grand time to be a nerd? Thanks in part to Larry Ellison’s hard work and rapacious libido, databases are to be found everywhere. They lie at the bottom of most web applications and in nearly every bit of business software. If your web site uses dynamic content, you need a database. If you run SAP or any ERP or CRM application, you need a database. We’re all using databases all the time, whether we actually have one installed on our personal computers or not. But that’s about to change. We’re entering the age of cloud computing, remember? And clouds, it turns out, don’t like databases, at least not as they have traditionally been used. This fact came out in my EmTech panel and all the experts onstage with me nodded sagely as my mind reeled. No database? No database. Parallel computing used to mean scientific computing, where hundreds or thousands of processors were thrown at technical problems in order to solve them faster than Moore’s Law might otherwise have allowed. The rest of us were relying on rising clock rates for our performance fix, but scientists — scientists with money — couldn’t wait so they came up with the idea of using multiple CPUs to solve problems that were divided into tasks done in parallel then glued back together into a final result. Parallel computing wasn’t easy, but sometimes that was the whole point — to do it simply because it was so difficult. Which is probably why parallel computing remained a small industry until quite recently. What changed was Moore’s Law put an end to the clock rate war because chips were simply getting too hot. While faster and faster chips had for the most part linear performance increases along with cost and power consumption decreases, the core temperature inside each microprocessor chip was going up at a cubic rate. Back in 2004 Intel released a chart showing that any clock speed over 5 GHz was likely to melt silicon and Moore’s Law would, by 2010, make internal processor temperatures similar to those on the surface of the Su[...]
Sat, 27 Sep 2008 05:15:59 +0100In the early 1980s I was a volunteer firefighter for a tiny community in the Santa Cruz Mountains of Northern California. We all lived in a beautiful redwood forest and our task was to keep that forest from burning down in a huge conflagration, taking us all with it. The job was made all the harder because our little part of paradise hadn't burned since the 1920s, so there was 60+ years of flammable undergrowth just waiting to light off. The current financial crisis facing the United States and the world really isn't much different from that. An unmanaged forest, one without the sort of fire control we attempted to provide, would naturally burn every few years. The undergrowth would build up, reach a critical mass, some source of ignition would come along -- usually lightning -- and all that undergrowth would burn. The redwoods themselves would be scarred but not really threatened, as we could see from the charring that marked them from countless such fires over centuries. Of course burning undergrowth threatened homes and property, too, so there was a natural desire on the part of that community to want the next burn to not come this year, please not this year. So there came a policy of aggressively fighting fires with the result that we eventually faced 60 (now 90!) years of flammable material growth rather than six or eight years. And the probable fire fueled by 60 years of undergrowth would have been so bad that our job changed to one of trying to prevent fires from happening, well, ever. This was an impossible task, of course. Eventually the stars would align the wrong way and the whole place would burn down, we all knew it. Just let it not happen on our watch. Does this sound familiar? Now America and much of the world face the possibility of recession and we handle that by first arguing about the definition of the term. Are we or are we not in recession? This distinction appears to be very important to some who view it like a forest fire: is it burning or not? Implicit in this distinction, I suppose, is the idea that if we're not burning -- if we are not in recession -- that maybe through some miracle we'll never face that problem. Whether this is a practical attitude or not depends entirely on your event horizon -- how soon you expect things to change. The people in power in this country have a relatively short event horizon. Politic[...]
Mon, 22 Sep 2008 20:29:57 +0100I’ll begin this third and (I promise) last column on IT management with a confession: I have been fired from every job I have ever held. This is certainly not something I set out to do, nor did I even realize it until one day my young and lovely wife mentioned that I had never told her about voluntarily leaving any position. It’s not that I’ve had so many jobs, either. This one and the one before it have kept me going for more than 20 years. But they always seem to end the same way. This one might, too. You can never tell. Most of the times I have been fired it’s because I’ve been judged to be unmanageable, which is to say I won’t shut up. The ultimate reason given is usually something minor. The last time around, for example, I was fired because I didn’t transfer the cringely.com domain to my employer. They asked me to do it and I said “no.” Had they said, “Transfer the domain or you will be fired,” I might have decided differently. But they never said that — never gave a hint of the consequences — so I assume the real goal was less to get the domain and more to get rid of me. The guy who had me fired, Stewart Alsop (maybe you’ve heard of him), ultimately lost his own job for firing me, at least according to International Data Group Chairman Pat McGovern, who told the story to 300 people once at a DEMO conference. Back when I was a kid and working at WWST Radio in Wooster, Ohio, I was fired for writing those seven unspeakable words in the middle of a livestock auction report. Another kid had been playing similar tricks on me for weeks, but when I finally retaliated he turned me in. I guess I was a threat to him and didn’t know it. One company hired and fired me three times and another company hired and fired me twice. And somehow in all this I’ve never received severance or unemployment compensation. I just found another job or it found me. There’s a point to all these firing stories and they actually do relate to IT. I’m typical of a lot of IT types. You know us. We are useful but sometimes a pain in the ass. We have opinions and speak our minds and don’t suffer fools at all. We stand up to authority from time to time. Sometimes we’re wrong. We get fired a lot and hired a lot, too, because we are generally useful, though dangerous. What do you do with folks like me [...]
Thu, 18 Sep 2008 00:29:44 +0100Last week’s column on bad IT management and the strong response from readers that followed show this to be a huge issue. There are WAY too many IT managers who either can’t or shouldn’t manage technical teams. Last week I maintained that having a firm technology base, or at least the ability and willingness to acquire one, was essential for good managers. While readers got carried away with which technical test is the best, I don’t think there is much dispute that there are certain aspects of technical management that are helped by the manager being a code god. But that’s far from all there is to the job. So this week I want to go deeper and look at what’s really missing in nearly every instance of such bad management, which is leadership. The distinction between management and leadership is a critical one. Management is — at its very best — an exercise in coping while leadership is so much more. Last week’s simple idea that the manager ought to at least be able to tell good work from bad is exemplified by Bill Gates, who liked to claim that he could tell good code from across the room and that whatever task a team was facing was something he could code in Visual Basic over a weekend. Both statements are nonsense, of course, but Bill knew he had to talk the talk, making him at least an adequate manager. Does it make him a leader? I don’t think so. But let’s not blame Bill for that. Let’s blame Charles Simonyi. Charles is the guy who came up with Microsoft’s development process — an outgrowth of his research at Xerox PARC. I covered this extensively in my book, Accidental Empires, but the short version is that Charles came to advocate a strong program manager as the central controller of any development group. One person made all the decisions and as long as that one person was correct 85 percent of the time, it was better to have a dictatorship than a democracy or even a meritocracy. This was an effective way to extend Bill’s will to Microsoft programmers Bill would never even meet. And to Charles’ credit the system worked well enough if the dictator was really, really smart and the task at hand wasn’t too complex. It was perfect for the 1980s. But it is far from perfect today and represents one of the fundamental reasons why Windows Vista was so late to[...]