Subscribe: Another Day In The Code Mines
Added By: Feedage Forager Feedage Grade A rated
Language: English
apple  bit  code  computer  game  iphone  java  lot  make  much  new  people  simple  sound  thing  things  time  year 
Rate this Feed
Rate this feedRate this feedRate this feedRate this feedRate this feed
Rate this feed 1 starRate this feed 2 starRate this feed 3 starRate this feed 4 starRate this feed 5 star

Comments (0)

Feed Details and Statistics Feed Statistics
Preview: Another Day In The Code Mines

Another Day In The Code Mines

Yet another infrequently-updated blog, this one about the daily excitement of working in the software industry.

Updated: 2016-10-19T10:13:39.254-07:00


Apple vs the FBI


What's up with Apple and the FBI?Several of my friends and family have asked me about this case, which has been in the news a lot recently. A whole lot of news stories have been written trying more-or-less successfully to explain what's going on here, often with ill-fitting analogies to locks and keys, and it seems like a lot of people (including some of our presidential candidates) are just as confused about what's going on now as they were when the whole thing started. The Wired article above is really very good, but it's long, fairly-technical, and doesn't cover the non-technical side of things particularly well.So, since y'all asked, here are some of my thoughts on the case. I'm going to be kind of all over the map here, because I've gotten questions about the moral side of things as well as the technical. I'm going to mostly skip over the legal side of things (because I'm unqualified to comment), except for a couple of specific points.On the off-chance that someone stumbles across this who doesn't already know who I am, I'm a computer programmer, and I have worked on encryption and digital security software for a number of different companies, including 3 of the 5 largest PC manufacturers.I'm going to try to stay away from using any analogies, and just explain the actual technology involved as simply as I can, since I know you can handle a bit of jargon, and the analogy-slinging I see on Facebook isn't making things any clearer for people, as far as I can see. There will be links to Wikipedia articles in here. You don't need to read them, but they are there in case you want to read more about those subjects.First, a very quick run-down of what this is all about:The FBI has an iPhone that was used by Syed Rizwan Farook, one of the shooters in the San Bernardino shootings last December.The phone is locked (of course), and the FBI wants Apple to help them unlock it, and in fact has a court order requiring Apple to do so.Apple is refusing to do what the FBI wants, for some fairly-complicated reasons.A whole lot of people, including information security experts, law experts, and politicians, have weighed in on how they think this should go.So, what's my take on all this?Encryption does not work the way you might think it does, from watching movies or TV.In the movies, you always see "hackers" running some piece of software that puts up a progress bar, and the software makes gradual progress over the course of seconds or minutes, until the encryption is "broken", and the spy gets access to the data they need. In the real world, unless the encryption implementation is fundamentally-broken by design, the only way to break in is by trying every possible key (we call this a "brute force attack"), and there are an enormous number of possible keys. You could get in with the very first key you try, or you might end up checking every possible key before you find the right one. Nothing about this process gives you any information about whether you're "close" to getting the right key, or whether you've still got billions of keys to try.The data on the iPhone is encrypted with a key long enough that trying to decrypt it through brute force is essentially impossible.The data on the iPhone is encrypted using AES, the Advanced Encryption Standard, which was developed by the US government for companies like Apple to use to secure data for their customers. as far as anybody knows, brute-force is the only way to attack AES, and with a 256-bit key (as is used on the iPhone), it'd take literally billions of years to try every possible key, if you used all of the computing power in the world.Apple doesn't have that key to hand it over to the FBIThe key used to encrypt data on the iPhone is derived from a combination of a device-specific key, and the pass-code which the user has set on the phone. There's no way to extract the device-specific key from the phone, and there's no record of which phone uses which device-specific key. This is done on purpose, because if you could get that data, it'd make it much easier for anyone to extract your pe[...]

Predictions for Apple's big announcement event tomorrow


So, Apple has scheduled some new product announcements tomorrow, which will certainly include a new iPhone (it’s the right time of year for that). There’s a lot of buzz on the internet about the event, based on oblique references from various Apple employees that this event is about much more than just a new iPhone.Despite the fact that I haven’t worked there in a decade, some people have asked me what I think Apple’s going to announce. For everybody’s amusement, here are my predictions, so we can all have a good laugh about them tomorrow. But first, some background:I’m really bad at thisAs many of my friends and family already well know, I have a history of really, really bad predictions of what Apple will and won’t do. A couple of notable failure in the past include:“Apple wouldn’t buy NeXT. That would make no sense. They might license some of the technology”When I said this, Apple was actually currently in negotiations to purchase NeXT, which ended up being their largest acquisition value-wise, until they acquired Beats Electronics this year.“Mac OS X will never ship. It’s a doomed project”This was while I was working on the OS X team, and more than a little depressed at the level of infighting and backstabbing going on between various teams. It took almost another year, but OS X 1.0 did actually ship,“Clearly, the Mac will be transitioning to a new architecture again. It won’t be X86, though”I had assumed X86-64 on AMD processors was the new target. I take some satisfaction from the fact that Apple relatively-quickly obsoleted the X86 processors in Macs, for 64-bit capable ones.  I *almost* got this one right, but I underestimated how much influence non-technical factors would have on the decision.That’s a common theme amongst many of the times that I mis-predict what Apple is going to do - because I’m this hyper-logical engineer-type person, it always surprises me when they do something that’s not the “right” decision technically, but makes sense economically or in some other way.PredictionsOkay, so here are my logical predictions, almost none of which will likely come to pass.What I think of the popular rumorsiPhone 6No doubt that this is going to be announced. It’ll be lighter, better battery life, faster. Rumors are that there will be a physically much-larger model, with a 5.5 inch screen. That’s totally ridiculous. We’ve all seen someone using one of those massive Android phones, and I think we can all agree that they look like total dorks. No way that Apple is going to make an iPhone that you have to use both hands to use.iWatchNot a chance in hell that Apple will produce a smart watch like the Galaxy Gear or Moto 360. Again with the “dork” factor - who even wears a watch those days? I haven’t worn a watch since I got my first  Palm Pilot, back in the day. My iPhone goes with me nearly everywhere I go, already. I look at higher-end wristwatches, and I can appreciate the craftsmanship, but I have no more interest in wearing them than any other piece of jewelry. If Apple does introduce a piece of “wearable technology”, then it won’t be a conventional watch. I could see something playing up the health-monitor angle, but a wristwatch? No way. A $300 accessory for my iPhone that saves me the effort of pulling my phone out of my pocket to read the calendar notifications? Ridiculous.”Obvious” things, which I haven’t seen rumors aboutNew MacsWeirdly, there’s not much buzz about this in the rumor-sphere. There was a little bit of buzz about that early on, given that the event is at the Flint Center, where the introduction of the original Macintosh was held, as well as the iMac, the machine that saved the whole Macintosh line. But the rumor mill died out, partly due to lack of information, and I think partly due to people being unable to figure out how a new Mac development would be any kind of big deal.What kind of announcements could they make about the Mac that’d revitalize that line, and the company, again[...]

One down, 11 to go


January OneGameAMonth post-mortemJanuary is over, and I'm done working on Rocks! (for now, at least), and it's time to go over what worked, what didn't, and what I'll do differently for February.First, here's a link to the current version:Rocks!And here's the Github repository with the source code:Repo!What I was trying to do:This was the first month of the One Game A Month challenge, and I really wanted to make sure I finished something, so I'd get started off on the right foot. To that end, I tried to shrink the scale of what I was trying to do for January to something I was sure I'd be able to finish. Rather than design a game from scratch, I started with a well-known design, and implemented it on an unknown (to me) technology stack. So, I decided to do a clone of Asteroids, running in the web browser, using the canvas element for graphics, and the Web Audio API for sound.I wanted to produce something with a retro feel, true to the spirit of the original, even if it wasn't exactly the same in execution. And I decided to do the whole thing without the use of any frameworks or libraries, both because I thought that the game was simple enough that I could just bang it out without much help, and because I wanted to actually learn the browser APIs, not some third-party library.What went right:Got something working very fast, then iteratedBy the end of the first week, I had a playable game, if not a very interesting one. That took a lot of the pressure off, knowing that even if I ran out of time, I'd have *something* to show for it.Scope creep (mostly) avoidedAlthough lots of really great ideas came to me while working on Rocks!, I managed to avoid the temptation to add in a bunch of extra features. I feel especially good about this given that I didn't quite meet the initial goals - I'd have felt a lot worse if I didn't manage to make a complete game, because I'd gotten distracted by doing something cool, but not part of the core gameplay.Proper "retro-twitch" feelI spent a fair amount of time tweaking the controls, to get ship movement that felt "right". I think this is something that really distinguishes my game from the other Asteroids-like games that were submitted to OneGameAMonth last month. My ship is very responsive, it turns and accelerates quickly enough to get out of trouble, which makes the player feel like they're in control of their own fate.No ArtI didn't want to spend a lot of time drawing terrible art that I then hated. I figured that going with the vector approach would encourage (enforce?) a simple graphical design, and save me from spending hours tweaking art trying to make it look less goofy. My inability to draw well is going to be an ongoing issue for the next 11 games, too.I "Finished" on timeActually a bit ahead of time. Which is good, because a bunch of "real world" stuff came up in the last few weeks of January.What went wrong:Spent much more time on art & sound than expectedDespite the fact that I went with a totally minimalist look & sound, I still had to do a fair amount of tweaking. But with everything defined in code (see next item), it was pure tedium to make any changes in the graphics or sound.No creative toolsI ended up doing the entire art design by sketching things out on graph paper and manually copying down the coordinates into my code. This wasn't *terrible*, but it was tedious and error-prone. I didn't produce an editor for shapes and sounds because that sounded like more work than actually finishing the game. For *this* game, that was arguably true - but a couple of features got left out, rather than going through the process of manually designing graphics & sound for them. I'm planning on using the same technologies in future games, so I'll be able to amortize the effort to produce the tools over several projects. Conveniently enough, the optional theme element for OneGameAMonth February is "sound", so I'll have good incentive to build (or find) at least a rudimentary sound editor.What ended up on the cutting-room [...]

Rocks! Update #2 - it's a game


It's an actual game now!So, first things first - here's the current version of Rocks!Rocks!New features include:updated graphics - random rock shapes, and a progression of sizeson-screen instructionsbetter soundsproper collision detectionparticle effects when things are destroyedmore than one levela "shield" that will prevent rocks from running into youIt's looking a lot more like a real game now.Sound design is hardOddly enough, the hardest thing for me so far has been making those decidedly "retro", simple sound effects. The Web Audio API is very powerful, but it's also very much focused on doing sophisticated manipulation of sampled sound. I certainly could have grabbed appropriate sampled sounds, or built some in Audacity, but I wanted to push the "classic" feel of the thing, and I thought - "I've done this sort of thing before, how hard can it be"? Besides, attaching a couple of huge sample files to a game that's currently under 20kb total in size felt a bit like the tail wagging the dog.Of course, the last time I tried to create synthesized sounds from scratch was probably 30 years ago, on an 8-bit home computer with a fixed-voice synthesizer chip. There's something to be said for the existence of fewer choices helping to focus your efforts. When you're faced with an API that supports multi-channel surround sound, arbitrary frequency- and time-domain manipulation, 3-D positional audio, dynamics compression, and all the rest, it's a little difficult to figure out how to just make a simple "beep".Here's what I've learned so far about using the Web Audio API:Web Audio is based on a connected graph of nodes, leading from one or more sources through the graph to the ultimate audio outputThis is enormously-flexible, and each of the individual node types is jut about as simple as it can be to do the thing it's designed for. There's a "gain" node that just multiplies the input by a constant and feeds it to the output, for instance. The source nodes don't have individual volume controls (because there's the gain node for that).There's one weird quirk to my old-school sensibilities, which is that every note requires making another source node and connecting it to the node graph. When a note stops playing, the source node is automatically removed and garbage collected. If you want to play the same sound over and over, you're continuously creating and destroying nodes and connecting them to the graph.There's a simple oscillator source node that's very flexibleYou can easily create an oscillator that uses an arbitrary waveform (square, triangle, sine, on user-defined), plays at a specific frequency, and starts and stops at a specific time. This is about 80% of what you need to make a "beep", but:Oddly, there's no built-in ADSR envelope supportBack in the day, we'd set up ADSR (attack, decay, sustain, release) parameters for a sound, which would control how quickly it came up to full volume, how loud it was as the note progressed, and how quickly it faded. There are probably about 10 different ways to do the same thing in Web Audio, but nothing with the same simplicity.There's no simple white-noise sourceThis is a bit of a weird omission, in that noise sources are the basic building blocks of a lot of useful effects, including explosions, hissing, and roaring noises. And again, there's probably 10 different ways to solve this with the existing building blocks, each with their own limitations and quirks. I ended up using Javascript to create a buffer of random samples, which I could then feed through filters to get the appropriate noises for thrust and explosions.The API is very much a work in progressDespite the fact I wasn't trying to anything particularly sophisticated, I ran into a few bugs in both Safari and Chrome. I imagine a certain amount of this is to be expected with an in-development API that hasn't been standardized yet.Next Up: Enemies!The next big feature for Rocks! is to have some enemies to chase you around and shoot at you.[...]

One Game a Month, One Blog a Month?


A New Year Brings a Fresh StartI swear, I'm not going to start this post out with how disappointed I am at my lack of writing output over the last year. Oops...The ProblemNo matter how much I promise myself I'm going to update my blog more often, it tends to languish. I have a bunch of half-written articles waiting to be published, but in the absence of any compelling deadline, I can continue to look at them as "not quite ready for public view" for forever.A possible solutionSomething I've seen work really well for other people who struggle with producing consistent output are what I think of as "creative challenges". Things like the "take a picture every day for a year" challenge that a lot of people are doing to improve their photography.I just can't face the idea of a "blog a day" challenge, though - I like the idea of something a little more long-form, and a daily deadline would force me to cut corners to an extent I'm not ready for yet.So instead, I signed up for the OneGameAMonth challenge. Game design is one of my non-programming passions, so I feel like I'll be able to stay motivated and really try to see this through. A month is a long-enough deadline that I feel like I can produce something worth examining, and the practical problems and "stuff I learned along the way" should provide ample material for *at least* one blog entry a month.The PlanI haven't planned the whole 12 months out yet, but here's what I do know my plans: I will create a variety of games in different formats, including video games, board games, and card gamesI will explore different genres in each formatEverything I do will be open-source on my Github accountI will write at least one blog entry every month, about the current gameIf I don't finish a game in a particular month, I will not give up - I'll just do something less ambitious for the next monthThe ProofAnd to prove that I'm not completely full of it, here's the in-progress game for January, after two days of after-hours hacking:It's named Rocks!And here's the GitHub repository for it.This is an HTML5 Canvas & WebAudio version of the old Asteroids arcade game. Because it uses some cutting-edge web features, it only runs properly in recent WebKit-based browsers. That's Google Chrome and Safari. Future games will likely be more cross-platform, but I wanted to learn a bit about the Web Audio API.What I've learned on this project so farThis first version is very limited, and frankly pretty buggy:There's no proper collision detection - it's hard to die, unless you try to hit a rock with the shipThe asteroids don't start larger and break up into smaller onesThere's no level progression, and no game-over when you die 3 timesNo enemy UFOs yetThere are missing sound & visual effectsAnd the code is, frankly, a mess. But on the other side, there's a lot I've learned over the last two days:All of the rendering is done using the Canvas line-drawing primitivesThe sounds are synthesized on-the-fly using Web Audio units instead of sampled soundsThe animation is driven using requestAnimationFrame, so it should throttle back when in the backgroundThe whole thing is less than 11k in size, and there's about 400 lines of Javascript in the main game file. That's smaller than a typical iOS app icon...[...]

The simplest possible computer


The simplest possible computer So, if we were going to build a model of the simplest possible computer, where would we start? As it turns out, you probably have such a model in your home already. Many homes have what's known as a "three-way" switch, which is a light switch that you can turn on and off from two different locations. This circuit can be used as a simple digital computer. By properly labeling the switch positions and the light bulb, we can use them to solve a logic problem. Let's say that you need a system to tell you whether to have dessert with your lunch, but you have some specific rules to follow: 1. If you have a salad for lunch, you'll have dessert. 2. If you have soup for lunch, you'll have dessert. 3. If you have both soup and salad for lunch, you'll skip dessert (since you'll be over-full). 4. If you haven't had anything for lunch, you won't have dessert (because dessert on an empty stomach will make you sick). Here's how to solve this problem with the three-way switch: If necessary, flip one of the switches so that the light is off. Label the positions that the switches are currently in. Label one "had soup", and the other "had salad". Label the other two positions "no soup" and "no salad", respectively. Hang a sign on the light bulb that reads "have dessert". Congratulations! You now have a computer that will tell you, based on whether you've had soup and/or salad, whether you should have dessert. Try it out, and you'll find that it follows the rules given above, and the light will only come on if you've had either soup or salad (but not both). This isn't all that exciting by itself, but this same circuit can be used to solve an entire family of related logic problems, just by changing the labels on the switches and the light bulb. This ability to use the same logic to solve many different problems is the source of the flexibility of computers, and is what enables them to be useful for so many different things. [...]

A new project!


I'm working on a "book" in my spare time. I put book in quotes there, because I don't know that it'll actually get to the level of being published on dead trees. Due to the subject matter, it would make more sense to publish it online (or perhaps, via something like iBooks) in any case.

It's intended to be an introduction to Computer Science for non-nerds (and/or younger folk), which I'm sure is well-covered ground, but the unique direction I'm planning on taking is to start "at the bottom" with the most basic principles and work my way up.

This is based on conversations I've had with family and friends over the last few decades, at family gatherings, at parties, and on road trips. I get the impression that a lot of folks think that there's this mysterious "other level" beneath what they understand about their computer that requires a lot of formal training to understand. I want to show that things aren't really that complicated at the lower level, and that all of the complexity is layered on top of a very simple foundation.

And, I find the subject really interesting, so I enjoy writing about it. I'm going to set up a website for he new project soon, but in the meantime, I'll put an excerpt up here to see what people think.

Update: Here it is - The Simplest Possible Computer

Time for a reboot...


Okay, it's now been more than a year and a half since I updated this blog. I need to get back on the horse. Stay tuned for an update soon (really)!

JavaScript by example: functions and function objects


I've been working in JavaScript a lot these last couple of months, and I feel like I've learned a lot. I wanted to show some of the more interesting aspects of JavaScript that I've had the opportunity to bump into. I'll use some simple examples along the way to illustrate my points.

Note: If you want to follow along with the examples in this blog post (and the followup posts), you'll probably want to use an interactive JavaScript environment. I tend to use Firebug with Firefox when I'm trying stuff out, but there shouldn't be anything in these examples that won't work in the WebKit console in Safari or Chrome, or in Rhino, for that matter.


A simple function is defined and used in JavaScript thusly:

function add(x, y) {
return x + y;
console.log(add(3, 5)); // this prints "8" to the console

This does just about what it looks like it does. There's no trickery here (the trickery comes later on). Let's say that we want a version of this function that takes a single argument, and always adds 5 to it. You could do that like this:

function add5(a) {
return add(a, 5);

console.log(add5(3)); // prints "8"

But what if you're going to need a bunch of these one-argument variants on the add function? Well, since functions are first-class objects in JavaScript, you can do this:

function make_adder(v) {
var f = function(x) {
return add(x, v);
return f;

var add7 = make_adder(7); //create a function
console.log(add7(3)); // prints "10"

This is only slightly more complicated than the original example. One possibly subtle point here is that the returned function "captures" the value of v that was passed into make_adder. In a more formal discussion, you'd call this a closure.

PuzzleTwist is now available!


My latest creation is now up on the iTunes App Store. It's called PuzzleTwist, and it's a puzzle game where you unscramble a picture by rotating the pieces.  As each piece is rotated into place, others will rotate as well - some in the same direction, some in the opposite direction. The key to solving the puzzle is to figure out what order to move the pieces in.

One unique feature is that the rules for each puzzle are different - some are simple, some are more complex. A few are so difficult that I can't solve them without looking at the solution.

Once you've solved a puzzle, you can save the resulting picture in the Photo Library on your iPhone, and then use it as the wallpaper image for the phone, or assign it to one of your contacts.

PuzzleTwist also keeps track of the best reported scores, so you can compare your scores versus the rest of the world.

If you're a puzzle fan, you should check it out. Here's the iTunes store link.

On a side note, this application was approved much faster than the previous applications I submitted. Perhaps the App Store review team is coming out from under their backlog.

The eyes have it - a tale of 3 vision problems


I'm recovering from a head cold today, so rather than try to do heavy programming work, I decided to write up a personal story that I've been thinking about lately, for a variety of reasons. As anyone who knows me personally can probably attest, I wear glasses and have pretty bad eyesight.  Not many of my friends, and probably not even all of my family, know that I have three distinct vision problems, only one of which is actually addressed by my glasses. I'm going to tell y'all about all three, more or less in the order that I found out about them, and the ways in which they've been treated.Disclaimer: I'm not an eye doctor, nor an expert in human vision. This is all about what my experience has been. It's entirely likely that I'll make at least one glaring error in my use of some technical term. Feel free to correct me.Chapter 1: Nearsightedness and AstigmatismOkay, that probably looks like two different problems, but they're both refractive issues, and they're caused by misshapen eyeballs, and so are corrected easily with eyeglasses. If I remember correctly, I got my first pair of glasses in the 5th grade, when I was 10 years old or so. I was pretty astounded at the difference when I put them on - for the first time, I could see the leaves on trees as individual objects. I asked my eye doctor how bad my vision was compared to the 20/20 that's considered "normal" and got the unsatisfying response that the 20/x scale wasn't really a useful measure for people with strong nearsightedness. Since I can barely find the eyechart on the wall at 20 feet without my glasses, I can now understand where he was coming from.There was some consternation amongst the various parties involved about how it is that I could have gone without glasses as long as I had without anybody noticing that I was blind as a bat. For whatever reason, there wasn't any mandatory screening for vision problems in my elementary school. I got screened for a number of other potential issues, amusingly including colorblindness, but nobody ever stuck an eye chart up on the wall and had me read it.The biggest issue was probably a simple (dare I say child-like?) assumption on my part that everybody else saw things more or less the same way that I did. So, since I couldn't see the blackboard if I was sitting in the back row, I assumed that nobody else could, either. And if it was critical to the learning process for us to be able to read what the teacher was writing, the school would have arranged the classroom such that it was possible, right?It probably didn't help that I was also a bit of a daydreamer and a slacker. I think that when I said I didn't know that we had homework due, my teachers and my Mom assumed that I just wasn't paying attention, when in reality, I might have simply not seen the homework assignment written on the board.As I get older, and more and more of my friends have children, it's occurring to me that there might actually be something useful for other people to learn from my experiences. I think the lesson here is actually a pretty simple one. Parents, talk to your kids about what their sensory experiences are. If someone had at any point between age 2 and age 10 simply asked me whether or not I could see some distant object, or count the number of birds on a telephone wire, or even tacked up an eye chart on the wall and tested me, I might have gotten into glasses sooner.Alright, so I got glasses at age 10, which helped a lot with being able to see what was written on the blackboard, probably made it a lot safer for me to ride my bicycle around, and generally greatly improved my quality of life. Problem solved, right? Not so much. It turns out that I had another problem, which went unnoticed for several more years, despite going to the eye doctor regularly.Chapter 2[...]

Release early, release often...


Six Apps In Six Months - or, Why Mark Can't Schedule
That was the plan, anyway - but I haven't been able to keep myself on track. It's really difficult to release something when you're the only engineer working on it. It's hard to resist the temptation to just keep polishing the thing, or try out some new ideas, until you're well and truly past your milestone. Ah, well.

Beta Test Now Open
In the interest of trying to keep the pipeline flowing, I've just released "The Picture Puzzle Game Without an Interesting Name" to a select group of Beta testers. Since I don't want anyone to miss out on the fun of seeing what you get when I'm forced to release something that I don't think is ready, I'll put a link here to Starchy Tuber's Secret Beta page, where you can sign up to Beta test my latest creation.

If you like puzzle games, or if you're just interested in seeing how the sausage is made, the Starchy Tuber Secret Beta program is the place to be!

I reserve the right to limit Beta signups to the first 100 applicants (ha! as if...).

Grr. I just found a typo on the Beta Sign Up instructions. I'll go fix that...

Easter Pictems - a marketing experiment


I'm trying an experiment. There's a free version of Pictems up on the App Store now, loaded with just the subset of items appropriate for Easter.

This version is called "Easter Pictems", appropriately enough, and you can get it here, if you're curious about Pictems, but didn't feel like ponying up the $2.99 to find out whether you liked it.

I'm hoping that folks will download the free version and like it enough to upgrade to the full version. This seems to be a common tactic among developers on the App Store. Of course, people have to find out about your free app if it's to be of any value as a marketing tool. I'll update this post if anything dramatic happens with sales.

In related news, product #2 is coming along nicely. It's a puzzle game, along the lines of the sliding-squares puzzles you might be familiar with, but with a twist (literally, in this case). For this game, the idea of Free and Pay versions makes a lot of sense, so I'm going to release both at the same time. Here's a preview of the (as yet unnamed) puzzle game:

Obsolescence is a pain in the neck...


I'm trying to clean out some of the unused/unloved technology around the house. An interesting case that I'm currently working on is Yvette's old laptop. She used this thing back in her college days, and it'd be nice to be able to get the data off it (for nostalgic purposes), then send it to the great computer graveyard.

It's an approximately 20-year old NEC DOS-based laptop, with a black-and-white LCD screen, and a massive 20 MB hard drive. It boots and seems to run just fine, a bit of a miracle in itself, but I haven't yet figured out how to get the data off of it.

You'd think that it'd be relatively easy to copy the data off this thing, but:

1. Accessing the floppy drive causes the computer to reboot.

2. Neither the serial port nor the modem are recognized by the communications software installed on the thing, so I can't transfer data that way.

3. This computer is old enough that those (and the printer port) are only external I/O ports - there's no USB, no network port, and no wireless network ability.

I took the thing apart, and discovered that the hard drive in it is actually an IDE drive. Wow - that's almost a current-generation drive technology. I figured I could just get an adapter, and connect the old hard drive directly to a new system. Piece of cake, right? I've already got a Firewire-to-IDE external drive case, so it ought to be just a matter of hooking things up.

Not so fast. They do make a 44-pin to 40 pin adaptor just for connecting laptop 40-pin drives to an IDE connector, and I can connect that adapter to my Firewire-to-IDE external drive enclosure, and the drive spins up on power-up and everything. However, it isn't recognized properly. Apparently the firewire-IDE adapter doesn't work correctly with this drive. If I had to take a guess, I'd guess that the adapter doesn't support IDE drives which don't do DMA transfers.

It's a bit frustrating to have a drive that I know is readable, and have no way to get the data off of it. I'll probably try another IDE bridge and see if it works with this drive, but if that's a bust, I may be in the market for an OLD PC that I can connect the drive to, copy the data off of it, and then recycle.

There may be a trip to Weird Stuff Warehouse in my near future...

Grr. Blogger hates me.


It won't even scale images correctly if I use the "upload image" tool. Oh, well. click the image to see the full comic...

A New Kind of Science meets XKCD


400 pages down, 450 to go. Here's my impression so far, with a little help from xkcd:

Conspiracy Theories


String Theory

A New Kind Of Science


I'm currently struggling my way through Stephen Wolfram's book A New Kind Of Science. So far, I've made it to about page 200 or so (of 850, not including almost 350 pages of end-notes). I'm not going to review it until (unless?) I've gotten to the end, but so far, I'm not very impressed. This book is really frustrating to read.

For starters, the title of the book ends up getting repeated over and over in the text. It's fairly common when writing about new phenomena or new ideas to assign names to them, for purposes of shorthand if nothing else. But no - phrases like "a new kind of science", or "the new kind of science I've discovered", or "the new kind of science described in this book" appear over and over in the first few chapters. This is really hard to read, and gives the impression of really trying to "sell" the idea that there's some kind of radical new idea here, which, 1/4 of the way in, there is so far no sign of.

It's also really hard to read a book where the author seems to be taking personal credit for well-known results in computer science, without so much as a reference to the work other people have done in the area. There are some references in the end-notes, but the main text doesn't seem to make any kind of distinction between what's new, and what's well-known or borrowed. For someone who isn't familiar with the field, it'd be easy to get the impression that Wolfram invented everything here.

I expected that this book would be fascinating. I've been interested in Cellular Automata since the 80's, and some of the things people have been able to do with the Game Of Life, or the Wireworld CA are pretty amazing. So far, though, there's been a lot of build up for the "big discovery", and some fairly rough-shod introduction to CA theory, but I feel like I'm not making much progress towards any kind of goal.

Yesterday, in an attempt to see whether it's just me that's having a problem with this book, I did a search for reviews of the book. The results were not encouraging.

I'd really like to hear from anybody who has made it all the way through this book. In particular, I'd like to know if I should just skip ahead to the grand conclusion, or slog through the rest of the text.

Well, I'm getting better...


I updated my Blogger layout to the "new and improved" form of the old layout. I'm not sure how "improved" it is, but I ended up with a hierarchical archive, which makes it really easy to see how many blog posts I've had in any given month or year, over the history of the blog.

As I start my 4th year of blogging, I can see that the trend looks like this:
2005: 3 posts
2006: 12 posts
2007: 14 posts
2008: 25 posts

Last year was the first year that I managed to post at least one blog post a month. That's nowhere near where I thought I wanted to be, but at least I'm getting better at consistently writing. I think the writing has gotten easier for me, as well. I suspect that the quality hasn't gone up much (if at all), but I've effectively trained myself not to edit my posts to death, and I'm no longer taking months to get one paragraph just right for publishing.

So it's a qualified success. Onward and upward!

This week's iPhone SDK sob story


I have ranted about this before, I know, but I'm a little irritated. Every single time I update the iPhone tools, I run into some crazy issue building code that worked just fine on a previous version.

This week, after digging my office out from under all the mess from moving to a new house, I revisited one of my older projects (yes, Pictems is finally getting an update!). And I ran into not one, but two of these issues. That's not counting the usual Code Signing errors, which I don't even pay attention to - I just click randomly on the Code Signing options until they go away.

(For my friends on the XCode team: Yes, I will file bugs on these issues, once I figure out what's going on. This is not a bug report)

Issue #1: During some early experimentation, I had set the "Navigation Bar Hidden" property on one of my Nib files. It didn't seem to do what I wanted, but I didn't bother to change it back. At some point, a change was made such that it now works. Great, but apparently the change was actually made in one of the iPhone tools, so even if I build my old project, with the SDK set to 2.0, I still get the new behavior. Easy to fix, but it's weird to have to change my "archived" version of my source so it builds correctly with the current version of XCode. If I build my old project against the old SDK, I'd expect to get the old behavior.

Issue #2: One of my resource files has a $ character in the name. One of the XCode copy scripts apparently changed such that it's not escaping the filename correctly, so now the resource doesn't get copied. Amusingly, no error message results - the file just ain't there. Yes, it's dubious to name a file with a $ in the name. But, again, it used to work just fine.

Oh, well. In the bigger scheme of things, I still prefer XCode/iPhone to Eclipse/Android...

A math problem


Okay, so here's an example of where dropping out of math classes after Differential Equations is coming back to bite me a bit. I'm working on a kind of puzzle, mostly for fun, but possibly to incorporate into a future software product.

At its most basic, the solution process for the puzzle turns out to be solving a system of simultaneous equations. Which is something I learned how to do way back in High School, and for linear equations, I can even find off-the-shelf algorithms and libraries for doing so.

The catch, of course, is that these aren't linear equations. They use modular arithmetic, which is something I understand at a basic level, like anybody who programs for a living probably does, but I don't know where to even start breaking this down to solve a non-trivial version, and Google isn't helping me.

Let's start with a simple linear example:
5x + 7y + 4 = 0
4x + 3y + 1 = 0

Use whatever method you like, and you get:
x = 0.3846
y = -0.846

Piece of cake. Now, what if the equation looks like this?
5x + 7y + 4 = 0 (modulo 16)
4x + 3y + 1 = 0 (modulo 16)

If we want to find a few integer solutions for x and y, how do we find them? I could write a program to just guess every integer between 1 and 1,000,000 for each of the coefficients, and that'd find me a solution, but it doesn't scale well if I have a large number of variables. In the example equations given, there are rather a lot of solutions ([9,9],[9,25],[25,25]...), but I suspect that some other (carefully chosen?) sets of coefficients would have a much smaller set of solutions. Actually, that's kind of the point of the whole exercise.

Anyone out there got some hints for me?
Googling "simultaneous modular equations" got me:
,both of which are interesting, but not quite what I'm looking for.

For the case where the modulus is 2, addition is equivalent to XOR, and so logic minimization techniques from EE can be used, but it's not clear to me how to move those up to work in a higher modulus.



A couple of posts ago, I said:
Hopefully backsliding on the Java thing doesn't mean I'm about to backslide
on the WoW thing - I can't afford the lost time. I've got to learn about how you
do things in Java again.

Today, I got an email from Blizzard, makers of World of Warcraft:
You've been summoned back to Azeroth! Your World of Warcraft®
account has been selected to receive 10 FREE days of game time and a FREE trial of The Burning Crusade® expansion pack.

Weird timing. On the other hand, 10 free days can't hurt, right? right?

Just In Time compilation vs. the desktop and embedded worlds


Okay, rant mode on. As I was waiting for Eclipse to launch again today, it occured to me that one of the enduring mysteries of Java (and C#/.NET) for me is the continued dominance of just-in-time compilation as a runtime strategy for these languages, wherever they're found. We've all read the articles that claim that Java is "nearly as fast as C++", we also all know that that's a bunch of hooey, particularly with regard to startup time. Of course, if Eclipse wasn't periodically crashing on me with out-of-memory errors, then I'd care less about the startup time - but that's another rant. Back to startup time and JIT compilation...

If you're creating a server-based application, the overhead of the JIT compiler is probably pretty nominal - the first time through the code, it's a little pokey, but after that, it's plenty fast, and you're likely throttled by network I/O or database performance, anyway. And in theory, the JIT compiler can make code that's optimal for your particular hardware, though in practice, device-specific optimizations are pretty minimal.

On the other hand, if you're writing a desktop application (or worse yet, a piece of embedded firmware), then startup time, and first-time through performance, matters. In many cases, it matters rather a lot.

There are a number of advantages to writing code in a "managed", garbage-collected language, especially for desktop applications - memory leaks are much reduced, you eliminate buffer overflows, and there is the whole Java standard library full of useful code that you don't have to write for yourself. I'm willing to put up with many of the disadvantages of Java to gain the productivity and safety advantages. But waiting for the Java interpreter to recompile the same application over and over offends me on some basic level.

On a recent project, we used every trick in the book to speed up our startup time, including a "faked" startup splash screen, lazy initialization of everything we could get away with, etc, etc. Despite all that effort (and unecessary complication in the code base), startup time was still one of the most common complaints from users.

Quite a bit of profiling was done, and in our case, much of the startup time was taken up deep inside the JIT, where there was little we could do about it. Why oh why doesn't anybody make a Java (or .NET) implementation that keeps the safe runtime behavior, and implements a simple all-at-once compilation to high-performance native code? Maybe somebody does, but I haven't heard of them.

For that matter, why don't the reference implementations of these language runtimes just save all that carefully-compiled native code so they can skip all that effort the next time? The .NET framework even has a global cache for shared assemblies. Why those, at least, aren't pre-compiled during installation, I can't even imagine.

I was helpfully reminded of NGen, which will pre-compile .NET assemblies to native code. I had forgotten about that, since so much of my most recent C# work was hosted on Mono, which does things a bit differently. Mono has an option for AOT (ahead of time) compilation, which works, sort of, but could easily be the subject of another long article.



It's election day today in America. THis is just a quick reminder for all my friends out there to get out and vote.

  • Even if "your guy" isn't going to win.
  • Even if some irresponsible news organization announces "winners" for your state, before the polls are even closed.
  • Even if "the election has already been decided" before your state's polls close.

There's a lot more going on than just elections for the Federal Government. Whether you're pro-growth or pro-environment, whether you want to support gigantic infrastructure programs in a time of depression, or if you just want to reduce the cost of parking at the airport, ensure that your voice is heard.

Returning to Java, after 10 years away


I'm once again writing Java code professionally, something that I haven't done in nearly 10 years (no, really - I had to stop and think it through because I didn't believe it, either). A couple of thoughts did occur to me, after I'd figured out the time frames involved.

I was a little taken aback by the very idea that Java is more than 10 years old. It just seems weird that a new programming language could go from introduction to being a major part of the world's IT infrastructure and college curriculums, in less time than I've been living here in California.

Java sure has evolved a lot in the last 10 years. There have been major changes to the language, the libraries, and the tools. I'd bet that some of my 10-year old Java code would throw deprecation warnings for nearly every line of code...

On the other hand, my final thought is along the lines of "Oh, my god. So much has changed, but Java is still irritating in nearly all the ways that made me crazy ten years ago! What have these people been up to for the last decade?"

Oh, and I was the first person I knew to "quit" Java, much like I was the first person to "quit" World of Warcraft. Hopefully backsliding on the Java thing doesn't mean I'm about to backslide on the WoW thing - I can't afford the lost time. I've got to learn about how you do things in Java again.

One good thing for my loyal readers (if any exist) is that I have a bunch of stored-up vitriol about Java that I can just uncork and pour out, so I should be updating more frequently.

So, what's it good for? (XO Laptop, part 3)


(I wrote this quite a while back, but was never really happy with how it turned out. Here it is, nevertheless)See also Part 1 and Part 2 Okay, so I've had a chance to play with the XO some more, and I've been thinking about how it might be useful for school kids in the developing world. You can read more about the project and their official justifications for it at the OLPC web site. But as a computer geek, and an early adopter of the personal computer in my own country (the USA), I thought it might be interesting to look at it from the perspective of my own experience.HistoryI first encountered Personal Computers some time in the mid-to-late 1970's, when they started appearing, in small numbers, in schools, at my friends' houses, and in stores. The first time I sat down at a computer and typed in a BASIC program, I was totally hooked. I experiemented with computers at other kid's houses, played with the systems on display at local stores, and even stayed after school and took summer classes at the local community college to get access to computers. For the next few Christmases and birthdays, when my parents asked me what I wanted, I only had one answer: "I want a computer!". Unfortunately, it wasn't until about 1982/3 when my parents could scrape together the nearly unimaginable sum of $500 or so to buy me my first computer - a Texas Instruments 99/4a. I loved that thing to death, and it was a major part of my life for several years. I would have been ecstatic if someone had come to me at age 9 or so and said "here is a computer of your very own, to use at school, and to take home with you at night". I got my first computer-related job my Junior year in High School. In theory, I was hired to do simple assembly and software loading on PCs, but I very rapidly got into more and more programming on a regular basis. You could fairly say that I wouldn't be where I am today without that early access to computer technology. I went to my 20-year High School reunion recently, and one of the things that struck me was the number of folks who were working in more-or-less high-tech fields, particularly computer software. For a bunch of middle-class midwestern folks, we did really well riding the tech wave. I think that having computers in our schools (and a mandatory computer literacy class in high school) was a major factor there. Okay, back on track...So, having access to computers at an early age changed my life, and led to me to a highly-paid job in the technology industry. So what? That's probably not a reasonable goal for a poor kid in South America or Africa, there being no reasonable local high-tech industry for them to move into when they grow up (yet). But, as the OLPC folks put it, "this is an education project, not a laptop project". So it's not (just) about providing computer literacy, but improving the educational process overall. One example of this is textbooks - textbooks are surprisingly expensive, and as a result, aren't readily available, or frequently updated, in developing countries. If every child has their laptop, then textbooks can be stored on them, greatly reducing year-to year costs, allowing for more frequent updates, and freeing the students from carting a heavy load of books to and from school. Or take language instruction - if you live in a developing country, one of the best things you can do to improve your chances at a better career is to learn a ma[...]