Subscribe: Geek Like Me, Too
http://geeklikemetoo.blogspot.com/feeds/posts/default
Added By: Feedage Forager Feedage Grade B rated
Language: English
Tags:
bootloader shared  bootloader  day  don  file  job  language  make  memory  much  new  shared memory  time  usr  work 
Rate this Feed
Rate this feedRate this feedRate this feedRate this feedRate this feed
Rate this feed 1 starRate this feed 2 starRate this feed 3 starRate this feed 4 starRate this feed 5 star

Comments (0)

Feed Details and Statistics Feed Statistics
Preview: Geek Like Me, Too

Geek Like Me, Too



A middle-aged software curmudgeon's rants, raves, gripes, and prophecies.



Updated: 2017-10-17T07:18:01.254-04:00

 



A Workaround for the MP3 Tagging Problem

2017-08-12T13:58:02.585-04:00

I have found a workaround for the problem I was having. To quickly summarize, I use Logic Pro 9 to export projects to wave files. Then I want to convert them to fully tagged MP3 files.I've been doing the conversion by hand using iTunes, and tagging in iTunes too. My projects tend to be in higher sample rates and bit depths -- for example, 24-bit, 96KHz. I would use Logic Pro to export the files to MP3 format, and tag them, but I have the following issues:In Logic Pro 9, if I ask Logic to bounce this project to an MP3 file, it does a long series of conversions and the end result is that I get an MP3 file at 48KHz, which is not what I want. There does not seem to be an option to force it to use 44.1KHz.In Logic Pro 9, you can also ask Logic to tag your file, but if you supply long fields, it will truncate them. I think this is because it supports a somewhat out-of-date version of the ID3 standard, while iTunes supports a later version which supports longer fields.So why not just upgrade to Logic Pro X? Well, first, I'm not sure if this would actually fix the problem. At the moment, money is a bit tight. And I'm recording on a 2009 "3,1" Mac Mini. It works great, although it is slow. I don't think installing Logic Pro X on this machine is likely to make my projects easier. Most likely, Pro X eats considerably more memory and CPU and disc space than Logic Pro 9. So the plan is to make do, as much as possible, with what I have, until I can do some major upgrades, including replacing the computer.Anyhow, to work around these issues with Logic Pro 9, I've been bouncing to a WAVE file, then bringing the 16-bit, 44.1KHz WAVE file into iTunes, creating an MP3 using iTunes, then tagging it by hand.I wanted to see if I could do some of these steps on the command line, so I could do it in a script, a BBEdit worksheet, or even a Makefile. I'd like to automate that somewhat, not so much because I'm spending that much time producing podcasts, but because all these steps are error-prone. I'd also like to automate, at least partially, the generation of the entries in the podcast feed file. I'm always screwing up the time zone offset in dates, or forgetting to update the size of the file in bytes. It would be nice to have a script to do the grunt work, especially since I am often making versions to test, before I am happy enough with them to add them to the live podcast feed.So I was trying to use LAME to do the encoding and tagging, but I discovered that iTunes would not import the "comment" field in the MP3 files created and tagged by LAME.I asked Dan Benjamin on Twitter, and was happy that he replied, but he just wrote "Why not just bounce correctly from Logic?"Maybe that works for him, but as I explained above, and in the link I sent him, it doesn't work for me, because I wind up with MP3 files encoded at 48KHz. I don't want to get too far into the weeds here, but I believe that making 48KHz MP3 files for podcasts are fairly pointless for most users, since they will need to be resampled on playback and resampling is lossy. For most listeners playing the files back on typical devices, 48KHz is a waste of storage space and will not provide a quality boost over a 44.1KHz file.I also want to be able to use comment fields like this:This work by Paul R. Potts is released under the Creative Commons Attribution-Noncommercial-Share Alike 3.0 Unported License (http://creativecommons.org/licenses/by-nc-sa/3.0/). See http://generalpurposepodcast.blogspot.com for more information.And not have them truncated.In the Hydrogen Audio forum I got a couple of useful replies, but no solutions as such. The root of the problem seems to be that iTunes silently fails to import certain types of comment fields in MP3 tags. I don't want to go too far down the ID3 rabbit hole, but it seems like if the language specified for the comment field is "XXX," iTunes will not import it.When LAME writes an ID3v2 comment tag, it seems to set the language to either some unicode string, or if the command-line switch --id3v2-latin1 is use[...]



Two Quick Technical Tidbits

2017-08-08T19:09:52.383-04:00

Trump is blathering about about "fire, fury, and - frankly - power," and I'm trying to thinkg about something less distressing, so I'm going to write briefly about two technical issues I've come across, in the hopes that Google places this within reach of someone else looking for a solution.Creating a Shared Memory Block with Keil µVision and the Atmel SAM4E16EIn my work, I have created firmware for the Atmel SAM4E16E, a nifty chip with an ARM core. My toolchain is Keil µVision version 5.23.What I'm trying to do is conceptually very simple. We have a bootloader image and an application image. When the bootloader executes, I want it to put some specialized values into a block of memory, at a fixed, known address. When the bootloader executes to run the main application, I want the main application to read some information from that block of memory.The challenge is not so much in writing the C code to handle this -- that's quite easy. The challenge is configuring your tools to get out of your way. In particular, you want the linker to set aside the memory in the right place, and the startup code to leave it alone (not zero it out). Documentation on this is a bit sparse and confusing. It took me quite a bit of trial and error to get it working. Along the way I discovered that the tools are both more poorly-documented and less robust than I hoped. But it did work, and here's how.First, I created a common header file for describing the memory structure. It looks something like this:typedef struct Bootloader_Shared_Memory_s{ uint32_t prefix; uint32_t version; uint32_t unused0; uint32_t unused1; uint32_t unused2; uint32_t unused3; uint32_t unused4; uint32_t unused5; uint32_t unused6; uint32_t unused7; uint32_t suffix;} Bootloader_Shared_Memory_t;extern Bootloader_Shared_Memory_t bootloader_shared_memory;That just gives us a series of 32-bit words in memory. I want to set a prefix and suffix value to some special values that I will look for, to see if the shared memory block looks like it was configured as I expect. It is very unlikely that the prefix and suffix would have these values unless they were deliberately put there.#define BOOTLOADER_SHARED_DATA_PREFIX ( 0x00ABACAB )#define BOOTLOADER_SHARED_DATA_SUFFIX ( 0xDEFEEDED )Then I just define a function that will configure the shared memory, instead of specifying inital values in the definition. That's because I don't want the compiler to treat this block as part of its initialized memory (the ".data" section of memory). See: https://en.wikipedia.org/wiki/Data_segmentvoid Configure_Bootloader_Shared_Memory( void );The bootloader code calls this function, which looks like this:void Configure_Bootloader_Shared_Memory( void ){ bootloader_shared_memory.prefix = BOOTLOADER_SHARED_DATA_PREFIX; bootloader_shared_memory.version = ( ( BOOTLOADER_REVISION_HIGH << 16 ) | ( BOOTLOADER_REVISION_MIDDLE << 8 ) | ( BOOTLOADER_REVISION_LOW ) ); bootloader_shared_memory.suffix = BOOTLOADER_SHARED_DATA_SUFFIX;}Where BOOTLOADER_REVISION_HIGH etc. are #defines which specify the current version number.Now, we've declared the data structure and made a function that operates on it. We need to define the data structure. I do this in a separate C file:/* Highest possible base address (last RAM address is 0x2001FFFF), masked to zero off two low bits of address for 4-byte alignment*/#define BOOTLOADER_SHARED_MEMORY_BASE_ADDRESS ( ( 0x20020000 - sizeof( Bootloader_Shared_Memory_t ) ) & 0xFFFFFFFC )/* Note: comes out to 0x2001FFD4, the data structure is 0x2C bytes long; will have to change the scatter file if our data structure changes*/__attribute__((at(BOOTLOADER_SHARED_MEMORY_BASE_ADDRESS),zero_init))Bootloader_Shared_Memory_t bootloader_shared_memory;Note that non-standard attribute. It creates a separate memory area to be passed to the linker. It specifies the base address. The *zero_init[...]



Fixed Point Math with AVR-GCC

2016-10-07T11:02:48.041-04:00

Wow, I see that it has been a long time since my last post. Sorry about that. I've been very busy. I have lots to talk about. I'd like to write about reading encoders, and I'd like to write about communicating with EEPROM chips that use two-wire protocols (I2C-like) as opposed to SPI-like protocols. But in the meantime I hope this short post will be useful to someone.Embedded CI recently had reason to do some non-integer math on a small microcontroller, a member of the Atmel ATtiny series. Floating point math on this chip is pretty much out of the question; there is no floating-point hardware. I think some of the chips in this family are big enough to hold floating-point library functions, but they will certainly eat up an enormous amount of the available program space, and given that they are eight-bit microcontrollers in most ways -- the registers are 8 bits wide -- it is probably best to just avoid floating point.So I began looking into fixed-point math. It is always possible to roll your own code for this kind of thing, but I thought I would see if I could take advantage of existing, debugged library code first. I found some free software libraries online, but because I develop code that runs in commercial products, I was not really happy with their license terms. It also was not very clear how to use them or whether they would fit on the ATtiny chips.I discovered that there is, in fact, a standard for fixed-point types in C. It has not been widely adopted, and like the C standard itself it is a little loose in parts, in that it doesn't dictate the numeric limits of types, but rather specifies a range of acceptable sizes. And it turns out that my toolchain supports this standard, at least in part.I won't try to describe everything covered in the Embedded C document. I'll spare you my struggle trying to find adequate documentation for it or determine how to do certain things in an implementation that doesn't implement everything in the Embedded C document.Instead I will try to do something more modest, and just explain how I managed to use a couple of fixed-point types to solve my specific problems.You can find more information on the Embedded C standard here: https://en.wikipedia.org/wiki/Embedded_CThe actual Embedded C standards document in PDF form can be found here (note: this is a link to a PDF file): http://www.open-std.org/JTC1/SC22/WG14/www/docs/n1169.pdf.At the time of this writing, this seems to be the latest version available, dated April 4, 2006. The document indicates a copyright, but unlike the C and C++ standards, it looks like you can download it for no cost, at least at present.avr-gccThe compiler I'm using is avr-gcc. My actual toolchain for this project is Atmel Studio version 7.0.1006. Atmel Studio is available for download at no cost. The avr-gcc toolchain that Atmel Studio uses under the hood is available in other toolchains and as source code. I'm not going to try to document all the ways you can get it, but you can find out more here: https://gcc.gnu.org/wiki/avr-gcc.As I understand it, these Embedded C extensions are not generally in across other versions of GCC.The Basics of Fixed Point Types in Embedded CI'm assuming I don't have to go into too much detail about what fixed-point math is. To put it briefly, fixed point types are like signed or unsigned integral types except there is an implicit binary point (not a decimal point, a binary point). To the left of that binary point, the bits indicate ascending powers of two as usual: 1, 2, 4, 8, etc. To the right of that binary point, the bits indicate fractional powers of two: 1/2, 1/4, 1/8.The Embedded C extensions for fixed-point math came about, I believe, at least originally because many microcontrollers and digital signal processors have hardware support for fixed-point math. I've used DSPs from Motorola and Texas Instruments that offered accumulators for fixed-point math in special wide sizes, such as 56 bits, and also offered saturation arithmetic. Using these registers from C require[...]



SPI Communications with the Arduino Uno and M93C46 EEPROM: Easy, Fun, Relaxing

2016-02-26T14:21:43.826-05:00

When I write code for an embedded microprocessor, I frequently need to use communications protocols that allow the micro to communicate with other chips. Often there are peripherals built in to the micro that will handle the bulk of the work for me, freeing up micro clock cycles and allowing me to write fewer lines of code. Indeed, the bulk of modern microcontroller datasheets is usually devoted to explaining these peripherals. So, if you aren't trying to do anything unusual, your micro may have a peripheral that will do most of the work for you. There might be a pre-existing driver library you can use to drive the peripheral. But, sometimes, you don't have a peripheral, or it won't do just what you need it to do, for one reason or another In that case, or if you just want to learn how the protocols work, you can probably seize control of the GPIO pins and implement the protocol yourself. That's what I will do, in the example below. I will show you how to implement the SPI (Serial Peripheral Interface) protocol, for communicating with an EEPROM. I've used SPI communication in a number of projects on a number of microcontrollers now. The basics are the same, but there are always issues to resolve. The SPI standard is entertaining keeps you on your toes, precisely because it is so non-standard; just about every vendor extends or varies the standard a bit. The basics of SPI are pretty simple. There are four signals: chip select, clock, incoming data, and outgoing data. The protocol is asymmetrical; the microcontroller is usually the master, and other chips on the board are slaves -- although it would be possible for the micro to act as a slave, too. The asymmetry is because the master drives the chip select and clock. In a basic SPI setup, the slaves don't drive these signals; the slave only drives one data line. I'll be showing you how to implement the master's side of the conversation. Chip select, sometimes known as slave select from the perspective of the slave chip, is a signal from the master to the slave chip. This signal cues the slave chip, informing the chip that it is now "on stage," ready for its close-up, and it should get ready to communicate. Whether the chip select is active high, or active low, varies. Chip select can sometimes be used for some extra signalling, but in the basic use case the micro set the chip select to the logically active state, then after a short delay, starts the clock, runs the clock for a while as it sets and reads the data signals, stops the clock, waits a bit, and turns off the chip select. Here's a picture showing the relationship between clock and chip select, as generated by my code. Note that I have offset the two signals slightly in the vertical direction, so that it is easier to see them: The clock signal is usually simple. The only common question is whether the clock is high when idle, or low when idle. Clock speeds can vary widely. Speeds of 2 to 10 MHz are common. Often you can clock a part much slower, though. CMOS parts can be clocked at an arbitrarily slow speed; you can even stop the clock in the middle of a transfer, and it will wait patiently. What is less simple is the number of clocks used in a transaction. That can become very complex. Some parts use consistent transfer lengths, where for each transaction, they expect the same number of clock cycles. Other parts might use different numbers of clock cycles for different types of commands. From the perspective of the slave, the incoming data arrives on a pin that is often known, from the perspective of the microcontroller, as MOSI (master out, slave in). This is again a simple digital signal, but the exact way it is interpreted can vary. Essentially, one of the possible clock transitions tells the slave to read the data. For example, if the clock normally idles low, a rising clock edge might signal the slave to read the data. For reliability, it is very important that the master and slave are in agreement about which edge triggers the[...]



Star Wars: The Force Awakens

2016-01-03T12:38:44.729-05:00

This review contains many spoilers. I want to start out by saying that I really was expecting, even hoping, to dislike The Force Awakens. Entering the theater a cynical, somewhat bitter middle-aged man, I fully expected to be able to take my distaste for the other work of J. J. Abrams (particularly, his atrocious 2009 Star Trek reboot), and Disney, and recycled nostalgia in general, and throw it directly at the screen. An original fan of Star Wars -- I saw the first one perhaps a dozen times in the theater -- and pretty much agree with the critical consensus about the prequels. Their utter failure led me to believe that the things I loved most about Episode IV had, for the most part, little to do with big-budget filmmaking, but were the result of giving a bunch of really brilliant costume and set designers and cinematographers and editors and sound designers a lot of creative control and a relatively low budget -- a situation unlikely to be replicated in a truly big film, an important investment where none of the investing parties would want to take any significant risks. I was wrong, and I'm still somewhat troubled by that. Is The Force Awakens a good movie, or was I just primed by my age and the bad experience with the prequels to suck up something relatively bad and call it good, simply because it lacks the awfulness of the prequels, and smells a lot like the 1977 original? I don't think I can actually answer that question, definitively, at least not easily because really taking that up requires me to think critically about the original 1977 Star Wars, something I find hard to do, given the way the film imprinted itself upon my nine-year-old brain. Is it really all that and a bag of chips? Or did it just land at the right time to be the formative movie of my childhood? One of my sons is nine, by the way. He enjoyed the new movie, but I don't think it blew his mind the way the original Star Wars blew mine, simply because we have, ever since 1977, lived in the era that had Star Wars in it. To be clear -- it's not the case that there weren't big action movies back then, and big science fiction movies back then. We had movies like 2001: a Space Odyssey, which also formed my tastes. We had Silent Running. We had Logan's Run. But it would be impossible to overstate the shock wave of Star Wars -- the innovative effects, editing, and yes, even marketing. We just can't go back to that world. He's seen a lot of things that are Star Wars-ish, while in 1977, I never had. And make no mistake, the new Star Wars is, most definitely, Star Wars-ish, in the way that the prequels were not. The world of the prequels was too clean, to sterile, too political, and too comic. Star Wars may have been the single most successful blending of genres ever attempted; a recent article called it "postmodern," and I think that is correct. The prequels might have been attempts at post-modern, too, but they seem to have a different set of influences, and just seem, in every respect, to have been assembled lazily, and without artfulness. For just one example, see how one of the prequel lightsaber battle scenes was actually filmed. The Force Awakens follows the 1977 formula so closely that it is perilously close to coming across as a kind of remake or pastiche of the original. But it is not that. It is actually an homage to the original. There are a lot of parallel details and actual "easter eggs," where props make cameos, audio clips from the original movies are sprinkled into the new one. In one of my favorite moments, on Starkiller base we hear a clip from the first movie, "we think they may be splitting up." Some reviewers have made their reviews catalogs of these moments, and consider this excessive, complaining about the "nostalgia overload." But although it is noticeable, I think the producers knew just how much nostalgia would be appreciated, and how much would become annoying, and walked that line very well. The film re-creates the world [...]



Working with a Thermistor: a C Programming Example

2016-06-15T15:16:22.295-04:00

Recently I worked on a project that needed to monitor temperature using a thermistor. A thermistor is a resistor that measures temperature: the resistance changes depending on how hot it is. They are used in all kinds of electronic devices to monitor temperature and keep components from oveheating. I searched for a good, simple worked-out example for how to get an accurate reading from the thermistor, but had trouble finding something readable. I am not actually an electrical engineer and have never studied the math behind thermistors formally, but I was able to adapt the formulas from some existing sources, with a little help. I am sharing this example in the hope that it might be useful to someone trying to solve a similar problem. Please note the way I have adapted the general thermistor math to our particular part and circuit; unless you have an identical part and measurement circuit, you will probably not be able to use this example exactly "as-is." As I understand it, most modern thermistor components are "NTC," which means that they have a "negative temperature coefficient," meaning that their resistance has an inverse relationship to temperature: higher temperature, lower resistance. Thermistors have highly non-linear response, and are usually characterized by the Steinhart-Hart equation. This equation is a general equation that can be parameterized to model the response curve associated with a specific thermistor device. The original form of the equation takes three coefficients, A, B, and C, and describes the relationship between thermistor resistance and temperature in degrees Kelvin (K). It turns out that the three-coefficient form is overkill for a lot of parts and their response curve can be characterized accurately with a single parameter, using a simplified version of the equation This single parameter is called "beta" and so the equation can be called the Beta Parameter Equation. Reading a thermistor is complicated by the fact that in a typical application we are first using resistance as a measurment of, or proxy for, temperature; that's the basic thing a thermistor does. But in a circuit we don't read resistance directly; instead, we would typically read voltage as a measure of, or proxy for, resistance. To read the resistance from a thermistor we treat it like we would treat a variable resistor, aka potentiometer. We use a voltage divider. This consists of two resistors in series. In our case we place the thermistor after a fixed resistor, and tap the voltage in between. This goes to an ADC - and analog-to-digital converter. I'm going to assume that you already have a reasonably accurate ADC and working code to take a reading from it. So now I'm going to describe how I took the general thermistor math and adapted it for a specific part and circuit. Our specific thermistor is a Murata NCP18XH103F03RB. So you can Google the vendor and part number and find a datasheet. You need to find out a few things from the datasheet, specifically the nominal resistance at the reference temperature, which is usually 25 degrees Celsius, or 298.15K (or if it is not, note the reference voltage). Also, the datasheet should specify the beta value for your part; in our case, it is 3380. The beta parameter equation, solved for resistance, reads: Rt = R0 * e^( -B * ( 1 / T0 - 1 / T ) ) Where Rt is the resistance as a proxy for temperature, e is the mathematical constant e, B is beta, T0 is the reference temperature in K, and T is the measured temperature in degrees Kelvin. We want temperature given resistance, so we can solve it for temperature, like so: T = B / ln( R / ( R0 * e^( -B / T0 ) ) ) Plugging in our R0 = 10,000 ohms, B = 3380, and T0 = 298.15 K we get: t = 10000 * e^( -3380 * ( 1 / 298.15 - 1 / T ) ) or T = 3380 / ln( R / ( 10000 * e^( -3380 / 298.15 ) ) ) Now, we need to have something to plug in for R, given the fact that we're reading a voltage from a voltage divider. In our c[...]



A Deep Dive: the Velocity Manufacturing Simulation

2015-02-05T05:18:24.143-05:00

In 1989 I graduated from the College of Wooster and then spent a year as an intern with Academic Computing Services there, writing newsletters and little software tools. In the summer of 1990 I moved to Ann Arbor, without a clear idea what I was going to do next.I worked for a short while with the Department of Anthropology, but by the end of 1990, I had found a job with the Office of Instructional Technology.OIT was sort of the University's answer to the MIT Media Lab. It was an organization where instructional designers, programmers, and faculty members could work together on projects to bring technology into classrooms. It was a pretty remarkable workplace, and although it is long gone, I am truly grateful for the varied experiences I had there. It was the early days of computer multimedia, a sort of wild west of platforms and tools, and I learned a lot.In January of 1993 my girlfriend and her parents visited my two workplaces, OIT headquarters and the Instructional Technology Lab, a site in the Chemistry building. I handed my girlfriend a video camera and proceeded to give a very boring little talk to her, and her extremely patient parents. Wow, I was a geek. I'd like to think my social skills and ability to make eye contact are a lot better now, but I probably haven't changed as much as I imagine that I have. I'm an extraverted geek now: when I am having a conversation with you, I can stare at your shoes.I have carried the original analog Hi-8 videocassette around through many moves, and life changes, and only today figured out a good way to get it into my computer -- after giving the camcorder heads a very thorough cleaning. I thought the tape was pretty much a lost cause, and was going to try working with my last-ditch backup, a dub to VHS tape, but I'm pleased to learn that the video is still playable, and pleased that I could finally get this made, such as it is. allowfullscreen="" class="YOUTUBE-iframe-video" data-thumbnail-src="https://i.ytimg.com/vi/1MxN364YayU/0.jpg" frameborder="0" height="266" src="http://www.youtube.com/embed/1MxN364YayU?feature=player_embedded" width="320">This project, the Velocity Manufacturing Simulation, was written in Visual BASIC, long before it became VB.NET. I remember that it involved a fair amount of code, although I don't have the source to look at. I remember painstakingly writing code for GUI elements like the animated disclosure triangles. There was some kind of custom controls library we bought separately; the details escape me. There was some kind of ODBC (maybe?) database plug-in that I can barely recall; I think Pete did most of the work on that part. Pete wrote parts of it, and I wrote parts of it. Now it seems almost laughably primitive, but you'll just have to take my word for it that back in the day it seemed pretty cool. It won an award. As far as I know, this is the only video footage of the project.The code is 147 years old in Internet years. It was almost half my lifetime ago. But at the same time it seems like I just left that office, and somehow if I could figure out where it was, I could still go back and find everyone there in the conference room having lunch, and after lunch settle back into my old office with the vintage, antique computers.This was only one of several projects I worked on while I worked at OIT. I have some other bits of video for a few of them, but not all. I will get clips up for at least one more. I wish there was more tape, and better tape, even if the only one nostalgic about these projects is me.Perhaps "enjoy" is the wrong word, but take a moment to remember what instructional multimedia was like, a few months before a group called NCSA released a program called Mosaic and the world started to hear about this exciting new thing called the World Wide Web... but grandpa's tired, kids, and that's a story for a different day.[...]



Apple Breaks Apache Configurations for Gitit (Again)

2013-11-14T09:08:19.002-05:00

I'm not quite sure why I put myself through this, but I upgraded my Mac Pro to Mavericks. This broke my local Gitit Wiki. The symptom was that Apache was unable to start, although nothing would be written in the error logs. To determine what was wrong I used sudo apachectl -t. The installer did preserve my http.conf, but wiped out the library mod_proxy_html.so that I had installed in /user/libexec/apache2. See this old entry that I wrote back when I fixed it for Mountain Lion here.

I installed XCode 5 and I thought I was set, but there is more breakage. You might need to run xcode-select --install to get headers in /usr/include. The makefile /usr/share/httpd/build/config_vars.mk is still broken in Mavericks, so commands like sudo apxs -ci -I /usr/include/libxml2 mod_xml2enc.c won't work.

To make a long story short, I got the latest (development) version of the mod_proxy_html source, these commands worked for me:

sudo /usr/share/apr-1/build-1/libtool --silent --mode=compile --tag=CC /usr/bin/cc -DDARW
IN -DSIGPROCMASK_SETS_THREAD_MASK -I/usr/local/include -I/usr/include/apache2 -I/usr/include/apr-1 -I/usr/include/libxml2 -I
. -c -o mod_xml2enc.lo mod_xml2enc.c && sudo touch mod_xml2enc.slo

and

sudo /usr/share/apr-1/build-1/libtool --silent --mode=compile --tag=CC /usr/bin/cc -DDARW
IN -DSIGPROCMASK_SETS_THREAD_MASK -I/usr/local/include -I/usr/include/apache2 -I/usr/include/apr-1 -I/usr/include/libxml2 -I
. -c -o mod_proxy_html.lo mod_proxy_html.c && sudo touch mod_proxy_html.slo

Previously, this gave me .so files in the generated .libs directory, but now I just have .o files and I'm not sure that's what I want.




More Crappy Print-on-Demand Books -- for Shame, Addison-Wesley "Professional"

2013-08-11T15:18:12.079-04:00

So, a while back I wrote about some print-on-demand editions that didn't live up to my expectations, particularly in the area of print quality -- these Tor print-on-demand editions.

Now, I've come across one that is even worse. A few days ago I ordered a book from Amazon called Imperfect C++ by Matthew Wilson -- it's useful, thought-provoking material. Like the famous UNIX-Hater's Book, it's written for people with a love-hate relationship with the language -- that is, those who have to use it, and who desperately want to get the best possible outcomes from using it, writing code that is as solid and portable as possible, and working around the language's many weaknesses. (People who haven't use other languages may not even be aware that something better is possible and that complaints about the language are just sour grapes; I'm not really talking to those people).

The universe sometimes insists on irony. My first copy of Imperfect C++ arrived very poorly glued; the pages began falling out as soon as I opened the cover and began to read. And I am not hard on books -- I take excellent care of them.

So I got online and arranged to return this copy to Amazon. They cross-shipped me a replacement. The replacement is even worse:

Not only are the pages falling out, because they were not properly glued, but the back of the book had a big crease:

So I guess I'll have to return both.

I'll look into finding an older used copy that wasn't print-on-demand. But then of course the author won't get any money.

Amazon, and Addison-Wesley, this is shameful. This book costs $50, even with an Amazon discount. I will be sending a note to the author. I'm not sure there is much he can do, but readers should not tolerate garbage like this. Amazon, and Addison-Wesley, fix this! As Amazon approaches total market dominance, I'm reminded of the old Saturday Night Live parody of Bell Telephone: "We don't care. We don't have to. We're the Book Company."




Arduino, Day 1

2013-08-11T22:28:48.976-04:00

A friend of mine sent me a RedBoard and asked me to collaborate with him on a development idea. So I'm playing with an Arduino-compatible device for the first time. I've been aware of them, but just never got one, in part because after writing embedded code all day, what I've wanted to do with my time off is not necessarily write more embedded code. I downloaded the Arduino IDE and checked that out a bit. There are some things about the way it's presented that drive me a little batty. The language is C++, but Arduino calls it the "Arduino Programming Language" -- it even has its own language reference page. Down at the bottom the fine print says "The Arduino language is based on C/C++." That repels me. First, it seems to give the Arduino team credit for creating something that they really haven't. They deserve plenty of credit -- not least for building a very useful library -- but not for inventing a programming language. Second, it fails to give credit (and blame) for the language to the large number of people who actually designed and implemented C, C++, and the GCC cross-compiler running behind the scenes, with its reduced standard libraries and all. And third, it obfuscates what programmers are learning -- especially the distinction between a language and a library. That might keep things simpler for beginners but this is supposed to be a teaching tool, isn't it? I don't think it's a good idea to obfuscate the difference between the core language (for example, bitwise and arithmetic operators), macros (like min), and functions in the standard Arduino library. For one thing, errors in using each of these will result in profoundly different kinds of diagnostic messages or other failure modes. It also obfuscates something important -- which C++ is this? Because C++ has many variations now. Can I use enum classes or other C++11 features? I don't know, and because of the facade that Arduino is a distinct language, it is harder to find out. They even have the gall to list true and false as constants. If there's one thing C and C++ programmers know, and beginners need to learn quickly, it's that logical truth in C and C++ is messy. I would hate to have to explain to a beginner why testing a masked bit that is not equal to one against true does not give the expected result. Anyway, all that aside, this is C++ where the IDE does a few hidden things for you when you compile your code. It inserts a standard header, Arduino.h. It links you to a standard main(). I guess that's all helpful. But finally, it generates prototypes for your functions. That implies a parsing stage, via a separate tool that is not a C++ compiler. On my Mac Pro running Mountain Lion, the board was not recognized as a serial device at all, so I had to give up using my Mac, at least until I can resolve that. I switched over to Ubuntu 12.04 on a ThinkPad laptop. The IDE works flawlessly. I tried to follow some directions to see where the code was actually built by engaging a verbose mode for compilation and uploading, but I couldn't get that working. So I ditched the IDE. This was fairly easy, with the caveat that there are a bunch of outdated tools out there. I went down some dead ends and rabbit holes, but the procedure is really not hard. I used sudo apt-get install to install arduino-core and arduino-mk. There is now a common Arduino.mk makefile in my /usr/share/arduino directory and I can make project folders with makefiles that refer to it. To make this work I had to add a new export to my .bashrc file, export ARDUINO_DIR=/usr/share/arduino (your mileage may vary depending on how your Linux version works, but that's where I define additional environment variables). The Makefile in my project directory has the following in it: BOARD_TAG = unoARDUINO_PORT = /dev/serial/by-id/usb-*include /usr/share/arduino/Ard[...]



Lexx is Wretched

2013-08-20T14:44:35.667-04:00

I have a fondness for science fiction series that are imaginative but not, as a whole, successful. Farscape, I'm talking about you. Even, occasionally, those that start out promising, but which turn into complete failures -- failure can occasionally be interesting. At least, it serves as an object lesson for how a story line can go so very far wrong. Andromeda, I've got your number. I can deal with very dated CGI -- Babylon Five is still generally good and often great. So I happened to come across discounted boxed sets of Lexx, the whole series, at my local Target store. They were dirt cheap. "How bad could it be?" I thought. Well, now I know. At least, I know part of the story. First off, Lexx is not something I can show my kids -- pretty much at all. Season 1 has a surprising amount of very fake gore in it -- brains and guts flying everywhere. That didn't really bother them -- I think they got that the brains were made of gelatin -- but it was getting to me. Watching characters carved up by rotating blades, repeatedly; watching characters getting their brains removed -- that got old. Body horror, body transformation -- pretty standard stuff for B grade science fiction, or anything that partakes of the tropes of such, but not actually kid-friendly. So we didn't continue showing the kids. Still, I thought it might make more sense to watch them in order, so I watched the second two-hour movie (1:38 without commercials). The second one has full frontal nudity, which startled me a bit. I'm not really opposed to looking at a nubile young woman, per se. There is some imaginative world-building and character creation here, but ultimately it's just incredibly boring. It's like the producers shot the material, not having any idea how long the finished product would be; they shot enough scenes to actually power an hour show (forty-plus minutes without commercials), but also shot a bunch of extended padding sequences, "just in case." And so after a repeated intro that lasts just under four minutes, we get a two-hour show with endless cuts to spinning blades slowly approaching female groins, huge needles slowly approaching male groins, countdown timers counting down, getting stopped, getting started, getting stopped... endless fight scenes, endless scenes of the robot head blathering his love poetry, a ridiculous new character eating fistfuls of brains... et cetera, et cetera, et cetera. Every time something happens, I'd get my hopes up, thinking that maybe the writing has actually improved, but then it's time to slow down the show again, because we've still got an extra hour and twenty minutes to pad. And it's all distressingly sexist and grotesquely homophobic. Again, I'd be lying if I said that I didn't like to look at Eva Habermann in a miniskirt, but given that the actress is actually young enough to be my daughter, and especially given that she has so little interesting to do, and there's just not much character in her character -- it's -- well, "gratuitous" doesn't even begin to cover it. She's young, but Brian Downey was old enough to know better. And let's just say I'm a little disgusted with the choices the show's producers made. The guest stars in Season 1 are like a who-used-to-be-who of B actors -- Tim Curry, Rutger Hauer, Malcom McDowell. There's material here for a great cult show -- but these episodes are mostly just tedious. They're actually not good enough to be cult classics. The season consists of four two-hour movies. After watching the first movie, I didn't quite realize all four season one movies were on one disc, so when I tried to watch some more, I put in the first disc of season two by mistake. I watched the first few episodes of season two -- these are shorter. I didn't notice any actual continuity issues. In other words, nothing sign[...]



The Situation (Day 135)

2013-07-24T14:46:33.360-04:00

So, it's day 135. This is either the last covered week (week 20) of unemployment benefits, or I have three more; I'm not quite sure. Without a new source of income, we will run out of money to cover mortgage payments either at the end of September or the end of October. We have burned through the money I withdrew from my 401K in March when I was laid off. I've been selling some possessions, guitars and music gear, but this is demoralizing, and not sustainable. We don't have much more that is worth selling. I was fortunate to have a 401K to cash out, and to get the food and unemployment benefits I've gotten -- so far I have been able to pay every bill on time and my credit rating is, so far, completely unscathed. But winter is coming. And another son is coming -- Benjamin Merry Potts, most likely around the middle of October. Emotionally, the situation is very confusing. On the one hand, I have several very promising job prospects, and I'm getting second phone interviews. But these are primarily for jobs where I'd have to relocate, and a small number of possible jobs that might allow me to work from home. This includes positions in Manhattan and Maine. We're coming to grips with the fact that we will most likely have to leave Saginaw. It's a well-worn path out of Saginaw. We were hoping to stick with the road less traveled, but we can't fight economic reality single-handed. And we don't really have any interest in relocating within Michigan, again. If we're going to have to move, let's move somewhere where we won't have to move again -- someplace where, if I lose one job, there's a good chance I can quickly find another. So, we are willing to relocate, for the right job in the right place. The right place would be the New England area -- Grace is fed up here, and I am too. Maine, Vermont, New Hampshire, Massachusetts, Connecticut, New York, or eastern Pennsylvania are all appealing. but it would not be a quick and easy process. It would probably involve a long separation from my family. I don't relish that idea, especially if my wife has a new baby. That might be what it takes, though. I'll do it for the right job and the right salary and the right place. In any case, we can't move with either a very pregnant woman or a newborn. It's would not be a quick and easy process to sell, or even rent out, a house. A benefit to a permanent job in Manhattan is that it would pay a wage that is scaled for the cost of living there. It might be perfectly doable for me to find as cheap a living arrangement there as I can, work there, and send money home. A Manhattan salary would go a long way towards maintaining a household in Michigan, and helping us figure out how to relocate, and I'd probably be able to fly home fairly frequently. I would consider a short-term remote contract job where I wasn't an employee, and didn't get benefits, and earned just an hourly wage. Let's say it was a four-hour drive away. I'd consider living away from home during the work week, staying in an extended-stay motel, and driving home on weekends. But it would have to pay well enough to be able to do that commute, pay for that hotel, and be able to send money home -- enough to pay the mortgage and bills. A per diem would help, but the contract work like this I've seen won't cover a per diem. We'd need to maintain two cars instead of one. Grace would need to hire some people for housekeeping and child care help. I wouldn't be there to spend the time I normally spend doing basic household chores and helping to take care of the kids. Would I consider a contract job like that father away -- for example, an hourly job in California? That's tougher. I think I could tolerate seeing my wife and kids only on weekends, if I knew that situation would not continue indefinite[...]



Building a Podcast Feed File, for Beginners

2013-07-24T13:18:08.383-04:00

I had a question about how to set up a podcast. I wrote this answer and thought while I was at it, I might as well polish up the answer just a bit and post it, in case it would be helpful to anyone else. I'm starting a podcast and I need help creating an RSS feed. You're the only person I could think of that might know how to create such a thing. Is there any way you could help me? OK, I am not an expert on podcasts in general because I've only every created mine. I set mine up by hand. I'll tell you how I do that and then you can try it that way if you want. You might prefer to use a web site that does the technical parts for you. A podcast just consists of audio files that can be downloaded, and the feed file. I write my feed files by hand. I just have a hosting site at DreamHost that gives me FTP access, and I upload audio files to a directory that is under the root of one of my hosted web site directories. For example: http://thepottshouse.org/pottscasts/gpp/ The feed file I use, I write with a text editor. I use BBEdit, which is a fantastic text editor for the Macintosh that I've used for over 20 years, but any text editor will do. For the General Purpose Podcast, this is the feed file: http://thepottshouse.org/pottscasts/gpp/index.xml The feed file contains information about the podcast feed as a whole, and then a series of entries, one for each episode (in my case, each audio file, although they don't strictly have to be audio files; you can use video files). When I add an audio file, I just add a new entry that describes the new audio file. This is a slight simplification. I actually use a separate "staging" file for testing before I add entries to the main podcast feed. The staging file contains the last few episodes, and I have a separate subscription in iTunes to the "staging" podcast for testing purposes. When I upload a new episode MP3 file, I test it by adding an entry to the staging index file here: http://thepottshouse.org/pottscasts/gpp/index_staging.xml So I add an entry to test, and then tell iTunes to update the staging podcast. If it works OK and finds a new episode, downloads it, and it comes out to the right length, and the tags look OK, then I add the same entry to the main index file. I have a blog for the podcast too. That's a separate thing on Blogger, here: http://generalpurposepodcast.blogspot.com That just provides a jumping-off point to get to the episodes, and something I can post on Facebook or Twitter. For each episode I just make a new blog post and write a description and then include a link to the particular MP3 file. The blog in the sidebar also has links to the feeds and to the iTunes store page for the podcast. I'll get to the iTunes store in a minute. Oh, writing the entry in the feed file is kind of a pain. You have to specify a date, and it has to be formatted correctly and it has to have the right GMT offset which changes with daylight savings time. You have to specify the exact number of bytes in the file and the length in hours, minutes, and seconds. If you get these wrong the file will not be downloaded correctly -- it will be cut off. The URL needs to be URL-escaped, for example spaces become %20, etc. If I upload the file to my hosting site first, so that I can see the file in my web browser, and copy the link, it comes out URL-escaped for me, so that part is easy. I paste that link to the file into the feed file entry for the episode. The entry gets a link to the file, and then there is a also a UID (a unique ID for the episode). Personally, I use the same thing for both the UID and the link, but they can be different. The UID is how iTunes (or some other podcast reader) decides, when it reads your feed file, whether it has downloaded that file al[...]



Building Repast HPC on Mountain Lion

2013-07-08T01:03:48.533-04:00

For a possible small consulting project, I've built Repast HPC on Mountain Lion and I'm making notes available here, since the build was not simple. First, I needed the hdf5 library. I used hdf5-1.8.11 from the .tar.gz. This has to be built using ./configure --prefix=/usr/local/ (or somewhere else if you are doing something different to manage user-built programs). I was then able to run sudo make, sudo make check, sudo make install, and sudo make check-install and that all seemed to work fine (although the tests take quite a while, even on my 8-core Mac Pro). Next, I needed to install netcdf. I went down a versioning rabbit hole for a number of hours with 4.3.0... I was _not_ able to get it to work! Use 4.2.1.1. ./configure --prefix=/usr/local, make, make check, sudo make install. Next, the netcdf-cxx, the C++ version. I used netcdf-cxx-4.2 -- NOT netcdf-cxx4-4.2 -- with ./configure --prefix=/usr/local/ Similarly, boost 1.54 had all kinds of problems. I had to use boost 1.48. ./bootstrap.sh --prefix=/usr/local and sudo ./b2 ... the build process is extremely time consuming, and I had to manually install both the boost headers and the compiled libraries. Next, openmp1 1.6.0 -- NOT 1.6.5. ./configure --prefix=/usr/local/ seemed to go OK, although it seems to run recursively on sub-projects, so it takes a long time, and creates hundreds of makefiles. Wow. Then sudo make install... so much stuff. My 8 cores are not really that much help, and don't seem to be busy enough. Maybe an SSD would help keep them stuffed. Where's my 2013 Mac Pro "space heater" edition, with a terabyte SSD? (Maybe when I get some income again...) Finally, ./configure --prefix=/usr/local/ in repasthps-1.0.1, and make succeeded. After about 4 hours of messing around with broken builds. I had a lot with build issues for individual components and final problems with Repast HPC itself despite everything else building successfully, before I finally found this e-mail message chain that had some details about the API changes between different versions, and laid out a workable set of libraries: http://repast.10935.n7.nabble.com/Installing-RepastHPC-on-Mac-Can-I-Install-Prerequisite-Libraries-with-MacPort-td8293.html They suggest that these versions work: drwxr-xr-x@ 27 markehlen staff 918 Aug 21 19:14 boost_1_48_0 drwxr-xr-x@ 54 markehlen staff 1836 Aug 21 19:19 netcdf-4.2.1.1 drwxr-xr-x@ 26 markehlen staff 884 Aug 21 19:20 netcdf-cxx-4.2 drwxr-xr-x@ 30 markehlen staff 1020 Aug 21 19:04 openmpi-1.6 drwxr-xr-x@ 31 markehlen staff 1054 Aug 21 19:28 repasthpc-1.0.1 And that combination did seem to work for me. I was able to run the samples (after changing some directory permissions) with: mpirun -np 4 ./zombie_model config.props model.propsmpirun -np 6 ./rumor_model config.props model.props --- Notes on building Boost 1.54: doing a full build yielded some failures, with those megabyte-long C++ template error messages. I had to build individual libraries. The build process doesn't seem to honor the prefix and won't install libraries anywhere but a stage directory in the source tree. I had to manually copy files from stage/lib into /user/local/lib and manually copy the boost headers. There is an issue with building mpi, too: ./bootstrap.sh --prefix=/usr/local/ --with-libraries=mpi --show-librariessudo ./b2 only works properly if I first put a user-config.jam file in my home directory containing "using mpi ;" Then I have to manually copy the boost mpi library. Notes on bilding netcdf-cxx4-4.2: I had to use sudo make and sudo make install since it seems to write build products into /usr/local/ even before doing make install (!)[...]



Are You Experienced?

2013-07-07T19:35:36.902-04:00

A recruiter recently asked me to answer some questions for a client, so I did. I thought it might be worthwhile to save the questions and my answers and make them public so that I can refer people to them. How many years of C++ experience do you have & have you worked with C++ during the last 2 years? I've been using both C and C++ since before the C89/C90 standard and well before the C++98 standard. I taught myself C programming in college -- I did not learn it in a class. I initially used C++ when there were various sets of object-oriented extensions to C like THINK C and Microsoft's "structured exception handling" for Windows NT 3.5. It's hard to provide an exact "number of years. At some positions I worked more in plain old C, or Java, or NewtonScript or other languages, but even in those jobs there were often times where I was working with small C and/or C++ projects on the side. I own copies of the ISO standards for C and C++ (C90, C99, and C++03) and used to study them for my own edification, so that I could write more portable code. I used to subscribe to the C++ Report magazine. I used to write C and C++ interview test questions for screening people at the University of Michigan. I own dozens of books on C++ and have studied them extensively. I was definitely a C++ expert, although I was much more of an expert on C++03 than C++11. I am not so interested in the "cutting edge" of C++ these days (see below for notes about STL and C++11/C++0x). For example, here's a blog post I wrote about the C++ feature "pointers to member functions," in 2006: http://praisecurseandrecurse.blogspot.com/2006/08/generic-functions-and-pointers-to.html I have used the following compilers and frameworks for paid work (off the top of my head, these are the major tools I've used, and I am probably forgetting some): THINK C / Think Class Library MPW C/C++ Borland C++ with the Object Windows Library and TurboVision for MS-DOS Microsoft Visual C++ starting with 1.0 / MFC CodeWarrior / PowerPlant class library and Qt XCode (aka ProjectBuilder) / CoreAudio GCC ("g++") / Lectronix AFrame library TI Code Composer Studio In addition, I have some experience with static checkers (Lint, Understand for C/C++, QAC, etc. -- more are mentioned on my resume.) and I would say they are a must for large commercial code bases. Also, I have worked with profilers, various run-time debuggers, and tools such as valgrind -- these are incredibly useful and helpful in finding bugs, especially in the use of uninitialized memory. So, how do you put that in an exact number? I'd say I've used C++ daily for perhaps 12 years, but even when I was not using C++ as my primary development language for a given job or set of projects, I used it at least a little bit every year for the last 24 years. So somewhere in between those numbers. In the Last Two Years Yes, the most recent project was a server for Lectronix, something called the PTT Server, that sits on top of the AFrame framework and receives requests via JSON-RPC, and manages the state of all the discrete IO in the system. It is a multi-threaded application using message queues and hierarchical state machines. The server is not very big, maybe 7,500 lines of code, and the top layer of it is actually generated by some internal Python scripts. During this period, I was also maintaining and adding new features to several other servers and drivers as needed. If the client wants to know whether I am familiar with C++11/C++0x, the answer is "not very much." I have not studied the C++ 11 changes very much yet, so I am only slightly familiar with features like enum classes and lambdas. At Lectronix, we chose not to try to adopt new features for an ex[...]



The Situation (Day 118)

2013-07-07T17:15:02.484-04:00

So. Day 118 of unemployment. Almost four months. It's getting hard to stay positive and keep anxiety at bay. Here's what's going on. It might sound hopelessly naive, but I didn't think it would be this hard to find another job. I know I've been quite fortunate in some ways with respect to my career -- being very into, and good at, computer programming through the '90s and 2000s was a good place to be. I've been out of work, briefly, a few times before, when the small business or startups I worked for shrunk or imploded. but I've never had much difficulty finding my next job, and the job changes have generally been "upgrades" to higher pay, or at least bigger projects and more responsibility. The job market is certainly bad right now, and especially bad locally. I am trying to both be realistic and optimistic at the same time -- realistically, it seems to be absolutely useless, for the most part, to apply for publicly-posted jobs. I've applied for dozens -- it would have been more, if there were more posted, but while there are a lot of job listings, it doesn't make any sense for me to apply for jobs that will not pay enough to cover our mortgage; if I got one, we'd still have to move. And we are still trying to figure out how to avoid that, so that we don't lose everything we've put into our house. Working with recruiters has been an overwhelmingly negative experience as well, although there have been a few bright spots that have led to good leads and interviews. I'm really fed up with applying for a job listing for Saginaw or Flint only to find out that I'm actually contacting a recruiter about a position in Alabama or Mississippi or Florida. I've talked to recruiters at length who, it turned out, didn't even know the company they were recruiting for, because they were actually working for another recruiter. Is there even an actual job posting at the end of that chain of recruiters, or am I putting effort into some kind of scam? I don't know. I've also put a considerable amount of time interviewing for contract positions, making the case that I am a strong candidate, only to be completely low-balled on hourly rate, to the point where it would make no economic sense whatsoever for me to take that job (for example, a six-month C++ programming contract out of state, in Manhattan, for a major bank, where I'm expected to be pleased to accept $50 an hour and no per diem or travel expenses). My wife suggests that in the market right now, it will basically be impossible to find a job without having a job, except through personal contacts. That's discouraging, but she is probably right. And one difficulty is that I just don't have a lot of personal contacts in the area, since we've only been here three years. I have a few, and they've been trying to help me, but in general the leads (even with referrals from people who already work in the companies) have not yielded much that is promising -- usually a series of web forms where I upload a resume, then describe my work experience again in detail, write and upload a cover letter, fill out an elaborate series of questions -- this can and often does take two or three hours -- and then hear nothing whatsoever about the job again. For most of these, there is no person to contact -- no name, no phone number, no e-mail address. I'm faceless to the company, and they are faceless to me. That's just not a good prospect. Still, I have a generalized feeling that the right thing will come along, at least for the short term. Essentially, I have to keep believing that. I keep feeling optimistic about particular jobs. But hearing nothing back over and over again for months is starting to wear m[...]



The Situation (On Investing in a Revitalized Career)

2013-06-18T14:09:15.482-04:00

When I found myself unemployed, one of my first thoughts was that it would be a good opportunity to invest some R&D in my career. I had a plan to put in some serious time learning some new skills. I ordered some books on Scala, Objective C, iOS programming, digital filters, and a few other topics I wanted to study. I considered taking an iOS "boot camp" with Big Nerd Ranch -- it looked like a good class, but it just plain cost too much. I planned to work through a couple of books. I got in a couple of days of work and made some progress, but have come to realize that this was just a bit unrealistic. In part, it's unrealistic because of the time required to manage benefits, as well as the job-search reporting requirements in which I have to log specific jobs applied for each week (only recently added, apparently). There's no option to say "I'm teaching myself some new skills so I can apply for better jobs." It hasn't helped that we've had a couple of other difficulties piled on too -- we're still waiting on the lead testing, now scheduled for this coming week. There was a heap of work to help my teenager finish some college application essays. There was some other family drama. In fact I had arranged to go stay with some friends in Ann Arbor for a week specifically to get away from the distractions here, and work towards a demo-able iOS app. When things blew up, I had to cancel that idea (although I did wind up doing it later). I came across something else that I'd really like to do (although I missed this one). There's an organization that teaches two- or four-day intensive courses in Haskell programming. The last one was in the San Francisco Bay area. There is no guarantee at all that if I took the class, and met the folks there, doing the classic networking thing, it would necessarily help me get a better job. I'd really, really like to take the class anyway. I'm not asking for donations to go to a training class like that right now, as such -- I'm not sure it is quite the right time. I'm mostly writing this down by way of just putting my intention out there in some kind of concrete form. I've been diddling around with Haskell for a number of years now. I've written about Haskell a few times on. I've used it "in anger" -- to solve a real work-related problem -- a few times, for creating small utility programs, usually to chew through some data files, to prototype an algorithm that I later wrote in C++, or to generate some audio data for testing. It is, hands-down, my favorite programming language, a language that expands my mind every time I use it, and has taught me some entirely new ways to think about writing code, applicable to any language. I won't claim that Haskell is, per se, the great savior of programming. GHC can be awkward, and produces truly obscure error messages. It can be hard to debug and optimize. However, it seems to have some staying power, and perhaps more importantly, it is a huge influence on recent programming language designs. Haskell didn't appear in a vacuum -- it certainly has absorbed strong influences from the Lisp family of languages, and from ML, and maybe other languages like Clean, and others even more obscure. I love learning new programming languages, and I've learned new ideas from just about every language I've learned, but Haskell seems unique in the sheer density of its ability to blow your mind. Despite the fact that it is perhaps not practical for every application, I've become convinced that many of the paradigms and concepts behind Haskell really are the future of programming -- specifically, the competitive advantage, even something close to [...]



The Situation (Post-Father's Day)

2013-06-18T11:12:37.956-04:00

I had a great weekend. These posts have been largely kind of gloomy -- maybe understandably, given my ongoing unemployment. But I had a great weekend. On Friday afternoon I had a phone interview that went, I thought, pretty well. Grace had taken the kids away with her to Ann Arbor where she had an obstetric appointment, and then stayed overnight with them with extended family, even taking them to a kind of barbecue/fishing party that sounded like a blast. On Saturday morning she picked up a CSA share that belonged to a friend, who was out of town and donated it to us. She got back Saturday afternoon. Our fridge is packed with fantastic produce. More on that in a bit. I spent most of that time working on a Dylan program, an implementation of the little Macintosh Polar puzzle game from 20-plus years ago. When I took breaks from the screen I worked on a Gene Wolfe novel that has eluded me for a long time -- the second part of the Short Sun trilogy, In Green's Jungles. Wolfe is one of my very favorite writers and I still think that the Book of the New Sun series is pretty much the masterpiece of late-twentieth-century fantasy and science fiction. I think The Shadow of the Torturer is the only book I've literally worn to the point of disintegration just by reading it over and over. But he's a puzzling writer, and in the later series he gets more puzzling. Reading In Green's Jungles is like looking through a kaleidoscope held by someone else. As soon as you start to figure out what you're looking at, and say "Ah! Yes, I think I see what is going on," he twists the kaleidoscope and says "how about now?" And it's all a jumble of pretty fragments again. And so these are books that are unsatisfying on a first reading, and even a second reading. I've gotten further this time; maybe I'll even finish the second book. Maybe by the third reading I will be able to plow through the third and final book and feel like I have a sense of what is really going on. They differ from The Book of the New Sun in that the former series can be read as a straightforward adventure story, and it is satisfying in that way -- to a certain extent. Until you realize that Severian's story doesn't entirely hold up, and that he is an unreliable narrator, and then you fall naturally into the mystery, and start to form your own theories. I have a monograph I'm working on, about The Book of the New Sun, but I don't feel it is quite ready for publication, even on my blog. I feel almost ready to write about the second series, the Long Sun books. The Short Sun books are still largely a blur of glittering fragments to me. I'm digressing again... back to my weekend. The time with my wife and family out of town. That was a great chance to dive back in, just a little bit, into one of my favorite programming languages, and one that was hugely formative to my thinking about programming. In 1994 or thereabouts I was an alpha-tester for Apple's Dylan development environment, a tool that was ultimately relegated to the status of a technology demo than a viable language. At the same time I was developing real solutions in NewtonScript, the language that Apple actually deployed in the Newton product line. Trying to understand Dylan led me to Scheme and eventually to Common Lisp and Haskell. Dylan still exists in the form of community-supported implementations -- see also the Dylan Foundry. Dylan is a fascinating language but as I study the original documents in 2013 -- Apple's book The Dylan Reference Manual and the original Dylan book describing the language with Lisp-like syntax -- I see an over-designed language, in t[...]



The Situation (Day 92)

2013-06-11T19:26:41.155-04:00

So this is one of those days where everything is just "hovering." For the last few weeks I've had three or four recruiter phone calls and e-mails a day, but today I've had none. It's spooky, like the rest of the world was destroyed and I haven't gotten the news yet. Meanwhile, I've done some follow-up e-mails and messages, and gotten nothing back. Several different applications are in the post-interview stage, "hovering." I need to apply for some more local jobs, but I'm not seeing very many that are even remotely within the realm of possibility. Just for fun I did a calculation on what it would take to do a daily commute from Saginaw to Bloomfield Hills. That's about 80 miles one way, taking approximately an hour and 20 minutes. I know people have commutes like this, and longer, but let's do the math as an exercise. Our current main car is a late-model SUV that gets an average of 15.4 mpg. It probably will do a little better for an all-highway commute, but considering the possibility of heavy traffic and road construction, let's call it 16 mpg. For a 160-mile round trip commute, that's a convenient round number, ten gallons of gas a day. Gas today is about $4.20 a gallon. It will probably be lower off-season, but that's what it is today. That gives us $42.00 a day in gas, or $210 a week. Not accounting for vacation time, that's $10,920 just in gas. That doesn't cover wear and tear at all. The IRS standard mileage allowance including wear and tear and repairs for 2012 is 56.5 cents a mile; that works out in this case to $90.40 a day or (again, not taking vacation time into account) $23,504 a year -- in other words, that's what they consider the actual cost of owning and maintaining a vehicle and using it for that much travel. Something like a Honda Fit would obviously be a better choice, at somewhere in the ballpark of 30 mpg, but note that this would add a car payment, when we don't have one now, and so the overall cost would not be dramatically lower. Note that this takes into account no "externalities" at all. Here's one externality: if I was going to be gone with the car all day, every work day, my wife would need a second car in order to run any kind of local errand at all with the family. So we'd then be a two-car family instead of a one-car family. So it wouldn't be a matter of swapping out one car for a better-mileage car -- where selling the first could help pay for the second. Of course the at-home car wouldn't incur nearly as much in the way of gas expense and wear-and-tear, but it isn't trivial just to maintain a car, even one you don't drive very much. It also doesn't account at all for the emissions, and what that is doing to the climate, or the fact that I'd be driving for almost 3 hours a day, turning an 9-hour-day (with lunch) into a 12-hour day, and what that would do to me and my relationship with the family, and whether we'd be able to afford to hire someone to help replace some of my labor in and around our home (ranging from cooking and cleaning and mowing the lawn to child care). So, alternatives. It would probably be cheaper to stay someplace much closer to a work situation in the metro Detroit area during the work week, and we're exploring that option. Relocation would be neither quick or easy. So what's the cost of an extended-stay hotel close to the area? The cheapest one I could find online in a brief search was about $55 a night. Assuming I stayed Monday through Thursday nights and left from work on Friday, that's $220 a week (and note that these are still a commute from the workplaces, just a much shorter one[...]



The Situation (Day 88)

2013-06-06T19:40:53.817-04:00

I received word back (via paper mail) from the State of Michigan saying that my claim for 3 weeks of unemployment compensation, for the weeks ending April 27, May 4, and May 11 (see my earlier posts) is denied. The form I got back said I had it is found that "you did not report (certify) as directed and had no good cause for failing to report (certify)."

The cause I reported was that I missed certifying online by one business day because I was distracted by recruiters and interviews. In other words, because I was concentrating so much on searching for a suitable job. What would have been good cause, I wonder?

So, er, let this be a lesson to all you slackers!

It says I have the right to appeal in writing. Would there be any point to that, I wonder?




A Counterfeit Motorola Razr V3 Cell Phone

2013-06-11T14:48:03.764-04:00

I have an old Motorola Razr V3. It's from (roughly) 2005 or 2006. I use it without a contract, with a T-Mobile SIM card, buying minutes when I need to. I like this phone design, and I don't really want a smart phone or even a dumb phone with a touch screen, but mine is falling apart. I bought two allegedly new-old stock Motorola Razr V3 phones from an eBay seller. Unfortunately, they are counterfeits.I have opened a case with eBay to return them, but I thought it might be useful to share pictures. Honestly, I wouldn't have minded much if (1) they worked well (they don't -- the speaker for speakerphone mode doesn't work, they don't vibrate, and the audio is poor), and (2) they were really cheap (they weren't that cheap -- I paid $59.99 each). Take a look at the pictures. The gray phone is the original. The gold one is the counterfeit. It's very obvious when you just pick them up, open them, and try to work the buttons or open the battery compartment. The old phone opens smoothly and still feels solid. The new one grinds slightly and feels loose and flimsy.Original: fit and finish is very clean. "M" logo button top center matches phone. Fake: front cover edge misaligned, "M" logo is blue and looks strange, buttons are loose.Original: you can read all the serial numbers (even though the picture is blurry, sorry).Fake: numbers are cut off; missing some numbers.Original: logo is laser etched right into the aluminum surface.Fake: logo is painted.Original: darker, glossy.Fake: type is different, lighter gray, matte.Original: inside battery compartment cover. Note recycling warning, 3 clips to stabilize cover. Release mechanism still works after many years.Fake: mechanism is extremely stiff and barely works, nothing molded on the inside.Original battery hologram.Fake battery hologram.Original: still has a little rubber plug in that access hole after years of handling.Fake: rubber plug stuck way out, fell out immediately with the gentlest handling, now it's around here somewhere...Back covers. Note the raised logo and carrier on the original (right). Ignore the missing dark glass over the display on the old phone, I broke that many years ago...The cover of the manual.The printing inside the manual.Under the right lighting you can see that the battery compartment cover on the counterfeit phone is completely mismatched to the rest of the case. Wow! Crap-tastic![...]



The Situation (Reporting from an Undisclosed Location)

2013-06-02T00:51:42.192-04:00

I've sequestered myself here for a few days to try to concentrate on some Objective C and iOS programming.

I had an interview last Thursday. You can read about some of the technical aspects of the interview in this post on my programming blog, Praise, Curse, and Recurse (warning: extreme programming geek content!)

I'm very grateful to my undisclosed friends for letting me work here in this undisclosed location, as well as feeding me some delicious undisclosed meals!




WIC and Nutritional Advice

2013-05-28T23:27:40.410-04:00

So, some followup. Grace finally found that Kroger had, in fact, ordered low-fat goat's milk, but no one got back to her to report that it had come in. She had gone to the store to check on it several times, but it had not been put on the shelf. She went back several times, and one morning found someone actually stocking shelves. She asked this person, who said "Oh! You're the one who ordered that goat's milk? It's been sitting in the back." So they get credit for actually ordering what we wanted, but pretty much a zero for customer service.

Grace took home 12 quarts, out of the 22 she was covered for that month (all that wouldn't fit in our refrigerator). And we started doing our best to figure out how to feed the kids a dozen quarts of low-fat goat's milk. We decided that making a batch of Cream of Wheat would use up a whole quart, so we did this several times. The kids ate that up. We had hoped that goat's milk wouldn't set off their milk allergies, but it did -- in fact, the seemed to react more than they did to cow's milk -- and so we had several sick kids, including a baby with an up-all-night ear infection. So we had to give up on the goat's milk. They won't cover the fortified almond or coconut milk that we normally drink.

Joshua is very small and so we took him to see a pediatric endocrinologist for evaluation. She does not seem to be concerned with the results of his blood tests and bone scan, although we are concerned, and we'll be looking for another opinion. He has a follow-up appointment set to find out why our two-year-old is almost as tall, and weighs more, than our four-year-old. Meawhile, the nutritionist in her office got back to us with some advice on how to deal with a child who does not seem to be eating an adequate number of calories. I'm looking at a handout entitled "How to Increase Calories." It starts out: "If your child is having eating problems, it's important to make every bite count. Getting in adequate calories can help your child maintain weight and continue to grow well."

It then goes on to list a number of different types of food to consider adding to your child's diet, including butter and margarine, whipped cream, and whole milk and cream. We already do some of this -- for example, we've been making him mashed sweet potatoes with butter and sour cream. He doesn't seem to have the milk allergies that his siblings have, but having a lot of milk in the house is problematic because they will demand what he is drinking.

We'll skip the margarine, to avoid soybean oil and hydrogenated fats. Sweetened whipped cream in the house leads to a feeding frenzy, but we might be able to use whipping cream unsweetened in Joshua's food. WIC won't cover whipped cream, butter, or whole milk of any kind. Cheese is covered, but in pretty limited quantites. We're definitely using all they will provide and then some.

The nutritionist also recommends cream cheese, sour cream (we already use it; not covered), salad dressing and mayonnaise (not covered -- but Grace makes deviled eggs for the kids pretty frequently, and tuna salad with mayo).

Next up are sweets. Honey, jam, and sugar, granola, and dried fruits. These are a little problematic for us since Joshua already has four stainless-steel crowns due to extensive tooth decay. He eats carbohydrates and especially simple carbs preferentially to anything els[...]




The Situation (Day 78)

2013-05-27T12:30:37.637-04:00

Happy Memorial Day!

I think my numbering got off-by-one a while back so I've gone through and adjusted the day numbers on some of these posts, assuming that day 1 was the Monday of my first full week of unemployment -- in other words, the first missed work day.

Yesterday my friend Bill visited us at our home here in Saginaw with his wife and their young child. I haven't seen Bill since... hmmm, I think it's been 4 years, since the last college reunion I attended? And then before that, it was much longer. He brought us some pies from Fuzzy's Diner, and we fed everyone baked beans we made from some of the pinto bean stash in our root cellar, flavored with maple and bacon and cooked overnight in a cast-iron dutch oven at 200 degrees. They were delicious. While our kids and his kid and some neighborhood kids ran around and got filthy, the best way to spend a sunny spring day, the grownups also got filthy -- we got through a bunch of garden work, which was very satisfying. Bill and I got to spend a little while playing music together -- he's a very talented guy. That was great. It was a little reminder that even in the midst of the stress and angst of The Situation, with stress over trying to find work, keeping our mortgage paid, collect benefits, and figure out what to do next, it is still late spring and the world is alive and beautiful and we are meant to be alive and enjoy it.

The Michigan unemployment office waits for no man (or woman), and MARVIN does not count holidays, so I certified online today. There is good news there. The online system says I will be paid for the previous two weeks. That will mean I've collected eight weeks of unemployment compensation out of the 20 I'm eligible for. No word, yet, on whether I will be paid anytime soon for the 3 calendar weeks that were withheld. If I'm never paid for those weeks, I think that it means I still have them in reserve, and could still collect them as calendar weeks 21-23 -- that would be the last week of July and the first couple of weeks of August.

Let's hope it doesn't come to that. If I'm not employed by then, though, we will be very low on cash, for things like water and electric bills and gasoline. Getting paid now for those missed weeks would really help with that.

Not getting paid now for them now could also mean that I could collect them later in the benefit year, if, G-d forbid, I wind up hired and laid off again. Let's hope that is really, really not necessary!

Please keep our family in your thoughts this week, as I have an interview tentatively scheduled for Thursday, and several employers I'm waiting to hear back from. I'll be off-line, devoting the rest of today to celebrating and remembering our lost loved ones.




The Situation (Crisis, Opportunity, and Why I Am Not a Disabled Electrical Engineer)

2013-05-25T16:10:48.930-04:00

Nothing will test you, baffle you, and frustrate you like a period of unemployment. It's definitely one of those crises-slash-opportunities. I've had this tendency to see (and write about) only the "crisis" part. But it's honestly just now really starting to dawn on me that there is an opportunity aspect of this. I mean, I've known this intellectually, but it's been hard to feel it. I think I'm finally starting to feel it. And that presents, oddly, another crisis. Today is a gorgeous spring day. Our neighborhood is this quiet, walkable boulevard with friendly neighbors, filled with flowering trees. I'm in a gorgeous old house. But at the same time this place is an economic shithole, in the grip of a terrible recession. A mile from her is one of the highest-crime neighborhoods in the country. There are hundreds of houses standing empty with foreclosure notices on the doors. It's kind of a mind-fuck, really, the disjunction between the apparent wealth and beauty here and the poverty behind the facade -- which, right now, is starting to include my own. This place looks like an opportunity, but it's actually mostly a crisis. We're thinking very hard about whether we will have to leave Saginaw. After 3 years here, the kids have settled in, and made friends, and we've started to establish a rhythm and pattern to life here. We've made improvements to our house. We've put money and love into our gardens. I don't want to go. I want this to work out. We chose this place, to be close to family, to find affordable real estate and space for our children, and to do what things we could to try to reverse the decline, the flight from what was a major American city. If we do have to leave, it will be under a dark cloud; we'll have lost everything we put into our home, most likely, and more. We'll be demoralized; the kids will lose the ties they've made here, the small roots they've put down. It will feel like a failure. I've never been one to be excited about the road more taken. And we'll be treading the path taken by fifty thousand people before us, over the last fifty years or so. There is good news on the job search front. I've been able to have good discussions with some hiring managers who are willing to consider a work arrangement that would help us stay in our home. The details are yet to be determined, but it's encouraging to have a manager suggest that they might be willing to have me in the office part of the work week, and work from home for the rest of the week. One guy even asked me what kind of support I would want from the company to set up such an arrangement. That actually baffled me. I just sort of went "ummm, ummm, what?" It took me a while to even understand that he was suggesting his company might chip in to help cover accommodation, for staying overnight, or a mileage allowance to pay for gas. In other words, he seemed to want to make the job attractive to me -- by offering perks! Isn't that weird? Michigan has been beaten down. I've been beaten down too. When I set up my arrangement to work at home for my last employer, I resolved that I would not ask for any special concessions, since I had asked for and gotten working from home as a special privilege. I bought the computers I would need for my home office myself, and paid for the network gear and high-speed internet and separate phone line and desk and oscilloscope and logic analyzer. When I did have to travel to the o[...]