Subscribe: More News
http://www.blogger.com/feeds/3322141/posts/default
Added By: Feedage Forager Feedage Grade B rated
Language: English
Tags:
application  computer  don  file  files  git  jquery  make  might  much  new  people  problem  programming  system  time  work 
Rate this Feed
Rate this feedRate this feedRate this feedRate this feedRate this feed
Rate this feed 1 starRate this feed 2 starRate this feed 3 starRate this feed 4 starRate this feed 5 star

Comments (0)

Feed Details and Statistics Feed Statistics
Preview: More News

More News



Technical stuff sometimes about programming.



Updated: 2017-08-18T11:57:34.875+10:00

 



Slavery is the New Bacon

2016-04-17T19:49:46.547+10:00

“We can barely decide whether or not bacon will cause health problems year over year, let alone the more complicated issues like politics and race.”Matthew T De Goes (screen capture).“Some people just don’t want a bad person invited to a tech conference, even if their talk was picked by a blind committee, they are peaceful, they reject any type of violence, and they don’t pose a safety threat.” Personal Thoughts on the LambdaConf Controversy.The committee was  blind to any favour or discrimination. How can anyone object to our objectivity? The blindfold came off for a bit though and the LambdaConf committee found out some stuff: “Are these views racist and sexist? Absolutely, since they don’t admit the possibility that, for example, an asian female with no background in computer science might do a better job at “governance” than any white male software engineer. Are these views endorsed by LambdaConf or held by any staff members? Hell, no!” LambdaConf-Yarvin Controversy: Call for Feedback.Having a blind submission process, getting people to sign up to a Code of Conduct and conducting a conference held purely on beliefs is a good ideal to aim at. It’s possible that this could’ve worked for LambdaConf.In contrast though, the LambdaConf organisers went looking into the background of the speaker, emailed other speakers and held a vote and wrote a few blog posts. It shows a lack of confidence in that processes while undermining it at the same time. Maybe a more fully featured open review process would’ve been better.Blind reviews do nothing for inclusion or diversity and reinforce existing discrimination: “Does double-blind review benefit female authors?” and “Understanding current causes of women’s underrepresentation in science”. It's like waving the checkered flag at the end of a Formula 1 wondering why only rich people are finishing.The contradiction of LambdaConf is having a conference that touts its diversity and at the same time inviting someone who is against including certain groups of people. Is Yarvin really the best guy for the job — is he even trying? No.  He just doubled down and justified his views.In that post, he makes it clear that Yarvin and Moldbug are the same person while saying the exact opposite. He’s saying, if you can’t tell the difference between the two, especially after thousands of words, it is you that has the problem not him. Don’t be confused, he’s blaming you — he’s not coming peacefully.He says he’s not racist but Moldbug might be (and another). He talks about Carlyle, fascism (“no such thing as too much truth, too much justice, or too much order”), people as property (“we agree that he can sell himself into slavery”), and race is intelligence (“current results in human biodiversity”). It’s a regressive set of ideas — even in its own time:“The alternative to markets was not socialism. There were socialist experiments, but there were no socialist economies. The alternative to market organization was slavery.”150 Years and Still Dismal!The purpose of conference is for networking and to learn. It’s a place where people are going to teach children, and single mothers, parents, and anyone else who comes a long. This will make a difference.The situation is that attendees will be able to see right through his poor disguise. It makes him a terrible teacher and the conference a poorer place at which to learn. The existence of a speaker, publicised in such a way reduces attendees ability to perform — hurting who you’re trying to help.Bacon is not good for you and there is no slippery slope. You pick who comes to your conference dependent on the size of the out-group you want to create. Racism and slavery is socially engineered injustice — you’re denying people’s humanity and in that way it reduces us all.[...]



Using Ruby to Inject you a Monoid

2015-08-25T15:50:23.307+10:00

A monoid has an append operation (like plus) and an identity (like 0) and you get for free a concat operation.

In Ruby it's something like:

[1,2,3].inject(0) {|a, x| a + x }
=> 6

Or just, [1,2,3].inject(:+)

In Haskell, you can even see it in the type signature for Monoid:
mconcat :: [a] -> a

You can see the array on the left ([1,2,3]) and the result unpacked (just 6).

What if you want to take it up one level of abstraction and have any operations on your list of numbers.  You just use a different monoid called Endo.

To take it to this next level you need a more abstract append and identity.  

Append needs to combine two operations:
compose = -> (f,g,*args) { f.call(g.call(*args)) }

And identity just returns what you give it:
id = -> (x) { x }

Which lets you then write:
[->(x){x + 2}, ->(x){x * 7} ].inject(id) {|x, y| compose(x, y) }.call(8)
=> 58

Or in Haskell:
Prelude> let a = foldr (.) id [(+2), (*7)]
Prelude> a 8
58

See:







jQuery still not a Monad

2015-02-18T04:25:48.284+10:00

I read jQuery is a Monad and thought yeah this is pretty cool I finally understand Monads.

jQuery is not a Monad, a Monad can take any type and it has a join operator that takes a doubly wrapped value and turn it into a singly wrapped one. This means that for it to be a Monad jQuery would have to work on any type, you have to be able to give it a String, Int or DOM and it operates on it consistently. jQuery's .map can only deal with the one type. It does have jquery.map but that would make the Array the Monad (or actually just a Functor) not jQuery.

Many of jQuery's method are specific to DOM manipulation, parsing and the like and not related to Monads in anyway - more like a combinator library like HXT.

The idea that it is a Monad still continues with, What jQuery can teach you about monads and Does jQuery expose a monadic interface?. One of the points that I think people ignore is that JavaScript has an implicit this and it affects how you apply function:
As is common with object-oriented language implementations, the this variable can be thought of as an implicitly-passed parameter, so we can then look through the API for a jQuery container looking for a method that takes one of these transformation callbacks and returns a new jQuery container.
This actually prevents you from easily (and definitely not clearly) writing Moands in JavaScript, Monads in the generic fashion that is required:
So, is jQuery or the Django ORM a monad? Strictly speaking, no. While the monad laws actually hold, they do so only in theory, but you can not actually use them in those languages as readily as you can in, say, Haskell. Methods get the object as the first (implicit, in JavaScript) argument, not the value(s) stored in the object. Methods are not first class objects independent from their classes. You can circumvent those restrictions by implementing some boiler code or, in Python, metaclasses that do some magic. What you get for doing that is a much easier time writing functions that work on all monadic classes, at the expense of making the whole concept more difficult to understand.
As Ed said: "jQuery is an amazing combinator library, but it isn't a functor, it isn't applicative, and it isn't a monad."



Recovering from ElasticSearch Recoveries

2014-05-06T09:11:10.401+10:00

We recently had a problem with ElasticSearch's snapshots where a shard (a directory) was failing because it was missing the metadata file and data files.

This leads to a couple of criticisms of the snapshot directory format.  Primarily, it takes files with reasonable extensions, generally Lucene files, and creates files like "__1" and then records a mapping from "__1" to "_def.fdt".  For example:

{
  "name" : "es-trk_allindices_2014-01-01_0000est",
  "index-version" : 78683,
  "files" : [ {
    "name" : "__0",
    "physical_name" : "_abc_0.pay",
    "length" : 2012,
    "checksum" : "13m617n",
    "part_size" : 104857600
    }, {
    "name" : "__1",
    "physical_name" : "_def.fdt",
    "length" : 97744833,
    "checksum" : "239wze",
    "part_size" : 104857600
    }
...

The files aren't event located together in the metadata file.  In Lucene, you have a group of files prefixed with say "_def" like fdt, fdx, tip, tim, del, nvm, and nvd in a single directory.  Losing the metadata file means not only losing the helpful filenames but also their groupings used by Lucene.

Luckily, ElasticSearch uses FDT files which have just enough information - the unique index identifier and then the payload - to turn them into a CSV or other file to be able to be reimport the data into ElasticSearch.  If you have the same problem you will have to force shard allocation or create an empty shard in a new cluster, delete the failed shard and copy that shard into the failed one.

The utility, es_fdr, reads FDT files and outputs them one field per line, it's available on the OtherLevels Github page.  I've also updated a related Lucene ticket.




Make "Enter" in Twitter Typeahead Select the First Item

2013-11-17T19:48:33.172+10:00

This is just a quick post which may not be applicable for long but fixed the problem I had where I was trying to get the first item even if you hadn't selected it with the mouse or cursor.

$('input.typeahead').keypress(function (e) {
  if (e.which == 13) {
    var selectedValue = $('input.typeahead').data().ttView.dropdownView.getFirstSuggestion().datum.id;
    $("#input_id").val(selectedValue); $('form').submit(); return true;
  }
});

I appended the same information to this GitHub issue.



Grace Hopper on Programmers and Programming

2013-08-23T12:10:53.888+10:00

I've started to read "Show Stopper!" and it has an excellent part in the first chapter about Grace Hopper, who created the first compiler that basically created the jobs that modern programmers perform:
"Hopper was convinced that overcoming the difficulties posed by proliferating computer languages would rank among the greatest technical challenges of the future. "To me programming is more than an important practical art," she said in a lecture in 1961. "It is also a gigantic undertaking in the foundations of knowledge." Ironically, she fretted that the greatest barrier to progress might come from programmers themselves. Like converts to a new religion, they often displayed a destructive closed-mindedness bordering on zealotry. "Programmers are a very curious group," she observed. 
They arose very quickly, became a profession very rapidly, and were all too soon infected with a certain amount of resistance to change. The very programmers whom I have heard almost castigate a customer because he would not change his system of doing business are the same people who at times walk into my office and say, "But we have always done it this way." It is for this reason that I now have a counterclockwise clock hanging in my office."
I would love to know what the name of the lecture was and if there were any transcripts or copies of it around.



Copying between two uploaders in CarrierWave

2013-11-17T19:46:57.980+10:00

To copy between two cloud providers using CarrierWave and Fog is a bit tricky.  Copying from one provider to a temporary file and then storing it in the other seems to work but the problem is that the file name is not preserved.  If you wrap the temporary file in a SanitizedFile then Carrierwave will update the content without changing the name of the file.

The following code preserves the file name between providers (where "obj" is the model, "source" is one uploader and "destination" is the other):

def copy_between_clouds(obj, src, dest)
  tmp = File.new("/tmp/tmp", "wb")
  begin
    filename = src.file.url
    File.open(tmp, "wb") do |file| 
      file << open(filename).read
    end
    t = File.new(tmp)
    sf = CarrierWave::SanitizedFile.new(t)
    dest.file.store(sf)
  ensure
    File.delete(tmp)
  end
end
To use it:
copy_between_clouds(o, o.old_jpg, o.new_jpg)
You might need to change "src.file.url" to "src.file.public_url" for some cloud providers.



Elliott, Dina and Steve

2013-07-15T20:01:46.557+10:00

I was reading "'Memo' Functions and Machine Learning" again.  It's an interesting article, appearing in Nature, before an article about mammalian reproduction, and uses balancing a pole on a trolley as an example of artificial intelligence. In the paper, the trolley is controlled in real-time by two computers: a PDP-7 and an Elliott 4100.  I hadn't heard of the 4100 before but the Elliott and others come from the start of the British computing industry - including others you may never have heard of like LEO and English Electric Computers. You can read more about them in "Early Computer Production at Elliotts" and "Moving Targets - Elliott Automation and the Dawn of the Computer Age in Britain 1947-1976" (review of the book).One of the pictures in the Elliott computer archives has the caption, "Switching on the Elliott 405 at Norwich City Council in 1957. The woman to the right is Dina Vaughan (later Dina St Johnston), who did the initial programming for the Norwich system." In 1959, she was the first person to start a UK software house.  This being a company that only wrote software - not software that came bundled with hardware.  The first, according to Wikipedia, was Computer Usage Company in 1955.The best resource on her I could find was in "The Computer Journal" called "An Appreciation of Dina St Johnston (1930–2007)".  It describes how she was writing software in the mid-50s making her a contemporary of people like Michie, Turing, von Neumann and Godel.  It describes what programming was like then:"She wrote with a Parker 51 fountain pen with permanent black ink and if there ever was a mistake it had to be corrected with a razor blade. Whereas the rest of us tested programs to find the faults, she tested them to demonstrate that they worked."One of the first commercial jobs for the company was a control system for the first industrial nuclear power plant.  Her company, Vaughan Programming Services, was visited on the 10th anniversary of the British software industry by "Electronic Weekly":"The staff in a company run by a woman might be expected to contain a high proportion of women, and this expectation is fulfilled", runs the EW report, "but, unexpectedly, a low proportion of the professionals employed have degrees, and there is no great emphasis on strong mathematical background in the mix of skills used."The industry norms don't seem to have changed very much.  More details can be found on Google Books by searching, "Dina Vaughan" (or St Johnston).In "Recoding Gender", Dina St Johnston is mentioned along with another female programming pioneer, Dame Stephanie Shirley.  A refugee of World War II, she entered the software industry as a "late pioneer".  She became interested in programming and got into the computer room by sweeping up chads, "I could not believe that I could be payed so much for something I enjoyed so much...early software was fascinating...it was so engrossing."  In 1962 she started "Freelance Programmers", the second software company founded by a woman in the UK.   Her view of the computing industry seems to be one that offered a way to address social and economic problems, "a crusade", a flexible workplace with policies designed to support women with dependents.  Originally designed to help women with children to continue to work, its charter gradually became more broad to include supporting women's careers, then for women with any dependent and in 1975 was expanded, by law, to include men.  The final mission became "people with dependents who couldn't work in the conventional environment".  She says in her biography, the company had always employed some men and at the time of the passing of the equal opportunities law three of the 300 programmers and a third of the 40 systems analysts were male.A Guardian a[...]



Removing Large Files from Git

2013-01-31T13:26:33.698+10:00

When I've used git I've used it pretty much CVS, SVN and any other version control system I've used before - I've checked in binary files.  Whether that's dlls or jars or gems I've checked them in.  Pretty much everywhere I've worked people have said this is a problem and I've tended to argue it solves a lot of problems - one of the main ones is when where the repositories and management software along with that fails - I still have a working system from a checkout/clone/etc.

The price of this is that sometimes you need to cleanup old binary files.  Git makes this complicated but once you've found a couple of tools then it's relatively straightforward.

Stackoverflow has a Perl and Ruby script that wraps around a few git commands to list all files in a repository that's above a certain file size "Find files in git repo over x megabytes, that don't exist in HEAD".  The main gist of it is (in Ruby):

IO.popen("git rev-list #{head}", 'r') do |rev_list|
  rev_list.each_line do |commit|
    commit.chomp!
for object in `git ls-tree -zrl #{commit}`.split("\0")
      bits, type, sha, size, path = object.split(/\s+/, 5)
size = size.to_i
      big_files[sha] = [path, size, commit] if size >= treshold
end
end
end

big_files.each do |sha, (path, size, commit)|
where = `git show -s #{commit} --format='%h: %cr'`.chomp
puts "%4.1fM\t%s\t(%s)" % [size.to_f / Megabyte, path, where]
end

Then to remove the old files from the repository:

git filter-branch --force --index-filter 'git rm --cached --ignore-unmatch [full file path]' -- --all
git push --force

Then to cleanup any used space in the git repository:

rm -rf .git/refs/original/
rm -rf .git/logs/
git reflog expire --expire=now --all
git gc --aggressive --prune=now




Transparent Salaries

2013-01-31T16:22:36.940+10:00

The stereotype is that developers are notoriously bad at human interactions.  I'd suggest that developers are notoriously bad at interactions that they see as fake.  Things like small talk and negotiations.  In a developers mind or to be honest mine at least, the ability to get paid well or to pay less than retail for a product shouldn't be based on your ability to pretend your friendly with someone you're not, it should be based on an some sort of system.  Why not create a self consistent system over relying on interacting with people?With this in mind I decided to try to create a transparent system at work to handle salaries.  The problems I see with the way traditional salary is handled, especially the lack of transparency, include:It combines performance with remuneration,Programmers are notoriously bad at valuing themselves, communicating it with others and ensuring that they are adequately paid during a job interview or while employed,It prevents an objective assessment of what your roles and responsibilities are in the organisation,It lacks an acknowledgement of what you skills are worth in the job market,It creates two groups: management and developers.  This allows a combative attitude to be created and is used to justify why developers shouldn't trust business people and management,People tend to find out anyway.Some of these points I'll admit are difficult to solve whether it's a transparent system or not.  However, the last two points, which I think are especially toxic, can be solved with a transparent system.  In a closed salary system, people are encouraged to secretly find out what other people are worth and to provoke comparisons between each other.  The time periods are often long and the information often incorrect.  If a system is transparent you can solve that problem by making the information accurate and positive.People tend to ask, "Why does Mary earn more than me?" I think I'm a better programmer/analyst/whatever than she is.  Was it just because Mary started when the company had more money?Joel Spolsky is probably one of the key influences that I've seen on having transparent salaries.  For example, "Why I Never Let Employees Negotiate a Raise":"...we knew that we wanted to create a pay scale that was objective and transparent. As I researched different systems, I found that a lot of employers tried to strike a balance between having a formulaic salary scale and one that was looser by setting a series of salary "ranges" for employees at every level of the organization. But this felt unfair to me. I wanted Fog Creek to have a salary scale that was as objective as possible. A manager would have absolutely no leeway when it came to setting a salary. And there would be only one salary per level."The Fog Creek Ladder is based on the idea of achieving a certain level of capability.  The problem I had with the Fog Creek solution was that it seemed to suggest, especially in the skills level, that a programmer starts off needing help and working together and then slowly achieves the abilities to work by themselves.  Whereas, where I work we wanted to do the opposite - as you get better at programming you get better at being able to explain, to listen and work with others.  I think this is especially important if you want to work in an environment with living architecture.So the inputs are simply what do you do - this should be objective and easy to see (again we're assuming a more transparent work environment where work is checked in or on a Wiki - if it's not shared you haven't done it).  It's assumed that you perform well - if you're not, you're not doing your job.  You can argue your role and performance separately to salary as these are assumed correct coming in.The othe[...]



Imagination Amplifier

2013-01-31T13:23:16.714+10:00

This is a reproduction of an article that appears in Compute's Gazzette, Issue 59.  I'm reproducing it here because I think it's particularly good and on archive.org appears in formats where it's unlikely to be found again. Alternative link to his November's COMPUTE! article.Worlds Of Wonder - WOW!In this month's mailbag I received a letter from Art Oswald of Goshen, Indiana. Art was responding to my article in the November COMPUTE! magazine about computers of the future. He wrote: "In the future, the phrase 'I wonder' will become obsolete. I won't have to wonder what would happen if, or wonder what something was like, or wonder how something might be. I would just ask my computer, and it would simulate by means of holographic projection anything my imagination could come up with."Now, I ask you, Art, is this something to look forward to or something to dread?I have a new science-fiction book coming out which deals with this subject — the effect of computers (and electronic media, in general) on the human imagination. The book is Robot Odyssey I: Escape from Robotropolis (Tor Books, April 1988). Listen to two teenage boys carrying on a conversation in the year 2014:We think plenty using computers, but we don't imagine. We don't have to imagine what the fourth dimension is, or what will happen if we combine two chemicals, or what the dark side of the moon looks like. The computer is there a step ahead of our imagination with its fantastic graphics, cartoons, and music. We no longer imagine because the computer can do our imagining for us. "So why imagine?" Les said. "My pop says most people's imaginations are vague and fuzzy anyway. If the computer imagines stuff for them, it'll probably be a big improvement.Les is right. If the computer "imagines" something, it is usually based on a database of facts, the vision of an artist, or a scientific model created by experts. How could our puny imaginations compete with images that are this inspired, detailed, and exact?Frontiers Of Knowledge Science-fiction writers think a lot about new worlds of wonder. It is the human desire to "go boldly where no man has gone before" that is among our more noble impulses. It may even be the "engine" that drives us to innovate, invent, and take risks. Without this engine, we might sink into a kind of emotional and intellectual swamp. Life could become extremely boring. Every time we contemplated a decision, we would first ask our computer, "What if?" and see what the consequences might be. Knowing too much might even paralyze us and cool our risk-taking ardor.Imagination AmplifiersArt writes that the phrase I wonder may be rendered obsolete by computers, but I'm not certain that he's right. Instead, I think that we could use computers to stimulate our imagination and make us wonder about things even more.Where does our imagination come from? I picture the imagination as a LegoTM set of memory blocks stuffed into the toy chest of our mind. When we imagine something, we are quickly and intuitively building a tiny picture inside our heads out of those blocks. The blocks are made up of images, tastes, smells, touches, emotions, and so on — all sorts of things that we've experienced and then tucked away in a corner of our minds. The quality of what we imagine depends on three things: how often we imagine, the quantity and diversity of blocks that we have to choose from, and our ability to combine the blocks in original — and piercingly true — ways.Most of us have "pop" imaginations created from images supplied to us by pop culture. We read popular books, see popular movies, watch the same sitcoms and commercials, and read the same news stories in our newspapers. It's no wonder that much of what we imagine is made up of pr[...]



Not Much Better

2012-09-20T14:40:27.113+10:00

I've been reading, "A Question of Truth" which is primarily about homosexuality in the Catholic church and references to it in the Bible. It has a lengthy, careful but very easily read introduction, explaining many things to do with currently held views, the difference between the acts from intents, the damage it does to people and carefully describing the different aspects of sexuality, separating all the issues well and does a reasonably good job of describing the difference between intensional and extensional usage.

A lot of this is Bible study 101 - the modern ideas like love, homosexuality, marriage, property, slavery, and so on have moved or did not exist when the Bible was written, so what people often read into it is not the original intent - not that I would say that the original intent is much better - and that's the real problem.

The book effectively reasons around all the major passages that people use to treat gay people badly. However, in the course of the reasoning, it just seems to move away from from treating homosexuality as sinful to refining women's historical position in society.

For example, the infamous Leviticus and men not lying with men passage is reasoned to mean not the act that is wrong but that a man shouldn't treat a man like a woman. Another is the story of Lot and our friends the Sodomites, which again is about offering up your daughters for hospitality reasons and the suggestion is that Sodom was destroyed because they humiliated them not because of any need for gay love.

There's a sentence or two along the lines that no modern Christian would treat women in this way (or have slaves?) which I thought rather undermines the whole point of the exercise to me.



Constructivism - Why You Should Code

2013-02-01T09:25:46.146+10:00

I think this article on why you shouldn't code is wrong. It's wrong in a way that I was wrong in high school that I would never need to know German, art or biology. It's wrong in the way I was wrong about never needing to know set theory or relational theory or category theory. But it's also wrong in the ways I will never really know, "Computer As Condom":
Debbie hated math and resisted everything to do with it. Tested at the bottom of the scale. She learned to program the computer because this let her play with words and poetry, which she loved. Once she could write programs she found a way to tie fractions into words and poetry. Writing witty programs about fractions led her to allow herself to think about these previously horrible things. And to her surprise as much as anyone's her score on a fractions test jumped into the upper part of the scale.
What you do as a job programming in C#, Java, JavaScript or whatever has very little to do with the way people use coding to learn about learning. That's the most disappointing thing about the article. It is the terrible idea that learning how to code lessens the world if you do it wrong. Learn to code backwards in time in Latin in Perl but don't listen to anyone who says you shouldn't code.



Lectorial

2012-05-14T08:40:09.078+10:00

I just finished a study group on Learn You a Haskell for Great Good.  Which was a great experience, for many reasons, but I think the way each session was structured into a combination of lecture and tutorial deserves particular attention.

The weekly structure was fairly straight forward: a chapter leader covers a chapter the week before the rest of group, writes a summary and some programming questions.  The weekly sessions took about an hour and a half.  This consisted of the chapter leader going through their summary allowing the group to interject with questions and answers (if the chapter leader didn't know) or there might be some furious Googling to find a good reference or answer that someone half remembered.  The programming questions and answers would usually go around the table, each person would answer a question and the others would then comment on it or show their answer if it was particularly different (or shorter or whatever).  The time was roughly 60/40 from lecture to programming/tutorial.

Compared to university courses, where you often had two hours of lectures and then one or two hours of tutorials often spread out over a week, this arrangement seemed to be very time efficient.  The other advantage was getting the students to run the study group.   The chapter leader has to spend a lot more time making sure they understood the chapter in order to answer any questions that would come up during the review and to set the programming questions.  For me, setting the questions and making sure you had answers (and by the end of it tests to help people along) was probably the best part of the learning experience.  There was no real hiding if you hadn't done the answers either - partially because it was such a small group but also because of the high level of participation.

It'd be interesting if there were university courses where you were graded not just on an examination and assignments but the questions you set and if you were able to run a small group of people through a class.  It would also make tutorials more relevant which are often dropped by students.

It seems "lectorial" also means, "large tutorial in a lecture hall to give context around information given in lectures".  They also mention small group activities and class lead presentations so there is some overlap.



One Platform

2012-01-29T08:27:01.940+10:00

A couple of things have struck me about the iPad and iBook Author.  If you want to read some background John Gruber has a good summary.  It may well come down to whether being focused on one thing is wrong.Firstly, Steve Jobs is quoted in his biography saying he'd give away textbooks.  This is a pretty big bargaining chip when Apple was talking to the textbook publishers: go with us or we'll give your product away for free.  How does this differ from Bill Gates saying they'd cut off Netscape's oxygen supply?The other thing it reminds me of is Bill Gates' developer virtuous cycle.  This is where developers write applications for a particular platform, users pick the platform with the most applications which then feeds back to developers supporting that platform.  In the past, developers have had to make a single choice as to which platform they wanted to support in order to succeed.  It continues to happen with Android and iPhone.  Jeff Raikes has given a good example in the early days of Microsoft in, "The Principle of Agility", he says:I suspect many of you or in the audience might if I ask you the question of, "What was Microsoft's first spreadsheet?" You might think Excel was our spreadsheet. But in fact, we had a product called Multiplan...Our strategy was to be able to have our application products run on all of those computing platforms because at that time there were literally hundreds of different personal computers.And on January 20th, 1983 I realized, I think Bill Gates also realized we had the wrong strategy. Any guesses to what happened on January 20th, 1983? Lotus, it was the shipment of Lotus 1-2-3. How many personal computers did Lotus run on in January of 1983? One, and exactly one. And it was a big winner. So what we learned was when it came to customer behavior. It wasn't whether you had a product that run on all of those computing platforms. What really mattered to the customer was, did you have the best application product on the computer that they own. And Lotus 1-2-3 was the best spreadsheet. In fact, it was the beginning of a "formula for success in applications". That I defined around that time called, "To win big, you have to make the right bet on the winning platform."So what's the principle? The principle is agility. If you're going to be successful as an entrepreneur what you have to do is you have to learn. You have to respond. You have to learn some more. You have to respond some more. And that kind of agility is very important. If we had stayed on our old strategy, we would not be in the applications business today. In fact, one of the great ironies of that whole episode is that in the late '80s or early '90s our competitors, WordPerfect, Lotus. What they really should have been doing was betting on Windows. But instead they were betting on and WordPerfect was the best example. Betting on, putting WordPerfect on the mainframe, on minicomputers. In fact, they went to the lowest common denominator software strategy which we switched out of in the 1983 timeframe. So, for my key principle is, make sure that you learn and respond. Show that kind of agility. This is echoed in one of the recent exchanges (about an hour into MacBreak Weekly 283) where Alex Lindsay talks about how important it is to him to make education interesting and how he's not going to wait for standards, he just wants to produce the best.  Leo Laporte responds by saying how important it is for an open standard to prevail in order to prevent every child in America having to own an iPad in order to be educated or informed.You have to wonder if developers have reached a level of sophi[...]



Pretending We All Don't Know

2012-03-18T20:06:28.224+10:00

Some amazing writing and performance by Mike Daisey (mp3):
He just walked up to the Foxconn plant and wanted to see if anyone wanted to talk to him:

I wouldn't talk to me...she runs right over to the very first worker...and in short order we cannot keep up...the line just gets longer and longer...everyone wants to talk...it's like they were coming to work everyday thinking, "You know it'd be great?  It'd be so great if somebody who uses all this crap we make, everyday all day long, it'd be so great, if one of those people came and asked us what was going on because we would have stories for them...

I haven't gotten all the way through but he has a bit about talking to a girl that cleaned the glass on the assembly line:

You'd think someone would notice this, you know?  I'm telling you that I don't know Mandarin, I don't speak Cantonese...I don't know fuck all about Chinese culture but I do know that in my first two hours on my first day at that gate I met workers who were 14 years old, 13 years old, 12.  Do you really think that Apple doesn't know?  In a company obsessed with the details.  With the aluminium being milled just so, with the glass being fitted perfectly into the case.  Do you really think it's credible that they don't know?  Or are they just doing what we're all just doing, do they just see what they want to see?


It seems absolutely credible that they do know.


Update 17th of March: Retracting Mr Daisey (mp3) it appears his story was more fiction than not.  I took it more as performance than journalistic reporting but many claims aren't just errors they were just made up.  He did out and out lie when asked about child labour, "Well I don't know if it's a big problem. I just know that I saw it."  Which is a shame because verified reports of these conditions contain similar claims.




Jesus Says Share Files

2012-01-04T08:02:23.083+10:00

A famous story, Jesus takes some fish and loaves (accounts differ - although maybe he did it more than once, setting up the "Jesus's Food Multiplier" stall every second and fourth Saturday of the month) and feeds some people (again accounts differ and they don't count women and children as people - let's just skirt around that entire issue shall we).

Everyone was impressed - even the disciples that were fisherman who had deep ties with the community.  They didn't say, "Hey, Jesus you've just destroyed our business model, you can't go around feeding thousands of people per fish.  One person, one fish - that's the way it has always been and that's the way it should always be."



Partitioning Graphs in Hadoop

2012-09-20T14:47:35.024+10:00

A recent article at Linked In called "Recap: Improving Hadoop Performance by (up to) 1000x" had a section called "Drill Bit #2: graph processing" mentioning the problem of partitioning the triples of RDF graphs amongst different nodes.

According to "Scalable SPARQL Querying of Large RDF Graphs" they use MapReduce jobs to create indexes where triples such as s,p,o and o,p',o' are on the same compute node.  The idea of using MapReduce to create better indexing is not a new one - but it's good to see the same approach being used to process RDF rather than actually using MapReduce jobs to do the querying.  It's similar to what I did with RDF molecules and creating a level of granularity between graphs and nodes as well as things like Nutch.



Why the Cloud isn't the Internet

2012-01-05T10:20:09.180+10:00

I think people are just starting to realize what some of the cloud vendors are providing and their drawbacks.  Steve Jobs is quoted in his biography describing the intent behind iCloud:We need to be the company that manages your relationship with the cloud - streams your videos and music from the cloud, stores your pictures and information, and maybe even your medical data...over the next few years, the hub is going to move from the computer into the cloud...So we wrote all these apps - iPhoto, iMove, iTunes - and tied in our devices, like the iPod and iPhone and iPad...We can provide all the syncing you need, and that way we can lock in the customer. The Mac has always been different to Windows.  One of those differences Windows users' notice is that you switch between applications in OS X compared to documents (or windows) in Windows. The Apple cloud maintains that pattern by syncing between applications rather than documents (or individual files).  This approach confuses a lot of people.This is different to how most Mac users currently sync their files with Dropbox.  iCloud has ended up following its Mac heritage whereas Dropbox sticks to file syncing.  Matthew writes:The difference between Dropbox and iCloud synchronization is that Dropbox is theoretically just a file system...If you have a document that you edit on your iPad and sync with Dropbox you can edit that same file, using a different application, on your PC...The iCloud experience is completely different. The only way to edit a document across platforms or devices is to use a version of the application for each device. Not a compatible application...it may actually make me change the desktop application that I use purely based on iCloud support.If you want to read more about Dropbox and Apple there's a really good article in Forbes which details how Steve Jobs personally made an offer to buy Dropbox. The edges of iCloud - the integration points to applications and the operating system - it's incomplete even if you buy into the idea of applications over documents. For example, on iOS devices there is a Notes application but on OS X these notes are in a tab in the Mail application.  This seems like a weird and non-standard place to put it - if you are going to sync by application you'd think it should be the same application across platforms. In iCloud for Windows, Windows users get more choice than OS X users.  Mail, Contacts and Calendar integrate with Outlook but you can choose your application for Bookmarks (IE or Safari) and Photo Stream.Even within applications Apple haven't quite gotten syncing right with iCloud yet either including the new rules around where files are stored and what is automatically removed or backed up.The cloud is about vendor lock as much as any other platform, like application servers or databases, but with the extra problem that your data is tied to the vendor's application, cloud and user base.  A stickier solution.Some, like Google and Facebook, offer export services, but these almost don't matter, because you get an almost useless hunk of data, lose the ability to run the applications and you can't access users on their network (who may well have been collaborators).With the Internet, the Web and open source you still have the possibility to use your data with applications shared by many people across different networks.[...]



Global Code Retreat 2012

2011-12-31T22:30:02.092+10:00

I went to the local Global Code Retreat held on the 3rd of December.  Overall, it was an amazing event - very well hosted and attended.  The basic structure of the day was 6 or so 45 minute sessions trying to implement something with a different person each time.  At the end of the 45 minutes, no matter how far you had got, you deleted your solution.The problem was "The Game of Life".  I'm pretty familiar with this problem having come across "Conway's Game of Life" early on in a magazine like Compute! or Byte. However, if you walked away with a really awesome solution to "The Game of Life" you probably missed the point - most of the things that were being taught were hidden.The solution was really beside the point.  One of the main reasons is to repeat solving the problem from scratch based on an idea called kata (movements practiced by yourself or in pairs).  This is something that I had come across in "The Pragmatic Programmer" which at the time reminded me of the time I had spent with projects at home - reimplementing the same thing over and over again.Steve Yegge mentions the same thing in his article "Practicing Programming".  He mentions that even as you program in your day job you may not actually be practicing programming.  Repetition in solving the same problems seems to be about keeping the problem fixed and then changing how you approach it and freeing you from any time constraints.  Most programming jobs involve solving the solution once (or if you're lucky doing a proof of concept and then implementing it again).The first time around it was awful.  I didn't know what I was doing, my environment was a little bit shaky, we couldn't agree on a language and I spent a lot of the time just setting it up. It made me become aware that for the first time, practically ever, my personal computer had diverged from my work computer.  Not in the "normal" Windows at work, Linux and OS X at home - but what I do at home and at work have diverged to the point where I'm learning stuff in many directions and there's almost no overlap between the two.The second time was much better.  There was less discussion on languages to use, how to approach the problem, how do you test drive it, whose computer to use and so on.  There was still discussion but we both shared a bit more context this time which made the discussion flow.  A big difference to the first time.The third time around changed the format a little to where you couldn't talk to the person but you could only express requirements through tests.  So this sorted out the people who were testing from those who weren't.  But it also seemed to reduce the clutter around what needed to be done.  Tests are much less ambiguous compared to talking through requirements and so once you setup a rhythm of tests it became much easier.  Also, the whole room was very quiet.  You could imagine that a team doing silent TDD and pair programming wouldn't be the noisiest group in the room (for once).Each round thereafter changed the programming requirements: no loops, methods no more than 3 lines, and no if statements.What did I learn?  Heaps.I ended up doing Ruby quite a bit and mostly the solution came out at about 30 lines of production code and 30 lines of tests and you could pretty much do it in the time allocated.  I also did solutions C# and Haskell.  The Haskell solution came out at about 30 lines total - both tests and production code - and met every constrain[...]



A Review of QI Live

2011-12-31T22:30:10.737+10:00

When Douglas Adams visited Brisbane in 2000 (possibly 1999) I had a friend sign a copy of "Starship Titanic" - I was too busy at work to see him myself.  I have always been a little disappointed that I didn't take the hour to see him.  When Stephen Fry announced "QI Live" on Twitter I made sure I wasn't going to miss out.  It was only in Melbourne and Perth (at the time) and bugger it if I was going to Perth, so even though I didn't live in Melbourne, I got some.As I walked into the theatre the place names of France that sound funny appeared on the screen.  I remembered them from a previous episode of the TV series.  It didn't matter, they'd just played the theme for "Pinky and the Brain" and were playing Tom Lehrer's "The Elements".  I was in a good mood and I was happy to be there.My wife thought she saw the producer, "What John Lloyd?".  Surely not, but soon after, an usher told us to stop taking photos.  It was a little annoying because I was mainly taking photos of the theatre - I'd never been to the Queen Victoria Theatre before.  Then a prissy voice said that Mr Fry's man servant would like you all to turn off your phones - so I did.The only bit I now remember of Stephen's opening was him telling a joke about a chicken going to a library.  It was told well, I guess, but too long for me because I'd heard it before and spent most of the joke remembering my grandmother telling me it about 20 years before.  Maybe that was what he was going for.  What was worse, it lead to the fact that a frog in California is the only species in the world to go ribbit.  Another recycled fact and I was getting a little bit annoyed.Colin Lane was the first panelist and pretty forgettable.  He and Andrew played tennis with the "Nobody Knows" paddles which was okay the first time.  Denise Scott didn't really seem to get the format but she had some good anecdotes - such as being recognised as looking like that person on TV but that she couldn't possibly be that person.  Andrew Denton was by far and away the star - he seemingly, confused Stephen by saying that, "If nobody knows, why isn't he on the panel?"  It made the whole thing almost worth it.  Except that Stephen then went on to spend most of the evening calling him stupid.The first question was about koalas having fingerprints that are indistinguishable from human fingerprints and that maybe they were doing all the robbing in Australia (which they said has the highest burglary rate in the world).  This fact doesn't seem to be true now although it was true in 2001 which is probably closer to when it was written. Some of the other content that was from previous episodes included: kangaroos not farting and having 3 vaginas, when does the sun set (a video), Beatles' HELP album cover, the most popular song being the default Nokia ring tone and slavery not being illegal in the UK until recently.  This is just me guessing but Andrew Denton knew every answer.  The members of the audience shouting out certainly did.  But then so did Alan and so did I.  Well, I might not have known every question.  Alan answered over the top of the question for 100 points so I didn't hear what it was.I might be wrong, but one of the ways the show works is that Alan doesn't know the answer to every question.  Sometimes some of the guests do (John Sessions and Rory McGrath) but the point is: Alan is the kind of guy the show is supposed to be educating - [...]



Building a Network Over Transactions

2011-12-31T22:38:50.770+10:00

The new meaning of customer value: a systemic perspective analyses providing value to customers from a systems perspective.

I had a thought, a little while ago, that Google is probably one of the first companies where the users and content providers are basically the same people and that they make money, through adverts, between connecting these two together using search.

It's probably not a new thought but at the time I started to draw a diagram of how it all works.  I happened across this diagram (on the left) in this paper (page 4). 

A perverse example is when you search for something and the first hit is your own blog.  You're now both the producer and consumer of the same content - with adverts sandwiched in the middle - hmm value sandwich.

In the paper they use Google and Apple as examples:
"Google has indeed realized the usability of systemic value-creation principles in building its offering. In contrast to Apple, it uses the value network to generate the revenues. Google provides free, easy-to-use tools for customers to use on the internet, the aim being to generate “eyeballs” for the ads of the advertising customers. In collecting these “eye balls” it has or it creates a product for every internet activity that attracts lots of traffic. From the firm's perspective, the offering elements are integrated to provide the audience for the ads, information being gathered in order to better scope the ads or just to make the customers happy and to promote other products."
There's an old idea, for the Web anyway, of building a network of customers above extracting value out of each transaction. Over-valuing the creation of the network lead to the whole dot com bubble and I have been thinking about how business models have progressed since then.



One horizontal and one vertical monitor for Ubuntu 11.04

2011-09-30T05:42:42.157+10:00

I recently had problems configuring Ubuntu with dual screens using an NVidia card - the first screen is horizontal and the second screen is vertical - they are both Dell U2711 monitors. The idea is to be able to have Firebug running on the screen without obscuring the web page.

This should be a simple thing. But from what I can tell NVidia's TwinView driver doesn't support different monitor rotations (under Linux). But having X Windows it should still be easy using dual X Screens. It should just be a matter of going to NVidia X Server Settings, selecting "Separate X Screen" and then selecting "Enable Xinerama". Unfortunately for me, this caused general weirdness where the first screen was mostly black and the second screen was displayed horizontally.

The way I fixed the problem was to disable Compiz. The easiest way I found to disable Compiz was to log in using the "Ubuntu Classic (No effects)" session.

Then it was just a matter of enabling multiple XServers and Xinerama and enabling rotation (RandRRotation). Here are the bits in my xorg.conf to rotate my second monitor to the left:

Section "Monitor" 
# HorizSync source: edid, VertRefresh source: edid
Identifier "Monitor1"
VendorName "Unknown"
ModelName "DELL U2711"
HorizSync 29.0 - 113.0
VertRefresh 49.0 - 86.0
Option "RandRRotation" "on"
Option "DPMS"
EndSection
Section "Screen" 
Identifier "Screen1"
Device "Device1"
Monitor "Monitor1"
DefaultDepth 24
Option "TwinView" "0"
Option "metamodes" "DFP-2: nvidia-auto-select +0+0"
Option "Rotate" "left"
SubSection "Display"
Depth 24
EndSubSection
EndSection



When Kworkers Don't

2011-09-26T20:10:27.732+10:00

I recently had the problem where kworker threads were taking up 100% on my Ubuntu box.

Various threads seem to suggest the problem lies with interrupts around PCI or power saving features. The thread with the answer that worked for me, "HELP !!! Zombie attack ... (kworker)"

To turn both of them off use:
noapic acpi=off


I found I only needed to turn off acpi though:
acpi=off


Put that in your grub configuration (sudo vi /etc/default/grub) and restart.

I hate not knowing why though.



End of JRDF

2011-09-20T09:48:58.105+10:00

Many things have changed since I started JRDF in 2003. It feels like JRDF has come to a natural conclusion.

Some of them are things I've failed to do very well: get contributors, implement different file format parsers, find enough time to refactor existing bits, etc. I'm also not that interested in Java anymore (as was becoming increasingly obvious as it did have Scala in there at one point and has some Groovy DSL code in there).

The most recent change I've seen is that JSON has achieved some of what RDF was trying to do and I see it more and more in the way people use it to expose their data in a RESTful way. The tooling is less onerous and the ease of use is higher even if what you get is much less.

Also, external factors like W3C's official RDF API (for Java and Javascript) is largely the same thing but with official backing.

I've enjoyed developing it and meeting and talking to other people in other groups (especially Jena and Sesame). And of course, none of this would've happened if it wasn't for a lot of other people: Paul Gearon, Simon Raboczi, David Wood, David Makepeace, Tom Adams, Yuan Fang-Li, Robert Turner, Brad Clow, Guido Governatori, Jane Hunter, Imran Khan and Abdul Alabri and the other guys and girls Tucana/Plugged In Software/UQ.