Algorithmic efficiency in the face of inelegance

March 1st, 2008

Seed magazine published a wonderful crossover article between the worlds of biology and engineering “Algorithmic Inelegance

The complexity of developmental regulation isn’t a product of design at all, and it’s the antithesis of what human designers would consider good planning or an elegant algorithm. It is, however, exactly what you’d expect as the result of cobbling together fortuitous accidents, stringing together helpful scraps into an outcome that may not be pretty, but it works. That’s all evolution needs from developmental processes: something that works well enough, no matter how awkward or needlessly complex it may seem.

Biological solutions can be remarkably efficient (well adapted thanks to evolution by natural selection) while being absurdly backwards in design. Nerves in the mammalian eye pass on top of the rods and cones that collect light yet I can’t complain about my eyesight (other than the need for contacts but that’s another story). Cephalopods apparently have more accurate vision thanks to happier circumstances:

The vertebrate retina is wired “backwards”. That is the photoreceptors point to back of the retina, away from incoming light, and the nerves and blood vessels are on the side of the incoming light, this means that any image formed on the vertebrate retina has to pass though layers of blood vessels and ganglion cells, absorbing and distorting the image….Now consider the eye of squids, cuttlefish and octopi. Their retinas are “rightway round”, that is the photoreceptors face the light, and the wiring and the blood vessels facing the back (1). Squid and octopi have no blind spot; they can also have high visual acuity. The octopus also has a fovea-equivalent structure, which it makes by packing more (or longer) photoreceptors into a given area (1). Because it doesn’t have to create a hole in the supporting tissue it can have arbitrarily large “fovea”, and greater visual acuity. Cuttlefish have better visual acuity than cats (2) and because of their “rightway round” retinas; this level of acuity covers nearly the entire retina (1,2) unlike vertebrates where it is confined to the small spot of the fovea.

More on this fascinating story here.

The author seems to indicate that human engineers would shy away from the needless complexity that evolution by historical accident seems to create. However, today’s leading engineers faced with shifting requirements, overwhelming problem complexity, or incomplete understanding of the science behind the solution may well adopt an evolutionary strategy toward solving a problem, as many circuit and drug designers have already done successfully. (I’ll try and dig up some references). They start with a modular design with lots of simple parts that can be tweaked and rearranged quickly, set up replication with random modification and a process for selecting the best fitted solutions. Like artificial selection, or breeding, you can achieve results quickly. This is similar to what successful organisms do, molded by the evolutionary process of natural selection working hand in hand with mutation and reproduction. And it explains why we burn brightly on this earth, have sex, and above all why we are programmed to die.

The legal limits of creativity explored: Recut Reframe Recycle

January 12th, 2008

Danah Boyd points out that the center for social media has released an informative paper entitled Recut Reframe Recycle concerning the boundaries of fair use (fair dealing in Canada) in digital media.

Types of use they point out:
• Parody and satire
• Negative or critical commentary
• Positive commentary
• Quoting to trigger discussion
• Illustration or example
• Incidental use
• Personal reportage or diaries
• Archiving of vulnerable or revealing materials
• Pastiche or collage

Some of these can push the limits of fair use, take for example D.J. Danger Mouse’s Grey Album (a mix of the Beatle’s White Album and Jay-Z’s The Black Album) which I discussed in Mashups: from hobby to art form to controversy.

The report concludes:

Some of these practices also fit comfortably into the evolving pattern of fair use jurisprudence. By contrast, other video makers appropriate material wholesale and without context or comment, in ways that clearly are not fair use. In all these cases, informed judgment on fair use, following established precedent, should be relatively straightforward. Many times, however, for instance within the category that our researchers called “pastiche or collage,” creators are developing practices that are at or near the boundaries of contemporary fair use analysis. Traditional fair use analysis would neither definitively exclude nor include them—at least until there is a better understanding of motive, context, circulation, and use of the new works. Since fair use doctrine evolves with creative practice, these borderline cases provide important areas for future research and analysis.

They call for “a code of best practices around fair use in online video needs to be articulated, both to educate new makers and to provide guidance for regulators private and public.” It will be a contentious process to push for more definition in the massive legal grey areas that exist in copyright law regarding fair dealing but it has such big implications for the (legal) limits of creativity that creators everywhere should be pushing hard for clarity. Unfortunately the effect of the law thus far has been largely the reverse: Danah gives a good background on why this is such a critical issue in the USA currently:

fair use is quite tricky because courts address it on a case by case basis after someone is sued. There is no list of what constitutes fair use. Thus, remixers engaging in practices that would collectively be viewed as fair use never have certainty that what they’re doing is legal. Because court cases are extremely costly (especially for the lone defendant in the face of Big Mega Corp), corporations can wield a lot of power through the egregious use of “Cease and Desist” letters. Most creators bow down in the face of them even if what they’re doing is totally legit because they are terrified of being sued. In legal terms, a “chilling effect” is when practices are squelched by fear of persecution. Right now, when it comes to remix, we’re in the middle of an ice age. The Chilling Effects Clearinghouse website attempts to counteract some of this effect by collecting and publishing Cease and Desists and other nefarious attempts by corporations to silence fans and critics.

This problem is increasingly relevant to Canadians as a US-corporation sympathetic government in Canada enables and encourages such business practices.

Android – one step closer to freedom of communication

November 5th, 2007

Android: the open phone platform Like most heavy cellphone users I have a love hate relationship with both the device and the wireless service. Love because it gives me freedom from a physical location and thus is an essential business enabler for entrepreneurs. Hate because it restricts my freedom in many ways that are designed solely to make more money for the handset and communications provider:

  • handsets are locked to a specific service provider’s network (consumers sign away the right to a competetive market for a small subsidy on the handset upfront)
  • restricted ability to run applications (e.g. no Skype for Blackberry or iPhone)
  • plans that are designed to extract the most money possible from entrepreneurs (especially in Canada, collusion between the “competetitors” ensures high margins)
  • So thanks to Google and others for supporting the development of an “open” platform for phones which threatens to transform the market:

    Next step: we need openly accessible wireless networks for such phones. Currently the only really open network is the internet and only if you can afford an unfiltered and reasonably symmetrical last mile which rules out many consumer internet products including wireless data (which is cost prohibitive in Canada). Widespread internet access is only somewhat accessible if you are a hotspot hacker extraordinaire.

    Maybe, just maybe Google’s plan to bid for the former UHF 700Mhz spectrum and make it accessible to 3rd parties is connected with Android project. Just maybe ;)

    To be fair, Android is getting a lot of hype almost a year before we can expect to see anything, while there are other mobile phone operating systems vying for a place as relatively open mobile internet platforms. Om Malick just published a review of the market for mobile platforms pointing out alternative mobile Linux platforms, and recently Alistair introduced us to Nokia’s N810 tablet which shows potential as an open mobile communications platform even if it doesn’t have an integrated cellphone. It will be interesting to see the next platform moves of Microsoft, currently with a relatively open platform as mobile platforms go, and RIM, whose tightly restricted Blackberry platform is a favorite with enterprise and e-mail junkies.

    Can the web help fight corruption?

    October 21st, 2007

    Lessig turns his keen insight and trademark presentation style towards the analysis of corruption as he steps away from 10 years of focusing on intellectual property issues and their impacts on culture, creativity and innovation. I wish him luck as he will have the full force of the political and business establishments fighting him to their eventual unlamented demise.

    For a quick take on the hour long lecture, skip to 8:23, a story about how the sugar institute managed to influence the food nutrition board (FDA) to set the recommended maximum intake of sugar to 25% of your daily calories, instead of the WHO’s advised maximum of 10%. Hang on through 10:30 where he talks about how lobbyists have skewed the global warming “debate.” Other issues he covers are the pharmaceutical industry targeting doctors with “bribes that are not considered bribes” and the influence of private interest funding on “scientific” research. His bottom line is that everywhere we look, money determines decisions more than ever, often trumping what is in the interests of the citizens.

    It all seems depressing, but what gives Lessig and the rest of us hope for democracy is the potential of the internet, particularly the web, to render the forces behind policy decisions more transparent and available to a wider population of stakeholders, namely the citizens (I know this is a shocking concept!). Perhaps the availability of hitherto inaccessible information will inspire more of the population to become politically involved. I’m not yet holding my breath on this one, but let’s dare to dream.

    Towards the end of the lecture, Lessig highlights MapLight which exists to expose the relationship between legislators voting patterns and their sources of funding in US congress. Also featured is the Sunlight Foundation:

    The mission of the Sunlight Foundation is to use the transformative power of the Internet and new information technology to enable citizens to learn more about what Congress and their elected representatives are doing, and thus help reduce corruption, ensure greater transparence and accountability by government, and foster public trust in the vital institutions of democracy.

    Aside: Canadians should check out the Data Libre project.

    Corruption is a problem that is probably a human condition arising from social interaction and economics, unlikely to disappear, but perhaps like many of our weaknesses, it can be minimized. We are in the earliest days of the development of new tools that combine openly accessible data, visualization technology, and the reach of the web. It remains to be seen if they will help improve our democracy or become yet another system to be influenced and gamed by special interests.

    The Web as operating system

    October 19th, 2007

    Alistair, who has been writing recently at GigaOM, puts together the most lucid synthesis I have seen on the topic of the “Web OS.”

    Big Internet companies are making themselves the OS of the web 2.0 world. In addition to the fundamentals — operating a web application, storing data, handling logins — each company has a core expertise. In Google’s case, it’s page ranking and relevance; Facebook maps social relationships; Salesforce knows about customer relationships; and eBay has an auction and reputation engine.

    Each of these web OS service platforms by their natures can in theory interoperate, and competitors can in theory be mixed and matched. Alistair points out that developers who use these sites as platforms still have tenuous relationships with the terms of use (such as the vibrant Facebook developer community). This will be pretty contentious, as many of the big web site “platforms” have incentives to build their own apps. But all software platform companies realize that developer communities are the key to success, and if Facebook doesn’t find the right balance of developer rights and incentives someone else will. In an ideal world we assume a free market where the user can basically choose their web OS options from a range of competitors. The question is how much competition can the market sustain (can there be multiple platforms for each service (who competes with eBay?) and can there be multiple winners within the ecosystems that run “atop” those platforms. I think the answer to the latter is yes, but my concern is with the former (competition for platforms) and how much collusion there will be between the established “components” to band together monopolistically (eBay and Facebook sitting in a tree), which has been a bit of an issue in the software world.

    In a slight digression I leave you with the following totally awesome video on the nature of information and how the web is setting it free:

    Can code be bad for the planet?

    October 14th, 2007

    Alistair has an interesting post on Earth2Tech, the thesis of which is that inefficient coding practices can lead to environmental harm.

    I almost misled myself into thinking AC is blaming virtualization and SaaS/IaaS (infrastructure as a service) on creating such inefficiencies. Rather, he skipped past the obvious environmental benefits of server consolidation (improved resource utilization) and service centralization (via economies of scale) and instead builds on the consequences: code inefficiencies become more obvious when you’re no longer massively overprovisioning hardware. This is an opportunity as much as a challenge. Bad code did matter previously, but not to the extent that is forseen: we’re scaling web applications to a much larger degree than ever before. There are more users, and inefficiencies are multiplied.

    So it is great news that virtualization and on demand infrastructure will allow us to focus more on code efficiency since as Alistair (incidentally a veteran in the monitoring of applications) points out, it exposes more granular economics of computing. These technologies are paving the way to greater infrastructure efficiencies and by forcing better utilization of hardware, putting more focus on the efficiency of the code that cohabits the infrastructure.

    Increasing code efficiency has been generally unimportant except in edge cases. Stability and function has been more of a concern while Moore’s law and incomplete costing of infrastructure have been more than compensating for performance. Rapid application development platforms proliferate based upon the abilities of modern hardware to crunch “affordably” through the multiple layers of abstraction.

    What me worry?

    There’s definite potential for code to have an environmental impact. We have an existing ecological disaster on our hands with the castoff personal computing hardware of both enterprises and consumers. Almost all of that computing power was wasted idling, never used, just for the ability to load Microsoft Office applications quickly. How do we acheive more efficient code? As more people rely more on computing, the costing of which is becoming more accurate and granular, and as the barriers of entry for developers drop, we should witness an evolutionary process at work battling inefficiency assuming:

    • large population of users
    • competing applications
    • rapid generation spans with modification
    • market exerts selective pressure

    While I think these evolutionary forces are already at work, the selective pressures have been weak, the environment has been overly abundant leading to a Cambrian explosion of inefficiencies that eventually will be represented in costs that the market will react to, assuming that the market has the freedom to do so. This is where intellectual property issues and the “one platform to rule them all” attitude may present a bit of a speedbump, but only that.

    Alistair’s most important point is highlighting that proper costing of computing is essential: if we want to minimize environmental impact we need to measure the efficiency of work performed by applications and the true cost of the resources they consume. My conjecture is that an evolutionary process of anthropogenic artificial selection, automated or not, should optmize resource utilization. This rests upon the premise of a competitive market, which I believe we are just starting to see in the world of software.

    For now I’m much more concerned with how poorly conceived code can compromise privacy, and restrictive code that restricts our freedom of communication and innovation. But those are stories for another post.

    The world’s on fire

    August 8th, 2007

    The funds that Sarah McLachlan would have spent to produce a music video are donated for more practical uses, and the video is instead used to help us visualize just how much of a difference $150,000 can make to those less fortunate than us:

    I’m a long time fan and proud she is a fellow Canadian.

    Infreemation: just the facts

    July 2nd, 2007

    Apologies to RSS subscribers who will no doubt be affected by my blog consolidation. I’m pulling in all of the posts from www.infreemation.net and rebranding this blog Infreemation. Why the consolidation? Most of what I write in both blogs is about information: technology, policy, analysis, visualization. I haven’t been making too much time lately for my blogs, as I was making time for growing my businesses (more on this later), attending lots of interesting conferences (Mesh, Interop, Web2Expo) and most importantly…getting married and disappearing on a “lune de miel!” I’ll be blogging professionally soon which will move some of the more internet-geek type posts out of the personal blog and allow me to talk about my tangential, hobbyist interests here.

    For the one or two of you who are still with me, let me reward you with one of my favorite exchanges caught on film as inspector Clouseau schools his sidesick in some of the finer points of deductive reasoning:

    Clouseau: Facts, Hercule, facts, behind them lays the whole fabric of deductive truth. Now, Hercule let us examine these facts: 1 [holding up 1 finger]
    Clouseau: she was found with the murder weapon in her hand, 2 [holding up 3 fingers]
    Clouseau: the murder weapon was fresh with blood, 3 [holding up 4 fingers]
    Clouseau: there were no fingerprints on the murder weapon other then hers and 4 [holding up all 5 fingers]
    Clouseau: all the members of the Ballon household staff have perfect alibis. Now then, Hercule what do these facts add up to?
    Hercule LaJoy: Maria Gambrelli killed Georges the gardener.
    Clouseau: You are an idiot, only a fresh faced novice would come up with a conclusion like that.
    Hercule LaJoy: But the facts.
    Clouseau: Listen, who even killed Miguel, killed Georges the gardener and he did it to cover up the first crime. Now what he is trying to do is lay the blame at the foot of this, this poor servant girl.
    Hercule LaJoy: Well who do you suspect?
    Clouseau: I suspect everyone.
    Hercule LaJoy: Well I suppose that is possible.
    Clouseau: Possible? What do you mean possible? I deal in certainties.

    I guess you kinda have to see it for yourself…If you haven’t already I recommend checking out “A Shot in the Dark”

    Owning the rainbow: why spectrum can and should be freed

    February 28th, 2007

    Lessig’s presentation on the control of US airwaves is brilliantly clear, as usual:

    An inconvenient graphic – visual analytics

    February 12th, 2007

    Some graphics from An Inconvenient Truth have been posted to Flickr. Copyright has most certainly been violated here but I suspect Gore would prefer the message spreads and perhaps this will inspire more folks to check out the movie.

    An Inconvenient Graph

    This graph showing a sharp rise in CO2 has generated a lot of controversy, especially because of the extrapolation of future CO2 levels. For an excellent analysis of the controversy and a discussion about how visualization approaches such as Gore’s slideshow can be used to turn complex data into information, look no further than David Womack’s excellent article: Seeing is believing: Information visualization and the debate over global warming.

    Excepted from the above:

    Information visualization is able to communicate the intricacies of global warming in a way no other discipline can. Its messages can be immediate and powerful, without sacrificing the level of detail necessary to represent the complex subject accurately. Not only is information visualization helping scientists and politicians communicate with the public, it is a primary tool for scientific study, and for the study of science itself. It is particularly telling that even the medium of film could not compete with the power of these visualizations. According to “An Inconvenient Truth” director Davis Guggenheim, “I thought, a film about a slide show? A filmed lecture? I don’t get it. And then I saw his slide show. The information in it is so powerful, and we all just felt like, what if we could give people a front-row seat to this.”

    Wider adoption of visual analytics, while useful for scientific analysis and “misuseful” or easily manipulated for special interests, will at least stimulate wider debate and analysis by increasing the accessibility of data to society (see Gapminder). And this, I dare to believe, will lead to good things.