Brain dump archived articles

Subscribe to the RSS feed for this category only

Brain dump and France and Politics17 Oct 2014 at 0:07 by Jean-Marc Liotier

A nation of destitute street boys and peasants, gone bare-feet with old rifles to war against every one of their neighbors, led by visionaries with statements such as “terror is nothing else than justice, prompt, severe, inflexible; it is thus an emanation of virtue“, stained with the blood of all the innocents they beheaded but strong with their belief in ideas that scare all the world’s governments. The French (circa 1793).

Nothing to do with the IS of course, though I  would love members of the IS to read my posts – they would be utterly outraged at being confused with people who fought in the name of the Declaration of the Rights of Man and of the Citizen !

Brain dump and Politics15 Oct 2014 at 11:42 by Jean-Marc Liotier

Queen of HeartsCitizen of a country whose current regime was founded on the corpses of 16594 beheaded people, during a period known as The Terror, I feel uniquely qualified to comment on terrorist beheadings.

Not that I usually need an excuse to open my big mouth, but lets not pass on this excellent one to indulge in some punditry !

Guess why Eugen Weidmann’s guillotine execution on the 17th June 1939 was the last one the French performed in public ? Unbeknownst to Parisian prison officials, a film camera had been set up in one of the apartments overlooking the scene

The public was scandalized by their own violence; the government embarrassed. In response France banned public executions. Weidmann went down in history as the last man in France to be guillotined for the entertainment of the awaiting crowd (a dubious distinction).

The government did not find fault in the grisly execution itself—of course it couldn’t have, that would have been an admission of justice’s guilt—rather it blamed the so-called unruly behavior of the savage crowd. The spectacle of bloodlust was, apparently, too powerful for film. Public guillotining was hidden behind the confines of the prison wall—privatized to conceal the spectacle.

Today, we still sentence to death, but we make sure the gore stays out of sight. As one HN commenter put it:

In some ways, the U.S. has done to executions and automated foreign assassinations what the supermarket has done to eating meat. We are distanced from the act so that we aren’t overly burdened thinking about about what is done in our names, both as citizens and voters. Hence, we do not oppose something that we normally would, were we only more aware of it.

Not having to wipe bloody bone shards and bits of blasted flesh from their tablet’s screen certainly is among the reasons why people outraged at the beheading of innocent on video still tolerate remote airborne executions of no less innocent people.

Michael Leuning sums it best:
Michael Leunig's Beheadings
So ? What do the French, the Saudi and the Queen of Hearts have in common ? They knows better than beheading people in public – it is just a basic matter of marketing communications management.

Brain dump and Technology and The media26 Jun 2012 at 13:42 by Jean-Marc Liotier

/set rant_mode on

A digit is a numeral from 0 to 9 – so the French translation is “un chiffre”. Surprisingly, I find myself having to add that the French translation of “a digit” is not “un doigt” – you may use your fingers for counting, but in the end it is all about numbers not body parts.

Therefore the proper translation of “digital” in French is “numérique” – the French word “digital” describes something related to fingers. A digital device may be finger operated, but its digital nature is related to binary processing… The presence of a keyboard is accessory.

Increasingly, I find my compatriots using “digital” to qualify anything run by computing devices without having to mention them by name – because computers, data processing, electronics and such drab technicalities are uncool compared to the glittering glitz of mass-marketable trinkets. I resent this lamentable technophobic trend but, if you want to indulge in such decadence, please at least use the proper French word.

From now on you’ll know that any French person caught saying “digital” instead of “numérique” spectacularly exposes his ignorance – you know who they are and you are welcome to anonymously report them in this article’s comments (with links to incriminating tweets for bonus ignominy).

I obviously don’t mind people using English. I don’t even mind loan words – they are part of how a language evolves. But I do object to mindless namespace pollution: using loan words does not exempt from semantic coherence.

Call me pedant if you want, but if you attempt to degrade our essential communication tools you’ll find me on your path and I’ll be angry !

Brain dump and Knowledge management and The Web and Writing13 May 2011 at 0:23 by Jean-Marc Liotier

Using this blog for occasional casual experience capitalization means that an article captures and shares a fragment of knowledge I have managed to grasp at a given moment. While this frozen frame remains forever still, it may become stale as knowledge moves on. Comments contributed by the readers may help in keeping the article fresh, but  that only lasts as long as the discussion. After a while, part of article is obsolete – so it is with some unease that I see some old articles of dubious wisdom keep attracting traffic on my blog.

Maybe this unease is the guilt that  comes with publishing in a blog – a form of writing whose subjective qualities can easily slide into asocial  self-centered drivel. Maybe I should sometimes let those articles become wiki pages – an useful option given to  contributors on some question & answers sites. But letting an article slide into the bland utilitarian style of a wiki would  spoil some of my narcissic writing fun. That shows that between the wiki utility and the blog subjectivity no choice must be made : they both have their role to play in the community media mix.

So what about the expiration date ? I won’t use one : let obsolete knowledge, false trails, failed attempts and disproved theories live forever with us for they are as useful to our research as the current knowledge, bright successes and established theories that are merely the end result of a process more haphazard than most recipients of scientific and technical glory will readily admit. To the scientific and technical world, what did not work and why it did not work is even more important than what did – awareness of failures is an essential raw material of the research process.

So I am left with the guilt of letting innocent bystanders hurt themselves with my stale drivel which I won’t even point to for fear of increasing its indecently high page rank. But there is not much I can do for them besides serving the articles with their publication date and hope that the intelligent reader will seek contemporary confirmation of a fact draped in the suspicious fog of a less informed past with an author even less competent than he is nowadays…

Brain dump and Knowledge management and Networking & telecommunications and Technology16 Dec 2010 at 13:19 by Jean-Marc Liotier

Piled Higher & Deeper and Savage Chickens nailed it (thanks redditors for digging them up) : we spend most of our waking hours in front of a computer display – and they are not even mentioning all the screens of devices other than a desktop computer.

According to a disturbing number of my parent’s generation, sitting in from of a computer makes me a computer scientist and what I’m doing there is “computing”. They couldn’t be further from the truth : as Edsger Dijkstra stated, “computer science is no more about computers than astronomy is about telescopes”.

The optical metaphor doesn’t stop there – the computer is indeed transparent: it is only a windows to the world. I wear my glasses all day, and that is barely worth mentioning – why would using a computer all day be more newsworthy ?

I’m myopic – without my glasses I feel lost. Out of my bed, am I really myself if my glasses are not connected to my face ?

Nowadays, my interaction with the noosphere is essentially computer-mediated. Am I really myself without a network-attached computer display handy ? Mind uploading still belongs to fantasy realms, but we are already on the way toward it. We are already partly uploaded creatures, not quite whole when out of touch with the technosphere, like Manfred Macx without his augmented reality gear ? I’m far not the only one to have been struck by that illustration – as this Accelerando writeup attests :

“At one point, Manfred Macx loses his glasses, which function as external computer support, and he can barely function. Doubtless this would happen if we became dependent on implants – but does anyone else, right now, find their mind functioning differently, perhaps even failing at certain tasks, because these cool things called “computers” can access so readily the answers to most factual questions ? How much of our brain function is affected by a palm pilot ? Or, for that matter, by the ability to write things down on a piece of paper ?”

This is not a new line of thought – this paper by Andy Clark and David Chalmers is a good example of reflections in that field. Here is the introduction :

“Where does the mind stop and the rest of the world begin? The question invites two standard replies. Some accept the demarcations of skin and skull, and say that what is outside the body is outside the mind. Others are impressed by arguments suggesting that the meaning of our words “just ain’t in the head”, and hold that this externalism about meaning carries over into an externalism about mind. We propose to pursue a third position. We advocate a very different sort of externalism: an active externalism, based on the active role of the environment in driving cognitive processes”.

There is certainly a “the medium is the message” angle on that – but it goes further with the author and the medium no longer being discrete entities but part of a continuum.

We are already uploading – but most of us have not noticed yet. As William Gibson puts it: the future is already here – it’s just not very evenly distributed.

Brain dump and Debian and Free software and Systems administration and Unix17 Nov 2010 at 19:54 by Jean-Marc Liotier

On I stumbled upon this dent by @fabsh quoting @nybill : “Linux was always by us, for us. Ubuntu is turning it into by THEM, for us“.

It definitely relates to my current feelings.

When I set up an Ubuntu host, I can’t help feeling like I’m installing some piece of proprietary software. Or course that is not the case : Ubuntu is (mostly) free software and as controversial as Canonical‘s ambitions, inclusion of non-free software or commercial services may be, no one can deny its significant contributions to the advancement of free software – making it palatable to the desktop mass market not being the least… I’m thankful for all the free software converts that saw the light thanks to Ubuntu. But nevertheless, in spite of all the Ubuntu community outreach propaganda and the involvement of many volunteers, I’m not feeling the love.

It may just be that I have not myself taken the steps to contribute to Ubuntu – my own fault in a way. But as I have not contributed anything to Debian either, aside from supporting my fellow users, religiously reporting bugs and spreading the gospel, I still feel like I’m part of it. When I install Debian, I have a sense of using a system that I really own and control. It is not a matter of tools – Ubuntu is still essentially Debian and it features most of the tools I’m familiar with… So what is it ? Is it an entirely subjective feeling with no basis in consensual reality ?

It may have something to do with the democratic culture that infuses Debian whereas in spite of Mark Shuttleworth‘s denials and actual collaborative moves, he sometimes echoes the Steve Jobs ukase style – the “this is not a democracy” comment certainly split the audience. But maybe it is an unavoidable feature of his organization: as Linus Torvalds unapologetically declares, being a mean bastard is an important part of the benevolent dictator job description.

Again, I’m pretty sure that Mark Shuttleworth means well and there is no denying his personal commitment, but the way the whole Canonical/Ubuntu apparatus communicates is arguably top-down enough to make some of us feel uneasy and prefer going elsewhere. This may be a side effect of trying hard to show the polished face of a heavily marketed product – and thus alienating a market segment from whose point of view the feel of a reassuringly corporate packaging is a turn-off rather than a selling point.

Surely there is is more about it than the few feelings I’m attempting to express… But anyway – when I use Debian I feel like I’m going home.

And before you mention I’m overly critical of Ubuntu, just wait until you hear my feelings about Android… Community – what community ?

Brain dump and Politics and Technology08 Nov 2010 at 1:42 by Jean-Marc Liotier

Evil implies that corporations can be judged as humans, but they are not : corporations are just soulless. They knows neither right nor wrong. By definition, a corporation exists merely as a maximization function toward the goals of its shareholders. That is why, in spite of having legal personality, corporations cannot exist in the political sphere that holds control and oversight in the name of the public good – though the extent to which the financial resources of corporations are employed to influence political campaigns shows how poorly that separation of power is applied.

Charles Stross’ Accelerando is heavily loaded with buzzwords – though it is a fun read and a great reflection on post-humanity. Among the interesting concepts that pepper the story, I found the “Turing-complete company constitution” – if you have legal personality, then why not Turing completeness ? And then why not go all the way to human-equivalent sentience and cognitive abilities or better ? You may, but it won’t matter because whatever their sophistication, corporations have a mandate inscribed in their lowest level code that merely makes them paperclip maximizers.

Whether you consider them anthropomorphic artificial intelligences or just really powerful optimization processes, corporations don’t care about you anyway. To paraphrase Eliezer Yudkowsky : they don’t hate you, nor do they love you – you just happen to be resources that they can use for something else.

Brain dump and Economy and Free software and Marketing11 Apr 2010 at 10:45 by Jean-Marc Liotier

In the wake of the Ordnance Survey’s liberation of the UK’s geographical information, I just had an interesting conversation with Glyn Moody about the relationship between free digital publishing and the sale of same data on physical substrate.

If computer reading is cheaper and more convenient, can free digital publishing lead to sale of same data on physical substrate ? Free data on physical substrate has market value if the substrate has value on its own or if the data has sentimental value. That is a potential axis of development for the traditional publishing industry : when nostalgia and habits are involved, the perceived value of the scarce physical substrate of digitally abundant data may actually increases. Of course, free data has value on its own – but, as the reader of this blog certainly knows, it involves a business model entirely different to physical items.

Identification of content producers, quality control, aggregation, packaging… This is what a traditional editor does – and it is also what a Linux distribution does. Isn’t it ironical that those the Free software world and the world or traditional publishing have had such a hard time understanding each other ?

Some actors did catch the wave early on. In the mid-nineties, I remember that my first exposure to Free software took the form of a Walnut Creek CD-ROM – at the time there was a small publishing industry based on producing and distributing physical media filled with freely available packages for those of us stuck across tens of kilobytes thin links in the Internet’s backwaters. And there were other before : since time immemorial, the Free software industry has understood that the market role of producing data on physical substrate is distinct and independent from managing the data. As Glyn Moody remarked, it is only a matter of time before the media industry as a whole gets it.

Strangely, the media industry lags at least fifteen years – and probably twenty : even in mainstream publications, the writing has been on the wall for that long. To prove that, here is an excerpts of a 1994 New York Times article by Laurie Flynn “In the On-Line Market, the Name of the Game Is Internet” :

“I think Compuserve as a business is going to change very radically,” said David Strom, a communications and networking consultant in Port Washington, N.Y. “It could be they’re going to become a pipe, an access provider to the Internet, rather than a content provider.”

But Compuserve, like other on-line services, says it will continue to find ways to differentiate its offerings from databases of similar information on the Internet, by providing better search tools, a more organized approach and better customer service.

Compuserve has just released a CD-ROM, to be updated bimonthly, that works with its consumer on-line service to add video clips and music to the service in a magazine-like format. In the first edition, for example, users can view a video clip from a Jimmy Buffett concert and then with a click of the mouse connect to the Compuserve on-line service where they can order the audio CD. All the on-line services are working to add multimedia.

“Compuserve has 15 years experience in organizing that data and making it easy for them to find it and grab it,” Mr. Hogan said. “It’s not just a user interface issue but how content is packaged.”

The history of Compuserve since then shows that they were never able to fully execute that vision. But it shows how long it took for the idea of free data as lifeblood of a multi-industry symbiotic organism to get from visionaries to a mainstream business model.

In the nineties, we had to endure the tired rear-guard debate of “content vs. pipes”. The coming of age of Free data, confirms that the whole thing was moot from the very start. In 1984, Stewart Brand said “Information Wants To Be Free. Information also wants to be expensive… That tension will not go away”. I believe that said tension is most definitely in the process of going away as free data will dominate and feed a system of economic actors who will add value to it and feed each other in the process.

Brain dump and Consumption and Mobile computing and Unix17 Jun 2009 at 22:23 by Jean-Marc Liotier

I acquired an HTC “G2” Magic less than two days ago. It runs the Android operating system. The feeling of being confronted with something very alien pushed me to record my first impressions with it to give an account of how a foreigner perceives the Android world with his naïve eyes, in his own words. For other systems where I’m a power user, I find the experience of newcomers interesting when they candidly point out problems we ignore because we have simply grown used to them.

This entry relates my feelings from friday night to sunday night. It may seem ridiculous in the future, but it is an instantaneous snapshot – for what it is worth.

First contact with the Android is a severe case of culture shock. More than ten years of Palm OS have shaped my expectations, and the disappointing past year with S60 on the Nokia E71 has not changed them much. But plunging into Android is unlike anything I have used so far – many of the UI conventions feel utterly strange. The home screen has a familiar status bar – but beyond that, Android is a class of its own. For example, instead of grabbing the scroll bar and sliding it, one has to slide the list itself – not illogical, but the contrary of any familiar widget kit I have come accross anywhere else. And many other things are just as alien.

After poking around a little, my first reaction is disorientation from the lack of keyboard. I could write tolerably fast with Palm’s Graffiti, but I was in love with the Treo-style keyboards – on the the Treos as well as on the E71 they let me write considerable volumes fluidly and without excessive strain. But my first attempts at text entry on the G2 are stumbling hit and miss torture with each word containing at least two typing errors. And why does the automatic correction insist on changing my “” address to “” every single time I enter it in a web form ? Text entry on the G2 is tedious enough without this sort of annoyance…

To be fair, I knew I had to expect text entry woes – I had anticipated that risk when choosing the G2 over the G1. Learning a new tool takes times, especially when low level reflexes are involved, so I have budgeted a few weeks for climbing up the learning curve. Then I’ll decide if I like onscreen keyboards or not. But whatever the learning, it seems that text entry on a virtual keyboard requires to keep one’s eyes on the keyboard – whereas with a physical keyboard, after a while muscle memory sets in and you can forget the keyboard to concentrate on what you are writing. So I’m not optimistic so far – but I’ll keep my mind open. Meanwhile, tactile feedback screens are on the way – I’ll keep an eye on them.

I miss the four-ways arrows button, but the obscene pointing device works rather well although like the rest it will take time to get used to. A “page down” button would be even better than having to swipe the whole screen every time I want to scroll down one step – one page at the time would be more precise than scrolling a random number of lines according to how much inertia the widget takes from the swipe. Screen swipe and inertia are sexy gimmicks, but I don’t understand how heavy users tolerate them for more than five minutes. I’m the sort of user who disables smooth scrolling and any on-screen animation that introduces the slightest lag in my interaction with the system – and I know I’m not the only one who wants responsiveness above everything else.

A combination of importing in Evolution a CSV file generated with Outlook, synchronization from the Nokia E71 to Google and copying native Evolution contacts to Google did not manage to capture at once all of the information I wanted transferred – so I had to munge some of the data and re-enter quite a few of the notes and adresses manually… Hopefully that is the last time I do that. Those contacts had often been through various synchronizations between Palm OS devices, Outlook and Evolution – but getting them to Google seemed lossier than usual. I have read about many other contorted data migration paths, and this one looked straightforward enough – but if I have to do it again I’ll spend time setting up better automation. I’ll concede that it does not have much to do with Android – anywhere you look the synchronization ecosystem seems quite wet behind the ears.

I won’t complain too much about how tightly tied the system is with Google’s applications – after all that is a major feature of Google’s Android business model. With the defaults applications, Android is a seamless extension of the Google universe. Synchronization of calendar and contacts is excellent – although it only happens eventually and you have no way to know when or to trigger it (this is the first time I ever see this implemented with no control by the user). In addition, I am very uncomfortable with the idea of using a third party as my synchronization hub and I’ll look for another way.

But every functionnality seems available through an API – so with the user in control and free to act there is no reason to complain. I’ll ignore the Gmail and Google Talk clients, and replace them with a decent XMPP client and an IMAP client better than the default one – and maybe I’ll even find a decent contacts manager. Meanwhile the native Google Maps client is such a pleasure to use that I could forget everything else (though I wonder why the relief layer has been omitted – I find it very useful for planning human powered movement).

The scarcity of exposed configuration options, output logs and exposed information in general leaves me wanting. For example the Jabiru Jabber client tells me “connexion error” but won’t explain anything, resulting in frustration. Of course, Jabiru is not part of the basic system, but this rarefied atmosphere seems to be the norm in the Android world. And why is there no option to sort the contacts by “family name, first name” instead of the default “first name, family name” ? Would that clutter the interface too much ? Even the simplicity-worshipping Palm OS gave that choice…

I guess that a compromise has been struck in favor of simplicity by default over configurability, and that developper tools are available to provide advanced access to deeply buried parameters – but for example not being able to set the language to english with the french keyboard upsets me a lot. I’m used the english as a device language, and I’m used to the french “AZERTY” keyboard – reading french or typing on a QWERTY keyboard with no accents feels awkward. On any other system I know, keyboard and language are two separate options – but not on Android. I hope I find an application that provides finer grained options. I was also frutrated not to find any configuration option that would solve my above-mentioned problem with the scrolling style.

The Android Market feels sluggish. I have been spoilt by APT caching all packages descriptions locally – and now I have to suffer Android Market loading package descriptions and icons slower than I scroll accross the list that only shows six items per page with no way of getting the device to display smaller characters in order to cram more lines per page. I can understand that the icons must be stored online for storage space’s sake, and maybe the user comments in order to keep them current – but why not load the whole package list at once ? And why does every list on this device use a standard widget that seems sized large enough for legibility by half blind users ? Where is the configuration option ?

Many of my gripes are probably related with the default applications, and after exploring the Android Market for a while I’m sure I’ll feel better. I’m commenting an operating system in its default form, and this is obviously not how I’m going to use it – In a few weeks, after the normal process of appropriation, my Android will hopefully not look and feel like its current state at all.

So see you in a couple of months for a look back at these first impressions – we’ll see which were real problems and which were merely artefacts of the clash of cultures ! For now I have the eery feeling of having stumbled in a sort of Apple-esque Disneyland with my hands tied…

Brain dump and Debian and Identity management and Security and The Web18 Mar 2009 at 18:19 by Jean-Marc Liotier

The PGP web of trust is a social network, even if many of the people who published their keys would never admit joining one. But there are less than sixty thousand users, so low density of users in most social environments causes weak connectivity in the web of trust : the strong set (largest set of keys such that for any two keys in the set, there is a path from one to the other) ties together less than fifty thousand users. This has been a problem for a long time : in 1997 the strong set was only 3100 keys out of sixty thousand published. And in a fast expanding online social sphere, a stagnating network of sixty thousand users is marginal. Of course, many of those users participate in core institutions of  the developper community, but that does not make that population any less marginal. Many don’t mind that marginality, but our taste for elitist cave-dwelling among like-minded peers will not change the fact that effective software development is a social sport. Societies need trust, and restricting our communications to people whose idea of a party is a key signing party is not going to help us very much, so a solution is needed.

The PGP web of trust is no longer the only application that supports a social graph. With the recent mainstream explosion of social networking and digital identity applications, there is an embarrassing wealth of choices such as Google’s OpenSocial specificationhat propose a common set of API for social applications across multiple sites. Social networking in a web environment, including all forms of publication such as blogging, microblogging, forums and anything else that support links is a way to build digital identity. Each person that follows your updates or links to your articles is in effect vouching for the authenticity of your personae, and each one who adds you as a “friend” on a social network is an even stronger vote toward the authenticity of your profile, even if some people add any comer as their “friend”.

The vetting process in social networking applications is in effect just as good as the average key signing outside of a proper key signing process : some will actually check who they are vetting, others will happily sign anything – and it does not matter too much because the whole point of the web of trust is to handle a continuous fabric whose nodes have different reputations and no guarantee of reliability. The result is a weak form of pseudonymous web of trust – just like the PGP web of trust. But with an untrusted technological infrastructure, it is only about strong enough for common social use.

An anaemic GPG web of trust and thriving social networking applications are obvious matches. So what about a social networking application that handles the PGP web of trust ? As usual, similar inputs through similar individuals generate similar outputs – the same problems with the same environment and the same tools handled by people who share backgrounds produce the same conclusions. So now that I am trawling search engines about that concept I find that I am not the only one to hav thought about it. Who will be the first to develop a social networking application plug-in that links a profile to a GPG key to facilitate and encourage key signing between members of  the same platform that know each other ?

Arts and Brain dump and Knowledge management and Methodology and Social networking and The Web23 Jan 2009 at 14:43 by Jean-Marc Liotier

Amanda Mooney remarks that :

It’s hard to maintain the illusion that you’re particularly special, talented and original when, with a quick Google of whatever genius idea you’ve come up with, you see that 3 billion people have already thought that, done that, analyzed that, criticized that, indexed the history of that in Wikipedia and made a fortune on that… In 1995.

So now, to really live up to our parents’ and teachers’ praise, we have to work a lot harder, be a lot smarter and know that we’re competing with all of those other 3 billion people who think like us and have already started to act on the kind of ideas and “talent” we have.

Actually it was always like that, but slower and invisible. Original ideas are few because similar inputs through similar individuals generate similar outputs – the same problems with the same environment and the same tools handled by people who share backgrounds produce the same conclusions. So it is not surprising that concepts are invented simultaneously and reinvented all the time. I don’t feel belittled by finding out that I’m not unique – on the contrary : I feel empowered by finding that I’m not isolated anymore. I remember lounging in libraries in my youth, reading esoteric technical books chosen at random. I often resented not being able to share that with people who have similar interests. Now we can find each other easily and all be surfing together at the wavefront. Childhood dreams came true – life is good !

But if you anguish about being a unique snowflake just like all the other unique snowflakes, there is still hope for you. Our mental agility and cultural maleability suffer from a rather heavy inertia, so the processing stage is not readily manipulable. That leaves only the input to be tinkered with in the short term – and you can play with inputs a lot ! This is why it is important to cultivate diversity in your social network, and it is also why adding some noise into your web feeds is good for you. Who is not addicted to new stimuli ?

Brain dump04 Jan 2009 at 3:29 by Jean-Marc Liotier

From Wikipedia’s paper size entry :

The international paper size standard, ISO 216, is based on the German DIN 476 standard for paper sizes. Its unique quality is its scalalability: The height divided by the width of all formats is the square root of two (1.4142), so folding any sheet in half, the two halves have the same proportions, and any image can be reproduced on the half size paper by reducing it by about 70% (0.707 is the reciprocal of root 2). To double an image area, the multiplication factor is about 140% These options commonly appear on photocopiers and image projectors.

Within the ISO metric system, the base format is a sheet of paper measuring 1 m² in area (A0 paper size). Successive paper sizes in the series A1, A2, A3, and so forth, are defined by halving the preceding paper size parallel to its shorter side. The most frequently used paper size is A4 (210 × 297 mm). An advantage is that standard A4 sheets made from 80 grams/m² paper weighs 5 grams, allowing one to know the weight – and associated postage rate – by counting the number of sheets used.

All those years of using A series paper and I did not know that… It all makes sense now !

I also realize my unabashed taste for engineering trivia…

Brain dump and Email31 May 2008 at 10:40 by Jean-Marc Liotier

I work as a project manager for a very large ISP that Dilbert readers would sometimes find strangely familiar. But nothing prepared me for the shock and disbelief I experienced when some of my co-workers in the information systems division asked me why I kept sending them mail written in 10-point Courier New font whereas I was sending them plain text.

Following on their remark I soon found that many people found likewise that my messages are difficult to read because of that poor choice of font. Apparently, no one realized that plain text is rendered as whatever you want it to render as, including Fette Fraktur or even Zapf Dingbats if you fancy hieroglyphic form.

Sometimes I wonder if I am really working for an ISP. If you think that such company is an oasis of Internet culture, then if you joined one nowadays you would be sadly disappointed.

Anyway here is a tip for them, straight from the horse’s mouth :

  1. From the main Microsoft Outlook window, on the “Tools” menu, click “Options” and then click the “Mail Format” tab.
  2. Click “Fonts“.
  3. Next to the “When composing and reading plain text” box, click “Choose Font“.
  4. Select the fonts you want, and then click “OK“.
  5. Enjoy my plain text messages in your favorite font and size.

Plain text grants the recipient freedom to render as he sees fit, including in Braille or as audio speech – plain text is that flexible.

I do not have a fetish for the spartan aesthetics of plain text in green monospace font on a black text console. What I appreciate is universal portability : I can read plain text on any device in any situation and process it with any tool including old or underpowered ones – and I actually do. That is the power of plain text.

Of course, as an interpreted markup language, HTML can also be rendered in a variety of ways and I could probably use it, but plain provides even more freedom. HTML being a standard I nevertheless welcome it in my mailbox even if I seldom send any HTML mail. A good maxim to live by when you are writing anything which has to interoperate with other systems is : “Be liberal in what you receive, and conservative in what you send”. It is as old as the Internet but it’s a great way to make things highly compatible and interoperable.

Writing this rant probably dates me as a unixian dinosaur… But it could be worse : just wait until someones gets me started about proper mail quoting !

Brain dump and Jabber and The Web09 May 2008 at 9:57 by Jean-Marc Liotier

Openness is everything – the rest is details. The technology is there and people have been talking about it for more than a year. Let’s bow to the inevitable : just as Compuserve, AOL, The Source, Prodigy and their ilk have all dissolved in the Internet, Twitter will find a decentralized replacement. And let’s make the inevitable happen by pushing XMPP !

Techchrunch reported that “over the last few days a number of popular bloggers have complained, loudly, that it’s time to ditch Twitter and move to a decentralized version of the service that won’t go down every time usage spikes“. But I could not care less about that : I am not even a Twitter user. But I think there are good uses for micro-blogging and social instant messaging, so I want a free and open solution. That means decentralization in the classical Internetworking style.

Brain dump and Systems04 May 2008 at 15:23 by Jean-Marc Liotier

The openMosix Project has officially closed as of March 1st 2008. This brings nostalgia of the toy OpenMosix cluster I once had running for a few years, assembled using the ailing collection of dusty hardware heating my apartment and infrequently put to productive use for large batch jobs. Soon I found that a single less ancient machine could perform about as fast if not faster for less electricity, and batch jobs being what they are I could just as well let them run during my sleep. But in an age when I had more time than money (I now have neither…) and when compression jobs were measured in hours, OpenMosix was a fun and useful patch for which I foresaw a bright future.

A few years later the efficient scheduler in recent Linux releases lets me load my workstation to high values with barely any consequence for interactive tasks, so I don’t really feel like I’m starved for processing power. But I still spend too much time staring at progress bars when editing photos, so more available CPU could definitely speed up my workflow. This is why I look longingly at the servers in the corridor who spend most of their lives at fractional loads while the workstation is struggling. Manual load balancing by executing heavy tasks on remote hosts is a bit of a chore, so I go browsing for single-system image clustering news, wondering why easily pooling local system resources is not yet a standard feature of modern operating systems.

One of the major obstacles to the generalization of SSI clustering outside of dedicated systems is that software such as OpenMosix or Kerrighed require an homogeneous environment : you can’t just mix whatever hosts happen to live on your LAN. For most users, homogenizing their systems using one Linux kernel version, let alone one type of operating system is not an option.

But nowadays, virtualization systems such as Xen are common enough that they may represent a viable path to homogenization. So I envision using it to extend my workstation to the neighboring hosts. I would run the workstation as a normal “on the metal” host, but on each of the hosts I want to assist the workstation I would run a Xen guest domain running a bare bones operating system compatible with taking part in a single system image with the workstation. Adding capacity to the cluster would be as simple as copying the Xen guest domain image to an additional host and running it as nice as desired, with no impact on the host apart from the CPU load and allocated memory.

This approach looks sane to me on paper, but strangely I can’t find much about it on the web. Is there an hidden caveat ? Did I make an obviously wrong assumption ? Tell me if you have heard of other users extending their workstation with SSI using Xen guest domains on random local hosts. Meanwhile, since OpenMosix is now unsupported, I guess I’ll have to dive into Kerrighed

Brain dump and Military and Security and Technology20 Jan 2008 at 17:33 by Jean-Marc Liotier

In spite of the hype surrounding micro and nano UAV and how important they are becoming to winning the struggle for tactical information, I can’t find any reference about how to defend against them. As their current use is mostly on the strong side of asymmetrical warfare, it seems that the industry and the users have simply set the problem aside for now.

But it won’t be long before two high-technological forces equipped with swarms of nano UAV will find themselves fighting against each other, and they will both certainly clamor for a better fly swatter. Since I can’t foresee very large fly swatters being part of standard issue kit anytime soon, there is a clear need for some new form of air defense against air vehicles as small as a mapple seed.

Will we see micro air defense units in action, complete with toy-size automatically guided artillery, dust-like shrapnel and tiny missiles ? This heralds the appearance of new dimensions in the tactical environment, and those familiar with nanotechnological prospective will have recognized the first step of a downscaling war.

Meanwhile I think about the potential for pest control – selectively killing flying intruders seems definitely better than spraying nerve agents in my home…

Next Page »