Systems and Travels29 Nov 2005 at 14:23 by Jean-Marc Liotier

My real time earth view used to only feature a view centered on Europe, Middle East, Africa and the Atlantic. I now also provide an Asia centered one and another one centered on the Americas.

As before, views are calculated every handful of minutes, cloud cover is updated eight times a day and the daylight background map is NASA’s Blue Marble‘s monthly map automatically rotated in place the first day of each month.

The available views and resolutions :

Systems05 Nov 2005 at 17:24 by Jean-Marc Liotier

While looking for a way to remove the <meta name="ROBOTS" content="NONE"%/> meta tag from some of the pages produced by Geneweb I stumbled upon a relatively new tool with interesting potential – mod_publisher :

Mod_publisher turns the URL mapping of mod_proxy_html into a general-purpose text search and replace. Whereas mod_proxy_html applies rewrites to HTML URLs, and in version 2 extends that to other contexts where a link might occur, mod_publisher extends it further to allow parsing of text wherever it can occur.

Unlike mod_proxy_html there is no presumption of the rewrites serving any particular purpose – this is entirely up to the user. This means we are potentially parsing all text in a document, which is a significantly higher overhead than mod_proxy_html. To deal with this, we provide fine-grained control over what is or isn’t parsed, replacing the simple ProxyHTMLExtended with a more general MLRewriteOptions directive.

My feeling is that the authors are considerably understating how much CPU this thing is going to cost. Production-minded people were certainly cringing at that thought while reading the description, but I foresee immense power for hacks of last resort.

Systems05 Nov 2005 at 17:05 by Jean-Marc Liotier

By default Geneweb asks robots to abstain from indexing the pages it generates. I wanted to :

  • Make the content of my genealogy database indexable by search engines.
  • Avoid putting the host under too much CPU load resulting from visits by spiders.
  • Keep the spiders from getting lost into the infinite navigation that Geneweb produces.

It is the special “non-person” pages (such as an ascendant tree) that are the most computationally intensive. It is also these pages that make the navigation infinite. So the functional constraints can be condensed into the following technical ones :

The first step was therefore to bypass the robots.txt generated by Geneweb. I use gwd in ‘server mode’ behind a Apache vhost with mod_rewrite so all I had to do was to add a mod_rewrite directive to hide Geneweb‘s robots.txt with mine :

RewriteEngine On
ProxyPass / http://kivu.grabeuh.com:2317/
ProxyPassReverse / http://kivu.grabeuh.com:2317/
ProxyPass /robots.txt http://www.bensaude.org/robots.txt

But that was not enough because Geneweb embeds a <meta name="ROBOTS" content="NONE"%/> tag into each page it generates. Geneweb provides a separate template for each page class. I guessed that etc/perso.txt is the template for what I call the “person page” and removed the <meta name="ROBOTS" content="NONE"%/> line from it.

And that was it : the person pages do not repulse friendly spiders anymore while the other pages are still off limit.

I love Geneweb !

Systems02 Nov 2005 at 2:24 by Jean-Marc Liotier

The latest addition to the collection of lame scripts I wrote and put online completely automates the trivial yet tedious task of Awstats batch Apache log reports production, with full history and even where multiple vhosts coexist.

And the icing on the cake is that it does it quite efficiently : it always updates the reports for the current month and the current year, but only produces other reports if they do not exist. To force the regeneration of a report, you simply erase it.

If a user wishes to control the access to a report for a vhost he must create a .htaccess file named /etc/awstats/.htaccess.vhost.name.tld
This file will be automatically detected and used. This is dead simple, and it just works.

Grab the code ! It is in production on this very server as the sample output testifies.

Systems31 Oct 2005 at 2:57 by Jean-Marc Liotier

In case you wondered what the display from a heat damaged video card looks like, I have nice specimen available in the photo gallery. This Radeon 9800 Pro is not quite fried but overheating definitely rendered it unusable for all practical purposes. In-game artefacts and cursor artefacts are common symptoms of heat damage, but I had never seen text-mode corruption with garbled text at boot-time. As far as I have understood, such damage is irremediable.

BIOS text output from an overheated Radeon 9800 Pro

The Radeon 9800 Pro is an excellent deal for those wish to extend the gaming life of an AGP based system on the cheap, but much care must be taken in dealing with the ungodly amounts of heat this thing radiates. Even more than usual, stay away from models mounted with cheap fans and tiny radiators or it will only be a matter of time before your Radeon 9800 Pro croaks horribly.

Photography30 Oct 2005 at 18:16 by Jean-Marc Liotier

Although I was quite happy with the nice pictures I brought back from South Africa last December, I very well knew how wide the gap remains between my little fiddlings and what seasoned virtuosos achieve. But today I took a look at Todd Gustafson’s take from Kenya in late September and it made me feel even smaller. These pictures are absolutely stunning : not just technically outstanding but also original and well chosen.

According to Bill on rec.travel.africa Todd Gustafson uses mostly a Canon EF 600/4 L IS with a Canon digital body and a Nikon 200-400/4 also on a digital body. There is also some fill flash with a Better Beamer flash extender. For the sake of both my wallet and my back I will not even dream about using the Canon EF 600/4 L IS

But fill flash for pictures of animals is something I had not thought about and that I should have begun using long ago. Now that I have seen this picture of a backlit white faced monkey in a Costa Rica forest by Philip Greenspun I wonder if my own negatives of backlit chimpanzees from the Kibale forest are even worth scanning…

Brain dump28 Oct 2005 at 14:49 by Jean-Marc Liotier

I wonder why private military companies have not thought about using Google Maps to reach new customers. Leveraging the user-friendly interface of Google Maps and Google Earth the purchaser selects an arbitrary point on the surface of the planet, selects the ordnance he wishes to see delivered and then securely enters his credit card details. Service fulfilment then rests on the shoulders of the contractor, Google only takes a commission as an intermediary – which incidentaly avoids lots of legal hassles. Secrecy costs extra because by keeping the incoming strike secret Googles forfeits potential revenue from other interested parties such as the target.

There is even room for the intermediation : since difficulty, risks and therefore ultimately costs of the delivery may vary extremely widely, there must be a way to help in adjusting supply to demand. An online marketplace featuring reverse auctions would fit the need perfectly. Why should Ebay stay away from the current boom in the war business now that it has gained respectability comparable to any other legal activity ?

The US Air Force in Iraq can deliver a 250 pound smart bomb at a total cost of under USD 30k (including the cost of operating the aircraft). Considering the infrastructure and economies of scale that the US Air Force enjoys in Iraq, it would not be unreasonable to imagine that an airstrike on a target in North Kivu for example could go for around USD 100k. If you are only an occasional user, that would be a very attractive price point compared to the costs of maintaining your own air force. Basing rights remain a logistical problem since no private operator has acquired global strike capability just yet…

Of course, even if mass market pricing some day puts it within reach of the consumer market, allowing individuals to use this service would be a bit unethical. So we’ll leave that for later development and aim for the government and corporate markets first…

Systems and Travels17 Oct 2005 at 17:21 by Jean-Marc Liotier

The same basic Earth view as before (calculated every handful of minutes with cloud cover updated several times a day) but the daylight background map is now the NASA’s Blue Marble‘s monthly image. The current one is automatically rotated in place the first day of each month… The images were captured during the year 2004 but this is nevertheless a more realistic and lively seasonal change of the land surface: the green-up and dying-back of vegetation in temperate regions such as North America and Europe, dry and wet seasons in the tropics, and advancing and retreating Northern Hemisphere snow cover… Enjoy !

Two resolutions are available :

  • 800×600
  • 1280×1024
  • Meta23 Sep 2005 at 14:43 by Jean-Marc Liotier

    Even with some CSS and some hand crafted PHP, maintaining heaps of static HTML was beginning to be a chore, especially with my increasingly busy schedules. Posting stuff into my favorite IRC channel was painless, but the audience was a bit narrow : I really wanted to share with the wider world, whatever I may have to share… So I finally took the plunge and set up a blog. As the tagline says : “highly random experience capitalization, just in case”. Yes – that is a decidedly fuzzy editorial line, but that is the whole point of the exercise : I intend this place as my personal random brain dump, a place where to publish whatever I have on my hands in the hope that some day some poor soul using a search engine will stumble upon it and find it somewhat useful… So there : serendipitous altruism !

    « Previous Page