Monday, December 04, 2006


What is it ? well, Pointillism is a style of painting I quite like, but Pointillist is simply the (working) name of a small graph library I'm working on. The previous post was discussing about the simulation software I wrote, which uses a simple graph view. I decided to clean up a bit that view, put it in a framework and commit it somewhere (likely on the étoilé repository). Not quite done yet, but it's shaping well, and it's now quite a bit more generic:

One of my "I will do it one day" project is a simple graph calculator... So in order to properly test the graph framework, I created a class using the StepTalk scripting framework to express functions (written in Smalltalk), which actually makes me rather close to have such a program :-P

for the curious, the code of the functions is:

Red function:

y := -0.5.
((x < 0.5) and: (x > -0.5)) ifTrue: [y := 1].
(x < -1) ifTrue: [ y := 0.25].
( x < -1.5) ifTrue: [ y := -2 ].
(x > 1.5) ifTrue: [ y := 2].
y := y + 1.5.

Green function:
x sin - 1 / x

Blue function:

|y mod|
y := -1.
mod := x mod: 2.
(mod = 0) ifTrue: [ y := 1 ].
y := y - 2.
^ y

StepTalk is really cool btw -- Easy to integrate, and works both on OSX and GNUstep. Here is the code I use here:

id environment = [STEnvironment environmentWithDefaultDescription];
id conversation = [STConversation conversationWithEnvironment: environment
language: @"Smalltalk"];


id number = [NSNumber numberWithFloat: x];
[environment setObject: number forName: @"x"];
[conversation interpretScript: @"^ (x tan) / (x atan)"];
id result = [conversation result];

[values addObject: result];

values beeing a NSMutableArray containing the numbers returned by the steptalk function (the ^ in smalltalk is equivalent to a "return" statement in C). As you can see, difficult to have a scripting framework simpler than that to work with!

You can see that in the environment object I put an "external" (to the script) object, number, and this object can be accessed from the script using the given name (here, x). You could, instead of getting the script result, get the value of specific objects you put in the environment.

Anyway, it's very simple to add StepTalk support to your program, and recent versions even add nifty UI widgets to play with scripts.. also, StepTalk is not "just" a Smalltalk scripting framework : it can actually use other languages. On GNUstep for instance there's an Io bundle, so you could use Io instead of Smalltalk, etc. (it's actually fairly easy to add languages to StepTalk).

Note also that StepTalk automatically bridge the numbers in the script to actual NSNumber instances... which is how I decided to implement the math functions, and this actually demonstrate a very cool Objective-C feature; NSNumber doesn't have thoses sin,cos,mod.. math functions; so in another programming language we'd be stuck. A "normal" solution could be to modify StepTalk so that it would use something else than NSNumber (say, a custom class of ours), or reversely, to add those functions to NSNumber and recompile -- wait! you could do that with GNUstep (because we have the source), but certainly not with Cocoa on OSX. So how comes it works ?

Well, Objective-C has this great feature: categories. A Category lets you add new code to an existing class, at runtime. So that's simply what I did here -- I created a category on NSNumber with my specific math functions. Without needing to have access to the NSNumber source, or to recompile Foundation. Isn't it great ?

Sunday, November 26, 2006

System simulation

I'm (rather predictably) still working on my distributed rendering system.. but as it's a bit tiresome to make some modifications, recompile, restart.. -- more importantly, that it takes time to do so (1), I wrote a simple and nice simulation program that let me try different rendering/clustering strategies easily, swap them, evaluate or compare them, etc. That way I can concentrate on them rather than waiting 5-10 minutes to start visualizing a one gigabyte dataset on the cluster. Even better, the final idea is to integrate the simulation in the real system (it actually already gets timings from the real one and extrapolate the results) in order to have a nice feedback loop: run things, keep simulations in parallel, switch to other strategies if the simulation says it's better, update timings if needed, etc. Here is a screenshot of the current program:

The screenshot shows three simulated clustering strategies; basically we want to render an image using multiple computers. Each computer having one or more rendering agent. We divide the image in tiles and the rendering agents' job is to render them, send them back to the clustering agent, which will recompose the final image to send it back to a visualisation client. It's a fairly straightforward distributed technic. What's more interesting here is that we have three different strategies defining which tiles are sent to which agent. The simplest one is to divide the total number of tiles by the number of agents, and send continuous tiles up to that number for each agent (second bottom figure -- each color represents one agent). Another is to alternate tiles (eg no consecutive tiles are sent). Another one is to break the tiles according the the image complexity (first bottom left figure). The image complexity beeing the time each tile took to be renderered (bottom right figure).

The graphic shows the evolution of these three strategies depending on the number of agents. We can see that the alternate tiles strategy works fairly well usually (because the complexity is more evenly distributed), unless we ends on a multiple of the width (16), meaning the tiles are aligned, and thus the complexity is not properly distributed. Having a simulator can let us choose which strategy is the best before having to really use it.

(1) which break one of my Rules of Programming, ie, keep the REPL (Read-Eval-Print-Loop) cycle as short as possible: as obviously the shortest it is, the more ideas you can try.

Sunday, October 15, 2006


I just read that article about the new Sony's e-book reader (from the osnews story). That made me think a bit about e-book.

I would love to have a "true" ebook. For what ? well, in my case, for reading technical documentation and research papers, possibly some websites. Another excellent usage would be to read newspapers and magazines, or research journals. Or if you are a student, another really obvious target are textbooks. And think about the manga/comics market.

Why would it be interesting to have an ebook for all that ? What's the pattern ? Simply, it's the volume: in all these use cases, you end up with lots of paper very easily (and in a short time frame), and having one device to gather everything would be nice.

But books ? books do not really have a volume problem -- most persons do not read or carry dozen of books at the same time, and the minor inconvenience to carry 2-3 books for a very long journey is not enough to warrant paying premium to have an ebook. Sure, if you have a device able to read pdf and newspapers, it can handles a book, so you'll have books too of course. Having your whole library in one device could be appealing to some too, I guess. But if the main advantage of an ebook reader is answering the "volume" problem, then it means books almost certainly shouldn't be the main target.

So if books aren't (or should not be) the real target, and if things like documentation, newspapers, textbooks are, what does it mean ? Well, it means your device should provide the same features you'd expect from paper when dealing with these. Specifically, you absolutely need annotation. But more than annotation, the device shouldn't be content to only be "as good as paper" -- if you want people to buy it, you need to be better. An obvious candidate feature (come on, we are talking about electronic documents) is to provide a much better way to manage your document collection. Things like searching, grouping, sorting, adding whatever annotation/metadata you want, do specific things that use these categories (like marking some documents, pages or text to be sent to somebody else via email, etc)...

Then, you will have something that people (and companies, and schools) might be interested in.

The screen technology is imho nearly irrelevant, as long as the battery life is good enough (~7-8 hours in continuous usage) and with a good enough resolution. While e-ink is extremely interesting and answers perfectly the battery/resolution problem, it's too slow for the moment (it's also black and white, not absolutely dramatic but something that reduce a bit its impact). Between an e-ink display and a high resolution lcd screen, even if the battery life would be much shorter on the device with the lcd screen, if it provides the kind of characteristics I described above, I will definitely choose it over the e-ink display, and I bet that many persons would do just the same.

Right now, the sony ebook looks more like a solution looking for a problem than anything else, and they focus their efforts on the wrong side (books).

They fail at everything I described above: it's slow to navigate, management of your documents seems inexistant, pdf aren't even there apparently (seriously!), no annotations... the only good thing (while not perfect) is the e-ink display, but they seem to think it's enough. It's not.

If I had money (and time) and more electronic skills I'd definitely want to create a good e-book device. Why not something based on gumstix + a high resolution lcd screen. There's a market waiting to be picked here.

edit: to be fair, checking the sony page they indicate that the device actually CAN read pdf and doc... But only after converting them to the proprietary format Sony uses. Crap ! :D (oh and it plays AAC and MP3 too)

Friday, September 01, 2006


I finally finished writing the last remaining things for my system, integrated everything, and could generate the data for the paper just in time... That was a hell of a week, but at least this deadline is over now ! I'll sleep a bit :-P

I will also try to resume a minimum my gnustep's activity next week (been more than a couple of months it's on hold) -- I have a couple of things to do that shouldn't take much time from me and that would be rather welcomed :-P

Thursday, July 06, 2006

UDP ...

Well, I'm not working much on anything related to GNUstep/étoilé lately -- I have my thesis to write for december, which also means I need to finish properly my visualisation system soon...

I changed completely the communication stack -- exit XMLRPC/TCP, welcome lossy UDP communication -- good things, it enables broadcast and low latency, bad things... well, using broadcast as an asset, you need to change quite a lot the way you think your program. But it's worth it: not only the end result (the visualization) can be broadcasted, which is nice, but moreover, the rendering process itself can take advantage of broadcasting to be more efficient. Anyway. Here is a screenshot showing a broadcasted MIP visualization of the CThead dataset:

The rendering itself is done on a cluster connected to these 3 machines through a simple udp tunnel. Works quite well..

Friday, May 19, 2006

Flow, State and Persistence

I gave a talk today discussing state and persistence in your work environment, for the No Grownups Seminar Series.. Here is a flash version of the talk, and here is the pdf one. Have a look !

Thursday, May 11, 2006


Well, I was idly thinking that I could move the HTML parsing code of HelpViewer in a GNUstep TextConverter bundle (so that *any* application using text would be automagically able to load html... or at leas the particular HelpViewer brand ;-), so I looked into the RTF TextConverter to see how that worked. It's fairly simple in fact :-)

But reading the code, I also realized that adding RTFD support (RTFD are basically RTF documents + Images) would be trivial. I am actually fairly surprised nobody did that before, particularly as I remember numerous complaint about its non-existence :-/

So, I modified the RTF parser to deal with the \NeXTGraphic tag, and it seem to work well -- at least reading RTFD documents that come from Mac OS X. Though, the actual grammar I used for dealing with it is pretty crude, so I wouldn't be surprised if some documents do not work (likely old NeXT documents... OS X documents produced by TextEdit seem fairly consistent with what I did). Anyway now the basic support is done, tweaking the grammar won't be difficult if it's needed.

Here is a screenshot showing two documents opened in (and obviously no modifications were needed in

It's committed on the gnustep repository, so you can try it by updating and reinstalling the RTF TextConverter bundle.

I'm thinking I'll perhaps commit my actual HelpViewer version (as it can use rtf files) without waiting for the html parser to be finished... plus "commit early commit often" is a good idea ;-) (which I don't follow enough...)

We could write a simple Text editor dealing with links, etc. as well. Oh well we'll see.. :-)

Wednesday, May 10, 2006

It doesn't hurt...

Things move quickly with etoile, so here's another screenshot... note that the shadows are ok now with xcompmgr and Azalea+EtoileMenu, that the menu items aren't displaced anymore, and that the info panel was updated (yeah, yeah, it doesn't improve the experience, but it's nice to have ^_^). Saso even added a small animation when you click on the etoile logo in the middle... pfff! ;-)

Anyway, time to work on helpviewer.

Sunday, May 07, 2006

Etoile screenshot

Just for fun.. Here is a current screenshot of etoile, showing the EtoileMenuServer, Azalea (the windowmanager), Nesedah (the theme) and HelpViewer... Things start to work together ;-) [note that the shadow should not/will not appear under the helpviewer menu item...]

Wednesday, May 03, 2006

Helpviewer and Help System in GNUstep

I -- finally !! -- started reworking on helpviewer lately -- as the last official version (0.3) doesn't work properly with gnustep (due to changes in GSHTMLParser...). The first goal was to have something working again -- that was fairly easy, by replacing GSHTMLParser by GSXMLParser. Though, it meant that you couldn't use single tags like "br", as you (obviously) need to close them in xml.

I then decided to more or less deprecate the hold "help" xmlish document format and its custom tags, and implement an xhtml parser instead, and while I was here, by using NSXMLParser rather than GSXMLParser.

Exit "intelligent" tags that deal with figures or references, but on the other hand we can reuse html documents -- eg it's probably easier to write documents now than it was with the previous custom xml tags, as you should be able to use standard html editor... or if you want to write your documents by hand, well, html is a slightly more known format anyway !

As we remove "dynamic" behavior, we need to have document completely generated beforehand; eg it will be the document creator application's duty (or the writer) to generate proper references, etc. And yes I somehow plan to write such an editor, but in the meantime you'll be able to easily write/convert your existing html documentation anyway ;-)

Though I'm not entirely fixed -- some of those dynamic behaviour could be coded as specific div tags after all...

Anyway, the importance for the help is to be displayed in the same, standard way. So basically at the moment it's a bit like if helpviewer would provide its own css style to display documentation, that you can't change.

Another feature that need to be added is a proper search function using indexation (LuceneKit, here we are !). Didn't do anything about that yet, but I will probably write a command line tool that take a help document, generate an index and put the index in the document..

So.. apart from ditching the old xml format to use more common html tags name, what changed ? Well, the structure of the help document changed quite a bit too :-)

Before, a help document was basically a bundle (==a folder) containing .xlp files (xml files using the previous syntax) and other resources (images). One file was mandatory, "main.xlp", that was loaded by default; other files were loaded following links written in main.xlp.

Now, it's different :-)

A help document is still a bundle, obviously -- what better way to distribute a help file : you can have various resources in various languages, all in the same place !

But the organisation inside is different. The document can still be segmented in multiple files (now html files), but the structure of the document simply reflect the actual content of the bundle -- eg helpviewer will simply show you the files and directories contained in the bundle. Directories act as a way of having different parts in a document, as they contain other html files. You can (actually it's mandatory for the moment, although I will support /not/ having an index.plist) use an index.plist to choose which file/directory is displayed in helpviewer, and with which name, and in which order. The index.plist file contains a simple ascii plist containing a list of key values, where keys are the filenames and values are the displayed names.

Of course, you can still use links in the html documents too, but the actual structure of the document doesn't depend on that.

One nice thing with it is that you are not at all tied to html documents -- in fact, the first thing I implemented was an "txt" document and later on a "rtf" document (a shame rtfd doesn't work). And anyway I find this new structure much cleaner and simpler...

So... what's the plan now ? Well, I ported nearly all the previously existing functionalities to the new parser (links, etc), so once it's done and usable, I will commit a 0.4 version on the étoilé repository; then it will be nice to add proper support for search/indexing using LuceneKit. After that... well, what would be good to do, is to modify gnustep to have NSHelpManager directly use something like helpviewer -- eg, instead of using helpviewer as a separate application, you'll find the same functionalities of helpviewer, directly in a NSHelpPanel, so that help acts as "private" to an application, like it did on openstep. Probably the good approach would be to modify gnustep to easily support bundles to implement this kind of additional behavior (actually, very similar to GSAppKitUserBundles, but.. I don't know, something perhaps slightly more formalized..), and code such a bundle :-)

After all that, I'd like to rework on my "semantic" editor, and extend it to a nice document (and help) writer. We'll see...

Tuesday, March 14, 2006

Camaelon Themes

I am quite impressed that even without any specific documentation explaining how to do a theme, people are willing to create them:

two new themes for Camaelon recently appeared... Check here for a list.

Very cool.

Sunday, March 05, 2006

GNUstep / Fosdem 2006

This year's Fosdem was, as usual, really cool and quite busy :-)

Here is a page with some pictures, slides and videos of the gnustep talks: I'll upload the rest of the videos probably during the next week...

Thursday, January 26, 2006


After some efforts, I had everything compiled -- ffcall, foundation, appkit, etc. for the nokia. But then my test program crashed :-)

Removing extra libs (hm.. ffcall.. :-/ ) I have something working though: Foundation !

[ note that I got the NXConstantString error, so it looks like library loading order matters (lobjc / lgnustep-base)... changing the ordering in the makefiles make it works though ]

Now.. I compiled gnustep-gui and gnustep-back, both with xlib and art. But the xlib backend crash (it can't find any available font on the nokia apparently..), and the art backend (after some efforts -- #ifdef XSHM to get rid of any XShm calls...) "nearly" works: with my test application (a simple info panel) it displays a window, but without any content (plain white), and complains about bad window... Yet it seems to receive events (I can click on the invisible buttons ;-) so it looks like some art display problem, perhaps just because the nokia uses a non conventional color ordering / depth.

Well I need to come back on ffcall and check if it's the culprit or not for the crashing thing -- having ffcall won't be bad for the gui :-) and try to understand what's wrong with backart... but all in all things progress ;-)

ah, I also did a very stupid mistake the other day, compiling gcc for arm-softfloat. Naively I thought you needed a soft fp implementation, as the arm doesn't have a hardware fp. But what I didn't know is that there's two kind of soft fp -- kernel and softfloat. Worse, you can't link binaries compiled with different fp emulation. And of course, the nokia is compiled with kernel fp emulation, not arm-softfloat.. So I was good for recompiling gcc ...

Though, one good thing is that with arm-softfloat ffcall didn't compile at all (seems that the asm bits of ffcall uses the fp registers for the arm platform), while now it compiles (but as I said it seemed to be the cause of the program's crash, so it's perhaps not as encouraging as it seems ^_^ -- although I need to double-check and recompile things nicely). Considering libffi closures don't work on the arm, if ffcall works that's a good thing :)

Anyway as soon as I have something working decently I'll post a full guide to cross-compile GNUstep on the nokia (and probably some binary packages too if you just want to install it on the nokia).

Wednesday, January 25, 2006

Nokia 770 + Objective-C

I finally had the time to try to do something with the Nokia 770.. and ended up with a gcc + objective-c support that can cross compile for the arm processor. I simply used crosstool with the following script:

set -ex

# Really, you should do the mkdir before running this,
# and chown /opt/crosstool to yourself so you don't need to run as root.
mkdir -p $RESULT_TOP

# Build the toolchain. Takes a couple hours and a couple gigabytes.

#eval `cat arm-softfloat.dat gcc-3.3.3-glibc-2.3.2.dat` sh --notest
#eval `cat arm-softfloat.dat gcc-3.4.0-glibc-2.3.2.dat` sh --notest
eval `cat arm-softfloat.dat gcc-3.4.1-glibc-2.3.3.dat` sh --notest
#eval `cat arm-softfloat.dat gcc-3.4.1-glibc-20040827.dat` sh --notest

echo Done.

Compiling a simple C program and copying it on the nokia worked fine, so I wrote a simple ObjC test program.. and after copying the libobjc.* files and changing the LD_LIBRARY_PATH to point to a local lib directory, it worked too.

So far, so good !

Now I just need to compile Foundation, then AppKit, then -back ... :-)