As an ArgoUML contributor I'm going to blog my activities here, so that they may draw interest by other developers or help other developers when doing tasks similar to what I've done. AND(!) the grand vision that makes an Argonaut what he is, TO THRIVE IN THE BIG DANGEROUS WORLD, TAKING THE Argo TO A GOOD SHORE ;-))

Thursday, December 16, 2010

Improve Emacs HTML mode by adding an automatic translator between accented chars and their HTML / Unicode entities

I would like to improve Emacs HTML mode so that when I enter an accented character it would replace it with the corresponding HTML Unicode entity. I don't know if this would have to be some kind of hook which captures the entrance of specific characters or if I could implement if as a gigantic search and replace operation. The later option isn't so attractive because it gets into the flow of writing. But the former would also...

Currently Emacs HTML mode provides the sgml-name-char (C-c C-n) command that almost enables the same thing, but, it gets a bit in the way of my flow of thought. (Actually, writing directly in HTML also gets in my thought process anyway... Maybe I should write first and then edit. But I digress...)

So, how could this be implemented? A way would be a minor mode for HTML for automatic translation which mainly defines a minor mode key map that overrides the global and major mode keymaps for accented characters and invalid HTML characters. Is this an interesting idea?

How to implement an Emacs minor mode? A good example is flyspell-mode, which source is available just C-h f flyspell-mode and following the link to the source.

Tuesday, December 14, 2010

Idea for Git demo: show its power for sweeping changes in SISCOG-UTIL and CREWS

At SISCOG we are using Git as a development tool, but, not as the official version control system. I was challenged to make a demo / presentation on what would be the changes to the processes and working methods if SISCOG was going to use Git.

One idea I had recently was to show what Git could do if I need to change CREWS in a more automated and integrated way, so that I might have an easily mergeable branch of CREWS and SISCOG-UTIL, how would I proceed? Would Git solve the problems easily? How could I demonstrate this?

Showing off the power with defsystem to ASDF migration? How would that give them the thrill of power? Try, don't be shy to fail and earn time, quality and satisfaction.

Para convencer alguém eu tenho de mostrar o que é possível
e não como lá chegar!!!
Eu tenho de estar mais com as pessoas
e só assim as poderei motivar.

Idea: make the binary data library of PCL book parallel friendly

I'm still reading and learning from the Practical Common Lisp book by Peter Seibel [it is hard to believe that I started in 2007!]. When reading the chapter 24. Practical: Parsing Binary Files, which describes and provides a library for reading binary files, and upon reaching the variable:

(defvar *in-progress-objects* nil
  "TODO or FIXME, how to make this part of the code more parallel friendly?")

I stopped reading and started wondering about the TODO or FIXME I just wrote. The following is the resulting ideaware...

His it worth it or will the disk access eat the advantages?

Yes if there is parallel disk access, meaning in disk array architectures and for really big files. For small files the parallel access isn't worth it because the disk array will know how to do it in a parallel way.


An additional option to a certain binary-class would mark that binary-class as one that would spoon a new process or thread for writing (NOTE to self write and read operations aren't symmetric from a performance point of view) or reading an instance of that binary-class.

Additional bindings of the dynamic variables would have to be created for the symbol that are to be called in parallel, i.e., a new thread based call would mean a new lexical environment is created in which the new binding exists.

It would be fun to port this library to Clojure / JVM languages. I think that this would be allowed because it isn't the original work that would be made available.

Wednesday, October 27, 2010

Programming Praxis Alien Numbers in Clojure

I solved the full Programming Praxis Alien Numbers exercise in Clojure. This enabled me to learn more about Clojure and to train my algorithmic parts of the brain a bit.

In all this I was a bit disappointed. It took me too much time to reach a full solution and then, in the core part of the exercise I had to resort to one of the other persons to reach a better way to convert a decimal number to an arbitrary radix number given its language.

I was also a bit disappointed in the way that I wasn't able to use Clojure's laziness more in the algorithms, resorting too much for my taste to loop. From a positive point of view, the Clojure abstractions work as advertised and the only difference from the code bellow and another that would use instead of strings a vectors or lists would be the conversion in the return value of decimal-to-lang.

Follows the core of the exercise solution. Note that the full solution with tests is here. Still considering the Clojure sweet points, I think I could easily explore its excellent support for concurrency and convert the main algorithm into a parallel one if there were hundreds of lines to be processed.

(defn decimal-to-lang
  "Converts decimal-num to the equivalent number in lang as a string.
  The algorithm was a translation of the algorithm by Rodrigo Menezes
  in C# that he posted in the programming praxis site."
  [decimal-num lang]
  (let [lang-radix (count lang)]
    (loop [decimal-value decimal-num lang-num []]
      (if (>= 0 decimal-value)
 (apply str lang-num)
 (recur (int (/ decimal-value lang-radix))
        (concat [(nth lang (mod decimal-value lang-radix))] lang-num))))))

(defn digit-index
  [digit lang]
  (loop [i 0]
    (if (= digit (nth lang i))
      (recur (inc i)))))

(defn lang-to-decimal
  [alien-num lang]
  (let [radix (count lang)
 ralien-num (reverse alien-num)]
    (loop [i 0 decimal-num 0 product 1]
      (if (= i (count ralien-num))
 (recur (inc i) (+ decimal-num (* (digit-index (nth ralien-num i) lang) product))
        (* radix product))))))

(defn convert-num
  "Convert alien-num which is in source-lang into the same number in target-lang"
  [alien-num source-lang target-lang]
  (decimal-to-lang (lang-to-decimal alien-num source-lang) target-lang))

Saturday, July 03, 2010

More ideas about htmled

htmled, my Python hacks to automate conversions of the contents in my handbooks into posts in my blogs received some recent attention, but, I still have plenty of hitches with the overall process. Follow two quick ideas for future improvement.

The handbook editing part (creating the content in HTML) is too hard; solution might be a specific mode in Emacs customized for editing HTML with specific functions to accelerate the creation of structural elements or using a base format that can be processed afterward to generate the proper HTML.

Editing in Emacs is way superior than in the HTML browser. I wonder if it would be possible to embed Emacs as the editing component of multiple lines text fields. Extra bonus if you could put a served Emacs frame as the editing component.

ArgoUML tests issues

Update ArgoUML tests to JUnit 4.x – issue created

This idea now has ArgoUML issue 6100.

Issue 6085 is also moving forward.

Saturday, June 26, 2010

ArgoUML C++ – generator and modeling

The C++ code generator in the ArgoUML C++ module has been growing in its usage of Taggedvalues and it hasn't grown more because it hasn't received recent contributions. In a recent thread in the Using ArgoUML for modeling C++ topic of the ArgoUML's users forum, this started to become more clear for me. Here is my answer to Ulrich Thombansen's on the 24th of June:

Hi Ulrich,

> (...) on the “constructor”. What about taking this off the documentation at tigris?

I consider that the best solution. Given two ways to make something, I would go for the standard UML way, which even if taking a bit more of work by the user, it is more flexible because you have an actual operation that might be customized, e.g., by adding arguments. By being explicit we may in the future enable specific customization related to C++ constructor specificity.

Note to self: TODO: place this conclusion in the issue 21. Consider the possibility of some TagDefinitions in the «cppOperation» Stereotype being only applicable if the Operation to which the Stereotype is applied also has the «create» Stereotype applied.

> cppAttribute
> ==========
> applying «cppAttribute» to a variable “myAttrib” with the tag “pointer” set to different values leads to the following results:
> no) tag value -> result
> ----------------------------
> 1) “false” -> int myAttrib; /*{pointer=false}*/
> 2a) “true” -> int *myAttrib; /*{pointer=true}*/
> 2b) “xyz” -> int *myAttib; /*{pointer=xyz}*/
> 3) “” -> int *myAttib;
> Feature or fault?

IMO feature (I would call it a default behavior due to a present limitation in ArgoUML) and a nicely chosen one, although in the future when ArgoUML supports typed TaggedValues, I would model the “pointer” TagDefinition as a UML Boolean and therefore you would only have True or False. The rational behind letting the user model any “pointer” TaggedValue as being true but the actual String “false” is that if the user added the “pointer” TaggedValue in an Attribute, it is because he want the pointer to be present.

> cppAttribute
> ===========
> applying «cppAttribute» to a variable “myAttrib” with the tag “get” set to public leads to the implementation of the int get_myAttrib(); function.
> Is there a way to make this “virtual”? only creates “virtual” for (!leaf && !isConstructor && !static)||abstract. Which rule applies / can be influenced?

This is one of the reasons I dislike TaggedValues being used where the default UML suffices. By adding the “get” or “set” TaggedValues to represent an hint for the generator you put yourself in a position where you step out of the proper language for representation of the Operations. If there was a button for generating get and set operations for a given attribute at the model level you would have afterward the liberty to customize it as for any other operation. Since ArgoUML doesn't have this, we added “get” and “set” TagDefinitions to the «cppAttribute» Stereotype, but, given that these are simply TaggedValues we will be left with the modeling of all the particularities of operations as different values of the “get” and “set” TaggedValues. It isn't a fair deal. The overload isn't only for the implementation, but, also for the users...

I would model it the other way around, having a “get” and “set” TagDefinitions applicable to Operations to which the «cppOperation» is added and the values would have to match the name of the attribute which the operations are related to. A more powerful way would be an association between the Operations and the Attributes, but, I don't know if UML allows that, and I'm almost certain that ArgoUML doesn't.

One of the issues with all this is trying to guarantee symmetry between code generation and reverse engineering. Although it is a bit like utopia, we might be closer to it if we stick as much as possible within the UML proper when possible.

Note to self: TODO: create an issue of type enhancement proposing the deprecation of get and set TagDefinitions in «cppAttribute» and adding the these TagDefinitions to «cppOperation». Model it initially as a String that must match the corresponding Attribute name and afterward as an Association (if and when this is supported).

Friday, June 25, 2010

Dispersion problem – revisited after 7 months

About 7 months ago I wrote about my dispersion problem. I didn't honored my decision at that time which was to dedicate myself fully to ArgoUML during about one year. For instance, I recently finished some hacking in the friendly user interface for htmled that allows me to say:

python --startdate=2010-06-01 > posts-in-programacao-since-june-1st.txt

Why did I did it? It isn't only dispersion working, it is also a desire to be a better hacker and to know more. But, this takes me away from my dream which is to fulfill one or more remarkable ideas. So, back to initial plan.

But, to see the glass half full instead of half empty, I list all the issues I was involved with in ArgoUML and which are resolved since my decision:

Also, looking in retrospective the past months were good for ArgoUML on several areas:

  • Linus Tolke finally handled the switch of the ArgoUML license from BSD like to the Eclipse Public License (EPL) 1.0, which means that ArgoUML and ArgoEclipse will move together and that Tom Morris will be more willing to continue to contribute.
  • Bob Tarling was the main contributor to ArgoUML 0.30.1 and he was able to achieve the replacement of the old model elements property panels by the new implementation which is based on XML definitions. Originally this was developed by Christian López Espínola in a GSoC 2008 project, but, never before matured enough to be a full and stable replacement of the old implementation. Kudos to Bob :-)
  • Tom Morris returned to active ArgoUML contributions, and although he isn't yet as active as before, his overall understanding and detailed knowledge of UML and plenty of ArgoUML components helps a lot when others need guidance such as I did during 0.30.x bug fixing period. In the last period of 0.30.1 development, he even took issue 6008 on his hands and made some big changes to the reference handling of profile model elements which I would have struggled for months to get right. He achieved it in about one week.
  • Linus Tolke managed to stabilize the Hudson continuous integration server where ArgoUML nightly builds are executed and continues to maintain the automated builds, project infrastructure and release tasks.
  • Thomas Neustupny and Andreas Rueckert are actively evolving the argouml-core-model-euml implementation of the model subsystem (the result of a GSoC 2007 project by Bogdan Ciprian Pistol) and therefore moving ArgoUML towards UML 2.x.
  • I'm happy with the progressive stabilization and polishing of the profile subsystem (yet another result of a GSoC 2007 project, this one by Marcos Aurélio).
  • ArgoUML's users forum is growing in activity with Thomas Neustupny, Tom Morris, me, daybyter and KiwiCoder being active helpers. Kudos to Lukasz Gromanowski for setting this and the users' wiki for the project and for the continued support. Compared with the argouml-users mailing list the forum is rocking!
  • Michiel Van der Wulp continued to support the notation subsystem and the state and activity diagrams and although he isn't so actively contributing as he used to he continues to help with his expertise in these areas.
  • There was also the return of previously inactive contributors such as Alexander Lepekhine.

I may be forgetting some relevant contribution and if so drop me a line to update this post. I think that attributing the kudos to contributors is important for the health of the project. Thank you all for continuing to move forward the project and to our users to keep nagging us about the bugs and annoyances :-)

Tuesday, June 15, 2010

Learning Clojure – development environment configuration

I already started learning Clojure some time ago by following the tutorial Clojure - Functional Programming for the JVM by R. Mark Volkmann, a Partner of Object Computing, Inc. (OCI). I also did the casting Clojure tutorial, read a lot of Clojure blog posts and saw some videos about Clojure.

What I was missing was a sane Clojure development environment configuration which could live at peace with my Common Lisp Emacs SLIME development environment. Well, by following the Swank_clojure_setup_for_hardcore_Emacs_users.el I achieved it. But, first I had to ditch my Debian based SLIME installation and go to the SLIME CVS version. That was the base recommendation that hardcore Emacs and Common Lisp hackers give, so, nothing too wild here.

Besides Clojure, I also wanted to have working Common Lisp implementations connected to Emacs. My choice was previously the SBCL version available from Debian repositories and ABCL from the Subversion checkout.

So, enough introduction, follows the objectives and how I did it on a Debian based GNU/Linux distribution and bash.

  • Emacs snapshot from Debian's repositories.
    ~$ sudo apt-get install emacs-snapshot-gtk emacs-snapshot-el
  • SLIME from CVS.
    ~$ sudo apt-get install cvs # only if you don't have CVS yet
    ~$ cd programacao/lisp # I have all Lisp related programming sources here
    ~/programacao/lisp$ cvs -d co slime
    ~/programacao/lisp$ cd slime/doc
    ~/programacao/lisp/slime/doc$ sudo apt-get install texlive # because I want to build an up-to-date slime-doc
    ~/programacao/lisp/slime/doc$ make
    ~/programacao/lisp/slime/doc$ gzip # TODO: this is available as make install, but, my distribution paths are different...
    ~/programacao/lisp/slime/doc$ sudo install-info # make it available to info
    ~/programacao/lisp/slime/doc$ sudo cp /usr/share/info/
    ~/programacao/lisp/slime/doc$ cd ~
  • SBCL from source tarball download, which I placed in ~/programacao/lisp/sbcl.
    ~$ cd programacao/lisp/sbcl
    ~/programacao/lisp/sbcl$ tar -xvjf sbcl-1.0.39-source.tar.bz2
    ~/programacao/lisp/sbcl$ cd sbcl-1.0.39/
    Now, check the INSTALL...
    Now, check the Getting started...
    I already had the SBCL from Debian, so, this would be my Common Lisp host which would be automatically used by
    ~/programacao/lisp/sbcl/sbcl-1.0.39$ sh # and detach with C-a d, using screen -r to check its state from time to time
    An alternative I will try next time is to use a non-GUI terminal by going C-M-F1.
    TODOs: install binaries, build and install doc.
    ~/programacao/lisp/slime/doc$ cd ~
  • ABCL from Subversion. I already had Subversion and Ant installed via apt-get install.
    ~$ mkdir -p programacao/lisp/abcl/svnco && cd programacao/lisp/abcl/svnco
    ~/programacao/lisp/abcl/svnco$ svn co svn:// && cd abcl
    ~/programacao/lisp/abcl/svnco/abcl$ ant # this creates the abcl executable shell script to launch ABCL
    ~/programacao/lisp/abcl/svnco/abcl$ cd ~
  • Ant and Maven binaries downloaded and installed.
    ~$ sudo apt-get install maven2
  • Clojure and Clojure-contrib from their Git repositories. Git was already installed.
    ~$ cd programacao/clojure
    ~/programacao/clojure$ git clone
    ~/programacao/clojure$ git clone
    ~/programacao/clojure$ cd clojure
    ~/programacao/clojure/clojure$ ant jar
    ~/programacao/clojure/clojure$ cd ../clojure-contrib
    ~/programacao/clojure/clojure-contrib$ mvn package
    ~/programacao/clojure/clojure-contrib$ cp target/clojure-contrib-*.jar clojure-contrib.jar
    ~/programacao/clojure/clojure-contrib$ cd ~
  • Leiningen binary downloaded and installed.
    ~$ mkdir -p progs/lein && cd progs/lein/
    ~/progs/lein$ wget && chmod u+x lein
    ~/progs/lein$ cat >> ~/.profile
    # finish editing with C-d

    ~/progs/lein$ lein self-install
    ~/progs/lein$ cd ~
  • Clojure-mode and Swank-clojure also from Git repositories.
    ~/programacao/clojure$ git clone
    ~/programacao/clojure$ git clone
    ~/programacao/clojure$ cd swank-clojure && lein jar
    ~/programacao/clojure/swank-jar$ cd ..
  • Create a shell script to launch Clojure with all the required goodies in the classpath. Note that this script will be used within Emacs and therefore you don't want fancy wrapping and history with rlwrap. You may create one of those to use in the command line, though.
    ~/programacao/clojure$ cat >
    JAVA=$(which java)
    exec $CLJ "$@"
    # not part of the file, finish with C-d


So, after writing all this in shell language, would it be possible to have it in Clojure, Common Lisp or Emacs Lisp? With Clojure we could get fancy and execute some long lasting tasks in parallel.

Finally, the relevant part of my ~/.emacs file:

;;; SLIME and Lisp and Clojure support
(require 'cl)
(add-to-list 'load-path "~/programacao/lisp/slime/")
(setq slime-backend "swank-loader.lisp")
(setq inferior-lisp-program "~/programacao/lisp/sbcl/sbcl-1.0.39/")
(add-to-list 'load-path "~/programacao/clojure/clojure-mode/")
(add-to-list 'load-path "~/programacao/clojure/swank-clojure/")
(pushnew '("\.clj$" . clojure-mode) auto-mode-alist)
(require 'clojure-mode)
(setq slime-lisp-implementations
      '((sbcl ("~/programacao/lisp/sbcl/sbcl-1.0.39/"))
 (abcl ("~/programacao/lisp/abcl/svnco/abcl/abcl") :coding-system iso-latin-1-unix)
 (clojure ("~/programacao/clojure/") :init swank-clojure-init)))
(require 'slime-autoloads)
(setf slime-use-autodoc-mode nil) ; swank-clojure doesn't support autodoc-mode
(slime-setup '(slime-banner slime-repl slime-fancy slime-scratch
       slime-editing-commands slime-scratch slime-asdf))
(setf slime-net-coding-system 'utf-8-unix)
(setf swank-clojure-binary "~/programacao/clojure/")
(require 'swank-clojure)
(defun run-clojure ()
  "Runs clojure lisp REPL"
  (slime 'clojure))
(defun run-abcl ()
  "Runs ABCL lisp REPL"
  (slime 'abcl))

I still have Paredit configured for the Lisp dialects I use, which is another recommended way to customize Emacs. Definitely, there are pieces missing, but, I reached a target with enough relevancy for me to be worthy of careful documentation :-) I could have earned extra hacking points if it would be automated, though, so, thou shall not halt here!

Wednesday, May 12, 2010

3rd European Lisp Symposium – a light review

The symposium was nice, small enough that you were capable of getting acquainted with a large percentage of persons and individually chatting with the presenters after their talks. There were a majority of persons related to academia, but, due to the large participation of SISCOG, I think that industry was almost half the number of persons. It is funny that by the number of academia persons is clearly larger from the point of view of presentations and involvement in the organization. We need more industry involvement not only for paying the checks, but, also for presenting and participating in the organization.

I would like to have seen some specific content related to Clojure. Alas, I should have proposed a lightning talk, but, I still don't have anything to show yet, so, the solution would be to spend one night preparing things, but, I preferred to enjoy the dinners. There is something that should have been added related to cloud computing and web applications. Note that RavenPack although not being that, it is a real time data service for financial companies, so, it isn't a mainstream application, but it is very interesting on its own and it is novel.

The overall feeling which I get of the Symposium is that of a group of persons who still suffer at the fact that being Lisp such a special thing, the Symposium should have a larger number of persons, the companies that use Lisp should be larger and more profitable, the open source libraries more polished and more in number, etc. I think that we like the persons which are part of the community and the accessibility of our stars, such as Kent Pitman. We don't like the fact that Lisp is still foreign in large swathes of the industry. Nevertheless, we are seeing a resurgence of Lisp and its use both in academia and in the industry. Now we need to get scale and solutions, so, like for other languages, we need some killer applications / frameworks. This might be a multitude of them, since there are several Lisp dialects and each one may pursuit something different.

Friday, May 07, 2010

3rd European Lisp Symposium, day 2

Tutorial: Parallel Programming in Common Lisp by Pascal Costanza

09:00 - 10:30

Parallel programming is the wave of the future: It becomes harder and harder to increase the speed of single-core processors, therefore chip vendors have turned to multi-core processors to provide more computing power. However, parallel programming is in principle very hard since it introduces the potential for a combinatorial explosion of the program state space. Therefore, we need different programming models to reduce the complexity induced by concurrency.

Common Lisp implementations have started to provide low-level symmetric multi-processing (SMP) facilities for current multi-core processors. In this tutorial, we will learn about important parallel programming concepts, what impact concurrency has on our intuitions about program efficiency, what low-level features are provided by current Common Lisp implementations, how they can be used to build high-level concepts, and what concepts Lispers should watch out for in the near future. The tutorial will cover basic concepts such as task parallelism, data parallelism and pipeline models; synchronization primitives ranging from compare-and-swap, over locks and software transactional memory, to mailboxes and barriers; integration with Lisp-specific concepts, such as special variables; and last but not least some rules of thumb for writing parallel programs.


I questioned Pascal about the portability of parallel code and there is none up-to-now. This smells like an opportunity.

On a question of mine about the need to sync the functional data structures internally - for instance FSet is portable - I got a book reference: Purely functional data structures - it was suggested to me that I should read this to understand why there is no need for synchronization. THOUGHT: my lack of an academic background in computer science is to blame for my lack of knowledge about things like this, even after having read about it in respect of the Clojure data structures.

Session III (The Language)

11:00 - 12:45

CLoX: Common Lisp Objects for XEmacs by Didier Verna

Debugging CLoX is a living hell.

Didier made an excellent and very dynamic presentation. There should be no problem in moving what he got up-to-now into GNU Emacs. He intends to optimize some parts to C, though, and then it would be a problem, because XEmacs C level is incompatible with GNU Emacs C.

CLWEB: A literate programming system for Common Lisp by Alexander Plotnick

CLWEB is language specific with advantages for Common Lispers. No tracking back to the source file position yet. Complexity due to the different objectives of the CLWEB reader and Common Lisp reader. Common Lisp STREAM libraries and pretty printer are great! Code walker for Common Lisp - based on the one from the macro-expand-all paper(?). It would be nice if there was more standard ways to interact with the Lisp operators, as to have an easier time creating a walker. Works in SBCL, Allegro and another (Clozure ?).

Keynote: Lots of Languages, Tons of Types by Matthias Felleisen, Northeastern University

Since 1995 my research team (PLT) and I have been working on a language for creating programming languages - small and large. Our code base includes a range of languages, and others contribute additional languages on a regular basis. PLT programmers don't hesitate to pick our lazy dialect to implement one module and to link it to a strict language for another module in the same system. Later they may even migrate one of the modules to the typed variant during some maintenance task.

An expressive macro system is one key to this riches of languages. Starting with the 1986 introduction of hygienic macros, the SCHEME world has worked on turning macros into tools for creating proper abstractions. The first part of my talk will briefly describe this world of modern macros and its key attributes: hygiene, referential transparency, modularity of macros, phase separation, and macro specification.

The second part of my talk will focus on how to equip LISP-like languages with a sound type systems and that will illustrate the second key idea, namely, monitoring the interactions between different languages. Our approach to type systems allows programmers to stick to their favorite LISP idioms. It mostly suffices to annotate functions and structures with type declarations during maintenance work. To ensure the soundness of this information even when higher-order values flow back and forth between typed and untyped modules, module boundaries are automatically equipped with software contracts that enforce type-invariants at all levels.

PLT Scheme is now Racket. Types are very important for maintenance work.

Type theory is stupid. Design driven to make typed Lisp programming practical. Combining the macro system and the type system is still an unresolved challenge.

Panel – does Lisp matters?

I'm part of the panel and therefore instead of notes the following is what I prepared to say. Eventually the specifics weren't followed...

My small intervention notes

  1. Name
  2. disclaimer: less Lisp Experience than the majority of the persons in the room. Although I'm a SISCOG employee, what I will say represents my opinions and I don't know if they are aligned with the ones of SISCOG administration or with the ones of the majority of SISCOG employees.
  3. background:
    1. Physics Engineering, some experience in software development both for data processing and analysis and for scientific instruments
    2. then I started working as a software engineer in telecommunications domain, specifically, in the management of telecommunications networks
    3. about 3 years ago I joined SISCOG and started working in the railway planning and management domain and yes, programming in Common Lisp.

So, SISCOG was founded by 2 professors of the IST - an engineering academical institution. They wanted to use Artificial Intelligence and in the 80s that meant Lisp.

Question: Does Lisp matters?

Yes it does, because of the following three things:

  • Inspiration – it has been an inspiration to lots of persons and languages. This is nothing new, but, many of the new languages still steal from Lisps. And normally they are still crippled copies. So Lisp matters even if it didn't existed anymore, although with open source, as Kent Pitman said yesterday, it isn't possible to kill it.
  • Edge – it provides an edge for its users. Be it due to its dynamic nature, macros, tools, history, smart people magnet, etc.
  • Research – Lisp is very important due to its flexibility and ease of experimenting with new ideas. I will just refer as an example to the previous keynote by Matthias Felleisen.

Quote by Matthias Felleisen:

we are the cockroaches of the world, nobody can kill us!

Announcements, Symposium wrap-up

Didier Verna announced that the 4th European Lisp Symposium will be next year in Hamburg, Germany, on March 30 and April 1. Deadline to submissions will be around the end of the year. This time around I will try to have something to present. Theme parallelism and ...

Thursday, May 06, 2010

3rd European Lisp Symposium, day 1

I'm participating in the 3rd European Lisp Symposium. My employer, SISCOG is sponsoring my participation and of a bunch of others, like the other Luís Oliveira of CFFI fame.

What follows are my notes of day one of the 3rd ELS, with an organization which follows the programme.


Wifi not working...

The TI Explorer Lisp Machine. It is possible to submit ideas for lightning talks.

Keynote: Going Meta: Reflections on Lisp, Past and Future by Kent Pitman, HyperMeta Inc.

Over a period of several decades, I have had the good fortune to witness and influence the design, evolution, standardization and use of quite a number of dialects of Lisp, including MACLISP, T, Scheme, Zetalisp, Common Lisp, and ISLISP. I will offer reflections, from a personal point of view, about what enduring lessons I have learned through this long involvement.

Both the programming world and the real world it serves have changed a lot in that time. Some issues that faced Lisp in the past no longer matter, while others matter more than ever. I'll assess the state of Lisp today, what challenges it faces, what pitfalls it needs to avoid, and what Lisp's role might and should be in the future of languages, of programming, and of humanity.

Maybe Kent would collaborate in data standards and not language standards – the objective is to have more choice when selecting what programming language and OS you use to do your work...

Interesting, during the AI Winter, due to the risk of Lisp failing altogether (vendors were near to bankruptcy) there was some need to control the news to be more positive about Lisp. Then Gabriel presented Worse is Better and plenty of persons got mad at him.

Quote of a saying someone had at Symbolics:

At Symbolics, we make hard problems easy and easy problems harder.

Lisp has became the nerd of the programming languages. Lisp became detached. At that time Lisp was different and that was very relevant.

Book references: You Are Not a Gadget by Jaron Lanier; Crossing the Chasm

Some people like to be guided, actually they would pay to be guided. Smart people often don't get this.

The effects of free software: languages can no longer be killed.

The effects of the internet: Lisp being used in the server hidden.

The promise of Clojure: multi-core, it is connected (being in the JVM), vision (Rich Hickey).


Why is Lisp the language and a Runtime? Why isn't it just the language? One potential problem is CLOS, maybe we need to drop it and actually try to move in a way that it becomes more attached.

Session I (Mathematical Applications)

Verifying monadic second order graph properties with tree automata by Bruno Courcelle and Irne Anne Durand

Hard to understand and I didn't followed it fully...

A DSEL for Computational Category Theory by Aleksandar Bakic

A Software Engineer that worked on this as a hobby. He used CLOS and functions, trying to mix the two to improve the state of the art of the DSL design for this application, based on what was achieved in the other DSLs based in more functional languages. Book reference, the book uses Standard ML.

Keynote: Reading the News with Common Lisp by Jason Cornez, RavenPack

The financial industry thrives on data: oceans of historical archives and rivers of low-latency, real-time feeds. If you can know more, know sooner,or know differently, then there is the opportunity to exploit this knowledge and make money. Today's automated trading systems consume this data and make unassisted decisions to do just that. But even though almost every trader will tell you that news is an important input into their trading decisions, most automated systems today are completely unaware of the news - some data is missing. What technology is being used to change all this and make news available as analytic data to meet the aggressive demands of the financial industry?

For around seven years now, RavenPack has been using Common Lisp as the core technology to solve problems and create opportunities for the financial industry. We have a revenue-generating business model where we sell News Analytics - factual and sentiment data extracted from unstructured, textual news. In this talk, I'll describe the RavenPack software architecture with special focus on how Lisp plays a critical role in our technology platform, and hopefully in our success. I hope to touch upon why we at RavenPack love Lisp, some challenges we face when using Lisp, and perhaps even some principles of successful software engineering.

CTO of RavenPack. No Lisp code in the presentation, but, he is passionate about Lisp and software engineering and still writes plenty of code.

Quants traders. The opportunity - News are important, but, the algorithms didn't used it. News metadata is unreliable. 27000 companies tracked. Manage the historical news database - entity detection and time dependent data.

They use Lisp because they like Lisp and not because it was a problem that only Lisp can solve. Their customers don't care about RavenPack using Lisp. They also don't sell software, they sell data.

They worked with Franz to improve the interface to Oracle DB. Although they were acquainted of some solutions like CL-SQL, it didn't supported some things they wanted, like stored procedures. Other tools being used are: Linux, Git, Windows, Jakarta HTTP client, Java for integration, Apache Server, etc.

Common Lisp technologies: CL-XMPP, Jlinker (Franz), Allegro Server (Franz), Lisp Server pages, Salza compression, CL-store, etc.


  • Bayesian filters – human experts that classify news as Ham or Spam which serves as training data. Accuracy of around 68%.
  • Restricted Boltzmann machine – compresses some complex full representation into its minimal representation. Accuracy around 87%. It isn't enough for quanta trading.
  • Processing Headlines, based on templates or patterns. Kind of rule processing.

Software architecture: boxes for receiving data, boxes for emitting data, boxes for analysis of historical data. Oracle at the center.

Collectors (read news) -> Real-time Analysis -> Event Server -> 
 Lisp                       Lisp + CL-Store        CL-XMPP (chat rooms)

Lisp frustrations: GC (Lisp is behind the curve compared with JVM and CLR), handling of out of memory errors, weak library support for HTTP, Lisp as a scripting language it is chatty on *standard-out* and last but not the least a lot of unfinished libraries.

Lisp Joy: patch live systems, optional keyword parameters, tools (debugger, REPL, tracer, inspector), FFI and macros.

Book reference: Coders at Work by Peter Siebel.


15:30 - 16:00

I was indirectly invited via Tiago Maduro Dias to participate in the final panel of 3rd-ELS. After about 1 minute considering if I was the appropriate person to represent SISCOG, I said to myself, yeah, I'll do it, it will be a honor, let me see if I'm capable of it!

Session II (The Outside World)

16:00 - 18:00

Marrying Common Lisp to Java, and Their Offspring by Jerry Boetje and Steven Melcher

How Lisp looks in Java. Very detailed explanation on how Common Lisp is implemented in Java in CLforJava.

THOUGHTS: I would like that there was a similar design overview for ABCL!...

Tutorial: Computer Vision with Allegro Common Lisp and the VIGRA Library using VIGRACL Benjamin Seppke and Leonie Dreschler-Fischer

Offers filters, morphological operations, feature detectors, segmentation, transformations... All at the comfort of an Allegro REPL near you :-)

VIGRA Library - no Google, its not Viagra! C++, template based, unit tested, multi-platform and which is fairly complete in terms of available algorithms.

VIGRACL uses defsystem. It has a multilayer design with 4 layers. In the bottom, there is the VIGRA library in C++, then a C wrapper, then Allegro Common Lisp FFI and finally an abstraction layer implemented in Common Lisp on the top.

Design: multi-layer, slide.

Saturday, May 01, 2010

SISCOG is hiring developers

SISCOG is hiring for our projects department. If you want to program in Common Lisp professionally, are interested in the railway domain (planning and management) and want to be my colleague ;-), apply please.

Saturday, March 20, 2010

The Functional Object System idea

Functional languages and functional programming techniques are getting into the mainstream of software development these days. Clojure is an example of a recently created language (a Lisp dialect) that although not purely functional, applies the techniques and principles of functional programming very seriously and IMHO with success. Now, how do we combine this in a seamless way with object oriented programming? OOP has been up-to-now very dependent on the objects having mutable states in a destructive way. This isn't compatible with the functional data types which although having state, this is immutable, being the objects' data shared and construction only – and therefore reusable.

My idea is that we need a new object paradigm, which enables easy definition of immutable objects which have the same nature as the functional data types. I've seen this to some degree in a specific SISCOG library defined as an extension to CLOS and that enables an object state tree – much more advanced than the typical command pattern stack based undo/redo functionality you commonly see. It could be implemented in Java with aspect oriented programming (i.e., with AspectJ). Alas, this library suffers from its objective not being specifically of providing a Functional Object System (FOS) and by using the mutable Common Lisp standard types – once you get hold to one of its fields, you can wreck the state indirectly. It also relies on the common with wrapper macros to change from state to state, instead of assuming that all state changes shall be construction only and therefore dropping the use of wrapper macros.

So, what specific concepts must a FOS take to heart in order to work?

  • Differentiate between query operations and construction operations.
  • Construction operations always return new objects that shall have one or more of their fields with different values than the objects they clone.
  • There shall be ways to change the state of the application, pointing some reference to a new root node tree, but, these shall follow concurrency (and maybe distribution) aware protocols, such as the ones defines in Clojure. These aren't necessarily part of the FOS, but, the FOS must play well with these.
  • The FOS shall provide means for its objects to take advantage of non-FOS objects. E.g., a FOS for Clojure must provide a potentially largely inclusive way for impure mutable Java defined objects to be used with it, even if that means to take advantage of introspection on these objects' operations and expensive quasi-duplication of them without the benefit of reuse.

So, now that I had one more crazy idea thrown into the web, lets check prior art...

OCaml has objects and contains a basis operation called functional update. This provides the basis for maintaining that an object is data plus behavior, but, making it side effect free. The convenience it provides is very interesting, it is almost more terse than actually mutating the objects! Even so, you have the possibility to use an imperative style, but, it isn't the default and you must declare the methods that actually change with mutable.

In Clojure it is all about concurrency, so, you would have to guard against concurrent access to objects that are being mutated. So, there might be additional semantics involved, which weren't obvious in the small part of OCaml that I read.

Gentoo and automated tests

With Gentoo Linux you already have the sources for all the software you install. It would be nice if you could also execute the automated tests that the software packages have. If the practice of having also deployment tests in software packages was common enough, you could in the end check if the whole stuff was working as expected by the authors of the software and provide very valuable diagnostic information to open source software projects.

Deployment automated tests

A nice thing I read in an agile book was of being able to execute the automated tests after deploying the software as a means of diagnosing if the installation was alright. I think that such a practice should be common place as a means of diagnosing if a certain deployment is correct.

UML representation in Clojure

UML representation in Clojure. I.e., create a Clojure domain language for representing UML models and diagrams, which should be serializable to XMI and graphically displayable in several formats, like for instance in ArgoUML format and in SVG. Another idea for ELS 2010...

Distributing Clojure

Make a framework for distributed computing with Clojure. Another idea for ELS 2010...

Exploring the clouds

Parallel system to explore the infrastructure of cloud providers. With Amazon use its web services to deploy clients and listen to the result. With Google AppEngine ping hosted applications. Use Google Web search to search for hosted applications – Do this with Clojure. How about Microsoft's cloud? ClojureCLR?!?

Common Lisp design patterns

Actually catalog of common Common Lisp idioms. Some idioms are generalizable for all major Lisp dialects. Another nice idea for ELS 2010.

reduce map
defx or def-x
dox or do-x
<conditional-operator>-let or b<conditional-operator> [b is for binding]

Provide examples from SISCOG code base?

Each pattern should have an associated implementation strategy for being defined with macros.

Reverse engineering of XML specifications

Make a module to handle reverse engineering of XML specifications. One obvious use would be to model XMI's representation of UML in UML. A specific usage interesting for SISCOG's domain would be to reveng RailML specs.

Make this with NSN BicNet Tnml and bring Tnml into UML world as I was proposing internally some years ago.

Change the ArgoUML plugins technology to be based on OSGi

ArgoUML has modules (the ArgoUML's terminology for its plugins), as an extension mechanism of the application. The more common use case are the programming language modules, but, you can also have core modules, like for instance the sequence diagram module or the UML property panels module. The problem with the ArgoUML modules as I see it now is that ArgoUML is reinventing the wheel and that this wheel has it share of limitations.

  • One of the biggest limitations is the problem of not being able to specify a dependency of one module of another.
  • Another problem is that it is a bit of work to define a module which only purpose is to integrate a third party library into ArgoUML.
  • The final thing I dislike is that for one reason or another, currently the installation of ArgoUML includes all standard modules in the classpath by generating a hard-coded list of the jars (see META-INF/MANIFEST.MF in argouml.jar in a installation of ArgoUML).

My idea is that we should base our modules or plugins in OSGi. I think that this isn't very hard to achieve because the current scheme doesn't differ very much from what plain OSGi requires. In part this was performed by necessity of the Argoeclipse project – definition of the Eclipse build system based on the MANIFEST.MF files. But, the other part is simply due to the usage of good object oriented design with the Java language, which is related to the interface involved in the definition of the base module (see ) and of specific interfaces for functionality oriented modules, like for instance .

Updating ArgoUML tests to JUnit 4.x

We should update ArgoUML tests to JUnit 4.x, is order to take advantage of some of the new features of it. Since ArgoUML now must have compatibility with JSE 5, this should be no problem.

Updating ArgoUML libraries

There are several libraries in ArgoUML that need to be updated: easymock, JUnit, etc.

Organizing ArgoUML's automated tests

ArgoUML has all the automated tests mixed in a single directory tree for each of the sub-projects of which it is composed. I think that as a project we should differentiate the levels of testing according to the nature of those tests: unit, module, integration, acceptance [non-existing as of now] and deployment. This would enable the TDD cycle, by having unit tests execution in seconds instead of minutes. It would also be very educative for the testing practices in ArgoUML.

UML profile for Lisp

Creating a UML profile for Lisp should be very interesting. It would be an excellent exercise for a lightning talk in the ELS 2010. It would be an excellent opportunity to test in depth the ArgoUML profile support. It would be a learning opportunity for me into the depths of the extensibility aspects of UML.

My idea is that it should have some common patterns amongst the several dialects and that those patterns should in principle be modeled in a reusable way in a common profile, maybe a PIM.

Then, each Lisp has a specific namespace system, macro system and library. This must be derived with a PIM to PSM transformation and then developed on its own.

A REPL for each would also be required.

A connector to connect to live Lisps to enable reverse engineering is also interesting. This is something that could take advantage of some sharing of a common base project, but, would naturally diverge for each Lisp dialect.

A multiple languages REPL in ArgoUML

Sub-project of ArgoUML that provides a base REPL infrastructure to enable and easy development of multiple language REPLs for ArgoUML. I would make one for each of the dynamic languages modules that exist. Even for the C++ module or other non-dynamic languages modules a special purpose and convenient REPL could be interesting if it contained the interesting imports for interaction with the model. Modeling at the REPL with Common Lisp (ABCL), Clojure or Jython, what a dream!

I'm trying to embed ABCL in ArgoUML, opening a REPL in Emacs, but, because my SLIME setup for ABCL is broken, I'm having difficulties. The idea is to try to develop interactively a ETL from a UML1.4 model into a UML2.2 model to make it easy to update the profiles from UML1.4 XMI into UML2.2 XMI.

One possible solution would be not to use Emacs and SLIME. For instance, I could reuse Netbeans REPL. Since Netbeans already has Jython and JRuby support, I could try to port those easily into ArgoUML.

Auto coordinated traffic

System or platform that signals drivers' intent from cars to cars, so that the cars could be smart enough to warn the drivers about what other persons intend to do or even take appropriate measures to avoid accidents.

TODO: elaborate on this.

TODO: develop the idea of developing computational models/simulations where the platform modules could be tested and developed. Do it in Clojure, visualization in Google Maps with site in Google AppEngine. Do this in the programacao handbook.

Tuesday, March 09, 2010

Reader Shared items

Blog Archive