migrating from python

I'm working on semantics and triples (RDF & co) Python code for inference in based totally on dictionaries (associative arrays ??),nested three or four times. The result is astonishing me:compact beautiful modular and extremely readable. I imagine that haskell way should be different but I'm in the dark. As the stream of triples coming from outside estabilishes the actions to be done for inference,the searching through keys (the words that form any triple) is the main job:every time I pick up a triple I have to enlarge the knowledge more or less "near" the resources represented by the words. And after I have produced (possibly) some new inferred triples surely I have to reswitch on some already read triples, matching a pattern. All these are very natural with dictionaries, so I'd like to figure out the haskell view. Thanks for your quiteness and answers Paolino -- ....lotta dura per la verdura

At 16:34 13/07/04 +0200, paolo veronelli wrote:
I'm working on semantics and triples (RDF & co)
I've been working on something very similar, in Haskell. I also did some work in Python before moving to Haskell. My project is Swish [1]. (I've also just completed coding/testing of an RDF/XML parser, which I've yet to integrate into Swish.) It is my experience that at the heart of almost any inference process I have tried is a query of the RDF graph. (I've played with conventional rules and some class-based inference approaches, and a key operation seems to be query.) I don't know how different you expect the "Haskell way" to be -- it's maybe less so than one might expect. My own experience is that Haskell works more like a specification language than conventional programming approaches, and that it's relatively easy to maintain a close correspondence between executable code and a logical description of the domain information. But actually directing (planning) an inference process remains a tricky problem. Python's dictionaries are neat, and very easy to use. With Haskell you have to choose a mechanism, but there are many there, waiting to be used. So far, all my work has used a very primitive linear search (horribly inefficient, I know, but efficiency hasn't been my primary concern, and it's easy enough to swap out one mechanism for another. I'm not sure what kind of information you're after, so there's not much more I can say at this stage. But have fun with the functional way! It's taken me a while to get a feel for it; sometimes, things just seem to be unreasonably easy, and at other times it seems little different to any other language. #g -- [1] http://www.ninebynine.org/RDFNotes/Swish/Intro.html At 16:34 13/07/04 +0200, paolo veronelli wrote:
I'm working on semantics and triples (RDF & co)
Python code for inference in based totally on dictionaries (associative arrays ??),nested three or four times. The result is astonishing me:compact beautiful modular and extremely readable.
I imagine that haskell way should be different but I'm in the dark.
As the stream of triples coming from outside estabilishes the actions to be done for inference,the searching through keys (the words that form any triple) is the main job:every time I pick up a triple I have to enlarge the knowledge more or less "near" the resources represented by the words.
And after I have produced (possibly) some new inferred triples surely I have to reswitch on some already read triples, matching a pattern.
All these are very natural with dictionaries, so I'd like to figure out the haskell view.
Thanks for your quiteness and answers Paolino -- ....lotta dura per la verdura
_______________________________________________ Haskell-Cafe mailing list Haskell-Cafe@haskell.org http://www.haskell.org/mailman/listinfo/haskell-cafe
------------ Graham Klyne For email: http://www.ninebynine.org/#Contact

paolo veronelli
All these are very natural with dictionaries, so I'd like to figure out the haskell view.
I've no clue what you're doing (or for that matter, what RDFs are), but is there any reason Data.FiniteMap doesn't do the job? -kzm -- If I haven't seen further, it is by standing in the footprints of giants
participants (3)
-
Graham Klyne
-
Ketil Malde
-
paolo veronelli