Non-technical Haskell question

Hi, all, I'm new to the Cafe, but not to Haskell (experimented with it on and off on a small scale over the last 5 or 6 years and enjoy the language quite a lot) and had more of a political question and wanted to see what people thought: Has anyone tried presenting the language to the average rank and file programming community? If so, was it successful? If not, is there interest in doing so? By "rank and file" I mean, outside of the acedemic world where a large number of the programmers I see have very little math background. This would be the typical commercial Visual Basic crowd and the like. Thanks, Will Collum

On Tue, 30 Nov 2004, GoldPython wrote:
Hi, all,
I'm new to the Cafe, but not to Haskell (experimented with it on and off on a small scale over the last 5 or 6 years and enjoy the language quite a lot) and had more of a political question and wanted to see what people thought:
Has anyone tried presenting the language to the average rank and file programming community? If so, was it successful? If not, is there interest in doing so?
By "rank and file" I mean, outside of the acedemic world where a large number of the programmers I see have very little math background. This would be the typical commercial Visual Basic crowd and the like.
Even inside the mathematical academic world, Haskell is not widely known or even used. So, I'm advocating Haskell whereever I feel it is advantageous compared to the languages in use. Though, the success makes me doubting, if advocacy is always good. Programmers of traditional languages convinced me that it is possible to write bad code in Haskell. :-) It's also hard or impossible to convince people that it is good, that some things are missing in Haskell for good reasons, such as global variables. I observed that people don't like advocacy but want to make the common errors themselves. It is better if they do them in C, Perl, MatLab, whatever. If they have made enough mistakes and wonder if there are ways around then they are at the point where advertising Haskell makes sense.

GoldPython
Has anyone tried presenting the language to the average rank and file programming community? If so, was it successful? If not, is there interest in doing so?
The #haskell irc channel on irc.freenode.net is composed of many different flavors of programmer, from self-educated 16 year olds on up to post doctoral students studying functional programming. I'm a self-educated, self-employed programmer. I use Python in most of my paying work but would very much prefer to use Haskell. It seems obvious to me (but not to most of my clients :-) that the various powerful and expressive patterns in Haskell allow programmers to deliver more business value in less time than almost any other programming language.
By "rank and file" I mean, outside of the acedemic world where a large number of the programmers I see have very little math background. This would be the typical commercial Visual Basic crowd and the like.
I have no math background. I started with BASIC on a Sinclair, and my first real programming job was with Visual Basic 4, 5, and 6 for trust management. It seems that Haskell is about finding essential patterns and making those available for easy use. Most of my code starts out fuzzy and complicated, but as I understand the problem better, my code gets smaller. In the process, I find and refactor more and more places where my code is special or general cases of Prelude functions. Implicit For Each looping in Visual Basic was easier than manual looping in C. The map function in Python was another improvement. Now I have monads as the next step up in power, and I'm reading up on arrows. I'd think every programmer who preferred Visual Basic over C would end up loving Haskell. -- Shae Matijs Erisson - http://www.ScannedInAvian.com/ - Sockmonster once said: You could switch out the unicycles for badgers, and the game would be the same.

On Tue, Nov 30, 2004 at 04:57:27PM +0100, Shae Matijs Erisson wrote:
The #haskell irc channel on irc.freenode.net is composed of many different flavors of programmer, from self-educated 16 year olds on up to post doctoral students studying functional programming. I'm a self-educated, self-employed programmer. I use Python in most of my paying work but would very much prefer to use Haskell. It seems obvious to me (but not to most of my clients :-) that the various powerful and expressive patterns in Haskell allow programmers to deliver more business value in less time than almost any other programming language.
I've been sitting around #haskell on EfNet for something like 5 years and wondering why no one ever came by. -- wli

On 2004-11-30, GoldPython
Has anyone tried presenting the language to the average rank and file programming community? If so, was it successful? If not, is there interest in doing so?
I am very interested in doing that. I'm a relatively recent Haskeller. I come from a background with a lot of I/O type problems: databases, networking, etc. Haskell works nicely in this domain, too, but it it severely under-documented and, in some cases, under-supported. My main project right now is my MissingH library. Loosely speaking, its goal is to provide pure Haskell implementations of all the stuff I miss from the Python standard library. It's been a great way for me to learn Haskell, and also goes a long way to making Haskell useful in the everyday problems I have to solve. Right now, it has a bunch of string utilities, a bunch of I/O utilities, a Printf implementation, an FTP client module, some MIME stuff, etc. I almost have my ConfigParser module done, which is a clone of the Python module. (This is now the second time I've written ConfigParser from scratch; my MissingLib project for OCaml also has it.) I also have a very small start on a "haskell for hackers" ("hackers" in the non-evil sense) sort of document. One this doesn't ignore I/O as "hard" or "unimportant". I/O in Haskell doesn't suck. It's just that a lot of people in the community don't have it as a high priority, I think. I also want MissingH to be complete enough that I can use it to port my OfflineIMAP program to Haskell. That would mean adding an IMAP client library and Maildir tools. I'm also a firm believer that code is one of the best forms of tutorial. I hope people will be able to refer to MissingH and other similar projects to learn how to do things. I, for instance, referred to the Haskell xml-rpc implementation to learn how to craft a function taking a variable number of arguments, and used that knowledge to write my Printf implementation. Maybe somebody can refer to my FTP module and use that knowledge to write a module for the world's most underused protocol, Gopher :-) -- John

On Wednesday 01 December 2004 21:18, John Goerzen wrote:
I also have a very small start on a "haskell for hackers" ("hackers" in the non-evil sense) sort of document. One this doesn't ignore I/O as "hard" or "unimportant". I/O in Haskell doesn't suck.
I come from a similar area (large control systems) and I agree fullheartedly.
It's just that a lot of people in the community don't have it as a high priority, I think.
OTOH, I have the feeling that they are no longer the overwhelming majority.
I'm also a firm believer that code is one of the best forms of tutorial.
Yes, absolutely. I always learn the fastest when I try to change existing code, for whatever reason (if not just to learn). Cheers, Ben

I think you may be asking the wrong question. As one of the "rank and file" and fairly new to Haskell (less then a month) I can tell you that there is a growing awareness of functional programming and that it offers different paradigms to work with. I think the more important question is - "is Haskell ready?" So far, from the perspective of a newbie, I would say no. The documentation is sparse and confusing, the "standard" libraries seem incomplete and how complitaion and linking is handled feels antiquated. I mean I think its a really cool idea, and I'm having fun learning it. But I would be hard pressed to come up with a justification to introduce this into our business environment. Jason GoldPython wrote:
Hi, all,
I'm new to the Cafe, but not to Haskell (experimented with it on and off on a small scale over the last 5 or 6 years and enjoy the language quite a lot) and had more of a political question and wanted to see what people thought:
Has anyone tried presenting the language to the average rank and file programming community? If so, was it successful? If not, is there interest in doing so?
By "rank and file" I mean, outside of the acedemic world where a large number of the programmers I see have very little math background. This would be the typical commercial Visual Basic crowd and the like.
Thanks, Will Collum _______________________________________________ Haskell-Cafe mailing list Haskell-Cafe@haskell.org http://www.haskell.org/mailman/listinfo/haskell-cafe

On 3 Dec 2004, at 03:48, Jason Bailey wrote:
As one of the "rank and file" and fairly new to Haskell (less then a month) I can tell you that there is a growing awareness of functional programming and that it offers different paradigms to work with.
That's good to hear.
The documentation is sparse and confusing,
Agreed. The hierarchical library documentation is poor in many places.
the "standard" libraries seem incomplete
Compared to perl or java? Yes, absolutely. Perl and Java both have enormous libraries of software available, and both have taken years and years to reach the current state. Compared to C/C++ (which are both popular 'real world' languages, of course) I think haskell isn't doing so badly.
and how complitaion and linking is handled feels antiquated.
Can you be more specific here? Jules

Jules Bean wrote:
The documentation is sparse and confusing,
Agreed. The hierarchical library documentation is poor in many places.
As well as a lack of decent online tutorials, examples, etc. If it wasn't for the book /Haskell: The Craft of Functional Programming/ I would be much farther back in my comprehension then I am now.
the "standard" libraries seem incomplete
Compared to perl or java? Yes, absolutely. Perl and Java both have enormous libraries of software available, and both have taken years and years to reach the current state. Compared to C/C++ (which are both popular 'real world' languages, of course) I think haskell isn't doing so badly.
I don't think you can really compare Haskell with the C's. C/C++, for the time being, is the basis of most low level api's. They don't really need a large standard library because their packages are available everywhere and are easily installed and updated. Other languages, such as Perl, Java, Python, need to supply their own extensive libraries just to compete.. To say in fact... look you can do whatever you want in our language just as you could in C/C++. Haskell has been around for quite a while, longer then Java or Python, and almost as long as Perl. Yet it doesn't have half the inherent library functionality that these other languages have. I find it curious. I like Haskell and I think it has a lot of promise. I just don't see why this problem exists.
and how complitaion and linking is handled feels antiquated.
Can you be more specific here?
First off let me say that I come from a world of Java. With the occasional foray into scripting languages and I only do C when forced. So yes I am spoiled :) When I compile a language I expect to do some simple command and have a single end result that is dynamically linked. One of the first things I wrote to get a feel of Haskell was a small program that popped open a gtk window with a button and every time I clicked the button it incremented a counter. When I compile it I get three files, an actual runnable binary (at only 5M in size), a .o file and a .hi file. I'm sure these additional files are usefull in someway and as soon as I come across the right piece of documentation everything should make sense. But as a person new to the language I'm just left wondering why. Static linking: For a single specific program I can see where it wouldn't make a difference. But lets say I want to replace my entire desktop with Haskell based applications. The sheer amount of additional space that would be required is breathtaking and then if there is a bug discovered in one of the libraries would I then be forced to recompile everything? There might be solutions/answers to my concerncs already in place. I am new to this and I realize I am going to miss things. But the fact that I am missing the answers to these is just an emphasis on the problem of documentation. Jason

Jason Bailey wrote:
As well as a lack of decent online tutorials, examples, etc. If it wasn't for the book /Haskell: The Craft of Functional Programming/ I would be much farther back in my comprehension then I am now.
Speaking of books, are there any intermediate/advanced Haskell books? Ones that aren't introduction to programming texts? If I own the Hudak book, is it worthwhile to also acquire the Thompson book? Here's an analogy from the perl universe. If the _School_of_Expression_ is to _Learning_Perl_ (the llama), what are the Haskell equivalents of _Programming_Perl_ (camel) and _Perl_Cookbook_ (ram)? I suppose I should stop being lazy and contribute to the Haskell version of the PLEAC project... http://pleac.sourceforge.net/ Greg Buchholz

Hi All,
On Fri, 03 Dec 2004 10:05:21 -0500, Jason Bailey
Jules Bean wrote: <sniped> I don't think you can really compare Haskell with the C's. C/C++, for the time being, is the basis of most low level api's. They don't really need a large standard library because their packages are available everywhere and are easily installed and updated. Other languages, such as Perl, Java, Python, need to supply their own extensive libraries just to compete.. To say in fact... look you can do whatever you want in our language just as you could in C/C++.
Haskell has been around for quite a while, longer then Java or Python, and almost as long as Perl. Yet it doesn't have half the inherent library functionality that these other languages have. I find it curious. I like Haskell and I think it has a lot of promise. I just don't see why this problem exists.
The standart library is not that bad as you state here. Okay, programming languages like Python ship everything including the kitchen sink with their "standard libraries", but Perl for example lives from CPAN and not from the standard libraries. What we need is an easy to handle package system and an online resource like CPAN. Cabal and Hackage are on the way to support that. The fact that Java, Perl and so on have more powerful libraries is just because there is a much larger community and also companies involved. I believe that Haskell has a very large potential also in real world programming, since one can produce more stable code in much less time. The reason why it is not so popular is because it is different from what people know and people tent to use known things. OO programming didn't conquer the world over night either. Smalltalk and Eiffel have been around for a very long time until C++ and Java came up and conquered the world. Functional programming is certainly not new and LISP for example is known and used quite a lot. I mean it needs time to get Haskell it the mind of the programmers around the world and since it is teached at some universities there is a good chance to increase the popularity over time. If there is a demand the consulting and training companies will be there, the other way around doesn't work IMHO.
and how complitaion and linking is handled feels antiquated.
Can you be more specific here?
First off let me say that I come from a world of Java. With the occasional foray into scripting languages and I only do C when forced. So yes I am spoiled :)
When I compile a language I expect to do some simple command and have a single end result that is dynamically linked.
One of the first things I wrote to get a feel of Haskell was a small program that popped open a gtk window with a button and every time I clicked the button it incremented a counter.
When I compile it I get three files, an actual runnable binary (at only 5M in size), a .o file and a .hi file. I'm sure these additional files are usefull in someway and as soon as I come across the right piece of documentation everything should make sense. But as a person new to the language I'm just left wondering why.
Why the hell do you care about intermediate files generated by the compiler. I am sure they make perfectly sense and the main point is: you don't have to care about them at all. The --make option and the automatic module chaising of ghc is perfect and I can't see what else you need and what javac can do better. Another thing I like to state here is that I have never seen a compiler that produces that nice error messages as ghc does. Considering the fact that type and class errors and be quite difficult it does a extremly good job in my eyes. Georg -- ---- Georg Martius, Tel: (+49 34297) 89434 ---- ------- http://www.flexman.homeip.net ---------

When I compile it I get three files, an actual runnable binary (at only 5M in size), a .o file and a .hi file. I'm sure these additional files are usefull in someway and as soon as I come across the right piece of documentation everything should make sense. But as a person new to the language I'm just left wondering why.
This is better than C... ghc makes the 'include' files for you... the .hi files are really module interface files (like .h files in C or C++). Keean.

When I compile it I get three files, an actual runnable binary (at only 5M in size), a .o file and a .hi file. I'm sure these additional files are usefull in someway and as soon as I come across the right piece of documentation everything should make sense. But as a person new to the language I'm just left wondering why.
gcc of course leaves .o files lying around, so this is no different than C. javac often creates tens or hundreds of .class files, some with strange names like MyClass$1.class, MyClass$2.class, and so on (yes, I know what they are for), and doesn't even generate a native binary. I don't think the output of ghc is any more surprising than these; I'm surprised it bothers you. The static vs dynamic linking question has been discussed many times. The summary is: GHC is a highly-optimising compiler, and the binary interface _necessarily_ changes with every minor revision (even patchlevel revision) of the compiler and each library. So you can't sensibly share libraries between apps. Anyway, disc is cheap. HTH. --KW 8-)

On 2004-12-06, Keith Wansbrough
The static vs dynamic linking question has been discussed many times. The summary is: GHC is a highly-optimising compiler, and the binary interface _necessarily_ changes with every minor revision (even patchlevel revision) of the compiler and each library. So you can't
We already have a way to deal with that using sonames. It's not that hard, and is routinely used. BTW, is this a problem even if no -O options are given when building a library?
sensibly share libraries between apps. Anyway, disc is cheap.
Memory not so much, though. One advantage of having something in .so form is that every instance of every application that uses it shares the same in-memory image of the code. -- John

John Goerzen
On 2004-12-06, Keith Wansbrough
wrote: The static vs dynamic linking question has been discussed many times. The summary is: GHC is a highly-optimising compiler, and the binary interface _necessarily_ changes with every minor revision (even patchlevel revision) of the compiler and each library. So you can't
We already have a way to deal with that using sonames. It's not that hard, and is routinely used.
I don't think sonames are enough. For a library A, we need to record - version of A - version of GHC used - version of each package on which A depends - version of each package on which those packages depend (recursively) and, if package management isn't being used throughout, the precise content of each .hs file used during compilation that was not package-controlled. sonames only encode a single version, the version of A. Even if we allow that in a dynlinked scenario the dependent packages would have their versions encoded internally (in the names of the requested libraries), we still need to encode _two_ versions into the name. I'm not aware of a routinely-used way of doing this, although I am happy to be corrected. I guess wouldn't be hard to come up with a scheme, though. The point is that this wouldn't be anything like as useful as it is in the C world. If you upgrade package A, your old apps can't use the new .so - they must still use the old one, until you recompile them. And instead of storing (number of packages * number of versions of lib) dynamic libraries, you have to store (number of packages * number of versions of lib * number of versions of GHC) dynamic libraries. I'll let others (Simon Marlow?) comment further... --KW 8-)

John Goerzen
sensibly share libraries between apps. Anyway, disc is cheap.
Memory not so much, though. One advantage of having something in .so form is that every instance of every application that uses it shares the same in-memory image of the code.
Well, a 5 Mbyte [1] overhead isn't really that much, IMHO. You'd need to run a lot of (different; if they're the same, the text will be shared) applications to get any measurable benefit. Eventually, it would be nice to have dynamic linkage, but I can see why it isn't a priority. -kzm [1] On my Linux system, the overhead seems to be less than 2 Mbyte. 5 Mb is the figure used by the OP. -- If I haven't seen further, it is by standing in the footprints of giants

Ketil Malde wrote:
John Goerzen
writes: sensibly share libraries between apps. Anyway, disc is cheap.
Memory not so much, though. One advantage of having something in .so form is that every instance of every application that uses it shares the same in-memory image of the code.
Well, a 5 Mbyte [1] overhead isn't really that much, IMHO. You'd need to run a lot of (different; if they're the same, the text will be shared) applications to get any measurable benefit. Eventually, it would be nice to have dynamic linkage, but I can see why it isn't a priority.
-kzm
[1] On my Linux system, the overhead seems to be less than 2 Mbyte. 5 Mb is the figure used by the OP.
If we assume that Haskell programs are a) uncommon and b) the top of a solution stack (ie, not the OS, not the GUI toolkit, not the network stack) then this reasoning is sound, and we don't really need dynamic linking. IF, on the other hand, you imageine Haskell wideing its borders and moving into other nitches, the value of dynamic linking becomes apparent. The problem, of course, is that Haskell likes to tightly bind with the libraries it uses (inlineing across modules and other optimizations). So imaging if the "package" unit was a barrier to those kinds of optimizations. Then, no knowledge of the internals of the package are needed by importing modules, and "sufficently" compatable pacakges could be drop in replacements, .so or .dll style. I suppose I am suggesting that we consider the "package" as a unit which has a stable ABI. Is this possible/easy to do? Could we then implement dynamic linking w/ packages?

On 2004-12-06, Robert Dockins
[1] On my Linux system, the overhead seems to be less than 2 Mbyte. 5 Mb is the figure used by the OP.
One other problem with that is that 5MB is a LOT on a device such as my Zaurus. I have an OCaml development environment on this PDA, as well as a Python one, but I think that ghc is going to be out of the question.
If we assume that Haskell programs are a) uncommon and b) the top of a solution stack (ie, not the OS, not the GUI toolkit, not the network stack) then this reasoning is sound, and we don't really need dynamic linking. IF, on the other hand, you imageine Haskell wideing its borders and moving into other nitches, the value of dynamic linking becomes apparent.
That is an excellent point. Who would use an ls or cp that requires 10MB of RAM, especially on embedded devices?
The problem, of course, is that Haskell likes to tightly bind with the libraries it uses (inlineing across modules and other optimizations). So imaging if the "package" unit was a barrier to those kinds of optimizations. Then, no knowledge of the internals of the package are needed by importing modules, and "sufficently" compatable pacakges could be drop in replacements, .so or .dll style.
I suppose I am suggesting that we consider the "package" as a unit which has a stable ABI. Is this possible/easy to do? Could we then implement dynamic linking w/ packages?
It seems that what we need is a way to control this cross-module optimization. I for one think that the performance benefit we see from that is more than offset by the inconvenience. If it were at least made an option, then a lot of other options would become available to us, too. -- John

On 6 Dec 2004, at 17:29, John Goerzen wrote:
That is an excellent point. Who would use an ls or cp that requires 10MB of RAM, especially on embedded devices?
This is presumably just because we don't have 'smart' linking, so the whole library is bundled in. I imagine in principle smart linking would be possible...
optimization. I for one think that the performance benefit we see from that is more than offset by the inconvenience. If it were at least made an option, then a lot of other options would become available to us, too.
I imageine that turning off cross-module inlining could be catastrophic for performance in certain cases. C++ solves this problem by defining all its inlinable functions in the interface rather than the implementation. Conceivably a clever compiler could still do inlining in one direction, and produce a dynamically linked executable, I suppose. Jules

On 6 Dec 2004, at 17:29, John Goerzen wrote:
That is an excellent point. Who would use an ls or cp that requires 10MB of RAM, especially on embedded devices?
This is presumably just because we don't have 'smart' linking, so the whole library is bundled in. I imagine in principle smart linking would be possible...
Indeed, and it would probably help with the xwHaskell bloat from a diffent post.
optimization. I for one think that the performance benefit we see from that is more than offset by the inconvenience. If it were at least made an option, then a lot of other options would become available to us, too.
I imageine that turning off cross-module inlining could be catastrophic for performance in certain cases.
But, (and here is the point) not in ALL cases. In particular, if we could segment closely related code with many interdependencies into discrete units with well defined external interfaces (sound like packages to anyone else?), then my intuition tells me that the cost of setting up an inlining barrier should be fairly low. Module inlining _within_ a package would still occur, just not _between_ packages.

On 2004-12-06, Robert Dockins
On 6 Dec 2004, at 17:29, John Goerzen wrote:
This is presumably just because we don't have 'smart' linking, so the whole library is bundled in. I imagine in principle smart linking would be possible...
Indeed, and it would probably help with the xwHaskell bloat from a diffent post.
Could you clarify what exactly "smart linking" is? The assumption I have gleaned with talking to a few experts on IRC is this: Presently, ghc will only link in a .o component of a .a file if symbols from that .o file are actually used in the final executable. This is the motiviation for the split-objs feature; to reduce the size of executables that use quite large .o files. In my experience, most .o files are not that large. Does that agree with your experience? -- John

On 6 Dec 2004, at 18:48, John Goerzen wrote:
Presently, ghc will only link in a .o component of a .a file if symbols from that .o file are actually used in the final executable. This is the motiviation for the split-objs feature; to reduce the size of executables that use quite large .o files.
In my experience, most .o files are not that large. Does that agree with your experience?
If that is the case, then it already is 'smart linking' and I stand corrected. Unless the granularity of the .o files is too large, of course... Jules

On Mon, 6 Dec 2004, Jules Bean wrote:
On 6 Dec 2004, at 18:48, John Goerzen wrote:
Presently, ghc will only link in a .o component of a .a file if symbols from that .o file are actually used in the final executable. This is the motiviation for the split-objs feature; to reduce the size of executables that use quite large .o files.
In my experience, most .o files are not that large. Does that agree with your experience?
If that is the case, then it already is 'smart linking' and I stand corrected. Unless the granularity of the .o files is too large, of course...
It is - you get one .o per module. -- flippa@flippac.org

Philippa Cowderoy wrote:
On Mon, 6 Dec 2004, Jules Bean wrote:
[...] If that is the case, then it already is 'smart linking' and I stand corrected. Unless the granularity of the .o files is too large, of course... It is - you get one .o per module.
That's not true with -split-objs, e.g. the base package consists of almost 12000 *.o files and one module of my OpenGL package is compiled into 3000 *.o files alone. One can't really get much smaller... Cheers, S.

On Mon, 6 Dec 2004, Robert Dockins wrote:
On 6 Dec 2004, at 17:29, John Goerzen wrote:
That is an excellent point. Who would use an ls or cp that requires 10MB of RAM, especially on embedded devices?
This is presumably just because we don't have 'smart' linking, so the whole library is bundled in. I imagine in principle smart linking would be possible...
Indeed, and it would probably help with the xwHaskell bloat from a diffent post.
The strip utility helps somewhat, I just dropped a wxHaskell app from a 10 meg .exe to about 3.6 megs under windows (grab a mingw distro for a strip.exe). It's still not wonderful, but I suspect the growth in binary size with new Haskell code will be reasonably tame at least (the app I tried is fairly low on functionality). A stripped "sig generation" utility I just hacked up came down to 344K under win32, which is still big for what it does but at least it's distributable. -- flippa@flippac.org

Philippa Cowderoy wrote:
The strip utility helps somewhat, I just dropped a wxHaskell app from a 10 meg .exe to about 3.6 megs under windows.
You can also compress the stripped executable with UPX. GHC-generated executables seem to compress very well (about 4:1 in my experience), and even a very large executable, like GHC itself, decompresses so quickly that I can hardly tell the difference in startup time. Caveat: this actually increases virtual memory requirements, since the executable image in memory can't be backed by the executable file on disk. -- Ben

Philippa Cowderoy
The strip utility helps somewhat
You're right, of course. My executable (incidentally on Sparc) seems to have an overhead of approximately one megabyte when just considering the text segment (that is, subtracting the text sizes of my own .o files). -kzm -- If I haven't seen further, it is by standing in the footprints of giants

Robert Dockins wrote:
[...] In particular, if we could segment closely related code with many interdependencies into discrete units with well defined external interfaces (sound like packages to anyone else?), then my intuition tells me that the cost of setting up an inlining barrier should be fairly low. Module inlining _within_ a package would still occur, just not _between_ packages.
This reasoning might be valid for traditional languages, but not for languages like Haskell promoting the use of higher order, typically small functions. Not inlining most monads would probably be catastrophic, as would be not doing so for our beloved map, foldr, etc. And IMHO C++ has given up this kind of binary compatibility completely lon ago when code in headers was introduced, so Haskell is in good company with one of "the" languages in use. Not really a good excuse, but a fact... And just a remark: We don't need a new technique for a "no inline barrier": Just compile the library optimized and use a facade which re-exports your public API compiled without optimizations. Cheers, S.

On Mon, 6 Dec 2004, Robert Dockins wrote:
The problem, of course, is that Haskell likes to tightly bind with the libraries it uses (inlineing across modules and other optimizations). So imaging if the "package" unit was a barrier to those kinds of optimizations. Then, no knowledge of the internals of the package are needed by importing modules, and "sufficently" compatable pacakges could be drop in replacements, .so or .dll style.
This would mean that functions like 'map' and 'foldr' couldn't be unrolled because they are in the package of the standard functions?

On 6 Dec 2004, at 21:16, Henning Thielemann wrote:
On Mon, 6 Dec 2004, Robert Dockins wrote:
The problem, of course, is that Haskell likes to tightly bind with the libraries it uses (inlineing across modules and other optimizations). So imaging if the "package" unit was a barrier to those kinds of optimizations. Then, no knowledge of the internals of the package are needed by importing modules, and "sufficently" compatable pacakges could be drop in replacements, .so or .dll style.
This would mean that functions like 'map' and 'foldr' couldn't be unrolled because they are in the package of the standard functions?
I don't think it does, actually. You can imagine a compiler which has access to not *only* the .so files, but also the haskell source. Therefore it can still unroll (from the source), but it can choose to link to an exported symbol if unrolling isn't worth it. Jules

Jules Bean wrote:
I don't think it does, actually. You can imagine a compiler which has access to not *only* the .so files, but also the haskell source. Therefore it can still unroll (from the source), but it can choose to link to an exported symbol if unrolling isn't worth it.
But that's not dynamic linking... Imagine a bug in version X of your lib, simply using version X+1 with your already compiled program won't fix that bug. Again, this is just like C++. Cheers, S.

On 6 Dec 2004, at 21:56, Sven Panne wrote:
Jules Bean wrote:
I don't think it does, actually. You can imagine a compiler which has access to not *only* the .so files, but also the haskell source. Therefore it can still unroll (from the source), but it can choose to link to an exported symbol if unrolling isn't worth it.
But that's not dynamic linking... Imagine a bug in version X of your lib, simply using version X+1 with your already compiled program won't fix that bug. Again, this is just like C++.
Absolutely. And, as such, it's a possible solution. You don't get all the advantages of dynamic linking: you can't get in-place upgrades on bugfix libraries. You do get other advantages though: reduced disk space and shared in-memory images. To get a more sophisticated solution you need to have a more instrumented object format, and move various optimisations either just-in-time or launch-time. (And then you lose the shared in-memory advantage, again...) Jules

Henning Thielemann wrote:
On Mon, 6 Dec 2004, Robert Dockins wrote:
The problem, of course, is that Haskell likes to tightly bind with the libraries it uses (inlineing across modules and other optimizations). So imaging if the "package" unit was a barrier to those kinds of optimizations. Then, no knowledge of the internals of the package are needed by importing modules, and "sufficently" compatable pacakges could be drop in replacements, .so or .dll style.
This would mean that functions like 'map' and 'foldr' couldn't be unrolled because they are in the package of the standard functions?
Probably it would make sense to have a basic set of modules which are not in any package, but which are avaliable to import into all modules. I image this would include things like the Prelude, the Haskell98 basic modules, the new Data.* modules, Control.*, Foreign.*, and probably a few other things as well (in short, most of the current Hierarchical Libraries). Obviously, barring map and foldr from inlining would be a Bad Idea. Things which exist now that I would think would work well behind a "no inline" barrier: Parsec wxHaskell HAXML HToolkit WASH dozens of others I don't know about I would guess that most of the entry points for these libraries are already too complicated for GHC to think that it is "worth" inlining them. Mandating a "no inline across packages" rule would likely not change their performance much.

Ketil Malde wrote:
John Goerzen
writes: sensibly share libraries between apps. Anyway, disc is cheap.
Memory not so much, though. One advantage of having something in .so form is that every instance of every application that uses it shares the same in-memory image of the code.
Well, a 5 Mbyte [1] overhead isn't really that much, IMHO. You'd need to run a lot of (different; if they're the same, the text will be shared) applications to get any measurable benefit. Eventually, it would be nice to have dynamic linkage, but I can see why it isn't a priority.
I find the size of the binaries generated by ghc when I use wxhaskell totally stupefying. A considerable time is spent just linking the final binary. (The big culprit is that wxhaskell is far to monolithic so you get all kinds of stuff linked in even if you don't use it.) I think the lack of dynamic linking of Haskell libraries is a real shame, and I don't buy the versioning argument. You can have enough sefety checks to at least detect versioning problems. -- Lennart

On Fri, Dec 03, 2004 at 10:05:21AM -0500, Jason Bailey wrote:
I don't think you can really compare Haskell with the C's. C/C++, for the time being, is the basis of most low level api's. They don't really need a large standard library because their packages are available everywhere and are easily installed and updated.
Those languages have their own problems, bigger problems if you ask me. That C++ doesn't need a large standard library is not really true - what's Boost if not a large C++ quasi-standard library?
Haskell has been around for quite a while, longer then Java or Python, and almost as long as Perl. Yet it doesn't have half the inherent library functionality that these other languages have. I find it curious. I like Haskell and I think it has a lot of promise. I just don't see why this problem exists.
What's funny (and curious) is that it's still easier for me to write any given program in Haskell than in those "mature" languages with tons of libraries. I don't want to find, learn and rely on hundreds of libraries. I want a powerful, expressive language, which makes half of these libraries unnecessary and allows me to easily create and use my own libraries. I want to learn and use powerful programming techniques, learn a couple of solid, well designed libraries, which I can understand from both sides, interface and implementation, to be able to fix them, when it is necessary. That's why I choose Haskell, not Java or Python. Best regards, Tomasz

Jason Bailey wrote:
I mean I think its a really cool idea, and I'm having fun learning it. But I would be hard pressed to come up with a justification to introduce this into our business environment.
How about increased productivity, and more stuff right first time... Keean.

A question with regards to making Haskell easier to manage (like say perl or python), does Haskell have an equivalent of CPAN... if not would it be a good idea to write one? If haskell had a central code repository (like CPAN) then it would make installing a library as simple as running a single command (and you could have dependancy resolution too)... What are peoples thoughts, good idea? Keean.

Keean Schupke wrote:
A question with regards to making Haskell easier to manage (like say perl or python), does Haskell have an equivalent of CPAN... if not would it be a good idea to write one?
If haskell had a central code repository (like CPAN) then it would make installing a library as simple as running a single command (and you could have dependancy resolution too)...
What are peoples thoughts, good idea?
Keean.
IMHO I think this is a great idea. This is something that would really enhance the Haskell experience. Jason

I've been somewhat frustrated by most of the things mentioned here and
so don't need to mention them again. (Most recently, just to see I
comiled and linked a haskell "hello,world", came up with something
like an 8 mb binary under Windows and couldn't see from the GHC docs
how to link dynamicly.)
I recently wrote a text editor (something new, eh?) in C# and found
that the docs on the .NET class library to be superb. That quality of
documentation on the Haskell libraries with examples would go a lot.
Manpower, I imagine, would be the issue.
I manage a group of DBA's at a bank and was toying with the idea of
offering a sort of lunchtime-learning thing to the geeks in the office
and seeing who would be interested in learning to think about familier
things in a new way. I don't get to program too much anymore, but I
very much enjoy fiddling around with Haskell, usually in some
integration setting, and was thinking some of the others may enjoy a
new outlook as well.
An aside: Philip Wadler makes good points in a paper I found online
about why people don't use functional languages.
On Fri, 03 Dec 2004 10:52:43 -0500, Jason Bailey
Keean Schupke wrote:
A question with regards to making Haskell easier to manage (like say perl or python), does Haskell have an equivalent of CPAN... if not would it be a good idea to write one?
If haskell had a central code repository (like CPAN) then it would make installing a library as simple as running a single command (and you could have dependancy resolution too)...
What are peoples thoughts, good idea?
Keean.
IMHO I think this is a great idea. This is something that would really enhance the Haskell experience.
Jason
_______________________________________________ Haskell-Cafe mailing list Haskell-Cafe@haskell.org http://www.haskell.org/mailman/listinfo/haskell-cafe

At 10:54 03/12/04 +0000, Keean Schupke wrote:
Jason Bailey wrote:
I mean I think its a really cool idea, and I'm having fun learning it. But I would be hard pressed to come up with a justification to introduce this into our business environment. How about increased productivity, and more stuff right first time...
I agree, but... to carry weight in this snake-oiled world, such a claim needs to be backed by clear evidence. #g ------------ Graham Klyne For email: http://www.ninebynine.org/#Contact

Keean Schupke wrote:
Jason Bailey wrote:
I mean I think its a really cool idea, and I'm having fun learning it. But I would be hard pressed to come up with a justification to introduce this into our business environment.
How about increased productivity, and more stuff right first time...
Keean.
No offense but those are just catch phrases. They can support a justification but won't work as a justification in its own right. Here are some questions that I would expect to get from business. Q:"What have I heard about this technology?" A: Probably nothing. Haskell isn't very well known in the programming community (out of 6 co-workers asked, one had used Haskell in a single college class), let alone the business community. Business has become very wary about accepting technologies that are obscure. Q:"What can I do with this language that I can't do now?" A:Well nothing. It can certainly do some things better then the current languages out there, but its just another general purpose language. Q:"Will it require training?" A: Oh yes, we're talking about a different way of looking at programs. On the surface level it should be fairly easy to pick up but it will take some time before the engineers are able to produce decent work. Oh and there are no training classes we can send people to. They will have to learn on their own. Q:"Whats the market like for Haskell programmers?" A: Well there isn't one. Which means that if business was going to advertise for someone with haskell programming knowledge they are going to end some spending a premium on them. Q:"Why should we support yet another programming language?" A: Because this is a better language. (Wouldn't work as an answer but I would give it a try. ) And this is just the business side. I kinda shudder at the thought of telling a room full of engineers that they need to abandon their current status as object level gurus and learn some language that the majority of them have never heard of. :) I think the most important aspect of getting haskell acceptance is mind share. Both from a programming perspective and a business perspective. People need to have heard of haskell and be familiar with the concepts behind it before they will be willing to give it a try. Also the larger the corporation the less likely this is going to happen. But with mind share I can see smaller corps and smaller IT departments moving over to it. Jason

Jason Bailey wrote:
No offense but those are just catch phrases. They can support a justification but won't work as a justification in its own right.
Here are some questions that I would expect to get from business.
Q:"What have I heard about this technology?" A: Probably nothing. Haskell isn't very well known in the programming community (out of 6 co-workers asked, one had used Haskell in a single college class), let alone the business community. Business has become very wary about accepting technologies that are obscure.
At Imperial College (top european science and technology university) all DOC undergradutes taught Haskell as main teaching language - so no shortage of top-quality trained graduates...
Q:"What can I do with this language that I can't do now?" A:Well nothing. It can certainly do some things better then the current languages out there, but its just another general purpose language.
Get static guarantees that a program won't crash... programs can be buffer-overflow proof (list based strings) and more reliable
Q:"Will it require training?" A: Oh yes, we're talking about a different way of looking at programs. On the surface level it should be fairly easy to pick up but it will take some time before the engineers are able to produce decent work. Oh and there are no training classes we can send people to. They will have to learn on their own.
See answer to 1
Q:"Whats the market like for Haskell programmers?" A: Well there isn't one. Which means that if business was going to advertise for someone with haskell programming knowledge they are going to end some spending a premium on them.
See answer to 1
Q:"Why should we support yet another programming language?" A: Because this is a better language. (Wouldn't work as an answer but I would give it a try. )
Its not yet another programming language - it's the future and you don't want to be left behind... Keean.

I find myself agreeing with the implied likely response to all of the points you raise below. I'd say that any attempt to proselytize Haskell (or any new technology) needs to start from a clear view of one kind of application that it is particularly good for. Then, focus on building a "bridgehead" for that narrow application area. For example, I've been using Haskell to experiment with reasoning tasks on Semantic Web data -- it's an application for which Haskell seems to be eminently well-suited, for the following reasons, among others: * functional expression (as opposed to imperative) means that the reasoning programs are more closely related to the reasoning tasks being performed. * its type system ensures that necessary formalities of mapping concepts to representations are fully expressed. * lazy evaluation makes search-and-backtrack patterns very easy to program. * Higher order functions facilitate separation of concerns. In short, for this kind of application, I find that I spend most of my time thinking about the problem space, relatively little time programming supporting "scaffolding", and very little time debugging (once I've got the types to match up). This is my story. I don't know if there's anything here you can use. #g -- At 10:45 03/12/04 -0500, Jason Bailey wrote:
Here are some questions that I would expect to get from business.
Q:"What have I heard about this technology?" A: Probably nothing. Haskell isn't very well known in the programming community (out of 6 co-workers asked, one had used Haskell in a single college class), let alone the business community. Business has become very wary about accepting technologies that are obscure.
Q:"What can I do with this language that I can't do now?" A:Well nothing. It can certainly do some things better then the current languages out there, but its just another general purpose language.
Q:"Will it require training?" A: Oh yes, we're talking about a different way of looking at programs. On the surface level it should be fairly easy to pick up but it will take some time before the engineers are able to produce decent work. Oh and there are no training classes we can send people to. They will have to learn on their own.
Q:"Whats the market like for Haskell programmers?" A: Well there isn't one. Which means that if business was going to advertise for someone with haskell programming knowledge they are going to end some spending a premium on them.
Q:"Why should we support yet another programming language?" A: Because this is a better language. (Wouldn't work as an answer but I would give it a try. )
And this is just the business side. I kinda shudder at the thought of telling a room full of engineers that they need to abandon their current status as object level gurus and learn some language that the majority of them have never heard of. :)
I think the most important aspect of getting haskell acceptance is mind share. Both from a programming perspective and a business perspective. People need to have heard of haskell and be familiar with the concepts behind it before they will be willing to give it a try. Also the larger the corporation the less likely this is going to happen. But with mind share I can see smaller corps and smaller IT departments moving over to it.
Jason _______________________________________________ Haskell-Cafe mailing list Haskell-Cafe@haskell.org http://www.haskell.org/mailman/listinfo/haskell-cafe
------------ Graham Klyne For email: http://www.ninebynine.org/#Contact

On 2004-12-07, Graham Klyne
I'd say that any attempt to proselytize Haskell (or any new technology) needs to start from a clear view of one kind of application that it is particularly good for. Then, focus on building a "bridgehead" for that narrow application area.
I'm not so sure. When I go to learn a new language, I want a pretty clear idea that I'll be able to use it to solve many different types of problems. That's even more true with Haskell. It'll probably take most people more time to get up to speed with Haskell than it would take them with, say, Python or Java.

Java had a relatively slow uptake in enterprise and a meteoric rise in universities - that is really starting to pay off now as graduates look to java as a solution first (the first graduates brought up on java are just getting into decision making roles). Universities will accept Haskell for "ideological" reasons whereas enterprise needs practical benefits. At the moment, Haskell offers more of the former and so the focus should be on the Unis. Matt On 01/12/2004, at 1:00 AM, GoldPython wrote:
Hi, all,
I'm new to the Cafe, but not to Haskell (experimented with it on and off on a small scale over the last 5 or 6 years and enjoy the language quite a lot) and had more of a political question and wanted to see what people thought:
Has anyone tried presenting the language to the average rank and file programming community? If so, was it successful? If not, is there interest in doing so?
By "rank and file" I mean, outside of the acedemic world where a large number of the programmers I see have very little math background. This would be the typical commercial Visual Basic crowd and the like.
Thanks, Will Collum _______________________________________________ Haskell-Cafe mailing list Haskell-Cafe@haskell.org http://www.haskell.org/mailman/listinfo/haskell-cafe

At Tue, 30 Nov 2004 09:00:18 -0500, GoldPython wrote:
Hi, all,
Has anyone tried presenting the language to the average rank and file programming community? If so, was it successful? If not, is there interest in doing so?
I think this article is right-on when it comes to explaining why haskell has not yet succeeded (it even mentions haskell): http://khason.biz/blog/2004/12/why-microsoft-can-blow-off-with-c.html Jeremy Shaw. -- This message contains information which may be confidential and privileged. Unless you are the addressee (or authorized to receive for the addressee), you may not use, copy or disclose to anyone the message or any information contained in the message. If you have received the message in error, please advise the sender and delete the message. Thank you.

At Tue, 30 Nov 2004 09:00:18 -0500, GoldPython wrote:
Has anyone tried presenting the language to the average rank and file programming community? If so, was it successful? If not, is there interest in doing so?
On Thu, Dec 09, 2004 at 04:01:32PM -0800, Jeremy Shaw wrote:
I think this article is right-on when it comes to explaining why haskell has not yet succeeded (it even mentions haskell): http://khason.biz/blog/2004/12/why-microsoft-can-blow-off-with-c.html
So send the inventors of your favorite programming languages fake beards. -- wli

G'day all.
Quoting Jeremy Shaw
I think this article is right-on when it comes to explaining why haskell has not yet succeeded (it even mentions haskell):
http://khason.biz/blog/2004/12/why-microsoft-can-blow-off-with-c.html
I don't think so at all. Sure, Simon P-J doesn't have a beard, but clearly this guy has never seen Phil Wadler. Cheers, Andrew Bromage

clearly this guy has never seen Phil Wadler.
Some people may find this tasteless - I thought it was funny, so I guess those people will find me tasteless also. In that case, I'm probably already in their kill files, so this won't offend anybody. http://www.malevole.com/mv/misc/killerquiz/ -kzm -- If I haven't seen further, it is by standing in the footprints of giants

Well, THERE's two good entries! :^)
On Fri, 10 Dec 2004 09:21:21 +0100, Ketil Malde
clearly this guy has never seen Phil Wadler.
Some people may find this tasteless - I thought it was funny, so I guess those people will find me tasteless also. In that case, I'm probably already in their kill files, so this won't offend anybody.
http://www.malevole.com/mv/misc/killerquiz/
-kzm -- If I haven't seen further, it is by standing in the footprints of giants _______________________________________________ Haskell-Cafe mailing list Haskell-Cafe@haskell.org http://www.haskell.org/mailman/listinfo/haskell-cafe
participants (23)
-
ajb@spamcop.net
-
Ben Rudiak-Gould
-
Benjamin Franksen
-
Georg Martius
-
GoldPython
-
Graham Klyne
-
Greg Buchholz
-
Henning Thielemann
-
Jason Bailey
-
Jeremy Shaw
-
John Goerzen
-
Jules Bean
-
Keean Schupke
-
Keith Wansbrough
-
Ketil Malde
-
Lennart Augustsson
-
Matthew Roberts
-
Philippa Cowderoy
-
Robert Dockins
-
Shae Matijs Erisson
-
Sven Panne
-
Tomasz Zielonka
-
William Lee Irwin III