Hi guys,

actually it involved haskelldb and a table that had 183 columns. In order to specify the table with its 183 columns I had to increase the context-stack slightly from 20 to a mere 190.

Günther

BTW there is one thing I find a bit unfortunate in otherwise flawless haskelldb, each column has to be made an instance of the class FieldTag, which makes it impossible to do this in a higer-orderish sort of way.



Yitzchak Gale schrieb:
Günther Schmidt wrote:
  
...*I*
have managed to write code that ghc is not even able to compile due to
exhausting virtual memory!
Top that!
      

Good work Günther!

Joe Fredette wrote:
  
Code or it didn't happen. :)
    

Yes, how did you do it?

Did it involve very large literals? GHC is known to have some
limitations with that. For example, on my machine, a String
literal that is larger than about 1 Gb will cause GHC to
overflow the RTS stack. And a list literal with only a few
thousand elements can cause GHC to suck up all memory and
bring my computer to its knees.

Examples:

choke1 = "\
\0123456789abcdef\
\0123456789abcdef\
.
.
.
\"

choke2 = [
 0,1,2,3,4,5,6,7,
 0,1,2,3,4,5,6,7,
.
.
.
 0,1,2,3,4,5,6,7,
 99]

Regards,
Yitz