
On Thu, 2008-09-25 at 00:11 -0400, Matthew Eastman wrote:
Hey guys,
This is probably more of a question about functional programming than it is about Haskell, but hopefully you can help me out. I'm new to thinking about things in a functional way so I'm not sure what the best way to do some things are.
I'm in a data structures course right now, and the assignments for the course are done in Java. I figured it'd be fun to try and implement them in Haskell as well.
The part of the assignment I'm working on is to implement a RedBlueStack, a stack where you can push items in as either Red or Blue. You can pop Red and Blue items individually, but the stack has to keep track of the overall order of items.
i.e. popping Blue in [Red, Red, Blue, Red, Blue] would give [Red, Red, Blue]
All push and pop operations on the stack have to be done in O(1) time.
It was easy to do in Java since you can just keep track of everything with a few pointers, but it took a while to get the Haskell version working. Maybe I'm just not used to the functional way of doing things.
Note that purely functional data structures are inherently persistent (i.e. you can access old copies as well as new copies.) This is a significant extra constraint. Your Java type is almost certainly ephemeral, the opposite of persistent. Rewriting your Java code to be persistent while still maintaining the asymptotic complexities of the relevant operations is non-trivial. You can always achieve persistence by (deep) copying, but copying is an O(n) operation at best. Consider a simple example. The requirements are a sequence of elements with O(1) add and remove to beginning and end. This is easy. One solution is a doubly-linked list with head and tail pointers. Now if I add the requirement that it is persistent, i.e. if I have list1 = [a,b,c,d] and I make list2 = list1.RemoveLast(), list1 should still be [a,b,c,d] and list2 should be [a,b,c]. Now the problem is quite a bit more difficult. Try it. Also try to prove (informally or formally) that the asymptotic complexities hold and that the persistence guarantee holds. Okasaki's thesis and/or book, "Purely Functional Data Structures" goes into the differences and how to produce data structures with good complexity characteristics in a purely functional language. The implementations described are rather different from the usual implementations used for ephemeral data structures. The real-time deques described in the thesis are one solution to the above problem, in this case a purely functional one. In a nutshell, persistent data structures are inherently more difficult to build than ephemeral ones*, which are what are usually described, and in a purely functional language all data structures are persistent. * Proof: If I have a persistent data structure I can make an ephemeral one with the same asymptotic complexity behaviour by simply having a mutable reference holding the persistent data structure.