
Adrian Hey wrote:
For the Data.Map clone I wrote something like this ..
-- An "open" map (this is abstract) data OMap k a = OMap k (Maybe a) (Map k a) Int#
-- This is just a lookup that encodes the path taken as an unboxed Int open :: Ord k => k -> Map k a -> OMap k a
-- Get the current associated value (if any) read :: OMap k a -> Maybe a
-- Change the current associated value and close the new map -- This is v.fast. No comparisons, and no balance checking or -- rebalancing either if this is a substitution rather than an -- insertion. write :: a -> OMap k a -> Map k a
-- Delete the current association (if any) and close the new map -- This is nop if there is no current association delete :: OMap k a -> Map k a
-- Not really needed if original map is still in scope close :: OMap k a -> Map k a
I think it depends if this can be implemented without burning significant extra heap in either focus or the resulting g function. Generally zippers do require quite a bit of heap (proportional to trie/tree depth).
The OMap type is like a zipper, the Int# encodes the path. I don't know whether the Int# (which should be a !Int with an UNPACK pragma) really gains anything compared to a list, only benchmarks can tell. Fretting about it seems like an irrelevant micro-optimization to me. In any case, focus can easily be implemented in terms of OMap : focus :: k -> map a -> (Maybe a, Maybe a -> map a) focus k m = (read z, maybe (delete z) (`write` z)) where z = open k m So any efficient implementation for OMap gives an efficient implementation for focus . And vice-versa type OMap k a = (Maybe a, Maybe a -> Map k a) open = focus read = fst write x = ($ Just x ) . snd delete = ($ Nothing) . snd Regards, apfelmus