A yet another question about subtyping and heterogeneous collections

First of all, MigMit has probably suggested the parameterization of Like by the constraint, something like the following: data Like ctx = forall a. (ctx a, Typeable a) => Like a instance ALike (Like ALike) where toA (Like x) = toA x instance CLike (Like CLike) where toC (Like x) = toC x get_mono :: Typeable b => [Like ALike] -> [b] get_mono = catMaybes . map ((\(Like x) -> cast x)) lst_a :: [Like ALike] lst_a = [Like a1, Like b1, Like c1, Like d1] lst_c :: [Like CLike] lst_c = [Like c1, Like d1] t1 = map print_a lst_a t2 = map print_a lst_c (The rest of the code is the same as in your first message). You need the flag ConstraintKinds. Second, all your examples so far used structural subtyping (objects with the same fields have the same type) rather than nominal subtyping of C++ (distinct classes have distinct types even if they have the same fields; the subtyping must be declared in the class declaration). For the structural subtyping, upcasts and downcasts can be done mostly automatically. See the OOHaskell paper or the code http://code.haskell.org/OOHaskell (see the files in the samples directory).

Second, all your examples so far used structural subtyping (objects with the same fields have the same type) rather than nominal subtyping of C++ (distinct classes have distinct types even if they have the same fields; the subtyping must be declared in the class declaration). For the structural subtyping, upcasts and downcasts can be done mostly automatically. See the OOHaskell paper or the code Hello Oleg, I've glanced over both HList and OOHaskell papers when I considered taking different approaches. Albeit elegant, OOHaskell looked too heavy for my purposes, I don't need mutability, for example. And HList paper left me with two questions. The first one is how much such an encoding costs both in terms of speed and space. And the second one is can I conveniently define a Storable instance for hlists. As I said before, I need all this machinery to parse a great number of serialized nested C structs from a file.
Best regards Dmitry

And HList paper left me with two questions. The first one is how much such an encoding costs both in terms of speed and space. And the second one is can I conveniently define a Storable instance for hlists. As I said before, I need all this machinery to parse a great number of serialized nested C structs from a file.
I'm afraid I've overlooked the part about the great serialized C structs. Serializing HList is easy -- it's de-serialization that is difficult. Essentially, we need to write a mini-type-checker. Sometimes, Template Haskell can help, and we can use GHC's own type-checker. Since the approach you outlined relies on Haskell type-classes to express hierarchies, you'll have the same type-checking problem. You'll have to somehow deduce those type-class constraints during the de-serialization, and convince GHC of them. If you assume a fixed number of classes (C struct types), things become simpler. The HList-based solution becomes just as simple if you assume a fixed number of record types.
participants (2)
-
Dmitry Vyal
-
oleg@okmij.org