If someone wants to play around with the caching idea, this is (at least approximately) how I'd write it:

data Res c a
  = Hit !(Maybe c)
  | Miss !(Maybe c) a
  deriving Functor

compose bc ab = flip evalState Strict.empty $ traverseMaybeWithKey go ab
  where
    go _ b = do
      cache <- get
      case Strict.alterF (update b) b cache of
        Hit mc -> pure mc
        Miss mc cache' -> do
          put cache'
          pure mc
    update b Nothing =
      let mc = lookup b bc
      in Miss mc (Just mc)
    update _ (Just mc) = Hit mc

On Sun, Jul 7, 2019, 1:49 PM David Feuer <david.feuer@gmail.com> wrote:
For what it's worth, I'm fairly confident that implementation is optimal, though I don't know how to even try to prove it. The only situation I see where we could obviously do better is when ab maps many keys to the same value and bc is very large compared to ab. In that case, it would make sense to store a "cache" of lookups into bc. But the constant factors for such an arrangement would be terrible, and there would be no benefit whatsoever in the general case. So I think the simple thing is the right thing here. As for names, I like compose. I support the proposal for the reason Alexandre gives: it feels like a fundamental operation.

On Sun, Jul 7, 2019, 1:22 PM Alexandre Esteves <alexandre.fmp.esteves@gmail.com> wrote:
In https://github.com/haskell/containers/issues/647 I proposed the following util:

composition :: Ord b => Map b c -> Map a b -> Map a c
composition bc ab = flip mapMaybe ab $ flip lookup bc

Which has O(|ab| * log |bc|) performance. It's not particularly hard to write, but it does feel like a primitive-ish operation, and it's not obvious to me whether there's a faster implementation.

Other name suggestions
- (.)
- chain
- compose


_______________________________________________
Libraries mailing list
Libraries@haskell.org
http://mail.haskell.org/cgi-bin/mailman/listinfo/libraries