Hello Café,

TL;DR: should the deprecation GHC option be transitively reported for re-exported definitions?

I have a library that is exposing too much. As a minimal example, say the library contains:
- Module A, which defines several functions and types.
- Module B, which exports specific definitions from module A and has none of its own.

It so happens that, to keep things as clean and abstract as possible, only module B should be exposed.

As per library policy, we give users time to adapt. A way to do that would be to deprecate module A, but configure B to ignore deprecations (-Wno-deprecations) so GHC does not complain during the compilation of the library itself.

My expectation was that library users who imported A directly would get a warning, but importing definitions in A via B would not give out any warnings.

That, however, is not what is happening:

In the use of ‘functionInA’
    (imported from B, but defined in A):
    Deprecated: "This module will be hidden in future versions."

There are "workarounds": I could move all definitions in A to new module C, deprecate A, and re-export C in B, or I could re-define the exported definitions in B as identities of those in A (easy for functions, probably more cumbersome for data constructors or classes.)

However, more generally, if you use a function from A in a NEW function definition in B and then export that second definition instead, the compiler won't tell the library user that B is internally relying on a deprecated function. Reexporting a function without changes could conceptually be seen as an "extreme" case of that, where where the name and the implementation in B coincide with those in A.

So I ask: should deprecation work the way it is working in the first place?

All the best,

Ivan