
I’ll try to find time to experiment with API and make a PoC in the beginning of the next week. Currently I have no idea what the API could be. It would be nice if it will turn out so you could do `CNF a -> NNF a` and `DNF a -> NFF a` transformations without any restrictions on `a`. Yet you still need `Negable a` to do reversed transformations as well as `CNF a <-> DNF a` Let’s see how it turns out. I’ll try to come back with something before Epiphany (i.e. on Monday).
On 01 Jan 2015, at 18:04, Yitzchak Gale
wrote: Yes, sounds great!
In practice, what I really need right now is to be able to convert between normal forms without contraints on the value type. If that will do it - I'm happy.
But beyond that, this is a nice library that is well thought out. Grounded in good theory, but still easily understandable and usable. Any further development in that direction would of course be appreciated!
In terms of specifics - would we need replacements for CoBoolean and CoBoolean1 if we move to a richer lattice-based organization of the types? Or would we then be able to move to simpler API like toLattice, fromLattice, toNormalForm, etc. for the conversions, that would just work?
Thanks, Yitz
On Thu, Jan 1, 2015 at 5:45 PM, Oleg Grenrus
wrote: First of all: nice to hear that someone is using boolean-normal-forms!
I totally feel your pain. Indeed there is “Monoid is used, where Semigroup would be enough” issue.
`Boolean` is too strong class there.. I’d like to rework the classes to have hierarchy: Lattice => Boolean Complementable Maybe also `Heyting` and `Negable`.
Then NNF would be free Lattice, and DNF and CNF are also just lattices.
With more granular classes, the requirements for transformations could be more precise.
How does this sound?
Cheers, Oleg
On 01 Jan 2015, at 17:06, Yitzchak Gale
wrote: I've been using Oleg Genrus' wonderful boolean-normal-forms library lately, and enjoying it. However, I'm running into some trouble with overly constrained conversions.
In theory, it's possible to convert between any of the normal forms without requiring a Negable instance on the value type. But the conversion mechanism in the library routes everything via the Boolean type class, and that ends up slapping a Negable constraint on everything.
You can work around that by embedding your value type in the free Negable, conveniently provided by the library. But in practice that causes a lot of trouble. You end up being required to deal with Neg cases in your result type that are actually impossible to occur and make no sense in context. You have to throw bottoms into provably total functions, or write code that produces meaningless results that you promise in human-readable comments will never happen in practice. You also make the code less efficient by requiring extra passes over the data to "verify" that Neg isn't there. It's similar to what happens when you try to use a Monoid where a Semigroup is really needed by bolting on an artificial "empty" constructor.
One thing you can do to begin with is to remove the Negable constraints on the NormalForm instances; I don't see that constraint ever being used. But that still doesn't solve the main issue.
Perhaps one approach would be to define a Boolean "bi-monoid" class, where a Boolean bi-monoid is a like Boolean algebra except without negation. All of the normal forms are instances of Boolean bi-monoid even without a Negable constraint on the value type.
If you then relax the constraints on the methods of CoBoolean and CoBoolean1 to Boolean bi-monoid instead of Boolean, all of the existing conversion implementations work without modification and without requiring the Negable constraint.
I'm not sure if that creates other problems for some other use cases of this library though.
Any thoughts?
Thanks, Yitz