I agree that the strictness there was surprising, but I think this may be a case where what is superficially expected is, in the end, inconsistent.
What about:
let ~(!x, !y) = undefined in ()
If nested bang patterns implied strictness of their parents, this valid expression seems not to make any sense. I can see a few ways to deal with that, but none of them seem intuitive to me.
One could disallow it, and only allow strictness annotations on variables rather than all patterns, but this sacrifices a lot of functionality to avoid that surprise. Alternatively, one could say that upward propagation of strictness is only a default, but that definitely feels like a hack. It might make the original example behave as expected, but it is no longer for the expected reasons, and suddenly there is something even more complex going on.
I don't have a strong opinion here, but I think it's important to consider more complex cases when making the decision.