I don't think it quite makes sense to say that Haskell shares the evaluation model with lambda calculus, because I don't think it's fair to say that lambda calculus has any specific evaluation model at all. Do you mean substitution, for example? But that's only one way to implement lambda calculus, and not one that is shared by any widely used Haskell implementation.
But I do agree there's a point here. There's a simplicity that the purely functional fragment of Haskell shares with the lambda calculus, which I wish were easier to get across to new Haskell programmers. That simplicity is precisely what allows the lambda calculus, as well as the purely functional fragment of Haskell, to have a meaning without answering the question of how it is evaluated. (Even in more complex programming languages, the notion of evaluation that is used to define the language is often not the same one that's used by implementations, of course. But nevertheless these languages must be defined in terms of some model of evaluation, where the purely functional fragment of Haskell doesn't.)
I struggle with this. In some very real ways, I consider it the most important point of learning Haskell for many programmers. But it's also not a prerequisite to using Haskell for practical purposes. For me, since I learned Haskell in order to just experience the cool ideas that it contains (and only 15 years later got my first job programming in Haskell), that's reason enough. But, within reason at least, it's the learner's goals that matter most when learning. Someone who isn't looking to understand the fundamental simplicity of a portion of Haskell isn't likely to be motivated to work through the effort it takes to understand it. So there must be some ways, at least, to learn Haskell without focusing on that particular aspect of the language. A Haskell programmer won't be as good of a Haskell programmer as they could be without understanding it, and will struggle to write idiomatic pure code. But one must start somewhere. New Python programmers don't write idiomatic or perfect Python, either!