hGetContents and modules architecture idea

Hello everyone! I'm trying to understand base idea of Haskell modules architecture. In other languages, "reading from file" is placed in something like "io", "istream", "file", etc modules. Also most languages have a concept of reading from abstract bytes streams. You read bytes from something and translate them into your high level object/type/etc. In Haskell I see that, for example, function *hGetContents* exists in (this is my local installation): * GHC.IO.Handle * System.IO * Data.ByteString * Data.ByteString.Char8 * Data.ByteString.Lazy * Data.ByteString.Lazy.Char8 * Data.Text.IO * Data.Text.Lazy.IO * System.IO.Strict * Text.Pandoc.UTF8 * Data.ListLike * Data.ListLike.IO * ClassyPrelude * Hledger.Utils.UTF8IOCompat * Data.IOData * Darcs.Util.Ratified * Sound.File.Sndfile * Sound.File.Sndfile.Buffer * Data.String.Class * Network.BufferType If I'll create module SuperMegaShapes with some Triangle, Rectangle, Square and other things, I'll create (to be consistent with Haskell-way)... hGetContents there??! So, I have 2 questions here: First one: let's imagine that we have Haskell compiler for embedded. And I want to read Text, ByteString, Sndfile and SuperMegaShapes from... SPI. There are many devices andprotocols, right? And I have not FILE HADNLER for most of them. So, this mean that Haskell (like simple script language) supports only concept of FILE HANDLER reading?! And no other ABSTRACTIONS? Second question is: must any new type which we plan to read/write to have hGetContents? What if it is packed in some tricky container? Matreshka? Something else, more tricky? :) And more: what other I/O functions must be injected in our model definitions modules? === Best regards

In haskell you have datatypes like String, Text, Text.Lazy,
ByteString, etc. All of those have functions like readFile,
writeFile, hPutStr, hGetLine (if applicable to that type). If you
have your own type, say a Triangle, you would usually get that from
one of the intermediate types, such as Bytestring -> Triangle.
It is also possible to make a class which allows you to create a
Triangle from a variety of types, ToShape a => a -> Triangle, where
instance ToShape ByteString.
For your second question. To do a complex type from say a ByteString,
most people would use a parser combinator, perhaps something like
attoparsec, although there are many other options. That particular
library allows you to parse from a bytestring or from a file as
needed. When using it on a file you might use withFile around
parseWith and pass hGetContents as its first argument.
On Fri, May 5, 2017 at 5:31 AM, PY
Hello everyone! I'm trying to understand base idea of Haskell modules architecture. In other languages, "reading from file" is placed in something like "io", "istream", "file", etc modules. Also most languages have a concept of reading from abstract bytes streams. You read bytes from something and translate them into your high level object/type/etc.
In Haskell I see that, for example, function hGetContents exists in (this is my local installation):
GHC.IO.Handle System.IO Data.ByteString Data.ByteString.Char8 Data.ByteString.Lazy Data.ByteString.Lazy.Char8 Data.Text.IO Data.Text.Lazy.IO System.IO.Strict Text.Pandoc.UTF8 Data.ListLike Data.ListLike.IO ClassyPrelude Hledger.Utils.UTF8IOCompat Data.IOData Darcs.Util.Ratified Sound.File.Sndfile Sound.File.Sndfile.Buffer Data.String.Class Network.BufferType
If I'll create module SuperMegaShapes with some Triangle, Rectangle, Square and other things, I'll create (to be consistent with Haskell-way)... hGetContents there??!
So, I have 2 questions here:
First one: let's imagine that we have Haskell compiler for embedded. And I want to read Text, ByteString, Sndfile and SuperMegaShapes from... SPI. There are many devices andprotocols, right? And I have not FILE HADNLER for most of them. So, this mean that Haskell (like simple script language) supports only concept of FILE HANDLER reading?! And no other ABSTRACTIONS?
Second question is: must any new type which we plan to read/write to have hGetContents? What if it is packed in some tricky container? Matreshka? Something else, more tricky? :) And more: what other I/O functions must be injected in our model definitions modules?
=== Best regards
_______________________________________________ Beginners mailing list Beginners@haskell.org http://mail.haskell.org/cgi-bin/mailman/listinfo/beginners

This sure makes sense and all other languages follow this practice. But
nevertheless Sndfile has this `hGetContents`. And Darcs module.
But more strange for me is: it is considered that this function
(hGetContents) is sufficiently universaland meets so often. But this is the
reading from file handler which is not abstract/generic/universal. So:
- there are types which are in no way related to I/O but their modules
implements I/O functions and this is very strange
- and even more: these I/O related functions are based on concreate kind of
I/O - file handler based, which means that no ways to read these types from
SPI, I2C or any other not file-hadler-based I/O. Whether there are any
serious problems with abstraction?
More natural is to have abstract stream of bytes. And to read only bytes.
Then to convert them into Text, Sndfiles, etc, but such I/O functions can
not be in "model"-related modules (where are defined data types). And is we
will read new type from NEW INTERFACE (which has not file handler), nothing
will be broken: we will still read bytes from a stream of bytes with
abstract interface (type-class); and this stream may be bound to register
I/O port, for example, etc - not file handler. If we need such kind of I/O
- we will add something like `portGetContents` in all these modules: Text,
ByteString, Sndfile, etc ? :)
This is what I can't understand.
2017-05-05 15:33 GMT+03:00 David McBride
In haskell you have datatypes like String, Text, Text.Lazy, ByteString, etc. All of those have functions like readFile, writeFile, hPutStr, hGetLine (if applicable to that type). If you have your own type, say a Triangle, you would usually get that from one of the intermediate types, such as Bytestring -> Triangle.
It is also possible to make a class which allows you to create a Triangle from a variety of types, ToShape a => a -> Triangle, where instance ToShape ByteString.
For your second question. To do a complex type from say a ByteString, most people would use a parser combinator, perhaps something like attoparsec, although there are many other options. That particular library allows you to parse from a bytestring or from a file as needed. When using it on a file you might use withFile around parseWith and pass hGetContents as its first argument.
On Fri, May 5, 2017 at 5:31 AM, PY
wrote: Hello everyone! I'm trying to understand base idea of Haskell modules architecture. In other languages, "reading from file" is placed in something like "io", "istream", "file", etc modules. Also most languages have a concept of reading from abstract bytes streams. You read bytes from something and translate them into your high level object/type/etc.
In Haskell I see that, for example, function hGetContents exists in (this is my local installation):
GHC.IO.Handle System.IO Data.ByteString Data.ByteString.Char8 Data.ByteString.Lazy Data.ByteString.Lazy.Char8 Data.Text.IO Data.Text.Lazy.IO System.IO.Strict Text.Pandoc.UTF8 Data.ListLike Data.ListLike.IO ClassyPrelude Hledger.Utils.UTF8IOCompat Data.IOData Darcs.Util.Ratified Sound.File.Sndfile Sound.File.Sndfile.Buffer Data.String.Class Network.BufferType
If I'll create module SuperMegaShapes with some Triangle, Rectangle, Square and other things, I'll create (to be consistent with Haskell-way)... hGetContents there??!
So, I have 2 questions here:
First one: let's imagine that we have Haskell compiler for embedded. And I want to read Text, ByteString, Sndfile and SuperMegaShapes from... SPI. There are many devices andprotocols, right? And I have not FILE HADNLER for most of them. So, this mean that Haskell (like simple script language) supports only concept of FILE HANDLER reading?! And no other ABSTRACTIONS?
Second question is: must any new type which we plan to read/write to have hGetContents? What if it is packed in some tricky container? Matreshka? Something else, more tricky? :) And more: what other I/O functions must be injected in our model definitions modules?
=== Best regards
_______________________________________________ Beginners mailing list Beginners@haskell.org http://mail.haskell.org/cgi-bin/mailman/listinfo/beginners
_______________________________________________ Beginners mailing list Beginners@haskell.org http://mail.haskell.org/cgi-bin/mailman/listinfo/beginners

Sorry I'm having trouble understanding your english and am unfamiliar
with some of the terms you are using.
-- More natural is to have abstract stream of bytes. And to read only
bytes. Then to convert them into
There are a lot of abstractions of data in haskell. Are you looking
for something like pipes, conduits, or io-streams?
io-streams for example exports different ways to get an io-stream from
some source.
-- from / to a network
socketToStreams :: Socket -> IO (InputStream ByteString, OutputStream
ByteString)
withFileAsInput
-- various to and from files with or without automatic resource management
handleToInputStream :: Handle -> IO (InputStream ByteString)
-- to / from an interactive command.
runInteractiveCommand :: String -> IO (OutputStream ByteString,
InputStream ByteString, InputStream ByteString, ProcessHandle)
Once you have an OutputStream or an InputStream, you can do whatever
you want with them.
-- fold an input stream into some type s, via the supplied functions.
fold :: (s -> a -> s) -> s -> InputStream a -> IO s
-- ensure that every byte in an input stream conforms to a supplied function.
all :: (a -> Bool) -> InputStream a -> IO Bool
-- zip two input streams into a single input stream with characters from each.
zip :: InputStream a -> InputStream b -> IO (InputStream (a, b))
-- And if you have access to such a stream, you can manipulate at a
very low level if you need to
read :: InputStream a -> IO (Maybe a)
peek :: InputStream a -> IO (Maybe a)
unRead :: a -> InputStream a -> IO ()
I don't think I've used hGetContents for many years. While io-streams
is the most straight forward, I personally use pipes quite a bit in my
every day code.
Beyond that for writing a complex datatype to a bytestring there are
numerous libraries like binary and cereal which allow you to write
bytes in a very exact fashion, to be put into a file or over the
network if you wish.
I'm not sure if I've gotten to the heart of what you are asking, but
haskell provides a huge wealth of ways to access and manipulate data
on every possible level and they pretty much all fit together very
well, far better than similar abstractions in other languages ever
could, so far as I'm aware.
On Fri, May 5, 2017 at 2:31 PM, baa dg
This sure makes sense and all other languages follow this practice. But nevertheless Sndfile has this `hGetContents`. And Darcs module. But more strange for me is: it is considered that this function (hGetContents) is sufficiently universaland meets so often. But this is the reading from file handler which is not abstract/generic/universal. So:
- there are types which are in no way related to I/O but their modules implements I/O functions and this is very strange - and even more: these I/O related functions are based on concreate kind of I/O - file handler based, which means that no ways to read these types from SPI, I2C or any other not file-hadler-based I/O. Whether there are any serious problems with abstraction?
More natural is to have abstract stream of bytes. And to read only bytes. Then to convert them into Text, Sndfiles, etc, but such I/O functions can not be in "model"-related modules (where are defined data types). And is we will read new type from NEW INTERFACE (which has not file handler), nothing will be broken: we will still read bytes from a stream of bytes with abstract interface (type-class); and this stream may be bound to register I/O port, for example, etc - not file handler. If we need such kind of I/O - we will add something like `portGetContents` in all these modules: Text, ByteString, Sndfile, etc ? :)
This is what I can't understand.
2017-05-05 15:33 GMT+03:00 David McBride
: In haskell you have datatypes like String, Text, Text.Lazy, ByteString, etc. All of those have functions like readFile, writeFile, hPutStr, hGetLine (if applicable to that type). If you have your own type, say a Triangle, you would usually get that from one of the intermediate types, such as Bytestring -> Triangle.
It is also possible to make a class which allows you to create a Triangle from a variety of types, ToShape a => a -> Triangle, where instance ToShape ByteString.
For your second question. To do a complex type from say a ByteString, most people would use a parser combinator, perhaps something like attoparsec, although there are many other options. That particular library allows you to parse from a bytestring or from a file as needed. When using it on a file you might use withFile around parseWith and pass hGetContents as its first argument.
On Fri, May 5, 2017 at 5:31 AM, PY
wrote: Hello everyone! I'm trying to understand base idea of Haskell modules architecture. In other languages, "reading from file" is placed in something like "io", "istream", "file", etc modules. Also most languages have a concept of reading from abstract bytes streams. You read bytes from something and translate them into your high level object/type/etc.
In Haskell I see that, for example, function hGetContents exists in (this is my local installation):
GHC.IO.Handle System.IO Data.ByteString Data.ByteString.Char8 Data.ByteString.Lazy Data.ByteString.Lazy.Char8 Data.Text.IO Data.Text.Lazy.IO System.IO.Strict Text.Pandoc.UTF8 Data.ListLike Data.ListLike.IO ClassyPrelude Hledger.Utils.UTF8IOCompat Data.IOData Darcs.Util.Ratified Sound.File.Sndfile Sound.File.Sndfile.Buffer Data.String.Class Network.BufferType
If I'll create module SuperMegaShapes with some Triangle, Rectangle, Square and other things, I'll create (to be consistent with Haskell-way)... hGetContents there??!
So, I have 2 questions here:
First one: let's imagine that we have Haskell compiler for embedded. And I want to read Text, ByteString, Sndfile and SuperMegaShapes from... SPI. There are many devices andprotocols, right? And I have not FILE HADNLER for most of them. So, this mean that Haskell (like simple script language) supports only concept of FILE HANDLER reading?! And no other ABSTRACTIONS?
Second question is: must any new type which we plan to read/write to have hGetContents? What if it is packed in some tricky container? Matreshka? Something else, more tricky? :) And more: what other I/O functions must be injected in our model definitions modules?
=== Best regards
_______________________________________________ Beginners mailing list Beginners@haskell.org http://mail.haskell.org/cgi-bin/mailman/listinfo/beginners
_______________________________________________ Beginners mailing list Beginners@haskell.org http://mail.haskell.org/cgi-bin/mailman/listinfo/beginners
_______________________________________________ Beginners mailing list Beginners@haskell.org http://mail.haskell.org/cgi-bin/mailman/listinfo/beginners

Oh, David, execuse my English!! I'm sorry so much.
I know about pipes and conduits libraries, but they are not part of
language itsef and more: no such thing like "batteries included": you can
avoid use of them or you can use both or even use libraries which depends
on pipe and conduit at one time, so to have both in one project. It's not
the same as "yield" in Python which is a path of the language, or "async",
for example.
So, is hGetContents is legacy? My ponits were:
1) hGetContents IMHO should be in I/O related modules, not in "data
definitions" modules. Because when somebody creates new modules, defining
some new data types, he should not thing about I/O of these types. Even
more, he don't know what kind of I/O and I/O functions he must to "inject"
there. Is not it a design error in Haskell?
2) hGetContents "supposes" that I/O is based on file handlers. This limits
Haskell as simple little script language. There are another kinds of I/O,
where you have not files and their handlers, for example, in embedded, when
you can read/write bytes from ports through different protocols: I2C, SPI,
etc, ports may be mapped to memory addresses or be available via fixed
register names, etc. Is not it a design limitation of Haskell, related to
this ubiquitous hGetContents?
3) Concept that you must read Text or Sndfile from input file leads (if it
is a idea of placing this hGetContents in all of that modules) to practice
when other libraries authors would implement hGetContents for their types.
They and we :) Is not it a lossing of abstraction? To combine all of these
things together you should read/write bytes and encode/decode types from
abstract bytes' streams, is not it? But, in this case, what does
hGetContents in all of these modules today?! :)
I'm programmer from 96 and used many different languages, and today, when
I'm using Haskell, it's important for me, as to any other programmer, which
begining to use new language (Haskell in the case) to accept and
understand, to feel base ideas of language's infrastructure design:
modules/ecosystem. Today languages are easy, but their ecosystem and
infrastrucutre often are complex, different, strange and sometimes
surprises us. As for me, all of these thoughts are related to "accepting"
of language ecosystem/core/infrastructure. To understand why so, why in
this manner... I like Haskell, it looks like functional C, not C++ even:
lambda and types, nothing else. But such thoughts (like about hGetContents)
embarras me. I'm not sure, is it right place to ask such questions, I tryed
on StackOverflow but their is not the place, may be Haskell-cafe? Other
programmers, with background in other languages, I'm sure, also ask
themselves similar questions and tried to accept these design, so, as for
me, I'll very glad to listen about experience of others. What they thing
about such design/libraries/modules ecosystem.
Early, I asked about this on StackOverflow, also about functions like
"fold", "elem", "intercalate", many other, which are in List module and
other modules too. As for me, they all are candidates to be totally
abstract (via typeclass) and should not be in List or Map/Hashmap/etc
modules only. Something like C++ interfaces, SML signatures. And as I
understand from answer, this actually so because of legacy reason, and
today there are attempts to solve this problem in Haskell: there are new
packages (alternative preludes, etc) which defines such type-classes (sure,
there are famouse Alternatives, Applicatives, Foldable, Traversable, but
it's not the same). If, hGetContents is such legacy - OK, this is
understandable, makes sense :)
And again, David, execuse my terrible English, please!
2017-05-05 22:19 GMT+03:00 David McBride
Sorry I'm having trouble understanding your english and am unfamiliar with some of the terms you are using.
-- More natural is to have abstract stream of bytes. And to read only bytes. Then to convert them into
There are a lot of abstractions of data in haskell. Are you looking for something like pipes, conduits, or io-streams?
io-streams for example exports different ways to get an io-stream from some source.
-- from / to a network socketToStreams :: Socket -> IO (InputStream ByteString, OutputStream ByteString) withFileAsInput
-- various to and from files with or without automatic resource management handleToInputStream :: Handle -> IO (InputStream ByteString)
-- to / from an interactive command. runInteractiveCommand :: String -> IO (OutputStream ByteString, InputStream ByteString, InputStream ByteString, ProcessHandle)
Once you have an OutputStream or an InputStream, you can do whatever you want with them.
-- fold an input stream into some type s, via the supplied functions. fold :: (s -> a -> s) -> s -> InputStream a -> IO s
-- ensure that every byte in an input stream conforms to a supplied function. all :: (a -> Bool) -> InputStream a -> IO Bool
-- zip two input streams into a single input stream with characters from each. zip :: InputStream a -> InputStream b -> IO (InputStream (a, b))
-- And if you have access to such a stream, you can manipulate at a very low level if you need to read :: InputStream a -> IO (Maybe a) peek :: InputStream a -> IO (Maybe a) unRead :: a -> InputStream a -> IO ()
I don't think I've used hGetContents for many years. While io-streams is the most straight forward, I personally use pipes quite a bit in my every day code.
Beyond that for writing a complex datatype to a bytestring there are numerous libraries like binary and cereal which allow you to write bytes in a very exact fashion, to be put into a file or over the network if you wish.
I'm not sure if I've gotten to the heart of what you are asking, but haskell provides a huge wealth of ways to access and manipulate data on every possible level and they pretty much all fit together very well, far better than similar abstractions in other languages ever could, so far as I'm aware.
This sure makes sense and all other languages follow this practice. But nevertheless Sndfile has this `hGetContents`. And Darcs module. But more strange for me is: it is considered that this function (hGetContents) is sufficiently universaland meets so often. But this is
On Fri, May 5, 2017 at 2:31 PM, baa dg
wrote: the reading from file handler which is not abstract/generic/universal. So:
- there are types which are in no way related to I/O but their modules implements I/O functions and this is very strange - and even more: these I/O related functions are based on concreate kind of I/O - file handler based, which means that no ways to read these types from SPI, I2C or any other not file-hadler-based I/O. Whether there are any serious problems with abstraction?
More natural is to have abstract stream of bytes. And to read only bytes. Then to convert them into Text, Sndfiles, etc, but such I/O functions can not be in "model"-related modules (where are defined data types). And is we will read new type from NEW INTERFACE (which has not file handler), nothing will be broken: we will still read bytes from a stream of bytes with abstract interface (type-class); and this stream may be bound to register I/O port, for example, etc - not file handler. If we need such kind of I/O - we will add something like `portGetContents` in all these modules: Text, ByteString, Sndfile, etc ? :)
This is what I can't understand.
2017-05-05 15:33 GMT+03:00 David McBride
: In haskell you have datatypes like String, Text, Text.Lazy, ByteString, etc. All of those have functions like readFile, writeFile, hPutStr, hGetLine (if applicable to that type). If you have your own type, say a Triangle, you would usually get that from one of the intermediate types, such as Bytestring -> Triangle.
It is also possible to make a class which allows you to create a Triangle from a variety of types, ToShape a => a -> Triangle, where instance ToShape ByteString.
For your second question. To do a complex type from say a ByteString, most people would use a parser combinator, perhaps something like attoparsec, although there are many other options. That particular library allows you to parse from a bytestring or from a file as needed. When using it on a file you might use withFile around parseWith and pass hGetContents as its first argument.
On Fri, May 5, 2017 at 5:31 AM, PY
wrote: Hello everyone! I'm trying to understand base idea of Haskell modules architecture. In other languages, "reading from file" is placed in something like "io", "istream", "file", etc modules. Also most languages have a concept of reading from abstract bytes streams. You read bytes from something and translate them into your high level object/type/etc.
In Haskell I see that, for example, function hGetContents exists in (this is my local installation):
GHC.IO.Handle System.IO Data.ByteString Data.ByteString.Char8 Data.ByteString.Lazy Data.ByteString.Lazy.Char8 Data.Text.IO Data.Text.Lazy.IO System.IO.Strict Text.Pandoc.UTF8 Data.ListLike Data.ListLike.IO ClassyPrelude Hledger.Utils.UTF8IOCompat Data.IOData Darcs.Util.Ratified Sound.File.Sndfile Sound.File.Sndfile.Buffer Data.String.Class Network.BufferType
If I'll create module SuperMegaShapes with some Triangle, Rectangle, Square and other things, I'll create (to be consistent with Haskell-way)... hGetContents there??!
So, I have 2 questions here:
First one: let's imagine that we have Haskell compiler for embedded.
And
I want to read Text, ByteString, Sndfile and SuperMegaShapes from... SPI. There are many devices andprotocols, right? And I have not FILE HADNLER for most of them. So, this mean that Haskell (like simple script language) supports only concept of FILE HANDLER reading?! And no other ABSTRACTIONS?
Second question is: must any new type which we plan to read/write to have hGetContents? What if it is packed in some tricky container? Matreshka? Something else, more tricky? :) And more: what other I/O functions must be injected in our model definitions modules?
=== Best regards
_______________________________________________ Beginners mailing list Beginners@haskell.org http://mail.haskell.org/cgi-bin/mailman/listinfo/beginners
_______________________________________________ Beginners mailing list Beginners@haskell.org http://mail.haskell.org/cgi-bin/mailman/listinfo/beginners
_______________________________________________ Beginners mailing list Beginners@haskell.org http://mail.haskell.org/cgi-bin/mailman/listinfo/beginners
_______________________________________________ Beginners mailing list Beginners@haskell.org http://mail.haskell.org/cgi-bin/mailman/listinfo/beginners

Please allow me to provide a little more context to David's fine answer:
baa dg
So, is hGetContents is legacy? My ponits were: 1) hGetContents IMHO should be in I/O related modules, not in "data definitions" modules. Because when somebody creates new modules, defining some new data types, he should not thing about I/O of these types. Even more, he don't know what kind of I/O and I/O functions he must to "inject" there. Is not it a design error in Haskell?
This is an instance of the expression problem[1]. We have a lot of IO actions that read a Handle and return its contents as represented by some Haskell data type. The question, then, is where do we put these things? The three most promising options for organizing them are: 1. Put them all together in a module for doing I/O. 2. Put them together with the data type they return. 3. Factor out common functionality — such as reading a handle — and only implement what's different, e.g., coercing a stream of bytes (possibly interpreted via some encoding) into a given data type. The first option is impossible — or at least impractical — because the collection of data-types is large and fully extensible. This module would need to depend on every package that provides data types, both those currently existing and any developed in the future. The maintenance burden on this module would be immense, as would be the amount of friction introduced into the development of new data types. The second option is the one we currently use for hGetContents, and it's the one you dislike (for entirely legitimate reasons). But it's not the only option provided by the language, or even the only option available to you today! So I wouldn't say it's a design error of the language per se (as the expression problem applies to all languages), just a less-than-ideal choice made early on in the growth of the Haskell ecosystem. The third option solves the expression problem with type classes[2] and other methods of abstraction, of which David has already shared a number of examples. Factoring out the common I/O behavior reduces duplication and increases modularity and while we're still left with an expression problem — Where do the implementations of the type class instances live? — it's a more tractable problem and solving it provides more value. — Rein Henrichs [1] [https://en.wikipedia.org/wiki/Expression_problem] [2] [https://userpages.uni-koblenz.de/~laemmel/TheEagle/resources/pdf/xproblem1.p...]

Yes, it makes sense. So, this is the evolution of the ecosystem and in the future this may be redesigned, right? And more abstract thing, like "streams" objects may replace file handlers in such "hGetContents", to support reading from string I/O, different devices, etc. But here, I see another question. How is it possible if today there are many libraries/approaches, and no "main" one, no even such thing like "batteries", which can be "included" due nature of Haskell community, may be... It's not bad but it is so. What abstraction can be used? Machines/streams/pipes/conduits/etc? No "benevolent dictator" in the language, so no way to create one solid library... For example, after "stack install something" you get "mtl" and "transformers". But why both, they looks like solution for one/same task. The same about "pipes" and "conduits": sure I have both installed. And often dependencies are very big and IMHO no way to make standard mechanism in the language like in all other... Thoughts out loud, nothing more :) /Paul 08.05.2017 23:24, Rein Henrichs пишет:
Please allow me to provide a little more context to David's fine answer:
So, is hGetContents is legacy? My ponits were: 1) hGetContents IMHO should be in I/O related modules, not in "data definitions" modules. Because when somebody creates new modules, defining some new data types, he should not thing about I/O of these types. Even more, he don't know what kind of I/O and I/O functions he must to "inject" there. Is not it a design error in Haskell? This is an instance of the expression problem[1]. We have a lot of IO actions that read a Handle and return its contents as represented by some Haskell data type. The question, then, is where do we put these
baa dg
writes: things? The three most promising options for organizing them are: 1. Put them all together in a module for doing I/O. 2. Put them together with the data type they return. 3. Factor out common functionality — such as reading a handle — and only implement what's different, e.g., coercing a stream of bytes (possibly interpreted via some encoding) into a given data type.
The first option is impossible — or at least impractical — because the collection of data-types is large and fully extensible. This module would need to depend on every package that provides data types, both those currently existing and any developed in the future. The maintenance burden on this module would be immense, as would be the amount of friction introduced into the development of new data types.
The second option is the one we currently use for hGetContents, and it's the one you dislike (for entirely legitimate reasons). But it's not the only option provided by the language, or even the only option available to you today! So I wouldn't say it's a design error of the language per se (as the expression problem applies to all languages), just a less-than-ideal choice made early on in the growth of the Haskell ecosystem.
The third option solves the expression problem with type classes[2] and other methods of abstraction, of which David has already shared a number of examples. Factoring out the common I/O behavior reduces duplication and increases modularity and while we're still left with an expression problem — Where do the implementations of the type class instances live? — it's a more tractable problem and solving it provides more value.
— Rein Henrichs
[1] [https://en.wikipedia.org/wiki/Expression_problem] [2] [https://userpages.uni-koblenz.de/~laemmel/TheEagle/resources/pdf/xproblem1.p...] _______________________________________________ Beginners mailing list Beginners@haskell.org http://mail.haskell.org/cgi-bin/mailman/listinfo/beginners

Hmm. David, I'm reading about io-stream now (I have never met it early), it
looks very interesting. Is it the same idea as Pipe/Conduit (to "piping"
data by fixed memory usage in each "processing phase")?
By the way, what do you prefer more: pipes or conduits? Snoyman said that
pipes has limited design due to exceptions handling possibilities, it is
true today? What is more stable, better designed, more "clean"?
2017-05-05 22:19 GMT+03:00 David McBride
Sorry I'm having trouble understanding your english and am unfamiliar with some of the terms you are using.
-- More natural is to have abstract stream of bytes. And to read only bytes. Then to convert them into
There are a lot of abstractions of data in haskell. Are you looking for something like pipes, conduits, or io-streams?
io-streams for example exports different ways to get an io-stream from some source.
-- from / to a network socketToStreams :: Socket -> IO (InputStream ByteString, OutputStream ByteString) withFileAsInput
-- various to and from files with or without automatic resource management handleToInputStream :: Handle -> IO (InputStream ByteString)
-- to / from an interactive command. runInteractiveCommand :: String -> IO (OutputStream ByteString, InputStream ByteString, InputStream ByteString, ProcessHandle)
Once you have an OutputStream or an InputStream, you can do whatever you want with them.
-- fold an input stream into some type s, via the supplied functions. fold :: (s -> a -> s) -> s -> InputStream a -> IO s
-- ensure that every byte in an input stream conforms to a supplied function. all :: (a -> Bool) -> InputStream a -> IO Bool
-- zip two input streams into a single input stream with characters from each. zip :: InputStream a -> InputStream b -> IO (InputStream (a, b))
-- And if you have access to such a stream, you can manipulate at a very low level if you need to read :: InputStream a -> IO (Maybe a) peek :: InputStream a -> IO (Maybe a) unRead :: a -> InputStream a -> IO ()
I don't think I've used hGetContents for many years. While io-streams is the most straight forward, I personally use pipes quite a bit in my every day code.
Beyond that for writing a complex datatype to a bytestring there are numerous libraries like binary and cereal which allow you to write bytes in a very exact fashion, to be put into a file or over the network if you wish.
I'm not sure if I've gotten to the heart of what you are asking, but haskell provides a huge wealth of ways to access and manipulate data on every possible level and they pretty much all fit together very well, far better than similar abstractions in other languages ever could, so far as I'm aware.
This sure makes sense and all other languages follow this practice. But nevertheless Sndfile has this `hGetContents`. And Darcs module. But more strange for me is: it is considered that this function (hGetContents) is sufficiently universaland meets so often. But this is
On Fri, May 5, 2017 at 2:31 PM, baa dg
wrote: the reading from file handler which is not abstract/generic/universal. So:
- there are types which are in no way related to I/O but their modules implements I/O functions and this is very strange - and even more: these I/O related functions are based on concreate kind of I/O - file handler based, which means that no ways to read these types from SPI, I2C or any other not file-hadler-based I/O. Whether there are any serious problems with abstraction?
More natural is to have abstract stream of bytes. And to read only bytes. Then to convert them into Text, Sndfiles, etc, but such I/O functions can not be in "model"-related modules (where are defined data types). And is we will read new type from NEW INTERFACE (which has not file handler), nothing will be broken: we will still read bytes from a stream of bytes with abstract interface (type-class); and this stream may be bound to register I/O port, for example, etc - not file handler. If we need such kind of I/O - we will add something like `portGetContents` in all these modules: Text, ByteString, Sndfile, etc ? :)
This is what I can't understand.
2017-05-05 15:33 GMT+03:00 David McBride
: In haskell you have datatypes like String, Text, Text.Lazy, ByteString, etc. All of those have functions like readFile, writeFile, hPutStr, hGetLine (if applicable to that type). If you have your own type, say a Triangle, you would usually get that from one of the intermediate types, such as Bytestring -> Triangle.
It is also possible to make a class which allows you to create a Triangle from a variety of types, ToShape a => a -> Triangle, where instance ToShape ByteString.
For your second question. To do a complex type from say a ByteString, most people would use a parser combinator, perhaps something like attoparsec, although there are many other options. That particular library allows you to parse from a bytestring or from a file as needed. When using it on a file you might use withFile around parseWith and pass hGetContents as its first argument.
On Fri, May 5, 2017 at 5:31 AM, PY
wrote: Hello everyone! I'm trying to understand base idea of Haskell modules architecture. In other languages, "reading from file" is placed in something like "io", "istream", "file", etc modules. Also most languages have a concept of reading from abstract bytes streams. You read bytes from something and translate them into your high level object/type/etc.
In Haskell I see that, for example, function hGetContents exists in (this is my local installation):
GHC.IO.Handle System.IO Data.ByteString Data.ByteString.Char8 Data.ByteString.Lazy Data.ByteString.Lazy.Char8 Data.Text.IO Data.Text.Lazy.IO System.IO.Strict Text.Pandoc.UTF8 Data.ListLike Data.ListLike.IO ClassyPrelude Hledger.Utils.UTF8IOCompat Data.IOData Darcs.Util.Ratified Sound.File.Sndfile Sound.File.Sndfile.Buffer Data.String.Class Network.BufferType
If I'll create module SuperMegaShapes with some Triangle, Rectangle, Square and other things, I'll create (to be consistent with Haskell-way)... hGetContents there??!
So, I have 2 questions here:
First one: let's imagine that we have Haskell compiler for embedded.
And
I want to read Text, ByteString, Sndfile and SuperMegaShapes from... SPI. There are many devices andprotocols, right? And I have not FILE HADNLER for most of them. So, this mean that Haskell (like simple script language) supports only concept of FILE HANDLER reading?! And no other ABSTRACTIONS?
Second question is: must any new type which we plan to read/write to have hGetContents? What if it is packed in some tricky container? Matreshka? Something else, more tricky? :) And more: what other I/O functions must be injected in our model definitions modules?
=== Best regards
_______________________________________________ Beginners mailing list Beginners@haskell.org http://mail.haskell.org/cgi-bin/mailman/listinfo/beginners
_______________________________________________ Beginners mailing list Beginners@haskell.org http://mail.haskell.org/cgi-bin/mailman/listinfo/beginners
_______________________________________________ Beginners mailing list Beginners@haskell.org http://mail.haskell.org/cgi-bin/mailman/listinfo/beginners
_______________________________________________ Beginners mailing list Beginners@haskell.org http://mail.haskell.org/cgi-bin/mailman/listinfo/beginners

I'd say that haskell depends more on its libraries than most other
languages because they kind of get better over time and a lot of times
the better libraries are not at a beginner level of skill.
As for hGetContents not being generic over its types, people have
actually written typeclasses for that purpose. See
http://hackage.haskell.org/package/ListLike-4.5.1/docs/Data-ListLike.html#v:...
it is just that it is not in the standard library. The problem with
having everything typeclassed is that it makes it so that you have to
fix all the types before your code will compile. There are a lot of
people who believe that some of the list functions like length should
not be generic in Prelude because it's difficult to get the hang of
when you first start learning haskell. I'm not one of them, but I
understand what they are getting at.
I personally use pipes, but over the years they've more or less
reached feature parity. Pipes has a few minor things over conduits,
conduits is slightly easier to use. You can convert between the two
fairly easily in code. Which you use is up to you.
On Sat, May 6, 2017 at 1:02 AM, baa dg
Hmm. David, I'm reading about io-stream now (I have never met it early), it looks very interesting. Is it the same idea as Pipe/Conduit (to "piping" data by fixed memory usage in each "processing phase")?
By the way, what do you prefer more: pipes or conduits? Snoyman said that pipes has limited design due to exceptions handling possibilities, it is true today? What is more stable, better designed, more "clean"?
2017-05-05 22:19 GMT+03:00 David McBride
: Sorry I'm having trouble understanding your english and am unfamiliar with some of the terms you are using.
-- More natural is to have abstract stream of bytes. And to read only bytes. Then to convert them into
There are a lot of abstractions of data in haskell. Are you looking for something like pipes, conduits, or io-streams?
io-streams for example exports different ways to get an io-stream from some source.
-- from / to a network socketToStreams :: Socket -> IO (InputStream ByteString, OutputStream ByteString) withFileAsInput
-- various to and from files with or without automatic resource management handleToInputStream :: Handle -> IO (InputStream ByteString)
-- to / from an interactive command. runInteractiveCommand :: String -> IO (OutputStream ByteString, InputStream ByteString, InputStream ByteString, ProcessHandle)
Once you have an OutputStream or an InputStream, you can do whatever you want with them.
-- fold an input stream into some type s, via the supplied functions. fold :: (s -> a -> s) -> s -> InputStream a -> IO s
-- ensure that every byte in an input stream conforms to a supplied function. all :: (a -> Bool) -> InputStream a -> IO Bool
-- zip two input streams into a single input stream with characters from each. zip :: InputStream a -> InputStream b -> IO (InputStream (a, b))
-- And if you have access to such a stream, you can manipulate at a very low level if you need to read :: InputStream a -> IO (Maybe a) peek :: InputStream a -> IO (Maybe a) unRead :: a -> InputStream a -> IO ()
I don't think I've used hGetContents for many years. While io-streams is the most straight forward, I personally use pipes quite a bit in my every day code.
Beyond that for writing a complex datatype to a bytestring there are numerous libraries like binary and cereal which allow you to write bytes in a very exact fashion, to be put into a file or over the network if you wish.
I'm not sure if I've gotten to the heart of what you are asking, but haskell provides a huge wealth of ways to access and manipulate data on every possible level and they pretty much all fit together very well, far better than similar abstractions in other languages ever could, so far as I'm aware.
On Fri, May 5, 2017 at 2:31 PM, baa dg
wrote: This sure makes sense and all other languages follow this practice. But nevertheless Sndfile has this `hGetContents`. And Darcs module. But more strange for me is: it is considered that this function (hGetContents) is sufficiently universaland meets so often. But this is the reading from file handler which is not abstract/generic/universal. So:
- there are types which are in no way related to I/O but their modules implements I/O functions and this is very strange - and even more: these I/O related functions are based on concreate kind of I/O - file handler based, which means that no ways to read these types from SPI, I2C or any other not file-hadler-based I/O. Whether there are any serious problems with abstraction?
More natural is to have abstract stream of bytes. And to read only bytes. Then to convert them into Text, Sndfiles, etc, but such I/O functions can not be in "model"-related modules (where are defined data types). And is we will read new type from NEW INTERFACE (which has not file handler), nothing will be broken: we will still read bytes from a stream of bytes with abstract interface (type-class); and this stream may be bound to register I/O port, for example, etc - not file handler. If we need such kind of I/O - we will add something like `portGetContents` in all these modules: Text, ByteString, Sndfile, etc ? :)
This is what I can't understand.
2017-05-05 15:33 GMT+03:00 David McBride
: In haskell you have datatypes like String, Text, Text.Lazy, ByteString, etc. All of those have functions like readFile, writeFile, hPutStr, hGetLine (if applicable to that type). If you have your own type, say a Triangle, you would usually get that from one of the intermediate types, such as Bytestring -> Triangle.
It is also possible to make a class which allows you to create a Triangle from a variety of types, ToShape a => a -> Triangle, where instance ToShape ByteString.
For your second question. To do a complex type from say a ByteString, most people would use a parser combinator, perhaps something like attoparsec, although there are many other options. That particular library allows you to parse from a bytestring or from a file as needed. When using it on a file you might use withFile around parseWith and pass hGetContents as its first argument.
On Fri, May 5, 2017 at 5:31 AM, PY
wrote: Hello everyone! I'm trying to understand base idea of Haskell modules architecture. In other languages, "reading from file" is placed in something like "io", "istream", "file", etc modules. Also most languages have a concept of reading from abstract bytes streams. You read bytes from something and translate them into your high level object/type/etc.
In Haskell I see that, for example, function hGetContents exists in (this is my local installation):
GHC.IO.Handle System.IO Data.ByteString Data.ByteString.Char8 Data.ByteString.Lazy Data.ByteString.Lazy.Char8 Data.Text.IO Data.Text.Lazy.IO System.IO.Strict Text.Pandoc.UTF8 Data.ListLike Data.ListLike.IO ClassyPrelude Hledger.Utils.UTF8IOCompat Data.IOData Darcs.Util.Ratified Sound.File.Sndfile Sound.File.Sndfile.Buffer Data.String.Class Network.BufferType
If I'll create module SuperMegaShapes with some Triangle, Rectangle, Square and other things, I'll create (to be consistent with Haskell-way)... hGetContents there??!
So, I have 2 questions here:
First one: let's imagine that we have Haskell compiler for embedded. And I want to read Text, ByteString, Sndfile and SuperMegaShapes from... SPI. There are many devices andprotocols, right? And I have not FILE HADNLER for most of them. So, this mean that Haskell (like simple script language) supports only concept of FILE HANDLER reading?! And no other ABSTRACTIONS?
Second question is: must any new type which we plan to read/write to have hGetContents? What if it is packed in some tricky container? Matreshka? Something else, more tricky? :) And more: what other I/O functions must be injected in our model definitions modules?
=== Best regards
_______________________________________________ Beginners mailing list Beginners@haskell.org http://mail.haskell.org/cgi-bin/mailman/listinfo/beginners
_______________________________________________ Beginners mailing list Beginners@haskell.org http://mail.haskell.org/cgi-bin/mailman/listinfo/beginners
_______________________________________________ Beginners mailing list Beginners@haskell.org http://mail.haskell.org/cgi-bin/mailman/listinfo/beginners
_______________________________________________ Beginners mailing list Beginners@haskell.org http://mail.haskell.org/cgi-bin/mailman/listinfo/beginners
_______________________________________________ Beginners mailing list Beginners@haskell.org http://mail.haskell.org/cgi-bin/mailman/listinfo/beginners

David, thank you for comprehensive answers!
Have a nice day.
2017-05-06 13:10 GMT+03:00 David McBride
I'd say that haskell depends more on its libraries than most other languages because they kind of get better over time and a lot of times the better libraries are not at a beginner level of skill.
As for hGetContents not being generic over its types, people have actually written typeclasses for that purpose. See http://hackage.haskell.org/package/ListLike-4.5.1/docs/ Data-ListLike.html#v:hGetContents it is just that it is not in the standard library. The problem with having everything typeclassed is that it makes it so that you have to fix all the types before your code will compile. There are a lot of people who believe that some of the list functions like length should not be generic in Prelude because it's difficult to get the hang of when you first start learning haskell. I'm not one of them, but I understand what they are getting at.
I personally use pipes, but over the years they've more or less reached feature parity. Pipes has a few minor things over conduits, conduits is slightly easier to use. You can convert between the two fairly easily in code. Which you use is up to you.
On Sat, May 6, 2017 at 1:02 AM, baa dg
wrote: Hmm. David, I'm reading about io-stream now (I have never met it early), it looks very interesting. Is it the same idea as Pipe/Conduit (to "piping" data by fixed memory usage in each "processing phase")?
By the way, what do you prefer more: pipes or conduits? Snoyman said that pipes has limited design due to exceptions handling possibilities, it is true today? What is more stable, better designed, more "clean"?
2017-05-05 22:19 GMT+03:00 David McBride
: Sorry I'm having trouble understanding your english and am unfamiliar with some of the terms you are using.
-- More natural is to have abstract stream of bytes. And to read only bytes. Then to convert them into
There are a lot of abstractions of data in haskell. Are you looking for something like pipes, conduits, or io-streams?
io-streams for example exports different ways to get an io-stream from some source.
-- from / to a network socketToStreams :: Socket -> IO (InputStream ByteString, OutputStream ByteString) withFileAsInput
-- various to and from files with or without automatic resource
management
handleToInputStream :: Handle -> IO (InputStream ByteString)
-- to / from an interactive command. runInteractiveCommand :: String -> IO (OutputStream ByteString, InputStream ByteString, InputStream ByteString, ProcessHandle)
Once you have an OutputStream or an InputStream, you can do whatever you want with them.
-- fold an input stream into some type s, via the supplied functions. fold :: (s -> a -> s) -> s -> InputStream a -> IO s
-- ensure that every byte in an input stream conforms to a supplied function. all :: (a -> Bool) -> InputStream a -> IO Bool
-- zip two input streams into a single input stream with characters from each. zip :: InputStream a -> InputStream b -> IO (InputStream (a, b))
-- And if you have access to such a stream, you can manipulate at a very low level if you need to read :: InputStream a -> IO (Maybe a) peek :: InputStream a -> IO (Maybe a) unRead :: a -> InputStream a -> IO ()
I don't think I've used hGetContents for many years. While io-streams is the most straight forward, I personally use pipes quite a bit in my every day code.
Beyond that for writing a complex datatype to a bytestring there are numerous libraries like binary and cereal which allow you to write bytes in a very exact fashion, to be put into a file or over the network if you wish.
I'm not sure if I've gotten to the heart of what you are asking, but haskell provides a huge wealth of ways to access and manipulate data on every possible level and they pretty much all fit together very well, far better than similar abstractions in other languages ever could, so far as I'm aware.
On Fri, May 5, 2017 at 2:31 PM, baa dg
wrote: This sure makes sense and all other languages follow this practice. But nevertheless Sndfile has this `hGetContents`. And Darcs module. But more strange for me is: it is considered that this function (hGetContents) is sufficiently universaland meets so often. But this is the reading from file handler which is not abstract/generic/universal. So:
- there are types which are in no way related to I/O but their modules implements I/O functions and this is very strange - and even more: these I/O related functions are based on concreate kind of I/O - file handler based, which means that no ways to read these types from SPI, I2C or any other not file-hadler-based I/O. Whether there are any serious problems with abstraction?
More natural is to have abstract stream of bytes. And to read only bytes. Then to convert them into Text, Sndfiles, etc, but such I/O functions can not be in "model"-related modules (where are defined data types). And is we will read new type from NEW INTERFACE (which has not file handler), nothing will be broken: we will still read bytes from a stream of bytes with abstract interface (type-class); and this stream may be bound to register I/O port, for example, etc - not file handler. If we need such kind of I/O - we will add something like `portGetContents` in all these modules: Text, ByteString, Sndfile, etc ? :)
This is what I can't understand.
2017-05-05 15:33 GMT+03:00 David McBride
: In haskell you have datatypes like String, Text, Text.Lazy, ByteString, etc. All of those have functions like readFile, writeFile, hPutStr, hGetLine (if applicable to that type). If you have your own type, say a Triangle, you would usually get that from one of the intermediate types, such as Bytestring -> Triangle.
It is also possible to make a class which allows you to create a Triangle from a variety of types, ToShape a => a -> Triangle, where instance ToShape ByteString.
For your second question. To do a complex type from say a ByteString, most people would use a parser combinator, perhaps something like attoparsec, although there are many other options. That particular library allows you to parse from a bytestring or from a file as needed. When using it on a file you might use withFile around parseWith and pass hGetContents as its first argument.
On Fri, May 5, 2017 at 5:31 AM, PY
wrote: Hello everyone! I'm trying to understand base idea of Haskell
modules
architecture. In other languages, "reading from file" is placed in something like "io", "istream", "file", etc modules. Also most languages have a concept of reading from abstract bytes streams. You read bytes from something and translate them into your high level object/type/etc.
In Haskell I see that, for example, function hGetContents exists in (this is my local installation):
GHC.IO.Handle System.IO Data.ByteString Data.ByteString.Char8 Data.ByteString.Lazy Data.ByteString.Lazy.Char8 Data.Text.IO Data.Text.Lazy.IO System.IO.Strict Text.Pandoc.UTF8 Data.ListLike Data.ListLike.IO ClassyPrelude Hledger.Utils.UTF8IOCompat Data.IOData Darcs.Util.Ratified Sound.File.Sndfile Sound.File.Sndfile.Buffer Data.String.Class Network.BufferType
If I'll create module SuperMegaShapes with some Triangle, Rectangle, Square and other things, I'll create (to be consistent with Haskell-way)... hGetContents there??!
So, I have 2 questions here:
First one: let's imagine that we have Haskell compiler for embedded. And I want to read Text, ByteString, Sndfile and SuperMegaShapes from... SPI. There are many devices andprotocols, right? And I have not FILE HADNLER for most of them. So, this mean that Haskell (like simple script language) supports only concept of FILE HANDLER reading?! And no other ABSTRACTIONS?
Second question is: must any new type which we plan to read/write to have hGetContents? What if it is packed in some tricky container? Matreshka? Something else, more tricky? :) And more: what other I/O functions must be injected in our model definitions modules?
=== Best regards
_______________________________________________ Beginners mailing list Beginners@haskell.org http://mail.haskell.org/cgi-bin/mailman/listinfo/beginners
_______________________________________________ Beginners mailing list Beginners@haskell.org http://mail.haskell.org/cgi-bin/mailman/listinfo/beginners
_______________________________________________ Beginners mailing list Beginners@haskell.org http://mail.haskell.org/cgi-bin/mailman/listinfo/beginners
_______________________________________________ Beginners mailing list Beginners@haskell.org http://mail.haskell.org/cgi-bin/mailman/listinfo/beginners
_______________________________________________ Beginners mailing list Beginners@haskell.org http://mail.haskell.org/cgi-bin/mailman/listinfo/beginners
_______________________________________________ Beginners mailing list Beginners@haskell.org http://mail.haskell.org/cgi-bin/mailman/listinfo/beginners
participants (4)
-
baa dg
-
David McBride
-
PY
-
Rein Henrichs