
I'm interested in webassembly's recent momentum https://hacks.mozilla.org/2016/03/a-webassembly-milestone/ Is this a way we could get a foot in the web? I'm curious, just putting the thought out there. Cheers, Dimitri

Am 23.04.2016 um 06:28 schrieb Dimitri DeFigueiredo:
I'm interested in webassembly's recent momentum
https://hacks.mozilla.org/2016/03/a-webassembly-milestone/
Is this a way we could get a foot in the web? I'm curious, just putting the thought out there.
Points to consider: ITfind commented that he'd expect browser support to happen in around 3-5 years. My own comment would be the question whether it's going to happen at all, or peter out. Then there's the question whether Webassembly will be Haskell-friendly. It might be a good idea for some Haskell compiler expertise to participate in the interest group to make it so. Also, the purpose of WebAssembly will be to integrate with the browser's DOM and with Javascript library. These interfaces are inherently stateful, so whether that can succeed will massively depend on good Haskell library support. In particular, it will need to appeal to web developers, most of which will come from a Javascript background; that's quite a cultural gap that the Haskell-side library will have to bridge. I think the interface libraries will have to be designed by somebody who is at home and in love with both environments. Finally, any interface libraries that will have to be downloaded from web pages should be small. Less than 100k. Page bloat is a real and serious problem. Haskell could actually take off if it turns out that Haskell-generated Webassembly is smaller than Webassembly generated from other languages. Given that Haskell code tends to have a higher abstraction level than code written in other languages, that's quite possible; however, the entire compiler needs to be configured towards small footprint to achieve that goal, expect work required towards that end. Just tinkering with Webassembly won't make Haskell get a foot in the web, but it may provide valuable insights into how Haskell might interface with the Next Big Thing of That Kind, whatever that will be, so tinkering would be a project in its own right. If the goal is to get a foot in the web, that's going to require a lot of manpower, and the Haskell compiler crowd will have to contribute at least advice and problem analysis. Like you I'd like to see that happen but can't make much happen myself; the best I can do is to look how the journey might be, and hope somebody who can do actual work on that is will pick up what's useful about my thoughts.

On Sat, Apr 23, 2016 at 10:52 AM, Joachim Durchholz
Finally, any interface libraries that will have to be downloaded from web pages should be small. Less than 100k. Page bloat is a real and serious problem.
IMO, this particular thing is not a serious problem, and the web pages can easily handle two orders of magnitude larger web pages, 10MB javascript without problems (example: Google+). If part of the download is a standard library, then we'll see more use of cache forever semantics from standardized locations (CDNs). The stackage eco-system with pre-compiled packages can be directly mapped onto a cache forever system and with async download of new versions (like what Chrome and other browsers do today), the problem is mostly converted from a latency issue into a bandwidth issue, and we have plenty of bandwidth. Alexander

Am 23.04.2016 um 11:34 schrieb Alexander Kjeldaas:
On Sat, Apr 23, 2016 at 10:52 AM, Joachim Durchholz
wrote: Finally, any interface libraries that will have to be downloaded from web pages should be small. Less than 100k. Page bloat is a real and serious problem.
IMO, this particular thing is not a serious problem, and the web pages can easily handle two orders of magnitude larger web pages, 10MB javascript without problems (example: Google+).
Well, that's far too much. You have people on UMTS, you have people in countries with low wired network bandwidth, and these are all locked out if every single page needs a download of 10 MB. BTW the average web page size is about the same as the Doom install image: https://mobiforge.com/research-analysis/the-web-is-doom That's just ridiculous.
If part of the download is a standard library, then we'll see more use of cache forever semantics from standardized locations (CDNs).
Cache-forever does not happen in practice. Stuff gets upgraded, stuff gets evicted from caches. Particularly if you're entering at the bottom of the Alexa rankings, you can't rely on that to fix the problems for you.
The stackage eco-system with pre-compiled packages can be directly mapped onto a cache forever system and with async download of new versions (like what Chrome and other browsers do today), the problem is mostly converted from a latency issue into a bandwidth issue, and we have plenty of bandwidth.
You have. I have. Our users do not have. Also, large libraries tend to be slow - it's not a 1:1 correlation, but it is statistically significant. So if you try to keep code size under control, that's usually a net win. BTW there's also a latency issue: Larger libraries tend to load slower, particularly if the user's persistent storage isn't an SSD but spinning-rust technology. 10 MB is latency even if it's not being downloaded. The prevalence of reactions like yours is what has been making web slowness part of the problem. 10 years ago, I could open dozens of web pages in my browser, and things would be snappy; today, with a machine that's ten times as powerful, the browser gets bogged down at a mere dozen webpages open in tabs. This isn't going to become better, Webassembly or not. Regards, Jo

the browser gets bogged down at a mere dozen webpages open in tabs.
I just opened 20 random pages of wiki as tabs in one browser window. There was no delay opening these pages or paging / scrolling any of them. Yes there are websites which consume many resources. I suppose visitors to these websites are ok with this, or maintainers of these websites are ok with visitor activity as it is.

Joachim Durchholz
IMO, this particular thing is not a serious problem, and the web pages can easily handle two orders of magnitude larger web pages, 10MB javascript without problems (example: Google+).
Well, that's far too much. You have people on UMTS, you have people in countries with low wired network bandwidth, and these are all locked out if every single page needs a download of 10 MB.
BTW the average web page size is about the same as the Doom install image: https://mobiforge.com/research-analysis/the-web-is-doom That's just ridiculous.
This almost sounds like an attempt to assign morality to program size. It doesn't seem evident that there is any notion of morality in how we, collectively, as humanity, seem to reach for tools of expression -- indeed, it trumps morality. Personal preferences aside, every time that someone finds out a way to be perceivable more expressive, you can count on that an X% increase in size will be gladly, indeed gleefully, accepted. This is outside of our control. All we can do is offer a tool, among many others.
The prevalence of reactions like yours is what has been making web slowness part of the problem. 10 years ago, I could open dozens of web pages in my browser, and things would be snappy; today, with a machine that's ten times as powerful, the browser gets bogged down at a mere dozen webpages open in tabs. This isn't going to become better, Webassembly or not.
... -- с уважениeм / respectfully, Косырев Сергей

On 2016-04-23 at 00:28, Dimitri DeFigueiredo
I'm interested in webassembly's recent momentum
https://hacks.mozilla.org/2016/03/a-webassembly-milestone/
Is this a way we could get a foot in the web? I'm curious, just putting the thought out there.
GHCJS can compile Haskell to JS already - anything that GHC compiles, except libraries that link against non-Haskell code. There are also various Haskell-like languages, with which I'm less familiar. Periodically someone asks about generating webassembly or asm.js from GHCJS. For example: https://github.com/ghcjs/ghcjs/issues/53 https://github.com/ghcjs/ghcjs/issues/359 My understanding is GHCJS depends on first-class functions and garbage collection in the JS runtime, which aren't provided by the assembly subset. bergey

Dimitri DeFigueiredo
I'm interested in webassembly's recent momentum
https://hacks.mozilla.org/2016/03/a-webassembly-milestone/
Is this a way we could get a foot in the web? I'm curious, just putting the thought out there.
Cheers,
Dimitri
http://venturebeat.com/2016/03/16/mozilla-will-release-the-first-tech- demo-of-servo-its-next-generation-browser-engine-in-june/
participants (7)
-
Alexander Kjeldaas
-
Art Scott
-
Daniel Bergey
-
Dimitri DeFigueiredo
-
Imants Cekusins
-
Joachim Durchholz
-
Kosyrev Serge