00:00:00  * ircretaryquit (Remote host closed the connection)
00:00:09  * ircretaryjoined
00:12:24  <dominictarr>mbalho: https://gist.github.com/maxogden/5761236#comment-843999
00:13:05  * kevino80quit (Remote host closed the connection)
00:14:07  * owen1joined
00:14:27  <mbalho>dominictarr: ah so if it was actually async it should be okay
00:14:52  <mbalho>dominictarr: im trying to make a user auth lib that is transport agnostic and works in node + browser
00:14:52  <dominictarr>or if it was the last event listener, not the first
00:15:14  <dominictarr>transport agnostic?
00:19:58  * dominictarrquit (Quit: dominictarr)
00:20:47  <substack>user auth?
00:21:15  <jesusabdullah>okay going outside, my apartment is hot as balls and it's a beautiful day
00:22:07  <substack>http://github.com/substack/tuple-stream
00:22:31  <substack>mbalho: what kind of auth?
00:22:34  <substack>there's https://github.com/substack/persona-id
00:22:53  <mbalho>substack: abstract
00:23:07  <mbalho>substack: but im basing it on the API from hoodie.account http://hood.ie/
00:25:03  * no9quit (Quit: Leaving)
00:32:47  * dguttmanquit (Quit: dguttman)
00:34:31  <mikolalysenko>anyone know if there are any libraries that can parse css color declarations into rgba values?
00:35:16  * mmckeggquit (Quit: mmckegg)
00:38:55  <mikolalysenko>ah found one: https://github.com/harthur/color
00:39:02  * nicholasfjoined
00:39:36  * rannmannquit (Ping timeout: 260 seconds)
00:40:35  <mbalho>ive used that lib before, good stuff
00:42:51  * mmckeggjoined
00:43:35  * nicholasfquit (Ping timeout: 255 seconds)
00:47:36  * tmcwjoined
00:48:27  <jesusabdullah>holy christ it is so hot :(
00:49:03  * rannmannjoined
00:49:04  * rannmannquit (Changing host)
00:49:04  * rannmannjoined
00:49:31  <substack>19C here
00:49:47  <substack>I've got 2 long-sleave shirts on still
00:49:53  <jesusabdullah>thefuckingweather.com/?where=99503
00:50:16  <jesusabdullah>4 degrees hotter n' you
00:50:28  <jesusabdullah>and that's outside!
00:52:51  <AvianFlu>lolol
00:54:26  <owen1>http://thefuckingweather.com/?where=91367
00:54:47  <owen1>13 above u
00:57:22  <jesusabdullah>All I know is
00:57:26  <jesusabdullah>My igloo's melting. :C
01:04:01  * thl0joined
01:11:44  * AvianFluquit (Remote host closed the connection)
01:19:32  * tmcwquit (Remote host closed the connection)
01:38:46  <mmckegg>any suggestions for a good place to stay in SF for a few days when I arrive next week?
01:43:50  * dguttmanjoined
01:51:41  * dguttmanquit (Quit: dguttman)
01:53:51  * ITproquit (Ping timeout: 252 seconds)
01:55:07  * owen1quit (Ping timeout: 264 seconds)
01:58:13  * kevino80joined
02:06:42  * tilgovijoined
02:08:41  * owen1joined
02:12:03  * fallsemojoined
02:20:30  * st_lukejoined
02:26:36  * defunctzombie_zzchanged nick to defunctzombie
02:26:39  * thl0quit (Remote host closed the connection)
02:27:15  * thl0joined
02:31:49  * thl0quit (Ping timeout: 276 seconds)
02:31:56  * kevino80quit (Remote host closed the connection)
02:37:04  <jesusabdullah>mmckegg: I usually try to crash with a friend, ymmv
02:49:44  <mbalho>mmckegg: me and substacks house has room in oakland
02:50:32  <substack>house is new zealand friendly
02:50:45  <substack>references available upon request
02:50:52  <mmckegg>mbalho: that would be very nice thanks :)
02:51:45  <mmckegg>not sure if you can trust someone's house by their github profiles, but that's all the references I need
02:53:12  * nicholasfjoined
02:56:40  <calvinfo>mmckegg, we also have a couch in SF if you want it
02:56:47  * mmckegg_joined
02:57:42  <calvinfo>mmckegg_: we also have a couch in SF if you want it
02:57:47  <calvinfo>it's pretty comfy
02:58:51  * jcrugzzquit (Ping timeout: 240 seconds)
02:59:11  <mmckegg>calvinfo: I'll keep it in mind. what part of SF?
02:59:37  <calvinfo>north beach area
03:00:29  <calvinfo>it's kinda out of the way of transportation, but the place itself is pretty nice
03:00:52  <calvinfo>we use it as our office too atm, so it's less of a hassle for us
03:01:20  <calvinfo>maybe a 35-40min walk downtown, but quicker by bus
03:03:22  <mmckegg>ok, I'm used to walking although Wellington is quite a bit smaller :)
03:05:03  <jesusabdullah>I'll vouch for sub and mbahlo being friendly people
03:05:15  <jesusabdullah>can't vouch for their 'hood or their house but I'll vouch for them
03:05:17  <jesusabdullah>;)
03:07:14  <mmckegg>the house is only 10% the experience after all
03:07:16  <mbalho>mmckegg: <-maxogden
03:07:25  <mmckegg>yes I know :)
03:07:29  <mbalho>:D
03:08:46  <mmckegg>sounds good, i arrive in town monday around noon so I'll be in touch
03:13:44  * tilgoviquit (Remote host closed the connection)
03:49:15  * chapelhas a place to call home finally here in the bay
03:49:21  * nicholasfquit (Ping timeout: 252 seconds)
03:49:54  <chapel>though, no furniture, an air bed, foldout/camping chair and a small folding table
03:50:14  <mbalho>lol
03:50:37  <chapel>mbalho: whats so funny?
03:50:40  <chapel>:P
03:53:56  * mikolalysenkoquit (Ping timeout: 255 seconds)
04:00:50  * st_lukequit (Remote host closed the connection)
04:16:27  * mmckegg_quit (Quit: ZNC - http://znc.sourceforge.net)
04:17:09  * nicholasfjoined
04:17:31  * jcrugzzjoined
04:22:27  <jesusabdullah>man client-side js is gnarly
04:22:35  <jesusabdullah>at least THIS client-side js is gnarly
04:27:15  * timoxleyquit (Quit: Computer has gone to sleep.)
04:27:17  * fallsemoquit (Quit: Leaving.)
04:29:14  <jesusabdullah>Fixing this is gonna be a matter of, "what if I just delete this?" I can just tell >_<
04:31:55  * mikolalysenkojoined
04:36:13  * timoxleyjoined
04:36:33  * cianomaidinjoined
04:46:10  <mikolalysenko>I love these things: https://github.com/eugeneware/debowerify
04:46:43  <mikolalysenko>what is especially delicious is that I don't think that bower has the capability to implement a de-commonjsify
04:47:38  * mmckegg_joined
04:49:16  <jesusabdullah>haha
04:51:03  <substack>bower doesn't specify what module system to use
04:51:12  <substack>so it's going to be a complete hodge-podge of what people put on there
04:51:35  * mmckegg_quit (Quit: ZNC - http://znc.in)
04:52:15  * mmckegg_joined
04:52:41  <mikolalysenko>or maybe not decommonjs-ify but rather a de-npmify...
04:52:57  * defunctzombiechanged nick to defunctzombie_zz
04:53:07  <mikolalysenko>anyway I think that beer hit me a bit hard since I haven't eaten anything since noon...
04:53:20  <mikolalysenko>I should probably stop doing the irc thing for a bit now
04:55:36  <mikolalysenko>though the weirdest thing about bower is that you have to install npm first before you can use it
04:59:37  <jesusabdullah>pbbbbt
05:01:45  <jesusabdullah>eddie bower
05:03:59  * mmckeggquit (Quit: mmckegg)
05:03:59  * mmckegg_changed nick to mmckegg
05:06:01  * creationixquit (Ping timeout: 240 seconds)
05:06:28  * tilgovijoined
05:11:49  * shamaquit (Remote host closed the connection)
05:19:33  * ralphtheninjaquit (Ping timeout: 252 seconds)
05:22:49  * cianomaidinquit (Quit: cianomaidin)
05:24:30  * jcrugzzquit (Ping timeout: 252 seconds)
05:27:38  * cianomaidinjoined
05:33:35  * timoxleyquit (Quit: Computer has gone to sleep.)
05:38:26  * jcrugzzjoined
05:39:00  * timoxleyjoined
05:42:55  * jcrugzzquit (Ping timeout: 260 seconds)
05:44:23  <jesusabdullah>mikolalysenko: no drunk irc is best irc
06:01:29  <mikolalysenko>bah, I am saying stupid things so I think I will stick to drunk coding
06:02:22  <jesusabdullah>stupid things are best things
06:02:27  <jesusabdullah>you shoulda seen me last night!
06:02:36  <jesusabdullah>'cept I was smart this time and kept it in PM
06:03:03  <mikolalysenko>ah well, at least the drunk coding was semi-productive. I just made this: https://github.com/mikolalysenko/glsl-exports
06:03:55  <mikolalysenko>part of the longer term plan to make a modular replacement for 3.js
06:04:04  <mikolalysenko>here is another part of it that I hashed out today: https://github.com/mikolalysenko/gl-now
06:11:12  <jesusabdullah>does glsl run in webgl?
06:11:34  <mikolalysenko>yeah
06:11:44  <jesusabdullah>crazy
06:11:44  <mikolalysenko>it is the programming language you use to write shaders
06:11:52  <jesusabdullah>what language is that?
06:11:56  <mikolalysenko>glsl
06:12:01  <jesusabdullah>aha
06:12:04  <mikolalysenko>glsl = opengl shading language
06:12:14  <mikolalysenko>there is also hlsl which the direct x thing
06:12:21  <mikolalysenko>it is basically the same only different syntax
06:12:25  <jesusabdullah>half-life shader language >:3
06:12:33  <jesusabdullah>("oh dude I loved that game!")
06:12:35  <mikolalysenko>high level shading language
06:12:50  <mikolalysenko>though it isn't any higher level than glsl, just different
06:13:11  <jesusabdullah>shsl: sooo high shading language
06:13:33  <jesusabdullah>a shader language that was a subset of js would be handy
06:13:37  <jesusabdullah>imo
06:13:53  <mikolalysenko>jesusabdullah: I've thought about it, but it would be tough to do
06:14:01  <jesusabdullah>oh, bummer
06:14:07  <mikolalysenko>js has pass by ref semantics for almost everything, while the shaders are necessarily all pass-by-value
06:14:32  <mikolalysenko>you could probably compile glsl to js though and interpret it there
06:14:39  <jesusabdullah>aha
06:14:48  <mikolalysenko>that might be neat to emulate and debug webgl stuff
06:15:06  <mikolalysenko>would be a lot of work though and would also be really slow
06:15:18  <mikolalysenko>so not clear that it is worth the trouble (and it would be a lot of trouble)
06:26:20  <jesusabdullah>yeeah
06:26:31  <jesusabdullah>doesn't sound justifiable
06:26:35  <jesusabdullah>at least at this point
06:30:01  * ralphtheninjajoined
06:45:31  * missinglinkquit (Ping timeout: 264 seconds)
06:49:28  <jesusabdullah>yo dawgs browserling.com isn't responding for me
06:49:34  <jesusabdullah>substack: ^^
06:50:03  <jesusabdullah>substack: http://downforeveryoneorjustme.com/browserling.com wah wah wahhh
06:53:32  <substack>ruh roh
06:54:22  <substack>I'm logged into that server so the server is up
06:55:16  <jesusabdullah>WELL ITS NOT RESPONDENEC
06:55:16  <LOUDBOT>DOUCHE BAG SAYS GONNA REMOVE THAT
06:59:08  * dguttmanjoined
06:59:32  * dguttmanquit (Client Quit)
07:04:27  * ins0mniajoined
07:12:22  <mbalho>DADGUMMIT
07:12:33  <mbalho>HEY LOUDBOT SAY SOMETHING
07:12:33  <LOUDBOT>BUT GOATS DON'T HAVE ANY GOLD!
07:12:41  <mbalho>DAD GUMMIT
07:12:42  <LOUDBOT>THATS IT, I'M GONNA DRINK THE WATER FROM THE ANKH!
07:18:50  * vitorpachecoquit (Remote host closed the connection)
07:24:37  * tilgoviquit (Read error: Connection reset by peer)
07:26:43  * cianomaidinquit (Ping timeout: 246 seconds)
07:31:28  * mikolalysenkoquit (Ping timeout: 276 seconds)
07:43:13  * djcoinjoined
07:45:56  * mikolalysenkojoined
07:50:25  * mikolalysenkoquit (Ping timeout: 248 seconds)
07:53:21  * djcoinquit (Read error: Operation timed out)
08:05:43  * nicholasfquit (Ping timeout: 260 seconds)
08:07:48  * djcoinjoined
08:09:51  <timoxley>ok, so if I pause a tcp stream that happens to be trying to establish a web socket connection
08:10:04  <timoxley>the browser side freaks out
08:10:35  <timoxley>… I think that's what's happening. any ideas?
08:11:15  * mcollinajoined
08:24:21  * calvinfoquit (Ping timeout: 240 seconds)
08:25:21  * Altreusquit (Ping timeout: 240 seconds)
08:27:43  * Altreusjoined
08:27:47  * Altreusquit (Changing host)
08:27:47  * Altreusjoined
08:36:39  * calvinfojoined
08:57:29  * cianomaidinjoined
09:02:43  * no9joined
09:15:58  * dfggquit (Ping timeout: 245 seconds)
09:17:14  * dfggjoined
09:25:32  * missinglinkjoined
09:28:23  <timoxley>substack any idea why pausing a stream while using shoe breaks the websocket connection?
09:28:28  <timoxley>"WebSocket is closed before the connection is established. "
09:28:42  <timoxley>example: https://gist.github.com/timoxley/5763982
09:28:49  <timoxley>possible I'm doing something stupid
09:28:54  <timoxley>s/possible/likely
09:31:12  * dominictarrjoined
09:31:48  <jesusabdullah>hello mr. tarr
09:33:11  <substack>timoxley: you could pipe the shoe stream to through
09:33:26  <timoxley>as a buffer?
09:33:28  <timoxley>ok
09:33:57  <substack>yep
09:34:44  <substack>var input = through(), output = through(); input.pipe(shoe('/sock')).pipe(output); var sock = duplexer(input, output)
09:38:32  * jcrugzzjoined
09:43:33  * jcrugzzquit (Ping timeout: 248 seconds)
09:45:27  <timoxley>substack hrm, I don't think I'm doing it right.
09:45:29  <timoxley>substack https://gist.github.com/timoxley/5763982
09:54:13  * kevino80joined
10:01:31  * kevino80quit (Remote host closed the connection)
10:02:25  * kevino80joined
10:08:11  * creationixjoined
10:10:45  * kevino80quit (Remote host closed the connection)
10:12:02  * nicholasfjoined
10:12:26  <timoxley>dominictarr so… I got more of them streams troubles, if you're in the mood
10:12:33  <dominictarr>sure
10:12:58  <timoxley>dominictarr check this https://gist.github.com/timoxley/5763982
10:13:11  <timoxley>lines 11 - 17
10:13:27  <dominictarr>what are you trying to do?
10:13:52  <timoxley>at a high level
10:14:14  <timoxley>I'm trying to "boot services on demand" to keep resource usage low
10:14:47  <timoxley>idea was start a net server on a port… which corresponds to some other service
10:15:16  <timoxley>when a connection comes in, hold it while we spin up the other service, when it's ready, pipe the data through
10:15:41  <timoxley>keep service up so long as connections exist, otherwise shut it down
10:15:45  <dominictarr>okay, cool
10:16:00  <timoxley>now I have a thing, works fine for regular http
10:16:10  <dominictarr>oh right - what node version?
10:16:20  <timoxley>been testing on 0.8.22 and 0.10.5
10:16:36  <timoxley>but trying web sockets and I'm getting "WebSocket is closed before the connection is established. " in chrome
10:16:37  <dominictarr>does it work on 0.10?
10:16:59  <timoxley>I don't believe so, hang on I'll double check
10:17:06  <dominictarr>in 0.8 you pause is just advisory
10:17:14  <dominictarr>but you can pipe to a pause stream
10:17:31  <dominictarr>stream.pipe(pause = through().pause());
10:17:38  <dominictarr>/later...
10:17:55  <dominictarr>pause.pipe(stream2); pause.resume()
10:18:19  <dominictarr>but, in 10, this is meant to 'just work' without even using pause/resume
10:19:01  <timoxley>I'll try that, hang on
10:20:55  <timoxley>dominictarr same issue
10:21:22  * missinglinkquit (Ping timeout: 246 seconds)
10:22:05  <timoxley>seems as soon as there's a pause and a resume in different ticks, shoe's websocket goes boom
10:22:23  <timoxley>even with buffer inbetween
10:22:30  <dominictarr>oh, really?
10:22:42  <dominictarr>okay, that is a bug you should post on shoe
10:23:15  <timoxley>dang
10:23:22  <timoxley>ok
10:23:45  * mcollinaquit (Remote host closed the connection)
10:27:37  <timoxley>dominictarr thanks :D
10:31:41  * tim_smartquit (Ping timeout: 240 seconds)
10:33:52  * jibayjoined
10:35:43  * mcollinajoined
10:37:16  * tim_smartjoined
10:37:38  * ins0mniaquit (Ping timeout: 256 seconds)
10:53:10  * dominictarrquit (Quit: dominictarr)
11:03:58  * thl0joined
11:07:37  * timoxleyquit (Quit: Computer has gone to sleep.)
11:19:59  * dominictarrjoined
11:26:07  * dominictarrquit (Quit: dominictarr)
11:31:51  <substack>some people just don't get modularity
11:31:59  <substack>http://thanpol.as/javascript/writing-modular-javascript-rewind/
11:32:01  <substack>http://www.reddit.com/r/javascript/comments/1g6qmv/writing_modular_javascript_rewind/
11:42:27  * simcop2387quit (Excess Flood)
11:45:39  * simcop2387joined
11:45:41  * dominictarrjoined
11:52:11  * cianomaidinquit (Quit: cianomaidin)
11:56:19  * fotoveritequit (Read error: Connection reset by peer)
11:56:41  * fotoveritejoined
11:58:14  * timoxleyjoined
11:58:40  * pikpikquit (Ping timeout: 256 seconds)
11:59:54  * nicholasfquit (Read error: Connection reset by peer)
12:00:06  * nicholasfjoined
12:01:13  * nicholasfquit (Read error: Connection reset by peer)
12:01:29  * nicholasfjoined
12:17:23  * mikolalysenkojoined
12:30:25  * cianomaidinjoined
12:34:49  * kevino80joined
12:39:28  * ednapiranhajoined
12:40:37  * ednapiranhaquit (Remote host closed the connection)
12:43:39  * ednapiranhajoined
12:53:43  * thl0quit (Remote host closed the connection)
13:01:00  * mikolalysenkoquit (Quit: Lost terminal)
13:06:21  * mikolalysenkojoined
13:12:27  * fallsemojoined
13:15:39  * dominictarrquit (Quit: dominictarr)
13:15:57  * cianomaidinquit (Quit: cianomaidin)
13:22:48  * cianomaidinjoined
13:27:52  * dguttmanjoined
13:31:15  * no9quit (Ping timeout: 252 seconds)
13:43:58  * no9joined
13:44:04  * djcoinquit (Ping timeout: 260 seconds)
13:47:56  * mikolalysenkoquit (Ping timeout: 255 seconds)
13:56:03  * pikpikjoined
13:56:27  * thl0joined
14:04:02  * nicholasfquit (Read error: Connection reset by peer)
14:04:27  * nicholasfjoined
14:13:59  * mikolalysenkojoined
14:14:11  * spionjoined
14:18:31  <spion>creationix, https://github.com/spion/genny - uses the `suspend` style of passing resume callbacks but also handles evil functions.
14:18:57  <creationix>oh hah, I guess that would work
14:19:01  * mikolalysenkoquit (Ping timeout: 276 seconds)
14:19:09  <creationix>spion: how will I ever decide between the two styles?!?
14:19:41  <spion>hmm.
14:19:51  <spion>less wrapping.
14:19:52  <spion>:D
14:21:00  * nicholasfquit (Read error: Connection reset by peer)
14:21:01  * dominictarrjoined
14:21:26  * nicholasfjoined
14:25:40  * jcrugzzjoined
14:28:06  <creationix>spion: ok, updated run function and added comment https://gist.github.com/creationix/5762837#comment-844278
14:28:20  <creationix>less wrapping yes, but harder to use every time you call it
14:28:31  <creationix>and if the function has optional args before the callback, it gets tricky
14:28:40  <creationix>especially if the optional args are functions
14:30:54  <spion>why does it get tricky? you can use the resume function at any position, just like you would use a named function
14:31:25  * spiondoesn't get it
14:35:45  * jibayquit (Remote host closed the connection)
14:39:41  * calvinfoquit (Ping timeout: 255 seconds)
14:40:16  <creationix>spion: the tricky part is when writing the function that accepts optional args
14:40:21  <spion>oh you mean the style as a whole
14:40:24  <creationix>fn(path, options, callback)...
14:40:35  <creationix>I call with fn("mypath", callback)
14:41:07  <creationix>the resume() syntax is extremely flexible, no doubt about that.
14:41:14  <creationix>works with anything that takes a callback somewhere
14:41:28  <spion>yep thats true. but for better or worse node is stuck with that style...
14:41:42  <creationix>node may be stick with it, but I'm designing a new platform
14:41:46  <creationix>and I can't decide which is better
14:42:00  <creationix>the thunk style seems much nicer *assuming* all your async functions use that style
14:43:02  <spion>what is the language of the platform?
14:43:10  <creationix>javascript
14:43:25  <creationix>perhaps new "platform" is a bit much
14:43:34  <creationix>but just defining common interfaces for use in browser and node
14:43:48  <creationix>I don't want to cargo-cult all node apis and styles to the browser
14:43:55  * mikolalysenkojoined
14:43:58  <creationix>there is some legacy stuff in there I don't want to keep
14:44:35  * defunctzombie_zzchanged nick to defunctzombie
14:44:53  <spion>I think the browser is kinda heading in the promises direction, at least thats my gut feeling
14:46:20  <creationix>yeah, not too happy about that
14:46:27  <creationix>continuables/thunks are powerful enough imho
14:46:32  <creationix>and *much* simpler
14:46:39  <spion>but I think thunk style is better than promises, because you can create the "task" without starting it
14:48:53  <creationix>spion: do you have an opinion on thunks vs callback-last for my new APIs?
14:49:07  <creationix>in particular, functions that return streams are where I'm having trouble
14:49:31  <creationix>I want the function to return either a stream directly or optionally return a thunk for the stream
14:49:56  <creationix>If I were going the callback route, I would detect for the callback, and if passed in send the stream to that, and if not return the stream directly
14:50:16  <creationix>in thunk mode, I'd need a special thunk: true option for the user to tell me they want it
14:51:20  <spion>why is it necessary to wrap a stream in a thunk?
14:51:49  <creationix>so, for example, I'm creating a read stream out of a file
14:51:56  <creationix>the first step is to open the file and get the fd
14:52:09  <creationix>that is the most likely place to have errors, the file might not exist, or might not be readable
14:52:25  <creationix>if the user of the API isn't sure if the file exists, they will be interested in this result
14:52:54  <creationix>thunk or callback mode would be so I don't give them the stream till I've successfully opened the fd
14:53:09  <creationix>createReadStream(path, function (err, stream) { ... })
14:53:19  * mikolalysenkoquit (Ping timeout: 246 seconds)
14:54:16  <creationix>for integrating with yield, I could make stream null in the case the file doesn't exist
14:54:30  <creationix>stream = yield createReadStream(path, {thunk: true})
14:54:34  <creationix>if (stream) ...
14:54:46  <creationix>spion: am I making sense?
14:54:49  <spion>yes.
14:54:54  <spion>and you can't emit the error through the stream because that would be awkward I guess.
14:55:08  <creationix>well, I can and I do in the default case
14:55:18  <creationix>because you usually don't care what the error is
14:55:29  <creationix>maybe I'm trying to read from /dev/input/js0 at a fixed offset
14:55:38  <creationix>that will fs.open just fine, but will fail on the first read
14:56:04  <creationix>(since it's not actually a file, but a special read-only device)
14:56:51  * calvinfojoined
14:56:55  <spion>i meant something like, not wrapping in a thunk at all, and always returning a stream, but emitting an error on the stream if the file can't be opened.
14:57:04  <spion>I don't know how that would play with yield though
14:57:09  * jcrugzzquit (Ping timeout: 248 seconds)
14:57:29  <creationix>the awkward part with returning the stream directly is they won't see the error till all the pieces have been hooked up and there is a sink pulling data through
14:57:43  <creationix>the first event in the stream will be the ENOENT error
14:58:16  <spion>what if there is a 'success' event for the stream?
14:58:25  <creationix>no named events
14:58:31  <creationix>I'm using min-streams
14:58:50  <creationix>besides, that's essentially a callback
14:58:54  <creationix>just weirder
14:58:56  <spion>oh. I need to read that.
14:59:25  <creationix>let me write up a quick gist showing the question I have
14:59:28  <spion>ah right, like multi-call callbacks.
15:01:25  <spion>well why not encode the beginning of the stream too.
15:02:07  <creationix>the stream encoding needs to be raw data, I don't want to add extra stuff on that
15:02:19  * no9quit (Ping timeout: 264 seconds)
15:02:35  <creationix>you *could* write a filter that read the first event, checked if it was an error and if not, forwarded it on with the rest of the data
15:02:37  <spion>i was referring to https://gist.github.com/creationix/5498108
15:02:42  <creationix>but that's actually quite hard to do properly
15:04:32  <spion>what if calling the callback for the first time is mandatory to indicate that the stream is ready to spew data?
15:05:24  <spion>/* do initialization or error... */ callback() /* i am ready */ callback(null, item); callback(); /* i am done */
15:06:08  <spion>i guess that doesn't make it much easier
15:10:00  * jcrugzzjoined
15:11:25  <spion>my idea is for yield to recognize the streams and wait either for the error or for the first callback();
15:11:31  <creationix>spion: here are the two APIs I'm stuck deciding between https://gist.github.com/creationix/5766132
15:14:53  <spion>the first version is easier to implement, but thats because there is a need for an optional options argument either way.
15:15:19  <spion>if most functions don't have an options argument, adding one just for thunk will be awkward.
15:15:23  <creationix>right, I just put the thunk option in the same object as my other options
15:15:35  <creationix>but not all my stream generating functions have options
15:15:40  <creationix>sometimes thunk is the *only* option
15:16:37  <creationix>also I have some functions where I'd like to have optional arguments that happen to be functions
15:16:46  <creationix>that's much tricker with callback-last
15:16:51  <creationix>a function looks like any other function
15:17:08  <creationix>if someone forgets to pass the callback, but does pass in the optional function, I'll get confused
15:17:19  <creationix>in thunk style, it will be very obvious
15:18:14  <spion>as for ease of use, its 'add callback' VS add both option and callback
15:18:55  <Domenic_>creationix: can you explain what using delegating yield gives us? I don't quite see how it accomplishes deep coroutines (at least in the way I usually think about them). Does it improve efficiency?
15:20:12  <creationix>spion: here is the difference when using yield https://gist.github.com/creationix/5766132#file-usage2-js
15:20:58  <creationix>Domenic_: you still need a generator for every frame, but internally V8 can implement a single coroutine and share it down the stack
15:21:55  <creationix>also it means that you don't need to wrap every layer in something like run()
15:22:07  <creationix>only the top layer needs to be wrapped, everything inside that is a bare generator function
15:22:19  <spion>creationix, that looks more or less the same :)
15:22:20  <Domenic_>ah that latter is interesting
15:22:30  <spion>personally I think thunks deserve a chance as a concept.
15:22:48  * sorensen_joined
15:22:56  * sorensen_quit (Client Quit)
15:23:24  <spion>the benefits of them being real values seems very significant.
15:23:30  <creationix>spion: if I knew everyone had access to yield, I would just choose thunk and default to always returning one
15:23:50  <creationix>then it would just be
15:23:58  <creationix>stream = yield readFile(path)
15:24:07  <creationix>and if the file didn't exist, stream would be undefined
15:24:13  <creationix>if it was some other error, it would throw an exception
15:24:28  <creationix>but without yield, it's
15:24:39  <creationix>readFile(path)(function (err, stream) { ... })
15:24:53  <creationix>which is nasty for the cases where you don't care about early errors
15:25:05  <spion>or readFile(path, {thunk: false}) ?
15:25:27  <creationix>or that
15:25:58  <creationix>though I think it's more common to not care about early early any more than late errors
15:26:11  <creationix>most code will be doing a stat check first anyway
15:26:24  <creationix>which will require it's own thunk or callback
15:28:36  <spion>if you want to write code to work in browsers, i think you can safely assume that using yield will not be an option for a very long time.
15:28:47  <creationix>right, I can't depend on it
15:28:51  <creationix>but I do want to plan for it
15:28:59  <spion>(on the other hand, there is that ES6 compiler...)
15:29:12  <creationix>that's why I was looking for a compiler that did only generators
15:29:28  <creationix>though if it does all the harmony stuff, I guess that won't hurt
15:29:32  * dguttmanquit (Quit: dguttman)
15:29:54  <creationix>I've seen plenty of front-end code written in coffeescript which doesn't run natively
15:29:59  <creationix>why not es6 code
15:30:05  <spion>is traceur good?
15:30:11  <creationix>no idea which is best
15:30:13  <Domenic_>traceur is pretty ok
15:30:24  <Domenic_>traceur is the only option really, all others are kind of bad
15:30:40  <Domenic_>except defs.js is pretty ok too at doing let/const
15:30:57  <creationix>Domenic_: the main feature I want in a transpiler is es6 generators
15:31:04  * tmcwjoined
15:31:11  <creationix>though argument destructuring and a for..of would be really nice too
15:31:15  <Domenic_>creationix: traceur has that pretty nicely, although pretty sure it doesn't have yield* yet.
15:31:56  <Domenic_>it has out-of-date (irrefutable) destructuring; they need to get around to implementing the new refutable destructuring semantics
15:32:09  <Domenic_>and for-of is impossible to implement correctly until we get symbols and probably modules
15:32:35  <creationix>well, I'd be fine with just generators
15:32:43  <creationix>that alone changes everything with async code
15:34:38  <spion>creationix, i can't see prevailing arguments for either choice, but I think that the fact that thunks are real values will eliminate much grief that I have right now with node code.
15:35:07  <creationix>right, I've pushed for thunks since back when node had promises
15:35:11  <creationix>years...
15:35:13  <spion>for me its db.query.bind(db, "arguments") vs db.query("arguments")
15:35:55  <spion>and db.query("arguments", function(err, data) { ... }) vs db.query("arguments")(function(err, data) { ... })
15:36:09  <creationix>yep
15:36:36  <creationix>they are like ultra-lightweight promises
15:36:59  <spion>in the second case they're basically the same, the first is better with thunks, so... thunks.
15:37:00  <spion>:)
15:37:20  <creationix>well, creating them is slightly more verbose
15:37:25  <spion>yes, like lightweight promises that are not opinionated about error handling
15:37:37  <creationix>function (path, callback) {} vs function (path) { return function (callback) {}}
15:37:51  <creationix>also it's trivial to turn a thunk into a promise
15:38:03  * tmcwquit (Remote host closed the connection)
15:38:15  <spion>but less verbose than creating promises, I think.
15:39:03  <creationix>Q.promise(thunk) is close
15:39:24  <creationix>except thunks is (callback(err, value) instead of (resolve, reject, notify)
15:40:32  <creationix>spion: ok, thanks. I think I'll use thunks over callback-last in my APIs
15:44:35  <creationix>though I'll keep resume() support in my yield runner for easy integration with non-thunk code
15:52:03  * mikolalysenkojoined
15:52:41  <creationix>spion: Domenic_, actually tracuer does appear to support function*
15:52:49  <creationix>http://traceur-compiler.googlecode.com/git/demo/repl.html#function*%20num(n)%20%7B%0A%20%20yield%20n%3B%0A%20%20yield%20n%20%2B%201%3B%0A%7D%0A%0Afunction*%20fib()%20%7B%0A%20%20yield%201%3B%0A%20%20yield*%20num(2)%3B%0A%7D%0A%0Avar%20gen%20%3D%20fib()%3B%0A%5Bgen.next().value%2C%0A%20gen.next().value%2C%0A%20gen.next().value%5D%0A%0A
15:55:19  <creationix>though the generated code is rather verbose.
15:55:47  <Domenic_>creationix: awesome
15:56:09  <Domenic_>it has source maps too but we found them somewhat problematic
16:03:42  <jez0990>creationix: do you know why it doesn't work for more than the first 3 values? http://traceur-compiler.googlecode.com/git/demo/repl.html#function*%20num(n)%20%7B%0A%20%20yield%20n%3B%0A%20%20yield%20n%20%2B%201%3B%0A%7D%0A%0Afunction*%20fib()%20%7B%0A%20%20yield%201%3B%0A%20%20yield*%20num(2)%3B%0A%7D%0A%0Avar%20gen%20%3D%20fib()%3B%0A%5Bgen.next().value%2C%0A%20gen.next().value%2C%0A%20gen.next().value%2C%0Agen.next().value%2C%0Agen.next()
16:03:42  <jez0990>.value%5D%0A%0A
16:04:45  <creationix>jez0990: because the generator only outputs 3 values
16:04:58  <creationix>1, then 2, then 2 + 1
16:08:27  <jez0990>creationix: right, well I'm pretty confused then... *begins some background reading*
16:09:04  * simcop2387quit (Excess Flood)
16:09:06  <jez0990>ohhh, no I get it
16:12:53  * jcrugzzquit (Ping timeout: 248 seconds)
16:13:24  <jez0990>I'm wondering if it's possible to do anything sort-of cyclical, this feels a lot like FRP signals
16:17:47  * shamajoined
16:20:58  * nicholasfquit (Read error: Connection reset by peer)
16:21:03  * nicholas_joined
16:21:37  * dguttmanjoined
16:29:31  * timoxleyquit (Quit: Computer has gone to sleep.)
16:30:18  * timoxleyjoined
16:32:50  * djcoinjoined
16:39:28  <creationix>jez0990: not sure, that could be interesting though
16:42:27  * dguttmanquit (Quit: dguttman)
16:45:08  <creationix>alright, published my run function as an npm module https://github.com/creationix/gen-run
16:46:18  * dguttmanjoined
16:53:37  * fallsemoquit (Quit: Leaving.)
16:54:11  * jcrugzzjoined
16:55:11  * fallsemojoined
16:59:09  * brianloveswordsquit (Excess Flood)
16:59:12  <thl0>creationix: people can also use es6ify with browserify to get this running in the browser
16:59:22  <thl0>uses traceur under the hood
16:59:43  <thl0>nice to see all these experiments with generators :)
17:00:05  <thl0>creationix: I'll star it once you add some tests ;)
17:00:58  <creationix>tests. lol
17:01:05  <creationix>well, I did test it all manually
17:01:15  <creationix>but CI isn't going to do much good since no platform has es6
17:01:17  * brianloveswordsjoined
17:01:37  <thl0>granted, but some tests make sense
17:01:46  <thl0>at least I can run them locally if I want
17:01:55  * jxsonjoined
17:02:06  <creationix>sure, I'll add them
17:02:48  <thl0>creationix: cool! -- I even write tests just to make sure nothing breaks while developing, even if no one else can run them: https://github.com/thlorenz/testlingify/blob/master/test/crd.js#L4
17:03:02  <creationix>I'll write some manual node tests then
17:03:16  * jolissjoined
17:03:19  <thl0>nice :)
17:04:47  <chapel>jesusabdullah: ping
17:06:44  <dlmanning>thl0: starting to really appreciate replpad. Nice work!
17:07:15  <thl0>dlmanning: thanks, although there is still room for improvement ;)
17:07:47  <thl0>sometimes the scriptie-talkie functionality (.talk) crashes replpad, but happy to hear it helps you
17:08:03  <dlmanning>yeah. I ran into a few issues trying to use it with ES6
17:09:22  <thl0>dlmanning: not sure if scriptie-talkie works at all with es6
17:09:52  <dlmanning>escodegen doesn't parse generators yet
17:09:57  <dlmanning>So it isn't your fault
17:09:59  <thl0>also since the code gets compacted via esprima and escodegen, those will need to be upgraded in the future
17:10:11  <thl0>I know, for now you can turn off .compact
17:10:17  <thl0>that could solve it (not sure)
17:10:37  <thl0>at least when piping from file to the repl
17:10:42  <dlmanning>yeah
17:10:46  * dguttmanquit (Quit: dguttman)
17:14:35  * AvianFlujoined
17:20:52  * cianomaidinquit (Quit: cianomaidin)
17:22:56  * tmcwjoined
17:24:58  * cianomaidinjoined
17:27:23  * dominictarrquit (Quit: dominictarr)
17:31:12  * no9joined
17:31:19  * shadghostquit (Read error: Operation timed out)
17:31:36  * shadghostjoined
17:32:49  * mikolalysenkoquit (Ping timeout: 248 seconds)
17:33:34  <creationix>thl0: added a test
17:33:42  <creationix>found a typo in the README along the way :)
17:33:48  <creationix>I just made all the samples there execute in order
17:34:27  * mikolalysenkojoined
17:34:45  <thl0>so you basically just inspect the output manually?
17:34:54  <creationix>yep
17:34:56  <thl0>instead of asserting
17:34:57  <thl0>ok
17:35:03  <creationix>though most problems it might have would cause it to throw
17:35:08  <thl0>ah
17:35:09  <creationix>so it's actually a pretty good test
17:35:16  <thl0>cool - got my star ;)
17:35:19  * mikealjoined
17:36:12  <thl0>creationix: I hope people don't start calling me the test nazi ;), but if I can't easily see if a module works, I'm most likely never gonna use it
17:36:36  <creationix>fair enough
17:37:05  <creationix>also it's a great way to test your node install has the right generator stuff
17:37:11  <creationix>even if I know my tiny library is bug-free
17:37:20  <creationix>even latest node 0.11.x will fail on that test
17:37:29  <creationix>you'll get a nasty C stack dump
17:41:48  <thl0>ah, so added benefit there
17:42:26  * timoxleyquit (Quit: Computer has gone to sleep.)
17:42:46  <dlmanning>Is there a version of v8 with yield* yet?
17:42:59  <creationix>dlmanning: yep, get latest node from trunk
17:43:04  <dlmanning>sweet
17:43:06  <creationix>err mster
17:43:07  <thl0>isn't that what node is using --harmony flag?
17:43:09  <creationix>*master
17:43:23  <creationix>and in latest node, you need --harmony_generators
17:43:28  <creationix>--harmony alone doesn't seem to work
17:43:32  <thl0>ok
17:43:53  <creationix>the next 0.11.3 release should have yield*
17:43:56  <creationix>whenever that is
17:44:38  <dlmanning>I'm a little confused about how yield* isn't a cooroutine
17:44:45  <creationix>it is
17:44:56  <creationix>except very explicit
17:44:57  * mcollinaquit (Remote host closed the connection)
17:45:05  * jcrugzzquit (Ping timeout: 255 seconds)
17:45:28  <dlmanning>Okay, I'd thought Dave Herman was saying that coros were a no-go for javascript
17:46:10  <dlmanning>Does yield* avoid the problems with state changes?
17:46:38  <creationix>even full hidden coros are no more dangerous than callbacks with regard to state change
17:46:45  <creationix>you're still single threaded
17:47:14  <creationix>but the different between node-fibers and ES6 yield* is that any point that might suspend you and change state has a visible yield or yield*
17:47:24  <creationix>and any function that might contain them is also tagged as function*
17:47:36  <dlmanning>Gotcha
17:47:40  <creationix>so, in a way, it's deep coroutines, but explicit
17:47:52  <dlmanning>Thanks for clarifying that
17:48:03  <creationix>I just learned about yield* yesterday
17:48:11  <creationix>I was also under the impression that we wouldn't have coros
17:48:30  <dlmanning>As was I... so I was confused about how yield* was going to be implemented
17:49:20  <creationix>dlmanning: my moment of clarity https://twitter.com/benvie/status/344669253221371906
17:50:42  <dlmanning>hmm...
17:51:00  <dlmanning>Nifty. I'm gonna have to think about this for a little while
17:51:18  <creationix>yeah, it's a neat compromise between shallow generators and deep coroutines
17:51:34  <creationix>overall I'm pretty pleased with the final ES6 generator spec
17:51:59  <dlmanning>I gave a presentation last week in which I told everyone we weren't getting coros
17:52:07  <dlmanning>Looks like I need to send out an addendum
17:52:13  <creationix>me too
17:52:26  <creationix>though technically they are still shallow generators
17:52:32  <creationix>just that they can delegate to eachother
17:52:42  <creationix>giving you the benefits of coroutines
17:52:42  <dlmanning>As long as they're safe and can be optimized
17:52:55  <creationix>I sure hope it will be optimized
17:52:58  <creationix>I think it can
17:53:03  <creationix>I haven't formally analyzed it yet though
17:53:35  <dlmanning>mraleph was saying yesterday on twitter that he didn't see any reason why it couldn't be
17:53:53  <creationix>that's encouraging
17:54:34  <creationix>I'm pretty sure using delegate yield will be much faster than layering run() wrapped generators that return continuables
17:54:45  <creationix>and a lot easier to optimize for the engine
17:55:30  <dlmanning>https://twitter.com/mraleph/status/344456148579143681
17:56:19  <creationix>I almost understand what they are talking about :)
17:56:25  <dlmanning>hah
17:56:52  * tmcwquit (Remote host closed the connection)
17:56:53  * dguttmanjoined
18:01:34  * cianomaidinquit (Quit: cianomaidin)
18:07:14  * whit537joined
18:11:54  * fallsemoquit (Quit: Leaving.)
18:12:28  * spionquit (Ping timeout: 246 seconds)
18:14:51  * jcrugzzjoined
18:19:42  * yorickjoined
18:20:30  * mikealquit (Quit: Leaving.)
18:21:25  * fallsemojoined
18:25:58  * kevino80quit (Remote host closed the connection)
18:27:16  * tmcwjoined
18:29:15  * kevino80joined
18:30:55  * defunctzombiechanged nick to defunctzombie_zz
18:31:59  * tmcwquit (Ping timeout: 260 seconds)
18:34:53  * blingcoderjoined
18:44:38  * djcoinquit (Quit: WeeChat 0.4.0)
18:57:57  * tilgovijoined
19:01:49  * no9quit (Ping timeout: 246 seconds)
19:03:02  * st_lukejoined
19:04:54  * jcrugzzquit (Ping timeout: 264 seconds)
19:09:33  <mikolalysenko>check out this webgl shader wrapper I am working on: https://github.com/mikolalysenko/gl-shader
19:09:48  <mikolalysenko>it uses chrisdickinson's glsl parser stuff to get uniforms/attributes automatically
19:12:13  * tmcwjoined
19:13:35  * tmcwquit (Remote host closed the connection)
19:14:01  <chrisdickinson>awesome :D
19:14:38  * no9joined
19:14:45  <chrisdickinson>mikolalysenko: does it support struct/array uniforms?
19:15:52  <mikolalysenko>chrisdickinson: not yet
19:17:48  <mikolalysenko>chrisdickinson: but it is on the list of things to do. this is basically a first usable version
19:20:43  <chrisdickinson>awesome
19:20:51  <chrisdickinson>btw, cssauron-glsl may be of use
19:21:22  <mikolalysenko>chrisdickinson: how does it work?
19:23:17  <chrisdickinson>https://github.com/chrisdickinson/programify/blob/master/index.js#L13
19:23:25  <chrisdickinson>https://github.com/chrisdickinson/programify/blob/master/index.js#L58-L60
19:23:44  <chrisdickinson>basically lets you write css selectors for glsl asts
19:23:57  <mikolalysenko>I see, that's pretty neat
19:24:21  <mikolalysenko>I was thinking though that in the shader wrapper though that I wanted to expose an interface that looks basically like a regular js object
19:24:34  <mikolalysenko>using Object.defineProperty() to handle wrapping stuff
19:24:59  <chrisdickinson>https://github.com/chrisdickinson/programify/blob/master/program.js
19:26:11  <chrisdickinson>^^ this does exactly that
19:26:43  <mikolalysenko>chrisdickinson: yeah, it is pretty cool. but I also wanted to be able to create the shaders directly so that I could do stuff like generate them at run time
19:26:58  <chrisdickinson>mikolalysenko: well, that module could be separated out
19:27:01  <mikolalysenko>which is pretty much the only problem I have with programify
19:27:02  <chrisdickinson>from the package
19:27:11  <mikolalysenko>yeah, that is what the goal of this thing basically is
19:27:28  <mikolalysenko>sort of a smaller version of just the shader compiler that you can use by itself
19:28:20  <mikolalysenko>but without the preprocessing/multiple file include stuff since that won't work so well in a run time environment
19:29:41  <chrisdickinson>program.js doesn't do any of that, luckily :)
19:29:44  <chrisdickinson>but yeah
19:30:56  <mikolalysenko>I think that ultimately there should be a common module or something like that between them
19:31:22  <mikolalysenko>basically I want something that is like program.js from programify, but exposed so that I can use it directly without having to compile my shaders
19:31:52  <mikolalysenko>(or more precisely so that I can generate shaders that get compiled at run time, or even use it with inlined shaders)
19:32:01  * no9quit (Ping timeout: 240 seconds)
19:36:15  * jolissquit (Quit: joliss)
19:36:23  <chrisdickinson>i think you can actually `require('programify/program')
19:36:26  <chrisdickinson>`
19:36:31  <chrisdickinson>without having to compile
19:37:02  <chrisdickinson>but the stripping of uniforms and attrs needs to be abstracted
19:38:29  <mikolalysenko>chrisdickinson: yeah, but it has problems still, like it doesn't check info logs and other errors and it also does a few magic things like "#define VERTEX" / "#define FRAGMENT"
19:38:42  <mikolalysenko>it is also a bit obscure...
19:39:38  <mikolalysenko>I think the program.js part of programify probably belongs in a separate module and needs to be mostly independent
19:42:08  * nicholas_quit (Read error: Connection reset by peer)
19:42:18  <chrisdickinson>yeah
19:42:19  * nicholasfjoined
19:42:21  <chrisdickinson>agreed
19:44:53  * no9joined
19:50:28  * yorickquit (Ping timeout: 246 seconds)
19:52:23  * yorickjoined
20:01:59  <mbalho>anyone have a working solution for sending a stream over XHR? i'm aware of how non-ideal it is
20:04:23  <creationix>mbalho: don't you just buffer the stream into a single string and send that?
20:04:31  <creationix>or is there a better way
20:05:47  <mbalho>creationix: the limitations i'm aware of: you can only .send once per xhr connection, and xhr.response is buffering so for long running connections you have to kill and reconnect to avoid keeping huge buffers around
20:06:11  <mbalho>creationix: so i would imagine that the logic for keeping the stream stateful across reconnects would get hairy
20:06:50  <creationix>yep, that's what socket.io does to simulate websockets over xhr
20:07:05  <mbalho>creationix: it would be nice to have that logic as a standalone thing
20:07:07  <creationix>long-poll for read and multiple requests for write
20:07:22  <creationix>use batching and send every ready message when the bus comes around
20:07:31  <mbalho>yea exactly
20:07:51  <creationix>engine.io is close to standalone for that
20:08:02  <mbalho>mikeal wrote this but it has no docs and is mikealcode(tm) https://github.com/mikeal/pollstream
20:08:32  <creationix>sorry I don't have any libraries for that
20:08:49  <creationix>that could be useful though
20:10:01  <mbalho>creationix: agreed, in the world of stream-abstraction-land it makes a lot of sense to be able to swap out transports like tcp, websockets, xhr, webrtc etc and have code work in node + browser and agnostic of transport
20:10:20  <mbalho>creationix: but in practice the only thing that makes streaming work in a browser right now is websockets so its kind of limiting
20:10:33  <creationix>what's wrong with engine.io?
20:11:02  <mbalho>i should take another look perhaps, i kept trying to use it and it was still under heavy construction
20:11:11  <creationix>is https://github.com/substack/shoe websocket only?
20:11:41  <mbalho>good question
20:11:49  <Domenic_>OK I've been tasked with making private npm work for Lab49. Yesss.
20:12:11  <creationix>Domenic_: fun
20:12:27  <thl0>awesome Domenic_!
20:12:46  <thl0>is it gonna be an actual couch backed registry?
20:13:01  <Domenic_>thl0: probably! will have to see what i can work out.
20:13:21  <Domenic_>thinking of using jden or dominictarr's proxies
20:13:21  <thl0>I wonder why you get to do all the fun stuff around here ;)
20:13:35  <mbalho>maybe someone has wrapped engine.io's api in a stream....
20:13:55  <mbalho>of course raynos did it 6 months ago https://github.com/Raynos/engine.io-stream
20:16:25  <chrisdickinson>mbalho: would be nice to just expose xhrs as a writable stream (from the browser perspective)
20:16:33  <chrisdickinson>then use sse's as the readable side
20:18:35  <mbalho>chrisdickinson: whats the simplest way to have a server keep track of multiple http requests that are all chunks of a stream? a simple sequence number scheme?
20:19:29  <chrisdickinson>yeah, that would work -- sequence with cookie
20:20:29  <mbalho>chrisdickinson: seems like there would be a module for server request aggregation + state, another for the xhr write client, and another for the xhr sse -> readable
20:20:54  <mbalho>chrisdickinson: then maybe a 4th that abstracts the two client side ones into a through()
20:20:55  <chrisdickinson>would you strictly need to keep track of request order with xhrs?
20:21:05  <mbalho>chrisdickinson: nah i guess http does that for you
20:21:17  <chrisdickinson>yeah, you'd just need to have a cookie identifier of some sort
20:21:33  <chrisdickinson>and be able to tie that back to an sse endpoint
20:21:43  * jcrugzzjoined
20:23:46  * st_lukequit (Remote host closed the connection)
20:24:01  * tmcwjoined
20:25:56  * simcop2387joined
20:26:14  * simcop2387quit (Changing host)
20:26:14  * simcop2387joined
20:28:53  * tmcwquit (Ping timeout: 255 seconds)
20:30:28  * Guest57305joined
20:30:43  * tilgoviquit (Ping timeout: 240 seconds)
20:31:28  <Raynos>mbalho: oh hi
20:31:29  * jjjohnnyquit (Ping timeout: 248 seconds)
20:31:32  * Guest57305changed nick to jjjohnny
20:31:52  <mbalho>Raynos: does engine.io-stream still work? there are no testlings
20:32:01  <mbalho>i guess you cant test it easily on testling
20:32:18  <Raynos>mbalho: https://github.com/Colingo/relay-server/blob/master/handlers/engine.js
20:32:41  <mbalho>sweet
20:32:45  <Raynos>mbalho: https://gist.github.com/Raynos/8fb2fb228f73e45ac8c5
20:32:50  <Raynos>that is how I use engine.io in production
20:33:34  <mbalho>ooh engine.io-client works with websocket-stream
20:33:38  <Raynos>the `require("./engine.io-client")` is just putting the vendored version of engine.io-client into a js file and having it module.export things
20:34:13  <mbalho>Raynos: you ant require('engine.io-client') from npm?
20:34:24  <Raynos>havn't tried it
20:34:28  <Raynos>I dont think it works with browserify
20:34:33  <mbalho>ah
20:34:34  <Raynos>because they are component guys
20:34:41  <Raynos>component. *shakes fist*
20:34:44  <mbalho>word
20:35:25  <Raynos>btw
20:35:25  <Raynos>i wouldnt recommend engine.io
20:35:27  <Raynos>i noticed it sending the same chunk multiple times
20:35:38  <Raynos>both engine.io & sockjs are buggy
20:35:42  <mbalho>bah
20:35:48  <Raynos>if you want something stable do what 3rd eden says
20:36:02  <mbalho>he doesnt say to use engine.io
20:36:02  <mbalho>?
20:36:17  <jjjohnny>Raynos: what is './engine.io-client'
20:36:29  <mbalho>jjjohnny: read above
20:36:33  * ELLIOTTCABLEquit (Read error: Connection reset by peer)
20:36:38  <jjjohnny>oh
20:36:43  * ELLIOTTCABLEjoined
20:37:03  <Raynos>https://gist.github.com/Raynos/9170bfd8f543085353a6
20:37:07  <Raynos>thats my vendored version
20:37:57  * mcollinajoined
20:38:14  <Raynos>https://twitter.com/3rdEden/status/337831025180831747
20:38:40  <Raynos>I would agree with 3rdeden that these projects aren't good enough
20:39:02  <Raynos>I'm interested in using event sourcing for read and XHR post for write
20:39:10  <Raynos>I have a suspision that will actually be stable
20:39:16  <mbalho>event sourcing?
20:39:21  <mbalho>oh event source
20:39:22  <Raynos>but of course if your building stuff for the lulz just use engine.io or sockjs
20:39:27  <Raynos>they work well enough
20:39:42  <mbalho>im not, is the thing
20:39:44  * dominictarrjoined
20:40:07  <jjjohnny>yeah me neither, but I am only building for chrome and mobile safari
20:40:09  <mbalho>i have been hoping someone would build a bunch of small modules that can be assembled into a xhr through stream
20:40:16  <mbalho>jjjohnny: me too
20:42:15  <jjjohnny>maybe sock or engine.io is good enough then? Or are they shitty on the server?
20:42:26  <mbalho>they are both way too big IMO
20:42:51  <mbalho>what i want is: the xhr modules i was discussing with chrisdickinson earlier and websockets, all as small modules that have nice simple tests
20:42:58  <mbalho>you dont need any of the other crap
20:43:03  * chrisdickinsonnods
20:43:21  * st_lukejoined
20:43:33  <jjjohnny>https://github.com/shtylman/chrome-socket/blob/master/socket.js
20:43:34  <chrisdickinson>the problem with sse is that it's a text -- not binary -- stream
20:43:51  <mbalho>chrisdickinson: oh good point
20:44:27  <mbalho>chrisdickinson: could always swap out for a custom binary read stream one
20:44:44  <chrisdickinson>yeah, it's easy to pretend it's binary by transparently base64'ing
20:44:50  <mbalho>chrisdickinson: :(
20:45:00  <chrisdickinson>mbalho: other interesting thing is that eventstream is built to multiplex events
20:45:07  <mbalho>chrisdickinson: i guess if youre in a browser that needs binary you can do websockets
20:45:18  <Raynos>jjjohnny: I am using both sock & engine in production
20:45:25  <Raynos>but real time is a component not the entire app
20:45:32  <chrisdickinson>mbalho: did you see http://chimera.labs.oreilly.com/books/1230000000545/index.html ?
20:45:43  <chrisdickinson>it's from ilya grigorik
20:45:46  <mbalho>ah nice
20:45:57  <mbalho>chrisdickinson: did you mean event source?
20:46:09  <chrisdickinson>yeah
20:46:12  <chrisdickinson>sorry
20:46:19  <chrisdickinson>haha, it's all events to all people.
20:46:48  <mikolalysenko>speaking of socket stuff, what is the current best/fastest/smallest web socket library for node?
20:47:10  <chrisdickinson>i've only used shoe and socket.io
20:47:15  <jjjohnny>mbalho: chrome had chrome.socket.create()
20:47:20  <mbalho>mikolalysenko: i use require('ws') and require('websocket-stream') together
20:47:22  <mikolalysenko>last one I used was ws and it was pretty good
20:47:57  <mbalho>mikolalysenko: see https://github.com/maxogden/poncho/blob/master/test-client.js + https://github.com/maxogden/poncho/blob/master/test-server.js
20:48:02  <jjjohnny>this claims to be binary socket http://binaryjs.com/
20:48:55  <mbalho>jjjohnny: websockets do binary natively now that is just a weird abstraction that does mux/demux style stuff on top
20:48:59  * tilgovijoined
20:49:15  <chrisdickinson>mbalho: is there a webrtc datachannel stream?
20:49:40  <mbalho>chrisdickinson: dunno, raynos probs has one, but webrtc reliable hasnt dropped quiet yet
20:49:43  <mbalho>quite*
20:49:49  <mbalho>chrisdickinson: so its basically UDP right now, TCP coming soon
20:49:54  <Raynos>i have a module called datachannel
20:49:56  <chrisdickinson>ah
20:50:04  <Raynos>i havnt kept upto date with webrtc
20:50:23  <Raynos>https://github.com/Raynos/signal-channel
20:50:28  <Raynos>that is my most recent webrtc work
20:50:53  <st_luke>anyone who knows berlin: are there any places that serve good late coffee into the evening?
20:51:03  <jesusabdullah>chapel: pong
20:52:11  <mikolalysenko>st_luke: st. oberholz is pretty good, don't remember when it closes
20:53:25  <mikolalysenko>st_luke: also you should check out the flear market at mauerpark on saturday if you get a chance
21:01:44  * ednapiranhaquit (Remote host closed the connection)
21:02:40  <mbalho>if i wanted to do a web login over a websocket connection can i send set-cookie headers somehow to the client?
21:03:10  <mikolalysenko>mbalho: maybe use a separate xhr request to set the cookie
21:03:19  <mikolalysenko>as a side channel
21:03:53  <mikolalysenko>or can you set cookies directly in js? (I don't know)
21:04:01  <mbalho>mikolalysenko: i think on the websocket upgrade response you can set things... maybe the websocket could disconnect/reconnect and receive the set-cookie on re-join
21:04:07  <mbalho>mikolalysenko: i was hoping for httponly cookies
21:05:16  <mbalho>mikolalysenko: but i think youre generally right
21:05:26  <mbalho>mikolalysenko: most people auth over http and then open a websocket
21:07:43  * spionjoined
21:24:28  * dominictarrquit (Quit: dominictarr)
21:24:42  * mikolalysenkoquit (Ping timeout: 264 seconds)
21:28:27  * AvianFluquit (Remote host closed the connection)
21:31:23  * mikolalysenkojoined
21:35:52  * no9quit (Ping timeout: 260 seconds)
21:40:22  * jcrugzzquit (Ping timeout: 276 seconds)
21:43:24  * kevino80quit (Remote host closed the connection)
21:47:25  <chapel>jesusabdullah: was an issue, but not with ecstatic
21:47:28  <chapel>odd issue though
21:47:33  <jesusabdullah>aha
21:55:28  * blingcoderquit (Quit: WeeChat 0.4.1)
22:10:51  * shuaibjoined
22:27:57  * AvianFlujoined
22:31:21  <AvianFlu>LOUDBOT: YO DAWG, SHARD MY BRONGOS
22:31:22  <LOUDBOT>AvianFlu: AND IT STARTS DOING THAT AGAIN
22:33:12  <st_luke>MAPS ON MAPS ON MAPS
22:33:12  <LOUDBOT>HOW IS SIGWINCH SENT? HOW TERMINAL GET RESIZE SIGNAL
22:34:25  <st_luke>LOUDBOT: twitlast
22:34:26  <LOUDBOT>http://twitter.com/LOUDBOT/status/344945814734524416 (substack/#stackvm)
22:39:04  * st_lukequit (Remote host closed the connection)
22:48:36  <mbalho>LOUDBOT: I SAW IT ON THE NEWS THIS MROING, IT WAS A SYSADMIN IN AR WHO KILL THEIR PROCESS
22:48:36  <LOUDBOT>mbalho: WHY YES, YES THEY DO JUST LET ANYONE IN HER NOWADAYS
22:49:37  * nicholasfquit (Remote host closed the connection)
22:51:28  <mbalho>chrisdickinson: https://github.com/substack/xhr-write-stream
22:51:39  <mbalho>chrisdickinson: now we just need xhr-read-stream
22:51:57  <chrisdickinson>mbalho: or could we use `sse-stream`?
22:52:30  * fallsemoquit (Quit: Leaving.)
22:52:51  <mbalho>chrisdickinson: i guess the trick would be making a server that associates sessions or something
22:53:05  <mbalho>chrisdickinson: in my use case i would need binary
22:53:21  <mbalho>chrisdickinson: but sse-stream would be a good place to start
22:53:39  <chrisdickinson>mbalho: problem with xhr-read-stream is that you can't trigger data from the server to the client
22:55:04  <mbalho>chrisdickinson: once you have an active read stream open you can
22:55:17  <chrisdickinson>true
22:57:33  * fallsemojoined
22:57:42  * LOUDBOTquit (Ping timeout: 256 seconds)
23:06:38  * thl0quit (Remote host closed the connection)
23:12:11  * no9joined
23:23:03  * yorickquit (Remote host closed the connection)
23:28:42  * mcollinaquit (Remote host closed the connection)
23:44:25  <Raynos>isaacs: did you ever write a caching gzipping bundler style thing that sends stuff
23:46:31  * AvianFluquit (Remote host closed the connection)
23:49:40  <mbalho>substack: dont you have a thing that gives you auth'd streams between two nodes
23:59:07  <isaacs>Raynos: no, what's that?