01:01:20  * DarkGodquit (Ping timeout: 244 seconds)
05:49:29  * Haragethquit (Read error: Connection reset by peer)
06:35:15  * SkyRocknRolljoined
06:45:12  <konobi>huh... wonder if anyone's ever tried hooking up torch to lusJIT with some of the ML/AI modules.
07:43:07  * rendarjoined
07:43:44  * DarkGodjoined
10:53:57  * SkyRocknRollquit (Ping timeout: 244 seconds)
11:07:44  * SkyRocknRolljoined
14:15:14  * SkyRocknRollquit (Remote host closed the connection)
14:55:54  <creationix>konobi: I thought torch already used luajit
15:36:19  <creationix>hmm, luvit.io is broken again
15:36:41  <creationix>lsof on the process doesn't show much
15:38:01  <creationix>hmm, strace shows that accept4 is getting EGAIN
15:38:08  <creationix>I wonder if it's out of file descriptors somehow
15:48:51  <creationix>hmm, not leaking uv handles https://luvit.io/handles
15:49:03  <creationix>I added a new API to show all active libuv handles
16:05:44  * Haragethjoined
16:41:47  * SinisterRectusquit (Ping timeout: 276 seconds)
16:42:21  * SinisterRectusjoined
16:43:38  <creationix>Harageth, yep, it's a memory leak. Run this a few times in a row https://luvit.io/stats
16:43:51  <creationix>I collect garbage before measuring lua heap, so it should be stable
16:44:22  <creationix>not sure how using lots of memory would cause EAGAIN on accept4 though
16:45:51  <creationix>seems to be leaking about 1kb of lua heap per request
16:49:02  <Harageth>yeah that seems weird
16:50:32  <Harageth>Is there maybe a loose reference that it is losing but the gc job still thinks is referenced?
16:51:13  <Harageth>^ could explain EAGAIN and the memory leak... Probably a bit farfetched of an idea though.
16:58:56  <konobi>creationix: i mean using the ml/ai functionality of torch within luajit
16:59:39  <creationix>konobi, oh so use torch like a library
16:59:51  <creationix>Harageth, well either way, we're leaking
17:00:01  <creationix>I'm trying to use the snapshot library to get some hints as to what it leaking
17:00:53  * creationixtopic: http://luvit.io | https://gitter.im/luvit/luvit | IRC Logs at http://logs.nodejs.org/luvit/latest
17:03:16  <creationix>ok, part of the leak is the etag-cache module
17:03:25  <creationix>that's probably causing more harm than good, I'll just remove it
17:03:31  <creationix>still leaking slowly though
17:06:29  * DarkGodquit (Ping timeout: 276 seconds)
17:08:13  <konobi>creationix: yup
17:30:59  <creationix>Harageth, I think part of the leak is the recursive closures used by weblit-app
17:31:07  <creationix>reimplementing that to see if it cleans up any
17:43:49  * DarkGodjoined
17:45:54  <creationix>hmm, removing the recursion/closure doesn't help
18:03:21  <Harageth>creationix do logs show when this issue started?
18:03:34  <creationix>not really
18:03:48  <creationix>it's been going on for a while. I did set a daily cron to restart the server, that helped
18:04:06  <creationix>but the site was still down this morning (I guess extra traffic that triggers the problem)
18:04:06  <Harageth>so we dont really know if it is configuration related or code related then?
18:04:13  <creationix>probably code
18:04:28  <creationix>which hasn't changed in a while
18:04:37  <Harageth>right I was looking at commit history
18:04:53  <Harageth>I also had forgotten that you had created that restart cron job a while ago
18:08:27  * erlbot--quit (Remote host closed the connection)
18:09:14  * erlbot--_joined
18:14:14  <Harageth>When did you start that cron job? Wondering if its related.
18:14:35  <Harageth>and it gives a good date to start looking at commits
18:21:31  <creationix>Harageth, so the code that renders the site hasn't really changed since last summer. It's been content updates since then
18:21:38  <creationix>as far as I know it's always been a little unstable
18:21:54  <Harageth>well that makes it more difficult to track down
18:22:11  <creationix>but I have a tool that gives hints as to what exactly is leaking in the heap
18:22:19  <creationix>if you want to join my room, I can show you
19:01:10  * Haragethquit (Remote host closed the connection)
19:13:16  <creationix>serving static files leaks a lot
19:13:52  <creationix>when the browser gets 304 responses from weblit, we don't leak heap objects
19:15:04  <creationix>but forcing a reload from the browser (disabling conditional get requests) leaks a lot
19:16:39  <creationix>Serving a single static file leaks 16 heap objects
19:21:51  <creationix>ahh, it has nothing to do with static file serving, it's the http keepalive
19:22:02  <creationix>when the browser forces a refresh, it also gets a new tcp connection
19:22:20  <creationix>disabling etags doesn't (and thus conditional GET requests) doesn't affect leakage
19:22:32  <creationix>sockets are leaking closures!
19:44:32  * rendarquit (Ping timeout: 260 seconds)
19:49:05  <creationix>I think I found it. In coro-channel socket:shutdown(wait()) is being called, but it's returning nil, ENOTCONN (which we were ignoring)
19:49:27  <creationix>but luv never released the GC handle on the generated callback closure which pulls in 11 other heap objects
19:49:42  <creationix>so I think the leak is in luv
19:49:57  <creationix>but only manifests if you try to call shutdown on an unconnected socket
19:50:20  <creationix>rphillips, have you ever seen a case with the agent where shutdown callbacks leak?
19:50:28  <creationix>luvit and agent don't use coro-channel
19:50:44  * rendarjoined
19:50:49  <rphillips>hmm. no... but we have a timeout on our socket shutdown
19:56:46  <creationix>well, that won't help
19:56:57  <creationix>it's not the socket that leaks, it's callbacks passed to functions like shutdown
19:57:26  <creationix>luv will ref the callback, but then never unref it in case the uv function errors out
19:57:45  <creationix>but if you never call shutdown on a broken socket, you'll never hit this particular path
20:01:41  * Haragethjoined
20:04:09  <rphillips>ahhh
20:06:16  * Haragethquit (Ping timeout: 244 seconds)
20:08:44  <creationix>rphillips: this *should* fix it. Luv tests all pass, trying in my app now https://github.com/luvit/luv/commit/36d26e39238ca43a77040a5a91a975156de38c23
20:09:30  <rphillips>nice!
20:09:35  <creationix>oops, forgot to check fs
20:10:11  <creationix>looks like fs already has the fix
20:10:17  <creationix>I seem to remember that PR a while back
20:14:10  * travis-cijoined
20:14:11  <travis-ci>luvit/luv#255 (master - 36d26e3 : Tim Caswell): The build passed.
20:14:11  <travis-ci>Change view : https://github.com/luvit/luv/compare/97984a874ae8...36d26e39238c
20:14:11  <travis-ci>Build details : https://travis-ci.org/luvit/luv/builds/125932810
20:14:11  * travis-cipart
20:14:17  * rendarquit (Ping timeout: 260 seconds)
20:16:55  <creationix>yep, that helped the heap object leak
20:17:15  <creationix>now my server has a stable number of heap objects in each snapshot
20:18:26  <creationix>time for a luvi release. (I also reverted the lua-openssl bump in luvi that broke lit and luvit's ssl tests)
20:21:36  <rphillips>that might help memory usage for sure
20:21:59  <rphillips>nice find
20:26:11  <creationix>alright, new luv published with memory leak fixed
20:26:44  <creationix>the luvit.io servers have been crashing for months and I finally got tired and decided to find the root cause
20:27:33  <rphillips>creationix: did you get the fs module tweaked as well?
20:27:45  <creationix>it already had the fix
20:29:06  <creationix>wow, since Oct. 1 2014
20:29:18  <rphillips>i'll get the agent updated when you get a luvi release
20:29:36  <creationix>it might not affect the agent, but just in case, it's good to have
20:29:45  <creationix>if affects anything coro-channel based for sure
20:30:50  * Haragethjoined
20:31:27  * travis-cijoined
20:31:28  <travis-ci>luvit/luv#256 (1.9.0-3 - 65fdb31 : Tim Caswell): The build passed.
20:31:29  <travis-ci>Change view : https://github.com/luvit/luv/commit/65fdb31b6f0c
20:31:29  <travis-ci>Build details : https://travis-ci.org/luvit/luv/builds/125936931
20:31:29  * travis-cipart
20:31:45  <creationix>rphillips, luvit still fails tls peer certificate. I reverted the lua-openssl bump in luvi thinking that was the cause
20:32:28  <rphillips>hmm
20:32:52  <rphillips>perhaps there is a luvit patch that was applied?
20:33:45  <creationix>fails with my custom luvi (v2.7.0-5-gd2b924f) and the last luvi release of (v2.7.0)
20:33:49  <creationix>master on luvit
20:34:17  <creationix>if I downgrade luvi to v2.6.1 the luvit tests pass
20:34:48  <creationix>how do I compare between tags on github?
20:35:04  <rphillips>git diff [tag name]..[tag name 2]
20:35:10  <rphillips>without the brackets
20:35:25  <rphillips>oh on github
20:35:34  <rphillips>don't know
20:36:52  <creationix>found it https://github.com/luvit/luvi/compare/v2.6.1...v2.7.0
20:37:09  <creationix>somewhere between these two versions the tls test in luvit broke
20:37:25  <creationix>I thought it was when I bumped lua-openssl, but reverting it doesn't seem to help
20:38:44  <creationix>hmm, also updated openssl itself
20:39:39  <creationix>let's see if reverting that fixes the problem...
20:40:23  <Harageth>creationix: Figure out the memory leak?
20:40:35  <creationix>Harageth, yep it was in the C code in luv
20:40:42  <Harageth>ahhhhh fun stuff
20:41:19  <creationix>coro-channel was calling socket:shutdown on an already disconnected socket. Luv put the passed in callback on the gc refs, but never released it because the call to shutdown errored out
20:41:48  <creationix>Harageth, for the gory details https://github.com/luvit/luv/commit/36d26e39238ca43a77040a5a91a975156de38c23
20:42:06  <creationix>I published a new luv to luarocks and am now trying to get a stable luvi release out so luvit and lit can use it too
20:43:19  <creationix>rphillips, reverting openssl in the submodule fixes it. Do you want to look into this or can I just release the known working version for now and worry about updating openssl till later
20:45:46  * erlbot--_changed nick to erlbot--
20:47:15  * travis-cijoined
20:47:15  <travis-ci>luvit/luvi#891 (master - a3c6e1a : Tim Caswell): The build passed.
20:47:16  <travis-ci>Change view : https://github.com/luvit/luvi/compare/d2b924f4cb46...a3c6e1a6b7c1
20:47:16  <travis-ci>Build details : https://travis-ci.org/luvit/luvi/builds/125941705
20:47:16  * travis-cipart
20:52:12  <creationix>new luvi with reverted ssl fixed lit's tests too. I'm cutting a luvi release
20:55:31  <creationix>https://github.com/luvit/luvi/commit/05e051b38706b1fec1475bb2f3c18455b3601a75, will build binaries as soon as *all* ci servers pass
20:55:41  * creationixhas been burned too many times building binaries before testing everything
20:58:04  * travis-cijoined
20:58:05  <travis-ci>luvit/luvi#892 (master - 05e051b : Tim Caswell): The build passed.
20:58:05  <travis-ci>Change view : https://github.com/luvit/luvi/compare/a3c6e1a6b7c1...05e051b38706
20:58:05  <travis-ci>Build details : https://travis-ci.org/luvit/luvi/builds/125944206
20:58:05  * travis-cipart
20:59:52  * erlbot--quit (Remote host closed the connection)
21:00:44  * erlbot--joined
21:01:06  * travis-cijoined
21:01:07  <travis-ci>luvit/luvi#893 (v2.7.1 - 05e051b : Tim Caswell): The build passed.
21:01:07  <travis-ci>Change view : https://github.com/luvit/luvi/compare/v2.7.1
21:01:07  <travis-ci>Build details : https://travis-ci.org/luvit/luvi/builds/125944903
21:01:07  * travis-cipart
21:04:47  <rphillips>creationix: 1.0.2e probably isn't a very good version to roll to
21:05:01  <creationix>no, but it's what we're on right now
21:05:13  <creationix>or did the agent use the new 2.7.0 luvi?
21:05:18  <creationix>I never could get luvit or lit on it
21:07:57  <rphillips>agent uses it, yes
21:08:38  <creationix>I guess you just never hit whatever changed API breaks the tests
21:13:55  <creationix>ok, I would hold off updating the agent then. I'm unsure if this memory leak even affects you and downgrading openssl is probably bad
21:14:11  <creationix>after this is out, I'll see about updating the ssl version in another luvi release
21:20:54  <rphillips>creationix: I think I have a patch
21:21:03  <creationix>awesome
21:21:07  <rphillips>tests are running on osx
21:21:24  <rphillips>'All tests pass'
21:25:13  <rphillips>creationix: https://github.com/luvit/luvi/pull/151
21:25:47  <rphillips>https://github.com/luvit/openssl/pull/23
21:25:58  <rphillips>regenerates the assembler optimizations
21:26:07  <rphillips>that luvi passes all master luvit tests for me
21:26:27  <creationix>great
21:26:31  * travis-cijoined
21:26:32  <travis-ci>luvit/luvi#894 (fixes/bump_openssl_1_0_2g - a88e772 : Ryan Phillips): The build passed.
21:26:32  <travis-ci>Change view : https://github.com/luvit/luvi/commit/a88e772f7779
21:26:32  <travis-ci>Build details : https://travis-ci.org/luvit/luvi/builds/125950807
21:26:32  * travis-cipart
21:26:42  <creationix>so no need to update lua-openssl then?
21:26:56  <rphillips>correct
21:28:15  <rphillips>creationix: just want to make sure the window builder works
21:28:19  <rphillips>windows*
21:28:36  <rphillips>regular-asm worked on windows
21:28:44  <rphillips>last one https://ci.appveyor.com/project/racker-buildbot/luvi/build/1.0.708/job/excch5blt134w9wn
21:29:01  <creationix>tested on your macbook too right?
21:30:07  <rphillips>roger
21:30:23  <rphillips>creationix: wait one
21:30:27  <rphillips>it failed on the 32bit compile
21:43:25  * travis-cijoined
21:43:26  <travis-ci>luvit/luvi#896 (fixes/bump_openssl_1_0_2g - fead911 : Ryan Phillips): The build passed.
21:43:26  <travis-ci>Change view : https://github.com/luvit/luvi/compare/a88e772f7779...fead911d62b6
21:43:26  <travis-ci>Build details : https://travis-ci.org/luvit/luvi/builds/125955107
21:43:26  * travis-cipart
21:52:11  * travis-cijoined
21:52:12  <travis-ci>luvit/luvi#898 (fixes/bump_openssl_1_0_2g - 1964644 : Ryan Phillips): The build passed.
21:52:12  <travis-ci>Change view : https://github.com/luvit/luvi/compare/fead911d62b6...1964644bf10d
21:52:12  <travis-ci>Build details : https://travis-ci.org/luvit/luvi/builds/125956840
21:52:12  * travis-cipart
22:06:38  * travis-cijoined
22:06:39  <travis-ci>luvit/luvi#900 (fixes/bump_openssl_1_0_2g - ef3fae6 : Ryan Phillips): The build passed.
22:06:39  <travis-ci>Change view : https://github.com/luvit/luvi/compare/1964644bf10d...ef3fae64ac2d
22:06:39  <travis-ci>Build details : https://travis-ci.org/luvit/luvi/builds/125959589
22:06:39  * travis-cipart
22:14:00  * travis-cijoined
22:14:01  <travis-ci>luvit/luvi#902 (fixes/bump_openssl_1_0_2g - 65e34b7 : Ryan Phillips): The build passed.
22:14:01  <travis-ci>Change view : https://github.com/luvit/luvi/compare/ef3fae64ac2d...65e34b77f61f
22:14:01  <travis-ci>Build details : https://travis-ci.org/luvit/luvi/builds/125961537
22:14:01  * travis-cipart
22:22:33  * travis-cijoined
22:22:34  <travis-ci>luvit/luvi#904 (fixes/bump_openssl_1_0_2g - 7cf7bcd : Ryan Phillips): The build has errored.
22:22:34  <travis-ci>Change view : https://github.com/luvit/luvi/compare/65e34b77f61f...7cf7bcd665f8
22:22:34  <travis-ci>Build details : https://travis-ci.org/luvit/luvi/builds/125963555
22:22:34  * travis-cipart
22:29:52  * travis-cijoined
22:29:53  <travis-ci>luvit/luvi#904 (fixes/bump_openssl_1_0_2g - 7cf7bcd : Ryan Phillips): The build passed.
22:29:53  <travis-ci>Change view : https://github.com/luvit/luvi/compare/65e34b77f61f...7cf7bcd665f8
22:29:53  <travis-ci>Build details : https://travis-ci.org/luvit/luvi/builds/125963555
22:29:53  * travis-cipart
22:45:01  <creationix>rphillips, any idea what changed with asm on windows?
22:45:09  <creationix>appveyor environment or openssl?
22:45:14  <rphillips>openssl
22:45:21  <rphillips>it will require nasm for the assembler
22:45:40  <rphillips>1.0.2 says that is the only supported assembler
22:45:58  <creationix>btw, luvi 2.7.1 is done, I can start the builds again once you get nasm stable
22:45:58  <rphillips>i'm working through the different architectures
22:46:04  <rphillips>thanks
22:46:05  * travis-cijoined
22:46:06  <travis-ci>luvit/luvi#906 (fixes/bump_openssl_1_0_2g - f21a14c : Ryan Phillips): The build passed.
22:46:06  <travis-ci>Change view : https://github.com/luvit/luvi/compare/7cf7bcd665f8...f21a14c2e06b
22:46:06  <travis-ci>Build details : https://travis-ci.org/luvit/luvi/builds/125969440
22:46:06  * travis-cipart
22:52:46  * travis-cijoined
22:52:47  <travis-ci>luvit/luvi#909 (fixes/bump_openssl_1_0_2g - 5dd3123 : Ryan Phillips): The build has errored.
22:52:47  <travis-ci>Change view : https://github.com/luvit/luvi/compare/f21a14c2e06b...5dd3123136d8
22:52:47  <travis-ci>Build details : https://travis-ci.org/luvit/luvi/builds/125970962
22:52:47  * travis-cipart
22:55:56  * travis-cijoined
22:55:57  <travis-ci>luvit/luvi#910 (fixes/bump_openssl_1_0_2g - 5cafc69 : Ryan Phillips): The build passed.
22:55:57  <travis-ci>Change view : https://github.com/luvit/luvi/compare/5dd3123136d8...5cafc69c81f4
22:55:57  <travis-ci>Build details : https://travis-ci.org/luvit/luvi/builds/125971201
22:55:57  * travis-cipart
23:05:45  <creationix>dinner time
23:05:51  <creationix>updated lit and luvit published
23:06:36  * travis-cijoined
23:06:36  <travis-ci>luvit/luvit#2784 (master - 7520d33 : Tim Caswell): The build passed.
23:06:37  <travis-ci>Change view : https://github.com/luvit/luvit/compare/f5f9be0cced9...7520d33fe88d
23:06:37  <travis-ci>Build details : https://travis-ci.org/luvit/luvit/builds/125973071
23:06:37  * travis-cipart
23:07:05  <creationix>wohoo, memory leak fixed in luvit.io! https://luvit.io/stats https://luvit.io/snapshots
23:07:16  <creationix>and I assume in lit as well, we'll see what sam's charts say
23:07:29  * travis-cijoined
23:07:30  <travis-ci>luvit/luvit#2785 (2.10.0 - 7520d33 : Tim Caswell): The build failed.
23:07:30  <travis-ci>Change view : https://github.com/luvit/luvit/compare/2.10.0
23:07:30  <travis-ci>Build details : https://travis-ci.org/luvit/luvit/builds/125973253
23:07:30  * travis-cipart
23:08:56  * travis-cijoined
23:08:57  <travis-ci>luvit/luvi#912 (fixes/bump_openssl_1_0_2g - 5670358 : Ryan Phillips): The build passed.
23:08:58  <travis-ci>Change view : https://github.com/luvit/luvi/compare/5cafc69c81f4...5670358c9d0a
23:08:58  <travis-ci>Build details : https://travis-ci.org/luvit/luvi/builds/125973332
23:08:58  * travis-cipart