00:00:08  * ircretaryjoined
00:08:30  <jesusabdullah>substack: how u CI lol
00:11:03  <substack>http://github.com/substack/cicada
00:11:18  <substack>testling-ci runs on this lib
00:11:40  <substack>with https://github.com/substack/github-post-receive set up to catch github web hooks and `git push` them at cicada
00:15:00  <jesusabdullah>neat
00:28:37  <jesusabdullah>I think I want parts of that but not the whole enchilada so to speak
00:28:53  <jesusabdullah>but I bet I can reuse a lot of the deps here
00:33:29  <substack>rvagg: maybe if I create a ren faire in my backyard steve foxx will show up out of nowhere to knock everything down
00:33:32  <substack>I can only hope.
00:35:24  * mikolalysenkojoined
00:46:00  * timoxleyquit (Quit: Computer has gone to sleep.)
00:48:17  * nicholasfjoined
00:49:00  * dominictarrquit (Quit: dominictarr)
00:50:41  * no9quit (Ping timeout: 256 seconds)
00:51:35  * jxsonquit (Remote host closed the connection)
00:52:21  * nicholasfquit (Ping timeout: 248 seconds)
01:06:02  <rvagg>as long as mimes don't show up too
01:08:38  * no9joined
01:13:13  * timoxleyjoined
01:16:18  * ralphtheninjaquit (Ping timeout: 264 seconds)
01:20:26  * jxsonjoined
01:26:41  * mikealquit (Quit: Leaving.)
01:27:22  * jxson_joined
01:27:58  * jxsonquit (Read error: Connection reset by peer)
01:31:43  * no9quit (Ping timeout: 264 seconds)
01:33:31  * jolissquit (Quit: joliss)
01:34:01  * jolissjoined
01:34:01  * jolissquit (Client Quit)
01:35:32  <jesusabdullah>Yeah I tweeted laughter at substack what of it
01:41:53  * kirbysayshijoined
01:57:00  * mikealjoined
02:19:05  * pikpikquit (Changing host)
02:19:05  * pikpikjoined
02:19:05  * pikpikquit (Changing host)
02:19:05  * pikpikjoined
02:25:25  * jolissjoined
02:27:49  * jolissquit (Client Quit)
02:29:05  * jolissjoined
02:30:35  * jxsonjoined
02:30:41  * jxson_quit (Ping timeout: 240 seconds)
02:31:07  * jxsonquit (Remote host closed the connection)
02:39:00  * vitorpachecoquit (Quit: Saindo)
02:44:56  <substack>mikeal: I have no routine at all and go out of my way not to have a routine
02:48:31  <substack>I wake up some time between 09:00 and 17:00
02:48:47  <substack>I might eat something right away or go several hours
02:56:20  <jcrugzz>substack: routine is dangerous
02:56:28  <jesusabdullah>routine is also healthy
02:56:49  <jesusabdullah>you should change up your routine definitely but having *some* routine is good
02:58:11  <jcrugzz>yea, makes the default things easier but yea too many people get caught in perpetual cycles
02:58:14  <jcrugzz>its terrible
02:58:51  * shamaquit (Remote host closed the connection)
02:58:51  <substack>I just get really sleepy all the time when I have a routine
02:59:30  <jesusabdullah>well it's not defaults necessarily that's good about it
03:01:01  <jcrugzz>yea defaults is ambiguous ha but i mean eating at certain intervals is usually positive
03:01:45  <jesusabdullah>right
03:03:48  * timoxleyquit (Ping timeout: 260 seconds)
03:08:06  * jjjjohnn1yquit (Ping timeout: 256 seconds)
03:16:35  * wolfeida_quit (Remote host closed the connection)
03:17:07  * wolfeidaujoined
03:19:59  * tilgoviquit (Ping timeout: 246 seconds)
03:25:02  * kirbysayshiquit (Quit: Leaving...)
03:26:03  * jxsonjoined
03:36:55  * kirbysayshijoined
03:38:55  * jdenjoined
03:38:55  * jlord_joined
03:38:59  * xyxnejoined
03:39:12  * py1hon_joined
03:39:19  * jdenchanged nick to Guest28371
03:39:41  * thl0quit (Remote host closed the connection)
03:41:39  * jlordquit (Ping timeout: 276 seconds)
03:41:39  * jden_quit (Remote host closed the connection)
03:41:40  * xyxne_quit (Remote host closed the connection)
03:41:40  * py1honquit (Remote host closed the connection)
03:46:23  * defunctzombie_zzchanged nick to defunctzombie
03:48:36  * defunctzombiequit (Changing host)
03:48:36  * defunctzombiejoined
03:52:47  <jesusabdullah>brb restarting serverbox D:
03:52:52  * jesusabdullahquit (Remote host closed the connection)
03:52:53  * _ddgbotquit (Read error: Connection reset by peer)
03:54:39  * jxsonquit (Remote host closed the connection)
03:57:49  * kirbysayshiquit (Quit: Leaving...)
03:58:24  * jesusabdullahjoined
04:00:39  * jxsonjoined
04:07:24  * _ddgbotjoined
04:07:26  * jxsonquit (Read error: Connection reset by peer)
04:07:37  * jxsonjoined
04:08:12  * mikealquit (Read error: Connection reset by peer)
04:09:30  * jxson_joined
04:10:08  * jxsonquit (Read error: Connection reset by peer)
04:10:09  * jxson_quit (Read error: Connection reset by peer)
04:10:20  * jxsonjoined
04:11:20  * jxsonquit (Read error: Connection reset by peer)
04:11:33  <mikolalysenko>I've got this itch to implement partial evaluation in js for js
04:12:09  * kirbysayshijoined
04:12:28  * jxsonjoined
04:13:07  * mikealjoined
04:13:35  * jxson_joined
04:13:49  * jxsonquit (Read error: Connection reset by peer)
04:14:49  * jxsonjoined
04:18:13  * jxson_quit (Ping timeout: 248 seconds)
04:19:11  * jxsonquit (Ping timeout: 240 seconds)
04:20:24  * defunctzombiechanged nick to defunctzombie_zz
04:25:50  * defunctzombie_zzchanged nick to defunctzombie
04:29:07  * defunctzombiechanged nick to defunctzombie_zz
04:48:20  * tilgovijoined
04:49:18  * nicholasfjoined
04:53:25  * nicholasfquit (Ping timeout: 248 seconds)
05:03:40  * tilgoviquit (Read error: Connection reset by peer)
05:29:03  * jxsonjoined
05:31:18  * fallsemoquit (Quit: Leaving.)
05:33:33  * jxsonquit (Ping timeout: 245 seconds)
05:35:46  * AvianFlu_quit (Remote host closed the connection)
05:37:10  * jlord_changed nick to jlord
05:51:43  * douglaslassancejoined
05:55:58  <jesusabdullah>sup jlord
05:58:07  * mikolalysenkoquit (Ping timeout: 264 seconds)
06:03:23  * timoxleyjoined
06:20:17  <substack>new quick start guide: http://browserling.com:9005/guide/quick_start
06:20:34  <substack>7 more sections to go
06:21:06  <emilisto>substack: nice :)
06:21:50  <emilisto>btw, do you know if there's something like hyperstream, but that feeds the stream into an object?
06:22:09  <emilisto>otherwise I'm writing a module like it
06:22:12  * jcrugzzquit (Ping timeout: 256 seconds)
06:30:21  * no9joined
06:30:42  <substack>emilisto: for json?
06:30:53  <substack>https://github.com/substack/node-gutter
06:32:19  <jesusabdullah>substack: I just deployed a wordpress site over ftp what is the world coming to
06:32:49  <emilisto>substack: sorta, but to populate a javascript object, and then run a method with the object as an argument once all its stream-values have gotten their data
06:33:16  <emilisto>basically a generalization of hyperstream, where instead of doing trumplet.select(), one can supply an arbitrary method to call with the stream data
06:33:48  * kirbysayshiquit (Quit: Leaving...)
06:36:34  <substack>emilisto: I'm not sure that maps well to what hyperstream does
06:36:48  <substack>hyperstream emits data as it receives it as text
06:36:49  <emilisto>hmm, okay
06:37:00  <substack>it doesn't buffer up the whole result into an object
06:37:17  <emilisto>https://gist.github.com/emilisto/c43a39cfbfc88645e502
06:37:20  <substack>what problem does your approach solve?
06:37:26  <emilisto>I'm thinking of using it like that
06:37:51  <emilisto>basically being able to feed streams into a template
06:39:11  <substack>ok so that is actually the same idea then
06:40:16  <emilisto>but I see how the buffer-it-all-up breaks the stream utility a bit
06:41:36  <emilisto>using gutter, maxogden's concat() and JSON.parse() would actually do it
06:42:02  * jibayjoined
06:42:08  <substack>you wouldn't need gutter or JSON.parse()
06:42:37  <jesusabdullah>looks good substizzy
06:42:48  <jesusabdullah>haha fuckin' wordpress man
06:43:01  <jesusabdullah>so much time has to be spent on figuring out how not to fuck up the database
06:44:08  <emilisto>substack: ah, cool, didn't know concat worked that way, thanks for the hint :)
06:59:43  * jjjohnnyjoined
06:59:46  <substack>emilisto: https://gist.github.com/substack/5712095
07:00:34  <substack>the trickiest part is getting the parameter indexes
07:00:39  <substack>the streaming stuff is pretty simple
07:03:51  * mikolalysenkojoined
07:06:28  <jesusabdullah>ugh, "runNpmTasks" makes me want to cry
07:06:35  <jesusabdullah>am I crazy? Something isn't right with grunt's api
07:06:49  <substack>haha
07:06:56  <substack>never used grunt myself
07:07:08  <substack>I just stick to package.json "scripts" and `npm run`
07:07:25  <jesusabdullah>yeah
07:07:30  <jesusabdullah>see what I'm thinking is
07:07:36  <substack>or a shell script or something simple like that
07:07:38  <jesusabdullah>if I do any more of these wordpress sites with Joe
07:07:41  <jesusabdullah>he runs windows
07:07:47  <jesusabdullah>uses dreameweaver
07:07:51  <jesusabdullah>and deploys via ftp
07:07:56  <jesusabdullah>php is new for him
07:08:22  <jesusabdullah>so like, I'm trying to think of the best possible way to introduce github and real tooling without overwhelming him
07:08:28  <substack>why doesn't he just use a hosted service instead?
07:08:36  * mikolalysenkoquit (Ping timeout: 240 seconds)
07:08:36  <jesusabdullah>hosted service?
07:08:43  <jesusabdullah>you mean like a php paas?
07:08:49  <jesusabdullah>I was actually just scoping out openshift
07:08:50  <substack>like blogspot or wordpress.com hosted installs I thought?
07:09:05  <jesusabdullah>wordpress does yeah, but it's a locked down one
07:09:15  <jesusabdullah>this is a CUSTOM PHP SITE that JUST HAPPENS to run WORDPRESS
07:09:18  <jesusabdullah>hehe
07:09:22  <jesusabdullah>anyways
07:09:24  <emilisto>substack: thanks man! it's the pause and resume parts of the streams that I gotta learn better, so that's really helpful
07:09:32  <jesusabdullah>yeah, so
07:09:53  <jesusabdullah>Maybe what I really want for deployment is a php PaaS
07:10:00  <jesusabdullah>I bet someone out there's doing it reasonably well
07:10:06  <jesusabdullah>openshift maybe, I'll have to look
07:10:11  <jesusabdullah>but either way I need a make replacement
07:10:34  <jesusabdullah>you know what? I bet I actually "need" very little
07:11:36  <substack>jesusabdullah: maybe you need to not have the problem in the first place?
07:11:44  <emilisto>jesusabdullah: I was faced with the same dilemma the other week, trying to make life a little easier for someone using php as his tool of trade
07:12:08  <emilisto>I think we discovered that google app engine supports php nowadays
07:12:23  <emilisto>https://developers.google.com/appengine/docs/php/
07:12:25  * kesslerquit (Ping timeout: 250 seconds)
07:12:28  <jesusabdullah>substack: See the cool part is I get to hang out with a friend, teach him shit and get some money on the side
07:12:31  <jesusabdullah>cool emilisto
07:13:11  <jesusabdullah>substack: I gotta get him beefed up quite a bit before he's ready to start thinking in terms of a programmer, he's a web designer just doing wordpress stuff to expand his horizons a little
07:13:25  <jesusabdullah>substack: anyways hacking on mini-build-thing
07:13:44  <jjjohnny>jesusabdullah: https://page.ly/
07:15:38  <jesusabdullah>jjjohnny: looks handy
07:15:46  <jjjohnny>hosting WP sites is a pain
07:15:51  <jesusabdullah>jjjohnny: you can still push your own files, resync the db itself, etc?
07:16:42  <gildean>jesusabdullah: there should be at least a few paas providers that you can use to run more custom php sites, like appfog for example
07:16:43  <jjjohnny>i dont know
07:17:06  <jjjohnny>Nowadays I thought you could do pretty much everything from within the WP CMS
07:17:39  <substack>I blog in markdown.
07:17:50  <jesusabdullah>gildean: yeah that was my thinking
07:18:07  <jesusabdullah>yeah and jjjohnny that seems like a reasonable approach as well, it just weirds me out
07:19:09  <jesusabdullah>I hate the idea of having to physically open something and do clicky-click to update my site XD
07:20:20  <jjjohnny>yeah
07:20:43  <jjjohnny>i made it through freelance web development cutting that BS and I aint going back
07:21:23  <jjjohnny>using wordpress is a not usually the right choice for regular freelance gigs in my exp
07:22:16  <jjjohnny>now there are many decent hosted site services more tuned to specific needs, can save you lots of time
07:22:40  <jjjohnny>we'd give people wordpress and they would never touch it
07:22:46  <jjjohnny>people clients
07:23:10  <jjjohnny>eventually I switched to rolling CMSs per client
07:23:36  <jjjohnny>then I bailed on that and moved to the place outside of oz where the shorties dance on yellow brick roads
07:23:46  <jesusabdullah>haha
07:25:41  * nicholasfjoined
07:26:46  <jjjohnny>my last "client" was my first node.js site to go live, a custom CMS with a tiny admin panel that is hidden but not secured behind any auth :\
07:27:02  <jesusabdullah>so what do you do now jjjohnny?
07:27:30  <jjjohnny>magically I landed an ebay contract as soon as I got to Oakland
07:27:55  * douglaslassancequit (Read error: Connection reset by peer)
07:28:12  <jesusabdullah>wait, fuck this, I'll use node-tap
07:28:12  <jjjohnny>that ended
07:28:16  <jesusabdullah>this is gonna be awesome!!
07:28:51  <jesusabdullah>oh, no, that won't work cause, derp
07:29:09  <jjjohnny>CHIRPY DERPIES
07:29:09  <LOUDBOT>THAT DOESN'T EVEN MAKE SENSE
07:30:08  * douglaslassancejoined
07:31:31  <jesusabdullah>I got this
07:46:35  <jesusabdullah>fuck
07:46:44  <jesusabdullah>of COURSE the hardest part by far is that you want to basically run bash commands
07:51:34  <jesusabdullah>okay, new plan
07:51:45  <jesusabdullah>set up my own windows dev environment, work at it from that angle
07:51:57  <jesusabdullah>I'll eventually be able to make it a one-click install XD
08:03:37  * no9quit (Ping timeout: 240 seconds)
08:05:18  * nicholasfquit (Read error: Connection reset by peer)
08:05:47  * nicholasfjoined
08:13:25  <guybrush_>anyone also ran into this error with watchify? -- Error: module "./buffer_ieee754" not found from "/usr/local/lib/node_modules/watchify/node_modules/browserify/node_modules/insert-module-globals/buffer.js"
08:16:42  <substack>woot https://gist.github.com/substack/5712384
08:17:00  <substack>I can totes build an ansi fft oscilloscope
08:17:13  <guybrush_>haha sweet
08:17:33  <substack>next stop, feeding baudio output into this
08:17:38  <substack>then real audio files
08:21:56  <jjjohnny>yay
08:22:41  <substack>mikola's thing is so much nicer than that other noise on npm
08:24:58  * jibayquit (Quit: Leaving)
08:25:20  <jjjohnny>yes i am a fan
08:27:01  <jjjohnny>substack: the data is in the imaginary numbers array?
08:29:57  <substack>yes it seems so
08:30:15  <substack>since I tried a thing and that's where the amplitude data was
08:33:02  * no9joined
08:33:20  <emilisto>I think that's because the signal's real with amplitude 1, i.e. a coefficient 1 before all the sines - so you get a purely imaginary fft
08:37:46  <substack>I just want the frequency decomposition
08:38:51  <emilisto>right
08:47:41  * douglaslassancequit (Quit: Leaving...)
08:50:40  * jolissquit (Quit: joliss)
08:55:42  <emilisto>https://gist.github.com/emilisto/5712587
08:55:55  <emilisto>fun to see the low frequency disappear when you increase the no. of floats
08:56:57  <substack>fancy
08:57:14  <substack>I'm writing a thing so I can convert s16 data from baudio into float64 typed arrays
08:57:59  <emilisto>what's s16 data? :)
08:58:19  <substack>signed 16 bit
08:58:21  <emilisto>ah
08:58:23  <substack>little endian usually
08:58:28  <substack>man soxformat
09:00:58  <emilisto>cool, gotta play around with baudio
09:03:37  <emilisto>have you tried it in the browser with the audio API?
09:04:09  * stagasjoined
09:08:48  <jesusabdullah>ahahaha
09:08:57  <jesusabdullah>I just wrote a gruntfile in coffeescript because I'm an asshole
09:10:03  <jesusabdullah>substack: https://gist.github.com/jesusabdullah/9135dde2b4581989e4a4
09:10:06  <jesusabdullah>oohoohoohoo
09:13:29  <substack>emilisto: the webaudio module on npm is a drop-in replacement for baudio
12:21:14  * yorickjoined
12:23:45  * thl0started writing github-all-pages
12:24:05  <substack>pagesize?
12:24:19  <thl0>max for github is 100
12:24:41  <thl0>so if I wanna get all your repos for instance I gotta get multiple pages
12:25:06  <thl0>substack: http://developer.github.com/v3/#pagination
12:25:48  <thl0>really annoying actually - requires multiple requests, which will each reduce the rate limit for the hour
12:25:59  <thl0>so mining github for information is quite a challenge :)
12:30:46  * brianloveswordsquit (*.net *.split)
12:30:47  * dfggquit (*.net *.split)
12:30:48  * zuquit (*.net *.split)
12:33:46  * jibayjoined
12:35:08  * dfggjoined
12:35:08  * zujoined
12:36:41  * brianloveswordsjoined
12:40:04  * kevino80joined
12:43:09  * thl0quit (Remote host closed the connection)
12:43:36  <substack>https://github.com/substack/sillyscope
12:43:49  * thl0joined
12:43:59  <dominictarr>nice!
12:44:28  <dominictarr>thl0: so, I have this crazy idea for open data...
12:44:37  <dominictarr>lets call it "data liberation"
12:44:50  <dominictarr>basically, combine scraping, with peer-to-peer replication
12:45:45  <dominictarr>so, lets say, currently N people have built apps that must scrape, say, hackernews, and are limited to a rate R requests per hours
12:46:13  <dominictarr>so, if they each scrape independently then that is N*R requests to HN
12:47:09  <dominictarr>but, if they cooperate, and then replicate the data between themselves, that is R/N requests
12:48:01  * thl0quit (Ping timeout: 246 seconds)
12:48:19  <dominictarr>they will be able to stay way more up to date, AND it will not put undue strain on HN's servers
12:53:41  * rvaggquit (Quit: ta ta)
13:04:08  * paul_irishquit (Ping timeout: 246 seconds)
13:05:02  <dools>dominictarr: in effect what it would be is an open cache of urls
13:05:34  * paul_irishjoined
13:05:35  <dools>dominictarr: so in instead of doing file_get_contents($url); i would do file_get_contents('http://opencache.org/?url='.$url);
13:06:30  <dominictarr>yes, except that opencache.org could also be a service you have running locally
13:06:38  <dominictarr>if you are contributing to the scraping
13:07:18  <dools>right ... p2p caching. pretty interesting idea. in fact, you could literally just use bittorrent couldn't you?
13:07:31  <dools>a "url" is just a file right? it's some data
13:08:12  <dominictarr>hmm, possibly
13:08:22  <dominictarr>although, that may not be ideal
13:08:48  <dominictarr>since generally, most bits of data (such as, a single comment) are fairly small.
13:09:11  * Madarsquit (Ping timeout: 256 seconds)
13:09:16  * st_lukejoined
13:09:22  <dominictarr>bittorrent is designed for large files
13:09:25  * Madarsjoined
13:09:36  <dools>yeah even a typical web page is small by bt standards
13:10:23  <dools>are you imagining, though, that say, i scrape HN and parse out a bunch of comments, that those *comments* become artefacts in the network? i would think a far simpler concept, and implementation (and in fact far more useful) would just be to provide entire pages, cached as of X minutes ago in a p2p network
13:10:31  <st_luke>substack: yes, it's pretty neat
13:11:03  <dools>in a sense, just allowing me to anonymously share my browser cache, but making it programatically available so that if i'm scraping a bunch of sites the data i download is shared with other scrapers
13:11:42  * AvianFlujoined
13:11:59  * nicholasfquit (Remote host closed the connection)
13:16:47  * emilistojoined
13:17:34  <dominictarr>dools: interesting point.
13:18:30  <dominictarr>why would just having the raw pages be more useful?
13:18:52  <dools>i'm just thinking of the way i scrape data normally ...
13:19:13  <dominictarr>sure, but what you really want is the data, not the html
13:19:15  * cubertquit (Ping timeout: 256 seconds)
13:20:11  <dools>dominictarr: like i'll go "okay i want to scrape X, so $data = file_get_contents($x); then foreach(parseLinks($data) as $link) scrapeSomehow($link); and then scrapeSomehow gets the thing i'm after (restaurant review text, movie ratings or whatever)
13:20:24  * st_lukequit (Remote host closed the connection)
13:20:30  * cubertjoined
13:20:51  <dools>dominictarr: so what you'd need is a DSL that created some sort of semantics for the page, where i would say "i want movie ratings" and that you had somehow called it the same thing
13:21:52  <dools>dominictarr: seems like ... impossible :) it's essentially like in order to make what i scrape available to you i would have to parse and store it in a microformat or some kind of RDF that you would then be able to plug into ... when really just allowing me to xpath out the bits i want is far simpler
13:21:53  * missinglinkquit (Ping timeout: 245 seconds)
13:22:15  <dools>dominictarr: and the original goal is not to create a "semantic web" but just to allow me to scrape the shit out of sites without them rate limiting me
13:23:00  <dominictarr>right - but if we all depend on the same sites, and trust each other to replicate the data,
13:23:22  <dominictarr>then why not also share maintanence of the parsing scripts?
13:23:27  * rowbit1joined
13:23:27  * rowbit1quit (Remote host closed the connection)
13:23:47  <dominictarr>otherwise, if the site schema changes, everyone has to update a their code,
13:24:11  <dominictarr>instead of whoever notices first, and then merge a patch from that person
13:24:24  * dlmannin1joined
13:24:46  * crankquit (Ping timeout: 246 seconds)
13:25:28  * Ralt_joined
13:25:38  * crankjoined
13:29:06  * yorick_joined
13:29:06  <dominictarr>also, that doesn't mean that replicating the caches for each url isn't a good idea
13:29:32  * emilisto_quit (Ping timeout: 261 seconds)
13:29:47  <dools>right okay you could create a kind of parsing library for high profile and/or popular sites in addition to just having a base level of cache replication i guess
13:29:51  <dominictarr>hmm, though, in some cases, the data might balloon - like if you have pagination with prepends
13:30:16  <dools>it doesn't necessarily need replication though
13:30:30  <dominictarr>dools: yes, that would be up to the scrapers
13:30:34  <dools>all it means is that when i send a request for $url, the data can come from 8 different people near me
13:30:41  <dools>like BT
13:30:44  * dlmanningquit (*.net *.split)
13:30:51  * niftylettucequit (*.net *.split)
13:30:59  * rowbitquit (*.net *.split)
13:31:39  <dominictarr>dools: it's also an opportunity to get notifications on when a page you are interested in changes
13:31:39  <dools>i don't know if BT is the right protocol (i have no idea of the overhead)
13:31:46  <dominictarr>you have to share torrent flies, and they describe how to get the actual contents
13:32:08  <dools>okay so that sounds about right:
13:32:51  <dools>when i request a url, it's like a torrent file of the url, which describes how to get the data from that url. then it could compare timestamps to get the most recent versions, and anyone with an outdated version would receive the update pushed to them or something
13:33:01  <dools>s/data from that/data for that/
13:33:41  <dominictarr>yes
13:33:45  <dools>so you'd still have a central server the same as you do with bt, but the data would be shared via p2p, and then overlaid on top of that basic p2p data sharing would be a series of packages designed to provide standard data formats for popular sites
13:34:06  <dools>pretty awesome!
13:34:12  <dools>2 weeks
13:34:29  * timoxleyquit (Ping timeout: 252 seconds)
13:34:30  * Raltquit (Ping timeout: 260 seconds)
13:34:32  * ehdquit (Ping timeout: 261 seconds)
13:34:38  * yorickquit (Ping timeout: 260 seconds)
13:34:43  * Ralt_changed nick to Ralt
13:34:44  * thl0joined
13:35:05  <dominictarr>I need a Dominic#fork method so that I have the bandwidth to work on all this stuff
13:35:19  <dominictarr>or some Mad Grad Students
13:37:29  * purrquit (Ping timeout: 264 seconds)
13:40:17  * purrjoined
13:41:11  * harrowquit (Ping timeout: 256 seconds)
13:41:30  * harrowjoined
13:41:32  * fallsemojoined
13:44:23  * dlmannin1quit (Remote host closed the connection)
13:44:35  * dlmanningjoined
13:44:36  <dools>haha
13:45:24  * tmcwjoined
13:49:18  * st_lukejoined
13:49:29  * yorick_changed nick to Guest471
13:50:08  * no9quit (Ping timeout: 240 seconds)
14:03:45  * brianloveswordsquit (Excess Flood)
14:04:12  * brianloveswordsjoined
14:04:15  * no9joined
14:07:01  * mikolalysenkojoined
14:08:48  <dominictarr>thl0: I have another case where I might need to pass info to a prehook
14:08:58  <dominictarr>just thinking through it now
14:09:54  * defunctzombie_zzchanged nick to defunctzombie
14:10:00  <dominictarr>for replication, when you save key->value, you also save node_id:timestamp->key
14:10:16  <dominictarr>but when you receive replicated data
14:10:48  <dominictarr>you want to save it with the node_id:timestamp that you received with it… not create a new ts.
14:11:04  * st_lukequit (Remote host closed the connection)
14:11:04  <dominictarr>hmm, but if you can just iterate over the batch, that is good enough...
14:13:27  <dominictarr>dools: do you know this album http://www.youtube.com/watch?v=30Acy8Z9jkw
14:14:21  * st_lukejoined
14:28:16  * emilisto_joined
14:29:07  * AvianFlu_joined
14:30:32  * AvianFluquit (Ping timeout: 240 seconds)
14:30:35  * owen1quit (Ping timeout: 240 seconds)
14:30:37  * emilistoquit (Ping timeout: 240 seconds)
14:30:43  * crankquit (Ping timeout: 240 seconds)
14:30:47  * crank_joined
14:31:07  * crank_changed nick to crank
14:31:09  * owen1joined
14:31:58  <dools>dominictarr: nah never heard of it .freaky deaky, might put it on while i work tomorrow
14:33:03  <dominictarr>you won't regret it
14:33:52  * mikealquit (Quit: Leaving.)
14:36:27  * crankquit (*.net *.split)
14:36:32  * st_lukequit (*.net *.split)
14:36:33  * tmcwquit (*.net *.split)
14:36:35  * dfggquit (*.net *.split)
14:36:36  * zuquit (*.net *.split)
14:37:47  * crankjoined
14:46:12  * AvianFlu_changed nick to AvianFlu
14:48:51  * tmcwjoined
14:48:52  * dfggjoined
14:48:52  * zujoined
14:58:48  * st_lukejoined
15:00:05  * jaz303quit (*.net *.split)
15:00:10  * rchquit (*.net *.split)
15:00:22  * jaz303joined
15:00:58  * rchjoined
15:01:48  * harrowquit (*.net *.split)
15:01:49  * paul_irishquit (*.net *.split)
15:06:05  * harrowjoined
15:08:04  * paul_irishjoined
15:10:46  * AvianFluquit (Remote host closed the connection)
15:19:27  * mikealjoined
15:21:42  * defunctzombiechanged nick to defunctzombie_zz
15:22:00  * ehdjoined
15:24:58  * st_lukequit (Remote host closed the connection)
15:27:19  * harrowquit (Ping timeout: 256 seconds)
15:30:07  * harrowjoined
15:30:22  * niftylettucejoined
15:41:00  * tim_smartquit (Ping timeout: 256 seconds)
15:41:03  * pikpikquit (Ping timeout: 256 seconds)
15:41:04  * niftylettucequit (Ping timeout: 256 seconds)
15:41:07  * harrowquit (Ping timeout: 256 seconds)
15:41:09  * crankquit (Ping timeout: 256 seconds)
15:41:09  * no9quit (Ping timeout: 256 seconds)
15:41:12  * ehdquit (Ping timeout: 256 seconds)
15:41:12  * hij1nxquit (Ping timeout: 256 seconds)
15:41:14  * harrowjoined
15:42:38  * pikpikjoined
15:42:40  * hij1nxjoined
15:43:05  * tim_smart|awayjoined
15:43:29  * tim_smart|awaychanged nick to tim_smart
15:44:03  * jesusabd1llahjoined
15:48:16  * brianloveswordsquit (Ping timeout: 264 seconds)
15:48:45  * owen1quit (Ping timeout: 264 seconds)
15:49:21  * mikolalysenkoquit (Ping timeout: 264 seconds)
15:49:35  * owen1joined
15:52:24  * brianloveswords_joined
15:55:34  * paul_irish_joined
15:55:37  * brianloveswords_changed nick to brianloveswords
15:56:30  <jlord>sup jesusabdullah!
15:57:18  * crankjoined
20:02:32  * thl0quit (*.net *.split)
20:07:57  * thl0joined
20:10:35  * thl0_joined
20:10:59  * thl0quit (Ping timeout: 240 seconds)
20:12:49  * AvianFluquit (Remote host closed the connection)
20:16:35  * paul_irish_changed nick to paul_irish
20:17:11  * kesslerquit (Quit: Page closed)
20:17:36  * kesslerjoined
20:23:06  * douglaslassancequit (Quit: Leaving...)
20:33:54  * brianloveswordsquit (Excess Flood)
20:35:14  * brianloveswordsjoined
20:37:30  * no9joined
20:44:05  * jolissjoined
20:51:08  * blobaumquit (Read error: No route to host)
21:05:46  * jolissquit (Quit: joliss)
21:12:01  * AvianFlujoined
21:12:45  * thl0_quit (Remote host closed the connection)
21:13:50  * timoxleyquit (Quit: Computer has gone to sleep.)
21:16:07  * kesslerquit (Ping timeout: 250 seconds)
21:20:03  * ehdjoined
21:22:36  * jolissjoined
21:24:34  * jcrugzzquit (Ping timeout: 276 seconds)
21:27:33  * jcrugzzjoined
21:30:47  * thl0joined
21:33:45  * wolfeida_changed nick to wolfeidau
21:43:07  * nicholasfjoined
21:49:25  * fallsemoquit (Quit: Leaving.)
22:01:41  * fallsemojoined
22:01:59  * jesusabd1llahchanged nick to jesusabdullah
22:06:07  * kevino80quit (Remote host closed the connection)
22:12:51  * dguttmanjoined
22:18:11  * thl0quit (Remote host closed the connection)
22:18:26  * jibayquit (Remote host closed the connection)
22:22:23  * dguttmanquit (Ping timeout: 240 seconds)
22:27:32  * dguttmanjoined
22:27:32  * mikolalysenkoquit (Read error: Connection reset by peer)
22:37:00  * ralphtheninjajoined
22:38:16  * tmcwquit (Remote host closed the connection)
22:45:23  * nicholasfquit (Read error: Connection reset by peer)
22:45:51  * nicholasfjoined
22:48:37  * nicholasfquit (Remote host closed the connection)
22:49:13  * nicholasfjoined
22:53:25  * nicholasfquit (Ping timeout: 248 seconds)
22:55:00  * ralphtheninjaquit (Quit: leaving)
22:56:35  * fotoveritequit (Quit: fotoverite)
22:59:02  <isaacs>substack: you still need a google glass?
22:59:10  <isaacs>substack: npm just got approved
22:59:18  <isaacs>substack: of course, it's not free or anything
22:59:36  <substack>I don't have money.
23:03:07  <mbalho>lol
23:03:08  <substack>plus I can't even buy anything online right now
23:04:20  * tilgovijoined
23:08:57  <chapel>isaacs: can I buy one?
23:09:52  <isaacs>chapel: there are a few other people i've already promised dibs on it to, if substack were to not want it
23:10:00  <chapel>:P
23:10:37  <isaacs>substack: i guess you'd have to show up at the place with money, but i don't know how much it costs.
23:10:46  <chapel>I really couldn't afford one, but would love to try a pair out someday
23:10:47  <isaacs>substack: and they don't have red, so npmjs is feeling kinda meh about this.
23:11:50  <substack>if I was going to buy technology I should probably buy a new laptop instead
23:12:00  <substack>this laptop has been slowly failing for over a year
23:12:54  <isaacs>substack: you should try kickstartering for a laptop
23:13:43  <isaacs>substack: like, "Hey, module users! would you like substack to keep making modules? if you are one of the first 10 people to donate $100, he'll make a module that is a parody of your name, with a drawing of your face in the readme!"
23:13:53  <chapel>:P
23:13:59  <chapel>I might donate for that :)
23:14:20  <substack>if people want to give me money they can already do that
23:14:34  <isaacs>substack: nono, kickstarter makes them feel like they're a part of something, though
23:14:34  * thl0joined
23:14:46  <isaacs>substack: it tricks us into feeling like it's a team activity
23:15:30  * thl0quit (Remote host closed the connection)
23:16:30  <substack>kickstarter feels really trashy
23:17:05  <isaacs>substack: yeah, it kinda makes sense for products that are tangible
23:17:17  <substack>I keep reading about produces that shouldn't exist being funded with it.
23:19:30  * AvianFluquit (Remote host closed the connection)
23:34:54  <Raynos>isaacs, substack: $100 for a parody of myself, where do I sign up?
23:35:31  <jesusabdullah>hahaha
23:35:38  <jesusabdullah>I'm not sure how to parody myself
23:35:52  <jesusabdullah>probably stop taking my meds XD
23:36:10  <chapel>hows it going jesusabdullah?
23:36:24  <jesusabdullah>not bad
23:36:28  <jesusabdullah>just had a phone chat with Eran
23:37:14  <jesusabdullah>might be flying down for a day or two
23:37:16  <jesusabdullah>sounds like
23:37:30  <chapel>cool
23:37:32  * thl0joined
23:37:46  <chapel>bah, make sure you say I referred you! :P
23:37:56  <jesusabdullah>vaguely surprised eran hasn't pointed me to an issue on spumko
23:38:20  <chapel>maybe he feels you're vetted already
23:38:27  * tilgoviquit (Remote host closed the connection)
23:38:34  <chapel>you definitely have the experience, very easy to see on github
23:38:51  <jesusabdullah>naw, he straight told me that
23:39:11  <jesusabdullah>that he felt like I was g2g based on my github and nodejitsu's recommendation
23:39:19  <chapel>yeah
23:40:41  * yorickquit (Remote host closed the connection)
23:40:59  * mikolalysenkojoined
23:41:11  * fallsemoquit (Quit: Leaving.)
23:41:47  <jesusabdullah>dangit I lost kessler's email :(
23:43:39  <mikolalysenko>check it out: a basic partial evaluator for javascript in javascript!
23:43:40  <mikolalysenko>https://github.com/mikolalysenko/specialize
23:43:58  <mikolalysenko>it automatically inlines closures so you don't have to
23:44:21  <mikolalysenko>substack: you might get a kick out of this ^^^
23:44:32  <mikolalysenko>probably has lots of bugs still though
23:45:18  * kevino80joined
23:45:45  <pkrumins>jesusabdullah: https://api.github.com/users/kessler/events/public? -> "email": "[email protected]"
23:47:18  * ralphtheninjajoined
23:47:19  * thl0quit (Remote host closed the connection)
23:47:33  <jesusabdullah>pkrumins: different kessler!
23:47:36  <jesusabdullah>pkrumins: inorite?
23:48:26  <pkrumins>jesusabdullah: ahh idk
23:48:31  <jesusabdullah>yeah
23:48:42  <pkrumins>does he haz githubs?
23:48:50  * tmcwjoined
23:48:51  <jesusabdullah>[email protected]
23:49:03  <jesusabdullah>naw we were just talking yesterday and, uhh, I forgot to ask anything about him XD
23:49:11  <jesusabdullah>I got caught up in what we were talking about
23:49:58  <pkrumins>gotcha
23:50:10  <jesusabdullah>oh sick got it
23:51:09  <jesusabdullah>I realized that google could tell me immediately if it was a legit gmail account
23:51:18  <jesusabdullah>due to goog+
23:54:04  * tmcwquit (Ping timeout: 276 seconds)
23:55:52  * kevinohara80joined
23:56:51  <substack>sillyscope on a log₂ scale https://github.com/substack/sillyscope#example
23:57:37  <jesusabdullah>substack: that's pretty sick \m/ I dig it
23:57:58  * kevino80quit (Ping timeout: 276 seconds)
23:58:21  <substack>mikolalysenko: neat!
23:58:55  <mikolalysenko>substack: probably has lots of bugs, but the basic idea works
23:59:12  <mikolalysenko>it optimistically tries to inline closures, but bails out if it isn't possible
23:59:43  <substack>mikolalysenko: how feasible would it be to automatically detect which functions have side-effects, inlining functions that are pure?
23:59:49  <mikolalysenko>the key to making it all work is that if finds the statements where it needs to inline, then it just smashes the surrounding code into signle static assignment