samth
2018-7-6 14:27:53

@ccshan when should we expect you?


ccshan
2018-7-6 14:28:14

Getting coffee downstairs, sorry


samth
2018-7-6 14:28:50

ah, we are in my office



rjnw
2018-7-6 15:12:11

Yes I changed that but haven’t commited.



ccshan
2018-7-6 17:10:10

It turns out that PSI exact-inference benchmarks aren’t too hard to run. https://github.com/eth-sri/psi


samth
2018-7-6 18:41:26

samth
2018-7-6 18:41:37

for the future this has some nice potential benchmarks


ccshan
2018-7-6 20:52:05

@rjnw How’s it going? Sam and I think you’re working on the 6 figures (2 GMM, 2 NB, 2 LDA). Adding to the exact inference evaluation comes later.


rjnw
2018-7-6 20:53:11

It’s going benchmarks are running. I fixed the gmm 50, I fixed the issue that we had in the morning.


rjnw
2018-7-6 20:53:21

rjnw
2018-7-6 20:55:30

I changed time settings for Lda and accidentally broke gmm.


rjnw
2018-7-6 20:56:09

Also is 30seconds okay or should I add more time?


ccshan
2018-7-6 23:48:27

Although 30 seconds is enough, I personally think it’s better to show (if within say 100 seconds) where the Haskell backend reaches the same plateau as the LLVM backend, just more slowly.


ccshan
2018-7-6 23:50:07

Also, please use the name AugurV2 throughout. It’s supposedly quite different from Augur (V1), and so they consistently call it AugurV2 (one word, with A and V capitalized), not Augur.


ccshan
2018-7-6 23:50:39

Thanks for the update! Please keep us posted.


carette
2018-7-7 02:09:36

So I’ll be in a plane all day tomorrow — but I’ve got the icfp reviews and other comments on the paper on my laptop. So I intend to take care of those during that time. [I’ve dealt with all my other deadline-driven obligations, so this one’s at the top now.]


carette
2018-7-7 02:13:58

@rjnw Can you resend the Docker link? It seems to have expired (i.e. is more than 10000 messages ago on this slack. I had saved a link to slack…)



carette
2018-7-7 02:52:52

Thanks. Downloaded (and I’ve got Docker as well).


carette
2018-7-7 02:53:04

Hopefully that’s enough for me to work offline…


rjnw
2018-7-7 02:54:19

There might be some issues in initializing, where it fetches the latest version from github. There is an older commit of hakaru in the docker image which should have the same issue.


rjnw
2018-7-7 02:55:13

Also there was a specific docker command to run this image with maple working, let me see if I can find it.


carette
2018-7-7 02:55:14

I can try running it for the first time while I wait at the gate (there’s free wifi at the airport). Hopefully that will do it.


carette
2018-7-7 02:55:37

Yes please!


carette
2018-7-7 02:56:12

[I’m going offline now, but I’ll check Slack before I go tomorrow morning… 6am, yuck.]


rjnw
2018-7-7 02:56:27

okay, I will post the commands here.


rjnw
2018-7-7 02:58:15
sudo docker import data/hakaru-docker.tar.gz testshare
sudo docker run --entrypoint "/init.sh" --mac-address 1c:1b:0d:e3:fc:22 -it testshare

rjnw
2018-7-7 02:59:05

It will download a lot of stuff in the beginning especially latest ghc and stuff for compiling hakaru


rjnw
2018-7-7 03:02:46

you can Ctrl-C it, and the older compiled files should be in hkbin folder, the hakaru code is in testcode/hksrc/. It starts in hakaru-benchmarks directory and maple binary is /hakaru-benchmarks/maple2017/bin/maple


rjnw
2018-7-7 03:04:51

The mac address is my personal machine as the maple license works with that mac, I hope you are not going to use it for any nefarious purposes. :sweat_smile:


rjnw
2018-7-7 03:27:34

rjnw
2018-7-7 03:28:02

I plotted with sweeps as sweep time is 45s vs 0.2s


rjnw
2018-7-7 03:28:22

*with number of sweeps