Open in new window / Try shogun cloud
--- Log opened Thu Nov 13 00:00:06 2014
-!- DSrupt [~DSrupt@] has joined #shogun01:18
-!- DSrupt [~DSrupt@] has quit [Quit: (null)]03:06
-!- txomon|home [~txomon@unaffiliated/txomon] has quit [Ping timeout: 258 seconds]03:17
-!- txomon|home [~txomon@unaffiliated/txomon] has joined #shogun03:18
wikingshogun-buildbot: stop build nightly_default "restarting"03:44
shogun-buildbotbuild 929 interrupted03:44
wikingshogun-buildbot: force build --develop=branch 'nightly_default'03:45
shogun-buildbotSomething bad happened (see logs)03:45
wikingshogun-buildbot: force build --branch=develop 'nightly_default'03:45
shogun-buildbotThe build has been queued, I'll give a shout when it starts03:45
shogun-buildbotbuild #930 forced03:51
shogun-buildbotI'll give a shout when the build finishes03:51
-!- pickle27 [] has joined #shogun03:55
-!- shogun-notifier- [] has joined #shogun03:57
shogun-notifier-shogun: Viktor Gal :develop * cc1b966 / doc/,doc/
shogun-notifier-shogun: Fix mathjax script location in doxygen config files03:57
shogun-notifier-shogun: [ci skip]03:57
wikingshogun-buildbot: stop build nightly_default "restarting"03:57
shogun-buildbotbuild 930 interrupted03:57
shogun-buildbotHey! build nightly_default #930 is complete: Exception [exception interrupted]03:57
shogun-buildbotBuild details are at
wikingshogun-buildbot: force build --branch=develop 'nightly_default'03:57
shogun-buildbotbuild #931 forced03:57
shogun-buildbotI'll give a shout when the build finishes03:57
-!- Floatingman [] has quit [Remote host closed the connection]05:36
-!- Floatingman [] has joined #shogun05:41
-!- pickle27 [] has quit [Remote host closed the connection]06:21
shogun-buildbotbuild #931 of nightly_default is complete: Failure [failed notebooks]  Build details are at
shogun-buildbotbuild #886 of precise - libshogun is complete: Failure [failed compile]  Build details are at  blamelist: Viktor Gal <>06:49
shogun-buildbotbuild #913 of FCRH - libshogun is complete: Failure [failed test]  Build details are at  blamelist: Viktor Gal <>06:53
-!- shogun-notifier- [] has quit [Quit: transmission timeout]06:57
shogun-buildbotbuild #119 of osx2 - modular_interfaces is complete: Failure [failed csharp modular]  Build details are at  blamelist: Viktor Gal <>06:59
shogun-buildbotbuild #480 of debian wheezy - memcheck is complete: Success [build successful]  Build details are at
-!- HeikoS [] has joined #shogun12:05
-!- mode/#shogun [+o HeikoS] by ChanServ12:05
wikingHeikoS: ping12:17
@HeikoSwiking: pong12:18
-!- Heikotablet [] has joined #shogun12:20
Heikotabletwiking, whatsup12:20
wikingHeikotablet: do u have matlab at UCL?12:23
wikingHeikotablet: ok then get a machine + matlab + buildbot12:24
@HeikoSwiking: can do that I think12:27
@HeikoSwiking: can I use my desktop?12:27
@HeikoSwiking: its 24/712:27
wikingu can12:27
@HeikoSbut can only install ubuntu package12:27
wikingthat's ok12:27
@HeikoSI dont have full admin rights12:27
@HeikoSbut can instal anything from repo12:27
wikingmmm question of course is12:27
wikingwhether u can open a custom port :P12:27
wikingalthough no12:27
wikingit's ok12:27
wikingso u just need to12:27
wikingapt-get install buildbot-slave12:28
@HeikoSlet me try12:28
wikingand then the rest we can figure out12:28
wikingHeikoS: do u have g++ on that machine? :P12:28
wikingi guess os or?12:28
@HeikoSwiking: yeah yeah, I also use shogun on it12:28
@HeikoSremember all my questions a while ago?12:28
@HeikoSour cluster consists of all our desktop machines12:28
wikingyeah ok because then we can haz swig-matlab12:28
@HeikoSwiking: awesome!12:29
wikingnow the only question remains12:29
@HeikoSok installed buildbot slave12:29
wikinglemme set it up12:29
@HeikoSwiking: what do you need from me?12:29
wikingHeikoS: i'm just writin you in pm12:29
-!- shogun-buildbot [] has quit [Quit: buildmaster reconfigured: bot disconnecting]12:32
-!- shogun-buildbot [] has joined #shogun12:32
-!- rajul [~rajul@] has joined #shogun14:37
wikingmatlab_modular interface generated14:44
-!- rajul [~rajul@] has quit [Ping timeout: 264 seconds]14:54
wikingbut we need to generate the typemaps14:54
-!- rajul [~rajul@] has joined #shogun15:06
-!- rajul_ [~rajul@] has joined #shogun15:13
-!- rajul [~rajul@] has quit [Ping timeout: 245 seconds]15:15
-!- rajul_ is now known as rajul15:16
@HeikoSwiking: whoooooo! :)15:21
@HeikoSwiking: might be cool for release, at least experimental15:21
@HeikoSbut I dont know how much work typemaps are15:21
@HeikoSwiking: we should definitely take the opportunity to unit test the maps systematically as a draft for how to test the existing ones15:22
-!- HeikoS [] has quit [Quit: Leaving.]15:37
-!- HeikoS [] has joined #shogun15:46
-!- mode/#shogun [+o HeikoS] by ChanServ15:46
@lisitsynHeikoS: hey15:47
@lisitsynHeikoS: jfyi I am ok with your date to talk about vw15:48
@HeikoSlisitsyn: good to know15:54
@HeikoSlets see what john says15:54
@HeikoSsorry for postponing all the time15:54
@lisitsynHeikoS: np I am sick today anyway15:54
@HeikoSlisitsyn:  did he reply?15:54
@HeikoSnope not yet15:54
@lisitsynHeikoS: not yet15:54
@HeikoSlisitsyn: haha I saw that paper15:56
@HeikoSnice that they have a nbotebook15:56
@lisitsynHeikoS: it ain't them15:56
@lisitsynit's jake from scikit learn15:57
@lisitsynHeikoS: can you recommend me something to read about variational inference?15:57
@HeikoSchris bishops book15:58
@lisitsynI am tired of being stupid :D15:58
@HeikoSpattern recognition and machine learning15:58
@HeikoSits the probabilistic ML bible15:58
@lisitsynhmm I think I glanced through it before15:58
@HeikoSor Wu's notebook15:58
@HeikoSits an easy idea15:58
@HeikoSdistribution is intractable15:58
@HeikoSso you select one that you can deal with (mostly gaussian)15:58
@HeikoSand then minimise KL div between approximation and true15:59
@lisitsynso in layman terms15:59
@lisitsynyou just fit multidimensional gaussian to the distribution?15:59
@lisitsynvia KL?15:59
@HeikoSkind of16:00
@HeikoSits not really the same as fitting the gaussian16:00
@HeikoSminimizing KL is different16:00
@HeikoSkullback leibler divergence16:00
@lisitsynyeah I know16:00
@lisitsynso you have that integral16:00
@HeikoSsome kind of distance between distributions16:00
@HeikoSbut not symmetric16:00
@HeikoSand then you usually do this via maximising a lower bound16:00
@HeikoSthe KL gives you a lower bound on the likelihood of the distrbution you care about16:00
@lisitsynI don't get one thing yet16:01
@HeikoSthe bound is usually not tight (thats why its approximate) but you just hope for the best16:01
@lisitsynso are we talking about inference or training?16:01
@HeikoSgetting posterior16:03
@HeikoShave a representation of posterior16:03
@HeikoSthat is kind of training16:03
@lisitsynis posterior usually a gaussian?16:04
@HeikoSbut if you have a gaussian posterior, usually inference is easy16:04
@HeikoSno usually not16:04
@HeikoSthats why the approximation16:04
@lisitsynbut you approximate with gaussian, right?16:04
@HeikoSkind of most cases16:05
@HeikoSbut sometimes you just assume a certain type of factorisation16:05
@HeikoSthat is you assume certain variables in posterior are independent16:05
@HeikoSand then a parametric form of posterior (that is not gaussian) drops out of the model math16:05
@HeikoSfor example for LDA for topic modelling that happens16:05
@lisitsynHeikoS: hmm I see16:06
@HeikoSvariational bayes there is very similar to gibbs sampling updates16:06
@HeikoSwith the posterior approximations happening to be discrete /dirichlet16:06
@HeikoSI gotta run off now, we can discuss a little later today if you want16:06
@HeikoSsee you  :)16:06
@lisitsynsee you16:06
@HeikoSIll be back soon, just a talk now16:06
-!- HeikoS [] has quit [Quit: Leaving.]16:06
* wiking just had 3 shots in a row.... :D16:09
-!- rajul [~rajul@] has quit [Ping timeout: 256 seconds]16:24
-!- rajul [~rajul@] has joined #shogun16:50
-!- HeikoS [] has joined #shogun17:15
-!- mode/#shogun [+o HeikoS] by ChanServ17:15
@HeikoSlisitsyn:  re17:15
@lisitsynHeikoS: cool17:15
@lisitsynHeikoS: I can ask you random questions if you are not busy :D17:15
@HeikoSlisitsyn: please do17:17
@lisitsynHeikoS: you were talking about gaussian posterior17:17
@lisitsynbut what's about other distributions?17:17
@lisitsynis it kind of engineering to choose the distribution17:18
@HeikoSas said, for discrete posteriors, one usually things are different17:18
@HeikoSlisitsyn: not really, the point about Gaussians is that one can integrate over them17:18
@HeikoSbut if you say learn a Gaussian mixture model17:18
@HeikoSyou do the same thing17:18
@HeikoSyou minimise the KL between the posterior and the mixture17:18
@HeikoSfor standard GMM, one can do that in closed form and gets EM algoirithm17:19
@HeikoSbut for other mixture models, that might not be possible17:19
@HeikoSso usually Gaussian, yes17:19
@lisitsynhow inaccurate it is?17:19
@HeikoSit depends on your posterior17:20
@HeikoSif it doesnt look like a Gaussian17:20
@HeikoSits not accurate17:20
@HeikoSbut you dont know how it looks17:20
@lisitsynhmm say I have features obtained from layer 2 of some deep learning net17:20
@HeikoSso you dont know how accurate variational inference is17:20
@lisitsynwill it be consistent to just try gaussian?17:21
@lisitsynI am just trying to get the thought behind it17:21
@HeikoSwhat is the model17:21
@lisitsynwhat do you mean?17:21
@HeikoSif you want to do variational inference you need a model17:22
@HeikoSusually one starts with the model, and then comes up with inference algroithms for it17:22
@lisitsynahh no I am speaking quite general17:22
@HeikoSso what do you want to do?17:22
@lisitsynis possible models class broad?17:22
@HeikoSit is an algorithm that characterises a posterior distribution17:22
@HeikoSso *any* probabilistic model17:22
@lisitsynahm I see17:23
@lisitsynHeikoS: I was talking about gps recently17:23
@lisitsynand was asked17:23
@lisitsynwhat is this limitation of 'answer' being gaussian17:23
@lisitsynlike is it very restrictive17:24
@lisitsynand I still don't have a real answer :D17:24
@lisitsynHeikoS: what do you think?17:24
@HeikoSthe limitation is that your predictive uncertainty when you integrate over the posterior might be wrong17:24
@HeikoSbtw depends on what kind of gp17:25
@HeikoSsince for example regression is analytically tractable, posterior *is* gaussian17:25
@HeikoSclassification, it is not17:25
@HeikoSbut it usually is close to being Gaussian17:25
@lisitsynis regression always gaussian like?17:25
@HeikoSposterior is closed form17:26
@lisitsynok I see17:26
@HeikoSwu put pictures of posterior and approximations to it17:26
@lisitsynlet me check17:26
@lisitsynHeikoS: okay next random :D17:26
@lisitsynthere is a book17:27
@lisitsynlet me find17:27
@HeikoSah he did not put the plot17:27
@HeikoSlisitsyn: but you can download and run17:27
@lisitsynten lectures on statistical and structural pattern recognition17:27
@lisitsynhave you seen that?17:28
@HeikoSbut i really reccomend the GP book by rasmussen and the ML book by bishop17:28
@HeikoSthese two are really really good books17:28
@lisitsynah yeah that's for sure17:28
@lisitsynbut it was a random question ;)17:28
@lisitsynHeikoS: the thing is this book comes from some years before17:29
@lisitsynand they think about say Wald, Neyman-Pearson tasks17:29
@lisitsynsome Anderson task etc17:29
@lisitsynlike P(error) < e17:29
@lisitsynor anything like that17:29
@lisitsynHeikoS: so my question is whether it is dead17:29
@HeikoSI dont like this kind of statistic17:29
@HeikoSpersonal taste17:30
@HeikoSnot bayesian17:30
@HeikoSnot powerful17:30
@HeikoSsmall d, small n17:30
@HeikoSmore like foundations17:30
@lisitsynhmm I see17:30
@HeikoSim more a computational statistics boy :)17:30
-!- rajul [~rajul@] has quit [Ping timeout: 265 seconds]17:30
@lisitsynHeikoS: you said about small d small n17:31
@lisitsynis it working well under these circumstances?17:31
@HeikoSlisitsyn: these methods attack different problems17:31
@HeikoSso you cannot really compare17:31
@lisitsynHeikoS: I am just trying to get what's really wrong about it17:32
@lisitsynthe only thing I can say is that you don't know P anyway17:32
@lisitsynso speaking about P(error) is kind of wrong17:32
@HeikoSnothing wrong about it17:33
@HeikoSbut everything depends on what you want to do17:33
@HeikoSif you want to do hypothesis testing, then this is the way to go17:33
@HeikoSall tools17:33
@HeikoShow good they are depends on what you want to do17:33
@lisitsynsay when would you choose variational inference?17:34
@lisitsynor more important - when would you choose something else :)17:34
@lisitsynHeikoS: let me attack it like that - NNs are good at images/audio now17:35
@HeikoSvariational inference is an inference algorithm17:36
@HeikoSit is not like you decide to use it17:36
@HeikoSyou first need to define your problem17:36
@HeikoSand your model17:36
@HeikoSthe model is the critical part17:36
@HeikoSnot the inference algoirithm17:37
@HeikoSyou dont choose to use variational inference, you choose to do a probabilistic model17:37
@lisitsynHeikoS: I see17:37
@lisitsynHeikoS: okay then how would you choose to do probabilistic model?17:37
@HeikoSit depends on what you want to do17:37
wikingchoose to drink :D17:38
@HeikoSfor example if you want to understand the process you are modelling17:38
@lisitsynwiking: haha17:38
@HeikoSor if uncertainty is important for you17:38
@HeikoSwiking: haha17:38
@lisitsynHeikoS: oh that seems legit17:38
@lisitsynanswers my question :)17:38
@HeikoSlisitsyn: example might be classification17:38
@HeikoSif you are interested in confidence17:38
@HeikoSand maybe have some complicated relationship that you would like to take into account for classification17:39
wiking'Column indexes start at 1 in JDBC'17:39
@HeikoSsay a group like strucutre that share hyper-parameters17:39
@HeikoSand are interested in how these parameters look since that tells you something about the world17:39
wikingwhy on fucking earth would anybody start indexing with 117:39
@HeikoSthen a probabilisti model might be good for oyu17:39
@HeikoSwiking: I recently got the same question with 0 :)17:39
@lisitsynHeikoS: is it moving towards something more automatic?17:40
wikingHeikoS: lemme guess... matlab developer? :D17:40
@lisitsynI mean neural guys claim they have feature learning17:40
@lisitsynis there any work on try to pass raw features and get something of it?17:40
wikingyou can learn anyting with an NN.. give that you have enough time/examples :d17:41
@lisitsynwiking: yeah absolutely but that's why I don't like what neural is all about17:41
@HeikoSwiking: no proper scientists :)17:42
@HeikoSlisitsyn: that is about representation learning17:42
@HeikoSlisitsyn:  kind of orthogonal to probabilistic modelling, and quite a different goal also17:42
@HeikoSprobabilistic models are for somethign for scientists who want to understand the world better17:42
@lisitsynHeikoS: I see17:45
@HeikoSlisitsyn: but there are lots of connections between back propagation and stochastic variational inference, kind of the same in fact ;)17:46
-!- rajul [~rajul@] has joined #shogun17:47
-!- HeikoS [] has quit [Quit: Leaving.]19:03
@lisitsynwiking: hah so now I am on a sickness leave for like a week19:16
@lisitsynwiking: curious if I can switch to help you on release19:16
-!- rajul [~rajul@] has quit [Ping timeout: 272 seconds]19:35
-!- rajul [~rajul@] has joined #shogun19:48
-!- rajul [~rajul@] has quit [Ping timeout: 255 seconds]20:26
-!- rajul [~rajul@] has joined #shogun20:40
-!- rajul [~rajul@] has quit [Ping timeout: 250 seconds]21:32
--- Log closed Fri Nov 14 00:00:08 2014