Open in new window / Try shogun cloud
--- Log opened Thu Dec 08 00:00:48 2016
-!- zero1hac [~zerooneha@139.59.16.180] has quit [Ping timeout: 268 seconds]05:02
-!- zero1hac [~zerooneha@139.59.16.180] has joined #shogun05:03
@sukeyPull Request #3575 "LinalgRefacotr - Fix GPU Vector/Matrix deep cloning"  opened by OXPHOS - https://github.com/shogun-toolbox/shogun/pull/357506:15
-!- CaBa [~Diu7saig@unaffiliated/caba] has quit [Ping timeout: 268 seconds]06:32
-!- CaBa [~Diu7saig@unaffiliated/caba] has joined #shogun06:33
@sukeyPull Request #3534 "LinalgRefactor - Cholesky - CPU only"  synchronized by OXPHOS - https://github.com/shogun-toolbox/shogun/pull/353408:16
-!- praisethemoon [~praisethe@41.226.248.123] has joined #shogun08:41
CaBawiking: ping11:24
@wikingpong11:25
CaBawiking: good evening! :)11:25
CaBawiking: do you have any info on my 22:35:03 question for me? :)11:25
@wikinglemme se11:26
@wikinge11:26
@wiking[05:35]  <CaBa> wiking: is there any harm in calling init(l,r) multiple times, i.e. with different features?11:26
@wikingno11:26
@wikingit should be fine11:26
@wikingunless there's a bug11:27
CaBawiking: hm, ok11:37
CaBawiking: i'm trying to write kind of a wrapper class around a kernel, derived from the latter. it is meant to accept a different feature type and convert it whenever needed to the type accepted by the upstream kernel... which is the case (1) if the features change and (2) if an extra parameter changed11:38
CaBawiking: does that seem realizable to you?11:39
@wikingah yeah11:45
@wikingsure11:45
@wikingalthough11:45
@wikingwhy not a preprocessor/11:45
@wiking?11:45
CaBawiking: can that be hooked in in a way such that it is re-run if a parameter of the kernel was changed through model selection?11:49
@wikingcould you rephrase this question?11:50
@wikinghttps://github.com/shogun-toolbox/shogun/blob/develop/examples/undocumented/python_modular/preprocessor_normone_modular.py11:51
CaBawiking: i don't know what the design goal of the preprocessors in sg is. i was wondering if they are automatically called in certain situations11:51
@wikinghttp://beta.shogun.ml/api/latest/classshogun_1_1CFeatures.html#a53d5b6b17d8e4b44117d404b119faca611:52
CaBawiking: when are those called?11:53
@wikingwhen you call force_preprocessing11:53
@wikingi meant apply_preprocessor11:54
@wikingon the features11:54
CaBawiking: i see. i don't think that interface is helping much here. i need something that is automatically rerun (or checks if it has to rerun) upon ::compute calls11:56
@wikingkernel normalizer? :D11:56
@wikingthey are actually rerun11:56
CaBaprior to the actual compute() logic ;)11:57
CaBawiking: oh, things like get_kernel_matrix() are parallelized...12:18
@wikingyes12:18
CaBawiking: my new compute() function isn't thread safe anymore12:18
@wiking:)12:18
@wikingthen just set12:19
@wikingthe num of threads12:19
@wikingto 112:19
CaBaCRITICAL should work, too, right?12:20
@wiking?12:50
@wikingcritical?12:50
@wikingin what sense?12:50
CaBa    #pragma omp critical12:56
CaBaworked :)12:56
CaBawrapper seems to work fine now, at least so far...12:56
@wiking:)13:01
@wikingmmm13:02
CaBammm?13:02
@wikingnevermind13:02
CaBacritical about the critical? :D13:02
@wikingi mean i was about to say that we'll change13:02
CaBato c++11 threads?13:02
CaBa:D13:02
@wikingthe backend for kernel.cpp (get_kernel_matrix etc)13:02
@wikingnono13:02
@wikingomp13:02
@wikingso it's all good13:02
@wikingthe commit is there13:02
@wikingfeature/KernelOMP13:02
@wikingit's just that it introduces some fixes that reveals some bugs13:02
@wikingin the parallel random forest etc13:02
@wikinghence it hasn't got merged into develop yet13:03
@wikingbut yeah because now13:04
@wikingthe #pragma omp critical13:04
@wikingis only working by chance :P13:04
CaBawhy?13:04
@wikingbecause what if you compile the thing with clang13:04
@wiking:P13:04
@wikingthen the #pragma omp critical13:04
@wikingwill be ignored13:04
CaBayes, but it'll also be single-threaded then13:05
@wikingbut the Kernel.cpp parallel things are implemented with posix threads13:05
@wikingno13:05
@wikingas you can see currently parallel stuff in kernel.cc13:05
@wiking*cpp13:05
@wikingare implemented using native posix threads13:05
CaBawell then the lock is also ignore in gcc13:06
@wiking?13:06
@wikinglock13:06
@wiking?13:06
@wikingwhat lock?13:06
CaBamy omp critical13:06
@wikingmmm gcc is for along time openmp ready13:06
@wiking:)13:07
@wikingso it wont be ignored13:07
CaBaoh so the pthread code is only compiled when openmp is unavailable?13:07
CaBai thought there is pthread code13:07
@wikingno13:07
@wikingone has nothing to do with other13:07
@wikingbut anyyyyyyyyyhow13:07
@wikingit's ok13:07
@wikingsoon it'll be omp only13:07
CaBaexactly.. so the pthread code will not care about openmp critical statemehts13:07
@wikingyes13:07
CaBaeven if theres openmp support13:07
CaBabbl, lunch13:07
@wikingif you only have PTHREAD13:07
@wikingbut not openmp13:08
@wikingno13:08
@wikingif you have openmp support13:08
@wikingthen it's find13:08
@wiking*fine13:08
@wikingas the pragma is not ignored13:08
@wikingbut ok13:08
@wikingthink about it13:08
@wikingand you'll see what i mean13:08
@wiking:)13:08
-!- fhal3 [~fhal@my83-216-94-208.cust.relish.net] has quit [Read error: Connection reset by peer]13:46
CaBawiking: the pragma works even if the underlying code uses pthreads?14:05
@sukeyIssue #3576 "Unable to install package on Ubuntu 16.04" opened by hrushikesht - https://github.com/shogun-toolbox/shogun/issues/357614:14
@sukeyIssue #3576 "Unable to install package on Ubuntu 16.04"- https://github.com/shogun-toolbox/shogun/issues/357614:15
-!- praisethemoon [~praisethe@41.226.248.123] has quit [Changing host]14:23
-!- praisethemoon [~praisethe@unaffiliated/praisethemoon] has joined #shogun14:23
CaBawiking: switched to an std::mutex, that should cover either cases...15:42
@sukeyPull Request #3534 "LinalgRefactor - Cholesky - CPU only"  synchronized by OXPHOS - https://github.com/shogun-toolbox/shogun/pull/353416:04
-!- praisethemoon [~praisethe@unaffiliated/praisethemoon] has quit [Ping timeout: 250 seconds]18:08
--- Log closed Fri Dec 09 00:00:50 2016