Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

SVM Prediction Accuracy issue #155

Open
Codezhak opened this issue Oct 10, 2019 · 3 comments
Open

SVM Prediction Accuracy issue #155

Codezhak opened this issue Oct 10, 2019 · 3 comments

Comments

@Codezhak
Copy link

Hi,
I am using Libsvm for my face recognition application, where I need to detect the person which does not match also.So i used predict with probability feature and I am getting predict_label (always 1) and target_label as different at (if(predict_label == target_label)).
Can you please help me how accuracy is calculated.

@cjlin1
Copy link
Owner

cjlin1 commented Oct 10, 2019 via email

@Codezhak
Copy link
Author

Codezhak commented Oct 11, 2019

@cjlin1 Thank you for replying,
Sorry I edited below comment that I am getting Target_label 1 always.
In my case Sample train model data is :

svm_type c_svc
kernel_type linear
nr_class 2
total_sv 7
rho 0.0297609
label 1 2
probA -2.95891
probB -0.00770355
nr_sv 3 4
SV
1 0:0.03694541 1:0.071669891 2:-0.092664048 3:-0.064463556 4:0.040241361 5:-0.024055684 6:-0.050895944 7:-0.12840363 8:0.012196725 9:0.0043732212 10:-0.023462353 11:0.097336009 12:0.018651906 13:-0.019849913 14:0.12040991 15:-0.047160998 16:-0.0051556467 17:-0.1239234 18:0.094275281 19:0.025820591 20:0.093659908 21:-0.074600518 22:0.063219473 23:0.079827301 24:0.0051653702 25:0.10770133 26:0.0073127155 27:-0.083744101 28:0.10648216 29:0.20111726 30:0.0022260793 31:0.085927598 32:-0.15429094 33:-0.15328857 34:-0.11767324 35:0.15361258 36:0.090266727 37:0.029450133 38:-0.0015768503 39:-0.027409237 40:-0.041641694 41:-0.076887839 42:-0.046940949 43:-0.06171843 44:-0.10300581 45:0.031150701 46:0.050628506 47:0.038913436 48:-0.029119743 49:-0.0057133124 50:-0.048775818 51:0.21805395 52:0.15209258 53:-0.024650376 54:0.073181532 55:-0.050327212 56:-0.010669088 57:0.15376899 58:0.13702595 59:0.080807798 60:0.12286413 61:-0.043400086 62:-0.15764433 63:-0.10736537 64:0.043983907 65:-0.061820917 66:0.019312553 67:0.055988599 68:0.084513105 69:-0.015690949 70:-0.005046647 71:-0.030403625 72:0.1430871 73:0.0087323552 74:0.018460998 75:-0.037979741 76:-0.041962337 77:0.041415408 78:-0.11663452 79:-0.0015341573 80:0.067158952 81:-0.095703341 82:0.06591215 83:-0.16004115 84:-0.011555639 85:-0.062305722 86:0.022989202 87:-0.060655482 88:-0.017563539 89:-0.099483833 90:-0.030519256 91:0.12542349 92:0.11623575 93:-0.052308142 94:-0.049938764 95:-0.055695452 96:-0.10933288 97:0.093588851 98:0.024587478 99:-0.041493692 100:0.050807275 101:0.049351905 102:-0.087455541 103:-0.040753484 104:-0.03087288 105:-0.26198933 106:-0.10154732 107:0.1199614 108:0.023676826 109:-0.078760251 110:-0.17657734 111:-0.035594635 112:0.084338345 113:-0.19602571 114:0.044273514 115:-0.12129731 116:-0.031900968 117:0.11768025 118:-0.015815748 119:-0.10057496 120:-0.14539227 121:0.1714884 122:0.088544421 123:0.041577563 124:0.1119957 125:0.082128972 126:0.006462872 127:0.016765771
0.1889599372325675 0:0.041713811 1:0.043081999 2:-0.091815926 3:-0.052683767 4:0.056648146 5:-0.011095919 6:-0.069263957 7:-0.14922883 8:0.033324637 9:0.0055900682 10:-0.05558107 11:0.089534208 12:0.0022743463 13:-0.012010496 14:0.11310712 15:-0.061973896 16:-0.023768619 17:-0.13515078 18:0.083318055 19:0.017259121 20:0.066347726 21:-0.085175194 22:0.086797602 23:0.065747224 24:0.0052062133 25:0.07093101 26:0.010732016 27:-0.077712461 28:0.11797749 29:0.19820464 30:0.0066847624 31:0.071149319 32:-0.1533028 33:-0.14795704 34:-0.10148879 35:0.1580372 36:0.11301838 37:0.018611133 38:0.0090597058 39:-0.010343821 40:-0.058600508 41:-0.079146877 42:-0.046183411 43:-0.058465112 44:-0.086292095 45:0.030071596 46:0.020489795 47:0.031289976 48:-0.0032570586 49:0.0052882233 50:-0.064024411 51:0.25447732 52:0.15144359 53:-0.011038651 54:0.076717973 55:-0.074079104 56:-0.0020892932 57:0.16353971 58:0.11866079 59:0.069231793 60:0.10091815 61:-0.028204583 62:-0.14298119 63:-0.099274755 64:0.058262665 65:-0.071819142 66:0.025884148 67:0.092924461 68:0.082155719 69:-0.0095548769 70:0.010507485 71:-0.025696786 72:0.15661329 73:0.0058229789 74:-0.0061575463 75:-0.064822204 76:-0.0096050911 77:0.048721731 78:-0.12949704 79:-0.00048309757 80:0.062589251 81:-0.098267935 82:0.084113844 83:-0.13733053 84:-0.043749731 85:-0.05488167 86:0.019123366 87:-0.056474254 88:-0.0053559933 89:-0.097428255 90:-0.021323465 91:0.10394122 92:0.12347198 93:-0.076476045 94:-0.060722698 95:-0.036500942 96:-0.12584874 97:0.085880749 98:0.044110652 99:-0.035416689 100:0.040731356 101:0.066288203 102:-0.10000805 103:-0.069261909 104:-0.025538819 105:-0.27970597 106:-0.09797553 107:0.13365084 108:0.012209886 109:-0.067893468 110:-0.18558654 111:-0.010446768 112:0.042384554 113:-0.17227599 114:0.068865821 115:-0.11540295 116:-0.029485077 117:0.10943904 118:-0.0093773063 119:-0.085469157 120:-0.14196877 121:0.15316218 122:0.071081549 123:0.042866059 124:0.090841711 125:0.067990765 126:0.024296788 127:0.048967447
0.356171101509426 0:0.083090477 1:0.035881896 2:-0.10264729 3:-0.079361714 4:0.027152631 5:-0.042443685 6:-0.013319381 7:-0.16197763 8:-0.010782876 9:0.01493198 10:0.0058063525 11:0.10427868 12:0.029111298 13:-0.024883699 14:0.11180219 15:-0.057255425 16:0.010482853 17:-0.086959057 18:0.098207258 19:0.0224507 20:0.081880637 21:-0.042264793 22:0.08765018 23:0.095444337 24:-0.026729977 25:0.1065691 26:-0.018091729 27:-0.064589836 28:0.11149803 29:0.21726081 30:0.009135887 31:0.070676178 32:-0.16184628 33:-0.15616919 34:-0.17255081 35:0.16075869 36:0.028972693 37:0.042580094 38:0.01003663 39:-0.00467184 40:-0.024269635 41:-0.065995075 42:-0.034469862 43:-0.048962977 44:-0.082396112 45:0.037605796 46:-0.0025566856 47:0.018316301 48:-0.059912283 49:-0.014124106 50:-0.07081975 51:0.16963245 52:0.12059624 53:-0.046095464 54:0.049106538 55:-0.072645411 56:-0.026484162 57:0.12954633 58:0.14860867 59:0.068982646 60:0.12171584 61:-0.042759087 62:-0.13893534 63:-0.16924043 64:0.014791356 65:0.021300251 66:-0.017508401 67:0.034966994 68:0.10768812 69:-0.046231978 70:-0.034645122 71:-0.039464865 72:0.10507275 73:-0.01931507 74:0.038020778 75:-0.028173592 76:-0.041284557 77:0.047523338 78:-0.072214954 79:0.002086533 80:0.1087334 81:-0.10463254 82:0.08927884 83:-0.16290236 84:-0.003801903 85:-0.050465878 86:0.0098502366 87:-0.054596622 88:0.0059317877 89:-0.10018605 90:-0.035635069 91:0.11927779 92:0.11243607 93:-0.025809038 94:-0.023808565 95:-0.097227544 96:-0.10358892 97:0.11509036 98:0.06610816 99:-0.063595138 100:0.01977131 101:0.026821002 102:-0.027907167 103:0.017522112 104:-0.054464977 105:-0.27559504 106:-0.10004856 107:0.10960201 108:0.045142367 109:-0.080977954 110:-0.13065027 111:-0.019427745 112:0.052484564 113:-0.20115958 114:0.015309382 115:-0.1238234 116:-0.021599023 117:0.14501688 118:-0.0018643811 119:-0.097507559 120:-0.15468878 121:0.16368541 122:0.084232152 123:0.045826145 124:0.14640544 125:0.10077457 126:0.0048952643 127:0.00095724832
-0.05612757379791415 0:0.10378603 1:0.021830684 2:-0.011668648 3:-0.084020823 4:0.11523163 5:0.16795045 6:-0.084651552 7:0.12808238 8:0.046218965 9:-0.067068748 10:-0.01050463 11:0.037231192 12:-0.014205315 13:0.076333605 14:-0.011784149 15:-0.016914591 16:0.034444924 17:-0.02370771 18:-0.0016922341 19:-0.16857755 20:0.11190296 21:0.0012244456 22:-0.042000797 23:0.029082764 24:-0.16308941 25:0.22678798 26:0.021373808 27:0.05087572 28:0.20579952 29:0.063835986 30:-0.08218275 31:-0.044234004 32:-0.024407379 33:-0.12314597 34:-0.10220771 35:-0.040952764 36:0.028546901 37:0.1307694 38:-0.070454165 39:0.010071487 40:0.0668962 41:-0.026295347 42:-0.052212123 43:-0.14998785 44:0.047474016 45:-0.065023087 46:-0.070706002 47:0.073601142 48:-0.11741056 49:-0.018383877 50:0.045340795 51:0.10432319 52:-0.077799886 53:-0.076302603 54:0.028700817 55:-0.045304008 56:-0.11087329 57:0.05321845 58:0.05872241 59:0.036005571 60:-0.070289031 61:-0.1281295 62:-0.03553167 63:-0.056643508 64:0.045226291 65:-0.14386991 66:0.10926843 67:0.0022273255 68:0.078352213 69:0.010188424 70:0.0087441616 71:0.10819614 72:0.049959213 73:-0.11445835 74:-0.032216325 75:-0.18426321 76:0.03625394 77:-0.0029627706 78:-0.0071457434 79:0.094908178 80:-0.02274899 81:-0.038744092 82:0.0095565729 83:-0.19630904 84:0.038964059 85:0.1870421 86:0.073824778 87:-0.16518088 88:-0.039775614 89:-0.045463841 90:-0.11044233 91:0.12782677 92:0.12027829 93:0.051015019 94:0.11951199 95:-0.09570279 96:-0.15588029 97:0.0034993039 98:-0.024270836 99:-0.09093646 100:0.081239678 101:0.083011054 102:0.03517209 103:0.0083131017 104:-0.14240597 105:0.01265295 106:0.10143626 107:-0.058350902 108:-0.11041536 109:-0.082397714 110:-0.042524699 111:-0.10735162 112:0.099507801 113:0.017388441 114:0.037999351 115:0.0036242837 116:-0.028528795 117:0.033692822 118:0.20023839 119:-0.042116225 120:0.020272024 121:-0.05128555 122:0.15354888 123:-0.015401131 124:-0.085192993 125:0.0098382728 126:-0.14329131 127:-0.11760122
-0.6523596134892871 0:0.13689755 1:0.014449684 2:-0.030574 3:-0.050823014 4:0.11932206 5:0.15471569 6:-0.16640002 7:0.11125171 8:0.0052616284 9:0.01403259 10:-0.032884885 11:0.0087129278 12:0.038635194 13:0.015237736 14:0.0073078414 15:0.020260196 16:0.011844252 17:-0.032883942 18:0.033286341 19:-0.14778154 20:0.092751756 21:0.040973999 22:-0.052001238 23:0.045099244 24:-0.15364288 25:0.21488971 26:0.090180978 27:-0.022570407 28:0.12159217 29:0.061767418 30:-0.08382608 31:0.010872589 32:-0.0021496902 33:-0.07860145 34:-0.05700729 35:-0.063313715 36:0.033368964 37:0.14979771 38:-0.072100207 39:0.035494391 40:0.16490227 41:-0.0039108163 42:-0.056940868 43:-0.13951644 44:0.0090367692 45:-0.010388664 46:-3.2001597e-05 47:0.086377874 48:-0.13013911 49:-0.09070573 50:0.02763425 51:0.12158474 52:-0.099271178 53:-0.049203459 54:0.075354479 55:-0.023029407 56:-0.17907001 57:0.070354752 58:0.14282243 59:0.0034687456 60:-0.060810432 61:-0.1761537 62:-0.075805165 63:-0.003830468 64:0.065249689 65:-0.20971306 66:0.13624606 67:0.021719791 68:0.052806269 69:0.0068715853 70:-0.055908296 71:0.10070759 72:0.089928359 73:-0.049604196 74:0.022550801 75:-0.12249383 76:0.022329353 77:-0.030147461 78:-0.026255237 79:0.063345417 80:-0.070957951 81:-0.063197508 82:0.047862567 83:-0.24243608 84:0.046294484 85:0.15487838 86:0.039656956 87:-0.16902366 88:-0.021909911 89:-0.015627258 90:0.013633505 91:0.072777428 92:0.09876547 93:0.054467425 94:0.11272025 95:-0.075604454 96:-0.12564114 97:-0.024715342 98:-0.040374335 99:-0.056050438 100:0.036961775 101:0.063147619 102:0.016683416 103:-0.047588877 104:-0.081007153 105:0.0026808486 106:0.0083495537 107:-0.053726487 108:-0.11652298 109:-0.082661085 110:-0.12038574 111:-0.089432918 112:0.12036726 113:-0.078808278 114:0.084529169 115:0.02178623 116:-0.056848515 117:0.0095838895 118:0.12701577 119:-0.062937416 120:-0.0060079782 121:-0.029166792 122:0.18190886 123:-0.034365032 124:-0.058088694 125:0.12685141 126:-0.15889789 127:-0.043237004
-0.5320817125661508 0:0.15741834 1:0.014924879 2:-0.035109822 3:-0.055807102 4:0.091201551 5:0.14285181 6:-0.16160582 7:0.078945652 8:0.016520264 9:0.014707695 10:-0.048823401 11:0.013858371 12:0.018422166 13:0.042749647 14:0.015012794 15:0.0094923889 16:0.0039524287 17:-0.012119943 18:0.048706971 19:-0.1658283 20:0.1037154 21:0.055486903 22:-0.034293398 23:0.05810919 24:-0.15656899 25:0.23826149 26:0.057583347 27:-0.041036516 28:0.10227452 29:0.059531745 30:-0.10823473 31:0.024269462 32:0.0075430735 33:-0.10053313 34:-0.063572705 35:-0.062977888 36:0.059993926 37:0.13624999 38:-0.051278029 39:0.015626792 40:0.15322593 41:0.018794317 42:-0.058517464 43:-0.14719258 44:0.040837467 45:0.0027287044 46:-0.010575648 47:0.071904741 48:-0.12061081 49:-0.079948276 50:0.041890781 51:0.10523295 52:-0.078399718 53:-0.065114535 54:0.067006841 55:-0.026176615 56:-0.17716697 57:0.066973113 58:0.10637022 59:-0.0097468011 60:-0.045944788 61:-0.16955529 62:-0.080382779 63:0.019127198 64:0.071513884 65:-0.1861883 66:0.15397038 67:0.049339689 68:0.030867839 69:0.0051336759 70:-0.064921267 71:0.079986088 72:0.075313807 73:-0.042432252 74:0.021379892 75:-0.15234084 76:0.05353561 77:-0.043357972 78:-0.0055541229 79:0.06609083 80:-0.046766903 81:-0.053260706 82:0.014653953 83:-0.25568551 84:0.061650831 85:0.17406726 86:0.034611583 87:-0.17037567 88:-0.016943805 89:0.012813309 90:0.0075516831 91:0.092334665 92:0.091752619 93:0.07002423 94:0.10976368 95:-0.082288817 96:-0.14476596 97:-0.022506956 98:-0.050708015 99:-0.098836608 100:0.022161616 101:0.085575208 102:-0.018403456 103:-0.03394964 104:-0.099122658 105:0.0045376578 106:0.017750494 107:-0.042136353 108:-0.08477331 109:-0.078130946 110:-0.11206643 111:-0.077971175 112:0.13284877 113:-0.09061271 114:0.078785606 115:0.021656556 116:-0.034394644 117:0.043125138 118:0.13405509 119:-0.036743887 120:0.0030887157 121:-0.019994602 122:0.16902868 123:-0.036131356 124:-0.06050599 125:0.097624555 126:-0.1638049 127:-0.05946305
-0.3045621388886415 0:0.13265789 1:0.04287947 2:-0.022533843 3:-0.029676065 4:0.13877369 5:0.093323424 6:-0.16009481 7:0.14464186 8:0.078488223 9:-0.019689847 10:-0.047618113 11:0.050374396 12:-0.026399303 13:0.03577666 14:-0.064175278 15:-0.0031519399 16:0.030996069 17:-0.012105824 18:0.074721694 19:-0.13510875 20:0.034691051 21:0.044050481 22:-0.024635581 23:0.023730418 24:-0.15296528 25:0.18736264 26:-0.016271692 27:-0.026707249 28:0.18839835 29:0.071683466 30:-0.029572923 31:-0.0069101928 32:0.03177242 33:-0.10820068 34:-0.056612954 35:-0.066101216 36:0.024351677 37:0.1544853 38:-0.095620312 39:0.049340796 40:0.062288772 41:-0.018870469 42:-0.10460676 43:-0.12368365 44:0.063146904 45:-0.0096856831 46:-0.055579446 47:0.01499547 48:-0.059010319 49:-0.020819254 50:0.022097986 51:0.13511468 52:-0.061164215 53:-0.084607273 54:0.024798114 55:-0.115839 56:-0.12259501 57:0.022010641 58:0.10695752 59:-0.023524566 60:-0.017427064 61:-0.12032808 62:-0.045925461 63:0.032045227 64:0.020688491 65:-0.17340675 66:0.098170549 67:0.065298922 68:0.077701703 69:-0.020727865 70:-0.043914281 71:0.094675548 72:0.086765252 73:-0.16631667 74:-0.0040651988 75:-0.14209117 76:0.028333241 77:0.040182944 78:-0.071837462 79:0.086242184 80:-0.038932201 81:-0.01737581 82:-0.01370337 83:-0.25589895 84:-0.0022302177 85:0.22972676 86:0.021776997 87:-0.18178098 88:-0.043261051 89:0.0097726109 90:-0.056673307 91:0.16500759 92:0.085731834 93:0.026602793 94:0.11517385 95:-0.099840604 96:-0.16715655 97:0.024331942 98:-0.067629561 99:-0.067295313 100:0.053321753 101:0.066249423 102:0.050632149 103:-0.040784474 104:-0.11061683 105:-0.010278202 106:0.080082864 107:-0.0037509578 108:-0.095562667 109:-0.15826115 110:-0.085019626 111:-0.059350546 112:0.11749499 113:0.024804683 114:0.10132024 115:0.0097814417 116:-0.031180723 117:-0.015148983 118:0.1490718 119:0.041099191 120:0.11612652 121:-0.016521929 122:0.11280422 123:-0.013151463 124:-0.093105428 125:0.051658563 126:-0.12643787 127:-0.057971671

Sample testing file is :

1 0:0.08869078010320663 1:0.11265850812196732 2:-0.08408842980861664 3:-0.10734429955482483 4:0.09569554775953293 5:-0.04100192338228226 6:0.034704867750406265 7:-0.10822495073080063 8:0.012405091896653175 9:-0.029127100482583046 10:0.0064252461306750774 11:0.05538499355316162 12:0.04487818852066994 13:-0.028461096808314323 14:0.08426535129547119 15:-0.06002195551991463 16:0.06701210141181946 17:-0.1009158119559288 18:0.12260399013757706 19:-0.01649876870214939 20:0.06636760383844376 21:-0.020320240408182144 22:0.02745750918984413 23:0.06801094114780426 24:-0.022455191239714622 25:0.17613570392131805 26:0.03230143338441849 27:-0.07424364238977432 28:0.09025981277227402 29:0.25953787565231323 30:-0.03251327574253082 31:0.044713057577610016 32:-0.12395288795232773 33:-0.1393536627292633 34:-0.08701413869857788 35:0.23229508101940155 36:0.058736104518175125 37:0.04533463716506958 38:-0.09867251664400101 39:0.009782051667571068 40:-0.030127104371786118 41:-0.11337379366159439 42:-0.06302675604820251 43:-0.07429847121238708 44:-0.009119205176830292 45:-0.019430872052907944 46:-0.006596052553504705 47:0.07191171497106552 48:-0.04393550381064415 49:0.040151696652173996 50:0.01908142678439617 51:0.1764426976442337 52:0.10995174199342728 53:-0.064067043364048 54:-0.004895928781479597 55:-0.06563697010278702 56:0.08175726979970932 57:0.16051332652568817 58:0.1841152012348175 59:0.11843384802341461 60:0.13972434401512146 61:-0.007530031260102987 62:-0.13817663490772247 63:-0.1228240355849266 64:0.056343164294958115 65:-0.03253542259335518 66:-0.00896674208343029 67:0.018803920596837997 68:0.07428935170173645 69:-0.038438934832811356 70:0.01932407356798649 71:-0.017723485827445984 72:0.1281740367412567 73:0.02006290853023529 74:1.9832190446322784E-5 75:-0.0075545888394117355 76:-0.03605575114488602 77:0.05641820654273033 78:-0.10415695607662201 79:-0.001241249730810523 80:0.09665007889270782 81:-0.0515027679502964 82:0.05948938429355621 83:-0.08536400645971298 84:-0.07551930099725723 85:-0.07200975716114044 86:0.05448233708739281 87:-0.06424335390329361 88:0.0023951532784849405 89:-0.08267461508512497 90:-0.060305897146463394 91:0.09392296522855759 92:0.17346027493476868 93:-0.05862409621477127 94:-0.04794041067361832 95:-0.09105176478624344 96:-0.11403479427099228 97:0.04632284864783287 98:0.0655788704752922 99:-0.1020464226603508 100:0.03114120289683342 101:-1.2855100794695318E-4 102:0.01882193237543106 103:0.018400922417640686 104:-0.04462011530995369 105:-0.245097354054451 106:-0.07447024434804916 107:0.08699245750904083 108:-0.003961259964853525 109:-0.020035304129123688 110:-0.16071002185344696 111:-0.035762544721364975 112:0.06791653484106064 113:-0.16253145039081573 114:0.0739869624376297 115:-0.06037476286292076 116:2.73046171059832E-4 117:0.1137760728597641 118:0.0011637017596513033 119:-0.11534024029970169 120:-0.07720830291509628 121:0.20741181075572968 122:0.08804057538509369 123:0.025654399767518044 124:0.12709087133407593 125:0.07892488688230515 126:-0.05567362904548645 127:-0.008171903900802135

My doubt is in testing file there is only one data (one line) so index is 1, so I am getting target_label as 1 every time. And in my case in model file (there is data of 2 persons ), the index of target face is 2,
and so getting predict_label as 2 (which is correct), but in code (svm_predict.java), at if clause if(predict_label == target_label) ++correct; , variable correct never get incremented..

So accuracy is always 0.0%;

Please Help me in this case

@Codezhak
Copy link
Author

Hi,

Could you please help me in this case. I am attaching the predict output data below. Could you please give me an idea about these values.

labels 1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 17 18 19 20 21 22 23 24 25 26 27 28 29 30 31 32 33 34 35 36 37 38 39 40 41 42 43 44 45 46 47 48 49 50 51 52 53 54 55 56 57 58 59 60 61 62 63 64 65 66 67 68 69 70 71 72 73 74 75 76 77 78 79 80 81 82 83 84 85 86 87 88 89 90 91 92 93 94 95 96

69.0 0.009766350027215745 0.008718788435991671 0.008714248367779589 0.01009612910406905 0.008810651760798765 0.007199669815463763 0.011279611860364866 0.007876555804444537 0.008771739007047094 0.010226308566184304 0.007987598090904446 0.013106959757897264 0.012791730642928609 0.013306778616485385 0.010275457089693834 0.010381747063016636 0.01048066547401133 0.012130526059080481 0.011596904443943094 0.008525880016375508 0.013414159773918148 0.011080507534170378 0.010220942129122727 0.011335087843791102 0.012599536513399808 0.007963997885801002 0.011967775983568104 0.010809823695305287 0.008077167983766769 0.009239526561179642 0.011225212243597642 0.009698110646958864 0.012332212996859214 0.0016028208496402278 0.010039914982045957 0.013777257243892135 0.008441028976934514 0.0087232354065756 0.009852114786750818 0.010543586117699196 0.014802952166708195 0.013204612745489714 0.01105621159172587 0.015241204046930735 0.011400324980291008 0.011703140846543602 0.005625613086914375 0.01330125846852294 0.012927454682873481 0.010569029308808374 0.005529270401969083 0.013183109915957053 0.010996940370459424 0.013776714701248008 0.010684849080162637 0.012601060135684591 0.009348879873881624 0.0074177779597208535 0.01099654683054148 0.006905452504926945 0.012869029339493254 0.011889579446811767 8.114384939217007E-4 0.012949375124371191 9.99447613272359E-4 0.009111369709154448 0.013002952980131162 0.009118058694081534 0.02296124258017252 0.007067531681322627 0.010405418275145323 0.009865121234733812 0.009910737045806908 0.00995742795217193 0.009089866311820006 0.010290719239926034 7.396414939486157E-4 0.010983618119280952 0.009701946614437403 0.009865429548498493 0.011044470061753736 0.007626160404856159 0.011514783926442175 0.010377611151098002 0.010029471694536468 0.017120699872452055 0.009992492331929706 0.00956459869570664 0.01148371498765532 0.012745678554922937 0.018075933514996353 8.351298078053558E-4 0.022526354673576 0.013679062638080873 0.014729257738683164 8.039045889690222E-4

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

2 participants