Banged my head on this way too long before I realize: when you add new path to environmental variables, you probably want to restart pycharm to ensure the terminal is reloading the new path info and can now resolve whatever new resources being added to the path. =_=+ embarrassingly long.
Fun fun fun installing Matlab on CentOS7…. on a server…. without display.
Our resident magical IT guru managed to get XServe up and running but even then… we saw a splash screen and installation FINISHED like what is described here:https://www.mathworks.com/matlabcentral/answers/229857-why-do-i-see-preparing-installation-files-installing-finished-in-the-terminal-window-wh
This problem blocked us for a while and never got resolved. Then our magical IT guru Rolando took a stab at it and solved it, all thanks to this post by John Miezitis (posted recently on 24th last month) which somehow even more magically found libXtst library is MISSING which is “silently” crashing the installation, as in this post: https://www.mathworks.com/matlabcentral/answers/229857-why-do-i-see-preparing-installation-files-installing-finished-in-the-terminal-window-wh#answer_330037
Kudos to John, whever you are. We owe you a beer in Montreal.
Yeah, we need GUI on a server (sigh) because SPM/Matlab and a few other programs are very heavy GUI driven and to facilitate debug time, a GUI really helped formation of the analyses script integration. I really wish this abomination isn’t born but so many request for SPM based analyses pipelines (looking at MANTIS right now…)
Wow… fun time working with matlab/linux/precompiled binaries…
So, got this awesome software that I have never heard of called iBeat that does pretty epic neonatal segmentation. Except no one I know ever used it. So. I am the dude who shall eat and relish this proverbial crab.
Binary pre-compiled. Joyous.
Then, after editing .bashrc to source their bash source script,
MCR crashed. Matlab Compile? Runtime is what this GUI and others were built on. FUNNNNN… Cryptic message just like what is reported in the solution thread:
Fatal error loading library /home/lz225/v76/bin/glnxa64/libmwmclmcr.so Error: libXp.so.6: cannot open shared object file: No such file or directory
Down off the search rabbit hole I went.
Glad to see I was not the only one running into this issue: https://www.linuxquestions.org/questions/linux-newbie-8/error-libxp-so-6-cannot-open-shared-object-file-no-such-file-or-directory-924315/
TLDR so far, this libmwmclmcr.so file (god… what could this abbreviation possbily mean???). Reqeuired a dynamicly linked library called libXp.so.6 which is somehow used for printing?
Then further stackxxxxxgoogle change later on: Found this thread: https://askubuntu.com/questions/944838/libxp-so-6-missing-for-ubuntu-17-04
Which led me to try that too. Downloaded from https://packages.ubuntu.com/trusty/libxp6
sudo dpkg -i ./libxp6_1.0.2-1ubuntu1_amd64.deb sudo apt-get install -f
and even though prompt said NOTHING is updated. It worked miraculously by some dark menacing black magic I guess.
and now I can actually launch iBeat now.
How did people use to solve these problems PRE search engine??? Must take forever on mailing list or snailmail author for giggles?
Update 1: Okay… maybe I spoke too soon. The moment iBeat ran once, system crashed. Super impressive as this is the SECOND time this happened. Previously with a superstable CentOS. Now with Ubuntu. Something feels odd. Don’t tried the crossed out fix until you got a VM snapshot or the like up. This is getting personal 😛 I reinstalled Ubuntu, snappedshotted this to hell and going to do SCIECNE on ibeat so hard until it works.
Update 2: Retried that patch method… seeing if Ubuntu terminal dies.
Update 3: Nope. Ubuntu is still alive and kicking THIS time. Hmmm… then this weird death of both a VM and my old laptop must just be a freaky coincidence then. I did install Anaconda last time prior to this… wondering if they are related… But either way, VM snapshots all the time now.
So… Be me. Tried to install VirtualBox on OSX High Sierra. Always fails. But it is installed. When run it, it says kernel and other stuff are not even installed. Wondering how this can happen.
So furious google ensured.
Then found this stackexchange thread… on this issue… and the causes are more bizarre.
Please review the following two posts to see the excellent problem descriptions:
TLDR: apparently, OSX DELIBERATELY HIDES the “Allow” button (or Windows UAC equivalent? XD) WHEN there is a mouse sharing program running PREVENTING you from allowing installation and elevate permissions etc. Neat security bulletproofing but at the cost of user convenience with ZERO warning WHATSOEVER. I had ShareMouse running and maybe even TeamViewer, closed both of them and the “Allow” from the Security and Privacy section worked and VirtualBox installed fine. What.The.Hell…
Ran into this enough times that it frustrates me enough to write this down:
When running python from old anaconda distros on Windows and need to update pip, don’t use pip to update pip because Windows don’t let that happen. So instead:
- Launch Admin Anaconda Prompt
- Run: python -m pip install –upgrade pip
Hmm… that was surprisingly painful to get it all working.
- Must ensure python is installed. admin permission run pip update and install all dependencies seems to have helped. Even though I started with Anaconda, that took longer than expected.
- “pip install datalad”
- install Git (latest 32bit as recommended by Git-Annex): be careful here as the path is actually best done by confining it strictly to Git Console (so not to mess up your other function).
- Install Git-Annex from their website (this is NOT installed by DataLad, but required).
- Launch Git Console.
- Test Git
- Test Git Annex.
- … and it seems datalad works from that console, only.
I did still ran into bugs like and I failed to see anything being majorly changed in the folder I was running this from: .datalad, .git etc:
$ datalad save . -m "Initial commit of all T2 data from CNBP registry before any kind of conversion" Failed to run ['git', '-c', 'receive.autogc=0', '-c', 'gc.auto=0', '-c', 'core.bare=False', '--work-tree=.', 'annex', 'proxy', '--', 'git', 'add', '--update', '--verbose'] under 'H:\\DataLadExp'. Exit code=1. out= failed err=git-annex: .git\annex\misctmp\proxy.0: removeDirectoryRecursive:removeContentsRecursive:RemoveDirectory "\\\\?\\H:\\DataLadExp\\.git\\annex\\misctmp\\proxy.0": unsatisfied constraints (The directory is not empty.) git-annex: proxy: 1 failed failed git-annex: .git\annex\misctmp\proxy.0: removeDirectoryRecursive:removeContentsRecursive:RemoveDirectory "\\\\?\\H:\\DataLadExp\\.git\\annex\\misctmp\\proxy.0": unsatisfied constraints (The directory is not empty.) git-annex: proxy: 1 failed
So I guess it is not entirely working yet…
Update: on second try with SHORTER commit message it seems to worked. Not sure if it is a folder creation/permission bug but… it “save(okay) wow
Update1: Just tried it on another computer… and it is complaining Datalad is missing from the git-bash window. Hmmmm……I am definitely doing something wrong here… The mystery deepens.
Update 2: Finally figured out that C:\ProgramData\Anaconda\Scripts must be added to the Environment search path for Git Bash to recognize the exe in the search path to run datalad executable without issue. So to recap: 1) Must be from Git-Bash windows because only it can reference Git/Git-Annex properly. 2) Path must include to executable DataLad. Once all these are set, datalad SHOULD WORK in the Git-Bash window. 3) Test show that Anaconda Console/Terminal/Bash will not work as it will not recognize Git/Git-Annex properly… unless you somehow configured that properly? I swear this is how DLL hell used to start…
Another funny quicks about Python fnmatch algorithm.
I was playing around with Pydicom, and it has a function called: get_testdata_files which calls get_files from its data_manager.py which calls fnmatch from fnmatch.py which is part of Anaconda 3 lib. However, fnmatch itself is quicky in that it will enforce case matching on Linux but relax this on Windows. So When calling the Pydicom function with “JPEG” will only match fewer test file on ubuntu (e.g. Travis-CI) vs when testing this on Windows (my dev machine).
Use fnmatchcase are probably a better idea.
Just spent a day finding out why Emgu seems to give me one hell of a time computing simple pca for a bunch of 2d points.
For some reason we are not sure and yet replicated across two dev machines, if you pass wrong matrix size as output parameters to functions like CVinvoke.calcCoVariance or CVinvoke.eigen it will crash the program but in very insidious ways with no error given except the output matrices just contain data of all 0s. And left you wondering what the hell went wrong….
Also calccovariance’s mandatory flag are apparently require “or” somehow in our layout such as “Emgu.CV.CvEnum.CovarMethod.Normal | Emgu.CV.CvEnum.CovarMethod.Cols” worked for us. Then afterwards the covariance matrix needs to normalize by array length size afterward…. In order to get the same results as numpy
Lastly, this one is on me: eigen value and vectors are arranged based on descending order in emgucv/opencv.
Almost have my pca working as intended to identify principle two dimensional clusters distributions…
Words cannot express the amount of joy when I first started using YNAB. It felt leaps and decade better than old Mint which has not really been updated since a decade ago?
So… then YNAB TD Canada Trust importer broke around Feb/March of this year, not YNAB’s fault, I know. But the fact it was not able to connect my bank account for like three months while MINT can still do 3 days later after TD Canada interface update makes me wonder if YNAB should switch to a better bank connector?
That part… is not the worst.
So today after everything is finally back up (btw, I do have to pay for YNAB, and not Mint), three months later. I was happily synchronizing everything. When I realized that the bank balance are not correct... this is where shxt really hits the fans. As the core principle and competency of a finance/accounting software, the number one creed of its goal above and beyond everything is the get its shit together and get transactions imported/transferred correctly from the bank to the BEST of its abilities. Turned out YNAB only import ONE month back and even back then, some dates were offsetted by as far as 4 days.
If YNAB requires ME to manually cross validate every single transactions to rebalance the book vs bank record, we have a major problem here as not only is it making my life more difficult vs Mint, it is now forcing me to run a full investigation to see WHICH lines are not imported/date set properly! So far, rough comparisons suggest everything in March is pretty much gone.
A few quick comparisons: Note blue transactions are only there because of a LINK from those credit cards etc.
I will update again once I begin to sort through the mess.
Update 1: spent past 24h cleaning up the update. A few piece of good news: manual import of QBX format has auto deduplication built into it (at least for my main banking account) so that really helped out. I had to blank import past 6 months of statement to patch up the missing bits. My suspicion is that when missing import sometimes, the transactions gets neglected and will no longer be imported. i.e. completely fall off the book. =_=…. While doing this for another different account, I actually noticed that the deduplication process fails so resulted in crap tons of duplicated entries and causing issues…. again, more headaches. This requirement to manually hunt down when and how the out of sync occured really leaves terrible taste in my mouth as it fails at a basic level as I mentioned earlier: 1) if this software is PULLING data from the bank in the first place after I give it permission, it should not be requiring me to go over every single transactions again to en sure that they are synced over, that is not the job of the user, that is the job of the software. As a user of financial software, my goal and sole purpose of using YNAB or mint or the like is to do budgeting, tracking expenses, not questioning the authenticity of whether the copy from the bank over to the software occurred. If I want to do that, I could have used a spreadsheet…/rant.