BrainQuant3D setup for the Hands-on, Neural, Behavioural and Histological Data Analysis Workshop by the UBC Dynamic Brain Circuit Cluster as a part of CAN 2024 meeting.
This repository and tutorial notebooks were developed at the Djavad Mowafaghian Centre for Brain Health by the NINC/UBC Brain Circuits team. If you find the tutorials useful you can acknowledge us with the following statement: "This work was supported by resources made available through the Dynamic Brain Circuits cluster and the NeuroImaging and NeuroComputation Core at the UBC Djavad Mowafaghian Centre for Brain Health (RRID:SCR_019086).” If you use the packages in your research, please cite the originators as per their documentation.
Here is what the final "bq3d-demo-ubc" folder will look like at the end of the demo:
.
└── bq3d-demo-ubc/ ### Setup step 1
├── bq3d_demo-main/ ### Setup step 2
│ ├── 000_create_subset.ipynb
│ ├── 00_bq3d_setup.ipynb
│ ├── 01_bq3d_tutorial.ipynb
│ ├── bq3d_env.yml
│ ├── filters/
│ │ ├── filters.ilp ### Downloading Data step 2
│ │ └── ...
│ ├── parameter_template.py ### will be edited in 00_bq3d_setup.ipynb
│ ├── process_template.py
│ ├── README.md
│ └── demo_data/ ### will be created in 01_bq3d_tutorial.ipynb
│ ├── analysis/
│ │ └── ...
│ └── data/
│ ├── auto/ ### Downloading Data step 3; move later
│ │ ├── Z001.tif
│ │ ├── Z002.tif
│ │ └── ...
│ ├── signal/ ### Downloading Data step 4; move later
│ │ ├── Z001.tif
│ │ ├── Z002.tif
│ │ └── ...
│ ├── parameter.py
│ └── process.py
└── Warping/ ### Downloading Data step 1
├── ARA2/
│ └── ...
└── ...
- Create a folder on your computer called "bq3d-demo-ubc" in a convenient location (such as your downloads folder). You can store all of the downloaded information here.
- Download all the files in this Github repository as a zip file.
- Click the green "Code" button at the top of the page and select "Download Zip". This will download "bq3d_demo-main.zip" onto your computer. Unzip this file.
- Move the unzipped "bq3d_demo-main" folder into the "bq3d-demo-ubc" folder.
All data used for this tutorial is located here
- Allen Brain Atlas Registration files are in the Brain Registration OSF component.
- Click "Warping.zip" and select "Download" to download it (477.9 MB zipped, 2.32 GB unzipped).
- Move the unzipped "Warping" folder into "bq3d-demo-ubc".
- The Ilastik filter used is in the ilastik filter OSF component.
- Click "cfos_6h_nov2.ilp.zip" and select "Download" to download it (794.6 MB zipped, 2.75 GB unzipped).
- Move the "cfos_6h_nov2.ilp" filter file into the "bq3d-demo-ubc/bq3d_demo-main/filters" folder.
- Demo CFOS signal brain image slices are in the Demo Data Subset OSF component.
- Click "signal.zip" to download it (91.6 MB zipped, 217.7 MB unzipped).
- Move the unzipped "signal" folder into "bq3d-demo-ubc" for now.
- Demo background autofluorescence brain image slices are in the Demo Data Subset OSF component.
- Click "auto.zip" to download it (77.6 MB zipped, 217.7 MB unzipped).
- Move the unzipped "auto" folder into "bq3d-demo-ubc" for now.
The data we use in this demo is a subset of a mouse brain hemisphere. We create the subset by choosing every 12th image in the z-stack slices to narrow down the initial 741 files (Z001, Z002, Z003...) into 61 files (Z012, Z024, Z036...). This is for reducing the processing time needed to acheive a result (from around 3 hours with the whole set to 10 minutes), and means that the cell counts are greatly underestimated compared to the original. Code to create this subset is in the file 000_create_subset.ipynb
. We do not recommend using this step for your own images, as the data will be incomplete.
- Windows Subsystem for Linux (WSL) (allows Windows users to use Linux applications and command-line tools)
- You can find the WSL application by searching "WSL" in your applications. The WSL application is similar to the Linux terminal. Run all of your commands through the WSL application.
- Even if you have git and anaconda/miniconda downloaded on your Windows computer, you will need to install the packages again on WSL, following the "Linux" installation instructions. Use command line installations through your WSL.
- git (For getting the BrainQuant3D package from GitHub)
- Run this line in WSL:
apt-get install git
- Verify the git installation with
git --version
in WSL
- Install miniconda (For isolating packages in its own Python environment)
- Run these lines in WSL:
and then:
mkdir -p ~/miniconda3 wget https://repo.anaconda.com/miniconda/Miniconda3-latest-Linux-x86_64.sh -O ~/miniconda3/miniconda.sh bash ~/miniconda3/miniconda.sh -b -u -p ~/miniconda3 rm -rf ~/miniconda3/miniconda.sh
Close and re-open WSL to complete the conda installation.~/miniconda3/bin/conda init bash ~/miniconda3/bin/conda init zsh
- Verify the anaconda installation with
conda --version
in WSL
NOTE FOR WINDOWS USERS: In WSL, you may need to specify the disk that your files are on. For example, the address might be
/mnt/c/Users/<your username>/Downloads
for the equivalent of the downloads folder in your C: drive (C:\Users\<your username>\Downloads
). You may also need to typesudo apt install gcc
andsudo apt install g
to successfully create the environment.
- git (For getting the BrainQuant3D package from GitHub)
- Verify the git installation with
git --version
in the Terminal
- Verify the anaconda installation with
conda --version
in the Terminal
- homebrew and install XCode (To download missing libraries and edit homebrew installations)
- You will need to download the opencv@3 library, an older version of OpenCV that is currently marked as "disabled".
- To download the "disabled" opencv@3 files, follow the instructions here and replace the
xxxx
placeholders withopencv@3
in your terminal commands.
- git (For getting the BrainQuant3D package from GitHub)
- Verify the git installation with
git --version
in the Terminal
- miniconda (For isolating packages in its own Python environment)
- Verify the anaconda installation with
conda --version
in the Terminal
- git (For getting the BrainQuant3D package from GitHub)
- Verify the git installation with
git --version
in the Terminal
- Verify the anaconda installation with
conda --version
in the Terminal
Open your Terminal (Mac, Linux) or WSL (Windows) and run these commands:
cd <path to bq3d_demo-main> #e.g. cd /Users/ytcao/Documents/bq3d-demo-ubc/bq3d_demo-main
conda env create -f bq3d_env.yml
conda activate bq3d
pip install git https://github.com/MehwishUBC/BrainQuant3D_cFos.git
NOTE: You can tell which Python environment you are using by looking at the parentheses before your command. You should see (bq3d). If you see (base), this means you are using the main environment on your computer.
NOTE for Windows users in WSL: If you run into an error at the
pip install git...
command that containsERROR: could not build wheels for...
witherror: command 'gcc' failed...
a few lines above it, look here for a reference to install gcc. After you've confirmed that gcc is installed, trypip install git https://github.com/MehwishUBC/BrainQuant3D_cFos.git
again.
Run this command:
conda activate bq3d
To activate Jupyter Notebooks, you can either:
- Open the Anaconda Navigator application and press "Launch" on Jupyter Notebook.
- Open Terminal/Anaconda Prompt and run this command in the bq3d environment:
jupyter notebook
This will open up the Jupyter Notebook Home Page in your default browser. You will need to navigate to the "bq3d-demo-ubc" and then "bq3d_demo-main" folder. Start with "00_bq3d_setup.ipynb".
Then, you need to run the Jupyter Notebook using the bq3d
environment we just created. You can modify this by going Kernel > Change Kernel > bq3d option in the menu bar.