Skip to content

Commit 4e11b0e

Browse files
authored
Merge pull request #122 from MouseLand/dev
Change display layout
2 parents a383298 + 5a58318 commit 4e11b0e

10 files changed

+1256
-1186
lines changed

README.md

+13-5
Original file line numberDiff line numberDiff line change
@@ -27,7 +27,7 @@ Syeda, A., Zhong, L., Tung, R., Long, W., Pachitariu, M.\*, & Stringer, C.\* (20
2727
Stringer, C.\*, Pachitariu, M.\*, Steinmetz, N., Reddy, C. B., Carandini, M., & Harris, K. D. (2019). Spontaneous behaviors drive multidimensional, brainwide activity. <em>Science, 364</em>(6437), eaav7893.
2828
[[bibtex](https://scholar.googleusercontent.com/scholar.bib?q=info:DNVOkEas4K8J:scholar.google.com/&output=citation&scisdr=CgXHFLYtEMb9qP1Bt0Q:AAGBfm0AAAAAY3JHr0TJourtY6W2vbjy7opKXX2jOX9Z&scisig=AAGBfm0AAAAAY3JHryiZnvgWM1ySwd_xQ9brvQxH71UM&scisf=4&ct=citation&cd=-1&hl=en&scfhb=1)]
2929

30-
The MATLAB version of the GUI is no longer supported (old [documentation](docs/svd_matlab_tutorial.md)).
30+
The MATLAB version of the GUI is no longer supported (see old [documentation](https://github.com/MouseLand/facemap/blob/main/docs/svd_matlab_tutorial.md)).
3131

3232
## Installation
3333

@@ -42,6 +42,14 @@ If you are using a GPU, make sure its drivers and the cuda libraries are correct
4242
5. To install the minimal version of facemap, run `python -m pip install facemap`.
4343
6. To install facemap and the GUI, run `python -m pip install facemap[gui]`. If you're on a zsh server, you may need to use ' ' around the facemap[gui] call: `python -m pip install 'facemap[gui]'.
4444

45+
For Macbook users with M1/M2 please follow the following instructions for installation:
46+
1. Install an [Anaconda](https://www.anaconda.com/products/distribution) distribution of Python. Note you might need to use an anaconda prompt if you did not add anaconda to the path.
47+
2. Open an anaconda prompt / command prompt which has `conda` for **python 3** in the path
48+
3. Create a new environment with `conda create -y -n facemap python=3.9 pyqt imagecodecs pyqtgraph matplotlib`
49+
4. To activate this new environment, run `conda activate facemap`
50+
5. Next, install the facemap in the environment: `pip install facemap`
51+
6. Finally, run `python -m facemap` to launch facemap GUI.
52+
4553
To upgrade facemap (package [here](https://pypi.org/project/facemap/)), run the following in the environment:
4654

4755
~~~sh
@@ -86,7 +94,7 @@ Facemap supports grayscale and RGB movies. The software can process multi-camera
8694
8795
'.mj2','.mp4','.mkv','.avi','.mpeg','.mpg','.asf'
8896
89-
For more details, please refer to the [data acquisition page](docs/data_acquisition.md).
97+
For more details, please refer to the [data acquisition page](https://github.com/MouseLand/facemap/blob/main/docs/data_acquisition.md).
9098
9199
## Support
92100
@@ -118,7 +126,7 @@ Facemap provides a trained network for tracking distinct keypoints on the mouse
118126
119127
Keypoints will be predicted in the selected bounding box region so please ensure the bounding box focuses on the face. See example frames [here](figs/mouse_views.png).
120128
121-
For more details on using the tracker, please refer to the [GUI Instructions](docs/pose_tracking_gui_tutorial.md). See [command line interface (CLI) instructions](docs/pose_tracking_cli_tutorial.md) and for more examples, please see [tutorial notebooks](https://github.com/MouseLand/facemap/tree/dev/notebooks).
129+
For more details on using the tracker, please refer to the [GUI Instructions](https://github.com/MouseLand/facemap/blob/main/docs/pose_tracking_gui_tutorial.md). Check out the [notebook](https://github.com/MouseLand/facemap/blob/main/docs/notebooks/process_keypoints.ipynb) for processing keypoints in colab.
122130
123131
<p float="middle">
124132
<img src="https://raw.githubusercontent.com/MouseLand/facemap/main/figs/mouse_face1_keypoints.png" width="310" height="290" title="View 1" alt="view1" align="left" vspace = "10" hspace="30" style="border: 0.5px solid white" />
@@ -134,7 +142,7 @@ Facemap allows pupil tracking, blink tracking and running estimation, see more d
134142
135143
You can draw ROIs to compute the motion/movie SVD within the ROI, and/or compute the full video SVD by checking `multivideo`. Then check `motSVD` and/or `movSVD` and click `process`. The processed SVD `*_proc.npy` (and optionally `*_proc.mat`) file will be saved in the output folder selected.
136144
137-
For more details see [SVD python tutorial](docs/svd_python_tutorial.md) or [SVD MATLAB tutorial](docs/svd_matlab_tutorial.md).
145+
For more details see [SVD python tutorial](https://github.com/MouseLand/facemap/blob/main/docs/svd_python_tutorial.md) or [SVD MATLAB tutorial](https://github.com/MouseLand/facemap/blob/main/docs/svd_matlab_tutorial.md).
138146
139147
([video](https://www.youtube.com/watch?v=Rq8fEQ-DOm4) with old install instructions)
140148
@@ -149,5 +157,5 @@ The encoding model used for prediction is described as follows:
149157
<img src="https://raw.githubusercontent.com/MouseLand/facemap/main/figs/encoding_model.png" width="70%" height="300" title="neural model" alt="neural model" align="center" vspace = "10" hspace="30" style="border: 0.5px solid white" />
150158
</p>
151159
152-
Please see neural activity prediction [tutorial](docs/neural_activity_prediction_tutorial.md) for more details.
160+
Please see neural activity prediction [tutorial](https://github.com/MouseLand/facemap/blob/main/docs/neural_activity_prediction_tutorial.md) for more details.
153161
+12-11
Original file line numberDiff line numberDiff line change
@@ -1,23 +1,24 @@
11
# Neural activity prediction
22

3-
This tutorial shows how to use the deep neural network encoding model for neural prediction using mouse orofacial behavior.
3+
This tutorial shows how to use the deep neural network encoding model to predict neural activity based on mouse orofacial behavior.
44

5-
To process neural activity prediction using pose estimates extracted using the tracker:
5+
To process neural activity prediction using pose estimates extracted using the keypoint tracker:
66

7-
1. Load or process keypoints ([see pose tracking tutorial](docs/pose_tracking_gui_tutorial.md)).
8-
2. Select `Neural activity` from file menu to `Load neural data`.
9-
3. Load neural activity data (2D-array stored in *.npy) and (optionally) timestamps for neural and behavioral data (1D-array stored in *.npy) then click `Done`.
10-
4. Select `Run neural prediction` from the `Neural activity` file menu.
11-
5. Select `Keypoints` as input data and set whether the output of the model's prediction to be `neural PCs` or neural activity. Use help button to set training parameters for the model.
12-
5. The predicted neural activity *.npy file will be saved in the selected output folder.
7+
1. Load or process keypoints ([see pose tracking tutorial](https://github.com/MouseLand/facemap/blob/main/docs/pose_tracking_gui_tutorial.md)).
8+
2. Select `Neural activity` from file menu.
9+
3. Click on `Launch neural activity window`.
10+
4. Select `Load neural activity` (2D-array stored in *.npy) and (optionally) timestamps for neural and behavioral data (1D-arrays stored in *.npy) then click `Done`.
11+
4. Once the neural data is loaded, click on `Run neural predictions`.
12+
5. Select `Keypoints` as input data and select one of the options for output of the model's prediction, which can be `Neural PCs` or neural activity. CLick on `Help` button for more information.
13+
5. The predicted neural activity (*.npy) file will be saved in the selected output folder.
1314

14-
To process neural activity prediction using pose estimates extracted using the tracker:
15+
To predict neural activity using SVDs from Facemap:
1516

16-
1. Load or process SVDs for the video. ([see SVD tutorial](docs/svd_tutorial.md)).
17+
1. Load or process SVDs for the video. ([see SVD tutorial](https://github.com/MouseLand/facemap/blob/main/docs/svd_python_tutorial.md)).
1718
2. Follow steps 2-5 above.
1819

1920
Note: a linear model is used for prediction using SVDs.
2021

21-
Predicted neural activity will be plotted in the bottom-right window of the GUI. You can highlight test data by selecting `Highlight test data` from the `Neural activity` file menu. Further information about neural prediction, including variance explained can be found in the saved neural prediction file.
22+
Predicted neural activity will be plotted in the neural activity window. Toggle `Highlight test data` to highlight time segments not used for training i.e. test data. Further information about neural prediction, including variance explained can be found in the saved neural prediction file (*.npy).
2223

2324

+19-20
Original file line numberDiff line numberDiff line change
@@ -1,37 +1,36 @@
11
Neural activity prediction
22
==========================
33

4-
This tutorial shows how to use the deep neural network encoding model
5-
for neural prediction using mouse orofacial behavior.
4+
This tutorial shows how to use the deep neural network encoding model to
5+
predict neural activity based on mouse orofacial behavior.
66

77
To process neural activity prediction using pose estimates extracted
8-
using the tracker:
8+
using the keypoint tracker:
99

1010
1. Load or process keypoints (`see pose tracking
1111
tutorial <https://github.com/MouseLand/facemap/blob/main/docs/pose_tracking_gui_tutorial.md>`__).
12-
2. Select ``Neural activity`` from file menu to ``Load neural data``.
13-
3. Load neural activity data (2D-array stored in *.npy) and (optionally)
14-
timestamps for neural and behavioral data (1D-array stored in*.npy)
15-
then click ``Done``.
16-
4. Select ``Run neural prediction`` from the ``Neural activity`` file
17-
menu.
18-
5. Select ``Keypoints`` as input data and set whether the output of the
19-
model’s prediction to be ``neural PCs`` or neural activity. Use help
20-
button to set training parameters for the model.
21-
6. The predicted neural activity \*.npy file will be saved in the
12+
2. Select ``Neural activity`` from file menu.
13+
3. Click on ``Launch neural activity window``.
14+
4. Select ``Load neural activity`` (2D-array stored in \*.npy) and
15+
(optionally) timestamps for neural and behavioral data (1D-arrays
16+
stored in*.npy) then click ``Done``.
17+
5. Once the neural data is loaded, click on ``Run neural predictions``.
18+
6. Select ``Keypoints`` as input data and select one of the options for
19+
output of the model’s prediction, which can be ``Neural PCs`` or
20+
neural activity. CLick on ``Help`` button for more information.
21+
7. The predicted neural activity (\*.npy) file will be saved in the
2222
selected output folder.
2323

24-
To process neural activity prediction using pose estimates extracted
25-
using the tracker:
24+
To predict neural activity using SVDs from Facemap:
2625

2726
1. Load or process SVDs for the video. (`see SVD
2827
tutorial <https://github.com/MouseLand/facemap/blob/main/docs/svd_python_tutorial.md>`__).
2928
2. Follow steps 2-5 above.
3029

3130
Note: a linear model is used for prediction using SVDs.
3231

33-
Predicted neural activity will be plotted in the bottom-right window of
34-
the GUI. You can highlight test data by selecting
35-
``Highlight test data`` from the ``Neural activity`` file menu. Further
36-
information about neural prediction, including variance explained can be
37-
found in the saved neural prediction file.
32+
Predicted neural activity will be plotted in the neural activity window.
33+
Toggle ``Highlight test data`` to highlight time segments not used for
34+
training i.e. test data. Further information about neural prediction,
35+
including variance explained can be found in the saved neural prediction
36+
file (\*.npy).

0 commit comments

Comments
 (0)