diff --git a/src/.vuepress/config.js b/src/.vuepress/config.js
index 889c917..f0ac3ab 100755
--- a/src/.vuepress/config.js
+++ b/src/.vuepress/config.js
@@ -97,6 +97,7 @@ function getMaintenanceSidebar () {
'projection',
'reward',
'stage',
+ 'positioning',
'miscellaneous'
]
}
diff --git a/src/maintenance/index.md b/src/maintenance/index.md
index 372ef87..dbb550f 100644
--- a/src/maintenance/index.md
+++ b/src/maintenance/index.md
@@ -9,4 +9,4 @@ lang: en-US
This website is a central repository for the documentation regarding the **maintenance** of mini virtual reality rigs as part of BRAIN CoGS at Princeton Neuroscience Institute.
-Maintenance documentation is also divided into modules, each module contains the process and tools needed to perform both preventive and corrective maintenance, as well as common troubleshooting.
\ No newline at end of file
+Maintenance documentation is also divided into modules, each module contains the process and tools needed to perform both preventive and corrective maintenance, as well as common troubleshooting.
diff --git a/src/maintenance/positioning.md b/src/maintenance/positioning.md
index d46cb70..dc12d6e 100644
--- a/src/maintenance/positioning.md
+++ b/src/maintenance/positioning.md
@@ -5,6 +5,20 @@ lang: en-US
# {{ $frontmatter.title }}
+# Automated positioning
+
The automated positioning system maintenance conssist of checking the motors and making sure everything is properly tighten (use loctite if necessary, altough it is recommended). Troubleshooting is mostly related to the motors driver.
+## Calibrating
+
+If the motors don't respond correctly, open the Zaber software. Select the 'Basic Movements' option and click on the home button. This action will take the motors to their absolute zero position.
+
+## Restore
+If sending the motors home doesn't work, the next step is to restore the affected motor. In the basic options, select the motor experiencing the issue, click on the settings icon, and choose 'Restore'.
+
+### Recomendation
+After completing the training, please send all the motors home.
+
+# Manual
+
The manual positioning system is low maintenance and consist mainly of tightening the Thorlabs parts and maintening the positioning tool in shape.
\ No newline at end of file
diff --git a/src/software/automation_pipeline_developer.md b/src/software/automation_pipeline_developer.md
index 8dd1448..2d96f92 100644
--- a/src/software/automation_pipeline_developer.md
+++ b/src/software/automation_pipeline_developer.md
@@ -7,7 +7,7 @@ lang: en-US
The Ephys/Imaging Automation Pipeline in BRAINCoGS main goals are:
-+ Automate spike sorting and imaging segmentation for all recordings
++ Automate spike sorting and imaging segmentation for all recordings
+ Centralize/Standardize paths for Recording Data Storage
+ Unify & Register Ephys/Imaging Processing
+ Store processed data in BRAINCoGS Database (DJ)
@@ -16,7 +16,7 @@ To accomplish this we developed three tools:
+ Ephys/Imaging Automation GUI (RecordingProcessJobGUI)
+ Recording Workflow management (Automatic_job directory in U19-pipeline_python )
-+ Collab reposiotries to handle Ephys/Imaging Processing (BrainCogsEphysSorters and BrainCogsImagingSegmentation )
++ Collab reposiotries to handle Ephys/Imaging Processing (BrainCogsEphysSorters and BrainCogsImagingSegmentation )
## Ephys/Imaging Automation GUI
@@ -33,31 +33,31 @@ Workflow management is composed mainly by two classes that handles recordings an
+ Ephys recordings are composed by one or many independent probe electrophysiology recordings. Each probe recording correspond to a job in the workflow management
+ Calcium imaging recordings are composed by one or many independent field of views image stacks. Each field of view image stack correspond to a job in the workflow management.
-The class that manages workflow at the recording level is (RecordingHandler)
+The class that manages workflow at the recording level is (RecordingHandler)
-### Main functions and variables in recording workflow manager
+### Main functions and variables in recording workflow manager
-+ **recording_status_dict** in (Params Config file): This dictionary defines status definitions and corresponding functions to execute.
++ **recording_status_dict** in (Params Config file): This dictionary defines status definitions and corresponding functions to execute.
+ **pipeline_handler_main** in (RecordingHandler): Main function in recording workflow
-1. Executes corresponding functions based in status.
-2. Executed every 30 minutes to check for new recordings to be handled.
-3. Send notifications for processed and failed functions.
-+ **exception_handler** in (RecordingHandler): Python decorator for error handling.
-+ **modality_preingestion** in (RecordingHandler): Main ingestion function from recording to recording_process tables. There are subcalls depending on modality of recording (ephys or imaging).
+ 1. Executes corresponding functions based in status.
+ 2. Executed every 30 minutes to check for new recordings to be handled.
+ 3. Send notifications for processed and failed functions.
+ + **exception_handler** in (RecordingHandler): Python decorator for error handling.
++ **modality_preingestion** in (RecordingHandler): Main ingestion function from recording to recording_process tables. There are subcalls depending on modality of recording (ephys or imaging).
-#### Imaging preingestion main steps:
+#### Imaging preingestion main steps:
+ **imaging_preingestion** in (RecordingHandler): Ingestion to recording_process table for an imaging recording. Get all FOVs (TIFF stacks) for the recording and assign a new job for each one with corresponding parameters fetched from selection done in automation GUI.
**AcquiredTiff populate function** in (Imaging pipeline): Auxiliar script to call **populate_Imaging_AcquiredTiff** script in MATLAB.
+ **populate_Imaging_AcquiredTiff** in (populate_Imaging_AcquiredTiff): Population calls to:
-1. **u19_imaging_pipeline.AcquiredTiff**: Each recording is divided into Tiff Splits (e.g. Mesoscope recordings contain multiple tiff stacks that will be processed independently). (Code here)
-2. **u19_imaging_pipeline.SyncImagingBehavior**: Find correspondence between virtual reality frame in the behavior experiment and Calcium Imaging frame in recording.
-(Code here)
+ 1. **u19_imaging_pipeline.AcquiredTiff**: Each recording is divided into Tiff Splits (e.g. Mesoscope recordings contain multiple tiff stacks that will be processed independently). (Code here)
+ 2. **u19_imaging_pipeline.SyncImagingBehavior**: Find correspondence between virtual reality frame in the behavior experiment and Calcium Imaging frame in recording.
+ (Code here)
@@ -66,7 +66,7 @@ The class that manages workflow at the recording level is (RecordingHandler): Ingestion to recording_process table for an ephys recording. Get all probes for the recording and assign a new job for each one with corresponding parameters fetched from selection done in automation GUI.
1. Ingest **ephys_pipeline.EphysPipelineSession** table
@@ -81,9 +81,9 @@ The class that manages workflow at the recording level is (Params Config file): This dictionary defines status definitions and corresponding functions to execute.
+ + **recording_process_status_dict** in (Params Config file): This dictionary defines status definitions and corresponding functions to execute.
+ **pipeline_handler_main** in (RecProcessHandler): Main function in recording process workflow
1. Executes corresponding functions based in status.
2. Executed every 30 minutes to check for new recordings to be handled.
@@ -110,4 +110,4 @@ The class that manages workflow at the recording level is (