Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Tickets/sitcom 1794 #2

Open
wants to merge 5 commits into
base: main
Choose a base branch
from
Open
Show file tree
Hide file tree
Changes from all commits
Commits
File filter

Filter by extension

Filter by extension


Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
11 changes: 11 additions & 0 deletions .github/dependabot.yml
Original file line number Diff line number Diff line change
@@ -0,0 +1,11 @@
version: 2
updates:
- package-ecosystem: "github-actions"
directory: "/"
schedule:
interval: "weekly"

- package-ecosystem: "pip"
directory: "/"
schedule:
interval: "weekly"
40 changes: 8 additions & 32 deletions .github/workflows/ci.yaml
Original file line number Diff line number Diff line change
@@ -1,36 +1,12 @@
name: CI

"on": [push, pull_request]
'on': [push, pull_request, workflow_dispatch]

jobs:
build:
runs-on: ubuntu-latest

steps:
- uses: actions/checkout@v2
with:
fetch-depth: 0 # full history for metadata
submodules: true

- name: Set up Python
uses: actions/setup-python@v2
with:
python-version: 3.9

- name: Python install
run: |
python -m pip install --upgrade pip
python -m pip install -r requirements.txt
python -m pip install "ltd-conveyor<2.0.0"

- name: Build
run: |
make html

- name: Upload
if: ${{ github.event_name == 'push' }}
env:
LTD_PASSWORD: ${{ secrets.LTD_PASSWORD }}
LTD_USERNAME: ${{ secrets.LTD_USERNAME }}
run: |
ltd upload --gh --dir _build/html --product sitcomtn-029
call-workflow:
uses: lsst-sqre/rubin-sphinx-technote-workflows/.github/workflows/ci.yaml@v1
with:
handle: sitcomtn-029
secrets:
ltd_username: ${{ secrets.LTD_USERNAME }}
ltd_password: ${{ secrets.LTD_PASSWORD }}
4 changes: 4 additions & 0 deletions .gitignore
Original file line number Diff line number Diff line change
@@ -1 +1,5 @@
_build/
.technote/
.tox/
venv/
.venv/
7 changes: 7 additions & 0 deletions .pre-commit-config.yaml
Original file line number Diff line number Diff line change
@@ -0,0 +1,7 @@
repos:
- repo: https://github.com/pre-commit/pre-commit-hooks
rev: v4.5.0
hooks:
# - id: trailing-whitespace
- id: check-yaml
- id: check-toml
72 changes: 20 additions & 52 deletions Makefile
Original file line number Diff line number Diff line change
@@ -1,58 +1,26 @@
# Makefile for Sphinx documentation
#

# You can set these variables from the command line.
SPHINXOPTS = -n
SPHINXBUILD = sphinx-build
PAPER =
BUILDDIR = _build

# User-friendly check for sphinx-build
ifeq ($(shell which $(SPHINXBUILD) >/dev/null 2>&1; echo $$?), 1)
$(error The '$(SPHINXBUILD)' command was not found. Try 'running pip install -r requirements.txt' to get the necessary Python dependencies.)
endif

# Internal variables.
PAPEROPT_a4 = -D latex_paper_size=a4
PAPEROPT_letter = -D latex_paper_size=letter
ALLSPHINXOPTS = -d $(BUILDDIR)/doctrees $(PAPEROPT_$(PAPER)) $(SPHINXOPTS) .
# the i18n builder cannot share the environment and doctrees with the others
I18NSPHINXOPTS = $(PAPEROPT_$(PAPER)) $(SPHINXOPTS) .

.PHONY: help clean html epub changes linkcheck refresh-bib

help:
@echo "Please use \`make <target>' where <target> is one of"
@echo " html to make standalone HTML files"
@echo " epub to make an epub"
@echo " linkcheck to check all external links for integrity"
@echo " refresh-bib to update LSST bibliographies in lsstbib/"

clean:
rm -rf $(BUILDDIR)/*
.PHONY:
init:
pip install tox pre-commit
pre-commit install

.PHONY:
html:
$(SPHINXBUILD) -b html $(ALLSPHINXOPTS) $(BUILDDIR)/html
@echo
@echo "Build finished. The HTML pages are in $(BUILDDIR)/html."
tox run -e html

epub:
$(SPHINXBUILD) -b epub $(ALLSPHINXOPTS) $(BUILDDIR)/epub
@echo
@echo "Build finished. The epub file is in $(BUILDDIR)/epub."
.PHONY:
lint:
tox run -e lint,linkcheck

changes:
$(SPHINXBUILD) -b changes $(ALLSPHINXOPTS) $(BUILDDIR)/changes
@echo
@echo "The overview file is in $(BUILDDIR)/changes."
.PHONY:
add-author:
tox run -e add-author

linkcheck:
$(SPHINXBUILD) -b linkcheck $(ALLSPHINXOPTS) $(BUILDDIR)/linkcheck
@echo
@echo "Link check complete; look for any errors in the above output " \
"or in $(BUILDDIR)/linkcheck/output.txt."
.PHONY:
sync-authors:
tox run -e sync-authors

refresh-bib:
refresh-lsst-bib -d lsstbib
@echo
@echo "Commit the new bibliographies: git add lsstbib && git commit -m \"Update bibliographies.\""
.PHONY:
clean:
rm -rf _build
rm -rf .technote
rm -rf .tox
58 changes: 15 additions & 43 deletions README.rst
Original file line number Diff line number Diff line change
@@ -1,11 +1,7 @@
.. image:: https://img.shields.io/badge/sitcomtn--029-lsst.io-brightgreen.svg
:target: https://sitcomtn-029.lsst.io
:target: https://sitcomtn-029.lsst.io/
.. image:: https://github.com/lsst-sitcom/sitcomtn-029/workflows/CI/badge.svg
:target: https://github.com/lsst-sitcom/sitcomtn-029/actions/
..
Uncomment this section and modify the DOI strings to include a Zenodo DOI badge in the README
.. image:: https://zenodo.org/badge/doi/10.5281/zenodo.#####.svg
:target: http://dx.doi.org/10.5281/zenodo.#####

##############################
LATISS Filter Change Procedure
Expand All @@ -18,61 +14,37 @@ This document explains the procedure of what to do when a new filter is added (o

**Links:**

- Publication URL: https://sitcomtn-029.lsst.io
- Publication URL: https://sitcomtn-029.lsst.io/
- Alternative editions: https://sitcomtn-029.lsst.io/v
- GitHub repository: https://github.com/lsst-sitcom/sitcomtn-029
- Build system: https://github.com/lsst-sitcom/sitcomtn-029/actions/


Build this technical note
=========================

You can clone this repository and build the technote locally with `Sphinx`_:
You can clone this repository and build the technote locally if your system has Python 3.11 or later:

.. code-block:: bash

git clone https://github.com/lsst-sitcom/sitcomtn-029
cd sitcomtn-029
pip install -r requirements.txt
make init
make html

.. note::

In a Conda_ environment, ``pip install -r requirements.txt`` doesn't work as expected.
Instead, ``pip`` install the packages listed in ``requirements.txt`` individually.
Repeat the ``make html`` command to rebuild the technote after making changes.
If you need to delete any intermediate files for a clean build, run ``make clean``.

The built technote is located at ``_build/html/index.html``.

Editing this technical note
===========================

You can edit the ``index.rst`` file, which is a reStructuredText document.
The `DM reStructuredText Style Guide`_ is a good resource for how we write reStructuredText.

Remember that images and other types of assets should be stored in the ``_static/`` directory of this repository.
See ``_static/README.rst`` for more information.

The published technote at https://sitcomtn-029.lsst.io will be automatically rebuilt whenever you push your changes to the ``main`` branch on `GitHub <https://github.com/lsst-sitcom/sitcomtn-029>`_.
Publishing changes to the web
=============================

Updating metadata
=================
This technote is published to https://sitcomtn-029.lsst.io/ whenever you push changes to the ``main`` branch on GitHub.
When you push changes to a another branch, a preview of the technote is published to https://sitcomtn-029.lsst.io/v.

This technote's metadata is maintained in ``metadata.yaml``.
In this metadata you can edit the technote's title, authors, publication date, etc..
``metadata.yaml`` is self-documenting with inline comments.

Using the bibliographies
========================

The bibliography files in ``lsstbib/`` are copies from `lsst-texmf`_.
You can update them to the current `lsst-texmf`_ versions with::

make refresh-bib

Add new bibliography items to the ``local.bib`` file in the root directory (and later add them to `lsst-texmf`_).
Editing this technical note
===========================

.. _Sphinx: http://sphinx-doc.org
.. _DM reStructuredText Style Guide: https://developer.lsst.io/restructuredtext/style.html
.. _this repo: ./index.rst
.. _Conda: http://conda.pydata.org/docs/
.. _lsst-texmf: https://lsst-texmf.lsst.io
The main content of this technote is in ``index.rst`` (a reStructuredText file).
Metadata and configuration is in the ``technote.toml`` file.
For guidance on creating content and information about specifying metadata and configuration, see the Documenteer documentation: https://documenteer.lsst.io/technotes.
24 changes: 3 additions & 21 deletions conf.py
Original file line number Diff line number Diff line change
@@ -1,22 +1,4 @@
#!/usr/bin/env python
#
# Sphinx configuration file
# see metadata.yaml in this repo to update document-specific metadata
# See the Documenteer docs for how to customize conf.py:
# https://documenteer.lsst.io/technotes/

import os
from documenteer.sphinxconfig.technoteconf import configure_technote

# Ingest settings from metadata.yaml and use documenteer's configure_technote()
# to build a Sphinx configuration that is injected into this script's global
# namespace.
metadata_path = os.path.join(os.path.dirname(__file__), 'metadata.yaml')
with open(metadata_path, 'r') as f:
confs = configure_technote(f)
g = globals()
g.update(confs)

# Add intersphinx inventories as needed
# http://www.sphinx-doc.org/en/stable/ext/intersphinx.html
# Example:
#
# intersphinx_mapping['python'] = ('https://docs.python.org/3', None)
from documenteer.conf.technote import * # noqa F401 F403
81 changes: 50 additions & 31 deletions index.rst
Original file line number Diff line number Diff line change
@@ -1,3 +1,12 @@
##############################
LATISS Filter Change Procedure
##############################

.. abstract::

This document explains the required procedures when replacing or adding a *new*, previously uninstalled, filter or grating to LATISS.
These steps are required before any data is taken, in order to guarantee the data is correctly handled by the acquisition and ingestion system.

..
Technote content.

Expand Down Expand Up @@ -36,70 +45,80 @@

Feel free to delete this instructional comment.

:tocdepth: 1

.. Please do not modify tocdepth; will be fixed when a new Sphinx theme is shipped.

.. sectnum::



Introduction
============

Adding a filter or disperser to LATISS is a multi-step procedure that spans the DM and T&S subsystems.
This technote is a list of things to do in order to have the data be taken with the correct metadata, as well as what it takes to get the files ingested on the summit.
Replacing a filter or disperser in LATISS requires a few steps to ensure that the instrument is properly configured and that data ingestion is working correctly.
When adding a new, previously uninstalled filter or disperser, there are a few additional steps that span both DM and T&S subsystems.
Some of these steps may take some time to be ready to be rolled out to the summit and should be done well in advance to prevent delaying operations

.. important::

The most time critical step is updating filters.py (see below) and having it be built in a regular weekly that is accessible from the summit environment.
When adding a new, previously uninstalled filter, the most time critical step is updating the `filters definition file <https://github.com/lsst/obs_lsst/blob/main/python/lsst/obs/lsst/filters.py>`_ (see below) and having it be built in a regular weekly that is accessible from the summit environment.
The DM weeklies are built on Wednesday nights and are generally available Thursday morning, the T&S container then builds on top of that, adding several hours.
Performing last-minute filter additions is high risk and puts strain on personnel.
Therefore, any new filter or grating names should be added as early as possible, preferably weeks in advance.

.. note::

A future update to this technote will include the steps required to ingest the data at USDF.



Procedure
=========

1. Make a ticket on the SUMMIT project that dictates which filter should be removed and which should be added.
#. Make a ticket on the SUMMIT project that dictates which filter should be removed and which should be added.
Use a previous ticket as a reference.
Roberto Tighe and Mario Rivera are trained in performing the physical filter change.

2. Create a DM ticket (Team = Telescope and Site) to update the config file in the `ts_config_latiss <https://github.com/lsst-ts/ts_config_latiss>`_ repository.
Follow the TSSW development process.
Never change the names of filters.
Update the name and any focus offsets.
Use the previous configs as the reference for the correct values.
#. Create a DM ticket (Team = Telescope and Site) to update the config file in the `ts_config_latiss <https://github.com/lsst-ts/ts_config_latiss>`_ repository.
Use the previous configs as the reference for the correct values of filter names and focus offsets.
Follow the `TSSW development process <https://tssw-developer.lsst.io/work_management/development_workflow.html#development-workflow-release-process>`_ to tag and release a new version of the `ts_config_latiss <https://github.com/lsst-ts/ts_config_latiss>`_ package once the changes are merged.

3. After the filter is installed, and before taking any images, update the CSC to use the tag (or branch) containing the new config.
This needs to be done either inside the container (temporary) or by updating the cycle.env file, then rebuilding and redeploying the ATSpectrograph CSC (which is the proper way).
#. If you are installing a new, previously unused filter, you will also need to open a PR to update the the `filters definition file <https://github.com/lsst/obs_lsst/blob/main/python/lsst/obs/lsst/filters.py>`_.
Follow the DM development process of making a ticket, making the change, running the CI tests, and getting it reviewed and merged.
The changes will be available in the next DM weekly.
If you are installing a known filter, you can skip directly to step 10.

#. Using the new DM weekly with the changes to the `filters definition file <https://github.com/lsst/obs_lsst/blob/main/python/lsst/obs/lsst/filters.py>`_, register the new filter definitions in all of the different standard butler repositories using the command below.
Note that $OODS_REPO_PATH needs to be replaced with the appropriate path depending on where you are running the command.

.. code-block:: bash

4. Check that the filter is in the `filters.py file of obs_lsst <https://github.com/lsst/obs_lsst/blob/main/python/lsst/obs/lsst/filters.py>`_.
If not, follow the DM development process of making a ticket, making the change, running the CI tests, and getting it reviewed and merged.
The changes will be available in the next weekly.
butler register-instrument $OODS_REPO_PATH lsst.obs.lsst.Latiss

5. Update the OODS container to use the new ``obs_lsst`` that contains the changes (if not done by a cycle deployment).
Currently, the repo paths are:

6. Once everything built and deployed, the butler needs to have the registry updated to say the filter exists. So each butler repo (on each instance such as the test stands) need to run the following from the command line wherever the new ``obs_lsst`` is installed. At the summit, this can be done from inside a Nublado instance (and probably the other places too). Note that $OODS_REPO_PATH needs to be populated or replaced. On the summit it is ``/repo/LATISS``
- **At USDF**: /repo/embargo and /repo/main
- **At Summit**: /repo/LATISS
- **At BTS**: /repo/LATISS
- **At TTS**: /repo/LATISS

.. code-block::
#. Update the cycle build. Change the weekly version of summit_utils, summit_extras, atmospec, spectractor, lsst-sqre to use the new DM weekly.

butler register-instrument $OODS_REPO_PATH lsst.obs.lsst.Latiss

7. The USDF embargo Butler auto ingestion service must also be restarted with the appropriate weekly, after registering the instrument in the embargo Butler repo (and also the main Butler repo).
#. Rebuild the following containers using the `current cycle build on jenkins <https://ts-cycle-build.lsst.io/user-guide/user-guide.html#fig-jenkins-build-with-parameters>`_

8. Start taking images with the new filter and/or grating and verify ingestion works and that images appear on RubinTV.

- deploy_lsstsqre
- build_scriptqueue
- rapid_analysis
- build-sciplat
- build-sciplat-recommended
- build-oods

#. Once the container builds are complete, redeploy the ATOODS container and rubinTV pods.

#. The USDF embargo Butler auto ingestion service must also be restarted with the appropriate weekly, after registering the instrument in the embargo Butler repo (and also the main Butler repo).

#. After the filter is physically installed, and before taking any images, update the ATSpecgtrograph CSC to use the tag (or branch) containing the new config.
This needs to be done either inside the container (temporary) or by updating the cycle.env file, then rebuilding and redeploying the ATSpectrograph CSC (which is the proper way).

#. Once the new version of the ATSpectrograph CSC is deployed you are ready to enable the spectrograph and take a few test images with the new configuration.

#. During the filter change process, it is possible the grating stage itself was moved out of its nominal position, so be sure to start by running the `latiss checkout procedure <https://obs-ops.lsst.io/AuxTel/Standard-Operations/Daytime-Operations/Daytime-Checkout.html#auxtel-daytime-checkout-latiss-checkout-py>`_ to check the position of this stage.

#. Check that the images are properly ingested in RubinTV by looking for the filter and grating and ensure that the values are correct.
If the values for filter and grating are correct, you are finished.

.. Add content here.
.. Do not include the document title (it's automatically added from metadata.yaml).
Expand All @@ -109,4 +128,4 @@ Procedure
.. Make in-text citations with: :cite:`bibkey`.

.. .. bibliography:: local.bib lsstbib/books.bib lsstbib/lsst.bib lsstbib/lsst-dm.bib lsstbib/refs.bib lsstbib/refs_ads.bib
.. :style: lsst_aa
.. :style: lsst_aa
Loading
Loading