diff --git a/.buildinfo b/.buildinfo new file mode 100644 index 00000000..dbadbc81 --- /dev/null +++ b/.buildinfo @@ -0,0 +1,4 @@ +# Sphinx build info version 1 +# This file hashes the configuration used when building these files. When it is not found, a full rebuild will be done. +config: 53d416767d6454373a24714add246362 +tags: 645f666f9bcd5a90fca523b33c5a78b7 diff --git a/.nojekyll b/.nojekyll new file mode 100644 index 00000000..e69de29b diff --git a/_images/DISCO-Workflow.png b/_images/DISCO-Workflow.png new file mode 100644 index 00000000..90cf6dfd Binary files /dev/null and b/_images/DISCO-Workflow.png differ diff --git a/_images/SMART-DS-flowchart.png b/_images/SMART-DS-flowchart.png new file mode 100644 index 00000000..3a7c615c Binary files /dev/null and b/_images/SMART-DS-flowchart.png differ diff --git a/_images/db-tables.png b/_images/db-tables.png new file mode 100644 index 00000000..6a3adc0f Binary files /dev/null and b/_images/db-tables.png differ diff --git a/_images/feeder.png b/_images/feeder.png new file mode 100644 index 00000000..dee9f613 Binary files /dev/null and b/_images/feeder.png differ diff --git a/_images/hca__pf1.png b/_images/hca__pf1.png new file mode 100644 index 00000000..89c4ded2 Binary files /dev/null and b/_images/hca__pf1.png differ diff --git a/_images/max_voltage_pri_sec.png b/_images/max_voltage_pri_sec.png new file mode 100644 index 00000000..f2082b5d Binary files /dev/null and b/_images/max_voltage_pri_sec.png differ diff --git a/_images/thermal_upgrades.png b/_images/thermal_upgrades.png new file mode 100644 index 00000000..71ac2280 Binary files /dev/null and b/_images/thermal_upgrades.png differ diff --git a/_images/thermal_workflow.png b/_images/thermal_workflow.png new file mode 100644 index 00000000..398786d2 Binary files /dev/null and b/_images/thermal_workflow.png differ diff --git a/_images/thermalafter_thermalupgrades.png b/_images/thermalafter_thermalupgrades.png new file mode 100644 index 00000000..430561e4 Binary files /dev/null and b/_images/thermalafter_thermalupgrades.png differ diff --git a/_images/thermalbefore_thermalupgrades.png b/_images/thermalbefore_thermalupgrades.png new file mode 100644 index 00000000..53f75bac Binary files /dev/null and b/_images/thermalbefore_thermalupgrades.png differ diff --git a/_images/upgrades.png b/_images/upgrades.png new file mode 100644 index 00000000..bc87abde Binary files /dev/null and b/_images/upgrades.png differ diff --git a/_images/voltage_upgrades.png b/_images/voltage_upgrades.png new file mode 100644 index 00000000..8a424ffd Binary files /dev/null and b/_images/voltage_upgrades.png differ diff --git a/_images/voltage_workflow.png b/_images/voltage_workflow.png new file mode 100644 index 00000000..81bb519d Binary files /dev/null and b/_images/voltage_workflow.png differ diff --git a/_images/voltageafter_thermalupgrades.png b/_images/voltageafter_thermalupgrades.png new file mode 100644 index 00000000..e35c4039 Binary files /dev/null and b/_images/voltageafter_thermalupgrades.png differ diff --git a/_images/voltageafter_voltageupgrades.png b/_images/voltageafter_voltageupgrades.png new file mode 100644 index 00000000..654064c6 Binary files /dev/null and b/_images/voltageafter_voltageupgrades.png differ diff --git a/_images/voltagebefore_thermalupgrades.png b/_images/voltagebefore_thermalupgrades.png new file mode 100644 index 00000000..5c20f631 Binary files /dev/null and b/_images/voltagebefore_thermalupgrades.png differ diff --git a/_images/voltagebefore_voltageupgrades.png b/_images/voltagebefore_voltageupgrades.png new file mode 100644 index 00000000..7f25751f Binary files /dev/null and b/_images/voltagebefore_voltageupgrades.png differ diff --git a/_sources/advanced-guide.rst.txt b/_sources/advanced-guide.rst.txt new file mode 100644 index 00000000..b48a7e5b --- /dev/null +++ b/_sources/advanced-guide.rst.txt @@ -0,0 +1,27 @@ +.. _advanced_guide: + +Advanced Guide +############## + +.. Content here is commented-out because it doesn't currently fit. Might need it again. + +.. This page describes how to use the DISCO package to create, modify, and run +.. simulations locally or on an HPC. + +.. DISCO create JADE extensions in DISCO, and calls high-level interfaces of PyDSS +.. to run simulations on top of OpenDSS.The supported simulations in DISCO currently +.. include: + +.. * DISCO PV Deployment Simulation via ``pydss_simulation`` extension. + + +.. Please refer to the following links and check the simulation types in detail. + +.. toctree:: + :maxdepth: 1 + + advanced-guide/upgrade-cost-analysis-generic-models.rst + +.. If you need to create your own extension, the +.. `JADE documentation `_ +.. provides step-by-step instructions. diff --git a/_sources/advanced-guide/upgrade-cost-analysis-generic-models.rst.txt b/_sources/advanced-guide/upgrade-cost-analysis-generic-models.rst.txt new file mode 100644 index 00000000..51049ec1 --- /dev/null +++ b/_sources/advanced-guide/upgrade-cost-analysis-generic-models.rst.txt @@ -0,0 +1,20 @@ +.. _upgrade_cost_analysis_schemas: + +********************************** +Upgrade Cost Analysis JSON Schemas +********************************** + +UpgradeCostAnalysisSimulationModel +================================== +.. literalinclude:: ../../build/json_schemas/UpgradeCostAnalysisSimulationModel.json + :language: json + +UpgradesCostResultSummaryModel +============================== +.. literalinclude:: ../../build/json_schemas/UpgradesCostResultSummaryModel.json + :language: json + +JobUpgradeSummaryOutputModel +============================ +.. literalinclude:: ../../build/json_schemas/JobUpgradeSummaryOutputModel.json + :language: json diff --git a/_sources/analysis-workflows.rst.txt b/_sources/analysis-workflows.rst.txt new file mode 100644 index 00000000..c6620582 --- /dev/null +++ b/_sources/analysis-workflows.rst.txt @@ -0,0 +1,24 @@ +.. _analysis_workflows: + +****************** +Analysis Workflows +****************** + +DISCO implements analysis workflows that allow post-processing of individual jobs, +batches of jobs, or a pipeline of batches. + +The supported analyses include: + +* Static Hosting Capacity Analysis +* Dynamic Hosting Capacity Analysis +* Upgrade Cost Analysis Analysis +* Snapshot/Time Series Impact Analysis + +The following sections show the analysis workflows in detail. + +.. toctree:: + :maxdepth: 2 + + analysis-workflows/hosting-capacity-analysis + analysis-workflows/upgrade-cost-analysis + analysis-workflows/impact-analysis diff --git a/_sources/analysis-workflows/hosting-capacity-analysis.rst.txt b/_sources/analysis-workflows/hosting-capacity-analysis.rst.txt new file mode 100644 index 00000000..e3c78871 --- /dev/null +++ b/_sources/analysis-workflows/hosting-capacity-analysis.rst.txt @@ -0,0 +1,220 @@ +Hosting Capacity Analysis +========================= + +This section shows how to conduct *hosting capacity analysis* using DISCO pipeline with *snapshot* +and *time-series* models as inputs. This tutorial assumes there's an existing ``snapshot-feeder-models`` +directory generated from the ``transform-model`` command as below. The workflow below can also be +applied to ``time-series-feeder-models``. + +**1. Config Pipeline** + +Check the ``--help`` option for creating pipeline template. + +.. code-block:: bash + + $ disco create-pipeline template --help + Usage: disco create-pipeline template [OPTIONS] INPUTS + + Create pipeline template file + + Options: + -T, --task-name TEXT The task name of the simulation/analysis + [required] + -P, --preconfigured Whether inputs models are preconfigured + [default: False] + -s, --simulation-type [snapshot|time-series|upgrade] + Choose a DISCO simulation type [default: + snapshot] + --with-loadshape / --no-with-loadshape + Indicate if loadshape file used for Snapshot + simulation. + --auto-select-time-points / --no-auto-select-time-points + Automatically select the time point based on + max PV-load ratio for snapshot simulations. + Only applicable if --with-loadshape. + [default: auto-select-time-points] + -d, --auto-select-time-points-search-duration-days INTEGER + Search duration in days. Only applicable + with --auto-select-time-points. [default: + 365] + -i, --impact-analysis Enable impact analysis computations + [default: False] + -h, --hosting-capacity Enable hosting capacity computations + [default: False] + -u, --upgrade-analysis Enable upgrade cost computations [default: + False] + -c, --cost-benefit Enable cost benefit computations [default: + False] + -p, --prescreen Enable PV penetration level prescreening + [default: False] + -t, --template-file TEXT Output pipeline template file [default: + pipeline-template.toml] + -r, --reports-filename TEXT PyDSS report options. If None, use the + default for the simulation type. + -S, --enable-singularity Add Singularity parameters and set the + config to run in a container. [default: + False] + -C, --container PATH Path to container + -D, --database PATH The path of new or existing SQLite database + [default: results.sqlite] + -l, --local Run in local mode (non-HPC). [default: + False] + --help Show this message and exit. + + +Given an output directory from ``transform-model``, we use this command with ``--preconfigured`` option +to create the template. + +.. code-block:: bash + + $ disco create-pipeline template -T SnapshotTask -s snapshot -h -P snapshot-feeder-models --with-loadshape + + +.. note:: For configuring a dynamic hosting capacity pipeline, use ``-s time-series`` + + +It creates ``pipeline-template.toml`` with configurable parameters of different sections. Update +parameter values if needed. Then run + +.. code-block:: bash + + $ disco create-pipeline config pipeline-template.toml + +This command creates a ``pipeline.json`` file containing two stages: + +* stage 1 - simulation +* stage 2 - post-process + +Accordingly, there will be an output directory for each stage, + +* output-stage1 +* output-stage2 + +**2. Submit Pipeline** + +With a configured DISCO pipeline in ``pipeline.json`` the next step is to submit the pipeline with +JADE: + +.. code-block:: bash + + $ jade pipeline submit pipeline.json -o output + +What does each stage do? + +* In the simulation stage DISCO runs a power flow simulation for each job through PyDSS and stores + per-job metrics. +* In the post-process stage DISCO aggregates the metrics from each simulation job, calculates + the hosting capacity, and then ingests results into a SQLite database. + + +**3. Check Results** + +The post-process stage aggregates metrics in the following tables in ``output/output-stage1``: + +* ``feeder_head_table.csv`` +* ``feeder_losses_table.csv`` +* ``metadata_table.csv`` +* ``thermal_metrics_table.csv`` +* ``voltage_metrics_table.csv`` + +Each table contains metrics related to the *snapshot* or *time-series* simulation. DISCO +computes hosting capacity results from these metrics and then writes them to the following files, +also in ``output/output-stage1``: + +* ``hosting_capacity_summary__.json`` +* ``hosting_capacity_overall__.json`` + +The scenario name will be ``scenario``, ``pf1`` and/or ``control_mode``, depending on your +simulation type and/or ``--with-loadshape`` option. + + +Note that DISCO also produces prototypical visualizations for hosting capacity automatically after each run: + +* ``hca__{scenario_name}.png`` + +.. image:: ../images/hca__pf1.png + :scale: 60 + + +The voltage plot examples for the first feeder comparing pf1 vs. voltvar and comparing primary and secondary voltages: + +* ``max_voltage_pf1_voltvar.png`` +* ``max_voltage_pri_sec.png`` + +.. image:: ../images/max_voltage_pri_sec.png + :scale: 60 + +**4. Results database** + +DISCO ingests the hosting capacity results and report metrics into a SQLite database named +``output/output-stage1/results.sqlite``. You can use standard SQL to query data, and perform +further analysis. + +If you want to ingest the results into an existing database, please specify the absolute path +of the database in ``pipeline.toml``. + +For sqlite query examples, please refer to the Jupyter notebook ``notebooks/db-query.ipynb`` in +the source code repo. + +If you would like to use the CLI tool ``sqlite3`` directly, here are some examples. Note that in +this case the database contains the results from a single task, and so the queries are not first +pre-filtering the tables. + +If you don't already have ``sqlite3`` installed, please refer to their +`website `_. + +Run this command to start the CLI utility: + +.. code-block:: bash + + $ sqlite3 -table + +.. note:: If your version of sqlite3 doesn't support ``-table``, use ``-header -column`` instead. + +1. View DISCO's hosting capacity results for all feeders. + +.. code-block:: bash + + sqlite> SELECT * from hosting_capacity WHERE hc_type = 'overall'; + +2. View voltage violations for one feeder and scenario. + +.. code-block:: bash + + sqlite> SELECT feeder, scenario, sample, penetration_level, node_type, min_voltage, max_voltage + FROM voltage_metrics + WHERE (max_voltage > 1.05 or min_voltage < 0.95) + AND scenario = 'pf1' + AND feeder = 'p19udt14287'; + +3. View the min and max voltages for each penetration_level (across samples) for one feeder. + +.. code-block:: bash + + sqlite> SELECT feeder, sample, penetration_level + ,MIN(min_voltage) as min_voltage_overall + ,MAX(max_voltage) as max_voltage_overall + ,MAX(num_nodes_any_outside_ansi_b) as num_nodes_any_outside_ansi_b_overall + ,MAX(num_time_points_with_ansi_b_violations) as num_time_points_with_ansi_b_violations_overall + FROM voltage_metrics + WHERE scenario = 'pf1' + AND feeder = 'p19udt14287' + GROUP BY feeder, penetration_level; + +4. View the max thermal loadings for each penetration_level (across samples) for one feeder. + +.. code-block:: bash + + sqlite> SELECT feeder, sample, penetration_level + ,MAX(line_max_instantaneous_loading_pct) as line_max_inst + ,MAX(line_max_moving_average_loading_pct) as line_max_mavg + ,MAX(line_num_time_points_with_instantaneous_violations) as line_num_inst + ,MAX(line_num_time_points_with_moving_average_violations) as line_num_mavg + ,MAX(transformer_max_instantaneous_loading_pct) as xfmr_max_inst + ,MAX(transformer_max_moving_average_loading_pct) as xfmr_max_mavg + ,MAX(transformer_num_time_points_with_instantaneous_violations) as xfmr_num_inst + ,MAX(transformer_num_time_points_with_moving_average_violations) as xfmr_num_mavg + FROM thermal_metrics + WHERE scenario = 'pf1' + AND feeder = 'p19udt14287' + GROUP BY feeder, penetration_level; diff --git a/_sources/analysis-workflows/impact-analysis.rst.txt b/_sources/analysis-workflows/impact-analysis.rst.txt new file mode 100644 index 00000000..3e3372b3 --- /dev/null +++ b/_sources/analysis-workflows/impact-analysis.rst.txt @@ -0,0 +1,219 @@ +Impact Analysis +=============== +This section shows how to perform customized analysis with time-series simulations on a set of input models. +Note that you could generally substitute "snapshot" for "time-series" for that type of simulation. + +Other sections of this documentation describe workflows that rely on DISCO/PyDSS to collect +specific metrics from each simulation. For example, the dynamic hosting capacity analysis workflow +collects metrics for max instantaneous and moving-average line and transformer loading violations. +It does not store values for every line and transformer at every time point because of the amount +of storage space that requires. This section shows how to collect all of the data. + +Transform Model +--------------- +As with earlier sections this assumes that you have cloned the disco repo to the ``~/disco`` directory. +Transform the source models into DISCO models with this command: + +.. code-block:: bash + + $ disco transform-model ~/disco/tests/data/smart-ds/substations/ time-series + Transformed data from ~/disco/tests/data/smart-ds/substations/ to time-series-feeder-models for TimeSeries Analysis. + +.. note:: For your own models you will likely need to set ``--start``, ``--end``, and ``--resolution``. + + +Config Jobs +----------- + +1. Copy this text into a file called ``exports.toml``. This will instruct PyDSS to store each of these + properties for each element at each time point. + +:: + + [Loads.Powers] + store_values_type = "all" + + [PVSystems.Powers] + store_values_type = "all" + + [Circuits.TotalPower] + store_values_type = "all" + + [Circuits.LineLosses] + store_values_type = "all" + + [Circuits.Losses] + store_values_type = "all" + + [Lines.Currents] + store_values_type = "all" + + [Lines.Losses] + store_values_type = "all" + + [Lines.Powers] + store_values_type = "all" + +2. Create the configuration with all reports disabled, custom exports, all data exported, a custom + DC-AC ratio, and a specific volt-var curve. + +.. note:: If you're using a Windows terminal, the ``\`` characters used here for line breaks probably won't work. + +.. code-block:: bash + + $ disco config time-series time-series-feeder-models \ + --config-file time-series-config.json \ + --volt-var-curve volt_var_ieee_1547_2018_catB \ + --dc-ac-ratio=1.0 \ + --exports-filename exports.toml \ + --export-data-tables \ + --store-all-time-points \ + --store-per-element-data \ + --thermal-metrics=true \ + --voltage-metrics=true \ + --feeder-losses=true + + +Submit Jobs +----------- +Run the jobs with JADE. Two examples are shown: one on a local machine and one on an HPC. + +.. code-block:: bash + + $ jade submit-jobs --local time-series-config.json -o time-series-output + $ jade submit-jobs -h hpc_config.toml time-series-config.json -o time-series-output + +Confirm that all jobs passed. + +.. code-block:: bash + + $ jade show-results -o time-series-output + +View Output Files +----------------- +Each job's outputs will be stored in ``time-series-output/job-outputs//pydss_project/project.zip``. +Extract one zip file. You will see exported data for all element properties. For example, this file +contains bus voltages for the volt-var scenario: ``Exports/control_mode/Buses__puVmagAngle.csv``. +``Exports/control_mode/CktElement__ExportLoadingsMetric.csv`` contains thermal loading values. +The same files will exist for the pf1 scenario. + +Summary files will be available for thermal and voltage metrics. Refer to ``Reports/thermal_metrics.json`` +and ``Reports/voltage_metrics.json``. + +Make metric table files +----------------------- +Run this command to convert the thermal and voltage metrics into tabular form. + +.. code-block:: bash + + $ disco make-summary-tables time-series-output + + +Access Results Programmatically +------------------------------- +DISCO includes analysis code to help look at thermal loading and voltage violations. Here is +some example code: + +.. code-block:: python + + import logging + import os + + from jade.loggers import setup_logging + from disco.pydss.pydss_analysis import PyDssAnalysis, PyDssScenarioAnalysis + from disco.extensions.pydss_simulation.pydss_configuration import PyDssConfiguration + + logger = setup_logging("config", "log.txt", console_level=logging.INFO) + + output_dir = "time-series-output" + config = PyDssConfiguration.deserialize(os.path.join(output_path, "config.json")) + analysis = PyDssAnalysis(output_path, config) + analysis.show_results() + + # Copy name from the output of show_results(). + name = analysis.list_results()[1].name + + # Look up job-specific parameters. + job = analysis.get_job(name) + print(job) + print(job.model.deployment) + print(job.model.deployment.project_data) + + simulation = analysis.get_simulation(name) + + # Get access to result dataframes. + results = analysis.read_results(simulation) + scenario = results.scenarios[0] + scenario_analysis = PyDssScenarioAnalysis(simulation, results, scenario.name) + + # Get list of voltage magnitudes for each bus. + voltages_per_bus = scenario_analysis.get_pu_bus_voltage_magnitudes() + + # Get loading percentages. + line_loading = scenario_analysis.get_line_loading_percentages() + transformer_loading = scenario_analysis.get_transformer_loading_percentages() + + # Find out what classes and properties are available. + for element_class in scenario.list_element_classes(): + for prop in scenario.list_element_properties(element_class): + print(element_class, prop) + + for name in scenario.list_element_names("Lines", "Currents"): + df = scenario.get_dataframe("Lines", "Currents", name) + print(df.head()) + + # Browse static element information. + for filename in scenario.list_element_info_files(): + print(filename) + df = scenario.read_element_info_file(filename) + print(df.head()) + + # Use class names to read specific element infomation. + df = scenario.read_element_info_file("Loads") + df = scenario.read_element_info_file("PVSystems") + + # Read events from the OpenDSS event log. + event_log = scenario.read_event_log() + + # Get the count of each capacitor's state changes from the event log. + capacitor_changes = scenario.read_capacitor_changes() + + +Use the PyDSS Data Viewer +------------------------- +PyDSS includes a data viewer that makes it easy to plot circuit element values in a Jupyter +notebook. Refer to its `docs `_. + + +Generic Models +-------------- +This section follows the same workflow except that it uses pre-defined OpenDSS models. Unlike +the previous example, DISCO will not make any changes to the model files. + +Refer to :ref:`GenericPowerFlowModels` for specific details about the input file +``time_series_generic.json``. + +.. code-block:: bash + + $ disco config-generic-models time-series ~/disco/tests/data/time_series_generic.json \ + --config-file time-series-config.json \ + --volt-var-curve volt_var_ieee_1547_2018_catB \ + --exports-filename exports.toml \ + --export-data-tables \ + --store-all-time-points \ + --store-per-element-data \ + --thermal-metrics=true \ + --voltage-metrics=true \ + --feeder-losses=true + +.. code-block:: bash + + $ jade submit-jobs --local time-series-config.json -o time-series-output + +.. code-block:: bash + + $ jade show-results -o time-series-output + +.. code-block:: bash + + $ disco make-summary-tables time-series-output diff --git a/_sources/analysis-workflows/upgrade-cost-analysis.rst.txt b/_sources/analysis-workflows/upgrade-cost-analysis.rst.txt new file mode 100644 index 00000000..382e4429 --- /dev/null +++ b/_sources/analysis-workflows/upgrade-cost-analysis.rst.txt @@ -0,0 +1,338 @@ +Upgrade Cost Analysis +===================== + +This chapter introduces the workflow for conducting *upgrade cost analysis* by using DISCO commands +step by step or DISCO pipeline, where the pipeline chains the individual steps and runs upgrade cost +analysis seamlessly. In the following two sections we will introduce the two methods separately. + +There is a third method that bypasses the normal DISCO processes. This generic workflow allows you +to run the upgrade simulations on existing, non-standardized OpenDSS models without any transformations. + +The following commands run with default options. If you need any customization, please run ``--help`` on +the commands to see the available options. + +Step-by-Step Workflow +--------------------- + +**1. Transform Model** + +Prepare the model with PV deployments by using DISCO model transformation. + +.. code-block:: bash + + $ disco transform-model tests/data/smart-ds/substations upgrade -o upgrade-models + +Load shape profiles for ``Load`` elements are not used by the upgrade module, and so we recommend that +you remove them from the models in order to speed-up the simulations. Do so with this option: + +.. code-block:: bash + + $ disco transform-model tests/data/smart-ds/substations upgrade --exclude-load-profile -o upgrade-models + +**2. Create Config** + +With the transformed model, create the `config.json` file with submittable jobs. + +.. code-block:: bash + + $ disco config upgrade upgrade-models + +DISCO will use default upgrade parameters if the option ``--params-file`` is not specified. +If ``--params-file`` is specified, that file must contain all required parameters. + +Here are optional parameters that you can customize in the same file: + +.. code-block:: + + [thermal_upgrade_params] + parallel_transformers_limit = 4 + parallel_lines_limit = 4 + upgrade_iteration_threshold = 5 + timepoint_multipliers = {} + + [voltage_upgrade_params] + capacitor_sweep_voltage_gap = 1.0 + reg_control_bands = [1, 2] + reg_v_delta = 0.5 + max_regulators = 4 + place_new_regulators = true + use_ltc_placement = true + timepoint_multipliers = {} + capacitor_action_flag = true + existing_regulator_sweep_action = true + + +**3. Submit Jobs** + +Submit jobs by using JADE and conduct upgrade cost analysis within each job. +This command assumes that you are running on a local system. Please remove the option +``--local`` if you run on an HPC. + +.. code-block:: bash + + $ jade submit-jobs config.json --local + +This step will generate the directory ``output``, which contains all upgrade results. + +**4. Upgrade Analysis** + +Run post-processing to aggregate upgrade cost analysis results and create analysis CSV tables. + +.. code-block:: bash + + $ disco-internal make-upgrade-tables output + +If everything succeeds, it produces aggregated json file: ``upgrade_summary.json`` + + +Pipeline Workflow +----------------- + +**1. Create Template** + +Create a DISCO pipeline template file. By default, the output file is ``pipeline-template.toml``. + +.. code-block:: bash + + $ disco create-pipeline template --task-name UpgradeTask --simulation-type upgrade --upgrade-analysis ~/Workspace/disco/tests/data/smart-ds/substations + +Here, we need to enable the ``--upgrade-analysis`` option. + +**2. Config Pipeline** + +Update the pipeline template file for customization if needed. Then create the pipeline config file +``pipeline.json`` with this command. + +.. code-block:: bash + + $ disco create-pipeline config pipeline-template.toml + + +**3. Submit Pipeline** + +Submit the pipeline with JADE + +.. code-block:: bash + + $ jade pipeline submit pipeline.json + +If everything succeeds, it produces same aggregated upgrade tables in ``output-stage1``. + +Generic Workflow +---------------- +Let's assume that you have multiple networks defined in OpenDSS model files where each network has +its own ``Master.dss``. + +- ``./custom_models/model1/Master.dss`` +- ``./custom_models/model2/Master.dss`` + +Single Execution Mode +~~~~~~~~~~~~~~~~~~~~~ +1. Configure the simulation parameters and in an input JSON file called ``upgrades.json``. +Refer to this +`file `_ +as an example. The JSON schemas are defined in :ref:`upgrade_cost_analysis_schemas`. + +Each job represents one OpenDSS network and one upgrade simulation. + +2. Run the simulation. + +.. code-block:: bash + + $ disco upgrade-cost-analysis run upgrades.json + +Refer to ``disco upgrade-cost-analysis run --help`` for additional options. + +Parallel Execution Mode through JADE +~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~ +1. Configure ``upgrades.json`` as described in the previous step. + +2. Create the JADE configuration file. + +.. code-block:: bash + + $ disco upgrade-cost-analysis config upgrades.json + +3. Modify the generated ``config.json`` if necessary. + +4. Run the jobs through JADE. This will aggregate results across all jobs. + This example assumes local-mode execution. + +.. code-block:: bash + + jade submit-jobs --local config.json + + +Technical Details +----------------- +The automated upgrades module consists of three components as shown in the figure: it performs traditional infrastructure upgrades to resolve both thermal and voltage violations, +and then computes the costs associated with each of those upgrades. + +.. image:: ../images/upgrades.png + :width: 250 + +A high level overview of thermal and voltage upgrades considerations is shown below: + +.. image:: ../images/thermal_upgrades.png + :width: 250 + +.. image:: ../images/voltage_upgrades.png + :width: 250 + + + +**1. Thermal Upgrades Workflow** + +In this sub-module, the thermal equipment (lines and transformers) violations are identified, and upgrades are determined as per the flowchart given below. + +.. image:: ../images/thermal_workflow.png + :height: 650 + + +The technical equipment database is a catalog of available lines and transformers and can optionally be provided as an input. +All the equipment in this database will be considered as available options while determining thermal upgrades. +If this file is not provided, a technical database will be automatically generated from the given feeder model. +This would provide the thermal upgrades module with a limited set of upgrade options. +Refer to this `sample technical equipment catalog +`_ +for more information. + + +For an overloaded equipment, if a higher rated equipment of similar configuration is available in the technical catalog, that is considered as an upgrade and is chosen. +Else, similar configuration equipment are added in parallel to resolve the observed violations. +Sometimes, extreme thermal equipment overloaded can also cause voltage issues. So, it can be seen that thermal upgrades also resolve some undervoltage violations. + + + +**2. Voltage Upgrades Workflow** + +In this sub-module, the voltage violations present in the feeder are identified, and are resolved as shown in flowchart below: + +.. image:: ../images/voltage_workflow.png + :width: 250 + + +*a. Existing Capacitors:* + +* If capacitors are present + + + * If capacitor control is present for a capacitor: correct capacitor control parameters i.e. PT ratio is checked and corrected (if needed) + * If capacitor control is present, it is changed to voltage-controlled (if it is of any other kind) + * If capacitor control is not present, voltage-controlled capacitor control is added and default control settings are applied to any newly added controller + +* A settings sweep is performed through all capacitor settings, and setting with least number of violations is chosen. If initial settings are best, no changes are made. In the capacitor settings sweep method, same settings are applied to all capacitors. + + +*b. Existing Regulators:* + +* If voltage regulators are present, regulator control parameters (like ptratio) are corrected (if needed), including for substation LTC. + + +* A settings sweep is performed for existing regulator control devices (excluding substation LTC). + * In this settings sweep method, same settings are applied to all regulators + + +*c. Add new Regulator:* + +* A new regulator is added by clustering nearby buses with violations and testing regulator placement (one at a time) on each of the common upstream nodes. The placement option with least number of violations is chosen. + + + +**3. Upgrades Cost computation** +A unit cost database is used to determine the total costs associated thermal and voltage upgrades determined through the workflows described above. +Sample input cost database can be found `here `_ + + + +Input parameters +~~~~~~~~~~~~~~~~ + +In order to run this simulation, the following inputs are needed. For required fields, example inputs are provided, and for optional parameters, default inputs are shown. + + +*1. Thermal Upgrade Inputs* + +The input parameters for thermal upgrades are shown in table below. For required fields, example inputs are provided, and for optional parameters, default inputs are shown. + +.. csv-table:: Thermal Upgrade Inputs + :file: ../images/thermal_inputs.csv + :header-rows: 1 + + +*2. Voltage Upgrade Inputs* + +The input parameters for voltage upgrades are shown in table below. + +.. csv-table:: Voltage Upgrade Inputs + :file: ../images/voltage_inputs.csv + :header-rows: 1 + + +*3. Simulation Input Parameters* + +In addition to the thermal and voltage input parameters, there are a few other simulation parameters which need to be provided. + +.. csv-table:: Simulation input parameters + :file: ../images/simulation_params.csv + :header-rows: 1 + + +Outputs +~~~~~~~ + +*1. Costs* + +.. csv-table:: Output costs + :file: ../images/output_costs.csv + :header-rows: 1 + + + + +*2. Summary* + +.. csv-table:: Output summary + :file: ../images/output_summary.csv + :header-rows: 1 + + +Example +~~~~~~~ + +For a feeder with thermal and voltage violations, the following figures show the violations in a feeder before and after upgrades. + +.. image:: ../images/feeder.png + :width: 400 + + +*1. Thermal Upgrades* + +The following figures show the thermal violations in a feeder before and after thermal upgrades: + +.. image:: ../images/thermalbefore_thermalupgrades.png + :width: 400 + + +.. image:: ../images/thermalafter_thermalupgrades.png + :width: 400 + +The following figures show the voltage violations in a feeder before and after thermal upgrades: + +.. image:: ../images/voltagebefore_thermalupgrades.png + :width: 400 + + +.. image:: ../images/voltageafter_thermalupgrades.png + :width: 400 + + +*2. Voltage Upgrades* + +The following figures show the voltage violations in a feeder before and after voltage upgrades: + +.. image:: ../images/voltagebefore_voltageupgrades.png + :width: 400 + +.. image:: ../images/voltageafter_voltageupgrades.png + :width: 400 diff --git a/_sources/build-docs.rst.txt b/_sources/build-docs.rst.txt new file mode 100644 index 00000000..2e7a2f53 --- /dev/null +++ b/_sources/build-docs.rst.txt @@ -0,0 +1,23 @@ +********** +Build docs +********** + +To build docs locally, use command below: + +.. code-block:: + + $ cd docs + + # Rebuild the API pages on first build or if code has changed. + $ rm -rf source/disco + $ sphinx-apidoc -o source/disco ../disco + + $ make html + + +To publish the updated docs on Github pages, use this command, + +.. code-block:: bash + + $ cd docs + $ make github diff --git a/_sources/data-ingestion.rst.txt b/_sources/data-ingestion.rst.txt new file mode 100644 index 00000000..d3ba671a --- /dev/null +++ b/_sources/data-ingestion.rst.txt @@ -0,0 +1,69 @@ +************** +Data Ingestion +************** + +DISCO can ingest simulation metrics and analysis results into a sqlite database, which +facilitates data sharing between researchers and data query for further investigation. + +Ingest to New Database +====================== + +Suppose we are assigned a task which requires us to run a DISCO pipeline for +static hosting capacity analysis and the generated pipeline output directory +is ``/data/snapshot-output/``. + +Run the command below to ingest the results into a database. + +.. code-block:: bash + + $ disco ingest-tables --task-name "SFO P1U Snapshot" --model-inputs /data/input-models/ --database=test.sqlite /data/snapshot-output/ + +It will create a database named ``test.sqlite`` with data tables like below, + +.. image:: images/db-tables.png + + +Ingest to Existing Database +=========================== + +Now we are assigned a second task, and need to run DISCO pipeline for dynamic hosting capacity. +We generated the output directory ``/data/time-series-output``. +Again, we would like to ingest the data into a database. + +We have two choices for data ingestion: + +1. Ingest the results into a new database, let's say, ``data.sqlite``. +2. Ingest the results into an existing database, for example, the one we created above ``test.sqlite``. + +If choose option 1, then just run the command above with new ``--database`` value specified. +Here, we would like to choose option2, and ingest the results of second task into an existing database, +`test.sqlite` created before. To perform this, we need to assign ``--task-name`` a different value, +otherwise, it would prevent the data ingestion, as the task for each ingestion must be unique. + +.. code-block:: bash + + $ disco ingest-tables --task-name "SFO P1U Time-series" --model-inputs /data/input-models/ --database=test.sqlite /data/time-series-output/ + +.. note:: + + Task names must be unique. It's recommended to use a naming convention like this: `` ``. + +Run Database Queries +==================== + +Create a db connection, + +.. code-block:: python + + import sqlite3 + conn = sqlite3.connect("test.sqlite") + +Run sql query with ``pandas``, + +.. code-block:: python + + import pandas as pd + query1 = "SELECT * FROM task" + df = pd.read_sql_query(query1, conn) + +For more query examples, please refer to the Jupyter notebook in this repository ``db-query.ipynb``. diff --git a/_sources/data-sources.rst.txt b/_sources/data-sources.rst.txt new file mode 100644 index 00000000..01c6cc51 --- /dev/null +++ b/_sources/data-sources.rst.txt @@ -0,0 +1,192 @@ +************ +Data Sources +************ + +DISCO currently supports OpenDSS models stored in several source formats, +namely Generic Models, SourceTree1, SourceTree2, GEM, EPRI. + +The following sections show how to +prepare the source feeder models which are used as *input paths* for +transforming models with a given analysis type. + +.. _GenericPowerFlowModels: + +Generic Power Flow Models +========================= +You can use this format to run power-flow simulations on your own OpenDSS models. +Unlike simulations run in the other formats, DISCO will not make any dynamic changes to the +models (as it does for DC-AC ratio for PVSystems). + +Refer to these input JSON files as examples: + +- `Snapshot `_ +- `Time Series `_ + +This `test file `_ +demonstrates the workflow. + +.. note:: If you enable external controls for PVSystems through PyDSS then the file specified as + ``opendss_model_file`` must contain the PVSystem definitions. + +The inputs must conform to the JSON schemas below. + +PowerFlowSnapshotSimulationModel +---------------------------------- +.. literalinclude:: ../build/json_schemas/PowerFlowSnapshotSimulationModel.json + :language: json + +PowerFlowTimeSeriesSimulationModel +---------------------------------- +.. literalinclude:: ../build/json_schemas/PowerFlowTimeSeriesSimulationModel.json + :language: json + +.. _SourceTree1Model: + +SourceTree1 Model +================= + +This format requires the following directory structure: + +.. code-block:: bash + + source_model_directory + ├── format.toml + ├── + │   ├── *.dss + │   └── -- + │   ├── *.dss + │   └── hc_pv_deployments + │   ├── feeder_summary.csv + │   └── + │   ├── + │   │   ├── + │   │   │   └── PVSystems.dss + │   │   │   └── PVSystems.dss + │   │   └── pv_config.json + └── profiles + └── .csv + +Where in *format.toml*, it defines ``type = "SourceTree1Model"``. +To see how to generate the PV deployments data in ``hc_pv_deployments`` directory, please +refer to :ref:`SourceTree1PVDeployments`. + +The `SMART-DS `_ dataset is an open-source dataset which is in the SourceTree1 format. +This dataset is prepared for performing DISCO hosting capacity analysis after some pre-processing which is described in the link below: + +.. toctree:: + :maxdepth: 1 + + data-sources/smart-ds-model-preparation + + +.. _SourceTree2Model: + +SourceTree2 Model +================= + +This format requires the following directory structure: + +.. code-block:: bash + + source_model_directory + ├── inputs + │   ├── + │   │   ├── LoadShapes + │   │   │   ├── .csv + │   │   ├── OpenDSS + │   │   │   ├── *.dss + │   │   ├── PVDeployments + │   │   │   └── new + │   │   │   ├── + │   │   │   │   ├── + │   │   │   │   │   ├── + │   │   │   │   │   │   ├── + │   │   │   │   │   │   │   ├── PV_Gen__.txt + ├── format.toml + +Where in *format.toml*, it defines ``type = "SourceTree2Model"``. + + +.. _GEM_JSON_Schema: + +GEM Model +========= + +A GEM config file (JSON) contains paths to source models on a filesystem along with +descriptor schema that describe all feeders and their possible deployments. + +Here is an example JSON file: + +.. code-block:: json + + { + "include_voltage_deviation": false, + "path_base": "gem/feeder_models", + "type": "GemModel", + "feeders": [ + { + "base_case": "deployment0", + "deployments": [ + { + "dc_ac_ratios": [], + "kva_to_kw_ratings": [], + "loadshape_file": null, + "loadshape_location": null, + "name": "deployment0", + "placement_type": null, + "project_data": { + "pydss_other_loads_dss_files": { + "2010-03-11_12-00-00-000": ["/data/path/loads1.dss"], + "2010-12-25_15-00-00-000": ["/data/path/loads2.dss"] + }, + "pydss_other_pvs_dss_files": { + "2010-03-11_12-00-00-000": ["/data/path/pvs1.dss"], + "2010-12-25_15-00-00-000": ["/data/path/pvs2.dss"], + } + }, + "pv_locations": [], + "sample": null, + "pydss_controllers": null, + "job_order": 0 + } + ], + "end_time": "2010-08-11_15:00:00.000", + "simulation_type": "Snapshot", + "load_multipliers": [ + 0.3, + 1.0, + 0.2, + 0.9 + ], + "loadshape_location": null, + "name": "MyFeeder", + "opendss_location": "/opendss/location/path/", + "start_time": "2010-08-11_15:00:00.000", + "step_resolution": 900 + }, + ] + } + + +Rules: + + * ``start_time`` and ``end_time`` must be set with timestamps. + * If ``simulation_type == "Snapshot"``, then ``start_time`` and ``end_time`` must be the same. + * ``dc_ac_ratios``, ``kva_to_kw_ratings`` may be empty arrays to represent no-PV scenarios. + * ``pydss_controllers`` has three attributes, + + - ``controller_type``: One controller type defined in PyDSS, for example, "PvController". + - ``name``: One controller name registered in PyDSS registry. + - ``targets`` (optional): null, a DSS file, or a list of DSS files. If null, then DISCO automatically sets the deployment file. + + +EPRI Model +========== + +The source URL of EPRI J1, K1, and M1 feeder models is +https://dpv.epri.com/feeder_models.html. You can download the source data with +this command: + +.. code-block:: bash + + $ disco download-source epri J1 K1 M1 --directory ./epri-feeders diff --git a/_sources/data-sources/smart-ds-model-preparation.rst.txt b/_sources/data-sources/smart-ds-model-preparation.rst.txt new file mode 100644 index 00000000..5d0cc703 --- /dev/null +++ b/_sources/data-sources/smart-ds-model-preparation.rst.txt @@ -0,0 +1,80 @@ +********************************** +SMART-DS OpenDSS Model Preparation +********************************** +Hosting Capacity Analysis makes use of the OpenDSS models from the `SMART-DS `_ dataset. +More documentation on the open source SMART-DS datasets can be found at the `SMART-DS website `_. +Pre-processing is performed on this dataset to prepare it for the analysis. The chart below shows the various stages of pre-processing performed on the SMART-DS OpenDSS Models. + +.. image:: ../images/SMART-DS-flowchart.png + :width: 400 + + +Copy SMART-DS Dataset +===================== +The dataset can be copied to the analysis location using https://github.com/NREL/disco/blob/main/scripts/copy_smart_ds_dataset.py + + +Usage: + +.. code-block:: bash + + $ python ~/sandboxes/disco/scripts/copy_smart_ds_dataset.py -y 2018 -c SFO -v v1.0 /projects/distcosts3/SMART-DS + +Here is the help: + +.. code-block:: bash + + $ python ~/sandboxes/disco/scripts/copy_smart_ds_dataset.py --help + + Usage: copy_smart_ds_dataset.py [OPTIONS] OUTPUT_DIR + + Copy a SMART-DS from the Eagle source directory to a destination directory. + + Options: + -f, --force overwrite output-dir if it exists + -c, --city TEXT dataset city [required] + -y, --year TEXT dataset year [required] + -v, --version TEXT dataset version [required] + --help Show this message and exit. + +Restructure to substation transformer +===================================== +The SMART-DS dataset has Open DSS models defined at the feeder and substation level. In this stage, Open DSS models are restructured and defined such that the analysis can be performed at the substation transformer level. +This can be performed using https://github.com/NREL/disco/blob/main/scripts/smartds_restructuring_transformer_folder.py + +Usage: + +.. code-block:: bash + + $ python ~/sandboxes/disco/scripts/smartds_restructuring_transformer_folder.py BASE_PATH ORIGINAL_DATASET NEW_DATASET LIST_OF_REGIONS + +Example: + +.. code-block:: bash + + $ python ~/sandboxes/disco/scripts/smartds_restructuring_transformer_folder.py /projects/distcosts3/SMART-DS/v1.0/2018 SFO SFO_xfmr P1U,P1R,P2U + + +Feeder screening & model fixes +============================== +In this, all the base-case feeders are passed through a preliminary screening process. +Here, disconnected nodes are removed, and the models are checked for connectivity, isolated nodes and extreme cases of thermal or voltage violations. +These would need to be addressed before proceeding to the hosting capacity analysis. This can be done using https://github.com/NREL/Distribution-Integration-Analysis/blob/master/scripts-simulation/generate_screening_jobs.py + +Usage: + +.. code-block:: bash + + $ python generate_screening_jobs.py PATH_TO_REGIONS + +Example: + +.. code-block:: bash + + $ python generate_screening_jobs.py /projects/distcosts3/SMART-DS/v1.0/2018/SFO + + +Create PV deployments +===================== +In this stage, PV deployments are generated for hosting capacity analysis. There are 10 sample PV deployments for every placement type (close, random, far) for every 5% increment upto 200% PV to load ratio . +This can be done using disco, refer to the :ref:`PVDeployments` documentation. diff --git a/_sources/debugging-issues.rst.txt b/_sources/debugging-issues.rst.txt new file mode 100644 index 00000000..f510b3d6 --- /dev/null +++ b/_sources/debugging-issues.rst.txt @@ -0,0 +1,150 @@ +**************** +Debugging Issues +**************** + +This page describes debugging techniques for issues encountered during the +simulation and analysis process. All of these tools produce output data in both +unstructured (.log) and structured form (.csv, .json, etc.). Aggregating data +from a batch with thousands of jobs will often require use of UNIX tools (find, +grep, awk, etc.) along with bash or Python scripts to process data in stages. + +.. _disco_return_codes: + +DISCO Return Codes +================== +DISCO processes (snapshot, time-series, upgrades simulations) return these codes for known +conditions. + +.. csv-table:: + :file: ../build/tables/return_codes.csv + :widths: 5, 30, 30 + :header-rows: 1 + + +Using JADE +========== + +Please refer to JADE documentation - +https://nrel.github.io/jade/tutorial.html#debugging + +Note that if you need result status in structured form, such as if you want to +find all failed jobs, refer to ``/results.json``. + +Using PyDSS +=========== + +DISCO creates a PyDSS project directory for each simulation job. The directory +will have the following contents: + +* ``project.zip`` +* ``store.h5`` + +When running on an HPC the directory contents will always be zipped because +huge numbers of directories can be problematic for the shared filesystem. + +Here is example content of an extracted job: + +.. code-block:: bash + + $ find output/job-outputs/p1uhs9_1247__p1udt6854__random__9__100/pydss-project + + output/job-outputs/p1uhs9_1247__p1udt6854__random__9__100/pydss-project + output/job-outputs/p1uhs9_1247__p1udt6854__random__9__100/pydss-project/DSSfiles + output/job-outputs/p1uhs9_1247__p1udt6854__random__9__100/pydss-project/DSSfiles/deployment.dss + output/job-outputs/p1uhs9_1247__p1udt6854__random__9__100/pydss-project/Exports + output/job-outputs/p1uhs9_1247__p1udt6854__random__9__100/pydss-project/Exports/control_mode + output/job-outputs/p1uhs9_1247__p1udt6854__random__9__100/pydss-project/Logs + output/job-outputs/p1uhs9_1247__p1udt6854__random__9__100/pydss-project/Logs/pydss.log + output/job-outputs/p1uhs9_1247__p1udt6854__random__9__100/pydss-project/Logs/pydss_project__control_mode__reports.log + output/job-outputs/p1uhs9_1247__p1udt6854__random__9__100/pydss-project/Scenarios + output/job-outputs/p1uhs9_1247__p1udt6854__random__9__100/pydss-project/Scenarios/control_mode + output/job-outputs/p1uhs9_1247__p1udt6854__random__9__100/pydss-project/Scenarios/control_mode/ExportLists/Exports.toml + output/job-outputs/p1uhs9_1247__p1udt6854__random__9__100/pydss-project/Scenarios/control_mode/pyControllerList/PvControllers.toml + +To debug a problem you can unzip the contents. However, this can be problematic +if you need to inspect lots of jobs. You may be better off using a tool like +``Vim`` that lets you view compressed files in place. + +You can also use ``zipgrep`` to search specific files within the .zip for +patterns. This is extremely helpful if you need to inspect many jobs. This tool +uses ``egrep`` so you may need to consult help from both locations to customize +searches. + +Errors +------ +All errors get logged in ``pydss.log``. Look there for problems reported by +OpenDSS. + +Searching for errors +-------------------- + Here is an example of searching for a pattern without unzipping: + +.. code-block:: bash + + $ zipgrep "Convergence error" output/job-outputs/p1uhs9_1247__p1udt6854__random__9__100/pydss_project/project.zip Logs/pydss.log + +Here is an example that searches all jobs: + +.. code-block:: bash + + $ for x in `find output/job-outputs -name project.zip`; do echo "$x"; zipgrep "Convergence error" $x Logs/pydss.log; done + +You will likely want to redirect that command's output to another file for +further processing (or pipe it to another command). + +Convergence errors +------------------ +PyDSS creates a report showing each instance of a convergence error for a PV +controller. An example name of this file is +``pydss_project__control_mode__reports.log``. This file contains line-delimited +JSON objects. This means that each line is valid JSON but the entire file is +not. + +Here is an example of one line of the file pretty-printed as JSON: + +.. code-block:: json + + { + "Report": "Convergence", + "Scenario": "control_mode", + "Time": 523800, + "DateTime": "2020-01-07 01:30:00", + "Controller": "pyCont_PVSystem_small_p1ulv32837_1_2_pv", + "Controlled element": "PVSystem.small_p1ulv32837_1_2_pv", + "Error": 0.00241144335086263, + "Control algorithm": "VVar" + } + +Here are some example commands to convert the file to JSON. This example uses +an excellent 3rd-party JSON-parsing tool called ``jq`` which you have to +install. (On Eagle: ``conda install -c conda-forge jq``). You may have a +different method. + +.. code-block:: bash + + $ zipgrep -h Convergence output/job-outputs/p1uhs9_1247__p1udt6854__random__9__100/pydss_project/project.zip Logs/pydss_project__control_mode__reports.log | jq . -s + +**Note**: That command used ``-h`` to suppress the filename from the output. + +This next command will do do the same for all jobs. Note that it loses the +association between job and error. You would need to do some extra work to keep +those associations. + +.. code-block:: bash + + $ for x in `find output/job-outputs -name project.zip`; do zipgrep -h "Convergence" $x Logs/pydss_project__control_mode__reports.log; done | jq . -s + +.. warning:: Be aware of how much CPU and memory will be consumed by these + operations. You may want to redirect this output to a temporary text file + first. + +In both cases you will probably want to redirect the output to a JSON file for +further processing. + +Running searches in parallel +---------------------------- +The DISCO repository has a script that extracts data from ``project.zip`` with +the Python multiprocessing library. You can use this as an example to speed up +large searches. Do not run this kind of search on an HPC login node. + +Refer to ``disco/cli/make_summary_tables.py``. diff --git a/_sources/index.rst.txt b/_sources/index.rst.txt new file mode 100644 index 00000000..eef05d13 --- /dev/null +++ b/_sources/index.rst.txt @@ -0,0 +1,84 @@ +.. disco documentation master file, created by + sphinx-quickstart on Mon May 6 14:12:42 2019. + You can adapt this file completely to your liking, but it should at least + contain the root `toctree` directive. + +******************* +DISCO Documentation +******************* +DISCO (Distribution Integration Solution Cost Options) is an NREL-developed, +python-based software tool for conducting scalable, repeatable distribution analyses. +While DISCO was originally developed to support photovoltaic (PV) impact analyses, +it can also be used to understand the impact of other distributed energy resources (DER) +and load changes on distribution systems. Analysis modules currently included in DISCO are: + +* Snapshot hosting capacity analysis, in which hosting capacity is based on a + traditional definition of if operating thresholds are exceeded for + worst-case/bounding snapshots in time +* Snapshot impact analysis, which calculates the same impact metrics as + hosting capacity, but for specific user-defined PV deployment scenarios +* Dynamic hosting capacity analysis, in which hosting capacity is calculating + using quasi-static time-series (QSTS) simulations and dynamic impact metrics + for voltage and thermal loading. PV curtailment, + number of device (voltage regulator, capacitor switch) operations, and + energy losses are also calculated as part of this analysis because excessive + PV curtailment, increases in device operations and associated replacement + cost increases, and energy losses can also serve to limit how much PV can + be economically interconnected to a given feeder. +* Dynamic impact analysis, which is to dynamic hosting capacity analysis as + snapshot impact analysis is to snapshot hosting capacity analyses. + + +DISCO analysis is based on power flow modeling with OpenDSS used as the simulation engine. +PyDSS (https://nrel.github.io/PyDSS) is used to interface with OpenDSS +and provide additional control layers. + +The benefit of using DISCO instead of just directly using OpenDSS or PyDSS is two-fold: + +* DISCO provides the infrastructure required to run a large number of analyses + by managing job submission and execution through JADE (https://nrel.github.io/jade/). +* DISCO provides ready-made, tested code to calculate snapshot and dynamic impact + metrics, allowing for repeatable analyses across projects and teams without + having to re-create code to process OpenDSS results. + +Examples of how DISCO has been or is currently being used are: + +* Evaluating curtailment risk associated with using advanced inverter controls + and flexible interconnection options for PV grid integration on 100’s of circuits. + +DISCO does not yet have the ability to conduct end-to-end techno-economic analyses +of different distribution integration solutions, including looking at impact on +customer bills, utility revenue, or the economic impact to customers and utilities +of reduced electricity demand. Thus, this is not a tool for comprehensive +techno-economic analysis. + +.. toctree:: + :maxdepth: 1 + :caption: User Guide + + installation + overview + quick-start + data-sources + pv-deployments + transform-models + analysis-workflows + pipelines + debugging-issues + data-ingestion + advanced-guide + +.. toctree:: + :maxdepth: 1 + :caption: Contribution + + build-docs + license + + +Indices and tables +================== + +* :ref:`genindex` +* :ref:`modindex` +* :ref:`search` diff --git a/_sources/installation.rst.txt b/_sources/installation.rst.txt new file mode 100644 index 00000000..06f28842 --- /dev/null +++ b/_sources/installation.rst.txt @@ -0,0 +1,67 @@ +.. _installation: + +************ +Installation +************ +We recommend that you install DISCO in a virtual environment such as ``Conda``. + +Conda Installation +================== + +1. Create a Conda virtual environment. This example uses the name ``disco`` + as a convention. + +.. code-block:: bash + + $ conda create -n disco python=3.10 + $ conda activate disco + +Optional: Install extra packages. + +.. code-block:: bash + + $ conda install ipython + +2. Install DISCO from the PyPi repository. + +.. code-block:: bash + + $ pip install NREL-disco + +**Known Windows installation problem**: DISCO requires PyDSS which requires the +Shapely package. In some cases Shapely will fail to install. +pip will report an error about ``geos_c.dll``. Install it from conda and then +retry. + +.. code-block:: bash + + $ conda install shapely + +Then retry the DISCO installation command. + +3. If you will run your jobs through JADE, install DISCO's extensions. + +.. code-block:: bash + + $ disco install-extensions + +Now, the Conda environment ``disco`` is ready to use. +To deactivate it, run the command below: + +.. code-block:: bash + + $ conda deactivate + + +Developer Installation +====================== +Follow these instructions if you will be developing DISCO code and running tests. + +.. code-block:: bash + + $ git clone https://github.com/NREL/disco.git + $ cd disco + $ pip install -e '.[dev]' + + # Run this command if your git version is lower than 2.19. + $ conda install git">=2.19" diff --git a/_sources/license.rst.txt b/_sources/license.rst.txt new file mode 100644 index 00000000..34ce45f3 --- /dev/null +++ b/_sources/license.rst.txt @@ -0,0 +1,5 @@ +******* +License +******* + +This project is licensed under the terms of the BSD 3-clause "New" or "Revised" license. diff --git a/_sources/overview.rst.txt b/_sources/overview.rst.txt new file mode 100644 index 00000000..fa60455c --- /dev/null +++ b/_sources/overview.rst.txt @@ -0,0 +1,73 @@ +******** +Overview +******** + +This section gives an overview about DISCO and its workflows. + +DISCO can be used for distributed grid system simulation analysis. +The analysis types are: + +* Snapshot Impact Analysis +* Static Hosting Capacity Analysis +* Time Series Impact Analysis +* Dynamic Hosting Capacity Analysis + +The diagram below shows the DISCO workflow: + +.. image:: images/DISCO-Workflow.png + :align: center + +As shown from the diagram, the main steps to run an analysis workflow are: + +* Prepare the OpenDSS models with a given data source. +* Transform the source OpenDSS models into DISCO models. +* Configure JADE jobs with the DISCO models. +* Run the jobs with JADE. + + +Data Sources +============ + +DISCO supports OpenDSS models in several data formats: + +#. Generic Model, this format supports any user-defined OpenDSS model - :ref:`GenericPowerFlowModels`. + +#. SourceTree1 Model, this format requires directory structure *tree1* defined by DISCO - :ref:`SourceTree1Model`. + +#. SourceTree2 Model, this format requires directory structure *tree2* defined by DISCO - :ref:`SourceTree2Model`. + +#. GEM Model, Grid-connected Energy systems Modeling + +#. EPRI Model - J1, K1, and M1, https://dpv.epri.com/feeder_models.html + + +Transform Model +=============== + +Given an analysis type, the source OpenDSS models need to be transformed into +DISCO models which then can be used as inputs for configuring JADE jobs. + +.. note:: The Generic Model workflows do not use this step. DISCO uses those models directly. + + +Config Jobs +=========== + +DISCO configures JADE jobs from standard DISCO models for specific analysis +types. The output is a configuration JSON file. + + +Submit Jobs +=========== + +JADE parallelizes execution of the jobs on a local computer or an HPC. +Execution on an HPC is highly configurable depending on the job resource +requirements. + + +Result Analysis +=============== + +After jobs complete JADE can assist with analysis by showing summaries of +individual job status, errors and events, job execution times, and compute +resource utilization statistics. diff --git a/_sources/pipelines.rst.txt b/_sources/pipelines.rst.txt new file mode 100644 index 00000000..dd5bda5e --- /dev/null +++ b/_sources/pipelines.rst.txt @@ -0,0 +1,293 @@ +********* +Pipelines +********* + +To conduct power flow simulations and analysis, people normally need to perform several steps, including +transform model, create configurations, submit jobs, and run post-processing scripts/commands. To streamline this workflow, DISCO leverages the power of +JADE pipeline and manage the steps using stages in simpler manner. + +A pipeline can contain one or more stages, each stage can perform config and submit jobs +to generate stage results. The result output from prior stage can be passed to its subsequent stage +as inputs, so that produces further results. DISCO uses pipeline ``template`` and pipeline +``config`` to manage the DISCO analysis workflow. + +To generate a pipeline template file and create a pipeline config file based on it, +use this group command: + +.. code-block:: bash + + $ disco create-pipeline --help + +The source models that DISCO pipeline currently supports include: + + * :ref:`SourceTree1Model` + + +SourceTree1Model +================ + +Snapshot Hosting Capacity Analysis +---------------------------------- + +**1. Create Pipeline Template File** + +To create pipeline template, use this command: + +.. code-block:: bash + + $ disco create-pipeline template + +The opendss model inputs - ```` can be source models or preconfigured models. + + +.. note:: + + When creating the pipeline template for snapshot simulation the flag ``--with-loadshape`` + or ``--no-with-loadshape`` must be set according to whether the Loads or PVSystems in the + models use load shapes. + + * if ``--no-with-loadshape``, DISCO runs snapshot simulation by using ``Snapshot`` mode. + * if ``--with-loadshape``, DISCO runs snapshot simulation by using ``QSTS`` mode with only one timestamp. + +a. Source Model Inputs + +.. code-block:: bash + + $ disco create-pipeline template tests/data/smart-ds/substations --task-name TestTask -s snapshot --hosting-capacity -t pipeline-template.toml + +b. Preconfigured Model Inputs + +Create preconfigured models: + +.. code-block:: bash + + $ disco transform-model tests/data/smart-ds/substations snapshot -o snapshot-feeder-models + +Then, use ``--preconfigured`` flag to indicate the input models ``snapshot-feeder-models`` are preconfigured. + +.. code-block:: bash + + $ disco create-pipeline template snapshot-feeder-models --task-name TestTask --preconfigured -s snapshot --hosting-capacity -t pipeline-template.toml + +The commands above create a pipeline template file named ``pipeline-template.toml``. + + +**2. Update Pipeline Template File** + +In the template generated above, there are 3 sections, including: + + * ``model`` + * ``simulation`` + * ``postprocess`` + +You can modify the different types of parameters in each section based on your task requirements +on model transform, config/submit jobs, and postprocess. To check the meaning of each parameter, +run ``--help`` on its command. + + * ``model.transform-params`` from ``disco transform-model snapshot`` + * ``simulation.config-params`` from ``disco config snapshot``. + * ``simulation.submitter-params`` from ``jade submit-jobs``. + * ``postprocess.config-params`` from ``jade config create``. + * ``postprocess.submitter-params`` from ``jade submit-jobs`` + +Note that simulation and postprocess can use different JADE submitter parameters. Check the default values +chosen by DISCO and consider whether they can be optimized for your run. If you're not familiar with the terms +used in this section, please refer to `JADE docs `_. + +For snapshot simulations DISCO uses a default value for ``per-node-batch-size``. You may be able to pick +a better value for the simulation stage. + +For time-series simulations DISCO estimates job runtimes and then uses JADE time-based-batching. So, you +should not need to worry about ``per-node-batch-size``. However, you might need to adjust the ``walltime`` +value in ``hpc_config.toml`` to account for your longest jobs. + + +**3. Create Pipeline Config File** + +.. code-block:: bash + + $ disco create-pipeline config pipeline-template.toml -c pipeline.json + +This step creates the pipeline config file named ``pipeline.json``, which contains the stage +information. In this example, there are 2 stages, JADE run each of the stage in order, and manages +the status of each util it completes the whole workflow. + + +**4. Sumbit Pipeline Using JADE** + +.. code-block:: bash + + $ jade pipeline submit pipeline.json -o snapshot-pipeline-output + +Pipeline output directory is ``snapshot-pipeline-output`` in this example, +which contains two stages' results, as shown below: + +.. code-block:: bash + + $tree snapshot-pipeline-output/ -L 2 + snapshot-pipeline-output/ + ├── config-stage1.json + ├── config-stage2.json + ├── output-stage1 + │   ├── config.json + │   ├── disco-diff.patch + │   ├── errors.txt + │   ├── events + │   ├── feeder_head_table.csv + │   ├── feeder_losses_table.csv + │   ├── jade-diff.patch + │   ├── job-outputs + │   ├── metadata_table.csv + │   ├── processed_results.csv + │   ├── results.csv + │   ├── results.json + │   ├── results.txt + │   ├── run_jobs_batch_0_events.log + │   ├── thermal_metrics_table.csv + │   └── voltage_metrics_table.csv + ├── output-stage2 + │   ├── config.json + │   ├── disco-diff.patch + │   ├── errors.txt + │   ├── events + │   ├── jade-diff.patch + │   ├── job-outputs + │   ├── processed_results.csv + │   ├── results.csv + │   ├── results.json + │   ├── results.txt + │   └── run_jobs_batch_0_events.log + ├── pipeline.json + └── pipeline_submit.log + +From the result tree, the metrics summary tables ``*.csv`` were created in ``output-stage1`` +by the postprocess job from stage 2. + + +Time-series Hosting Capacity Analysis +------------------------------------- + +Similarly, you can run time-series hosting capacity analysis using pipeline. +However, there is a difference for time-series pipeline, where one more +stage named ``prescreen`` could be enabled, so that to prescreen pv penetration levels +and avoid running jobs with higher failure potentials, which could help reduce the consumption of +HPC compute node hours. + +**1. Create Pipeline Template File** + +.. code-block:: bash + + $ disco create-pipeline template tests/data/smart-ds/substations --task-name TestTask -s time-series --hosting-capacity -t pipeline-template.toml + +If you need to prescreen PV penetration levels, use the flag ``--prescreen`` to create the template. + +.. code-block:: bash + + $ disco create-pipeline template tests/data/smart-ds/substations --task-name TestTask -s time-series --prescreen --hosting-capacity -t pipeline-template.toml + +This step create the ``pipeline-template.toml`` file. + +**2. Update Pipeline Tempalte File** + +There are 3 (or 4, with ``--prescreen`` enabled) sections in the template file generated above. + * ``model`` + * ``prescreen`` (optional) + * ``simulation`` + * ``postprocess`` + +Update the params in each section based on your task requirements, + + * ``model.transform-params`` from ``disco transform-model time-series`` + * ``prescreen.config-params`` from ``disco config time-series`` + * ``prescreen.prescreen-params`` from ``disco prescreen-pv-penetration-levels create`` + and ``disco prescreen-pv-penetration-levels filter-config``. + * ``simulation.submitter-params`` from ``jade submit-jobs``. + * ``postprocess.config-params`` from ``jade config create``. + * ``postprocess.submitter-params`` from ``jade submit-jobs`` + +then save it. + + +**3. Create Pipeline Config File** + +.. code-block:: bash + + $ disco create-pipeline config pipeline-template.toml -c pipeline.json + +This command creates the pipeline config file named ``pipeline.json``, there are 3 stages if +you have ``--prescreen`` enabled, otherwise, 2 stages - ``simulation`` and ``postprocess``. + + +**4. Submit Pipeline Using JADE** + +.. code-block:: bash + + $ jade pipeline submit pipeline.json -o time-series-pipeline-output + +Pipeline output directory is ``time-series-pipeline-output`` in this example, +which contains the results of 3 stages with ``--prescreen`` enabled. + +.. code-block:: bash + + $tree time-series-pipeline-output/ -L 2 + time-series-pipeline-output + ├── config-stage1.json + ├── config-stage2.json + ├── config-stage3.json + ├── output-stage1 + │   ├── config.json + │   ├── disco-diff.patch + │   ├── errors.txt + │   ├── events + │   ├── filter_prescreened_jobs.log + │   ├── jade-diff.patch + │   ├── job-outputs + │   ├── processed_results.csv + │   ├── results.csv + │   ├── results.json + │   ├── results.txt + │   └── run_jobs_batch_0_events.log + │ ├── output-stage2 + │   ├── config.json + │   ├── disco-diff.patch + │   ├── errors.txt + │   ├── events + │   ├── feeder_head_table.csv + │   ├── feeder_losses_table.csv + │   ├── jade-diff.patch + │   ├── job-outputs + │   ├── metadata_table.csv + │   ├── processed_results.csv + │   ├── results.csv + │   ├── results.json + │   ├── results.txt + │   ├── run_jobs_batch_0_events.log + │   ├── thermal_metrics_table.csv + │   └── voltage_metrics_table.csv + ├── output-stage3 + │   ├── config.json + │   ├── disco-diff.patch + │   ├── errors.txt + │   ├── events + │   ├── jade-diff.patch + │   ├── job-outputs + │   ├── processed_results.csv + │   ├── results.csv + │   ├── results.json + │   ├── results.txt + │   └── run_jobs_batch_0_events.log + ├── pipeline.json + └── pipeline_submit.log + +As shown above, the metrics summary tables ``*.csv`` were created in ``output-stage2`` +by postprocess job from stage 3. + +**5. Check Results and Plots** + +Based on the metrics results, DISCO integrate plot functions which help create 3 plots. + +1. compare voltage primary and secondary on the first feeder. +2. compare pf1 and volt-var on the first feeder. +3. heatmap for max and min hosting capacity for all feeders. + +These visualizations show the feeder operational conditions under different PV penetration levels. diff --git a/_sources/pv-deployments.rst.txt b/_sources/pv-deployments.rst.txt new file mode 100644 index 00000000..32ab4c80 --- /dev/null +++ b/_sources/pv-deployments.rst.txt @@ -0,0 +1,264 @@ +.. _PVDeployments: + +************** +PV Deployments +************** + +This section shows how to generate DISCO PV deployments from raw opendss models. + +.. _SourceTree1PVDeployments: + +SourceTree1 PV Deployments +========================== + +The main command going to be used is the one below, + +.. code-block:: bash + + $ disco pv-deployments source-tree-1 --action --hierarchy --placement INPUT_PATH + +There are several actions here related to PV deployments manipulation, including + +* ``redirect-pvshapes``: Redirect PVShape.dss in both substation and feeder Master.dss files. +* ``transform-loads``: Transform Loads.dss file before conducting PV deployments. +* ``generate-jobs``: Help generate ``create-pv`` and ``create-configs`` jobs in JSON, i.e., jade config. +* ``restore-feeders``: Before and during PV deployments, Loads.dss and Master.dss files were modified, need to restore after that. +* ``create-pv``: create PV deployments on feeders based on `placement`, `sample` and `penetration` levels. +* ``check-pv``: check if there are PV deployments missing at each `placement`, `sample` and `penetration` level. +* ``remove-pv``: delete PV deployments in case there's something wrong. +* ``create-configs``: create PV config files at each `sample` deployment level. +* ``check-configs``: check if there are PV config files missing in deployment directories. +* ``remove-configs``: remove PV config files in case there's something wrong. +* ``list-feeders``: list feeder paths given input of region, substation or feeder. + + +Redirect PVShapes +----------------- +This workflow will generate OpenDSS files with varying counts and sizes of PVSystems. It will +assign load shape profiles to those PVSystems from a pool of profiles. You must define these +profiles in a ``PVShapes.dss`` file and copy that files to all substation and/or feeder +directories. + +All ``Master.dss`` need to redirect to ``PVShapes.dss``. We recommend that you add these lines to +your files. If you do that, you can skip to the next section. + +If your directory structure aligns with the ``source-tree-1`` expectations, the disco CLI command +below will add the redirects automatically. + +.. todo:: Make this code handle all cases generically. + +Run this command: + +.. code-block:: bash + + $ disco pv-deployments source-tree-1 -a redirect-pvshapes -h INPUT_PATH + + +Transform Loads +--------------- +Also, ``Loads.dss`` file under the feeder needs to be transformed before PV deployments, so that to +change load model to suitable center-tap schema if needed. The command to run this is, + +.. code-block:: bash + + $ disco pv-deployments source-tree-1 -a transform-loads -h INPUT_PATH + + +Generate Jobs +------------- +DISCO provides a command to help generate JADE jobs config files for PV deployments and PV configs, that is, + +.. code-block:: bash + + $ disco pv-deployments source-tree-1 -a generate-jobs -h INPUT_PATH + +The hierarchy options are: + + * ``city`` + * ``region`` + * ``substation`` + * ``feeder`` + +We recommand to run this ``generate-jobs`` command with ``--hierarchy=city`` and generate jobs on all +feeders within the city path, if your simulation/analysis jobs run relatively stable, this way can help avoid +the repeated job generation work on regions, substations, or feeders. +For test or debug purpose,it's good to specify ``--hierarchy=feeder`` for generating config file with one job, +or ``--hierarchy=substation`` with a few jobs. + + +This command will generate two JADE config files: + + * ``create-pv-jobs.json`` contains jobs for PV deployments. + * ``create-config-jobs.json`` contains jobs for PV configs + +And, you can submit the jobs via ``jade submit-jobs `` command. + +.. warning:: + + Since PV configs are based on the result of PV deployments, so you will need to wait PV deployment + jobs to complete, before to submit PV config jobs. + + +PV Deployments +-------------- + +Submit Jobs +^^^^^^^^^^^ + +To generate PV deployments, you will need to submit the jobs via JADE, that is, + +.. code-block:: bash + + $ jade submit-jobs create-pv-jobs.json + +If the jobs pass, then the PV deployments task is done. If you'd like to explore details +about ``create-pv`` action based on your hierarchy and according input path, please check the section below. + +Details Exploration +^^^^^^^^^^^^^^^^^^^ + +Here are some example commands showing how to create, check and remove PV deployments. + +1. Create PV deployments on feeder1 with ``--placement random``. + +.. code-block:: bash + + $ disco pv-deployments source-tree-1 -a create-pv -h feeder -p random --pv-upscale + + +2. Create PV deployments on substation1 with a few feeders. + +.. code-block:: bash + + $ disco pv-deployments source-tree-1 -a create-pv -h substation -p random --pv-upscale + + +3. Create PV deployments on region1 with many feeders in parallel by using JADE. + +As each region has a large number of feeders, it is recommended to use JADE to parallize the jobs. + +.. code-block:: bash + + $ disco pv-deployments source-tree-1 -a list-feeders -h region + # Create a file which contains create-pv commands on feeders as above. + $ jade config create -c config1.json + $ jade submit-jobs config1.json + + +4. If you like to check which PV deployments are missing due to job failures, + +.. code-block:: bash + + $ disco pv-deployments source-tree-1 -a check-pv -h feeder -p random + $ disco pv-deployments source-tree-1 -a check-pv -h substation -p random + $ disco pv-deployments source-tree-1 -a check-pv -h region -p random + +It returns the missing samples and penetrations on each feeder. If don't have ``--placement`` specified, +the result would include `placement` missing information on each feeder. + + +5. If you found some issues with the PV deployments, and like to delete them, here are example commands, + +.. code-block:: bash + + $ disco pv-deployments source-tree-1 -a remove-pv -h feeder -p random + $ disco pv-deployments source-tree-1 -a remove-pv -h substation -p random + $ disco pv-deployments source-tree-1 -a remove-pv -h region -p random + + +PV Configs +---------- + +Submit Jobs +^^^^^^^^^^^ + +To generate PV configs, you will need to submit the jobs via JADE, that is, + +.. code-block:: bash + + $ jade submit-jobs create-config-jobs.json + +If the jobs pass, then the PV configs task is done. If you'd like to explore details +about ``create-configs`` action based on your hierarchy and according input path, please check the section below. + +Details Exploration +^^^^^^^^^^^^^^^^^^^ + +After creating PV deployments we need to generate PV config files that define a load shape +profile for each PV system. The config files get created in ``sample`` directories. +The examples below show commands for creating, checking or removing PV config files. + +1. Create PV configs on feeder1 based on PV deployments data. + +.. code-block:: bash + + $ disco pv-deployments source-tree-1 -a create-configs -h feeder + + +2. Create PV configs on substation1 with a few feeders. + +.. code-block:: bash + + $ disco pv-deployments source-tree-1 -a create-configs -h substation + +.. warning:: + + The option ``-p`` or ``--placement`` does not apply to ``create-configs`` action, as after all + pv configs created in each feeder, a sum group file based on customer types would be created + based on the pv configs of the feeder. + +3. Create PV configs on region1 with many feeders in parallel by using JADE. + +.. code-block:: bash + + $ disco pv-deployments source-tree-1 -a list-feeders -h region + # Create a file which contains create-configs commands on feeders as above. + $ jade config create -c config2.json + $ jade submit-jobs config2.json + +4. Check if any feeder is missing PV config files. + +.. code-block:: bash + + $ disco pv-deployments source-tree-1 -a check-configs -h feeder -p random + $ disco pv-deployments source-tree-1 -a check-configs -h substation -p random + $ disco pv-deployments source-tree-1 -a check-configs -h region -p random + +5. Remove PV configs if something is wrong. + +.. code-block:: bash + + $ disco pv-deployments source-tree-1 -a remove-configs -h feeder -p random + $ disco pv-deployments source-tree-1 -a remove-configs -h substation -p random + $ disco pv-deployments source-tree-1 -a remove-configs -h region -p random + + +Restore Feeders +--------------- + +As the ``Loads.dss`` in SourceTree1 models needs to be transformed during PV deployments, and the +content of ``Loads.dss`` was modified. However, we backed up the original ``Loads.dss`` before +PV deployments, so we can rename back after that. Simply, the steps look like this. + +One more thing, to speed up PV deployments, we commented out ``LoadShapes.dss`` before PV deployments in master +files, we need to revert it back after PV deployments. + +1. Before PV deployments: + +* Rename raw ``Loads.dss`` into ``Original_Loads.dss``. + +2. During PV deployments: + +* DISCO PV deployment program transformed ``Loads.dss`` in place. +* and, stripped ``yearly=`` from the load lines. + +3. After PV deployments: + +* Rename transformed ``Loads.dss`` file into ``PV_Loads.dss``. +* Rename ``Original_Loads.dss`` back to ``Loads.dss``. + +Run the command below to rename ``Loads.dss`` file and related, + +.. code-block:: bash + + $ disco pv-deployments source-tree-1 -a restore-feeders -h INPUT_PATH diff --git a/_sources/quick-start.rst.txt b/_sources/quick-start.rst.txt new file mode 100644 index 00000000..3c7f5762 --- /dev/null +++ b/_sources/quick-start.rst.txt @@ -0,0 +1,114 @@ +*********** +Quick Start +*********** + +This tutorial will show an example by using `SMART-DS `_ +models with snapshot impact analysis. Note that you could generally substitute "time-series" for +"snapshot" for that type of simulation. + +Source Data +=========== + +Suppose the DISCO repo is downloaded to the ``~/disco`` directory, where the +SMART-DS data is located in the directory ``tests/data/smart-ds/substations/``. + + +Transform Model +=============== +DISCO transforms the SMART-DS models into DISCO models with this command. + +.. code-block:: bash + + $ disco transform-model ~/disco/tests/data/smart-ds/substations/ snapshot + Transformed data from ~/disco/tests/data/smart-ds/substations/ to snapshot-feeder-models for Snapshot Analysis. + +By default, it generates a directory named ``snapshot-feeder-models`` with transformed models. + + +Config Jobs +=========== + +Configure jobs for execution through JADE with this command: + +.. code-block:: bash + + $ disco config snapshot ./snapshot-feeder-models + Created config.json for Snapshot Analysis + +A job config file named ``config.json`` was created. + +Parameters that you may want to configure: + +- By default, the PyDSS-exported circuit element properties are taken from + `snapshot-exports.toml `_. + Specify a different file with ``-e ``. +- PyDSS will not automatically export results to CSV files by default. + You can set ``export_data_tables`` to ``true`` in ``config.json``. +- DISCO applies a DC-AC ratio of 1.15 to all PVSystems by default. You can customize it with the + option ``--dc-ac-ratio``. Set it to ``1.0`` to prevent any changes to your models. +- DISCO uses a standard IEEE volt-var curve by default. You can customize the value with the option + ``--volt-var-curve``. This must be a controller name registered with PyDSS. + Run ``pydss controllers show`` to see the registered controllers. +- DISCO does not store per-element data in reports by default. For example, it stores max/min + voltages across all buses and not the max/min voltages for each bus. + You can set ``store_per_element_data`` to ``true`` in ``config.json``. +- Other PyDSS parameters: Refer to the ``pydss_inputs`` section of ``config.json``. + `PyDSS documentation `_ + +Submit Jobs +=========== + +Then batch of jobs in ``config.json`` can be submitted through JADE. Two examples are shown below: +one on a local machine and one on an HPC. + +.. code-block:: bash + + $ jade submit-jobs --local config.json + $ jade submit-jobs -h hpc_config.toml config.json + +.. note:: + + Create hpc_config.toml with ``jade config hpc`` and modify it as necessary. + Refer to `JADE instructions `_ + for additional information on how to customize execution. + +The submitted jobs run to completion and generate an output directory named ``output``. + +Result Analysis +=============== + +To get a quick summary of job results using JADE: + +.. code-block:: bash + + $ jade show-results + Results from directory: output + JADE Version: 0.1.1 + 01/04/2021 08:52:36 + + +-----------------------------------------+-------------+----------+--------------------+----------------------------+ + | Job Name | Return Code | Status | Execution Time (s) | Completion Time | + +-----------------------------------------+-------------+----------+--------------------+----------------------------+ + | p1uhs10_1247__p1udt14394__random__1__5 | 0 | finished | 23.069955110549927 | 2021-01-04 08:52:35.939785 | + | p1uhs10_1247__p1udt14394__random__1__10 | 0 | finished | 23.06603503227234 | 2021-01-04 08:52:35.942345 | + | p1uhs10_1247__p1udt14394__random__2__5 | 0 | finished | 23.062479972839355 | 2021-01-04 08:52:35.943899 | + | p1uhs10_1247__p1udt14394__random__2__10 | 0 | finished | 23.05748414993286 | 2021-01-04 08:52:35.944780 | + +-----------------------------------------+-------------+----------+--------------------+----------------------------+ + + Num successful: 4 + Num failed: 0 + Total: 4 + + Avg execution time (s): 23.06 + Min execution time (s): 23.06 + Max execution time (s): 23.07 + + +Each job output directory contains PyDSS-exported data and reports. + +- Reports (ex: thermal_metrics.json, voltage_metrics.json) are stored in ``/job-outputs//pydss_project/project.zip`` in the ``Results`` sub-directory. +- Exported data tables, if enabled, are stored in the ``Exports`` sub-directory. +- You can access the PyDSS-exported data in a Jupyter notebook data-viewer UI or programmatically + as shown in this `documentation `_. + +This is the complete workflow for conducting snapshot impact analysis on SMART_DS feeders. diff --git a/_sources/transform-models.rst.txt b/_sources/transform-models.rst.txt new file mode 100644 index 00000000..6cd63629 --- /dev/null +++ b/_sources/transform-models.rst.txt @@ -0,0 +1,207 @@ +**************** +Transform Models +**************** + +Transform Model Help +==================== + +This process transforms user OpenDSS models into a format understood by DISCO +so that it can perform simulation and analysis with the models. + +Given an input path of source data DISCO can determine the types of analysis +it supports. The input path can be one of: + + * a GEM config file; the JSON schema definition is here - :ref:`GEM_JSON_Schema`. + * a directory path which contains a ``format.toml`` with a source type definition. + The source types are: + + - EpriModel + - SourceTree1Model + - SourceTree2Model + +Input File +---------- + +The ``--help`` option displays the types of analysis the source models support. +For example, if the input path is a GEM file: + +.. code-block:: bash + + $ disco transform-model ./gem-file.json --help + + Available analysis types: snapshot + + For additional help run one of the following: + disco transform-model ./gem-file.json snapshot --help + + +Input Directory +--------------- + +If the input path is a directory, for example, with ``type = SourceTree1Model`` +in *format.toml*. + +.. code-block:: bash + + $ disco transform-model tests/data/smart-ds/substations/ --help + + Available analysis types: snapshot time-series + + For additional help run one of the following: + disco transform-model tests/data/smart-ds/substations/ snapshot --help + disco transform-model tests/data/smart-ds/substations/ time-series --help + +.. note:: + + By default, the name of PV deployments directory is ``hc_pv_deployments``, if the PV deployments + are located in another directory, please specify the right directory by using option ``-P/--pv-deployments-dirname`` + in the ``transform-model`` command. + + +Load Shape Data files +--------------------- +By default, DISCO replaces relative paths to load shape data files with absolute +paths and does not copy them. This reduces time and consumed storage space. +However, it also makes the directory non-portable to other systems. + +If you want to create a portable directory with copies of these files, add +this flag to the command: + +.. code-block:: bash + + $ disco transform-model tests/data/smart-ds/substations time-series -c + $ disco transform-model tests/data/smart-ds/substations time-series --copy-load-shape-data-files + + +DISCO Model in Depth +==================== + +PyDSS Controllers +----------------- + +If you have custom *controllers* that need to be applied to simulation, +please make the controllers are registered via PyDSS first. + +Suppose we have particular controller settings defined in a ``my-custom-controllers.toml`` file: + +.. code-block:: python + + [my_volt_var_curve] + Control1 = "VVar" + Control2 = "None" + Control3 = "None" + ... + +.. code-block:: bash + + $ pydss controllers register PvController /path/my-custom-controllers.toml + +Once registered, the following information could be used to create the input +config related to ``pydss_controllers``. + +.. code-block:: json + + { + "name": "project123", + "controller_type": "PvController" + } + +By default, the target PyDSS file that the PyDSS controller would be applied to +is the deployment file, you do not need to specify the target DSS files. However, +if you want to specify the target DSS files here, other than the deployment file, + +.. code-block:: json + + { + "name": "project123", + "controller_type": "PvController", + "targets": [ + "/data/dss/file1.dss", + "/data/dss/file2.dss" + ] + } + +And, ``pydss_controllers`` supports multiple PyDSS controllers here, + +.. code-block:: json + + [ + { + "name": "project123", + "controller_type": "PvController" + }, + { + "name": "project123", + "type": "StorageController" + }, + ] + + +Model Schema +------------ + +DISCO uses `pydantic `_ +models to define the schema of model inputs for each type of analysis. Given a +type of analysis in DISCO, the schema shows all attributes used to define +the analysis models. + +*Show Schema* + +The input configurations in JSON should meet the specifications defined +by DISCO. To show the schema of a given analysis type, for example, +``SnapshotImpactAnalysisModel`` using this command with ``--mode show-schema`` +option, + +.. code-block:: bash + + $ disco simulation-models --mode show-schema SnapshotImpactAnalysisModel + +*Show Example* + +A data example may be more straightforward, use ``--mode show-example`` option, + +.. code-block:: bash + + $ disco simulation-models --mode show-example SnapshotImpactAnalysisModel --output-file=disco-models/configurations.json + $ cat disco-models/configurations.json + [ + { + "feeder": "J1", + "tag": "2010", + "deployment": { + "name": "deployment_001.dss", + "dc_ac_ratio": 1.15, + "directory": "disco-models", + "kva_to_kw_rating": 1.0, + "project_data": {}, + "pv_locations": [], + "pydss_controllers": null + }, + "simulation": { + "start_time": "2013-06-17T15:00:00.000", + "end_time": "2014-06-17T15:00:00.000", + "step_resolution": 900, + "simulation_type": "Snapshot" + }, + "name": "J1_123_Sim_456", + "base_case": null, + "include_voltage_deviation": false, + "blocked_by": [], + "job_order": null + } + ] + + +Validate Inputs +--------------- + +If you want to prepare the models manually then you must generate them in a +JSON file and then validate them to make sure they match the schema. + +.. code-block:: bash + + $ disco simulation-models validate-file disco-models/configurations.json + +The ``ValidationError`` will be raised if any input does not meet the +specification defined by DISCO. The error messages should provide corrective +action. diff --git a/_static/basic.css b/_static/basic.css new file mode 100644 index 00000000..30fee9d0 --- /dev/null +++ b/_static/basic.css @@ -0,0 +1,925 @@ +/* + * basic.css + * ~~~~~~~~~ + * + * Sphinx stylesheet -- basic theme. + * + * :copyright: Copyright 2007-2023 by the Sphinx team, see AUTHORS. + * :license: BSD, see LICENSE for details. + * + */ + +/* -- main layout ----------------------------------------------------------- */ + +div.clearer { + clear: both; +} + +div.section::after { + display: block; + content: ''; + clear: left; +} + +/* -- relbar ---------------------------------------------------------------- */ + +div.related { + width: 100%; + font-size: 90%; +} + +div.related h3 { + display: none; +} + +div.related ul { + margin: 0; + padding: 0 0 0 10px; + list-style: none; +} + +div.related li { + display: inline; +} + +div.related li.right { + float: right; + margin-right: 5px; +} + +/* -- sidebar --------------------------------------------------------------- */ + +div.sphinxsidebarwrapper { + padding: 10px 5px 0 10px; +} + +div.sphinxsidebar { + float: left; + width: 230px; + margin-left: -100%; + font-size: 90%; + word-wrap: break-word; + overflow-wrap : break-word; +} + +div.sphinxsidebar ul { + list-style: none; +} + +div.sphinxsidebar ul ul, +div.sphinxsidebar ul.want-points { + margin-left: 20px; + list-style: square; +} + +div.sphinxsidebar ul ul { + margin-top: 0; + margin-bottom: 0; +} + +div.sphinxsidebar form { + margin-top: 10px; +} + +div.sphinxsidebar input { + border: 1px solid #98dbcc; + font-family: sans-serif; + font-size: 1em; +} + +div.sphinxsidebar #searchbox form.search { + overflow: hidden; +} + +div.sphinxsidebar #searchbox input[type="text"] { + float: left; + width: 80%; + padding: 0.25em; + box-sizing: border-box; +} + +div.sphinxsidebar #searchbox input[type="submit"] { + float: left; + width: 20%; + border-left: none; + padding: 0.25em; + box-sizing: border-box; +} + + +img { + border: 0; + max-width: 100%; +} + +/* -- search page ----------------------------------------------------------- */ + +ul.search { + margin: 10px 0 0 20px; + padding: 0; +} + +ul.search li { + padding: 5px 0 5px 20px; + background-image: url(file.png); + background-repeat: no-repeat; + background-position: 0 7px; +} + +ul.search li a { + font-weight: bold; +} + +ul.search li p.context { + color: #888; + margin: 2px 0 0 30px; + text-align: left; +} + +ul.keywordmatches li.goodmatch a { + font-weight: bold; +} + +/* -- index page ------------------------------------------------------------ */ + +table.contentstable { + width: 90%; + margin-left: auto; + margin-right: auto; +} + +table.contentstable p.biglink { + line-height: 150%; +} + +a.biglink { + font-size: 1.3em; +} + +span.linkdescr { + font-style: italic; + padding-top: 5px; + font-size: 90%; +} + +/* -- general index --------------------------------------------------------- */ + +table.indextable { + width: 100%; +} + +table.indextable td { + text-align: left; + vertical-align: top; +} + +table.indextable ul { + margin-top: 0; + margin-bottom: 0; + list-style-type: none; +} + +table.indextable > tbody > tr > td > ul { + padding-left: 0em; +} + +table.indextable tr.pcap { + height: 10px; +} + +table.indextable tr.cap { + margin-top: 10px; + background-color: #f2f2f2; +} + +img.toggler { + margin-right: 3px; + margin-top: 3px; + cursor: pointer; +} + +div.modindex-jumpbox { + border-top: 1px solid #ddd; + border-bottom: 1px solid #ddd; + margin: 1em 0 1em 0; + padding: 0.4em; +} + +div.genindex-jumpbox { + border-top: 1px solid #ddd; + border-bottom: 1px solid #ddd; + margin: 1em 0 1em 0; + padding: 0.4em; +} + +/* -- domain module index --------------------------------------------------- */ + +table.modindextable td { + padding: 2px; + border-collapse: collapse; +} + +/* -- general body styles --------------------------------------------------- */ + +div.body { + min-width: 360px; + max-width: 800px; +} + +div.body p, div.body dd, div.body li, div.body blockquote { + -moz-hyphens: auto; + -ms-hyphens: auto; + -webkit-hyphens: auto; + hyphens: auto; +} + +a.headerlink { + visibility: hidden; +} + +a:visited { + color: #551A8B; +} + +h1:hover > a.headerlink, +h2:hover > a.headerlink, +h3:hover > a.headerlink, +h4:hover > a.headerlink, +h5:hover > a.headerlink, +h6:hover > a.headerlink, +dt:hover > a.headerlink, +caption:hover > a.headerlink, +p.caption:hover > a.headerlink, +div.code-block-caption:hover > a.headerlink { + visibility: visible; +} + +div.body p.caption { + text-align: inherit; +} + +div.body td { + text-align: left; +} + +.first { + margin-top: 0 !important; +} + +p.rubric { + margin-top: 30px; + font-weight: bold; +} + +img.align-left, figure.align-left, .figure.align-left, object.align-left { + clear: left; + float: left; + margin-right: 1em; +} + +img.align-right, figure.align-right, .figure.align-right, object.align-right { + clear: right; + float: right; + margin-left: 1em; +} + +img.align-center, figure.align-center, .figure.align-center, object.align-center { + display: block; + margin-left: auto; + margin-right: auto; +} + +img.align-default, figure.align-default, .figure.align-default { + display: block; + margin-left: auto; + margin-right: auto; +} + +.align-left { + text-align: left; +} + +.align-center { + text-align: center; +} + +.align-default { + text-align: center; +} + +.align-right { + text-align: right; +} + +/* -- sidebars -------------------------------------------------------------- */ + +div.sidebar, +aside.sidebar { + margin: 0 0 0.5em 1em; + border: 1px solid #ddb; + padding: 7px; + background-color: #ffe; + width: 40%; + float: right; + clear: right; + overflow-x: auto; +} + +p.sidebar-title { + font-weight: bold; +} + +nav.contents, +aside.topic, +div.admonition, div.topic, blockquote { + clear: left; +} + +/* -- topics ---------------------------------------------------------------- */ + +nav.contents, +aside.topic, +div.topic { + border: 1px solid #ccc; + padding: 7px; + margin: 10px 0 10px 0; +} + +p.topic-title { + font-size: 1.1em; + font-weight: bold; + margin-top: 10px; +} + +/* -- admonitions ----------------------------------------------------------- */ + +div.admonition { + margin-top: 10px; + margin-bottom: 10px; + padding: 7px; +} + +div.admonition dt { + font-weight: bold; +} + +p.admonition-title { + margin: 0px 10px 5px 0px; + font-weight: bold; +} + +div.body p.centered { + text-align: center; + margin-top: 25px; +} + +/* -- content of sidebars/topics/admonitions -------------------------------- */ + +div.sidebar > :last-child, +aside.sidebar > :last-child, +nav.contents > :last-child, +aside.topic > :last-child, +div.topic > :last-child, +div.admonition > :last-child { + margin-bottom: 0; +} + +div.sidebar::after, +aside.sidebar::after, +nav.contents::after, +aside.topic::after, +div.topic::after, +div.admonition::after, +blockquote::after { + display: block; + content: ''; + clear: both; +} + +/* -- tables ---------------------------------------------------------------- */ + +table.docutils { + margin-top: 10px; + margin-bottom: 10px; + border: 0; + border-collapse: collapse; +} + +table.align-center { + margin-left: auto; + margin-right: auto; +} + +table.align-default { + margin-left: auto; + margin-right: auto; +} + +table caption span.caption-number { + font-style: italic; +} + +table caption span.caption-text { +} + +table.docutils td, table.docutils th { + padding: 1px 8px 1px 5px; + border-top: 0; + border-left: 0; + border-right: 0; + border-bottom: 1px solid #aaa; +} + +th { + text-align: left; + padding-right: 5px; +} + +table.citation { + border-left: solid 1px gray; + margin-left: 1px; +} + +table.citation td { + border-bottom: none; +} + +th > :first-child, +td > :first-child { + margin-top: 0px; +} + +th > :last-child, +td > :last-child { + margin-bottom: 0px; +} + +/* -- figures --------------------------------------------------------------- */ + +div.figure, figure { + margin: 0.5em; + padding: 0.5em; +} + +div.figure p.caption, figcaption { + padding: 0.3em; +} + +div.figure p.caption span.caption-number, +figcaption span.caption-number { + font-style: italic; +} + +div.figure p.caption span.caption-text, +figcaption span.caption-text { +} + +/* -- field list styles ----------------------------------------------------- */ + +table.field-list td, table.field-list th { + border: 0 !important; +} + +.field-list ul { + margin: 0; + padding-left: 1em; +} + +.field-list p { + margin: 0; +} + +.field-name { + -moz-hyphens: manual; + -ms-hyphens: manual; + -webkit-hyphens: manual; + hyphens: manual; +} + +/* -- hlist styles ---------------------------------------------------------- */ + +table.hlist { + margin: 1em 0; +} + +table.hlist td { + vertical-align: top; +} + +/* -- object description styles --------------------------------------------- */ + +.sig { + font-family: 'Consolas', 'Menlo', 'DejaVu Sans Mono', 'Bitstream Vera Sans Mono', monospace; +} + +.sig-name, code.descname { + background-color: transparent; + font-weight: bold; +} + +.sig-name { + font-size: 1.1em; +} + +code.descname { + font-size: 1.2em; +} + +.sig-prename, code.descclassname { + background-color: transparent; +} + +.optional { + font-size: 1.3em; +} + +.sig-paren { + font-size: larger; +} + +.sig-param.n { + font-style: italic; +} + +/* C++ specific styling */ + +.sig-inline.c-texpr, +.sig-inline.cpp-texpr { + font-family: unset; +} + +.sig.c .k, .sig.c .kt, +.sig.cpp .k, .sig.cpp .kt { + color: #0033B3; +} + +.sig.c .m, +.sig.cpp .m { + color: #1750EB; +} + +.sig.c .s, .sig.c .sc, +.sig.cpp .s, .sig.cpp .sc { + color: #067D17; +} + + +/* -- other body styles ----------------------------------------------------- */ + +ol.arabic { + list-style: decimal; +} + +ol.loweralpha { + list-style: lower-alpha; +} + +ol.upperalpha { + list-style: upper-alpha; +} + +ol.lowerroman { + list-style: lower-roman; +} + +ol.upperroman { + list-style: upper-roman; +} + +:not(li) > ol > li:first-child > :first-child, +:not(li) > ul > li:first-child > :first-child { + margin-top: 0px; +} + +:not(li) > ol > li:last-child > :last-child, +:not(li) > ul > li:last-child > :last-child { + margin-bottom: 0px; +} + +ol.simple ol p, +ol.simple ul p, +ul.simple ol p, +ul.simple ul p { + margin-top: 0; +} + +ol.simple > li:not(:first-child) > p, +ul.simple > li:not(:first-child) > p { + margin-top: 0; +} + +ol.simple p, +ul.simple p { + margin-bottom: 0; +} + +aside.footnote > span, +div.citation > span { + float: left; +} +aside.footnote > span:last-of-type, +div.citation > span:last-of-type { + padding-right: 0.5em; +} +aside.footnote > p { + margin-left: 2em; +} +div.citation > p { + margin-left: 4em; +} +aside.footnote > p:last-of-type, +div.citation > p:last-of-type { + margin-bottom: 0em; +} +aside.footnote > p:last-of-type:after, +div.citation > p:last-of-type:after { + content: ""; + clear: both; +} + +dl.field-list { + display: grid; + grid-template-columns: fit-content(30%) auto; +} + +dl.field-list > dt { + font-weight: bold; + word-break: break-word; + padding-left: 0.5em; + padding-right: 5px; +} + +dl.field-list > dd { + padding-left: 0.5em; + margin-top: 0em; + margin-left: 0em; + margin-bottom: 0em; +} + +dl { + margin-bottom: 15px; +} + +dd > :first-child { + margin-top: 0px; +} + +dd ul, dd table { + margin-bottom: 10px; +} + +dd { + margin-top: 3px; + margin-bottom: 10px; + margin-left: 30px; +} + +.sig dd { + margin-top: 0px; + margin-bottom: 0px; +} + +.sig dl { + margin-top: 0px; + margin-bottom: 0px; +} + +dl > dd:last-child, +dl > dd:last-child > :last-child { + margin-bottom: 0; +} + +dt:target, span.highlighted { + background-color: #fbe54e; +} + +rect.highlighted { + fill: #fbe54e; +} + +dl.glossary dt { + font-weight: bold; + font-size: 1.1em; +} + +.versionmodified { + font-style: italic; +} + +.system-message { + background-color: #fda; + padding: 5px; + border: 3px solid red; +} + +.footnote:target { + background-color: #ffa; +} + +.line-block { + display: block; + margin-top: 1em; + margin-bottom: 1em; +} + +.line-block .line-block { + margin-top: 0; + margin-bottom: 0; + margin-left: 1.5em; +} + +.guilabel, .menuselection { + font-family: sans-serif; +} + +.accelerator { + text-decoration: underline; +} + +.classifier { + font-style: oblique; +} + +.classifier:before { + font-style: normal; + margin: 0 0.5em; + content: ":"; + display: inline-block; +} + +abbr, acronym { + border-bottom: dotted 1px; + cursor: help; +} + +.translated { + background-color: rgba(207, 255, 207, 0.2) +} + +.untranslated { + background-color: rgba(255, 207, 207, 0.2) +} + +/* -- code displays --------------------------------------------------------- */ + +pre { + overflow: auto; + overflow-y: hidden; /* fixes display issues on Chrome browsers */ +} + +pre, div[class*="highlight-"] { + clear: both; +} + +span.pre { + -moz-hyphens: none; + -ms-hyphens: none; + -webkit-hyphens: none; + hyphens: none; + white-space: nowrap; +} + +div[class*="highlight-"] { + margin: 1em 0; +} + +td.linenos pre { + border: 0; + background-color: transparent; + color: #aaa; +} + +table.highlighttable { + display: block; +} + +table.highlighttable tbody { + display: block; +} + +table.highlighttable tr { + display: flex; +} + +table.highlighttable td { + margin: 0; + padding: 0; +} + +table.highlighttable td.linenos { + padding-right: 0.5em; +} + +table.highlighttable td.code { + flex: 1; + overflow: hidden; +} + +.highlight .hll { + display: block; +} + +div.highlight pre, +table.highlighttable pre { + margin: 0; +} + +div.code-block-caption + div { + margin-top: 0; +} + +div.code-block-caption { + margin-top: 1em; + padding: 2px 5px; + font-size: small; +} + +div.code-block-caption code { + background-color: transparent; +} + +table.highlighttable td.linenos, +span.linenos, +div.highlight span.gp { /* gp: Generic.Prompt */ + user-select: none; + -webkit-user-select: text; /* Safari fallback only */ + -webkit-user-select: none; /* Chrome/Safari */ + -moz-user-select: none; /* Firefox */ + -ms-user-select: none; /* IE10+ */ +} + +div.code-block-caption span.caption-number { + padding: 0.1em 0.3em; + font-style: italic; +} + +div.code-block-caption span.caption-text { +} + +div.literal-block-wrapper { + margin: 1em 0; +} + +code.xref, a code { + background-color: transparent; + font-weight: bold; +} + +h1 code, h2 code, h3 code, h4 code, h5 code, h6 code { + background-color: transparent; +} + +.viewcode-link { + float: right; +} + +.viewcode-back { + float: right; + font-family: sans-serif; +} + +div.viewcode-block:target { + margin: -1px -10px; + padding: 0 10px; +} + +/* -- math display ---------------------------------------------------------- */ + +img.math { + vertical-align: middle; +} + +div.body div.math p { + text-align: center; +} + +span.eqno { + float: right; +} + +span.eqno a.headerlink { + position: absolute; + z-index: 1; +} + +div.math:hover a.headerlink { + visibility: visible; +} + +/* -- printout stylesheet --------------------------------------------------- */ + +@media print { + div.document, + div.documentwrapper, + div.bodywrapper { + margin: 0 !important; + width: 100%; + } + + div.sphinxsidebar, + div.related, + div.footer, + #top-link { + display: none; + } +} \ No newline at end of file diff --git a/_static/css/badge_only.css b/_static/css/badge_only.css new file mode 100644 index 00000000..c718cee4 --- /dev/null +++ b/_static/css/badge_only.css @@ -0,0 +1 @@ +.clearfix{*zoom:1}.clearfix:after,.clearfix:before{display:table;content:""}.clearfix:after{clear:both}@font-face{font-family:FontAwesome;font-style:normal;font-weight:400;src:url(fonts/fontawesome-webfont.eot?674f50d287a8c48dc19ba404d20fe713?#iefix) format("embedded-opentype"),url(fonts/fontawesome-webfont.woff2?af7ae505a9eed503f8b8e6982036873e) format("woff2"),url(fonts/fontawesome-webfont.woff?fee66e712a8a08eef5805a46892932ad) format("woff"),url(fonts/fontawesome-webfont.ttf?b06871f281fee6b241d60582ae9369b9) format("truetype"),url(fonts/fontawesome-webfont.svg?912ec66d7572ff821749319396470bde#FontAwesome) format("svg")}.fa:before{font-family:FontAwesome;font-style:normal;font-weight:400;line-height:1}.fa:before,a .fa{text-decoration:inherit}.fa:before,a .fa,li .fa{display:inline-block}li .fa-large:before{width:1.875em}ul.fas{list-style-type:none;margin-left:2em;text-indent:-.8em}ul.fas li .fa{width:.8em}ul.fas li .fa-large:before{vertical-align:baseline}.fa-book:before,.icon-book:before{content:"\f02d"}.fa-caret-down:before,.icon-caret-down:before{content:"\f0d7"}.fa-caret-up:before,.icon-caret-up:before{content:"\f0d8"}.fa-caret-left:before,.icon-caret-left:before{content:"\f0d9"}.fa-caret-right:before,.icon-caret-right:before{content:"\f0da"}.rst-versions{position:fixed;bottom:0;left:0;width:300px;color:#fcfcfc;background:#1f1d1d;font-family:Lato,proxima-nova,Helvetica Neue,Arial,sans-serif;z-index:400}.rst-versions a{color:#2980b9;text-decoration:none}.rst-versions .rst-badge-small{display:none}.rst-versions .rst-current-version{padding:12px;background-color:#272525;display:block;text-align:right;font-size:90%;cursor:pointer;color:#27ae60}.rst-versions .rst-current-version:after{clear:both;content:"";display:block}.rst-versions .rst-current-version .fa{color:#fcfcfc}.rst-versions .rst-current-version .fa-book,.rst-versions .rst-current-version .icon-book{float:left}.rst-versions .rst-current-version.rst-out-of-date{background-color:#e74c3c;color:#fff}.rst-versions .rst-current-version.rst-active-old-version{background-color:#f1c40f;color:#000}.rst-versions.shift-up{height:auto;max-height:100%;overflow-y:scroll}.rst-versions.shift-up .rst-other-versions{display:block}.rst-versions .rst-other-versions{font-size:90%;padding:12px;color:grey;display:none}.rst-versions .rst-other-versions hr{display:block;height:1px;border:0;margin:20px 0;padding:0;border-top:1px solid #413d3d}.rst-versions .rst-other-versions dd{display:inline-block;margin:0}.rst-versions .rst-other-versions dd a{display:inline-block;padding:6px;color:#fcfcfc}.rst-versions.rst-badge{width:auto;bottom:20px;right:20px;left:auto;border:none;max-width:300px;max-height:90%}.rst-versions.rst-badge .fa-book,.rst-versions.rst-badge .icon-book{float:none;line-height:30px}.rst-versions.rst-badge.shift-up .rst-current-version{text-align:right}.rst-versions.rst-badge.shift-up .rst-current-version .fa-book,.rst-versions.rst-badge.shift-up .rst-current-version .icon-book{float:left}.rst-versions.rst-badge>.rst-current-version{width:auto;height:30px;line-height:30px;padding:0 6px;display:block;text-align:center}@media screen and (max-width:768px){.rst-versions{width:85%;display:none}.rst-versions.shift{display:block}} \ No newline at end of file diff --git a/_static/css/fonts/Roboto-Slab-Bold.woff b/_static/css/fonts/Roboto-Slab-Bold.woff new file mode 100644 index 00000000..6cb60000 Binary files /dev/null and b/_static/css/fonts/Roboto-Slab-Bold.woff differ diff --git a/_static/css/fonts/Roboto-Slab-Bold.woff2 b/_static/css/fonts/Roboto-Slab-Bold.woff2 new file mode 100644 index 00000000..7059e231 Binary files /dev/null and b/_static/css/fonts/Roboto-Slab-Bold.woff2 differ diff --git a/_static/css/fonts/Roboto-Slab-Regular.woff b/_static/css/fonts/Roboto-Slab-Regular.woff new file mode 100644 index 00000000..f815f63f Binary files /dev/null and b/_static/css/fonts/Roboto-Slab-Regular.woff differ diff --git a/_static/css/fonts/Roboto-Slab-Regular.woff2 b/_static/css/fonts/Roboto-Slab-Regular.woff2 new file mode 100644 index 00000000..f2c76e5b Binary files /dev/null and b/_static/css/fonts/Roboto-Slab-Regular.woff2 differ diff --git a/_static/css/fonts/fontawesome-webfont.eot b/_static/css/fonts/fontawesome-webfont.eot new file mode 100644 index 00000000..e9f60ca9 Binary files /dev/null and b/_static/css/fonts/fontawesome-webfont.eot differ diff --git a/_static/css/fonts/fontawesome-webfont.svg b/_static/css/fonts/fontawesome-webfont.svg new file mode 100644 index 00000000..855c845e --- /dev/null +++ b/_static/css/fonts/fontawesome-webfont.svg @@ -0,0 +1,2671 @@ + + + + +Created by FontForge 20120731 at Mon Oct 24 17:37:40 2016 + By ,,, +Copyright Dave Gandy 2016. All rights reserved. + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + diff --git a/_static/css/fonts/fontawesome-webfont.ttf b/_static/css/fonts/fontawesome-webfont.ttf new file mode 100644 index 00000000..35acda2f Binary files /dev/null and b/_static/css/fonts/fontawesome-webfont.ttf differ diff --git a/_static/css/fonts/fontawesome-webfont.woff b/_static/css/fonts/fontawesome-webfont.woff new file mode 100644 index 00000000..400014a4 Binary files /dev/null and b/_static/css/fonts/fontawesome-webfont.woff differ diff --git a/_static/css/fonts/fontawesome-webfont.woff2 b/_static/css/fonts/fontawesome-webfont.woff2 new file mode 100644 index 00000000..4d13fc60 Binary files /dev/null and b/_static/css/fonts/fontawesome-webfont.woff2 differ diff --git a/_static/css/fonts/lato-bold-italic.woff b/_static/css/fonts/lato-bold-italic.woff new file mode 100644 index 00000000..88ad05b9 Binary files /dev/null and b/_static/css/fonts/lato-bold-italic.woff differ diff --git a/_static/css/fonts/lato-bold-italic.woff2 b/_static/css/fonts/lato-bold-italic.woff2 new file mode 100644 index 00000000..c4e3d804 Binary files /dev/null and b/_static/css/fonts/lato-bold-italic.woff2 differ diff --git a/_static/css/fonts/lato-bold.woff b/_static/css/fonts/lato-bold.woff new file mode 100644 index 00000000..c6dff51f Binary files /dev/null and b/_static/css/fonts/lato-bold.woff differ diff --git a/_static/css/fonts/lato-bold.woff2 b/_static/css/fonts/lato-bold.woff2 new file mode 100644 index 00000000..bb195043 Binary files /dev/null and b/_static/css/fonts/lato-bold.woff2 differ diff --git a/_static/css/fonts/lato-normal-italic.woff b/_static/css/fonts/lato-normal-italic.woff new file mode 100644 index 00000000..76114bc0 Binary files /dev/null and b/_static/css/fonts/lato-normal-italic.woff differ diff --git a/_static/css/fonts/lato-normal-italic.woff2 b/_static/css/fonts/lato-normal-italic.woff2 new file mode 100644 index 00000000..3404f37e Binary files /dev/null and b/_static/css/fonts/lato-normal-italic.woff2 differ diff --git a/_static/css/fonts/lato-normal.woff b/_static/css/fonts/lato-normal.woff new file mode 100644 index 00000000..ae1307ff Binary files /dev/null and b/_static/css/fonts/lato-normal.woff differ diff --git a/_static/css/fonts/lato-normal.woff2 b/_static/css/fonts/lato-normal.woff2 new file mode 100644 index 00000000..3bf98433 Binary files /dev/null and b/_static/css/fonts/lato-normal.woff2 differ diff --git a/_static/css/theme.css b/_static/css/theme.css new file mode 100644 index 00000000..19a446a0 --- /dev/null +++ b/_static/css/theme.css @@ -0,0 +1,4 @@ +html{box-sizing:border-box}*,:after,:before{box-sizing:inherit}article,aside,details,figcaption,figure,footer,header,hgroup,nav,section{display:block}audio,canvas,video{display:inline-block;*display:inline;*zoom:1}[hidden],audio:not([controls]){display:none}*{-webkit-box-sizing:border-box;-moz-box-sizing:border-box;box-sizing:border-box}html{font-size:100%;-webkit-text-size-adjust:100%;-ms-text-size-adjust:100%}body{margin:0}a:active,a:hover{outline:0}abbr[title]{border-bottom:1px dotted}b,strong{font-weight:700}blockquote{margin:0}dfn{font-style:italic}ins{background:#ff9;text-decoration:none}ins,mark{color:#000}mark{background:#ff0;font-style:italic;font-weight:700}.rst-content code,.rst-content tt,code,kbd,pre,samp{font-family:monospace,serif;_font-family:courier new,monospace;font-size:1em}pre{white-space:pre}q{quotes:none}q:after,q:before{content:"";content:none}small{font-size:85%}sub,sup{font-size:75%;line-height:0;position:relative;vertical-align:baseline}sup{top:-.5em}sub{bottom:-.25em}dl,ol,ul{margin:0;padding:0;list-style:none;list-style-image:none}li{list-style:none}dd{margin:0}img{border:0;-ms-interpolation-mode:bicubic;vertical-align:middle;max-width:100%}svg:not(:root){overflow:hidden}figure,form{margin:0}label{cursor:pointer}button,input,select,textarea{font-size:100%;margin:0;vertical-align:baseline;*vertical-align:middle}button,input{line-height:normal}button,input[type=button],input[type=reset],input[type=submit]{cursor:pointer;-webkit-appearance:button;*overflow:visible}button[disabled],input[disabled]{cursor:default}input[type=search]{-webkit-appearance:textfield;-moz-box-sizing:content-box;-webkit-box-sizing:content-box;box-sizing:content-box}textarea{resize:vertical}table{border-collapse:collapse;border-spacing:0}td{vertical-align:top}.chromeframe{margin:.2em 0;background:#ccc;color:#000;padding:.2em 0}.ir{display:block;border:0;text-indent:-999em;overflow:hidden;background-color:transparent;background-repeat:no-repeat;text-align:left;direction:ltr;*line-height:0}.ir br{display:none}.hidden{display:none!important;visibility:hidden}.visuallyhidden{border:0;clip:rect(0 0 0 0);height:1px;margin:-1px;overflow:hidden;padding:0;position:absolute;width:1px}.visuallyhidden.focusable:active,.visuallyhidden.focusable:focus{clip:auto;height:auto;margin:0;overflow:visible;position:static;width:auto}.invisible{visibility:hidden}.relative{position:relative}big,small{font-size:100%}@media print{body,html,section{background:none!important}*{box-shadow:none!important;text-shadow:none!important;filter:none!important;-ms-filter:none!important}a,a:visited{text-decoration:underline}.ir a:after,a[href^="#"]:after,a[href^="javascript:"]:after{content:""}blockquote,pre{page-break-inside:avoid}thead{display:table-header-group}img,tr{page-break-inside:avoid}img{max-width:100%!important}@page{margin:.5cm}.rst-content .toctree-wrapper>p.caption,h2,h3,p{orphans:3;widows:3}.rst-content .toctree-wrapper>p.caption,h2,h3{page-break-after:avoid}}.btn,.fa:before,.icon:before,.rst-content .admonition,.rst-content .admonition-title:before,.rst-content .admonition-todo,.rst-content .attention,.rst-content .caution,.rst-content .code-block-caption .headerlink:before,.rst-content .danger,.rst-content .eqno .headerlink:before,.rst-content .error,.rst-content .hint,.rst-content .important,.rst-content .note,.rst-content .seealso,.rst-content .tip,.rst-content .warning,.rst-content code.download span:first-child:before,.rst-content dl dt .headerlink:before,.rst-content h1 .headerlink:before,.rst-content h2 .headerlink:before,.rst-content h3 .headerlink:before,.rst-content h4 .headerlink:before,.rst-content h5 .headerlink:before,.rst-content h6 .headerlink:before,.rst-content p.caption .headerlink:before,.rst-content p .headerlink:before,.rst-content table>caption .headerlink:before,.rst-content tt.download span:first-child:before,.wy-alert,.wy-dropdown .caret:before,.wy-inline-validate.wy-inline-validate-danger .wy-input-context:before,.wy-inline-validate.wy-inline-validate-info .wy-input-context:before,.wy-inline-validate.wy-inline-validate-success .wy-input-context:before,.wy-inline-validate.wy-inline-validate-warning .wy-input-context:before,.wy-menu-vertical li.current>a button.toctree-expand:before,.wy-menu-vertical li.on a button.toctree-expand:before,.wy-menu-vertical li button.toctree-expand:before,input[type=color],input[type=date],input[type=datetime-local],input[type=datetime],input[type=email],input[type=month],input[type=number],input[type=password],input[type=search],input[type=tel],input[type=text],input[type=time],input[type=url],input[type=week],select,textarea{-webkit-font-smoothing:antialiased}.clearfix{*zoom:1}.clearfix:after,.clearfix:before{display:table;content:""}.clearfix:after{clear:both}/*! + * Font Awesome 4.7.0 by @davegandy - http://fontawesome.io - @fontawesome + * License - http://fontawesome.io/license (Font: SIL OFL 1.1, CSS: MIT License) + */@font-face{font-family:FontAwesome;src:url(fonts/fontawesome-webfont.eot?674f50d287a8c48dc19ba404d20fe713);src:url(fonts/fontawesome-webfont.eot?674f50d287a8c48dc19ba404d20fe713?#iefix&v=4.7.0) format("embedded-opentype"),url(fonts/fontawesome-webfont.woff2?af7ae505a9eed503f8b8e6982036873e) format("woff2"),url(fonts/fontawesome-webfont.woff?fee66e712a8a08eef5805a46892932ad) format("woff"),url(fonts/fontawesome-webfont.ttf?b06871f281fee6b241d60582ae9369b9) format("truetype"),url(fonts/fontawesome-webfont.svg?912ec66d7572ff821749319396470bde#fontawesomeregular) format("svg");font-weight:400;font-style:normal}.fa,.icon,.rst-content .admonition-title,.rst-content .code-block-caption .headerlink,.rst-content .eqno .headerlink,.rst-content code.download span:first-child,.rst-content dl dt .headerlink,.rst-content h1 .headerlink,.rst-content h2 .headerlink,.rst-content h3 .headerlink,.rst-content h4 .headerlink,.rst-content h5 .headerlink,.rst-content h6 .headerlink,.rst-content p.caption .headerlink,.rst-content p .headerlink,.rst-content table>caption .headerlink,.rst-content tt.download span:first-child,.wy-menu-vertical li.current>a button.toctree-expand,.wy-menu-vertical li.on a button.toctree-expand,.wy-menu-vertical li button.toctree-expand{display:inline-block;font:normal normal normal 14px/1 FontAwesome;font-size:inherit;text-rendering:auto;-webkit-font-smoothing:antialiased;-moz-osx-font-smoothing:grayscale}.fa-lg{font-size:1.33333em;line-height:.75em;vertical-align:-15%}.fa-2x{font-size:2em}.fa-3x{font-size:3em}.fa-4x{font-size:4em}.fa-5x{font-size:5em}.fa-fw{width:1.28571em;text-align:center}.fa-ul{padding-left:0;margin-left:2.14286em;list-style-type:none}.fa-ul>li{position:relative}.fa-li{position:absolute;left:-2.14286em;width:2.14286em;top:.14286em;text-align:center}.fa-li.fa-lg{left:-1.85714em}.fa-border{padding:.2em .25em .15em;border:.08em solid #eee;border-radius:.1em}.fa-pull-left{float:left}.fa-pull-right{float:right}.fa-pull-left.icon,.fa.fa-pull-left,.rst-content .code-block-caption .fa-pull-left.headerlink,.rst-content .eqno .fa-pull-left.headerlink,.rst-content .fa-pull-left.admonition-title,.rst-content code.download span.fa-pull-left:first-child,.rst-content dl dt .fa-pull-left.headerlink,.rst-content h1 .fa-pull-left.headerlink,.rst-content h2 .fa-pull-left.headerlink,.rst-content h3 .fa-pull-left.headerlink,.rst-content h4 .fa-pull-left.headerlink,.rst-content h5 .fa-pull-left.headerlink,.rst-content h6 .fa-pull-left.headerlink,.rst-content p .fa-pull-left.headerlink,.rst-content table>caption .fa-pull-left.headerlink,.rst-content tt.download span.fa-pull-left:first-child,.wy-menu-vertical li.current>a button.fa-pull-left.toctree-expand,.wy-menu-vertical li.on a button.fa-pull-left.toctree-expand,.wy-menu-vertical li button.fa-pull-left.toctree-expand{margin-right:.3em}.fa-pull-right.icon,.fa.fa-pull-right,.rst-content .code-block-caption .fa-pull-right.headerlink,.rst-content .eqno .fa-pull-right.headerlink,.rst-content .fa-pull-right.admonition-title,.rst-content code.download span.fa-pull-right:first-child,.rst-content dl dt .fa-pull-right.headerlink,.rst-content h1 .fa-pull-right.headerlink,.rst-content h2 .fa-pull-right.headerlink,.rst-content h3 .fa-pull-right.headerlink,.rst-content h4 .fa-pull-right.headerlink,.rst-content h5 .fa-pull-right.headerlink,.rst-content h6 .fa-pull-right.headerlink,.rst-content p .fa-pull-right.headerlink,.rst-content table>caption .fa-pull-right.headerlink,.rst-content tt.download span.fa-pull-right:first-child,.wy-menu-vertical li.current>a button.fa-pull-right.toctree-expand,.wy-menu-vertical li.on a button.fa-pull-right.toctree-expand,.wy-menu-vertical li button.fa-pull-right.toctree-expand{margin-left:.3em}.pull-right{float:right}.pull-left{float:left}.fa.pull-left,.pull-left.icon,.rst-content .code-block-caption .pull-left.headerlink,.rst-content .eqno .pull-left.headerlink,.rst-content .pull-left.admonition-title,.rst-content code.download span.pull-left:first-child,.rst-content dl dt .pull-left.headerlink,.rst-content h1 .pull-left.headerlink,.rst-content h2 .pull-left.headerlink,.rst-content h3 .pull-left.headerlink,.rst-content h4 .pull-left.headerlink,.rst-content h5 .pull-left.headerlink,.rst-content h6 .pull-left.headerlink,.rst-content p .pull-left.headerlink,.rst-content table>caption .pull-left.headerlink,.rst-content tt.download span.pull-left:first-child,.wy-menu-vertical li.current>a button.pull-left.toctree-expand,.wy-menu-vertical li.on a button.pull-left.toctree-expand,.wy-menu-vertical li button.pull-left.toctree-expand{margin-right:.3em}.fa.pull-right,.pull-right.icon,.rst-content .code-block-caption .pull-right.headerlink,.rst-content .eqno .pull-right.headerlink,.rst-content .pull-right.admonition-title,.rst-content code.download span.pull-right:first-child,.rst-content dl dt .pull-right.headerlink,.rst-content h1 .pull-right.headerlink,.rst-content h2 .pull-right.headerlink,.rst-content h3 .pull-right.headerlink,.rst-content h4 .pull-right.headerlink,.rst-content h5 .pull-right.headerlink,.rst-content h6 .pull-right.headerlink,.rst-content p .pull-right.headerlink,.rst-content table>caption .pull-right.headerlink,.rst-content tt.download span.pull-right:first-child,.wy-menu-vertical li.current>a button.pull-right.toctree-expand,.wy-menu-vertical li.on a button.pull-right.toctree-expand,.wy-menu-vertical li button.pull-right.toctree-expand{margin-left:.3em}.fa-spin{-webkit-animation:fa-spin 2s linear infinite;animation:fa-spin 2s linear infinite}.fa-pulse{-webkit-animation:fa-spin 1s steps(8) infinite;animation:fa-spin 1s steps(8) infinite}@-webkit-keyframes fa-spin{0%{-webkit-transform:rotate(0deg);transform:rotate(0deg)}to{-webkit-transform:rotate(359deg);transform:rotate(359deg)}}@keyframes fa-spin{0%{-webkit-transform:rotate(0deg);transform:rotate(0deg)}to{-webkit-transform:rotate(359deg);transform:rotate(359deg)}}.fa-rotate-90{-ms-filter:"progid:DXImageTransform.Microsoft.BasicImage(rotation=1)";-webkit-transform:rotate(90deg);-ms-transform:rotate(90deg);transform:rotate(90deg)}.fa-rotate-180{-ms-filter:"progid:DXImageTransform.Microsoft.BasicImage(rotation=2)";-webkit-transform:rotate(180deg);-ms-transform:rotate(180deg);transform:rotate(180deg)}.fa-rotate-270{-ms-filter:"progid:DXImageTransform.Microsoft.BasicImage(rotation=3)";-webkit-transform:rotate(270deg);-ms-transform:rotate(270deg);transform:rotate(270deg)}.fa-flip-horizontal{-ms-filter:"progid:DXImageTransform.Microsoft.BasicImage(rotation=0, mirror=1)";-webkit-transform:scaleX(-1);-ms-transform:scaleX(-1);transform:scaleX(-1)}.fa-flip-vertical{-ms-filter:"progid:DXImageTransform.Microsoft.BasicImage(rotation=2, mirror=1)";-webkit-transform:scaleY(-1);-ms-transform:scaleY(-1);transform:scaleY(-1)}:root .fa-flip-horizontal,:root .fa-flip-vertical,:root .fa-rotate-90,:root .fa-rotate-180,:root .fa-rotate-270{filter:none}.fa-stack{position:relative;display:inline-block;width:2em;height:2em;line-height:2em;vertical-align:middle}.fa-stack-1x,.fa-stack-2x{position:absolute;left:0;width:100%;text-align:center}.fa-stack-1x{line-height:inherit}.fa-stack-2x{font-size:2em}.fa-inverse{color:#fff}.fa-glass:before{content:""}.fa-music:before{content:""}.fa-search:before,.icon-search:before{content:""}.fa-envelope-o:before{content:""}.fa-heart:before{content:""}.fa-star:before{content:""}.fa-star-o:before{content:""}.fa-user:before{content:""}.fa-film:before{content:""}.fa-th-large:before{content:""}.fa-th:before{content:""}.fa-th-list:before{content:""}.fa-check:before{content:""}.fa-close:before,.fa-remove:before,.fa-times:before{content:""}.fa-search-plus:before{content:""}.fa-search-minus:before{content:""}.fa-power-off:before{content:""}.fa-signal:before{content:""}.fa-cog:before,.fa-gear:before{content:""}.fa-trash-o:before{content:""}.fa-home:before,.icon-home:before{content:""}.fa-file-o:before{content:""}.fa-clock-o:before{content:""}.fa-road:before{content:""}.fa-download:before,.rst-content code.download span:first-child:before,.rst-content tt.download span:first-child:before{content:""}.fa-arrow-circle-o-down:before{content:""}.fa-arrow-circle-o-up:before{content:""}.fa-inbox:before{content:""}.fa-play-circle-o:before{content:""}.fa-repeat:before,.fa-rotate-right:before{content:""}.fa-refresh:before{content:""}.fa-list-alt:before{content:""}.fa-lock:before{content:""}.fa-flag:before{content:""}.fa-headphones:before{content:""}.fa-volume-off:before{content:""}.fa-volume-down:before{content:""}.fa-volume-up:before{content:""}.fa-qrcode:before{content:""}.fa-barcode:before{content:""}.fa-tag:before{content:""}.fa-tags:before{content:""}.fa-book:before,.icon-book:before{content:""}.fa-bookmark:before{content:""}.fa-print:before{content:""}.fa-camera:before{content:""}.fa-font:before{content:""}.fa-bold:before{content:""}.fa-italic:before{content:""}.fa-text-height:before{content:""}.fa-text-width:before{content:""}.fa-align-left:before{content:""}.fa-align-center:before{content:""}.fa-align-right:before{content:""}.fa-align-justify:before{content:""}.fa-list:before{content:""}.fa-dedent:before,.fa-outdent:before{content:""}.fa-indent:before{content:""}.fa-video-camera:before{content:""}.fa-image:before,.fa-photo:before,.fa-picture-o:before{content:""}.fa-pencil:before{content:""}.fa-map-marker:before{content:""}.fa-adjust:before{content:""}.fa-tint:before{content:""}.fa-edit:before,.fa-pencil-square-o:before{content:""}.fa-share-square-o:before{content:""}.fa-check-square-o:before{content:""}.fa-arrows:before{content:""}.fa-step-backward:before{content:""}.fa-fast-backward:before{content:""}.fa-backward:before{content:""}.fa-play:before{content:""}.fa-pause:before{content:""}.fa-stop:before{content:""}.fa-forward:before{content:""}.fa-fast-forward:before{content:""}.fa-step-forward:before{content:""}.fa-eject:before{content:""}.fa-chevron-left:before{content:""}.fa-chevron-right:before{content:""}.fa-plus-circle:before{content:""}.fa-minus-circle:before{content:""}.fa-times-circle:before,.wy-inline-validate.wy-inline-validate-danger .wy-input-context:before{content:""}.fa-check-circle:before,.wy-inline-validate.wy-inline-validate-success .wy-input-context:before{content:""}.fa-question-circle:before{content:""}.fa-info-circle:before{content:""}.fa-crosshairs:before{content:""}.fa-times-circle-o:before{content:""}.fa-check-circle-o:before{content:""}.fa-ban:before{content:""}.fa-arrow-left:before{content:""}.fa-arrow-right:before{content:""}.fa-arrow-up:before{content:""}.fa-arrow-down:before{content:""}.fa-mail-forward:before,.fa-share:before{content:""}.fa-expand:before{content:""}.fa-compress:before{content:""}.fa-plus:before{content:""}.fa-minus:before{content:""}.fa-asterisk:before{content:""}.fa-exclamation-circle:before,.rst-content .admonition-title:before,.wy-inline-validate.wy-inline-validate-info .wy-input-context:before,.wy-inline-validate.wy-inline-validate-warning .wy-input-context:before{content:""}.fa-gift:before{content:""}.fa-leaf:before{content:""}.fa-fire:before,.icon-fire:before{content:""}.fa-eye:before{content:""}.fa-eye-slash:before{content:""}.fa-exclamation-triangle:before,.fa-warning:before{content:""}.fa-plane:before{content:""}.fa-calendar:before{content:""}.fa-random:before{content:""}.fa-comment:before{content:""}.fa-magnet:before{content:""}.fa-chevron-up:before{content:""}.fa-chevron-down:before{content:""}.fa-retweet:before{content:""}.fa-shopping-cart:before{content:""}.fa-folder:before{content:""}.fa-folder-open:before{content:""}.fa-arrows-v:before{content:""}.fa-arrows-h:before{content:""}.fa-bar-chart-o:before,.fa-bar-chart:before{content:""}.fa-twitter-square:before{content:""}.fa-facebook-square:before{content:""}.fa-camera-retro:before{content:""}.fa-key:before{content:""}.fa-cogs:before,.fa-gears:before{content:""}.fa-comments:before{content:""}.fa-thumbs-o-up:before{content:""}.fa-thumbs-o-down:before{content:""}.fa-star-half:before{content:""}.fa-heart-o:before{content:""}.fa-sign-out:before{content:""}.fa-linkedin-square:before{content:""}.fa-thumb-tack:before{content:""}.fa-external-link:before{content:""}.fa-sign-in:before{content:""}.fa-trophy:before{content:""}.fa-github-square:before{content:""}.fa-upload:before{content:""}.fa-lemon-o:before{content:""}.fa-phone:before{content:""}.fa-square-o:before{content:""}.fa-bookmark-o:before{content:""}.fa-phone-square:before{content:""}.fa-twitter:before{content:""}.fa-facebook-f:before,.fa-facebook:before{content:""}.fa-github:before,.icon-github:before{content:""}.fa-unlock:before{content:""}.fa-credit-card:before{content:""}.fa-feed:before,.fa-rss:before{content:""}.fa-hdd-o:before{content:""}.fa-bullhorn:before{content:""}.fa-bell:before{content:""}.fa-certificate:before{content:""}.fa-hand-o-right:before{content:""}.fa-hand-o-left:before{content:""}.fa-hand-o-up:before{content:""}.fa-hand-o-down:before{content:""}.fa-arrow-circle-left:before,.icon-circle-arrow-left:before{content:""}.fa-arrow-circle-right:before,.icon-circle-arrow-right:before{content:""}.fa-arrow-circle-up:before{content:""}.fa-arrow-circle-down:before{content:""}.fa-globe:before{content:""}.fa-wrench:before{content:""}.fa-tasks:before{content:""}.fa-filter:before{content:""}.fa-briefcase:before{content:""}.fa-arrows-alt:before{content:""}.fa-group:before,.fa-users:before{content:""}.fa-chain:before,.fa-link:before,.icon-link:before{content:""}.fa-cloud:before{content:""}.fa-flask:before{content:""}.fa-cut:before,.fa-scissors:before{content:""}.fa-copy:before,.fa-files-o:before{content:""}.fa-paperclip:before{content:""}.fa-floppy-o:before,.fa-save:before{content:""}.fa-square:before{content:""}.fa-bars:before,.fa-navicon:before,.fa-reorder:before{content:""}.fa-list-ul:before{content:""}.fa-list-ol:before{content:""}.fa-strikethrough:before{content:""}.fa-underline:before{content:""}.fa-table:before{content:""}.fa-magic:before{content:""}.fa-truck:before{content:""}.fa-pinterest:before{content:""}.fa-pinterest-square:before{content:""}.fa-google-plus-square:before{content:""}.fa-google-plus:before{content:""}.fa-money:before{content:""}.fa-caret-down:before,.icon-caret-down:before,.wy-dropdown .caret:before{content:""}.fa-caret-up:before{content:""}.fa-caret-left:before{content:""}.fa-caret-right:before{content:""}.fa-columns:before{content:""}.fa-sort:before,.fa-unsorted:before{content:""}.fa-sort-desc:before,.fa-sort-down:before{content:""}.fa-sort-asc:before,.fa-sort-up:before{content:""}.fa-envelope:before{content:""}.fa-linkedin:before{content:""}.fa-rotate-left:before,.fa-undo:before{content:""}.fa-gavel:before,.fa-legal:before{content:""}.fa-dashboard:before,.fa-tachometer:before{content:""}.fa-comment-o:before{content:""}.fa-comments-o:before{content:""}.fa-bolt:before,.fa-flash:before{content:""}.fa-sitemap:before{content:""}.fa-umbrella:before{content:""}.fa-clipboard:before,.fa-paste:before{content:""}.fa-lightbulb-o:before{content:""}.fa-exchange:before{content:""}.fa-cloud-download:before{content:""}.fa-cloud-upload:before{content:""}.fa-user-md:before{content:""}.fa-stethoscope:before{content:""}.fa-suitcase:before{content:""}.fa-bell-o:before{content:""}.fa-coffee:before{content:""}.fa-cutlery:before{content:""}.fa-file-text-o:before{content:""}.fa-building-o:before{content:""}.fa-hospital-o:before{content:""}.fa-ambulance:before{content:""}.fa-medkit:before{content:""}.fa-fighter-jet:before{content:""}.fa-beer:before{content:""}.fa-h-square:before{content:""}.fa-plus-square:before{content:""}.fa-angle-double-left:before{content:""}.fa-angle-double-right:before{content:""}.fa-angle-double-up:before{content:""}.fa-angle-double-down:before{content:""}.fa-angle-left:before{content:""}.fa-angle-right:before{content:""}.fa-angle-up:before{content:""}.fa-angle-down:before{content:""}.fa-desktop:before{content:""}.fa-laptop:before{content:""}.fa-tablet:before{content:""}.fa-mobile-phone:before,.fa-mobile:before{content:""}.fa-circle-o:before{content:""}.fa-quote-left:before{content:""}.fa-quote-right:before{content:""}.fa-spinner:before{content:""}.fa-circle:before{content:""}.fa-mail-reply:before,.fa-reply:before{content:""}.fa-github-alt:before{content:""}.fa-folder-o:before{content:""}.fa-folder-open-o:before{content:""}.fa-smile-o:before{content:""}.fa-frown-o:before{content:""}.fa-meh-o:before{content:""}.fa-gamepad:before{content:""}.fa-keyboard-o:before{content:""}.fa-flag-o:before{content:""}.fa-flag-checkered:before{content:""}.fa-terminal:before{content:""}.fa-code:before{content:""}.fa-mail-reply-all:before,.fa-reply-all:before{content:""}.fa-star-half-empty:before,.fa-star-half-full:before,.fa-star-half-o:before{content:""}.fa-location-arrow:before{content:""}.fa-crop:before{content:""}.fa-code-fork:before{content:""}.fa-chain-broken:before,.fa-unlink:before{content:""}.fa-question:before{content:""}.fa-info:before{content:""}.fa-exclamation:before{content:""}.fa-superscript:before{content:""}.fa-subscript:before{content:""}.fa-eraser:before{content:""}.fa-puzzle-piece:before{content:""}.fa-microphone:before{content:""}.fa-microphone-slash:before{content:""}.fa-shield:before{content:""}.fa-calendar-o:before{content:""}.fa-fire-extinguisher:before{content:""}.fa-rocket:before{content:""}.fa-maxcdn:before{content:""}.fa-chevron-circle-left:before{content:""}.fa-chevron-circle-right:before{content:""}.fa-chevron-circle-up:before{content:""}.fa-chevron-circle-down:before{content:""}.fa-html5:before{content:""}.fa-css3:before{content:""}.fa-anchor:before{content:""}.fa-unlock-alt:before{content:""}.fa-bullseye:before{content:""}.fa-ellipsis-h:before{content:""}.fa-ellipsis-v:before{content:""}.fa-rss-square:before{content:""}.fa-play-circle:before{content:""}.fa-ticket:before{content:""}.fa-minus-square:before{content:""}.fa-minus-square-o:before,.wy-menu-vertical li.current>a button.toctree-expand:before,.wy-menu-vertical li.on a button.toctree-expand:before{content:""}.fa-level-up:before{content:""}.fa-level-down:before{content:""}.fa-check-square:before{content:""}.fa-pencil-square:before{content:""}.fa-external-link-square:before{content:""}.fa-share-square:before{content:""}.fa-compass:before{content:""}.fa-caret-square-o-down:before,.fa-toggle-down:before{content:""}.fa-caret-square-o-up:before,.fa-toggle-up:before{content:""}.fa-caret-square-o-right:before,.fa-toggle-right:before{content:""}.fa-eur:before,.fa-euro:before{content:""}.fa-gbp:before{content:""}.fa-dollar:before,.fa-usd:before{content:""}.fa-inr:before,.fa-rupee:before{content:""}.fa-cny:before,.fa-jpy:before,.fa-rmb:before,.fa-yen:before{content:""}.fa-rouble:before,.fa-rub:before,.fa-ruble:before{content:""}.fa-krw:before,.fa-won:before{content:""}.fa-bitcoin:before,.fa-btc:before{content:""}.fa-file:before{content:""}.fa-file-text:before{content:""}.fa-sort-alpha-asc:before{content:""}.fa-sort-alpha-desc:before{content:""}.fa-sort-amount-asc:before{content:""}.fa-sort-amount-desc:before{content:""}.fa-sort-numeric-asc:before{content:""}.fa-sort-numeric-desc:before{content:""}.fa-thumbs-up:before{content:""}.fa-thumbs-down:before{content:""}.fa-youtube-square:before{content:""}.fa-youtube:before{content:""}.fa-xing:before{content:""}.fa-xing-square:before{content:""}.fa-youtube-play:before{content:""}.fa-dropbox:before{content:""}.fa-stack-overflow:before{content:""}.fa-instagram:before{content:""}.fa-flickr:before{content:""}.fa-adn:before{content:""}.fa-bitbucket:before,.icon-bitbucket:before{content:""}.fa-bitbucket-square:before{content:""}.fa-tumblr:before{content:""}.fa-tumblr-square:before{content:""}.fa-long-arrow-down:before{content:""}.fa-long-arrow-up:before{content:""}.fa-long-arrow-left:before{content:""}.fa-long-arrow-right:before{content:""}.fa-apple:before{content:""}.fa-windows:before{content:""}.fa-android:before{content:""}.fa-linux:before{content:""}.fa-dribbble:before{content:""}.fa-skype:before{content:""}.fa-foursquare:before{content:""}.fa-trello:before{content:""}.fa-female:before{content:""}.fa-male:before{content:""}.fa-gittip:before,.fa-gratipay:before{content:""}.fa-sun-o:before{content:""}.fa-moon-o:before{content:""}.fa-archive:before{content:""}.fa-bug:before{content:""}.fa-vk:before{content:""}.fa-weibo:before{content:""}.fa-renren:before{content:""}.fa-pagelines:before{content:""}.fa-stack-exchange:before{content:""}.fa-arrow-circle-o-right:before{content:""}.fa-arrow-circle-o-left:before{content:""}.fa-caret-square-o-left:before,.fa-toggle-left:before{content:""}.fa-dot-circle-o:before{content:""}.fa-wheelchair:before{content:""}.fa-vimeo-square:before{content:""}.fa-try:before,.fa-turkish-lira:before{content:""}.fa-plus-square-o:before,.wy-menu-vertical li button.toctree-expand:before{content:""}.fa-space-shuttle:before{content:""}.fa-slack:before{content:""}.fa-envelope-square:before{content:""}.fa-wordpress:before{content:""}.fa-openid:before{content:""}.fa-bank:before,.fa-institution:before,.fa-university:before{content:""}.fa-graduation-cap:before,.fa-mortar-board:before{content:""}.fa-yahoo:before{content:""}.fa-google:before{content:""}.fa-reddit:before{content:""}.fa-reddit-square:before{content:""}.fa-stumbleupon-circle:before{content:""}.fa-stumbleupon:before{content:""}.fa-delicious:before{content:""}.fa-digg:before{content:""}.fa-pied-piper-pp:before{content:""}.fa-pied-piper-alt:before{content:""}.fa-drupal:before{content:""}.fa-joomla:before{content:""}.fa-language:before{content:""}.fa-fax:before{content:""}.fa-building:before{content:""}.fa-child:before{content:""}.fa-paw:before{content:""}.fa-spoon:before{content:""}.fa-cube:before{content:""}.fa-cubes:before{content:""}.fa-behance:before{content:""}.fa-behance-square:before{content:""}.fa-steam:before{content:""}.fa-steam-square:before{content:""}.fa-recycle:before{content:""}.fa-automobile:before,.fa-car:before{content:""}.fa-cab:before,.fa-taxi:before{content:""}.fa-tree:before{content:""}.fa-spotify:before{content:""}.fa-deviantart:before{content:""}.fa-soundcloud:before{content:""}.fa-database:before{content:""}.fa-file-pdf-o:before{content:""}.fa-file-word-o:before{content:""}.fa-file-excel-o:before{content:""}.fa-file-powerpoint-o:before{content:""}.fa-file-image-o:before,.fa-file-photo-o:before,.fa-file-picture-o:before{content:""}.fa-file-archive-o:before,.fa-file-zip-o:before{content:""}.fa-file-audio-o:before,.fa-file-sound-o:before{content:""}.fa-file-movie-o:before,.fa-file-video-o:before{content:""}.fa-file-code-o:before{content:""}.fa-vine:before{content:""}.fa-codepen:before{content:""}.fa-jsfiddle:before{content:""}.fa-life-bouy:before,.fa-life-buoy:before,.fa-life-ring:before,.fa-life-saver:before,.fa-support:before{content:""}.fa-circle-o-notch:before{content:""}.fa-ra:before,.fa-rebel:before,.fa-resistance:before{content:""}.fa-empire:before,.fa-ge:before{content:""}.fa-git-square:before{content:""}.fa-git:before{content:""}.fa-hacker-news:before,.fa-y-combinator-square:before,.fa-yc-square:before{content:""}.fa-tencent-weibo:before{content:""}.fa-qq:before{content:""}.fa-wechat:before,.fa-weixin:before{content:""}.fa-paper-plane:before,.fa-send:before{content:""}.fa-paper-plane-o:before,.fa-send-o:before{content:""}.fa-history:before{content:""}.fa-circle-thin:before{content:""}.fa-header:before{content:""}.fa-paragraph:before{content:""}.fa-sliders:before{content:""}.fa-share-alt:before{content:""}.fa-share-alt-square:before{content:""}.fa-bomb:before{content:""}.fa-futbol-o:before,.fa-soccer-ball-o:before{content:""}.fa-tty:before{content:""}.fa-binoculars:before{content:""}.fa-plug:before{content:""}.fa-slideshare:before{content:""}.fa-twitch:before{content:""}.fa-yelp:before{content:""}.fa-newspaper-o:before{content:""}.fa-wifi:before{content:""}.fa-calculator:before{content:""}.fa-paypal:before{content:""}.fa-google-wallet:before{content:""}.fa-cc-visa:before{content:""}.fa-cc-mastercard:before{content:""}.fa-cc-discover:before{content:""}.fa-cc-amex:before{content:""}.fa-cc-paypal:before{content:""}.fa-cc-stripe:before{content:""}.fa-bell-slash:before{content:""}.fa-bell-slash-o:before{content:""}.fa-trash:before{content:""}.fa-copyright:before{content:""}.fa-at:before{content:""}.fa-eyedropper:before{content:""}.fa-paint-brush:before{content:""}.fa-birthday-cake:before{content:""}.fa-area-chart:before{content:""}.fa-pie-chart:before{content:""}.fa-line-chart:before{content:""}.fa-lastfm:before{content:""}.fa-lastfm-square:before{content:""}.fa-toggle-off:before{content:""}.fa-toggle-on:before{content:""}.fa-bicycle:before{content:""}.fa-bus:before{content:""}.fa-ioxhost:before{content:""}.fa-angellist:before{content:""}.fa-cc:before{content:""}.fa-ils:before,.fa-shekel:before,.fa-sheqel:before{content:""}.fa-meanpath:before{content:""}.fa-buysellads:before{content:""}.fa-connectdevelop:before{content:""}.fa-dashcube:before{content:""}.fa-forumbee:before{content:""}.fa-leanpub:before{content:""}.fa-sellsy:before{content:""}.fa-shirtsinbulk:before{content:""}.fa-simplybuilt:before{content:""}.fa-skyatlas:before{content:""}.fa-cart-plus:before{content:""}.fa-cart-arrow-down:before{content:""}.fa-diamond:before{content:""}.fa-ship:before{content:""}.fa-user-secret:before{content:""}.fa-motorcycle:before{content:""}.fa-street-view:before{content:""}.fa-heartbeat:before{content:""}.fa-venus:before{content:""}.fa-mars:before{content:""}.fa-mercury:before{content:""}.fa-intersex:before,.fa-transgender:before{content:""}.fa-transgender-alt:before{content:""}.fa-venus-double:before{content:""}.fa-mars-double:before{content:""}.fa-venus-mars:before{content:""}.fa-mars-stroke:before{content:""}.fa-mars-stroke-v:before{content:""}.fa-mars-stroke-h:before{content:""}.fa-neuter:before{content:""}.fa-genderless:before{content:""}.fa-facebook-official:before{content:""}.fa-pinterest-p:before{content:""}.fa-whatsapp:before{content:""}.fa-server:before{content:""}.fa-user-plus:before{content:""}.fa-user-times:before{content:""}.fa-bed:before,.fa-hotel:before{content:""}.fa-viacoin:before{content:""}.fa-train:before{content:""}.fa-subway:before{content:""}.fa-medium:before{content:""}.fa-y-combinator:before,.fa-yc:before{content:""}.fa-optin-monster:before{content:""}.fa-opencart:before{content:""}.fa-expeditedssl:before{content:""}.fa-battery-4:before,.fa-battery-full:before,.fa-battery:before{content:""}.fa-battery-3:before,.fa-battery-three-quarters:before{content:""}.fa-battery-2:before,.fa-battery-half:before{content:""}.fa-battery-1:before,.fa-battery-quarter:before{content:""}.fa-battery-0:before,.fa-battery-empty:before{content:""}.fa-mouse-pointer:before{content:""}.fa-i-cursor:before{content:""}.fa-object-group:before{content:""}.fa-object-ungroup:before{content:""}.fa-sticky-note:before{content:""}.fa-sticky-note-o:before{content:""}.fa-cc-jcb:before{content:""}.fa-cc-diners-club:before{content:""}.fa-clone:before{content:""}.fa-balance-scale:before{content:""}.fa-hourglass-o:before{content:""}.fa-hourglass-1:before,.fa-hourglass-start:before{content:""}.fa-hourglass-2:before,.fa-hourglass-half:before{content:""}.fa-hourglass-3:before,.fa-hourglass-end:before{content:""}.fa-hourglass:before{content:""}.fa-hand-grab-o:before,.fa-hand-rock-o:before{content:""}.fa-hand-paper-o:before,.fa-hand-stop-o:before{content:""}.fa-hand-scissors-o:before{content:""}.fa-hand-lizard-o:before{content:""}.fa-hand-spock-o:before{content:""}.fa-hand-pointer-o:before{content:""}.fa-hand-peace-o:before{content:""}.fa-trademark:before{content:""}.fa-registered:before{content:""}.fa-creative-commons:before{content:""}.fa-gg:before{content:""}.fa-gg-circle:before{content:""}.fa-tripadvisor:before{content:""}.fa-odnoklassniki:before{content:""}.fa-odnoklassniki-square:before{content:""}.fa-get-pocket:before{content:""}.fa-wikipedia-w:before{content:""}.fa-safari:before{content:""}.fa-chrome:before{content:""}.fa-firefox:before{content:""}.fa-opera:before{content:""}.fa-internet-explorer:before{content:""}.fa-television:before,.fa-tv:before{content:""}.fa-contao:before{content:""}.fa-500px:before{content:""}.fa-amazon:before{content:""}.fa-calendar-plus-o:before{content:""}.fa-calendar-minus-o:before{content:""}.fa-calendar-times-o:before{content:""}.fa-calendar-check-o:before{content:""}.fa-industry:before{content:""}.fa-map-pin:before{content:""}.fa-map-signs:before{content:""}.fa-map-o:before{content:""}.fa-map:before{content:""}.fa-commenting:before{content:""}.fa-commenting-o:before{content:""}.fa-houzz:before{content:""}.fa-vimeo:before{content:""}.fa-black-tie:before{content:""}.fa-fonticons:before{content:""}.fa-reddit-alien:before{content:""}.fa-edge:before{content:""}.fa-credit-card-alt:before{content:""}.fa-codiepie:before{content:""}.fa-modx:before{content:""}.fa-fort-awesome:before{content:""}.fa-usb:before{content:""}.fa-product-hunt:before{content:""}.fa-mixcloud:before{content:""}.fa-scribd:before{content:""}.fa-pause-circle:before{content:""}.fa-pause-circle-o:before{content:""}.fa-stop-circle:before{content:""}.fa-stop-circle-o:before{content:""}.fa-shopping-bag:before{content:""}.fa-shopping-basket:before{content:""}.fa-hashtag:before{content:""}.fa-bluetooth:before{content:""}.fa-bluetooth-b:before{content:""}.fa-percent:before{content:""}.fa-gitlab:before,.icon-gitlab:before{content:""}.fa-wpbeginner:before{content:""}.fa-wpforms:before{content:""}.fa-envira:before{content:""}.fa-universal-access:before{content:""}.fa-wheelchair-alt:before{content:""}.fa-question-circle-o:before{content:""}.fa-blind:before{content:""}.fa-audio-description:before{content:""}.fa-volume-control-phone:before{content:""}.fa-braille:before{content:""}.fa-assistive-listening-systems:before{content:""}.fa-american-sign-language-interpreting:before,.fa-asl-interpreting:before{content:""}.fa-deaf:before,.fa-deafness:before,.fa-hard-of-hearing:before{content:""}.fa-glide:before{content:""}.fa-glide-g:before{content:""}.fa-sign-language:before,.fa-signing:before{content:""}.fa-low-vision:before{content:""}.fa-viadeo:before{content:""}.fa-viadeo-square:before{content:""}.fa-snapchat:before{content:""}.fa-snapchat-ghost:before{content:""}.fa-snapchat-square:before{content:""}.fa-pied-piper:before{content:""}.fa-first-order:before{content:""}.fa-yoast:before{content:""}.fa-themeisle:before{content:""}.fa-google-plus-circle:before,.fa-google-plus-official:before{content:""}.fa-fa:before,.fa-font-awesome:before{content:""}.fa-handshake-o:before{content:""}.fa-envelope-open:before{content:""}.fa-envelope-open-o:before{content:""}.fa-linode:before{content:""}.fa-address-book:before{content:""}.fa-address-book-o:before{content:""}.fa-address-card:before,.fa-vcard:before{content:""}.fa-address-card-o:before,.fa-vcard-o:before{content:""}.fa-user-circle:before{content:""}.fa-user-circle-o:before{content:""}.fa-user-o:before{content:""}.fa-id-badge:before{content:""}.fa-drivers-license:before,.fa-id-card:before{content:""}.fa-drivers-license-o:before,.fa-id-card-o:before{content:""}.fa-quora:before{content:""}.fa-free-code-camp:before{content:""}.fa-telegram:before{content:""}.fa-thermometer-4:before,.fa-thermometer-full:before,.fa-thermometer:before{content:""}.fa-thermometer-3:before,.fa-thermometer-three-quarters:before{content:""}.fa-thermometer-2:before,.fa-thermometer-half:before{content:""}.fa-thermometer-1:before,.fa-thermometer-quarter:before{content:""}.fa-thermometer-0:before,.fa-thermometer-empty:before{content:""}.fa-shower:before{content:""}.fa-bath:before,.fa-bathtub:before,.fa-s15:before{content:""}.fa-podcast:before{content:""}.fa-window-maximize:before{content:""}.fa-window-minimize:before{content:""}.fa-window-restore:before{content:""}.fa-times-rectangle:before,.fa-window-close:before{content:""}.fa-times-rectangle-o:before,.fa-window-close-o:before{content:""}.fa-bandcamp:before{content:""}.fa-grav:before{content:""}.fa-etsy:before{content:""}.fa-imdb:before{content:""}.fa-ravelry:before{content:""}.fa-eercast:before{content:""}.fa-microchip:before{content:""}.fa-snowflake-o:before{content:""}.fa-superpowers:before{content:""}.fa-wpexplorer:before{content:""}.fa-meetup:before{content:""}.sr-only{position:absolute;width:1px;height:1px;padding:0;margin:-1px;overflow:hidden;clip:rect(0,0,0,0);border:0}.sr-only-focusable:active,.sr-only-focusable:focus{position:static;width:auto;height:auto;margin:0;overflow:visible;clip:auto}.fa,.icon,.rst-content .admonition-title,.rst-content .code-block-caption .headerlink,.rst-content .eqno .headerlink,.rst-content code.download span:first-child,.rst-content dl dt .headerlink,.rst-content h1 .headerlink,.rst-content h2 .headerlink,.rst-content h3 .headerlink,.rst-content h4 .headerlink,.rst-content h5 .headerlink,.rst-content h6 .headerlink,.rst-content p.caption .headerlink,.rst-content p .headerlink,.rst-content table>caption .headerlink,.rst-content tt.download span:first-child,.wy-dropdown .caret,.wy-inline-validate.wy-inline-validate-danger .wy-input-context,.wy-inline-validate.wy-inline-validate-info .wy-input-context,.wy-inline-validate.wy-inline-validate-success .wy-input-context,.wy-inline-validate.wy-inline-validate-warning .wy-input-context,.wy-menu-vertical li.current>a button.toctree-expand,.wy-menu-vertical li.on a button.toctree-expand,.wy-menu-vertical li button.toctree-expand{font-family:inherit}.fa:before,.icon:before,.rst-content .admonition-title:before,.rst-content .code-block-caption .headerlink:before,.rst-content .eqno .headerlink:before,.rst-content code.download span:first-child:before,.rst-content dl dt .headerlink:before,.rst-content h1 .headerlink:before,.rst-content h2 .headerlink:before,.rst-content h3 .headerlink:before,.rst-content h4 .headerlink:before,.rst-content h5 .headerlink:before,.rst-content h6 .headerlink:before,.rst-content p.caption .headerlink:before,.rst-content p .headerlink:before,.rst-content table>caption .headerlink:before,.rst-content tt.download span:first-child:before,.wy-dropdown .caret:before,.wy-inline-validate.wy-inline-validate-danger .wy-input-context:before,.wy-inline-validate.wy-inline-validate-info .wy-input-context:before,.wy-inline-validate.wy-inline-validate-success .wy-input-context:before,.wy-inline-validate.wy-inline-validate-warning .wy-input-context:before,.wy-menu-vertical li.current>a button.toctree-expand:before,.wy-menu-vertical li.on a button.toctree-expand:before,.wy-menu-vertical li button.toctree-expand:before{font-family:FontAwesome;display:inline-block;font-style:normal;font-weight:400;line-height:1;text-decoration:inherit}.rst-content .code-block-caption a .headerlink,.rst-content .eqno a .headerlink,.rst-content a .admonition-title,.rst-content code.download a span:first-child,.rst-content dl dt a .headerlink,.rst-content h1 a .headerlink,.rst-content h2 a .headerlink,.rst-content h3 a .headerlink,.rst-content h4 a .headerlink,.rst-content h5 a .headerlink,.rst-content h6 a .headerlink,.rst-content p.caption a .headerlink,.rst-content p a .headerlink,.rst-content table>caption a .headerlink,.rst-content tt.download a span:first-child,.wy-menu-vertical li.current>a button.toctree-expand,.wy-menu-vertical li.on a button.toctree-expand,.wy-menu-vertical li a button.toctree-expand,a .fa,a .icon,a .rst-content .admonition-title,a .rst-content .code-block-caption .headerlink,a .rst-content .eqno .headerlink,a .rst-content code.download span:first-child,a .rst-content dl dt .headerlink,a .rst-content h1 .headerlink,a .rst-content h2 .headerlink,a .rst-content h3 .headerlink,a .rst-content h4 .headerlink,a .rst-content h5 .headerlink,a .rst-content h6 .headerlink,a .rst-content p.caption .headerlink,a .rst-content p .headerlink,a .rst-content table>caption .headerlink,a .rst-content tt.download span:first-child,a .wy-menu-vertical li button.toctree-expand{display:inline-block;text-decoration:inherit}.btn .fa,.btn .icon,.btn .rst-content .admonition-title,.btn .rst-content .code-block-caption .headerlink,.btn .rst-content .eqno .headerlink,.btn .rst-content code.download span:first-child,.btn .rst-content dl dt .headerlink,.btn .rst-content h1 .headerlink,.btn .rst-content h2 .headerlink,.btn .rst-content h3 .headerlink,.btn .rst-content h4 .headerlink,.btn .rst-content h5 .headerlink,.btn .rst-content h6 .headerlink,.btn .rst-content p .headerlink,.btn .rst-content table>caption .headerlink,.btn .rst-content tt.download span:first-child,.btn .wy-menu-vertical li.current>a button.toctree-expand,.btn .wy-menu-vertical li.on a button.toctree-expand,.btn .wy-menu-vertical li button.toctree-expand,.nav .fa,.nav .icon,.nav .rst-content .admonition-title,.nav .rst-content .code-block-caption .headerlink,.nav .rst-content .eqno .headerlink,.nav .rst-content code.download span:first-child,.nav .rst-content dl dt .headerlink,.nav .rst-content h1 .headerlink,.nav .rst-content h2 .headerlink,.nav .rst-content h3 .headerlink,.nav .rst-content h4 .headerlink,.nav .rst-content h5 .headerlink,.nav .rst-content h6 .headerlink,.nav .rst-content p .headerlink,.nav .rst-content table>caption .headerlink,.nav .rst-content tt.download span:first-child,.nav .wy-menu-vertical li.current>a button.toctree-expand,.nav .wy-menu-vertical li.on a button.toctree-expand,.nav .wy-menu-vertical li button.toctree-expand,.rst-content .btn .admonition-title,.rst-content .code-block-caption .btn .headerlink,.rst-content .code-block-caption .nav .headerlink,.rst-content .eqno .btn .headerlink,.rst-content .eqno .nav .headerlink,.rst-content .nav .admonition-title,.rst-content code.download .btn span:first-child,.rst-content code.download .nav span:first-child,.rst-content dl dt .btn .headerlink,.rst-content dl dt .nav .headerlink,.rst-content h1 .btn .headerlink,.rst-content h1 .nav .headerlink,.rst-content h2 .btn .headerlink,.rst-content h2 .nav .headerlink,.rst-content h3 .btn .headerlink,.rst-content h3 .nav .headerlink,.rst-content h4 .btn .headerlink,.rst-content h4 .nav .headerlink,.rst-content h5 .btn .headerlink,.rst-content h5 .nav .headerlink,.rst-content h6 .btn .headerlink,.rst-content h6 .nav .headerlink,.rst-content p .btn .headerlink,.rst-content p .nav .headerlink,.rst-content table>caption .btn .headerlink,.rst-content table>caption .nav .headerlink,.rst-content tt.download .btn span:first-child,.rst-content tt.download .nav span:first-child,.wy-menu-vertical li .btn button.toctree-expand,.wy-menu-vertical li.current>a .btn button.toctree-expand,.wy-menu-vertical li.current>a .nav button.toctree-expand,.wy-menu-vertical li .nav button.toctree-expand,.wy-menu-vertical li.on a .btn button.toctree-expand,.wy-menu-vertical li.on a .nav button.toctree-expand{display:inline}.btn .fa-large.icon,.btn .fa.fa-large,.btn .rst-content .code-block-caption .fa-large.headerlink,.btn .rst-content .eqno .fa-large.headerlink,.btn .rst-content .fa-large.admonition-title,.btn .rst-content code.download span.fa-large:first-child,.btn .rst-content dl dt .fa-large.headerlink,.btn .rst-content h1 .fa-large.headerlink,.btn .rst-content h2 .fa-large.headerlink,.btn .rst-content h3 .fa-large.headerlink,.btn .rst-content h4 .fa-large.headerlink,.btn .rst-content h5 .fa-large.headerlink,.btn .rst-content h6 .fa-large.headerlink,.btn .rst-content p .fa-large.headerlink,.btn .rst-content table>caption .fa-large.headerlink,.btn .rst-content tt.download span.fa-large:first-child,.btn .wy-menu-vertical li button.fa-large.toctree-expand,.nav .fa-large.icon,.nav .fa.fa-large,.nav .rst-content .code-block-caption .fa-large.headerlink,.nav .rst-content .eqno .fa-large.headerlink,.nav .rst-content .fa-large.admonition-title,.nav .rst-content code.download span.fa-large:first-child,.nav .rst-content dl dt .fa-large.headerlink,.nav .rst-content h1 .fa-large.headerlink,.nav .rst-content h2 .fa-large.headerlink,.nav .rst-content h3 .fa-large.headerlink,.nav .rst-content h4 .fa-large.headerlink,.nav .rst-content h5 .fa-large.headerlink,.nav .rst-content h6 .fa-large.headerlink,.nav .rst-content p .fa-large.headerlink,.nav .rst-content table>caption .fa-large.headerlink,.nav .rst-content tt.download span.fa-large:first-child,.nav .wy-menu-vertical li button.fa-large.toctree-expand,.rst-content .btn .fa-large.admonition-title,.rst-content .code-block-caption .btn .fa-large.headerlink,.rst-content .code-block-caption .nav .fa-large.headerlink,.rst-content .eqno .btn .fa-large.headerlink,.rst-content .eqno .nav .fa-large.headerlink,.rst-content .nav .fa-large.admonition-title,.rst-content code.download .btn span.fa-large:first-child,.rst-content code.download .nav span.fa-large:first-child,.rst-content dl dt .btn .fa-large.headerlink,.rst-content dl dt .nav .fa-large.headerlink,.rst-content h1 .btn .fa-large.headerlink,.rst-content h1 .nav .fa-large.headerlink,.rst-content h2 .btn .fa-large.headerlink,.rst-content h2 .nav .fa-large.headerlink,.rst-content h3 .btn .fa-large.headerlink,.rst-content h3 .nav .fa-large.headerlink,.rst-content h4 .btn .fa-large.headerlink,.rst-content h4 .nav .fa-large.headerlink,.rst-content h5 .btn .fa-large.headerlink,.rst-content h5 .nav .fa-large.headerlink,.rst-content h6 .btn .fa-large.headerlink,.rst-content h6 .nav .fa-large.headerlink,.rst-content p .btn .fa-large.headerlink,.rst-content p .nav .fa-large.headerlink,.rst-content table>caption .btn .fa-large.headerlink,.rst-content table>caption .nav .fa-large.headerlink,.rst-content tt.download .btn span.fa-large:first-child,.rst-content tt.download .nav span.fa-large:first-child,.wy-menu-vertical li .btn button.fa-large.toctree-expand,.wy-menu-vertical li .nav button.fa-large.toctree-expand{line-height:.9em}.btn .fa-spin.icon,.btn .fa.fa-spin,.btn .rst-content .code-block-caption .fa-spin.headerlink,.btn .rst-content .eqno .fa-spin.headerlink,.btn .rst-content .fa-spin.admonition-title,.btn .rst-content code.download span.fa-spin:first-child,.btn .rst-content dl dt .fa-spin.headerlink,.btn .rst-content h1 .fa-spin.headerlink,.btn .rst-content h2 .fa-spin.headerlink,.btn .rst-content h3 .fa-spin.headerlink,.btn .rst-content h4 .fa-spin.headerlink,.btn .rst-content h5 .fa-spin.headerlink,.btn .rst-content h6 .fa-spin.headerlink,.btn .rst-content p .fa-spin.headerlink,.btn .rst-content table>caption .fa-spin.headerlink,.btn .rst-content tt.download span.fa-spin:first-child,.btn .wy-menu-vertical li button.fa-spin.toctree-expand,.nav .fa-spin.icon,.nav .fa.fa-spin,.nav .rst-content .code-block-caption .fa-spin.headerlink,.nav .rst-content .eqno .fa-spin.headerlink,.nav .rst-content .fa-spin.admonition-title,.nav .rst-content code.download span.fa-spin:first-child,.nav .rst-content dl dt .fa-spin.headerlink,.nav .rst-content h1 .fa-spin.headerlink,.nav .rst-content h2 .fa-spin.headerlink,.nav .rst-content h3 .fa-spin.headerlink,.nav .rst-content h4 .fa-spin.headerlink,.nav .rst-content h5 .fa-spin.headerlink,.nav .rst-content h6 .fa-spin.headerlink,.nav .rst-content p .fa-spin.headerlink,.nav .rst-content table>caption .fa-spin.headerlink,.nav .rst-content tt.download span.fa-spin:first-child,.nav .wy-menu-vertical li button.fa-spin.toctree-expand,.rst-content .btn .fa-spin.admonition-title,.rst-content .code-block-caption .btn .fa-spin.headerlink,.rst-content .code-block-caption .nav .fa-spin.headerlink,.rst-content .eqno .btn .fa-spin.headerlink,.rst-content .eqno .nav .fa-spin.headerlink,.rst-content .nav .fa-spin.admonition-title,.rst-content code.download .btn span.fa-spin:first-child,.rst-content code.download .nav span.fa-spin:first-child,.rst-content dl dt .btn .fa-spin.headerlink,.rst-content dl dt .nav .fa-spin.headerlink,.rst-content h1 .btn .fa-spin.headerlink,.rst-content h1 .nav .fa-spin.headerlink,.rst-content h2 .btn .fa-spin.headerlink,.rst-content h2 .nav .fa-spin.headerlink,.rst-content h3 .btn .fa-spin.headerlink,.rst-content h3 .nav .fa-spin.headerlink,.rst-content h4 .btn .fa-spin.headerlink,.rst-content h4 .nav .fa-spin.headerlink,.rst-content h5 .btn .fa-spin.headerlink,.rst-content h5 .nav .fa-spin.headerlink,.rst-content h6 .btn .fa-spin.headerlink,.rst-content h6 .nav .fa-spin.headerlink,.rst-content p .btn .fa-spin.headerlink,.rst-content p .nav .fa-spin.headerlink,.rst-content table>caption .btn .fa-spin.headerlink,.rst-content table>caption .nav .fa-spin.headerlink,.rst-content tt.download .btn span.fa-spin:first-child,.rst-content tt.download .nav span.fa-spin:first-child,.wy-menu-vertical li .btn button.fa-spin.toctree-expand,.wy-menu-vertical li .nav button.fa-spin.toctree-expand{display:inline-block}.btn.fa:before,.btn.icon:before,.rst-content .btn.admonition-title:before,.rst-content .code-block-caption .btn.headerlink:before,.rst-content .eqno .btn.headerlink:before,.rst-content code.download span.btn:first-child:before,.rst-content dl dt .btn.headerlink:before,.rst-content h1 .btn.headerlink:before,.rst-content h2 .btn.headerlink:before,.rst-content h3 .btn.headerlink:before,.rst-content h4 .btn.headerlink:before,.rst-content h5 .btn.headerlink:before,.rst-content h6 .btn.headerlink:before,.rst-content p .btn.headerlink:before,.rst-content table>caption .btn.headerlink:before,.rst-content tt.download span.btn:first-child:before,.wy-menu-vertical li button.btn.toctree-expand:before{opacity:.5;-webkit-transition:opacity .05s ease-in;-moz-transition:opacity .05s ease-in;transition:opacity .05s ease-in}.btn.fa:hover:before,.btn.icon:hover:before,.rst-content .btn.admonition-title:hover:before,.rst-content .code-block-caption .btn.headerlink:hover:before,.rst-content .eqno .btn.headerlink:hover:before,.rst-content code.download span.btn:first-child:hover:before,.rst-content dl dt .btn.headerlink:hover:before,.rst-content h1 .btn.headerlink:hover:before,.rst-content h2 .btn.headerlink:hover:before,.rst-content h3 .btn.headerlink:hover:before,.rst-content h4 .btn.headerlink:hover:before,.rst-content h5 .btn.headerlink:hover:before,.rst-content h6 .btn.headerlink:hover:before,.rst-content p .btn.headerlink:hover:before,.rst-content table>caption .btn.headerlink:hover:before,.rst-content tt.download span.btn:first-child:hover:before,.wy-menu-vertical li button.btn.toctree-expand:hover:before{opacity:1}.btn-mini .fa:before,.btn-mini .icon:before,.btn-mini .rst-content .admonition-title:before,.btn-mini .rst-content .code-block-caption .headerlink:before,.btn-mini .rst-content .eqno .headerlink:before,.btn-mini .rst-content code.download span:first-child:before,.btn-mini .rst-content dl dt .headerlink:before,.btn-mini .rst-content h1 .headerlink:before,.btn-mini .rst-content h2 .headerlink:before,.btn-mini .rst-content h3 .headerlink:before,.btn-mini .rst-content h4 .headerlink:before,.btn-mini .rst-content h5 .headerlink:before,.btn-mini .rst-content h6 .headerlink:before,.btn-mini .rst-content p .headerlink:before,.btn-mini .rst-content table>caption .headerlink:before,.btn-mini .rst-content tt.download span:first-child:before,.btn-mini .wy-menu-vertical li button.toctree-expand:before,.rst-content .btn-mini .admonition-title:before,.rst-content .code-block-caption .btn-mini .headerlink:before,.rst-content .eqno .btn-mini .headerlink:before,.rst-content code.download .btn-mini span:first-child:before,.rst-content dl dt .btn-mini .headerlink:before,.rst-content h1 .btn-mini .headerlink:before,.rst-content h2 .btn-mini .headerlink:before,.rst-content h3 .btn-mini .headerlink:before,.rst-content h4 .btn-mini .headerlink:before,.rst-content h5 .btn-mini .headerlink:before,.rst-content h6 .btn-mini .headerlink:before,.rst-content p .btn-mini .headerlink:before,.rst-content table>caption .btn-mini .headerlink:before,.rst-content tt.download .btn-mini span:first-child:before,.wy-menu-vertical li .btn-mini button.toctree-expand:before{font-size:14px;vertical-align:-15%}.rst-content .admonition,.rst-content .admonition-todo,.rst-content .attention,.rst-content .caution,.rst-content .danger,.rst-content .error,.rst-content .hint,.rst-content .important,.rst-content .note,.rst-content .seealso,.rst-content .tip,.rst-content .warning,.wy-alert{padding:12px;line-height:24px;margin-bottom:24px;background:#e7f2fa}.rst-content .admonition-title,.wy-alert-title{font-weight:700;display:block;color:#fff;background:#6ab0de;padding:6px 12px;margin:-12px -12px 12px}.rst-content .danger,.rst-content .error,.rst-content .wy-alert-danger.admonition,.rst-content .wy-alert-danger.admonition-todo,.rst-content .wy-alert-danger.attention,.rst-content .wy-alert-danger.caution,.rst-content .wy-alert-danger.hint,.rst-content .wy-alert-danger.important,.rst-content .wy-alert-danger.note,.rst-content .wy-alert-danger.seealso,.rst-content .wy-alert-danger.tip,.rst-content .wy-alert-danger.warning,.wy-alert.wy-alert-danger{background:#fdf3f2}.rst-content .danger .admonition-title,.rst-content .danger .wy-alert-title,.rst-content .error .admonition-title,.rst-content .error .wy-alert-title,.rst-content .wy-alert-danger.admonition-todo .admonition-title,.rst-content .wy-alert-danger.admonition-todo .wy-alert-title,.rst-content .wy-alert-danger.admonition .admonition-title,.rst-content .wy-alert-danger.admonition .wy-alert-title,.rst-content .wy-alert-danger.attention .admonition-title,.rst-content .wy-alert-danger.attention .wy-alert-title,.rst-content .wy-alert-danger.caution .admonition-title,.rst-content .wy-alert-danger.caution .wy-alert-title,.rst-content .wy-alert-danger.hint .admonition-title,.rst-content .wy-alert-danger.hint .wy-alert-title,.rst-content .wy-alert-danger.important .admonition-title,.rst-content .wy-alert-danger.important .wy-alert-title,.rst-content .wy-alert-danger.note .admonition-title,.rst-content .wy-alert-danger.note .wy-alert-title,.rst-content .wy-alert-danger.seealso .admonition-title,.rst-content .wy-alert-danger.seealso .wy-alert-title,.rst-content .wy-alert-danger.tip .admonition-title,.rst-content .wy-alert-danger.tip .wy-alert-title,.rst-content .wy-alert-danger.warning .admonition-title,.rst-content .wy-alert-danger.warning .wy-alert-title,.rst-content .wy-alert.wy-alert-danger .admonition-title,.wy-alert.wy-alert-danger .rst-content .admonition-title,.wy-alert.wy-alert-danger .wy-alert-title{background:#f29f97}.rst-content .admonition-todo,.rst-content .attention,.rst-content .caution,.rst-content .warning,.rst-content .wy-alert-warning.admonition,.rst-content .wy-alert-warning.danger,.rst-content .wy-alert-warning.error,.rst-content .wy-alert-warning.hint,.rst-content .wy-alert-warning.important,.rst-content .wy-alert-warning.note,.rst-content .wy-alert-warning.seealso,.rst-content .wy-alert-warning.tip,.wy-alert.wy-alert-warning{background:#ffedcc}.rst-content .admonition-todo .admonition-title,.rst-content .admonition-todo .wy-alert-title,.rst-content .attention .admonition-title,.rst-content .attention .wy-alert-title,.rst-content .caution .admonition-title,.rst-content .caution .wy-alert-title,.rst-content .warning .admonition-title,.rst-content .warning .wy-alert-title,.rst-content .wy-alert-warning.admonition .admonition-title,.rst-content .wy-alert-warning.admonition .wy-alert-title,.rst-content .wy-alert-warning.danger .admonition-title,.rst-content .wy-alert-warning.danger .wy-alert-title,.rst-content .wy-alert-warning.error .admonition-title,.rst-content .wy-alert-warning.error .wy-alert-title,.rst-content .wy-alert-warning.hint .admonition-title,.rst-content .wy-alert-warning.hint .wy-alert-title,.rst-content .wy-alert-warning.important .admonition-title,.rst-content .wy-alert-warning.important .wy-alert-title,.rst-content .wy-alert-warning.note .admonition-title,.rst-content .wy-alert-warning.note .wy-alert-title,.rst-content .wy-alert-warning.seealso .admonition-title,.rst-content .wy-alert-warning.seealso .wy-alert-title,.rst-content .wy-alert-warning.tip .admonition-title,.rst-content .wy-alert-warning.tip .wy-alert-title,.rst-content .wy-alert.wy-alert-warning .admonition-title,.wy-alert.wy-alert-warning .rst-content .admonition-title,.wy-alert.wy-alert-warning .wy-alert-title{background:#f0b37e}.rst-content .note,.rst-content .seealso,.rst-content .wy-alert-info.admonition,.rst-content .wy-alert-info.admonition-todo,.rst-content .wy-alert-info.attention,.rst-content .wy-alert-info.caution,.rst-content .wy-alert-info.danger,.rst-content .wy-alert-info.error,.rst-content .wy-alert-info.hint,.rst-content .wy-alert-info.important,.rst-content .wy-alert-info.tip,.rst-content .wy-alert-info.warning,.wy-alert.wy-alert-info{background:#e7f2fa}.rst-content .note .admonition-title,.rst-content .note .wy-alert-title,.rst-content .seealso .admonition-title,.rst-content .seealso .wy-alert-title,.rst-content .wy-alert-info.admonition-todo .admonition-title,.rst-content .wy-alert-info.admonition-todo .wy-alert-title,.rst-content .wy-alert-info.admonition .admonition-title,.rst-content .wy-alert-info.admonition .wy-alert-title,.rst-content .wy-alert-info.attention .admonition-title,.rst-content .wy-alert-info.attention .wy-alert-title,.rst-content .wy-alert-info.caution .admonition-title,.rst-content .wy-alert-info.caution .wy-alert-title,.rst-content .wy-alert-info.danger .admonition-title,.rst-content .wy-alert-info.danger .wy-alert-title,.rst-content .wy-alert-info.error .admonition-title,.rst-content .wy-alert-info.error .wy-alert-title,.rst-content .wy-alert-info.hint .admonition-title,.rst-content .wy-alert-info.hint .wy-alert-title,.rst-content .wy-alert-info.important .admonition-title,.rst-content .wy-alert-info.important .wy-alert-title,.rst-content .wy-alert-info.tip .admonition-title,.rst-content .wy-alert-info.tip .wy-alert-title,.rst-content .wy-alert-info.warning .admonition-title,.rst-content .wy-alert-info.warning .wy-alert-title,.rst-content .wy-alert.wy-alert-info .admonition-title,.wy-alert.wy-alert-info .rst-content .admonition-title,.wy-alert.wy-alert-info .wy-alert-title{background:#6ab0de}.rst-content .hint,.rst-content .important,.rst-content .tip,.rst-content .wy-alert-success.admonition,.rst-content .wy-alert-success.admonition-todo,.rst-content .wy-alert-success.attention,.rst-content .wy-alert-success.caution,.rst-content .wy-alert-success.danger,.rst-content .wy-alert-success.error,.rst-content .wy-alert-success.note,.rst-content .wy-alert-success.seealso,.rst-content .wy-alert-success.warning,.wy-alert.wy-alert-success{background:#dbfaf4}.rst-content .hint .admonition-title,.rst-content .hint .wy-alert-title,.rst-content .important .admonition-title,.rst-content .important .wy-alert-title,.rst-content .tip .admonition-title,.rst-content .tip .wy-alert-title,.rst-content .wy-alert-success.admonition-todo .admonition-title,.rst-content .wy-alert-success.admonition-todo .wy-alert-title,.rst-content .wy-alert-success.admonition .admonition-title,.rst-content .wy-alert-success.admonition .wy-alert-title,.rst-content .wy-alert-success.attention .admonition-title,.rst-content .wy-alert-success.attention .wy-alert-title,.rst-content .wy-alert-success.caution .admonition-title,.rst-content .wy-alert-success.caution .wy-alert-title,.rst-content .wy-alert-success.danger .admonition-title,.rst-content .wy-alert-success.danger .wy-alert-title,.rst-content .wy-alert-success.error .admonition-title,.rst-content .wy-alert-success.error .wy-alert-title,.rst-content .wy-alert-success.note .admonition-title,.rst-content .wy-alert-success.note .wy-alert-title,.rst-content .wy-alert-success.seealso .admonition-title,.rst-content .wy-alert-success.seealso .wy-alert-title,.rst-content .wy-alert-success.warning .admonition-title,.rst-content .wy-alert-success.warning .wy-alert-title,.rst-content .wy-alert.wy-alert-success .admonition-title,.wy-alert.wy-alert-success .rst-content .admonition-title,.wy-alert.wy-alert-success .wy-alert-title{background:#1abc9c}.rst-content .wy-alert-neutral.admonition,.rst-content .wy-alert-neutral.admonition-todo,.rst-content .wy-alert-neutral.attention,.rst-content .wy-alert-neutral.caution,.rst-content .wy-alert-neutral.danger,.rst-content .wy-alert-neutral.error,.rst-content .wy-alert-neutral.hint,.rst-content .wy-alert-neutral.important,.rst-content .wy-alert-neutral.note,.rst-content .wy-alert-neutral.seealso,.rst-content .wy-alert-neutral.tip,.rst-content .wy-alert-neutral.warning,.wy-alert.wy-alert-neutral{background:#f3f6f6}.rst-content .wy-alert-neutral.admonition-todo .admonition-title,.rst-content .wy-alert-neutral.admonition-todo .wy-alert-title,.rst-content .wy-alert-neutral.admonition .admonition-title,.rst-content .wy-alert-neutral.admonition .wy-alert-title,.rst-content .wy-alert-neutral.attention .admonition-title,.rst-content .wy-alert-neutral.attention .wy-alert-title,.rst-content .wy-alert-neutral.caution .admonition-title,.rst-content .wy-alert-neutral.caution .wy-alert-title,.rst-content .wy-alert-neutral.danger .admonition-title,.rst-content .wy-alert-neutral.danger .wy-alert-title,.rst-content .wy-alert-neutral.error .admonition-title,.rst-content .wy-alert-neutral.error .wy-alert-title,.rst-content .wy-alert-neutral.hint .admonition-title,.rst-content .wy-alert-neutral.hint .wy-alert-title,.rst-content .wy-alert-neutral.important .admonition-title,.rst-content .wy-alert-neutral.important .wy-alert-title,.rst-content .wy-alert-neutral.note .admonition-title,.rst-content .wy-alert-neutral.note .wy-alert-title,.rst-content .wy-alert-neutral.seealso .admonition-title,.rst-content .wy-alert-neutral.seealso .wy-alert-title,.rst-content .wy-alert-neutral.tip .admonition-title,.rst-content .wy-alert-neutral.tip .wy-alert-title,.rst-content .wy-alert-neutral.warning .admonition-title,.rst-content .wy-alert-neutral.warning .wy-alert-title,.rst-content .wy-alert.wy-alert-neutral .admonition-title,.wy-alert.wy-alert-neutral .rst-content .admonition-title,.wy-alert.wy-alert-neutral .wy-alert-title{color:#404040;background:#e1e4e5}.rst-content .wy-alert-neutral.admonition-todo a,.rst-content .wy-alert-neutral.admonition a,.rst-content .wy-alert-neutral.attention a,.rst-content .wy-alert-neutral.caution a,.rst-content .wy-alert-neutral.danger a,.rst-content .wy-alert-neutral.error a,.rst-content .wy-alert-neutral.hint a,.rst-content .wy-alert-neutral.important a,.rst-content .wy-alert-neutral.note a,.rst-content .wy-alert-neutral.seealso a,.rst-content .wy-alert-neutral.tip a,.rst-content .wy-alert-neutral.warning a,.wy-alert.wy-alert-neutral a{color:#2980b9}.rst-content .admonition-todo p:last-child,.rst-content .admonition p:last-child,.rst-content .attention p:last-child,.rst-content .caution p:last-child,.rst-content .danger p:last-child,.rst-content .error p:last-child,.rst-content .hint p:last-child,.rst-content .important p:last-child,.rst-content .note p:last-child,.rst-content .seealso p:last-child,.rst-content .tip p:last-child,.rst-content .warning p:last-child,.wy-alert p:last-child{margin-bottom:0}.wy-tray-container{position:fixed;bottom:0;left:0;z-index:600}.wy-tray-container li{display:block;width:300px;background:transparent;color:#fff;text-align:center;box-shadow:0 5px 5px 0 rgba(0,0,0,.1);padding:0 24px;min-width:20%;opacity:0;height:0;line-height:56px;overflow:hidden;-webkit-transition:all .3s ease-in;-moz-transition:all .3s ease-in;transition:all .3s ease-in}.wy-tray-container li.wy-tray-item-success{background:#27ae60}.wy-tray-container li.wy-tray-item-info{background:#2980b9}.wy-tray-container li.wy-tray-item-warning{background:#e67e22}.wy-tray-container li.wy-tray-item-danger{background:#e74c3c}.wy-tray-container li.on{opacity:1;height:56px}@media screen and (max-width:768px){.wy-tray-container{bottom:auto;top:0;width:100%}.wy-tray-container li{width:100%}}button{font-size:100%;margin:0;vertical-align:baseline;*vertical-align:middle;cursor:pointer;line-height:normal;-webkit-appearance:button;*overflow:visible}button::-moz-focus-inner,input::-moz-focus-inner{border:0;padding:0}button[disabled]{cursor:default}.btn{display:inline-block;border-radius:2px;line-height:normal;white-space:nowrap;text-align:center;cursor:pointer;font-size:100%;padding:6px 12px 8px;color:#fff;border:1px solid rgba(0,0,0,.1);background-color:#27ae60;text-decoration:none;font-weight:400;font-family:Lato,proxima-nova,Helvetica Neue,Arial,sans-serif;box-shadow:inset 0 1px 2px -1px hsla(0,0%,100%,.5),inset 0 -2px 0 0 rgba(0,0,0,.1);outline-none:false;vertical-align:middle;*display:inline;zoom:1;-webkit-user-drag:none;-webkit-user-select:none;-moz-user-select:none;-ms-user-select:none;user-select:none;-webkit-transition:all .1s linear;-moz-transition:all .1s linear;transition:all .1s linear}.btn-hover{background:#2e8ece;color:#fff}.btn:hover{background:#2cc36b;color:#fff}.btn:focus{background:#2cc36b;outline:0}.btn:active{box-shadow:inset 0 -1px 0 0 rgba(0,0,0,.05),inset 0 2px 0 0 rgba(0,0,0,.1);padding:8px 12px 6px}.btn:visited{color:#fff}.btn-disabled,.btn-disabled:active,.btn-disabled:focus,.btn-disabled:hover,.btn:disabled{background-image:none;filter:progid:DXImageTransform.Microsoft.gradient(enabled = false);filter:alpha(opacity=40);opacity:.4;cursor:not-allowed;box-shadow:none}.btn::-moz-focus-inner{padding:0;border:0}.btn-small{font-size:80%}.btn-info{background-color:#2980b9!important}.btn-info:hover{background-color:#2e8ece!important}.btn-neutral{background-color:#f3f6f6!important;color:#404040!important}.btn-neutral:hover{background-color:#e5ebeb!important;color:#404040}.btn-neutral:visited{color:#404040!important}.btn-success{background-color:#27ae60!important}.btn-success:hover{background-color:#295!important}.btn-danger{background-color:#e74c3c!important}.btn-danger:hover{background-color:#ea6153!important}.btn-warning{background-color:#e67e22!important}.btn-warning:hover{background-color:#e98b39!important}.btn-invert{background-color:#222}.btn-invert:hover{background-color:#2f2f2f!important}.btn-link{background-color:transparent!important;color:#2980b9;box-shadow:none;border-color:transparent!important}.btn-link:active,.btn-link:hover{background-color:transparent!important;color:#409ad5!important;box-shadow:none}.btn-link:visited{color:#9b59b6}.wy-btn-group .btn,.wy-control .btn{vertical-align:middle}.wy-btn-group{margin-bottom:24px;*zoom:1}.wy-btn-group:after,.wy-btn-group:before{display:table;content:""}.wy-btn-group:after{clear:both}.wy-dropdown{position:relative;display:inline-block}.wy-dropdown-active .wy-dropdown-menu{display:block}.wy-dropdown-menu{position:absolute;left:0;display:none;float:left;top:100%;min-width:100%;background:#fcfcfc;z-index:100;border:1px solid #cfd7dd;box-shadow:0 2px 2px 0 rgba(0,0,0,.1);padding:12px}.wy-dropdown-menu>dd>a{display:block;clear:both;color:#404040;white-space:nowrap;font-size:90%;padding:0 12px;cursor:pointer}.wy-dropdown-menu>dd>a:hover{background:#2980b9;color:#fff}.wy-dropdown-menu>dd.divider{border-top:1px solid #cfd7dd;margin:6px 0}.wy-dropdown-menu>dd.search{padding-bottom:12px}.wy-dropdown-menu>dd.search input[type=search]{width:100%}.wy-dropdown-menu>dd.call-to-action{background:#e3e3e3;text-transform:uppercase;font-weight:500;font-size:80%}.wy-dropdown-menu>dd.call-to-action:hover{background:#e3e3e3}.wy-dropdown-menu>dd.call-to-action .btn{color:#fff}.wy-dropdown.wy-dropdown-up .wy-dropdown-menu{bottom:100%;top:auto;left:auto;right:0}.wy-dropdown.wy-dropdown-bubble .wy-dropdown-menu{background:#fcfcfc;margin-top:2px}.wy-dropdown.wy-dropdown-bubble .wy-dropdown-menu a{padding:6px 12px}.wy-dropdown.wy-dropdown-bubble .wy-dropdown-menu a:hover{background:#2980b9;color:#fff}.wy-dropdown.wy-dropdown-left .wy-dropdown-menu{right:0;left:auto;text-align:right}.wy-dropdown-arrow:before{content:" ";border-bottom:5px solid #f5f5f5;border-left:5px solid transparent;border-right:5px solid transparent;position:absolute;display:block;top:-4px;left:50%;margin-left:-3px}.wy-dropdown-arrow.wy-dropdown-arrow-left:before{left:11px}.wy-form-stacked select{display:block}.wy-form-aligned .wy-help-inline,.wy-form-aligned input,.wy-form-aligned label,.wy-form-aligned select,.wy-form-aligned textarea{display:inline-block;*display:inline;*zoom:1;vertical-align:middle}.wy-form-aligned .wy-control-group>label{display:inline-block;vertical-align:middle;width:10em;margin:6px 12px 0 0;float:left}.wy-form-aligned .wy-control{float:left}.wy-form-aligned .wy-control label{display:block}.wy-form-aligned .wy-control select{margin-top:6px}fieldset{margin:0}fieldset,legend{border:0;padding:0}legend{width:100%;white-space:normal;margin-bottom:24px;font-size:150%;*margin-left:-7px}label,legend{display:block}label{margin:0 0 .3125em;color:#333;font-size:90%}input,select,textarea{font-size:100%;margin:0;vertical-align:baseline;*vertical-align:middle}.wy-control-group{margin-bottom:24px;max-width:1200px;margin-left:auto;margin-right:auto;*zoom:1}.wy-control-group:after,.wy-control-group:before{display:table;content:""}.wy-control-group:after{clear:both}.wy-control-group.wy-control-group-required>label:after{content:" *";color:#e74c3c}.wy-control-group .wy-form-full,.wy-control-group .wy-form-halves,.wy-control-group .wy-form-thirds{padding-bottom:12px}.wy-control-group .wy-form-full input[type=color],.wy-control-group .wy-form-full input[type=date],.wy-control-group .wy-form-full input[type=datetime-local],.wy-control-group .wy-form-full input[type=datetime],.wy-control-group .wy-form-full input[type=email],.wy-control-group .wy-form-full input[type=month],.wy-control-group .wy-form-full input[type=number],.wy-control-group .wy-form-full input[type=password],.wy-control-group .wy-form-full input[type=search],.wy-control-group .wy-form-full input[type=tel],.wy-control-group .wy-form-full input[type=text],.wy-control-group .wy-form-full input[type=time],.wy-control-group .wy-form-full input[type=url],.wy-control-group .wy-form-full input[type=week],.wy-control-group .wy-form-full select,.wy-control-group .wy-form-halves input[type=color],.wy-control-group .wy-form-halves input[type=date],.wy-control-group .wy-form-halves input[type=datetime-local],.wy-control-group .wy-form-halves input[type=datetime],.wy-control-group .wy-form-halves input[type=email],.wy-control-group .wy-form-halves input[type=month],.wy-control-group .wy-form-halves input[type=number],.wy-control-group .wy-form-halves input[type=password],.wy-control-group .wy-form-halves input[type=search],.wy-control-group .wy-form-halves input[type=tel],.wy-control-group .wy-form-halves input[type=text],.wy-control-group .wy-form-halves input[type=time],.wy-control-group .wy-form-halves input[type=url],.wy-control-group .wy-form-halves input[type=week],.wy-control-group .wy-form-halves select,.wy-control-group .wy-form-thirds input[type=color],.wy-control-group .wy-form-thirds input[type=date],.wy-control-group .wy-form-thirds input[type=datetime-local],.wy-control-group .wy-form-thirds input[type=datetime],.wy-control-group .wy-form-thirds input[type=email],.wy-control-group .wy-form-thirds input[type=month],.wy-control-group .wy-form-thirds input[type=number],.wy-control-group .wy-form-thirds input[type=password],.wy-control-group .wy-form-thirds input[type=search],.wy-control-group .wy-form-thirds input[type=tel],.wy-control-group .wy-form-thirds input[type=text],.wy-control-group .wy-form-thirds input[type=time],.wy-control-group .wy-form-thirds input[type=url],.wy-control-group .wy-form-thirds input[type=week],.wy-control-group .wy-form-thirds select{width:100%}.wy-control-group .wy-form-full{float:left;display:block;width:100%;margin-right:0}.wy-control-group .wy-form-full:last-child{margin-right:0}.wy-control-group .wy-form-halves{float:left;display:block;margin-right:2.35765%;width:48.82117%}.wy-control-group .wy-form-halves:last-child,.wy-control-group .wy-form-halves:nth-of-type(2n){margin-right:0}.wy-control-group .wy-form-halves:nth-of-type(odd){clear:left}.wy-control-group .wy-form-thirds{float:left;display:block;margin-right:2.35765%;width:31.76157%}.wy-control-group .wy-form-thirds:last-child,.wy-control-group .wy-form-thirds:nth-of-type(3n){margin-right:0}.wy-control-group .wy-form-thirds:nth-of-type(3n+1){clear:left}.wy-control-group.wy-control-group-no-input .wy-control,.wy-control-no-input{margin:6px 0 0;font-size:90%}.wy-control-no-input{display:inline-block}.wy-control-group.fluid-input input[type=color],.wy-control-group.fluid-input input[type=date],.wy-control-group.fluid-input input[type=datetime-local],.wy-control-group.fluid-input input[type=datetime],.wy-control-group.fluid-input input[type=email],.wy-control-group.fluid-input input[type=month],.wy-control-group.fluid-input input[type=number],.wy-control-group.fluid-input input[type=password],.wy-control-group.fluid-input input[type=search],.wy-control-group.fluid-input input[type=tel],.wy-control-group.fluid-input input[type=text],.wy-control-group.fluid-input input[type=time],.wy-control-group.fluid-input input[type=url],.wy-control-group.fluid-input input[type=week]{width:100%}.wy-form-message-inline{padding-left:.3em;color:#666;font-size:90%}.wy-form-message{display:block;color:#999;font-size:70%;margin-top:.3125em;font-style:italic}.wy-form-message p{font-size:inherit;font-style:italic;margin-bottom:6px}.wy-form-message p:last-child{margin-bottom:0}input{line-height:normal}input[type=button],input[type=reset],input[type=submit]{-webkit-appearance:button;cursor:pointer;font-family:Lato,proxima-nova,Helvetica Neue,Arial,sans-serif;*overflow:visible}input[type=color],input[type=date],input[type=datetime-local],input[type=datetime],input[type=email],input[type=month],input[type=number],input[type=password],input[type=search],input[type=tel],input[type=text],input[type=time],input[type=url],input[type=week]{-webkit-appearance:none;padding:6px;display:inline-block;border:1px solid #ccc;font-size:80%;font-family:Lato,proxima-nova,Helvetica Neue,Arial,sans-serif;box-shadow:inset 0 1px 3px #ddd;border-radius:0;-webkit-transition:border .3s linear;-moz-transition:border .3s linear;transition:border .3s linear}input[type=datetime-local]{padding:.34375em .625em}input[disabled]{cursor:default}input[type=checkbox],input[type=radio]{padding:0;margin-right:.3125em;*height:13px;*width:13px}input[type=checkbox],input[type=radio],input[type=search]{-webkit-box-sizing:border-box;-moz-box-sizing:border-box;box-sizing:border-box}input[type=search]::-webkit-search-cancel-button,input[type=search]::-webkit-search-decoration{-webkit-appearance:none}input[type=color]:focus,input[type=date]:focus,input[type=datetime-local]:focus,input[type=datetime]:focus,input[type=email]:focus,input[type=month]:focus,input[type=number]:focus,input[type=password]:focus,input[type=search]:focus,input[type=tel]:focus,input[type=text]:focus,input[type=time]:focus,input[type=url]:focus,input[type=week]:focus{outline:0;outline:thin dotted\9;border-color:#333}input.no-focus:focus{border-color:#ccc!important}input[type=checkbox]:focus,input[type=file]:focus,input[type=radio]:focus{outline:thin dotted #333;outline:1px auto #129fea}input[type=color][disabled],input[type=date][disabled],input[type=datetime-local][disabled],input[type=datetime][disabled],input[type=email][disabled],input[type=month][disabled],input[type=number][disabled],input[type=password][disabled],input[type=search][disabled],input[type=tel][disabled],input[type=text][disabled],input[type=time][disabled],input[type=url][disabled],input[type=week][disabled]{cursor:not-allowed;background-color:#fafafa}input:focus:invalid,select:focus:invalid,textarea:focus:invalid{color:#e74c3c;border:1px solid #e74c3c}input:focus:invalid:focus,select:focus:invalid:focus,textarea:focus:invalid:focus{border-color:#e74c3c}input[type=checkbox]:focus:invalid:focus,input[type=file]:focus:invalid:focus,input[type=radio]:focus:invalid:focus{outline-color:#e74c3c}input.wy-input-large{padding:12px;font-size:100%}textarea{overflow:auto;vertical-align:top;width:100%;font-family:Lato,proxima-nova,Helvetica Neue,Arial,sans-serif}select,textarea{padding:.5em .625em;display:inline-block;border:1px solid #ccc;font-size:80%;box-shadow:inset 0 1px 3px #ddd;-webkit-transition:border .3s linear;-moz-transition:border .3s linear;transition:border .3s linear}select{border:1px solid #ccc;background-color:#fff}select[multiple]{height:auto}select:focus,textarea:focus{outline:0}input[readonly],select[disabled],select[readonly],textarea[disabled],textarea[readonly]{cursor:not-allowed;background-color:#fafafa}input[type=checkbox][disabled],input[type=radio][disabled]{cursor:not-allowed}.wy-checkbox,.wy-radio{margin:6px 0;color:#404040;display:block}.wy-checkbox input,.wy-radio input{vertical-align:baseline}.wy-form-message-inline{display:inline-block;*display:inline;*zoom:1;vertical-align:middle}.wy-input-prefix,.wy-input-suffix{white-space:nowrap;padding:6px}.wy-input-prefix .wy-input-context,.wy-input-suffix .wy-input-context{line-height:27px;padding:0 8px;display:inline-block;font-size:80%;background-color:#f3f6f6;border:1px solid #ccc;color:#999}.wy-input-suffix .wy-input-context{border-left:0}.wy-input-prefix .wy-input-context{border-right:0}.wy-switch{position:relative;display:block;height:24px;margin-top:12px;cursor:pointer}.wy-switch:before{left:0;top:0;width:36px;height:12px;background:#ccc}.wy-switch:after,.wy-switch:before{position:absolute;content:"";display:block;border-radius:4px;-webkit-transition:all .2s ease-in-out;-moz-transition:all .2s ease-in-out;transition:all .2s ease-in-out}.wy-switch:after{width:18px;height:18px;background:#999;left:-3px;top:-3px}.wy-switch span{position:absolute;left:48px;display:block;font-size:12px;color:#ccc;line-height:1}.wy-switch.active:before{background:#1e8449}.wy-switch.active:after{left:24px;background:#27ae60}.wy-switch.disabled{cursor:not-allowed;opacity:.8}.wy-control-group.wy-control-group-error .wy-form-message,.wy-control-group.wy-control-group-error>label{color:#e74c3c}.wy-control-group.wy-control-group-error input[type=color],.wy-control-group.wy-control-group-error input[type=date],.wy-control-group.wy-control-group-error input[type=datetime-local],.wy-control-group.wy-control-group-error input[type=datetime],.wy-control-group.wy-control-group-error input[type=email],.wy-control-group.wy-control-group-error input[type=month],.wy-control-group.wy-control-group-error input[type=number],.wy-control-group.wy-control-group-error input[type=password],.wy-control-group.wy-control-group-error input[type=search],.wy-control-group.wy-control-group-error input[type=tel],.wy-control-group.wy-control-group-error input[type=text],.wy-control-group.wy-control-group-error input[type=time],.wy-control-group.wy-control-group-error input[type=url],.wy-control-group.wy-control-group-error input[type=week],.wy-control-group.wy-control-group-error textarea{border:1px solid #e74c3c}.wy-inline-validate{white-space:nowrap}.wy-inline-validate .wy-input-context{padding:.5em .625em;display:inline-block;font-size:80%}.wy-inline-validate.wy-inline-validate-success .wy-input-context{color:#27ae60}.wy-inline-validate.wy-inline-validate-danger .wy-input-context{color:#e74c3c}.wy-inline-validate.wy-inline-validate-warning .wy-input-context{color:#e67e22}.wy-inline-validate.wy-inline-validate-info .wy-input-context{color:#2980b9}.rotate-90{-webkit-transform:rotate(90deg);-moz-transform:rotate(90deg);-ms-transform:rotate(90deg);-o-transform:rotate(90deg);transform:rotate(90deg)}.rotate-180{-webkit-transform:rotate(180deg);-moz-transform:rotate(180deg);-ms-transform:rotate(180deg);-o-transform:rotate(180deg);transform:rotate(180deg)}.rotate-270{-webkit-transform:rotate(270deg);-moz-transform:rotate(270deg);-ms-transform:rotate(270deg);-o-transform:rotate(270deg);transform:rotate(270deg)}.mirror{-webkit-transform:scaleX(-1);-moz-transform:scaleX(-1);-ms-transform:scaleX(-1);-o-transform:scaleX(-1);transform:scaleX(-1)}.mirror.rotate-90{-webkit-transform:scaleX(-1) rotate(90deg);-moz-transform:scaleX(-1) rotate(90deg);-ms-transform:scaleX(-1) rotate(90deg);-o-transform:scaleX(-1) rotate(90deg);transform:scaleX(-1) rotate(90deg)}.mirror.rotate-180{-webkit-transform:scaleX(-1) rotate(180deg);-moz-transform:scaleX(-1) rotate(180deg);-ms-transform:scaleX(-1) rotate(180deg);-o-transform:scaleX(-1) rotate(180deg);transform:scaleX(-1) rotate(180deg)}.mirror.rotate-270{-webkit-transform:scaleX(-1) rotate(270deg);-moz-transform:scaleX(-1) rotate(270deg);-ms-transform:scaleX(-1) rotate(270deg);-o-transform:scaleX(-1) rotate(270deg);transform:scaleX(-1) rotate(270deg)}@media only screen and (max-width:480px){.wy-form button[type=submit]{margin:.7em 0 0}.wy-form input[type=color],.wy-form input[type=date],.wy-form input[type=datetime-local],.wy-form input[type=datetime],.wy-form input[type=email],.wy-form input[type=month],.wy-form input[type=number],.wy-form input[type=password],.wy-form input[type=search],.wy-form input[type=tel],.wy-form input[type=text],.wy-form input[type=time],.wy-form input[type=url],.wy-form input[type=week],.wy-form label{margin-bottom:.3em;display:block}.wy-form input[type=color],.wy-form input[type=date],.wy-form input[type=datetime-local],.wy-form input[type=datetime],.wy-form input[type=email],.wy-form input[type=month],.wy-form input[type=number],.wy-form input[type=password],.wy-form input[type=search],.wy-form input[type=tel],.wy-form input[type=time],.wy-form input[type=url],.wy-form input[type=week]{margin-bottom:0}.wy-form-aligned .wy-control-group label{margin-bottom:.3em;text-align:left;display:block;width:100%}.wy-form-aligned .wy-control{margin:1.5em 0 0}.wy-form-message,.wy-form-message-inline,.wy-form .wy-help-inline{display:block;font-size:80%;padding:6px 0}}@media screen and (max-width:768px){.tablet-hide{display:none}}@media screen and (max-width:480px){.mobile-hide{display:none}}.float-left{float:left}.float-right{float:right}.full-width{width:100%}.rst-content table.docutils,.rst-content table.field-list,.wy-table{border-collapse:collapse;border-spacing:0;empty-cells:show;margin-bottom:24px}.rst-content table.docutils caption,.rst-content table.field-list caption,.wy-table caption{color:#000;font:italic 85%/1 arial,sans-serif;padding:1em 0;text-align:center}.rst-content table.docutils td,.rst-content table.docutils th,.rst-content table.field-list td,.rst-content table.field-list th,.wy-table td,.wy-table th{font-size:90%;margin:0;overflow:visible;padding:8px 16px}.rst-content table.docutils td:first-child,.rst-content table.docutils th:first-child,.rst-content table.field-list td:first-child,.rst-content table.field-list th:first-child,.wy-table td:first-child,.wy-table th:first-child{border-left-width:0}.rst-content table.docutils thead,.rst-content table.field-list thead,.wy-table thead{color:#000;text-align:left;vertical-align:bottom;white-space:nowrap}.rst-content table.docutils thead th,.rst-content table.field-list thead th,.wy-table thead th{font-weight:700;border-bottom:2px solid #e1e4e5}.rst-content table.docutils td,.rst-content table.field-list td,.wy-table td{background-color:transparent;vertical-align:middle}.rst-content table.docutils td p,.rst-content table.field-list td p,.wy-table td p{line-height:18px}.rst-content table.docutils td p:last-child,.rst-content table.field-list td p:last-child,.wy-table td p:last-child{margin-bottom:0}.rst-content table.docutils .wy-table-cell-min,.rst-content table.field-list .wy-table-cell-min,.wy-table .wy-table-cell-min{width:1%;padding-right:0}.rst-content table.docutils .wy-table-cell-min input[type=checkbox],.rst-content table.field-list .wy-table-cell-min input[type=checkbox],.wy-table .wy-table-cell-min input[type=checkbox]{margin:0}.wy-table-secondary{color:grey;font-size:90%}.wy-table-tertiary{color:grey;font-size:80%}.rst-content table.docutils:not(.field-list) tr:nth-child(2n-1) td,.wy-table-backed,.wy-table-odd td,.wy-table-striped tr:nth-child(2n-1) td{background-color:#f3f6f6}.rst-content table.docutils,.wy-table-bordered-all{border:1px solid #e1e4e5}.rst-content table.docutils td,.wy-table-bordered-all td{border-bottom:1px solid #e1e4e5;border-left:1px solid #e1e4e5}.rst-content table.docutils tbody>tr:last-child td,.wy-table-bordered-all tbody>tr:last-child td{border-bottom-width:0}.wy-table-bordered{border:1px solid #e1e4e5}.wy-table-bordered-rows td{border-bottom:1px solid #e1e4e5}.wy-table-bordered-rows tbody>tr:last-child td{border-bottom-width:0}.wy-table-horizontal td,.wy-table-horizontal th{border-width:0 0 1px;border-bottom:1px solid #e1e4e5}.wy-table-horizontal tbody>tr:last-child td{border-bottom-width:0}.wy-table-responsive{margin-bottom:24px;max-width:100%;overflow:auto}.wy-table-responsive table{margin-bottom:0!important}.wy-table-responsive table td,.wy-table-responsive table th{white-space:nowrap}a{color:#2980b9;text-decoration:none;cursor:pointer}a:hover{color:#3091d1}a:visited{color:#9b59b6}html{height:100%}body,html{overflow-x:hidden}body{font-family:Lato,proxima-nova,Helvetica Neue,Arial,sans-serif;font-weight:400;color:#404040;min-height:100%;background:#edf0f2}.wy-text-left{text-align:left}.wy-text-center{text-align:center}.wy-text-right{text-align:right}.wy-text-large{font-size:120%}.wy-text-normal{font-size:100%}.wy-text-small,small{font-size:80%}.wy-text-strike{text-decoration:line-through}.wy-text-warning{color:#e67e22!important}a.wy-text-warning:hover{color:#eb9950!important}.wy-text-info{color:#2980b9!important}a.wy-text-info:hover{color:#409ad5!important}.wy-text-success{color:#27ae60!important}a.wy-text-success:hover{color:#36d278!important}.wy-text-danger{color:#e74c3c!important}a.wy-text-danger:hover{color:#ed7669!important}.wy-text-neutral{color:#404040!important}a.wy-text-neutral:hover{color:#595959!important}.rst-content .toctree-wrapper>p.caption,h1,h2,h3,h4,h5,h6,legend{margin-top:0;font-weight:700;font-family:Roboto Slab,ff-tisa-web-pro,Georgia,Arial,sans-serif}p{line-height:24px;font-size:16px;margin:0 0 24px}h1{font-size:175%}.rst-content .toctree-wrapper>p.caption,h2{font-size:150%}h3{font-size:125%}h4{font-size:115%}h5{font-size:110%}h6{font-size:100%}hr{display:block;height:1px;border:0;border-top:1px solid #e1e4e5;margin:24px 0;padding:0}.rst-content code,.rst-content tt,code{white-space:nowrap;max-width:100%;background:#fff;border:1px solid #e1e4e5;font-size:75%;padding:0 5px;font-family:SFMono-Regular,Menlo,Monaco,Consolas,Liberation Mono,Courier New,Courier,monospace;color:#e74c3c;overflow-x:auto}.rst-content tt.code-large,code.code-large{font-size:90%}.rst-content .section ul,.rst-content .toctree-wrapper ul,.rst-content section ul,.wy-plain-list-disc,article ul{list-style:disc;line-height:24px;margin-bottom:24px}.rst-content .section ul li,.rst-content .toctree-wrapper ul li,.rst-content section ul li,.wy-plain-list-disc li,article ul li{list-style:disc;margin-left:24px}.rst-content .section ul li p:last-child,.rst-content .section ul li ul,.rst-content .toctree-wrapper ul li p:last-child,.rst-content .toctree-wrapper ul li ul,.rst-content section ul li p:last-child,.rst-content section ul li ul,.wy-plain-list-disc li p:last-child,.wy-plain-list-disc li ul,article ul li p:last-child,article ul li ul{margin-bottom:0}.rst-content .section ul li li,.rst-content .toctree-wrapper ul li li,.rst-content section ul li li,.wy-plain-list-disc li li,article ul li li{list-style:circle}.rst-content .section ul li li li,.rst-content .toctree-wrapper ul li li li,.rst-content section ul li li li,.wy-plain-list-disc li li li,article ul li li li{list-style:square}.rst-content .section ul li ol li,.rst-content .toctree-wrapper ul li ol li,.rst-content section ul li ol li,.wy-plain-list-disc li ol li,article ul li ol li{list-style:decimal}.rst-content .section ol,.rst-content .section ol.arabic,.rst-content .toctree-wrapper ol,.rst-content .toctree-wrapper ol.arabic,.rst-content section ol,.rst-content section ol.arabic,.wy-plain-list-decimal,article ol{list-style:decimal;line-height:24px;margin-bottom:24px}.rst-content .section ol.arabic li,.rst-content .section ol li,.rst-content .toctree-wrapper ol.arabic li,.rst-content .toctree-wrapper ol li,.rst-content section ol.arabic li,.rst-content section ol li,.wy-plain-list-decimal li,article ol li{list-style:decimal;margin-left:24px}.rst-content .section ol.arabic li ul,.rst-content .section ol li p:last-child,.rst-content .section ol li ul,.rst-content .toctree-wrapper ol.arabic li ul,.rst-content .toctree-wrapper ol li p:last-child,.rst-content .toctree-wrapper ol li ul,.rst-content section ol.arabic li ul,.rst-content section ol li p:last-child,.rst-content section ol li ul,.wy-plain-list-decimal li p:last-child,.wy-plain-list-decimal li ul,article ol li p:last-child,article ol li ul{margin-bottom:0}.rst-content .section ol.arabic li ul li,.rst-content .section ol li ul li,.rst-content .toctree-wrapper ol.arabic li ul li,.rst-content .toctree-wrapper ol li ul li,.rst-content section ol.arabic li ul li,.rst-content section ol li ul li,.wy-plain-list-decimal li ul li,article ol li ul li{list-style:disc}.wy-breadcrumbs{*zoom:1}.wy-breadcrumbs:after,.wy-breadcrumbs:before{display:table;content:""}.wy-breadcrumbs:after{clear:both}.wy-breadcrumbs>li{display:inline-block;padding-top:5px}.wy-breadcrumbs>li.wy-breadcrumbs-aside{float:right}.rst-content .wy-breadcrumbs>li code,.rst-content .wy-breadcrumbs>li tt,.wy-breadcrumbs>li .rst-content tt,.wy-breadcrumbs>li code{all:inherit;color:inherit}.breadcrumb-item:before{content:"/";color:#bbb;font-size:13px;padding:0 6px 0 3px}.wy-breadcrumbs-extra{margin-bottom:0;color:#b3b3b3;font-size:80%;display:inline-block}@media screen and (max-width:480px){.wy-breadcrumbs-extra,.wy-breadcrumbs li.wy-breadcrumbs-aside{display:none}}@media print{.wy-breadcrumbs li.wy-breadcrumbs-aside{display:none}}html{font-size:16px}.wy-affix{position:fixed;top:1.618em}.wy-menu a:hover{text-decoration:none}.wy-menu-horiz{*zoom:1}.wy-menu-horiz:after,.wy-menu-horiz:before{display:table;content:""}.wy-menu-horiz:after{clear:both}.wy-menu-horiz li,.wy-menu-horiz ul{display:inline-block}.wy-menu-horiz li:hover{background:hsla(0,0%,100%,.1)}.wy-menu-horiz li.divide-left{border-left:1px solid #404040}.wy-menu-horiz li.divide-right{border-right:1px solid #404040}.wy-menu-horiz a{height:32px;display:inline-block;line-height:32px;padding:0 16px}.wy-menu-vertical{width:300px}.wy-menu-vertical header,.wy-menu-vertical p.caption{color:#55a5d9;height:32px;line-height:32px;padding:0 1.618em;margin:12px 0 0;display:block;font-weight:700;text-transform:uppercase;font-size:85%;white-space:nowrap}.wy-menu-vertical ul{margin-bottom:0}.wy-menu-vertical li.divide-top{border-top:1px solid #404040}.wy-menu-vertical li.divide-bottom{border-bottom:1px solid #404040}.wy-menu-vertical li.current{background:#e3e3e3}.wy-menu-vertical li.current a{color:grey;border-right:1px solid #c9c9c9;padding:.4045em 2.427em}.wy-menu-vertical li.current a:hover{background:#d6d6d6}.rst-content .wy-menu-vertical li tt,.wy-menu-vertical li .rst-content tt,.wy-menu-vertical li code{border:none;background:inherit;color:inherit;padding-left:0;padding-right:0}.wy-menu-vertical li button.toctree-expand{display:block;float:left;margin-left:-1.2em;line-height:18px;color:#4d4d4d;border:none;background:none;padding:0}.wy-menu-vertical li.current>a,.wy-menu-vertical li.on a{color:#404040;font-weight:700;position:relative;background:#fcfcfc;border:none;padding:.4045em 1.618em}.wy-menu-vertical li.current>a:hover,.wy-menu-vertical li.on a:hover{background:#fcfcfc}.wy-menu-vertical li.current>a:hover button.toctree-expand,.wy-menu-vertical li.on a:hover button.toctree-expand{color:grey}.wy-menu-vertical li.current>a button.toctree-expand,.wy-menu-vertical li.on a button.toctree-expand{display:block;line-height:18px;color:#333}.wy-menu-vertical li.toctree-l1.current>a{border-bottom:1px solid #c9c9c9;border-top:1px solid #c9c9c9}.wy-menu-vertical .toctree-l1.current .toctree-l2>ul,.wy-menu-vertical .toctree-l2.current .toctree-l3>ul,.wy-menu-vertical .toctree-l3.current .toctree-l4>ul,.wy-menu-vertical .toctree-l4.current .toctree-l5>ul,.wy-menu-vertical .toctree-l5.current .toctree-l6>ul,.wy-menu-vertical .toctree-l6.current .toctree-l7>ul,.wy-menu-vertical .toctree-l7.current .toctree-l8>ul,.wy-menu-vertical .toctree-l8.current .toctree-l9>ul,.wy-menu-vertical .toctree-l9.current .toctree-l10>ul,.wy-menu-vertical .toctree-l10.current .toctree-l11>ul{display:none}.wy-menu-vertical .toctree-l1.current .current.toctree-l2>ul,.wy-menu-vertical .toctree-l2.current .current.toctree-l3>ul,.wy-menu-vertical .toctree-l3.current .current.toctree-l4>ul,.wy-menu-vertical .toctree-l4.current .current.toctree-l5>ul,.wy-menu-vertical .toctree-l5.current .current.toctree-l6>ul,.wy-menu-vertical .toctree-l6.current .current.toctree-l7>ul,.wy-menu-vertical .toctree-l7.current .current.toctree-l8>ul,.wy-menu-vertical .toctree-l8.current .current.toctree-l9>ul,.wy-menu-vertical .toctree-l9.current .current.toctree-l10>ul,.wy-menu-vertical .toctree-l10.current .current.toctree-l11>ul{display:block}.wy-menu-vertical li.toctree-l3,.wy-menu-vertical li.toctree-l4{font-size:.9em}.wy-menu-vertical li.toctree-l2 a,.wy-menu-vertical li.toctree-l3 a,.wy-menu-vertical li.toctree-l4 a,.wy-menu-vertical li.toctree-l5 a,.wy-menu-vertical li.toctree-l6 a,.wy-menu-vertical li.toctree-l7 a,.wy-menu-vertical li.toctree-l8 a,.wy-menu-vertical li.toctree-l9 a,.wy-menu-vertical li.toctree-l10 a{color:#404040}.wy-menu-vertical li.toctree-l2 a:hover button.toctree-expand,.wy-menu-vertical li.toctree-l3 a:hover button.toctree-expand,.wy-menu-vertical li.toctree-l4 a:hover button.toctree-expand,.wy-menu-vertical li.toctree-l5 a:hover button.toctree-expand,.wy-menu-vertical li.toctree-l6 a:hover button.toctree-expand,.wy-menu-vertical li.toctree-l7 a:hover button.toctree-expand,.wy-menu-vertical li.toctree-l8 a:hover button.toctree-expand,.wy-menu-vertical li.toctree-l9 a:hover button.toctree-expand,.wy-menu-vertical li.toctree-l10 a:hover button.toctree-expand{color:grey}.wy-menu-vertical li.toctree-l2.current li.toctree-l3>a,.wy-menu-vertical li.toctree-l3.current li.toctree-l4>a,.wy-menu-vertical li.toctree-l4.current li.toctree-l5>a,.wy-menu-vertical li.toctree-l5.current li.toctree-l6>a,.wy-menu-vertical li.toctree-l6.current li.toctree-l7>a,.wy-menu-vertical li.toctree-l7.current li.toctree-l8>a,.wy-menu-vertical li.toctree-l8.current li.toctree-l9>a,.wy-menu-vertical li.toctree-l9.current li.toctree-l10>a,.wy-menu-vertical li.toctree-l10.current li.toctree-l11>a{display:block}.wy-menu-vertical li.toctree-l2.current>a{padding:.4045em 2.427em}.wy-menu-vertical li.toctree-l2.current li.toctree-l3>a{padding:.4045em 1.618em .4045em 4.045em}.wy-menu-vertical li.toctree-l3.current>a{padding:.4045em 4.045em}.wy-menu-vertical li.toctree-l3.current li.toctree-l4>a{padding:.4045em 1.618em .4045em 5.663em}.wy-menu-vertical li.toctree-l4.current>a{padding:.4045em 5.663em}.wy-menu-vertical li.toctree-l4.current li.toctree-l5>a{padding:.4045em 1.618em .4045em 7.281em}.wy-menu-vertical li.toctree-l5.current>a{padding:.4045em 7.281em}.wy-menu-vertical li.toctree-l5.current li.toctree-l6>a{padding:.4045em 1.618em .4045em 8.899em}.wy-menu-vertical li.toctree-l6.current>a{padding:.4045em 8.899em}.wy-menu-vertical li.toctree-l6.current li.toctree-l7>a{padding:.4045em 1.618em .4045em 10.517em}.wy-menu-vertical li.toctree-l7.current>a{padding:.4045em 10.517em}.wy-menu-vertical li.toctree-l7.current li.toctree-l8>a{padding:.4045em 1.618em .4045em 12.135em}.wy-menu-vertical li.toctree-l8.current>a{padding:.4045em 12.135em}.wy-menu-vertical li.toctree-l8.current li.toctree-l9>a{padding:.4045em 1.618em .4045em 13.753em}.wy-menu-vertical li.toctree-l9.current>a{padding:.4045em 13.753em}.wy-menu-vertical li.toctree-l9.current li.toctree-l10>a{padding:.4045em 1.618em .4045em 15.371em}.wy-menu-vertical li.toctree-l10.current>a{padding:.4045em 15.371em}.wy-menu-vertical li.toctree-l10.current li.toctree-l11>a{padding:.4045em 1.618em .4045em 16.989em}.wy-menu-vertical li.toctree-l2.current>a,.wy-menu-vertical li.toctree-l2.current li.toctree-l3>a{background:#c9c9c9}.wy-menu-vertical li.toctree-l2 button.toctree-expand{color:#a3a3a3}.wy-menu-vertical li.toctree-l3.current>a,.wy-menu-vertical li.toctree-l3.current li.toctree-l4>a{background:#bdbdbd}.wy-menu-vertical li.toctree-l3 button.toctree-expand{color:#969696}.wy-menu-vertical li.current ul{display:block}.wy-menu-vertical li ul{margin-bottom:0;display:none}.wy-menu-vertical li ul li a{margin-bottom:0;color:#d9d9d9;font-weight:400}.wy-menu-vertical a{line-height:18px;padding:.4045em 1.618em;display:block;position:relative;font-size:90%;color:#d9d9d9}.wy-menu-vertical a:hover{background-color:#4e4a4a;cursor:pointer}.wy-menu-vertical a:hover button.toctree-expand{color:#d9d9d9}.wy-menu-vertical a:active{background-color:#2980b9;cursor:pointer;color:#fff}.wy-menu-vertical a:active button.toctree-expand{color:#fff}.wy-side-nav-search{display:block;width:300px;padding:.809em;margin-bottom:.809em;z-index:200;background-color:#2980b9;text-align:center;color:#fcfcfc}.wy-side-nav-search input[type=text]{width:100%;border-radius:50px;padding:6px 12px;border-color:#2472a4}.wy-side-nav-search img{display:block;margin:auto auto .809em;height:45px;width:45px;background-color:#2980b9;padding:5px;border-radius:100%}.wy-side-nav-search .wy-dropdown>a,.wy-side-nav-search>a{color:#fcfcfc;font-size:100%;font-weight:700;display:inline-block;padding:4px 6px;margin-bottom:.809em;max-width:100%}.wy-side-nav-search .wy-dropdown>a:hover,.wy-side-nav-search>a:hover{background:hsla(0,0%,100%,.1)}.wy-side-nav-search .wy-dropdown>a img.logo,.wy-side-nav-search>a img.logo{display:block;margin:0 auto;height:auto;width:auto;border-radius:0;max-width:100%;background:transparent}.wy-side-nav-search .wy-dropdown>a.icon img.logo,.wy-side-nav-search>a.icon img.logo{margin-top:.85em}.wy-side-nav-search>div.version{margin-top:-.4045em;margin-bottom:.809em;font-weight:400;color:hsla(0,0%,100%,.3)}.wy-nav .wy-menu-vertical header{color:#2980b9}.wy-nav .wy-menu-vertical a{color:#b3b3b3}.wy-nav .wy-menu-vertical a:hover{background-color:#2980b9;color:#fff}[data-menu-wrap]{-webkit-transition:all .2s ease-in;-moz-transition:all .2s ease-in;transition:all .2s ease-in;position:absolute;opacity:1;width:100%;opacity:0}[data-menu-wrap].move-center{left:0;right:auto;opacity:1}[data-menu-wrap].move-left{right:auto;left:-100%;opacity:0}[data-menu-wrap].move-right{right:-100%;left:auto;opacity:0}.wy-body-for-nav{background:#fcfcfc}.wy-grid-for-nav{position:absolute;width:100%;height:100%}.wy-nav-side{position:fixed;top:0;bottom:0;left:0;padding-bottom:2em;width:300px;overflow-x:hidden;overflow-y:hidden;min-height:100%;color:#9b9b9b;background:#343131;z-index:200}.wy-side-scroll{width:320px;position:relative;overflow-x:hidden;overflow-y:scroll;height:100%}.wy-nav-top{display:none;background:#2980b9;color:#fff;padding:.4045em .809em;position:relative;line-height:50px;text-align:center;font-size:100%;*zoom:1}.wy-nav-top:after,.wy-nav-top:before{display:table;content:""}.wy-nav-top:after{clear:both}.wy-nav-top a{color:#fff;font-weight:700}.wy-nav-top img{margin-right:12px;height:45px;width:45px;background-color:#2980b9;padding:5px;border-radius:100%}.wy-nav-top i{font-size:30px;float:left;cursor:pointer;padding-top:inherit}.wy-nav-content-wrap{margin-left:300px;background:#fcfcfc;min-height:100%}.wy-nav-content{padding:1.618em 3.236em;height:100%;max-width:800px;margin:auto}.wy-body-mask{position:fixed;width:100%;height:100%;background:rgba(0,0,0,.2);display:none;z-index:499}.wy-body-mask.on{display:block}footer{color:grey}footer p{margin-bottom:12px}.rst-content footer span.commit tt,footer span.commit .rst-content tt,footer span.commit code{padding:0;font-family:SFMono-Regular,Menlo,Monaco,Consolas,Liberation Mono,Courier New,Courier,monospace;font-size:1em;background:none;border:none;color:grey}.rst-footer-buttons{*zoom:1}.rst-footer-buttons:after,.rst-footer-buttons:before{width:100%;display:table;content:""}.rst-footer-buttons:after{clear:both}.rst-breadcrumbs-buttons{margin-top:12px;*zoom:1}.rst-breadcrumbs-buttons:after,.rst-breadcrumbs-buttons:before{display:table;content:""}.rst-breadcrumbs-buttons:after{clear:both}#search-results .search li{margin-bottom:24px;border-bottom:1px solid #e1e4e5;padding-bottom:24px}#search-results .search li:first-child{border-top:1px solid #e1e4e5;padding-top:24px}#search-results .search li a{font-size:120%;margin-bottom:12px;display:inline-block}#search-results .context{color:grey;font-size:90%}.genindextable li>ul{margin-left:24px}@media screen and (max-width:768px){.wy-body-for-nav{background:#fcfcfc}.wy-nav-top{display:block}.wy-nav-side{left:-300px}.wy-nav-side.shift{width:85%;left:0}.wy-menu.wy-menu-vertical,.wy-side-nav-search,.wy-side-scroll{width:auto}.wy-nav-content-wrap{margin-left:0}.wy-nav-content-wrap .wy-nav-content{padding:1.618em}.wy-nav-content-wrap.shift{position:fixed;min-width:100%;left:85%;top:0;height:100%;overflow:hidden}}@media screen and (min-width:1100px){.wy-nav-content-wrap{background:rgba(0,0,0,.05)}.wy-nav-content{margin:0;background:#fcfcfc}}@media print{.rst-versions,.wy-nav-side,footer{display:none}.wy-nav-content-wrap{margin-left:0}}.rst-versions{position:fixed;bottom:0;left:0;width:300px;color:#fcfcfc;background:#1f1d1d;font-family:Lato,proxima-nova,Helvetica Neue,Arial,sans-serif;z-index:400}.rst-versions a{color:#2980b9;text-decoration:none}.rst-versions .rst-badge-small{display:none}.rst-versions .rst-current-version{padding:12px;background-color:#272525;display:block;text-align:right;font-size:90%;cursor:pointer;color:#27ae60;*zoom:1}.rst-versions .rst-current-version:after,.rst-versions .rst-current-version:before{display:table;content:""}.rst-versions .rst-current-version:after{clear:both}.rst-content .code-block-caption .rst-versions .rst-current-version .headerlink,.rst-content .eqno .rst-versions .rst-current-version .headerlink,.rst-content .rst-versions .rst-current-version .admonition-title,.rst-content code.download .rst-versions .rst-current-version span:first-child,.rst-content dl dt .rst-versions .rst-current-version .headerlink,.rst-content h1 .rst-versions .rst-current-version .headerlink,.rst-content h2 .rst-versions .rst-current-version .headerlink,.rst-content h3 .rst-versions .rst-current-version .headerlink,.rst-content h4 .rst-versions .rst-current-version .headerlink,.rst-content h5 .rst-versions .rst-current-version .headerlink,.rst-content h6 .rst-versions .rst-current-version .headerlink,.rst-content p .rst-versions .rst-current-version .headerlink,.rst-content table>caption .rst-versions .rst-current-version .headerlink,.rst-content tt.download .rst-versions .rst-current-version span:first-child,.rst-versions .rst-current-version .fa,.rst-versions .rst-current-version .icon,.rst-versions .rst-current-version .rst-content .admonition-title,.rst-versions .rst-current-version .rst-content .code-block-caption .headerlink,.rst-versions .rst-current-version .rst-content .eqno .headerlink,.rst-versions .rst-current-version .rst-content code.download span:first-child,.rst-versions .rst-current-version .rst-content dl dt .headerlink,.rst-versions .rst-current-version .rst-content h1 .headerlink,.rst-versions .rst-current-version .rst-content h2 .headerlink,.rst-versions .rst-current-version .rst-content h3 .headerlink,.rst-versions .rst-current-version .rst-content h4 .headerlink,.rst-versions .rst-current-version .rst-content h5 .headerlink,.rst-versions .rst-current-version .rst-content h6 .headerlink,.rst-versions .rst-current-version .rst-content p .headerlink,.rst-versions .rst-current-version .rst-content table>caption .headerlink,.rst-versions .rst-current-version .rst-content tt.download span:first-child,.rst-versions .rst-current-version .wy-menu-vertical li button.toctree-expand,.wy-menu-vertical li .rst-versions .rst-current-version button.toctree-expand{color:#fcfcfc}.rst-versions .rst-current-version .fa-book,.rst-versions .rst-current-version .icon-book{float:left}.rst-versions .rst-current-version.rst-out-of-date{background-color:#e74c3c;color:#fff}.rst-versions .rst-current-version.rst-active-old-version{background-color:#f1c40f;color:#000}.rst-versions.shift-up{height:auto;max-height:100%;overflow-y:scroll}.rst-versions.shift-up .rst-other-versions{display:block}.rst-versions .rst-other-versions{font-size:90%;padding:12px;color:grey;display:none}.rst-versions .rst-other-versions hr{display:block;height:1px;border:0;margin:20px 0;padding:0;border-top:1px solid #413d3d}.rst-versions .rst-other-versions dd{display:inline-block;margin:0}.rst-versions .rst-other-versions dd a{display:inline-block;padding:6px;color:#fcfcfc}.rst-versions.rst-badge{width:auto;bottom:20px;right:20px;left:auto;border:none;max-width:300px;max-height:90%}.rst-versions.rst-badge .fa-book,.rst-versions.rst-badge .icon-book{float:none;line-height:30px}.rst-versions.rst-badge.shift-up .rst-current-version{text-align:right}.rst-versions.rst-badge.shift-up .rst-current-version .fa-book,.rst-versions.rst-badge.shift-up .rst-current-version .icon-book{float:left}.rst-versions.rst-badge>.rst-current-version{width:auto;height:30px;line-height:30px;padding:0 6px;display:block;text-align:center}@media screen and (max-width:768px){.rst-versions{width:85%;display:none}.rst-versions.shift{display:block}}.rst-content .toctree-wrapper>p.caption,.rst-content h1,.rst-content h2,.rst-content h3,.rst-content h4,.rst-content h5,.rst-content h6{margin-bottom:24px}.rst-content img{max-width:100%;height:auto}.rst-content div.figure,.rst-content figure{margin-bottom:24px}.rst-content div.figure .caption-text,.rst-content figure .caption-text{font-style:italic}.rst-content div.figure p:last-child.caption,.rst-content figure p:last-child.caption{margin-bottom:0}.rst-content div.figure.align-center,.rst-content figure.align-center{text-align:center}.rst-content .section>a>img,.rst-content .section>img,.rst-content section>a>img,.rst-content section>img{margin-bottom:24px}.rst-content abbr[title]{text-decoration:none}.rst-content.style-external-links a.reference.external:after{font-family:FontAwesome;content:"\f08e";color:#b3b3b3;vertical-align:super;font-size:60%;margin:0 .2em}.rst-content blockquote{margin-left:24px;line-height:24px;margin-bottom:24px}.rst-content pre.literal-block{white-space:pre;margin:0;padding:12px;font-family:SFMono-Regular,Menlo,Monaco,Consolas,Liberation Mono,Courier New,Courier,monospace;display:block;overflow:auto}.rst-content div[class^=highlight],.rst-content pre.literal-block{border:1px solid #e1e4e5;overflow-x:auto;margin:1px 0 24px}.rst-content div[class^=highlight] div[class^=highlight],.rst-content pre.literal-block div[class^=highlight]{padding:0;border:none;margin:0}.rst-content div[class^=highlight] td.code{width:100%}.rst-content .linenodiv pre{border-right:1px solid #e6e9ea;margin:0;padding:12px;font-family:SFMono-Regular,Menlo,Monaco,Consolas,Liberation Mono,Courier New,Courier,monospace;user-select:none;pointer-events:none}.rst-content div[class^=highlight] pre{white-space:pre;margin:0;padding:12px;display:block;overflow:auto}.rst-content div[class^=highlight] pre .hll{display:block;margin:0 -12px;padding:0 12px}.rst-content .linenodiv pre,.rst-content div[class^=highlight] pre,.rst-content pre.literal-block{font-family:SFMono-Regular,Menlo,Monaco,Consolas,Liberation Mono,Courier New,Courier,monospace;font-size:12px;line-height:1.4}.rst-content div.highlight .gp,.rst-content div.highlight span.linenos{user-select:none;pointer-events:none}.rst-content div.highlight span.linenos{display:inline-block;padding-left:0;padding-right:12px;margin-right:12px;border-right:1px solid #e6e9ea}.rst-content .code-block-caption{font-style:italic;font-size:85%;line-height:1;padding:1em 0;text-align:center}@media print{.rst-content .codeblock,.rst-content div[class^=highlight],.rst-content div[class^=highlight] pre{white-space:pre-wrap}}.rst-content .admonition,.rst-content .admonition-todo,.rst-content .attention,.rst-content .caution,.rst-content .danger,.rst-content .error,.rst-content .hint,.rst-content .important,.rst-content .note,.rst-content .seealso,.rst-content .tip,.rst-content .warning{clear:both}.rst-content .admonition-todo .last,.rst-content .admonition-todo>:last-child,.rst-content .admonition .last,.rst-content .admonition>:last-child,.rst-content .attention .last,.rst-content .attention>:last-child,.rst-content .caution .last,.rst-content .caution>:last-child,.rst-content .danger .last,.rst-content .danger>:last-child,.rst-content .error .last,.rst-content .error>:last-child,.rst-content .hint .last,.rst-content .hint>:last-child,.rst-content .important .last,.rst-content .important>:last-child,.rst-content .note .last,.rst-content .note>:last-child,.rst-content .seealso .last,.rst-content .seealso>:last-child,.rst-content .tip .last,.rst-content .tip>:last-child,.rst-content .warning .last,.rst-content .warning>:last-child{margin-bottom:0}.rst-content .admonition-title:before{margin-right:4px}.rst-content .admonition table{border-color:rgba(0,0,0,.1)}.rst-content .admonition table td,.rst-content .admonition table th{background:transparent!important;border-color:rgba(0,0,0,.1)!important}.rst-content .section ol.loweralpha,.rst-content .section ol.loweralpha>li,.rst-content .toctree-wrapper ol.loweralpha,.rst-content .toctree-wrapper ol.loweralpha>li,.rst-content section ol.loweralpha,.rst-content section ol.loweralpha>li{list-style:lower-alpha}.rst-content .section ol.upperalpha,.rst-content .section ol.upperalpha>li,.rst-content .toctree-wrapper ol.upperalpha,.rst-content .toctree-wrapper ol.upperalpha>li,.rst-content section ol.upperalpha,.rst-content section ol.upperalpha>li{list-style:upper-alpha}.rst-content .section ol li>*,.rst-content .section ul li>*,.rst-content .toctree-wrapper ol li>*,.rst-content .toctree-wrapper ul li>*,.rst-content section ol li>*,.rst-content section ul li>*{margin-top:12px;margin-bottom:12px}.rst-content .section ol li>:first-child,.rst-content .section ul li>:first-child,.rst-content .toctree-wrapper ol li>:first-child,.rst-content .toctree-wrapper ul li>:first-child,.rst-content section ol li>:first-child,.rst-content section ul li>:first-child{margin-top:0}.rst-content .section ol li>p,.rst-content .section ol li>p:last-child,.rst-content .section ul li>p,.rst-content .section ul li>p:last-child,.rst-content .toctree-wrapper ol li>p,.rst-content .toctree-wrapper ol li>p:last-child,.rst-content .toctree-wrapper ul li>p,.rst-content .toctree-wrapper ul li>p:last-child,.rst-content section ol li>p,.rst-content section ol li>p:last-child,.rst-content section ul li>p,.rst-content section ul li>p:last-child{margin-bottom:12px}.rst-content .section ol li>p:only-child,.rst-content .section ol li>p:only-child:last-child,.rst-content .section ul li>p:only-child,.rst-content .section ul li>p:only-child:last-child,.rst-content .toctree-wrapper ol li>p:only-child,.rst-content .toctree-wrapper ol li>p:only-child:last-child,.rst-content .toctree-wrapper ul li>p:only-child,.rst-content .toctree-wrapper ul li>p:only-child:last-child,.rst-content section ol li>p:only-child,.rst-content section ol li>p:only-child:last-child,.rst-content section ul li>p:only-child,.rst-content section ul li>p:only-child:last-child{margin-bottom:0}.rst-content .section ol li>ol,.rst-content .section ol li>ul,.rst-content .section ul li>ol,.rst-content .section ul li>ul,.rst-content .toctree-wrapper ol li>ol,.rst-content .toctree-wrapper ol li>ul,.rst-content .toctree-wrapper ul li>ol,.rst-content .toctree-wrapper ul li>ul,.rst-content section ol li>ol,.rst-content section ol li>ul,.rst-content section ul li>ol,.rst-content section ul li>ul{margin-bottom:12px}.rst-content .section ol.simple li>*,.rst-content .section ol.simple li ol,.rst-content .section ol.simple li ul,.rst-content .section ul.simple li>*,.rst-content .section ul.simple li ol,.rst-content .section ul.simple li ul,.rst-content .toctree-wrapper ol.simple li>*,.rst-content .toctree-wrapper ol.simple li ol,.rst-content .toctree-wrapper ol.simple li ul,.rst-content .toctree-wrapper ul.simple li>*,.rst-content .toctree-wrapper ul.simple li ol,.rst-content .toctree-wrapper ul.simple li ul,.rst-content section ol.simple li>*,.rst-content section ol.simple li ol,.rst-content section ol.simple li ul,.rst-content section ul.simple li>*,.rst-content section ul.simple li ol,.rst-content section ul.simple li ul{margin-top:0;margin-bottom:0}.rst-content .line-block{margin-left:0;margin-bottom:24px;line-height:24px}.rst-content .line-block .line-block{margin-left:24px;margin-bottom:0}.rst-content .topic-title{font-weight:700;margin-bottom:12px}.rst-content .toc-backref{color:#404040}.rst-content .align-right{float:right;margin:0 0 24px 24px}.rst-content .align-left{float:left;margin:0 24px 24px 0}.rst-content .align-center{margin:auto}.rst-content .align-center:not(table){display:block}.rst-content .code-block-caption .headerlink,.rst-content .eqno .headerlink,.rst-content .toctree-wrapper>p.caption .headerlink,.rst-content dl dt .headerlink,.rst-content h1 .headerlink,.rst-content h2 .headerlink,.rst-content h3 .headerlink,.rst-content h4 .headerlink,.rst-content h5 .headerlink,.rst-content h6 .headerlink,.rst-content p.caption .headerlink,.rst-content p .headerlink,.rst-content table>caption .headerlink{opacity:0;font-size:14px;font-family:FontAwesome;margin-left:.5em}.rst-content .code-block-caption .headerlink:focus,.rst-content .code-block-caption:hover .headerlink,.rst-content .eqno .headerlink:focus,.rst-content .eqno:hover .headerlink,.rst-content .toctree-wrapper>p.caption .headerlink:focus,.rst-content .toctree-wrapper>p.caption:hover .headerlink,.rst-content dl dt .headerlink:focus,.rst-content dl dt:hover .headerlink,.rst-content h1 .headerlink:focus,.rst-content h1:hover .headerlink,.rst-content h2 .headerlink:focus,.rst-content h2:hover .headerlink,.rst-content h3 .headerlink:focus,.rst-content h3:hover .headerlink,.rst-content h4 .headerlink:focus,.rst-content h4:hover .headerlink,.rst-content h5 .headerlink:focus,.rst-content h5:hover .headerlink,.rst-content h6 .headerlink:focus,.rst-content h6:hover .headerlink,.rst-content p.caption .headerlink:focus,.rst-content p.caption:hover .headerlink,.rst-content p .headerlink:focus,.rst-content p:hover .headerlink,.rst-content table>caption .headerlink:focus,.rst-content table>caption:hover .headerlink{opacity:1}.rst-content p a{overflow-wrap:anywhere}.rst-content .wy-table td p,.rst-content .wy-table td ul,.rst-content .wy-table th p,.rst-content .wy-table th ul,.rst-content table.docutils td p,.rst-content table.docutils td ul,.rst-content table.docutils th p,.rst-content table.docutils th ul,.rst-content table.field-list td p,.rst-content table.field-list td ul,.rst-content table.field-list th p,.rst-content table.field-list th ul{font-size:inherit}.rst-content .btn:focus{outline:2px solid}.rst-content table>caption .headerlink:after{font-size:12px}.rst-content .centered{text-align:center}.rst-content .sidebar{float:right;width:40%;display:block;margin:0 0 24px 24px;padding:24px;background:#f3f6f6;border:1px solid #e1e4e5}.rst-content .sidebar dl,.rst-content .sidebar p,.rst-content .sidebar ul{font-size:90%}.rst-content .sidebar .last,.rst-content .sidebar>:last-child{margin-bottom:0}.rst-content .sidebar .sidebar-title{display:block;font-family:Roboto Slab,ff-tisa-web-pro,Georgia,Arial,sans-serif;font-weight:700;background:#e1e4e5;padding:6px 12px;margin:-24px -24px 24px;font-size:100%}.rst-content .highlighted{background:#f1c40f;box-shadow:0 0 0 2px #f1c40f;display:inline;font-weight:700}.rst-content .citation-reference,.rst-content .footnote-reference{vertical-align:baseline;position:relative;top:-.4em;line-height:0;font-size:90%}.rst-content .citation-reference>span.fn-bracket,.rst-content .footnote-reference>span.fn-bracket{display:none}.rst-content .hlist{width:100%}.rst-content dl dt span.classifier:before{content:" : "}.rst-content dl dt span.classifier-delimiter{display:none!important}html.writer-html4 .rst-content table.docutils.citation,html.writer-html4 .rst-content table.docutils.footnote{background:none;border:none}html.writer-html4 .rst-content table.docutils.citation td,html.writer-html4 .rst-content table.docutils.citation tr,html.writer-html4 .rst-content table.docutils.footnote td,html.writer-html4 .rst-content table.docutils.footnote tr{border:none;background-color:transparent!important;white-space:normal}html.writer-html4 .rst-content table.docutils.citation td.label,html.writer-html4 .rst-content table.docutils.footnote td.label{padding-left:0;padding-right:0;vertical-align:top}html.writer-html5 .rst-content dl.citation,html.writer-html5 .rst-content dl.field-list,html.writer-html5 .rst-content dl.footnote{display:grid;grid-template-columns:auto minmax(80%,95%)}html.writer-html5 .rst-content dl.citation>dt,html.writer-html5 .rst-content dl.field-list>dt,html.writer-html5 .rst-content dl.footnote>dt{display:inline-grid;grid-template-columns:max-content auto}html.writer-html5 .rst-content aside.citation,html.writer-html5 .rst-content aside.footnote,html.writer-html5 .rst-content div.citation{display:grid;grid-template-columns:auto auto minmax(.65rem,auto) minmax(40%,95%)}html.writer-html5 .rst-content aside.citation>span.label,html.writer-html5 .rst-content aside.footnote>span.label,html.writer-html5 .rst-content div.citation>span.label{grid-column-start:1;grid-column-end:2}html.writer-html5 .rst-content aside.citation>span.backrefs,html.writer-html5 .rst-content aside.footnote>span.backrefs,html.writer-html5 .rst-content div.citation>span.backrefs{grid-column-start:2;grid-column-end:3;grid-row-start:1;grid-row-end:3}html.writer-html5 .rst-content aside.citation>p,html.writer-html5 .rst-content aside.footnote>p,html.writer-html5 .rst-content div.citation>p{grid-column-start:4;grid-column-end:5}html.writer-html5 .rst-content dl.citation,html.writer-html5 .rst-content dl.field-list,html.writer-html5 .rst-content dl.footnote{margin-bottom:24px}html.writer-html5 .rst-content dl.citation>dt,html.writer-html5 .rst-content dl.field-list>dt,html.writer-html5 .rst-content dl.footnote>dt{padding-left:1rem}html.writer-html5 .rst-content dl.citation>dd,html.writer-html5 .rst-content dl.citation>dt,html.writer-html5 .rst-content dl.field-list>dd,html.writer-html5 .rst-content dl.field-list>dt,html.writer-html5 .rst-content dl.footnote>dd,html.writer-html5 .rst-content dl.footnote>dt{margin-bottom:0}html.writer-html5 .rst-content dl.citation,html.writer-html5 .rst-content dl.footnote{font-size:.9rem}html.writer-html5 .rst-content dl.citation>dt,html.writer-html5 .rst-content dl.footnote>dt{margin:0 .5rem .5rem 0;line-height:1.2rem;word-break:break-all;font-weight:400}html.writer-html5 .rst-content dl.citation>dt>span.brackets:before,html.writer-html5 .rst-content dl.footnote>dt>span.brackets:before{content:"["}html.writer-html5 .rst-content dl.citation>dt>span.brackets:after,html.writer-html5 .rst-content dl.footnote>dt>span.brackets:after{content:"]"}html.writer-html5 .rst-content dl.citation>dt>span.fn-backref,html.writer-html5 .rst-content dl.footnote>dt>span.fn-backref{text-align:left;font-style:italic;margin-left:.65rem;word-break:break-word;word-spacing:-.1rem;max-width:5rem}html.writer-html5 .rst-content dl.citation>dt>span.fn-backref>a,html.writer-html5 .rst-content dl.footnote>dt>span.fn-backref>a{word-break:keep-all}html.writer-html5 .rst-content dl.citation>dt>span.fn-backref>a:not(:first-child):before,html.writer-html5 .rst-content dl.footnote>dt>span.fn-backref>a:not(:first-child):before{content:" "}html.writer-html5 .rst-content dl.citation>dd,html.writer-html5 .rst-content dl.footnote>dd{margin:0 0 .5rem;line-height:1.2rem}html.writer-html5 .rst-content dl.citation>dd p,html.writer-html5 .rst-content dl.footnote>dd p{font-size:.9rem}html.writer-html5 .rst-content aside.citation,html.writer-html5 .rst-content aside.footnote,html.writer-html5 .rst-content div.citation{padding-left:1rem;padding-right:1rem;font-size:.9rem;line-height:1.2rem}html.writer-html5 .rst-content aside.citation p,html.writer-html5 .rst-content aside.footnote p,html.writer-html5 .rst-content div.citation p{font-size:.9rem;line-height:1.2rem;margin-bottom:12px}html.writer-html5 .rst-content aside.citation span.backrefs,html.writer-html5 .rst-content aside.footnote span.backrefs,html.writer-html5 .rst-content div.citation span.backrefs{text-align:left;font-style:italic;margin-left:.65rem;word-break:break-word;word-spacing:-.1rem;max-width:5rem}html.writer-html5 .rst-content aside.citation span.backrefs>a,html.writer-html5 .rst-content aside.footnote span.backrefs>a,html.writer-html5 .rst-content div.citation span.backrefs>a{word-break:keep-all}html.writer-html5 .rst-content aside.citation span.backrefs>a:not(:first-child):before,html.writer-html5 .rst-content aside.footnote span.backrefs>a:not(:first-child):before,html.writer-html5 .rst-content div.citation span.backrefs>a:not(:first-child):before{content:" "}html.writer-html5 .rst-content aside.citation span.label,html.writer-html5 .rst-content aside.footnote span.label,html.writer-html5 .rst-content div.citation span.label{line-height:1.2rem}html.writer-html5 .rst-content aside.citation-list,html.writer-html5 .rst-content aside.footnote-list,html.writer-html5 .rst-content div.citation-list{margin-bottom:24px}html.writer-html5 .rst-content dl.option-list kbd{font-size:.9rem}.rst-content table.docutils.footnote,html.writer-html4 .rst-content table.docutils.citation,html.writer-html5 .rst-content aside.footnote,html.writer-html5 .rst-content aside.footnote-list aside.footnote,html.writer-html5 .rst-content div.citation-list>div.citation,html.writer-html5 .rst-content dl.citation,html.writer-html5 .rst-content dl.footnote{color:grey}.rst-content table.docutils.footnote code,.rst-content table.docutils.footnote tt,html.writer-html4 .rst-content table.docutils.citation code,html.writer-html4 .rst-content table.docutils.citation tt,html.writer-html5 .rst-content aside.footnote-list aside.footnote code,html.writer-html5 .rst-content aside.footnote-list aside.footnote tt,html.writer-html5 .rst-content aside.footnote code,html.writer-html5 .rst-content aside.footnote tt,html.writer-html5 .rst-content div.citation-list>div.citation code,html.writer-html5 .rst-content div.citation-list>div.citation tt,html.writer-html5 .rst-content dl.citation code,html.writer-html5 .rst-content dl.citation tt,html.writer-html5 .rst-content dl.footnote code,html.writer-html5 .rst-content dl.footnote tt{color:#555}.rst-content .wy-table-responsive.citation,.rst-content .wy-table-responsive.footnote{margin-bottom:0}.rst-content .wy-table-responsive.citation+:not(.citation),.rst-content .wy-table-responsive.footnote+:not(.footnote){margin-top:24px}.rst-content .wy-table-responsive.citation:last-child,.rst-content .wy-table-responsive.footnote:last-child{margin-bottom:24px}.rst-content table.docutils th{border-color:#e1e4e5}html.writer-html5 .rst-content table.docutils th{border:1px solid #e1e4e5}html.writer-html5 .rst-content table.docutils td>p,html.writer-html5 .rst-content table.docutils th>p{line-height:1rem;margin-bottom:0;font-size:.9rem}.rst-content table.docutils td .last,.rst-content table.docutils td .last>:last-child{margin-bottom:0}.rst-content table.field-list,.rst-content table.field-list td{border:none}.rst-content table.field-list td p{line-height:inherit}.rst-content table.field-list td>strong{display:inline-block}.rst-content table.field-list .field-name{padding-right:10px;text-align:left;white-space:nowrap}.rst-content table.field-list .field-body{text-align:left}.rst-content code,.rst-content tt{color:#000;font-family:SFMono-Regular,Menlo,Monaco,Consolas,Liberation Mono,Courier New,Courier,monospace;padding:2px 5px}.rst-content code big,.rst-content code em,.rst-content tt big,.rst-content tt em{font-size:100%!important;line-height:normal}.rst-content code.literal,.rst-content tt.literal{color:#e74c3c;white-space:normal}.rst-content code.xref,.rst-content tt.xref,a .rst-content code,a .rst-content tt{font-weight:700;color:#404040;overflow-wrap:normal}.rst-content kbd,.rst-content pre,.rst-content samp{font-family:SFMono-Regular,Menlo,Monaco,Consolas,Liberation Mono,Courier New,Courier,monospace}.rst-content a code,.rst-content a tt{color:#2980b9}.rst-content dl{margin-bottom:24px}.rst-content dl dt{font-weight:700;margin-bottom:12px}.rst-content dl ol,.rst-content dl p,.rst-content dl table,.rst-content dl ul{margin-bottom:12px}.rst-content dl dd{margin:0 0 12px 24px;line-height:24px}.rst-content dl dd>ol:last-child,.rst-content dl dd>p:last-child,.rst-content dl dd>table:last-child,.rst-content dl dd>ul:last-child{margin-bottom:0}html.writer-html4 .rst-content dl:not(.docutils),html.writer-html5 .rst-content dl[class]:not(.option-list):not(.field-list):not(.footnote):not(.citation):not(.glossary):not(.simple){margin-bottom:24px}html.writer-html4 .rst-content dl:not(.docutils)>dt,html.writer-html5 .rst-content dl[class]:not(.option-list):not(.field-list):not(.footnote):not(.citation):not(.glossary):not(.simple)>dt{display:table;margin:6px 0;font-size:90%;line-height:normal;background:#e7f2fa;color:#2980b9;border-top:3px solid #6ab0de;padding:6px;position:relative}html.writer-html4 .rst-content dl:not(.docutils)>dt:before,html.writer-html5 .rst-content dl[class]:not(.option-list):not(.field-list):not(.footnote):not(.citation):not(.glossary):not(.simple)>dt:before{color:#6ab0de}html.writer-html4 .rst-content dl:not(.docutils)>dt .headerlink,html.writer-html5 .rst-content dl[class]:not(.option-list):not(.field-list):not(.footnote):not(.citation):not(.glossary):not(.simple)>dt .headerlink{color:#404040;font-size:100%!important}html.writer-html4 .rst-content dl:not(.docutils) dl:not(.option-list):not(.field-list):not(.footnote):not(.citation):not(.glossary):not(.simple)>dt,html.writer-html5 .rst-content dl[class]:not(.option-list):not(.field-list):not(.footnote):not(.citation):not(.glossary):not(.simple) dl:not(.option-list):not(.field-list):not(.footnote):not(.citation):not(.glossary):not(.simple)>dt{margin-bottom:6px;border:none;border-left:3px solid #ccc;background:#f0f0f0;color:#555}html.writer-html4 .rst-content dl:not(.docutils) dl:not(.option-list):not(.field-list):not(.footnote):not(.citation):not(.glossary):not(.simple)>dt .headerlink,html.writer-html5 .rst-content dl[class]:not(.option-list):not(.field-list):not(.footnote):not(.citation):not(.glossary):not(.simple) dl:not(.option-list):not(.field-list):not(.footnote):not(.citation):not(.glossary):not(.simple)>dt .headerlink{color:#404040;font-size:100%!important}html.writer-html4 .rst-content dl:not(.docutils)>dt:first-child,html.writer-html5 .rst-content dl[class]:not(.option-list):not(.field-list):not(.footnote):not(.citation):not(.glossary):not(.simple)>dt:first-child{margin-top:0}html.writer-html4 .rst-content dl:not(.docutils) code.descclassname,html.writer-html4 .rst-content dl:not(.docutils) code.descname,html.writer-html4 .rst-content dl:not(.docutils) tt.descclassname,html.writer-html4 .rst-content dl:not(.docutils) tt.descname,html.writer-html5 .rst-content dl[class]:not(.option-list):not(.field-list):not(.footnote):not(.citation):not(.glossary):not(.simple) code.descclassname,html.writer-html5 .rst-content dl[class]:not(.option-list):not(.field-list):not(.footnote):not(.citation):not(.glossary):not(.simple) code.descname,html.writer-html5 .rst-content dl[class]:not(.option-list):not(.field-list):not(.footnote):not(.citation):not(.glossary):not(.simple) tt.descclassname,html.writer-html5 .rst-content dl[class]:not(.option-list):not(.field-list):not(.footnote):not(.citation):not(.glossary):not(.simple) tt.descname{background-color:transparent;border:none;padding:0;font-size:100%!important}html.writer-html4 .rst-content dl:not(.docutils) code.descname,html.writer-html4 .rst-content dl:not(.docutils) tt.descname,html.writer-html5 .rst-content dl[class]:not(.option-list):not(.field-list):not(.footnote):not(.citation):not(.glossary):not(.simple) code.descname,html.writer-html5 .rst-content dl[class]:not(.option-list):not(.field-list):not(.footnote):not(.citation):not(.glossary):not(.simple) tt.descname{font-weight:700}html.writer-html4 .rst-content dl:not(.docutils) .optional,html.writer-html5 .rst-content dl[class]:not(.option-list):not(.field-list):not(.footnote):not(.citation):not(.glossary):not(.simple) .optional{display:inline-block;padding:0 4px;color:#000;font-weight:700}html.writer-html4 .rst-content dl:not(.docutils) .property,html.writer-html5 .rst-content dl[class]:not(.option-list):not(.field-list):not(.footnote):not(.citation):not(.glossary):not(.simple) .property{display:inline-block;padding-right:8px;max-width:100%}html.writer-html4 .rst-content dl:not(.docutils) .k,html.writer-html5 .rst-content dl[class]:not(.option-list):not(.field-list):not(.footnote):not(.citation):not(.glossary):not(.simple) .k{font-style:italic}html.writer-html4 .rst-content dl:not(.docutils) .descclassname,html.writer-html4 .rst-content dl:not(.docutils) .descname,html.writer-html4 .rst-content dl:not(.docutils) .sig-name,html.writer-html5 .rst-content dl[class]:not(.option-list):not(.field-list):not(.footnote):not(.citation):not(.glossary):not(.simple) .descclassname,html.writer-html5 .rst-content dl[class]:not(.option-list):not(.field-list):not(.footnote):not(.citation):not(.glossary):not(.simple) .descname,html.writer-html5 .rst-content dl[class]:not(.option-list):not(.field-list):not(.footnote):not(.citation):not(.glossary):not(.simple) .sig-name{font-family:SFMono-Regular,Menlo,Monaco,Consolas,Liberation Mono,Courier New,Courier,monospace;color:#000}.rst-content .viewcode-back,.rst-content .viewcode-link{display:inline-block;color:#27ae60;font-size:80%;padding-left:24px}.rst-content .viewcode-back{display:block;float:right}.rst-content p.rubric{margin-bottom:12px;font-weight:700}.rst-content code.download,.rst-content tt.download{background:inherit;padding:inherit;font-weight:400;font-family:inherit;font-size:inherit;color:inherit;border:inherit;white-space:inherit}.rst-content code.download span:first-child,.rst-content tt.download span:first-child{-webkit-font-smoothing:subpixel-antialiased}.rst-content code.download span:first-child:before,.rst-content tt.download span:first-child:before{margin-right:4px}.rst-content .guilabel,.rst-content .menuselection{font-size:80%;font-weight:700;border-radius:4px;padding:2.4px 6px;margin:auto 2px}.rst-content .guilabel,.rst-content .menuselection{border:1px solid #7fbbe3;background:#e7f2fa}.rst-content :not(dl.option-list)>:not(dt):not(kbd):not(.kbd)>.kbd,.rst-content :not(dl.option-list)>:not(dt):not(kbd):not(.kbd)>kbd{color:inherit;font-size:80%;background-color:#fff;border:1px solid #a6a6a6;border-radius:4px;box-shadow:0 2px grey;padding:2.4px 6px;margin:auto 0}.rst-content .versionmodified{font-style:italic}@media screen and (max-width:480px){.rst-content .sidebar{width:100%}}span[id*=MathJax-Span]{color:#404040}.math{text-align:center}@font-face{font-family:Lato;src:url(fonts/lato-normal.woff2?bd03a2cc277bbbc338d464e679fe9942) format("woff2"),url(fonts/lato-normal.woff?27bd77b9162d388cb8d4c4217c7c5e2a) format("woff");font-weight:400;font-style:normal;font-display:block}@font-face{font-family:Lato;src:url(fonts/lato-bold.woff2?cccb897485813c7c256901dbca54ecf2) format("woff2"),url(fonts/lato-bold.woff?d878b6c29b10beca227e9eef4246111b) format("woff");font-weight:700;font-style:normal;font-display:block}@font-face{font-family:Lato;src:url(fonts/lato-bold-italic.woff2?0b6bb6725576b072c5d0b02ecdd1900d) format("woff2"),url(fonts/lato-bold-italic.woff?9c7e4e9eb485b4a121c760e61bc3707c) format("woff");font-weight:700;font-style:italic;font-display:block}@font-face{font-family:Lato;src:url(fonts/lato-normal-italic.woff2?4eb103b4d12be57cb1d040ed5e162e9d) format("woff2"),url(fonts/lato-normal-italic.woff?f28f2d6482446544ef1ea1ccc6dd5892) format("woff");font-weight:400;font-style:italic;font-display:block}@font-face{font-family:Roboto Slab;font-style:normal;font-weight:400;src:url(fonts/Roboto-Slab-Regular.woff2?7abf5b8d04d26a2cafea937019bca958) format("woff2"),url(fonts/Roboto-Slab-Regular.woff?c1be9284088d487c5e3ff0a10a92e58c) format("woff");font-display:block}@font-face{font-family:Roboto Slab;font-style:normal;font-weight:700;src:url(fonts/Roboto-Slab-Bold.woff2?9984f4a9bda09be08e83f2506954adbe) format("woff2"),url(fonts/Roboto-Slab-Bold.woff?bed5564a116b05148e3b3bea6fb1162a) format("woff");font-display:block} \ No newline at end of file diff --git a/_static/doctools.js b/_static/doctools.js new file mode 100644 index 00000000..d06a71d7 --- /dev/null +++ b/_static/doctools.js @@ -0,0 +1,156 @@ +/* + * doctools.js + * ~~~~~~~~~~~ + * + * Base JavaScript utilities for all Sphinx HTML documentation. + * + * :copyright: Copyright 2007-2023 by the Sphinx team, see AUTHORS. + * :license: BSD, see LICENSE for details. + * + */ +"use strict"; + +const BLACKLISTED_KEY_CONTROL_ELEMENTS = new Set([ + "TEXTAREA", + "INPUT", + "SELECT", + "BUTTON", +]); + +const _ready = (callback) => { + if (document.readyState !== "loading") { + callback(); + } else { + document.addEventListener("DOMContentLoaded", callback); + } +}; + +/** + * Small JavaScript module for the documentation. + */ +const Documentation = { + init: () => { + Documentation.initDomainIndexTable(); + Documentation.initOnKeyListeners(); + }, + + /** + * i18n support + */ + TRANSLATIONS: {}, + PLURAL_EXPR: (n) => (n === 1 ? 0 : 1), + LOCALE: "unknown", + + // gettext and ngettext don't access this so that the functions + // can safely bound to a different name (_ = Documentation.gettext) + gettext: (string) => { + const translated = Documentation.TRANSLATIONS[string]; + switch (typeof translated) { + case "undefined": + return string; // no translation + case "string": + return translated; // translation exists + default: + return translated[0]; // (singular, plural) translation tuple exists + } + }, + + ngettext: (singular, plural, n) => { + const translated = Documentation.TRANSLATIONS[singular]; + if (typeof translated !== "undefined") + return translated[Documentation.PLURAL_EXPR(n)]; + return n === 1 ? singular : plural; + }, + + addTranslations: (catalog) => { + Object.assign(Documentation.TRANSLATIONS, catalog.messages); + Documentation.PLURAL_EXPR = new Function( + "n", + `return (${catalog.plural_expr})` + ); + Documentation.LOCALE = catalog.locale; + }, + + /** + * helper function to focus on search bar + */ + focusSearchBar: () => { + document.querySelectorAll("input[name=q]")[0]?.focus(); + }, + + /** + * Initialise the domain index toggle buttons + */ + initDomainIndexTable: () => { + const toggler = (el) => { + const idNumber = el.id.substr(7); + const toggledRows = document.querySelectorAll(`tr.cg-${idNumber}`); + if (el.src.substr(-9) === "minus.png") { + el.src = `${el.src.substr(0, el.src.length - 9)}plus.png`; + toggledRows.forEach((el) => (el.style.display = "none")); + } else { + el.src = `${el.src.substr(0, el.src.length - 8)}minus.png`; + toggledRows.forEach((el) => (el.style.display = "")); + } + }; + + const togglerElements = document.querySelectorAll("img.toggler"); + togglerElements.forEach((el) => + el.addEventListener("click", (event) => toggler(event.currentTarget)) + ); + togglerElements.forEach((el) => (el.style.display = "")); + if (DOCUMENTATION_OPTIONS.COLLAPSE_INDEX) togglerElements.forEach(toggler); + }, + + initOnKeyListeners: () => { + // only install a listener if it is really needed + if ( + !DOCUMENTATION_OPTIONS.NAVIGATION_WITH_KEYS && + !DOCUMENTATION_OPTIONS.ENABLE_SEARCH_SHORTCUTS + ) + return; + + document.addEventListener("keydown", (event) => { + // bail for input elements + if (BLACKLISTED_KEY_CONTROL_ELEMENTS.has(document.activeElement.tagName)) return; + // bail with special keys + if (event.altKey || event.ctrlKey || event.metaKey) return; + + if (!event.shiftKey) { + switch (event.key) { + case "ArrowLeft": + if (!DOCUMENTATION_OPTIONS.NAVIGATION_WITH_KEYS) break; + + const prevLink = document.querySelector('link[rel="prev"]'); + if (prevLink && prevLink.href) { + window.location.href = prevLink.href; + event.preventDefault(); + } + break; + case "ArrowRight": + if (!DOCUMENTATION_OPTIONS.NAVIGATION_WITH_KEYS) break; + + const nextLink = document.querySelector('link[rel="next"]'); + if (nextLink && nextLink.href) { + window.location.href = nextLink.href; + event.preventDefault(); + } + break; + } + } + + // some keyboard layouts may need Shift to get / + switch (event.key) { + case "/": + if (!DOCUMENTATION_OPTIONS.ENABLE_SEARCH_SHORTCUTS) break; + Documentation.focusSearchBar(); + event.preventDefault(); + } + }); + }, +}; + +// quick alias for translations +const _ = Documentation.gettext; + +_ready(Documentation.init); diff --git a/_static/documentation_options.js b/_static/documentation_options.js new file mode 100644 index 00000000..e21c068c --- /dev/null +++ b/_static/documentation_options.js @@ -0,0 +1,13 @@ +const DOCUMENTATION_OPTIONS = { + VERSION: '0.1', + LANGUAGE: 'en', + COLLAPSE_INDEX: false, + BUILDER: 'html', + FILE_SUFFIX: '.html', + LINK_SUFFIX: '.html', + HAS_SOURCE: true, + SOURCELINK_SUFFIX: '.txt', + NAVIGATION_WITH_KEYS: false, + SHOW_SEARCH_SUMMARY: true, + ENABLE_SEARCH_SHORTCUTS: true, +}; \ No newline at end of file diff --git a/_static/file.png b/_static/file.png new file mode 100644 index 00000000..a858a410 Binary files /dev/null and b/_static/file.png differ diff --git a/_static/js/badge_only.js b/_static/js/badge_only.js new file mode 100644 index 00000000..526d7234 --- /dev/null +++ b/_static/js/badge_only.js @@ -0,0 +1 @@ +!function(e){var t={};function r(n){if(t[n])return t[n].exports;var o=t[n]={i:n,l:!1,exports:{}};return e[n].call(o.exports,o,o.exports,r),o.l=!0,o.exports}r.m=e,r.c=t,r.d=function(e,t,n){r.o(e,t)||Object.defineProperty(e,t,{enumerable:!0,get:n})},r.r=function(e){"undefined"!=typeof Symbol&&Symbol.toStringTag&&Object.defineProperty(e,Symbol.toStringTag,{value:"Module"}),Object.defineProperty(e,"__esModule",{value:!0})},r.t=function(e,t){if(1&t&&(e=r(e)),8&t)return e;if(4&t&&"object"==typeof e&&e&&e.__esModule)return e;var n=Object.create(null);if(r.r(n),Object.defineProperty(n,"default",{enumerable:!0,value:e}),2&t&&"string"!=typeof e)for(var o in e)r.d(n,o,function(t){return e[t]}.bind(null,o));return n},r.n=function(e){var t=e&&e.__esModule?function(){return e.default}:function(){return e};return r.d(t,"a",t),t},r.o=function(e,t){return Object.prototype.hasOwnProperty.call(e,t)},r.p="",r(r.s=4)}({4:function(e,t,r){}}); \ No newline at end of file diff --git a/_static/js/html5shiv-printshiv.min.js b/_static/js/html5shiv-printshiv.min.js new file mode 100644 index 00000000..2b43bd06 --- /dev/null +++ b/_static/js/html5shiv-printshiv.min.js @@ -0,0 +1,4 @@ +/** +* @preserve HTML5 Shiv 3.7.3-pre | @afarkas @jdalton @jon_neal @rem | MIT/GPL2 Licensed +*/ +!function(a,b){function c(a,b){var c=a.createElement("p"),d=a.getElementsByTagName("head")[0]||a.documentElement;return c.innerHTML="x",d.insertBefore(c.lastChild,d.firstChild)}function d(){var a=y.elements;return"string"==typeof a?a.split(" "):a}function e(a,b){var c=y.elements;"string"!=typeof c&&(c=c.join(" ")),"string"!=typeof a&&(a=a.join(" ")),y.elements=c+" "+a,j(b)}function f(a){var b=x[a[v]];return b||(b={},w++,a[v]=w,x[w]=b),b}function g(a,c,d){if(c||(c=b),q)return c.createElement(a);d||(d=f(c));var e;return e=d.cache[a]?d.cache[a].cloneNode():u.test(a)?(d.cache[a]=d.createElem(a)).cloneNode():d.createElem(a),!e.canHaveChildren||t.test(a)||e.tagUrn?e:d.frag.appendChild(e)}function h(a,c){if(a||(a=b),q)return a.createDocumentFragment();c=c||f(a);for(var e=c.frag.cloneNode(),g=0,h=d(),i=h.length;i>g;g++)e.createElement(h[g]);return e}function i(a,b){b.cache||(b.cache={},b.createElem=a.createElement,b.createFrag=a.createDocumentFragment,b.frag=b.createFrag()),a.createElement=function(c){return y.shivMethods?g(c,a,b):b.createElem(c)},a.createDocumentFragment=Function("h,f","return function(){var n=f.cloneNode(),c=n.createElement;h.shivMethods&&("+d().join().replace(/[\w\-:]+/g,function(a){return b.createElem(a),b.frag.createElement(a),'c("'+a+'")'})+");return n}")(y,b.frag)}function j(a){a||(a=b);var d=f(a);return!y.shivCSS||p||d.hasCSS||(d.hasCSS=!!c(a,"article,aside,dialog,figcaption,figure,footer,header,hgroup,main,nav,section{display:block}mark{background:#FF0;color:#000}template{display:none}")),q||i(a,d),a}function k(a){for(var b,c=a.getElementsByTagName("*"),e=c.length,f=RegExp("^(?:"+d().join("|")+")$","i"),g=[];e--;)b=c[e],f.test(b.nodeName)&&g.push(b.applyElement(l(b)));return g}function l(a){for(var b,c=a.attributes,d=c.length,e=a.ownerDocument.createElement(A+":"+a.nodeName);d--;)b=c[d],b.specified&&e.setAttribute(b.nodeName,b.nodeValue);return e.style.cssText=a.style.cssText,e}function m(a){for(var b,c=a.split("{"),e=c.length,f=RegExp("(^|[\\s,>+~])("+d().join("|")+")(?=[[\\s,>+~#.:]|$)","gi"),g="$1"+A+"\\:$2";e--;)b=c[e]=c[e].split("}"),b[b.length-1]=b[b.length-1].replace(f,g),c[e]=b.join("}");return c.join("{")}function n(a){for(var b=a.length;b--;)a[b].removeNode()}function o(a){function b(){clearTimeout(g._removeSheetTimer),d&&d.removeNode(!0),d=null}var d,e,g=f(a),h=a.namespaces,i=a.parentWindow;return!B||a.printShived?a:("undefined"==typeof h[A]&&h.add(A),i.attachEvent("onbeforeprint",function(){b();for(var f,g,h,i=a.styleSheets,j=[],l=i.length,n=Array(l);l--;)n[l]=i[l];for(;h=n.pop();)if(!h.disabled&&z.test(h.media)){try{f=h.imports,g=f.length}catch(o){g=0}for(l=0;g>l;l++)n.push(f[l]);try{j.push(h.cssText)}catch(o){}}j=m(j.reverse().join("")),e=k(a),d=c(a,j)}),i.attachEvent("onafterprint",function(){n(e),clearTimeout(g._removeSheetTimer),g._removeSheetTimer=setTimeout(b,500)}),a.printShived=!0,a)}var p,q,r="3.7.3",s=a.html5||{},t=/^<|^(?:button|map|select|textarea|object|iframe|option|optgroup)$/i,u=/^(?:a|b|code|div|fieldset|h1|h2|h3|h4|h5|h6|i|label|li|ol|p|q|span|strong|style|table|tbody|td|th|tr|ul)$/i,v="_html5shiv",w=0,x={};!function(){try{var a=b.createElement("a");a.innerHTML="",p="hidden"in a,q=1==a.childNodes.length||function(){b.createElement("a");var a=b.createDocumentFragment();return"undefined"==typeof a.cloneNode||"undefined"==typeof a.createDocumentFragment||"undefined"==typeof a.createElement}()}catch(c){p=!0,q=!0}}();var y={elements:s.elements||"abbr article aside audio bdi canvas data datalist details dialog figcaption figure footer header hgroup main mark meter nav output picture progress section summary template time video",version:r,shivCSS:s.shivCSS!==!1,supportsUnknownElements:q,shivMethods:s.shivMethods!==!1,type:"default",shivDocument:j,createElement:g,createDocumentFragment:h,addElements:e};a.html5=y,j(b);var z=/^$|\b(?:all|print)\b/,A="html5shiv",B=!q&&function(){var c=b.documentElement;return!("undefined"==typeof b.namespaces||"undefined"==typeof b.parentWindow||"undefined"==typeof c.applyElement||"undefined"==typeof c.removeNode||"undefined"==typeof a.attachEvent)}();y.type+=" print",y.shivPrint=o,o(b),"object"==typeof module&&module.exports&&(module.exports=y)}("undefined"!=typeof window?window:this,document); \ No newline at end of file diff --git a/_static/js/html5shiv.min.js b/_static/js/html5shiv.min.js new file mode 100644 index 00000000..cd1c674f --- /dev/null +++ b/_static/js/html5shiv.min.js @@ -0,0 +1,4 @@ +/** +* @preserve HTML5 Shiv 3.7.3 | @afarkas @jdalton @jon_neal @rem | MIT/GPL2 Licensed +*/ +!function(a,b){function c(a,b){var c=a.createElement("p"),d=a.getElementsByTagName("head")[0]||a.documentElement;return c.innerHTML="x",d.insertBefore(c.lastChild,d.firstChild)}function d(){var a=t.elements;return"string"==typeof a?a.split(" "):a}function e(a,b){var c=t.elements;"string"!=typeof c&&(c=c.join(" ")),"string"!=typeof a&&(a=a.join(" ")),t.elements=c+" "+a,j(b)}function f(a){var b=s[a[q]];return b||(b={},r++,a[q]=r,s[r]=b),b}function g(a,c,d){if(c||(c=b),l)return c.createElement(a);d||(d=f(c));var e;return e=d.cache[a]?d.cache[a].cloneNode():p.test(a)?(d.cache[a]=d.createElem(a)).cloneNode():d.createElem(a),!e.canHaveChildren||o.test(a)||e.tagUrn?e:d.frag.appendChild(e)}function h(a,c){if(a||(a=b),l)return a.createDocumentFragment();c=c||f(a);for(var e=c.frag.cloneNode(),g=0,h=d(),i=h.length;i>g;g++)e.createElement(h[g]);return e}function i(a,b){b.cache||(b.cache={},b.createElem=a.createElement,b.createFrag=a.createDocumentFragment,b.frag=b.createFrag()),a.createElement=function(c){return t.shivMethods?g(c,a,b):b.createElem(c)},a.createDocumentFragment=Function("h,f","return function(){var n=f.cloneNode(),c=n.createElement;h.shivMethods&&("+d().join().replace(/[\w\-:]+/g,function(a){return b.createElem(a),b.frag.createElement(a),'c("'+a+'")'})+");return n}")(t,b.frag)}function j(a){a||(a=b);var d=f(a);return!t.shivCSS||k||d.hasCSS||(d.hasCSS=!!c(a,"article,aside,dialog,figcaption,figure,footer,header,hgroup,main,nav,section{display:block}mark{background:#FF0;color:#000}template{display:none}")),l||i(a,d),a}var k,l,m="3.7.3-pre",n=a.html5||{},o=/^<|^(?:button|map|select|textarea|object|iframe|option|optgroup)$/i,p=/^(?:a|b|code|div|fieldset|h1|h2|h3|h4|h5|h6|i|label|li|ol|p|q|span|strong|style|table|tbody|td|th|tr|ul)$/i,q="_html5shiv",r=0,s={};!function(){try{var a=b.createElement("a");a.innerHTML="",k="hidden"in a,l=1==a.childNodes.length||function(){b.createElement("a");var a=b.createDocumentFragment();return"undefined"==typeof a.cloneNode||"undefined"==typeof a.createDocumentFragment||"undefined"==typeof a.createElement}()}catch(c){k=!0,l=!0}}();var t={elements:n.elements||"abbr article aside audio bdi canvas data datalist details dialog figcaption figure footer header hgroup main mark meter nav output picture progress section summary template time video",version:m,shivCSS:n.shivCSS!==!1,supportsUnknownElements:l,shivMethods:n.shivMethods!==!1,type:"default",shivDocument:j,createElement:g,createDocumentFragment:h,addElements:e};a.html5=t,j(b),"object"==typeof module&&module.exports&&(module.exports=t)}("undefined"!=typeof window?window:this,document); \ No newline at end of file diff --git a/_static/js/theme.js b/_static/js/theme.js new file mode 100644 index 00000000..1fddb6ee --- /dev/null +++ b/_static/js/theme.js @@ -0,0 +1 @@ +!function(n){var e={};function t(i){if(e[i])return e[i].exports;var o=e[i]={i:i,l:!1,exports:{}};return n[i].call(o.exports,o,o.exports,t),o.l=!0,o.exports}t.m=n,t.c=e,t.d=function(n,e,i){t.o(n,e)||Object.defineProperty(n,e,{enumerable:!0,get:i})},t.r=function(n){"undefined"!=typeof Symbol&&Symbol.toStringTag&&Object.defineProperty(n,Symbol.toStringTag,{value:"Module"}),Object.defineProperty(n,"__esModule",{value:!0})},t.t=function(n,e){if(1&e&&(n=t(n)),8&e)return n;if(4&e&&"object"==typeof n&&n&&n.__esModule)return n;var i=Object.create(null);if(t.r(i),Object.defineProperty(i,"default",{enumerable:!0,value:n}),2&e&&"string"!=typeof n)for(var o in n)t.d(i,o,function(e){return n[e]}.bind(null,o));return i},t.n=function(n){var e=n&&n.__esModule?function(){return n.default}:function(){return n};return t.d(e,"a",e),e},t.o=function(n,e){return Object.prototype.hasOwnProperty.call(n,e)},t.p="",t(t.s=0)}([function(n,e,t){t(1),n.exports=t(3)},function(n,e,t){(function(){var e="undefined"!=typeof window?window.jQuery:t(2);n.exports.ThemeNav={navBar:null,win:null,winScroll:!1,winResize:!1,linkScroll:!1,winPosition:0,winHeight:null,docHeight:null,isRunning:!1,enable:function(n){var t=this;void 0===n&&(n=!0),t.isRunning||(t.isRunning=!0,e((function(e){t.init(e),t.reset(),t.win.on("hashchange",t.reset),n&&t.win.on("scroll",(function(){t.linkScroll||t.winScroll||(t.winScroll=!0,requestAnimationFrame((function(){t.onScroll()})))})),t.win.on("resize",(function(){t.winResize||(t.winResize=!0,requestAnimationFrame((function(){t.onResize()})))})),t.onResize()})))},enableSticky:function(){this.enable(!0)},init:function(n){n(document);var e=this;this.navBar=n("div.wy-side-scroll:first"),this.win=n(window),n(document).on("click","[data-toggle='wy-nav-top']",(function(){n("[data-toggle='wy-nav-shift']").toggleClass("shift"),n("[data-toggle='rst-versions']").toggleClass("shift")})).on("click",".wy-menu-vertical .current ul li a",(function(){var t=n(this);n("[data-toggle='wy-nav-shift']").removeClass("shift"),n("[data-toggle='rst-versions']").toggleClass("shift"),e.toggleCurrent(t),e.hashChange()})).on("click","[data-toggle='rst-current-version']",(function(){n("[data-toggle='rst-versions']").toggleClass("shift-up")})),n("table.docutils:not(.field-list,.footnote,.citation)").wrap("
"),n("table.docutils.footnote").wrap("
"),n("table.docutils.citation").wrap("
"),n(".wy-menu-vertical ul").not(".simple").siblings("a").each((function(){var t=n(this);expand=n(''),expand.on("click",(function(n){return e.toggleCurrent(t),n.stopPropagation(),!1})),t.prepend(expand)}))},reset:function(){var n=encodeURI(window.location.hash)||"#";try{var e=$(".wy-menu-vertical"),t=e.find('[href="'+n+'"]');if(0===t.length){var i=$('.document [id="'+n.substring(1)+'"]').closest("div.section");0===(t=e.find('[href="#'+i.attr("id")+'"]')).length&&(t=e.find('[href="#"]'))}if(t.length>0){$(".wy-menu-vertical .current").removeClass("current").attr("aria-expanded","false"),t.addClass("current").attr("aria-expanded","true"),t.closest("li.toctree-l1").parent().addClass("current").attr("aria-expanded","true");for(let n=1;n<=10;n++)t.closest("li.toctree-l"+n).addClass("current").attr("aria-expanded","true");t[0].scrollIntoView()}}catch(n){console.log("Error expanding nav for anchor",n)}},onScroll:function(){this.winScroll=!1;var n=this.win.scrollTop(),e=n+this.winHeight,t=this.navBar.scrollTop()+(n-this.winPosition);n<0||e>this.docHeight||(this.navBar.scrollTop(t),this.winPosition=n)},onResize:function(){this.winResize=!1,this.winHeight=this.win.height(),this.docHeight=$(document).height()},hashChange:function(){this.linkScroll=!0,this.win.one("hashchange",(function(){this.linkScroll=!1}))},toggleCurrent:function(n){var e=n.closest("li");e.siblings("li.current").removeClass("current").attr("aria-expanded","false"),e.siblings().find("li.current").removeClass("current").attr("aria-expanded","false");var t=e.find("> ul li");t.length&&(t.removeClass("current").attr("aria-expanded","false"),e.toggleClass("current").attr("aria-expanded",(function(n,e){return"true"==e?"false":"true"})))}},"undefined"!=typeof window&&(window.SphinxRtdTheme={Navigation:n.exports.ThemeNav,StickyNav:n.exports.ThemeNav}),function(){for(var n=0,e=["ms","moz","webkit","o"],t=0;t0 + var meq1 = "^(" + C + ")?" + V + C + "(" + V + ")?$"; // [C]VC[V] is m=1 + var mgr1 = "^(" + C + ")?" + V + C + V + C; // [C]VCVC... is m>1 + var s_v = "^(" + C + ")?" + v; // vowel in stem + + this.stemWord = function (w) { + var stem; + var suffix; + var firstch; + var origword = w; + + if (w.length < 3) + return w; + + var re; + var re2; + var re3; + var re4; + + firstch = w.substr(0,1); + if (firstch == "y") + w = firstch.toUpperCase() + w.substr(1); + + // Step 1a + re = /^(.+?)(ss|i)es$/; + re2 = /^(.+?)([^s])s$/; + + if (re.test(w)) + w = w.replace(re,"$1$2"); + else if (re2.test(w)) + w = w.replace(re2,"$1$2"); + + // Step 1b + re = /^(.+?)eed$/; + re2 = /^(.+?)(ed|ing)$/; + if (re.test(w)) { + var fp = re.exec(w); + re = new RegExp(mgr0); + if (re.test(fp[1])) { + re = /.$/; + w = w.replace(re,""); + } + } + else if (re2.test(w)) { + var fp = re2.exec(w); + stem = fp[1]; + re2 = new RegExp(s_v); + if (re2.test(stem)) { + w = stem; + re2 = /(at|bl|iz)$/; + re3 = new RegExp("([^aeiouylsz])\\1$"); + re4 = new RegExp("^" + C + v + "[^aeiouwxy]$"); + if (re2.test(w)) + w = w + "e"; + else if (re3.test(w)) { + re = /.$/; + w = w.replace(re,""); + } + else if (re4.test(w)) + w = w + "e"; + } + } + + // Step 1c + re = /^(.+?)y$/; + if (re.test(w)) { + var fp = re.exec(w); + stem = fp[1]; + re = new RegExp(s_v); + if (re.test(stem)) + w = stem + "i"; + } + + // Step 2 + re = /^(.+?)(ational|tional|enci|anci|izer|bli|alli|entli|eli|ousli|ization|ation|ator|alism|iveness|fulness|ousness|aliti|iviti|biliti|logi)$/; + if (re.test(w)) { + var fp = re.exec(w); + stem = fp[1]; + suffix = fp[2]; + re = new RegExp(mgr0); + if (re.test(stem)) + w = stem + step2list[suffix]; + } + + // Step 3 + re = /^(.+?)(icate|ative|alize|iciti|ical|ful|ness)$/; + if (re.test(w)) { + var fp = re.exec(w); + stem = fp[1]; + suffix = fp[2]; + re = new RegExp(mgr0); + if (re.test(stem)) + w = stem + step3list[suffix]; + } + + // Step 4 + re = /^(.+?)(al|ance|ence|er|ic|able|ible|ant|ement|ment|ent|ou|ism|ate|iti|ous|ive|ize)$/; + re2 = /^(.+?)(s|t)(ion)$/; + if (re.test(w)) { + var fp = re.exec(w); + stem = fp[1]; + re = new RegExp(mgr1); + if (re.test(stem)) + w = stem; + } + else if (re2.test(w)) { + var fp = re2.exec(w); + stem = fp[1] + fp[2]; + re2 = new RegExp(mgr1); + if (re2.test(stem)) + w = stem; + } + + // Step 5 + re = /^(.+?)e$/; + if (re.test(w)) { + var fp = re.exec(w); + stem = fp[1]; + re = new RegExp(mgr1); + re2 = new RegExp(meq1); + re3 = new RegExp("^" + C + v + "[^aeiouwxy]$"); + if (re.test(stem) || (re2.test(stem) && !(re3.test(stem)))) + w = stem; + } + re = /ll$/; + re2 = new RegExp(mgr1); + if (re.test(w) && re2.test(w)) { + re = /.$/; + w = w.replace(re,""); + } + + // and turn initial Y back to y + if (firstch == "y") + w = firstch.toLowerCase() + w.substr(1); + return w; + } +} + diff --git a/_static/minus.png b/_static/minus.png new file mode 100644 index 00000000..d96755fd Binary files /dev/null and b/_static/minus.png differ diff --git a/_static/plus.png b/_static/plus.png new file mode 100644 index 00000000..7107cec9 Binary files /dev/null and b/_static/plus.png differ diff --git a/_static/pygments.css b/_static/pygments.css new file mode 100644 index 00000000..0d49244e --- /dev/null +++ b/_static/pygments.css @@ -0,0 +1,75 @@ +pre { line-height: 125%; } +td.linenos .normal { color: inherit; background-color: transparent; padding-left: 5px; padding-right: 5px; } +span.linenos { color: inherit; background-color: transparent; padding-left: 5px; padding-right: 5px; } +td.linenos .special { color: #000000; background-color: #ffffc0; padding-left: 5px; padding-right: 5px; } +span.linenos.special { color: #000000; background-color: #ffffc0; padding-left: 5px; padding-right: 5px; } +.highlight .hll { background-color: #ffffcc } +.highlight { background: #eeffcc; } +.highlight .c { color: #408090; font-style: italic } /* Comment */ +.highlight .err { border: 1px solid #FF0000 } /* Error */ +.highlight .k { color: #007020; font-weight: bold } /* Keyword */ +.highlight .o { color: #666666 } /* Operator */ +.highlight .ch { color: #408090; font-style: italic } /* Comment.Hashbang */ +.highlight .cm { color: #408090; font-style: italic } /* Comment.Multiline */ +.highlight .cp { color: #007020 } /* Comment.Preproc */ +.highlight .cpf { color: #408090; font-style: italic } /* Comment.PreprocFile */ +.highlight .c1 { color: #408090; font-style: italic } /* Comment.Single */ +.highlight .cs { color: #408090; background-color: #fff0f0 } /* Comment.Special */ +.highlight .gd { color: #A00000 } /* Generic.Deleted */ +.highlight .ge { font-style: italic } /* Generic.Emph */ +.highlight .ges { font-weight: bold; font-style: italic } /* Generic.EmphStrong */ +.highlight .gr { color: #FF0000 } /* Generic.Error */ +.highlight .gh { color: #000080; font-weight: bold } /* Generic.Heading */ +.highlight .gi { color: #00A000 } /* Generic.Inserted */ +.highlight .go { color: #333333 } /* Generic.Output */ +.highlight .gp { color: #c65d09; font-weight: bold } /* Generic.Prompt */ +.highlight .gs { font-weight: bold } /* Generic.Strong */ +.highlight .gu { color: #800080; font-weight: bold } /* Generic.Subheading */ +.highlight .gt { color: #0044DD } /* Generic.Traceback */ +.highlight .kc { color: #007020; font-weight: bold } /* Keyword.Constant */ +.highlight .kd { color: #007020; font-weight: bold } /* Keyword.Declaration */ +.highlight .kn { color: #007020; font-weight: bold } /* Keyword.Namespace */ +.highlight .kp { color: #007020 } /* Keyword.Pseudo */ +.highlight .kr { color: #007020; font-weight: bold } /* Keyword.Reserved */ +.highlight .kt { color: #902000 } /* Keyword.Type */ +.highlight .m { color: #208050 } /* Literal.Number */ +.highlight .s { color: #4070a0 } /* Literal.String */ +.highlight .na { color: #4070a0 } /* Name.Attribute */ +.highlight .nb { color: #007020 } /* Name.Builtin */ +.highlight .nc { color: #0e84b5; font-weight: bold } /* Name.Class */ +.highlight .no { color: #60add5 } /* Name.Constant */ +.highlight .nd { color: #555555; font-weight: bold } /* Name.Decorator */ +.highlight .ni { color: #d55537; font-weight: bold } /* Name.Entity */ +.highlight .ne { color: #007020 } /* Name.Exception */ +.highlight .nf { color: #06287e } /* Name.Function */ +.highlight .nl { color: #002070; font-weight: bold } /* Name.Label */ +.highlight .nn { color: #0e84b5; font-weight: bold } /* Name.Namespace */ +.highlight .nt { color: #062873; font-weight: bold } /* Name.Tag */ +.highlight .nv { color: #bb60d5 } /* Name.Variable */ +.highlight .ow { color: #007020; font-weight: bold } /* Operator.Word */ +.highlight .w { color: #bbbbbb } /* Text.Whitespace */ +.highlight .mb { color: #208050 } /* Literal.Number.Bin */ +.highlight .mf { color: #208050 } /* Literal.Number.Float */ +.highlight .mh { color: #208050 } /* Literal.Number.Hex */ +.highlight .mi { color: #208050 } /* Literal.Number.Integer */ +.highlight .mo { color: #208050 } /* Literal.Number.Oct */ +.highlight .sa { color: #4070a0 } /* Literal.String.Affix */ +.highlight .sb { color: #4070a0 } /* Literal.String.Backtick */ +.highlight .sc { color: #4070a0 } /* Literal.String.Char */ +.highlight .dl { color: #4070a0 } /* Literal.String.Delimiter */ +.highlight .sd { color: #4070a0; font-style: italic } /* Literal.String.Doc */ +.highlight .s2 { color: #4070a0 } /* Literal.String.Double */ +.highlight .se { color: #4070a0; font-weight: bold } /* Literal.String.Escape */ +.highlight .sh { color: #4070a0 } /* Literal.String.Heredoc */ +.highlight .si { color: #70a0d0; font-style: italic } /* Literal.String.Interpol */ +.highlight .sx { color: #c65d09 } /* Literal.String.Other */ +.highlight .sr { color: #235388 } /* Literal.String.Regex */ +.highlight .s1 { color: #4070a0 } /* Literal.String.Single */ +.highlight .ss { color: #517918 } /* Literal.String.Symbol */ +.highlight .bp { color: #007020 } /* Name.Builtin.Pseudo */ +.highlight .fm { color: #06287e } /* Name.Function.Magic */ +.highlight .vc { color: #bb60d5 } /* Name.Variable.Class */ +.highlight .vg { color: #bb60d5 } /* Name.Variable.Global */ +.highlight .vi { color: #bb60d5 } /* Name.Variable.Instance */ +.highlight .vm { color: #bb60d5 } /* Name.Variable.Magic */ +.highlight .il { color: #208050 } /* Literal.Number.Integer.Long */ \ No newline at end of file diff --git a/_static/searchtools.js b/_static/searchtools.js new file mode 100644 index 00000000..7918c3fa --- /dev/null +++ b/_static/searchtools.js @@ -0,0 +1,574 @@ +/* + * searchtools.js + * ~~~~~~~~~~~~~~~~ + * + * Sphinx JavaScript utilities for the full-text search. + * + * :copyright: Copyright 2007-2023 by the Sphinx team, see AUTHORS. + * :license: BSD, see LICENSE for details. + * + */ +"use strict"; + +/** + * Simple result scoring code. + */ +if (typeof Scorer === "undefined") { + var Scorer = { + // Implement the following function to further tweak the score for each result + // The function takes a result array [docname, title, anchor, descr, score, filename] + // and returns the new score. + /* + score: result => { + const [docname, title, anchor, descr, score, filename] = result + return score + }, + */ + + // query matches the full name of an object + objNameMatch: 11, + // or matches in the last dotted part of the object name + objPartialMatch: 6, + // Additive scores depending on the priority of the object + objPrio: { + 0: 15, // used to be importantResults + 1: 5, // used to be objectResults + 2: -5, // used to be unimportantResults + }, + // Used when the priority is not in the mapping. + objPrioDefault: 0, + + // query found in title + title: 15, + partialTitle: 7, + // query found in terms + term: 5, + partialTerm: 2, + }; +} + +const _removeChildren = (element) => { + while (element && element.lastChild) element.removeChild(element.lastChild); +}; + +/** + * See https://developer.mozilla.org/en-US/docs/Web/JavaScript/Guide/Regular_Expressions#escaping + */ +const _escapeRegExp = (string) => + string.replace(/[.*+\-?^${}()|[\]\\]/g, "\\$&"); // $& means the whole matched string + +const _displayItem = (item, searchTerms, highlightTerms) => { + const docBuilder = DOCUMENTATION_OPTIONS.BUILDER; + const docFileSuffix = DOCUMENTATION_OPTIONS.FILE_SUFFIX; + const docLinkSuffix = DOCUMENTATION_OPTIONS.LINK_SUFFIX; + const showSearchSummary = DOCUMENTATION_OPTIONS.SHOW_SEARCH_SUMMARY; + const contentRoot = document.documentElement.dataset.content_root; + + const [docName, title, anchor, descr, score, _filename] = item; + + let listItem = document.createElement("li"); + let requestUrl; + let linkUrl; + if (docBuilder === "dirhtml") { + // dirhtml builder + let dirname = docName + "/"; + if (dirname.match(/\/index\/$/)) + dirname = dirname.substring(0, dirname.length - 6); + else if (dirname === "index/") dirname = ""; + requestUrl = contentRoot + dirname; + linkUrl = requestUrl; + } else { + // normal html builders + requestUrl = contentRoot + docName + docFileSuffix; + linkUrl = docName + docLinkSuffix; + } + let linkEl = listItem.appendChild(document.createElement("a")); + linkEl.href = linkUrl + anchor; + linkEl.dataset.score = score; + linkEl.innerHTML = title; + if (descr) { + listItem.appendChild(document.createElement("span")).innerHTML = + " (" + descr + ")"; + // highlight search terms in the description + if (SPHINX_HIGHLIGHT_ENABLED) // set in sphinx_highlight.js + highlightTerms.forEach((term) => _highlightText(listItem, term, "highlighted")); + } + else if (showSearchSummary) + fetch(requestUrl) + .then((responseData) => responseData.text()) + .then((data) => { + if (data) + listItem.appendChild( + Search.makeSearchSummary(data, searchTerms) + ); + // highlight search terms in the summary + if (SPHINX_HIGHLIGHT_ENABLED) // set in sphinx_highlight.js + highlightTerms.forEach((term) => _highlightText(listItem, term, "highlighted")); + }); + Search.output.appendChild(listItem); +}; +const _finishSearch = (resultCount) => { + Search.stopPulse(); + Search.title.innerText = _("Search Results"); + if (!resultCount) + Search.status.innerText = Documentation.gettext( + "Your search did not match any documents. Please make sure that all words are spelled correctly and that you've selected enough categories." + ); + else + Search.status.innerText = _( + `Search finished, found ${resultCount} page(s) matching the search query.` + ); +}; +const _displayNextItem = ( + results, + resultCount, + searchTerms, + highlightTerms, +) => { + // results left, load the summary and display it + // this is intended to be dynamic (don't sub resultsCount) + if (results.length) { + _displayItem(results.pop(), searchTerms, highlightTerms); + setTimeout( + () => _displayNextItem(results, resultCount, searchTerms, highlightTerms), + 5 + ); + } + // search finished, update title and status message + else _finishSearch(resultCount); +}; + +/** + * Default splitQuery function. Can be overridden in ``sphinx.search`` with a + * custom function per language. + * + * The regular expression works by splitting the string on consecutive characters + * that are not Unicode letters, numbers, underscores, or emoji characters. + * This is the same as ``\W+`` in Python, preserving the surrogate pair area. + */ +if (typeof splitQuery === "undefined") { + var splitQuery = (query) => query + .split(/[^\p{Letter}\p{Number}_\p{Emoji_Presentation}]+/gu) + .filter(term => term) // remove remaining empty strings +} + +/** + * Search Module + */ +const Search = { + _index: null, + _queued_query: null, + _pulse_status: -1, + + htmlToText: (htmlString) => { + const htmlElement = new DOMParser().parseFromString(htmlString, 'text/html'); + htmlElement.querySelectorAll(".headerlink").forEach((el) => { el.remove() }); + const docContent = htmlElement.querySelector('[role="main"]'); + if (docContent !== undefined) return docContent.textContent; + console.warn( + "Content block not found. Sphinx search tries to obtain it via '[role=main]'. Could you check your theme or template." + ); + return ""; + }, + + init: () => { + const query = new URLSearchParams(window.location.search).get("q"); + document + .querySelectorAll('input[name="q"]') + .forEach((el) => (el.value = query)); + if (query) Search.performSearch(query); + }, + + loadIndex: (url) => + (document.body.appendChild(document.createElement("script")).src = url), + + setIndex: (index) => { + Search._index = index; + if (Search._queued_query !== null) { + const query = Search._queued_query; + Search._queued_query = null; + Search.query(query); + } + }, + + hasIndex: () => Search._index !== null, + + deferQuery: (query) => (Search._queued_query = query), + + stopPulse: () => (Search._pulse_status = -1), + + startPulse: () => { + if (Search._pulse_status >= 0) return; + + const pulse = () => { + Search._pulse_status = (Search._pulse_status + 1) % 4; + Search.dots.innerText = ".".repeat(Search._pulse_status); + if (Search._pulse_status >= 0) window.setTimeout(pulse, 500); + }; + pulse(); + }, + + /** + * perform a search for something (or wait until index is loaded) + */ + performSearch: (query) => { + // create the required interface elements + const searchText = document.createElement("h2"); + searchText.textContent = _("Searching"); + const searchSummary = document.createElement("p"); + searchSummary.classList.add("search-summary"); + searchSummary.innerText = ""; + const searchList = document.createElement("ul"); + searchList.classList.add("search"); + + const out = document.getElementById("search-results"); + Search.title = out.appendChild(searchText); + Search.dots = Search.title.appendChild(document.createElement("span")); + Search.status = out.appendChild(searchSummary); + Search.output = out.appendChild(searchList); + + const searchProgress = document.getElementById("search-progress"); + // Some themes don't use the search progress node + if (searchProgress) { + searchProgress.innerText = _("Preparing search..."); + } + Search.startPulse(); + + // index already loaded, the browser was quick! + if (Search.hasIndex()) Search.query(query); + else Search.deferQuery(query); + }, + + /** + * execute search (requires search index to be loaded) + */ + query: (query) => { + const filenames = Search._index.filenames; + const docNames = Search._index.docnames; + const titles = Search._index.titles; + const allTitles = Search._index.alltitles; + const indexEntries = Search._index.indexentries; + + // stem the search terms and add them to the correct list + const stemmer = new Stemmer(); + const searchTerms = new Set(); + const excludedTerms = new Set(); + const highlightTerms = new Set(); + const objectTerms = new Set(splitQuery(query.toLowerCase().trim())); + splitQuery(query.trim()).forEach((queryTerm) => { + const queryTermLower = queryTerm.toLowerCase(); + + // maybe skip this "word" + // stopwords array is from language_data.js + if ( + stopwords.indexOf(queryTermLower) !== -1 || + queryTerm.match(/^\d+$/) + ) + return; + + // stem the word + let word = stemmer.stemWord(queryTermLower); + // select the correct list + if (word[0] === "-") excludedTerms.add(word.substr(1)); + else { + searchTerms.add(word); + highlightTerms.add(queryTermLower); + } + }); + + if (SPHINX_HIGHLIGHT_ENABLED) { // set in sphinx_highlight.js + localStorage.setItem("sphinx_highlight_terms", [...highlightTerms].join(" ")) + } + + // console.debug("SEARCH: searching for:"); + // console.info("required: ", [...searchTerms]); + // console.info("excluded: ", [...excludedTerms]); + + // array of [docname, title, anchor, descr, score, filename] + let results = []; + _removeChildren(document.getElementById("search-progress")); + + const queryLower = query.toLowerCase(); + for (const [title, foundTitles] of Object.entries(allTitles)) { + if (title.toLowerCase().includes(queryLower) && (queryLower.length >= title.length/2)) { + for (const [file, id] of foundTitles) { + let score = Math.round(100 * queryLower.length / title.length) + results.push([ + docNames[file], + titles[file] !== title ? `${titles[file]} > ${title}` : title, + id !== null ? "#" + id : "", + null, + score, + filenames[file], + ]); + } + } + } + + // search for explicit entries in index directives + for (const [entry, foundEntries] of Object.entries(indexEntries)) { + if (entry.includes(queryLower) && (queryLower.length >= entry.length/2)) { + for (const [file, id] of foundEntries) { + let score = Math.round(100 * queryLower.length / entry.length) + results.push([ + docNames[file], + titles[file], + id ? "#" + id : "", + null, + score, + filenames[file], + ]); + } + } + } + + // lookup as object + objectTerms.forEach((term) => + results.push(...Search.performObjectSearch(term, objectTerms)) + ); + + // lookup as search terms in fulltext + results.push(...Search.performTermsSearch(searchTerms, excludedTerms)); + + // let the scorer override scores with a custom scoring function + if (Scorer.score) results.forEach((item) => (item[4] = Scorer.score(item))); + + // now sort the results by score (in opposite order of appearance, since the + // display function below uses pop() to retrieve items) and then + // alphabetically + results.sort((a, b) => { + const leftScore = a[4]; + const rightScore = b[4]; + if (leftScore === rightScore) { + // same score: sort alphabetically + const leftTitle = a[1].toLowerCase(); + const rightTitle = b[1].toLowerCase(); + if (leftTitle === rightTitle) return 0; + return leftTitle > rightTitle ? -1 : 1; // inverted is intentional + } + return leftScore > rightScore ? 1 : -1; + }); + + // remove duplicate search results + // note the reversing of results, so that in the case of duplicates, the highest-scoring entry is kept + let seen = new Set(); + results = results.reverse().reduce((acc, result) => { + let resultStr = result.slice(0, 4).concat([result[5]]).map(v => String(v)).join(','); + if (!seen.has(resultStr)) { + acc.push(result); + seen.add(resultStr); + } + return acc; + }, []); + + results = results.reverse(); + + // for debugging + //Search.lastresults = results.slice(); // a copy + // console.info("search results:", Search.lastresults); + + // print the results + _displayNextItem(results, results.length, searchTerms, highlightTerms); + }, + + /** + * search for object names + */ + performObjectSearch: (object, objectTerms) => { + const filenames = Search._index.filenames; + const docNames = Search._index.docnames; + const objects = Search._index.objects; + const objNames = Search._index.objnames; + const titles = Search._index.titles; + + const results = []; + + const objectSearchCallback = (prefix, match) => { + const name = match[4] + const fullname = (prefix ? prefix + "." : "") + name; + const fullnameLower = fullname.toLowerCase(); + if (fullnameLower.indexOf(object) < 0) return; + + let score = 0; + const parts = fullnameLower.split("."); + + // check for different match types: exact matches of full name or + // "last name" (i.e. last dotted part) + if (fullnameLower === object || parts.slice(-1)[0] === object) + score += Scorer.objNameMatch; + else if (parts.slice(-1)[0].indexOf(object) > -1) + score += Scorer.objPartialMatch; // matches in last name + + const objName = objNames[match[1]][2]; + const title = titles[match[0]]; + + // If more than one term searched for, we require other words to be + // found in the name/title/description + const otherTerms = new Set(objectTerms); + otherTerms.delete(object); + if (otherTerms.size > 0) { + const haystack = `${prefix} ${name} ${objName} ${title}`.toLowerCase(); + if ( + [...otherTerms].some((otherTerm) => haystack.indexOf(otherTerm) < 0) + ) + return; + } + + let anchor = match[3]; + if (anchor === "") anchor = fullname; + else if (anchor === "-") anchor = objNames[match[1]][1] + "-" + fullname; + + const descr = objName + _(", in ") + title; + + // add custom score for some objects according to scorer + if (Scorer.objPrio.hasOwnProperty(match[2])) + score += Scorer.objPrio[match[2]]; + else score += Scorer.objPrioDefault; + + results.push([ + docNames[match[0]], + fullname, + "#" + anchor, + descr, + score, + filenames[match[0]], + ]); + }; + Object.keys(objects).forEach((prefix) => + objects[prefix].forEach((array) => + objectSearchCallback(prefix, array) + ) + ); + return results; + }, + + /** + * search for full-text terms in the index + */ + performTermsSearch: (searchTerms, excludedTerms) => { + // prepare search + const terms = Search._index.terms; + const titleTerms = Search._index.titleterms; + const filenames = Search._index.filenames; + const docNames = Search._index.docnames; + const titles = Search._index.titles; + + const scoreMap = new Map(); + const fileMap = new Map(); + + // perform the search on the required terms + searchTerms.forEach((word) => { + const files = []; + const arr = [ + { files: terms[word], score: Scorer.term }, + { files: titleTerms[word], score: Scorer.title }, + ]; + // add support for partial matches + if (word.length > 2) { + const escapedWord = _escapeRegExp(word); + Object.keys(terms).forEach((term) => { + if (term.match(escapedWord) && !terms[word]) + arr.push({ files: terms[term], score: Scorer.partialTerm }); + }); + Object.keys(titleTerms).forEach((term) => { + if (term.match(escapedWord) && !titleTerms[word]) + arr.push({ files: titleTerms[word], score: Scorer.partialTitle }); + }); + } + + // no match but word was a required one + if (arr.every((record) => record.files === undefined)) return; + + // found search word in contents + arr.forEach((record) => { + if (record.files === undefined) return; + + let recordFiles = record.files; + if (recordFiles.length === undefined) recordFiles = [recordFiles]; + files.push(...recordFiles); + + // set score for the word in each file + recordFiles.forEach((file) => { + if (!scoreMap.has(file)) scoreMap.set(file, {}); + scoreMap.get(file)[word] = record.score; + }); + }); + + // create the mapping + files.forEach((file) => { + if (fileMap.has(file) && fileMap.get(file).indexOf(word) === -1) + fileMap.get(file).push(word); + else fileMap.set(file, [word]); + }); + }); + + // now check if the files don't contain excluded terms + const results = []; + for (const [file, wordList] of fileMap) { + // check if all requirements are matched + + // as search terms with length < 3 are discarded + const filteredTermCount = [...searchTerms].filter( + (term) => term.length > 2 + ).length; + if ( + wordList.length !== searchTerms.size && + wordList.length !== filteredTermCount + ) + continue; + + // ensure that none of the excluded terms is in the search result + if ( + [...excludedTerms].some( + (term) => + terms[term] === file || + titleTerms[term] === file || + (terms[term] || []).includes(file) || + (titleTerms[term] || []).includes(file) + ) + ) + break; + + // select one (max) score for the file. + const score = Math.max(...wordList.map((w) => scoreMap.get(file)[w])); + // add result to the result list + results.push([ + docNames[file], + titles[file], + "", + null, + score, + filenames[file], + ]); + } + return results; + }, + + /** + * helper function to return a node containing the + * search summary for a given text. keywords is a list + * of stemmed words. + */ + makeSearchSummary: (htmlText, keywords) => { + const text = Search.htmlToText(htmlText); + if (text === "") return null; + + const textLower = text.toLowerCase(); + const actualStartPosition = [...keywords] + .map((k) => textLower.indexOf(k.toLowerCase())) + .filter((i) => i > -1) + .slice(-1)[0]; + const startWithContext = Math.max(actualStartPosition - 120, 0); + + const top = startWithContext === 0 ? "" : "..."; + const tail = startWithContext + 240 < text.length ? "..." : ""; + + let summary = document.createElement("p"); + summary.classList.add("context"); + summary.textContent = top + text.substr(startWithContext, 240).trim() + tail; + + return summary; + }, +}; + +_ready(Search.init); diff --git a/_static/sphinx_highlight.js b/_static/sphinx_highlight.js new file mode 100644 index 00000000..8a96c69a --- /dev/null +++ b/_static/sphinx_highlight.js @@ -0,0 +1,154 @@ +/* Highlighting utilities for Sphinx HTML documentation. */ +"use strict"; + +const SPHINX_HIGHLIGHT_ENABLED = true + +/** + * highlight a given string on a node by wrapping it in + * span elements with the given class name. + */ +const _highlight = (node, addItems, text, className) => { + if (node.nodeType === Node.TEXT_NODE) { + const val = node.nodeValue; + const parent = node.parentNode; + const pos = val.toLowerCase().indexOf(text); + if ( + pos >= 0 && + !parent.classList.contains(className) && + !parent.classList.contains("nohighlight") + ) { + let span; + + const closestNode = parent.closest("body, svg, foreignObject"); + const isInSVG = closestNode && closestNode.matches("svg"); + if (isInSVG) { + span = document.createElementNS("http://www.w3.org/2000/svg", "tspan"); + } else { + span = document.createElement("span"); + span.classList.add(className); + } + + span.appendChild(document.createTextNode(val.substr(pos, text.length))); + const rest = document.createTextNode(val.substr(pos + text.length)); + parent.insertBefore( + span, + parent.insertBefore( + rest, + node.nextSibling + ) + ); + node.nodeValue = val.substr(0, pos); + /* There may be more occurrences of search term in this node. So call this + * function recursively on the remaining fragment. + */ + _highlight(rest, addItems, text, className); + + if (isInSVG) { + const rect = document.createElementNS( + "http://www.w3.org/2000/svg", + "rect" + ); + const bbox = parent.getBBox(); + rect.x.baseVal.value = bbox.x; + rect.y.baseVal.value = bbox.y; + rect.width.baseVal.value = bbox.width; + rect.height.baseVal.value = bbox.height; + rect.setAttribute("class", className); + addItems.push({ parent: parent, target: rect }); + } + } + } else if (node.matches && !node.matches("button, select, textarea")) { + node.childNodes.forEach((el) => _highlight(el, addItems, text, className)); + } +}; +const _highlightText = (thisNode, text, className) => { + let addItems = []; + _highlight(thisNode, addItems, text, className); + addItems.forEach((obj) => + obj.parent.insertAdjacentElement("beforebegin", obj.target) + ); +}; + +/** + * Small JavaScript module for the documentation. + */ +const SphinxHighlight = { + + /** + * highlight the search words provided in localstorage in the text + */ + highlightSearchWords: () => { + if (!SPHINX_HIGHLIGHT_ENABLED) return; // bail if no highlight + + // get and clear terms from localstorage + const url = new URL(window.location); + const highlight = + localStorage.getItem("sphinx_highlight_terms") + || url.searchParams.get("highlight") + || ""; + localStorage.removeItem("sphinx_highlight_terms") + url.searchParams.delete("highlight"); + window.history.replaceState({}, "", url); + + // get individual terms from highlight string + const terms = highlight.toLowerCase().split(/\s+/).filter(x => x); + if (terms.length === 0) return; // nothing to do + + // There should never be more than one element matching "div.body" + const divBody = document.querySelectorAll("div.body"); + const body = divBody.length ? divBody[0] : document.querySelector("body"); + window.setTimeout(() => { + terms.forEach((term) => _highlightText(body, term, "highlighted")); + }, 10); + + const searchBox = document.getElementById("searchbox"); + if (searchBox === null) return; + searchBox.appendChild( + document + .createRange() + .createContextualFragment( + '" + ) + ); + }, + + /** + * helper function to hide the search marks again + */ + hideSearchWords: () => { + document + .querySelectorAll("#searchbox .highlight-link") + .forEach((el) => el.remove()); + document + .querySelectorAll("span.highlighted") + .forEach((el) => el.classList.remove("highlighted")); + localStorage.removeItem("sphinx_highlight_terms") + }, + + initEscapeListener: () => { + // only install a listener if it is really needed + if (!DOCUMENTATION_OPTIONS.ENABLE_SEARCH_SHORTCUTS) return; + + document.addEventListener("keydown", (event) => { + // bail for input elements + if (BLACKLISTED_KEY_CONTROL_ELEMENTS.has(document.activeElement.tagName)) return; + // bail with special keys + if (event.shiftKey || event.altKey || event.ctrlKey || event.metaKey) return; + if (DOCUMENTATION_OPTIONS.ENABLE_SEARCH_SHORTCUTS && (event.key === "Escape")) { + SphinxHighlight.hideSearchWords(); + event.preventDefault(); + } + }); + }, +}; + +_ready(() => { + /* Do not call highlightSearchWords() when we are on the search page. + * It will highlight words from the *previous* search query. + */ + if (typeof Search === "undefined") SphinxHighlight.highlightSearchWords(); + SphinxHighlight.initEscapeListener(); +}); diff --git a/_static/style.css b/_static/style.css new file mode 100644 index 00000000..408f5ba8 --- /dev/null +++ b/_static/style.css @@ -0,0 +1,10 @@ +/* override table no-wrap */ +.wy-table-responsive table td, .wy-table-responsive table th { + white-space: normal; +} +.wy-nav-content { + max-width: none; +} +.highlight { + background: #FBFDFD; +} diff --git a/advanced-guide.html b/advanced-guide.html new file mode 100644 index 00000000..5afb04d9 --- /dev/null +++ b/advanced-guide.html @@ -0,0 +1,184 @@ + + + + + + + Advanced Guide — DISCO 0.1 documentation + + + + + + + + + + + + + + + + + + + + + + + + \ No newline at end of file diff --git a/advanced-guide/upgrade-cost-analysis-generic-models.html b/advanced-guide/upgrade-cost-analysis-generic-models.html new file mode 100644 index 00000000..8c77f61d --- /dev/null +++ b/advanced-guide/upgrade-cost-analysis-generic-models.html @@ -0,0 +1,1110 @@ + + + + + + + Upgrade Cost Analysis JSON Schemas — DISCO 0.1 documentation + + + + + + + + + + + + + + + + + + + +
+ + +
+ +
+
+
+ +
+
+
+
+ +
+

Upgrade Cost Analysis JSON Schemas

+
+

UpgradeCostAnalysisSimulationModel

+
{
+  "title": "UpgradeCostAnalysisSimulationModel",
+  "description": "Defines the jobs in an upgrade cost analysis simulation.",
+  "type": "object",
+  "properties": {
+    "thermal_upgrade_params": {
+      "title": "Thermal Upgrade Params",
+      "default": {
+        "transformer_upper_limit": 1.25,
+        "line_upper_limit": 1.25,
+        "line_design_pu": 0.75,
+        "transformer_design_pu": 0.75,
+        "voltage_upper_limit": 1.05,
+        "voltage_lower_limit": 0.95,
+        "read_external_catalog": false,
+        "external_catalog": "",
+        "create_plots": true,
+        "parallel_transformers_limit": 4,
+        "parallel_lines_limit": 4,
+        "upgrade_iteration_threshold": 5,
+        "timepoint_multipliers": null
+      },
+      "allOf": [
+        {
+          "$ref": "#/definitions/ThermalUpgradeParamsModel"
+        }
+      ]
+    },
+    "voltage_upgrade_params": {
+      "title": "Voltage Upgrade Params",
+      "default": {
+        "initial_upper_limit": 1.05,
+        "initial_lower_limit": 0.95,
+        "final_upper_limit": 1.05,
+        "final_lower_limit": 0.95,
+        "nominal_voltage": 120.0,
+        "create_plots": true,
+        "capacitor_sweep_voltage_gap": 1.0,
+        "reg_control_bands": [
+          1,
+          2
+        ],
+        "reg_v_delta": 0.5,
+        "max_regulators": 4,
+        "place_new_regulators": true,
+        "use_ltc_placement": true,
+        "timepoint_multipliers": null,
+        "capacitor_action_flag": true,
+        "existing_regulator_sweep_action": true,
+        "max_control_iterations": 50
+      },
+      "allOf": [
+        {
+          "$ref": "#/definitions/VoltageUpgradeParamsModel"
+        }
+      ]
+    },
+    "upgrade_cost_database": {
+      "title": "upgrade_cost_database",
+      "description": "Database containing costs for each equipment type",
+      "type": "string"
+    },
+    "calculate_costs": {
+      "title": "calculate_costs",
+      "description": "If True, calculate upgrade costs from database.",
+      "default": true,
+      "type": "boolean"
+    },
+    "upgrade_order": {
+      "title": "Upgrade Order",
+      "description": "Order of upgrade algorithm. 'thermal' or 'voltage' can be removed from the simulation by excluding them from this parameter.",
+      "default": [
+        "thermal",
+        "voltage"
+      ],
+      "type": "array",
+      "items": {
+        "type": "string"
+      }
+    },
+    "pydss_controllers": {
+      "title": "pydss_controllers",
+      "description": "If enable_pydss_controllers is True, these PyDSS controllers are applied to each corresponding element type.",
+      "default": {
+        "pv_controller": null
+      },
+      "allOf": [
+        {
+          "$ref": "#/definitions/PyDssControllerModels"
+        }
+      ]
+    },
+    "plot_violations": {
+      "title": "plot_violations",
+      "description": "If True, create plots of violations before and after simulation.",
+      "default": true,
+      "type": "boolean"
+    },
+    "enable_pydss_controllers": {
+      "title": "enable_pydss_controllers",
+      "description": "Flag to enable/disable use of PyDSS controllers",
+      "default": false,
+      "type": "boolean"
+    },
+    "include_pf1": {
+      "title": "include_pf1",
+      "description": "Include PF1 scenario (no controls) if pydss_controllers are defined.",
+      "default": true,
+      "type": "boolean"
+    },
+    "dc_ac_ratio": {
+      "title": "dc_ac_ratio",
+      "description": "Apply DC-AC ratio for PV Systems",
+      "type": "number"
+    },
+    "jobs": {
+      "title": "Jobs",
+      "type": "array",
+      "items": {
+        "$ref": "#/definitions/UpgradeCostAnalysisGenericModel"
+      }
+    }
+  },
+  "required": [
+    "upgrade_cost_database",
+    "jobs"
+  ],
+  "additionalProperties": false,
+  "definitions": {
+    "ThermalUpgradeParamsModel": {
+      "title": "UpgradeParamsBaseModel",
+      "description": "Thermal Upgrade Parameters for all jobs in a simulation",
+      "type": "object",
+      "properties": {
+        "transformer_upper_limit": {
+          "title": "transformer_upper_limit",
+          "description": "Transformer upper limit in per unit (example: 1.25)",
+          "type": "number"
+        },
+        "line_upper_limit": {
+          "title": "line_upper_limit",
+          "description": "Line upper limit in per unit (example: 1.25)",
+          "type": "number"
+        },
+        "line_design_pu": {
+          "title": "line_design_pu",
+          "description": "Line design in per unit (example: 0.75)",
+          "type": "number"
+        },
+        "transformer_design_pu": {
+          "title": "transformer_design_pu",
+          "description": "Transformer design in per unit (example: 0.75)",
+          "type": "number"
+        },
+        "voltage_upper_limit": {
+          "title": "voltage_upper_limit",
+          "description": "Voltage upper limit in per unit (example: 1.05)",
+          "type": "number"
+        },
+        "voltage_lower_limit": {
+          "title": "voltage_lower_limit",
+          "description": "Voltage lower limit in per unit (example: 0.95)",
+          "type": "number"
+        },
+        "read_external_catalog": {
+          "title": "read_external_catalog",
+          "description": "Flag to determine whether external catalog is to be used (example: False)",
+          "type": "boolean"
+        },
+        "external_catalog": {
+          "title": "external_catalog",
+          "description": "Location to external upgrades technical catalog json file",
+          "type": "string"
+        },
+        "create_plots": {
+          "title": "create_plots",
+          "description": "Flag to enable or disable figure creation",
+          "default": true,
+          "type": "boolean"
+        },
+        "parallel_transformers_limit": {
+          "title": "parallel_transformers_limit",
+          "description": "Parallel transformer limit",
+          "default": 4,
+          "type": "integer"
+        },
+        "parallel_lines_limit": {
+          "title": "parallel_lines_limit",
+          "description": "Parallel lines limit",
+          "default": 4,
+          "type": "integer"
+        },
+        "upgrade_iteration_threshold": {
+          "title": "upgrade_iteration_threshold",
+          "description": "Upgrade iteration threshold",
+          "default": 5,
+          "type": "integer"
+        },
+        "timepoint_multipliers": {
+          "title": "timepoint_multipliers",
+          "description": "Dictionary to provide timepoint multipliers. example: timepoint_multipliers={\"load_multipliers\": {\"with_pv\": [1.2], \"without_pv\": [0.6]}}",
+          "type": "object"
+        }
+      },
+      "required": [
+        "transformer_upper_limit",
+        "line_upper_limit",
+        "line_design_pu",
+        "transformer_design_pu",
+        "voltage_upper_limit",
+        "voltage_lower_limit",
+        "read_external_catalog",
+        "external_catalog"
+      ],
+      "additionalProperties": false
+    },
+    "VoltageUpgradeParamsModel": {
+      "title": "UpgradeParamsBaseModel",
+      "description": "Voltage Upgrade Parameters for all jobs in a simulation",
+      "type": "object",
+      "properties": {
+        "initial_upper_limit": {
+          "title": "initial_upper_limit",
+          "description": "Initial upper limit in per unit (example: 1.05)",
+          "type": "number"
+        },
+        "initial_lower_limit": {
+          "title": "initial_lower_limit",
+          "description": "Initial lower limit in per unit (example: 0.95)",
+          "type": "number"
+        },
+        "final_upper_limit": {
+          "title": "final_upper_limit",
+          "description": "Final upper limit in per unit (example: 1.05)",
+          "type": "number"
+        },
+        "final_lower_limit": {
+          "title": "final_lower_limit",
+          "description": "Final lower limit in per unit (example: 0.95)",
+          "type": "number"
+        },
+        "nominal_voltage": {
+          "title": "nominal_voltage",
+          "description": "Nominal voltage (volts) (example: 120)",
+          "type": "number"
+        },
+        "create_plots": {
+          "title": "create_plots",
+          "description": "Flag to enable or disable figure creation",
+          "default": true,
+          "type": "boolean"
+        },
+        "capacitor_sweep_voltage_gap": {
+          "title": "capacitor_sweep_voltage_gap",
+          "description": "Capacitor sweep voltage gap (example: 1)",
+          "default": 1.0,
+          "type": "number"
+        },
+        "reg_control_bands": {
+          "title": "reg_control_bands",
+          "description": "Regulator control bands (example: [1, 2])",
+          "default": [
+            1,
+            2
+          ],
+          "type": "array",
+          "items": {
+            "type": "integer"
+          }
+        },
+        "reg_v_delta": {
+          "title": "reg_v_delta",
+          "description": "Regulator voltage delta (example: 0.5)",
+          "default": 0.5,
+          "type": "number"
+        },
+        "max_regulators": {
+          "title": "max_regulators",
+          "description": "Maximum number of regulators",
+          "default": 4,
+          "type": "integer"
+        },
+        "place_new_regulators": {
+          "title": "place_new_regulators",
+          "description": "Flag to enable or disable new regulator placement",
+          "default": true,
+          "type": "boolean"
+        },
+        "use_ltc_placement": {
+          "title": "use_ltc_placement",
+          "description": "Flag to enable or disable substation LTC upgrades module",
+          "default": true,
+          "type": "boolean"
+        },
+        "timepoint_multipliers": {
+          "title": "timepoint_multipliers",
+          "description": "Dictionary to provide timepoint multipliers. example: timepoint_multipliers={\"load_multipliers\": {\"with_pv\": [1.2], \"without_pv\": [0.6]}}",
+          "type": "object"
+        },
+        "capacitor_action_flag": {
+          "title": "capacitor_action_flag",
+          "description": "Flag to enable or disable capacitor controls settings sweep module",
+          "default": true,
+          "type": "boolean"
+        },
+        "existing_regulator_sweep_action": {
+          "title": "existing_regulator_sweep_action",
+          "description": "Flag to enable or disable existing regulator controls settings sweep module",
+          "default": true,
+          "type": "boolean"
+        },
+        "max_control_iterations": {
+          "title": "max_control_iterations",
+          "description": "Max control iterations to be set for OpenDSS",
+          "default": 50,
+          "type": "integer"
+        }
+      },
+      "required": [
+        "initial_upper_limit",
+        "initial_lower_limit",
+        "final_upper_limit",
+        "final_lower_limit",
+        "nominal_voltage"
+      ],
+      "additionalProperties": false
+    },
+    "PvControllerModel": {
+      "title": "ControllerBaseModel",
+      "type": "object",
+      "properties": {
+        "Control1": {
+          "title": "Control1",
+          "description": "TODO",
+          "type": "string"
+        },
+        "Control2": {
+          "title": "Control1",
+          "description": "TODO",
+          "type": "string"
+        },
+        "Control3": {
+          "title": "Control3",
+          "description": "TODO",
+          "type": "string"
+        },
+        "pf": {
+          "title": "pf",
+          "description": "TODO",
+          "type": "integer"
+        },
+        "pfMin": {
+          "title": "pfMin",
+          "description": "TODO",
+          "type": "number"
+        },
+        "pfMax": {
+          "title": "pfMax",
+          "description": "TODO",
+          "type": "number"
+        },
+        "Pmin": {
+          "title": "Pmin",
+          "description": "TODO",
+          "type": "number"
+        },
+        "Pmax": {
+          "title": "Pmax",
+          "description": "TODO",
+          "type": "number"
+        },
+        "uMin": {
+          "title": "uMin",
+          "description": "TODO",
+          "type": "number"
+        },
+        "uDbMin": {
+          "title": "uDbMin",
+          "description": "TODO",
+          "type": "number"
+        },
+        "uDbMax": {
+          "title": "uDbMax",
+          "description": "TODO",
+          "type": "number"
+        },
+        "uMax": {
+          "title": "uMax",
+          "description": "TODO",
+          "type": "number"
+        },
+        "QlimPU": {
+          "title": "QlimPU",
+          "description": "TODO",
+          "type": "number"
+        },
+        "PFlim": {
+          "title": "PFlim",
+          "description": "TODO",
+          "type": "number"
+        },
+        "Enable PF limit": {
+          "title": "EnablePFLimit",
+          "description": "TODO",
+          "type": "boolean"
+        },
+        "uMinC": {
+          "title": "uMinC",
+          "description": "TODO",
+          "type": "number"
+        },
+        "uMaxC": {
+          "title": "uMaxC",
+          "description": "TODO",
+          "type": "number"
+        },
+        "PminVW": {
+          "title": "PminVW",
+          "description": "TODO",
+          "type": "number"
+        },
+        "VWtype": {
+          "title": "VWtype",
+          "description": "TODO",
+          "type": "string"
+        },
+        "%PCutin": {
+          "title": "PCutin",
+          "description": "TODO",
+          "type": "number"
+        },
+        "%PCutout": {
+          "title": "%PCutout",
+          "description": "TODO",
+          "type": "number"
+        },
+        "Efficiency": {
+          "title": "Efficiency",
+          "description": "TODO",
+          "type": "number"
+        },
+        "Priority": {
+          "title": "Priority",
+          "description": "TODO",
+          "type": "string"
+        },
+        "DampCoef": {
+          "title": "DampCoef",
+          "description": "TODO",
+          "type": "number"
+        }
+      },
+      "required": [
+        "Control1",
+        "pf",
+        "pfMin",
+        "pfMax",
+        "Pmin",
+        "Pmax",
+        "uMin",
+        "uDbMin",
+        "uDbMax",
+        "uMax",
+        "QlimPU",
+        "PFlim",
+        "Enable PF limit",
+        "uMinC",
+        "uMaxC",
+        "PminVW",
+        "VWtype",
+        "%PCutin",
+        "%PCutout",
+        "Efficiency",
+        "Priority",
+        "DampCoef"
+      ],
+      "additionalProperties": false
+    },
+    "PyDssControllerModels": {
+      "title": "UpgradeParamsBaseModel",
+      "description": "Defines the settings for PyDSS controllers",
+      "type": "object",
+      "properties": {
+        "pv_controller": {
+          "title": "pv_controller",
+          "description": "Settings for a PV controller",
+          "allOf": [
+            {
+              "$ref": "#/definitions/PvControllerModel"
+            }
+          ]
+        }
+      },
+      "additionalProperties": false
+    },
+    "UpgradeCostAnalysisGenericModel": {
+      "title": "UpgradeCostAnalysisGenericModel",
+      "description": "Parameters for each job in a simulation",
+      "type": "object",
+      "properties": {
+        "model_type": {
+          "title": "model_type",
+          "description": "Model type",
+          "default": "UpgradeCostAnalysisGenericModel",
+          "type": "string"
+        },
+        "name": {
+          "title": "name",
+          "description": "Unique name identifying the job",
+          "type": "string"
+        },
+        "blocked_by": {
+          "title": "blocked_by",
+          "description": "Names of jobs that must finish before this job starts",
+          "default": [],
+          "type": "array",
+          "items": {
+            "type": "string"
+          },
+          "uniqueItems": true
+        },
+        "job_order": {
+          "title": "job_order",
+          "description": "The execution order of the simulation job.",
+          "anyOf": [
+            {
+              "type": "integer",
+              "minimum": 0.0
+            },
+            {
+              "type": "number",
+              "minimum": 0.0
+            }
+          ]
+        },
+        "opendss_model_file": {
+          "title": "opendss_model_file",
+          "description": "Path to file used load the simulation model files",
+          "type": "string"
+        },
+        "estimated_run_minutes": {
+          "title": "estimated_run_minutes",
+          "description": "Optionally advises the job execution manager on how long the job will run",
+          "type": "integer"
+        }
+      },
+      "required": [
+        "name",
+        "opendss_model_file"
+      ]
+    }
+  }
+}
+
+
+
+
+

UpgradesCostResultSummaryModel

+
{
+  "title": "UpgradeParamsBaseModel",
+  "description": "Contains individual equipment output",
+  "type": "object",
+  "properties": {
+    "name": {
+      "title": "name",
+      "description": "Job name",
+      "type": "string"
+    },
+    "equipment_type": {
+      "title": "equipment_type",
+      "description": "Type of equipment",
+      "type": "string"
+    },
+    "equipment_name": {
+      "title": "equipment_name",
+      "description": "Name of equipment",
+      "type": "string"
+    },
+    "status": {
+      "title": "status",
+      "description": "Status",
+      "type": "string"
+    },
+    "total_cost_usd": {
+      "title": "total_cost_usd",
+      "description": "Total cost in US dollars",
+      "type": "number"
+    },
+    "parameter1_name": {
+      "title": "parameter1_name",
+      "description": "Name of parameter1",
+      "type": "string"
+    },
+    "parameter1_original": {
+      "title": "parameter1_original",
+      "description": "Original value of parameter1"
+    },
+    "parameter1_upgraded": {
+      "title": "parameter1_upgraded",
+      "description": "Upgraded value of parameter1"
+    },
+    "parameter2_name": {
+      "title": "parameter2_name",
+      "description": "Name of parameter2",
+      "default": "",
+      "type": "string"
+    },
+    "parameter2_original": {
+      "title": "parameter2_original",
+      "description": "Original value of parameter2"
+    },
+    "parameter2_upgraded": {
+      "title": "parameter2_upgraded",
+      "description": "Upgraded value of parameter2"
+    },
+    "parameter3_name": {
+      "title": "parameter3_name",
+      "description": "Name of parameter3",
+      "default": "",
+      "type": "string"
+    },
+    "parameter3_original": {
+      "title": "parameter3_original",
+      "description": "Original value of parameter3"
+    },
+    "parameter3_upgraded": {
+      "title": "parameter3_upgraded",
+      "description": "Upgraded value of parameter3"
+    }
+  },
+  "required": [
+    "name",
+    "equipment_type",
+    "equipment_name",
+    "status",
+    "total_cost_usd",
+    "parameter1_name"
+  ],
+  "additionalProperties": false
+}
+
+
+
+
+

JobUpgradeSummaryOutputModel

+
{
+  "title": "UpgradeParamsBaseModel",
+  "description": "Contains results from all jobs in the simulation.",
+  "type": "object",
+  "properties": {
+    "results": {
+      "title": "results",
+      "description": "Results summary for each job",
+      "type": "array",
+      "items": {}
+    },
+    "outputs": {
+      "title": "outputs",
+      "description": "Outputs for each job in the simulation.",
+      "allOf": [
+        {
+          "$ref": "#/definitions/UpgradeSimulationOutputModel"
+        }
+      ]
+    },
+    "violation_summary": {
+      "title": "upgrade_summary",
+      "description": "Contains thermal or voltage upgrade results for each job",
+      "type": "array",
+      "items": {
+        "$ref": "#/definitions/UpgradeViolationResultModel"
+      }
+    },
+    "costs_per_equipment": {
+      "title": "costs_per_equipment",
+      "description": "Contains upgrade cost information for each job by equipment type",
+      "type": "array",
+      "items": {
+        "$ref": "#/definitions/TotalUpgradeCostsResultModel"
+      }
+    },
+    "equipment": {
+      "title": "equipment",
+      "description": "Contains equipment information for each job",
+      "type": "array",
+      "items": {}
+    }
+  },
+  "required": [
+    "results",
+    "outputs",
+    "violation_summary",
+    "costs_per_equipment",
+    "equipment"
+  ],
+  "additionalProperties": false,
+  "definitions": {
+    "UpgradeJobOutputModel": {
+      "title": "UpgradeParamsBaseModel",
+      "description": "Contains outputs from one job.",
+      "type": "object",
+      "properties": {
+        "upgraded_opendss_model_file": {
+          "title": "upgraded_opendss_model_file",
+          "description": "Path to file that will load the upgraded network.",
+          "type": "string"
+        },
+        "feeder_stats": {
+          "title": "feeder_stats",
+          "description": "Path to file containing feeder metadata and equipment details before and after upgrades.",
+          "type": "string"
+        },
+        "return_code": {
+          "title": "return_code",
+          "description": "Return code from process. Zero is success, non-zero is a failure.",
+          "type": "integer"
+        }
+      },
+      "required": [
+        "upgraded_opendss_model_file",
+        "feeder_stats",
+        "return_code"
+      ],
+      "additionalProperties": false
+    },
+    "UpgradeSimulationOutputModel": {
+      "title": "UpgradeParamsBaseModel",
+      "description": "Contains outputs from all jobs in the simulation.",
+      "type": "object",
+      "properties": {
+        "log_file": {
+          "title": "log_file",
+          "description": "Path to log file for the simulation.",
+          "type": "string"
+        },
+        "jobs": {
+          "title": "jobs",
+          "description": "Outputs for each job in the simulation.",
+          "type": "array",
+          "items": {
+            "$ref": "#/definitions/UpgradeJobOutputModel"
+          }
+        }
+      },
+      "required": [
+        "log_file",
+        "jobs"
+      ],
+      "additionalProperties": false
+    },
+    "UpgradeViolationResultModel": {
+      "title": "UpgradeParamsBaseModel",
+      "description": "Defines result parameters for thermal upgrades.",
+      "type": "object",
+      "properties": {
+        "name": {
+          "title": "name",
+          "description": "Job name",
+          "type": "string"
+        },
+        "scenario": {
+          "title": "scenario",
+          "description": "Simulation scenario describing the controls being used",
+          "default": "control_mode",
+          "type": "string"
+        },
+        "stage": {
+          "title": "stage",
+          "description": "Stage of upgrades: initial (before upgrades) or final (after upgrades)",
+          "type": "string"
+        },
+        "upgrade_type": {
+          "title": "upgrade_type",
+          "description": "Type of upgrade: thermal or voltage",
+          "type": "string"
+        },
+        "simulation_time_s": {
+          "title": "simulation_time_s",
+          "description": "Simulation time to perform upgrades (seconds)",
+          "type": "number"
+        },
+        "thermal_violations_present": {
+          "title": "thermal_violations_present",
+          "description": "Flag indicating whether thermal violations are present",
+          "type": "boolean"
+        },
+        "voltage_violations_present": {
+          "title": "voltage_violations_present",
+          "description": "Flag indicating whether voltage violations are present",
+          "type": "boolean"
+        },
+        "max_bus_voltage": {
+          "title": "max_bus_voltage",
+          "description": "Maximum voltage recorded on any bus",
+          "units": "pu",
+          "type": "number"
+        },
+        "min_bus_voltage": {
+          "title": "min_bus_voltage",
+          "description": "Minimum voltage recorded on any bus",
+          "units": "pu",
+          "type": "number"
+        },
+        "num_voltage_violation_buses": {
+          "title": "num_voltage_violation_buses",
+          "description": "Number of buses with voltage violations",
+          "type": "integer"
+        },
+        "num_overvoltage_violation_buses": {
+          "title": "num_overvoltage_violation_buses",
+          "description": "Number of buses with voltage above voltage_upper_limit",
+          "type": "integer"
+        },
+        "voltage_upper_limit": {
+          "title": "voltage_upper_limit",
+          "description": "Voltage upper limit, the threshold considered for determining overvoltages",
+          "units": "pu",
+          "type": "number"
+        },
+        "num_undervoltage_violation_buses": {
+          "title": "num_undervoltage_violation_buses",
+          "description": "Number of buses with voltage below voltage_lower_limit",
+          "type": "integer"
+        },
+        "voltage_lower_limit": {
+          "title": "voltage_lower_limit",
+          "description": "Voltage lower limit, the threshold considered for determining undervoltages",
+          "units": "pu",
+          "type": "number"
+        },
+        "max_line_loading": {
+          "title": "max_line_loading",
+          "description": "Maximum line loading",
+          "units": "pu",
+          "type": "number"
+        },
+        "max_transformer_loading": {
+          "title": "max_transformer_loading",
+          "description": "Maximum transformer loading",
+          "units": "pu",
+          "type": "number"
+        },
+        "num_line_violations": {
+          "title": "num_line_violations",
+          "description": "Number of lines with loading above line upper limit",
+          "type": "integer"
+        },
+        "line_upper_limit": {
+          "title": "line_upper_limit",
+          "description": "Line upper limit, the threshold considered for determining line overloading",
+          "units": "pu",
+          "type": "number"
+        },
+        "num_transformer_violations": {
+          "title": "num_transformer_violations",
+          "description": "Number of transformers with loading above transformer upper limit",
+          "type": "integer"
+        },
+        "transformer_upper_limit": {
+          "title": "transformer_upper_limit",
+          "description": "Transformer upper limit, the threshold considered for determining transformer overloading",
+          "units": "pu",
+          "type": "number"
+        }
+      },
+      "required": [
+        "name",
+        "stage",
+        "upgrade_type",
+        "simulation_time_s",
+        "thermal_violations_present",
+        "voltage_violations_present",
+        "max_bus_voltage",
+        "min_bus_voltage",
+        "num_voltage_violation_buses",
+        "num_overvoltage_violation_buses",
+        "voltage_upper_limit",
+        "num_undervoltage_violation_buses",
+        "voltage_lower_limit",
+        "max_line_loading",
+        "max_transformer_loading",
+        "num_line_violations",
+        "line_upper_limit",
+        "num_transformer_violations",
+        "transformer_upper_limit"
+      ],
+      "additionalProperties": false
+    },
+    "TotalUpgradeCostsResultModel": {
+      "title": "UpgradeParamsBaseModel",
+      "description": "Provides total output costs for upgrading a type of equipment.",
+      "type": "object",
+      "properties": {
+        "name": {
+          "title": "name",
+          "description": "Job name",
+          "type": "string"
+        },
+        "type": {
+          "title": "type",
+          "description": "Equipment type",
+          "type": "string"
+        },
+        "count": {
+          "title": "count",
+          "description": "Count of upgraded equipment",
+          "type": "integer"
+        },
+        "total_cost_usd": {
+          "title": "total_cost_usd",
+          "description": "Total cost in US dollars",
+          "units": "dollars",
+          "type": "number"
+        }
+      },
+      "required": [
+        "name",
+        "type",
+        "count",
+        "total_cost_usd"
+      ],
+      "additionalProperties": false
+    }
+  }
+}
+
+
+
+
+ + +
+
+ +
+
+
+
+ + + + \ No newline at end of file diff --git a/analysis-workflows.html b/analysis-workflows.html new file mode 100644 index 00000000..d158f3e3 --- /dev/null +++ b/analysis-workflows.html @@ -0,0 +1,212 @@ + + + + + + + Analysis Workflows — DISCO 0.1 documentation + + + + + + + + + + + + + + + + + + + +
+ + +
+ +
+
+
+ +
+
+
+
+ +
+

Analysis Workflows

+

DISCO implements analysis workflows that allow post-processing of individual jobs, +batches of jobs, or a pipeline of batches.

+

The supported analyses include:

+
    +
  • Static Hosting Capacity Analysis

  • +
  • Dynamic Hosting Capacity Analysis

  • +
  • Upgrade Cost Analysis Analysis

  • +
  • Snapshot/Time Series Impact Analysis

  • +
+

The following sections show the analysis workflows in detail.

+ +
+ + +
+
+ +
+
+
+
+ + + + \ No newline at end of file diff --git a/analysis-workflows/hosting-capacity-analysis.html b/analysis-workflows/hosting-capacity-analysis.html new file mode 100644 index 00000000..dbd5e325 --- /dev/null +++ b/analysis-workflows/hosting-capacity-analysis.html @@ -0,0 +1,371 @@ + + + + + + + Hosting Capacity Analysis — DISCO 0.1 documentation + + + + + + + + + + + + + + + + + + + +
+ + +
+ +
+
+
+ +
+
+
+
+ +
+

Hosting Capacity Analysis

+

This section shows how to conduct hosting capacity analysis using DISCO pipeline with snapshot +and time-series models as inputs. This tutorial assumes there’s an existing snapshot-feeder-models +directory generated from the transform-model command as below. The workflow below can also be +applied to time-series-feeder-models.

+

1. Config Pipeline

+

Check the --help option for creating pipeline template.

+
$ disco create-pipeline template --help
+Usage: disco create-pipeline template [OPTIONS] INPUTS
+
+Create pipeline template file
+
+Options:
+-T, --task-name TEXT            The task name of the simulation/analysis
+                                [required]
+-P, --preconfigured             Whether inputs models are preconfigured
+                                [default: False]
+-s, --simulation-type [snapshot|time-series|upgrade]
+                                Choose a DISCO simulation type  [default:
+                                snapshot]
+--with-loadshape / --no-with-loadshape
+                                Indicate if loadshape file used for Snapshot
+                                simulation.
+--auto-select-time-points / --no-auto-select-time-points
+                                Automatically select the time point based on
+                                max PV-load ratio for snapshot simulations.
+                                Only applicable if --with-loadshape.
+                                [default: auto-select-time-points]
+-d, --auto-select-time-points-search-duration-days INTEGER
+                                Search duration in days. Only applicable
+                                with --auto-select-time-points.  [default:
+                                365]
+-i, --impact-analysis           Enable impact analysis computations
+                                [default: False]
+-h, --hosting-capacity          Enable hosting capacity computations
+                                [default: False]
+-u, --upgrade-analysis          Enable upgrade cost computations  [default:
+                                False]
+-c, --cost-benefit              Enable cost benefit computations  [default:
+                                False]
+-p, --prescreen                 Enable PV penetration level prescreening
+                                [default: False]
+-t, --template-file TEXT        Output pipeline template file  [default:
+                                pipeline-template.toml]
+-r, --reports-filename TEXT     PyDSS report options. If None, use the
+                                default for the simulation type.
+-S, --enable-singularity        Add Singularity parameters and set the
+                                config to run in a container.  [default:
+                                False]
+-C, --container PATH            Path to container
+-D, --database PATH             The path of new or existing SQLite database
+                                [default: results.sqlite]
+-l, --local                     Run in local mode (non-HPC).  [default:
+                                False]
+--help                          Show this message and exit.
+
+
+

Given an output directory from transform-model, we use this command with --preconfigured option +to create the template.

+
$ disco create-pipeline template -T SnapshotTask -s snapshot -h -P snapshot-feeder-models --with-loadshape
+
+
+
+

Note

+

For configuring a dynamic hosting capacity pipeline, use -s time-series

+
+

It creates pipeline-template.toml with configurable parameters of different sections. Update +parameter values if needed. Then run

+
$ disco create-pipeline config pipeline-template.toml
+
+
+

This command creates a pipeline.json file containing two stages:

+
    +
  • stage 1 - simulation

  • +
  • stage 2 - post-process

  • +
+

Accordingly, there will be an output directory for each stage,

+
    +
  • output-stage1

  • +
  • output-stage2

  • +
+

2. Submit Pipeline

+

With a configured DISCO pipeline in pipeline.json the next step is to submit the pipeline with +JADE:

+
$ jade pipeline submit pipeline.json -o output
+
+
+

What does each stage do?

+
    +
  • In the simulation stage DISCO runs a power flow simulation for each job through PyDSS and stores +per-job metrics.

  • +
  • In the post-process stage DISCO aggregates the metrics from each simulation job, calculates +the hosting capacity, and then ingests results into a SQLite database.

  • +
+

3. Check Results

+

The post-process stage aggregates metrics in the following tables in output/output-stage1:

+
    +
  • feeder_head_table.csv

  • +
  • feeder_losses_table.csv

  • +
  • metadata_table.csv

  • +
  • thermal_metrics_table.csv

  • +
  • voltage_metrics_table.csv

  • +
+

Each table contains metrics related to the snapshot or time-series simulation. DISCO +computes hosting capacity results from these metrics and then writes them to the following files, +also in output/output-stage1:

+
    +
  • hosting_capacity_summary__<scenario_name>.json

  • +
  • hosting_capacity_overall__<scenario_name>.json

  • +
+

The scenario name will be scenario, pf1 and/or control_mode, depending on your +simulation type and/or --with-loadshape option.

+

Note that DISCO also produces prototypical visualizations for hosting capacity automatically after each run:

+
    +
  • hca__{scenario_name}.png

  • +
+../_images/hca__pf1.png +

The voltage plot examples for the first feeder comparing pf1 vs. voltvar and comparing primary and secondary voltages:

+
    +
  • max_voltage_pf1_voltvar.png

  • +
  • max_voltage_pri_sec.png

  • +
+../_images/max_voltage_pri_sec.png +

4. Results database

+

DISCO ingests the hosting capacity results and report metrics into a SQLite database named +output/output-stage1/results.sqlite. You can use standard SQL to query data, and perform +further analysis.

+

If you want to ingest the results into an existing database, please specify the absolute path +of the database in pipeline.toml.

+

For sqlite query examples, please refer to the Jupyter notebook notebooks/db-query.ipynb in +the source code repo.

+

If you would like to use the CLI tool sqlite3 directly, here are some examples. Note that in +this case the database contains the results from a single task, and so the queries are not first +pre-filtering the tables.

+

If you don’t already have sqlite3 installed, please refer to their +website.

+

Run this command to start the CLI utility:

+
$ sqlite3 -table <path-to-db.sqlite>
+
+
+
+

Note

+

If your version of sqlite3 doesn’t support -table, use -header -column instead.

+
+
    +
  1. View DISCO’s hosting capacity results for all feeders.

  2. +
+
sqlite> SELECT * from hosting_capacity WHERE hc_type = 'overall';
+
+
+
    +
  1. View voltage violations for one feeder and scenario.

  2. +
+
sqlite> SELECT feeder, scenario, sample, penetration_level, node_type, min_voltage, max_voltage
+        FROM voltage_metrics
+        WHERE (max_voltage > 1.05 or min_voltage < 0.95)
+        AND scenario = 'pf1'
+        AND feeder = 'p19udt14287';
+
+
+
    +
  1. View the min and max voltages for each penetration_level (across samples) for one feeder.

  2. +
+
sqlite> SELECT feeder, sample, penetration_level
+        ,MIN(min_voltage) as min_voltage_overall
+        ,MAX(max_voltage) as max_voltage_overall
+        ,MAX(num_nodes_any_outside_ansi_b) as num_nodes_any_outside_ansi_b_overall
+        ,MAX(num_time_points_with_ansi_b_violations) as num_time_points_with_ansi_b_violations_overall
+        FROM voltage_metrics
+        WHERE scenario = 'pf1'
+        AND feeder = 'p19udt14287'
+        GROUP BY feeder, penetration_level;
+
+
+
    +
  1. View the max thermal loadings for each penetration_level (across samples) for one feeder.

  2. +
+
sqlite> SELECT feeder, sample, penetration_level
+        ,MAX(line_max_instantaneous_loading_pct) as line_max_inst
+        ,MAX(line_max_moving_average_loading_pct) as line_max_mavg
+        ,MAX(line_num_time_points_with_instantaneous_violations) as line_num_inst
+        ,MAX(line_num_time_points_with_moving_average_violations) as line_num_mavg
+        ,MAX(transformer_max_instantaneous_loading_pct) as xfmr_max_inst
+        ,MAX(transformer_max_moving_average_loading_pct) as xfmr_max_mavg
+        ,MAX(transformer_num_time_points_with_instantaneous_violations) as xfmr_num_inst
+        ,MAX(transformer_num_time_points_with_moving_average_violations) as xfmr_num_mavg
+        FROM thermal_metrics
+        WHERE scenario = 'pf1'
+        AND feeder = 'p19udt14287'
+        GROUP BY feeder, penetration_level;
+
+
+
+ + +
+
+ +
+
+
+
+ + + + \ No newline at end of file diff --git a/analysis-workflows/impact-analysis.html b/analysis-workflows/impact-analysis.html new file mode 100644 index 00000000..84a0bdfa --- /dev/null +++ b/analysis-workflows/impact-analysis.html @@ -0,0 +1,383 @@ + + + + + + + Impact Analysis — DISCO 0.1 documentation + + + + + + + + + + + + + + + + + + + +
+ + +
+ +
+
+
+ +
+
+
+
+ +
+

Impact Analysis

+

This section shows how to perform customized analysis with time-series simulations on a set of input models. +Note that you could generally substitute “snapshot” for “time-series” for that type of simulation.

+

Other sections of this documentation describe workflows that rely on DISCO/PyDSS to collect +specific metrics from each simulation. For example, the dynamic hosting capacity analysis workflow +collects metrics for max instantaneous and moving-average line and transformer loading violations. +It does not store values for every line and transformer at every time point because of the amount +of storage space that requires. This section shows how to collect all of the data.

+
+

Transform Model

+

As with earlier sections this assumes that you have cloned the disco repo to the ~/disco directory. +Transform the source models into DISCO models with this command:

+
$ disco transform-model ~/disco/tests/data/smart-ds/substations/ time-series
+Transformed data from ~/disco/tests/data/smart-ds/substations/ to time-series-feeder-models for TimeSeries Analysis.
+
+
+
+

Note

+

For your own models you will likely need to set --start, --end, and --resolution.

+
+
+
+

Config Jobs

+
    +
  1. Copy this text into a file called exports.toml. This will instruct PyDSS to store each of these +properties for each element at each time point.

  2. +
+
[Loads.Powers]
+store_values_type = "all"
+
+[PVSystems.Powers]
+store_values_type = "all"
+
+[Circuits.TotalPower]
+store_values_type = "all"
+
+[Circuits.LineLosses]
+store_values_type = "all"
+
+[Circuits.Losses]
+store_values_type = "all"
+
+[Lines.Currents]
+store_values_type = "all"
+
+[Lines.Losses]
+store_values_type = "all"
+
+[Lines.Powers]
+store_values_type = "all"
+
+
+
    +
  1. Create the configuration with all reports disabled, custom exports, all data exported, a custom +DC-AC ratio, and a specific volt-var curve.

  2. +
+
+

Note

+

If you’re using a Windows terminal, the \ characters used here for line breaks probably won’t work.

+
+
$ disco config time-series time-series-feeder-models \
+    --config-file time-series-config.json \
+    --volt-var-curve volt_var_ieee_1547_2018_catB \
+    --dc-ac-ratio=1.0 \
+    --exports-filename exports.toml \
+    --export-data-tables \
+    --store-all-time-points \
+    --store-per-element-data \
+    --thermal-metrics=true \
+    --voltage-metrics=true \
+    --feeder-losses=true
+
+
+
+
+

Submit Jobs

+

Run the jobs with JADE. Two examples are shown: one on a local machine and one on an HPC.

+
$ jade submit-jobs --local time-series-config.json -o time-series-output
+$ jade submit-jobs -h hpc_config.toml time-series-config.json -o time-series-output
+
+
+

Confirm that all jobs passed.

+
$ jade show-results -o time-series-output
+
+
+
+
+

View Output Files

+

Each job’s outputs will be stored in time-series-output/job-outputs/<job-name>/pydss_project/project.zip. +Extract one zip file. You will see exported data for all element properties. For example, this file +contains bus voltages for the volt-var scenario: Exports/control_mode/Buses__puVmagAngle.csv. +Exports/control_mode/CktElement__ExportLoadingsMetric.csv contains thermal loading values. +The same files will exist for the pf1 scenario.

+

Summary files will be available for thermal and voltage metrics. Refer to Reports/thermal_metrics.json +and Reports/voltage_metrics.json.

+
+
+

Make metric table files

+

Run this command to convert the thermal and voltage metrics into tabular form.

+
$ disco make-summary-tables time-series-output
+
+
+
+
+

Access Results Programmatically

+

DISCO includes analysis code to help look at thermal loading and voltage violations. Here is +some example code:

+
import logging
+import os
+
+from jade.loggers import setup_logging
+from disco.pydss.pydss_analysis import PyDssAnalysis, PyDssScenarioAnalysis
+from disco.extensions.pydss_simulation.pydss_configuration import PyDssConfiguration
+
+logger = setup_logging("config", "log.txt", console_level=logging.INFO)
+
+output_dir = "time-series-output"
+config = PyDssConfiguration.deserialize(os.path.join(output_path, "config.json"))
+analysis = PyDssAnalysis(output_path, config)
+analysis.show_results()
+
+# Copy name from the output of show_results().
+name = analysis.list_results()[1].name
+
+# Look up job-specific parameters.
+job = analysis.get_job(name)
+print(job)
+print(job.model.deployment)
+print(job.model.deployment.project_data)
+
+simulation = analysis.get_simulation(name)
+
+# Get access to result dataframes.
+results = analysis.read_results(simulation)
+scenario = results.scenarios[0]
+scenario_analysis = PyDssScenarioAnalysis(simulation, results, scenario.name)
+
+# Get list of voltage magnitudes for each bus.
+voltages_per_bus = scenario_analysis.get_pu_bus_voltage_magnitudes()
+
+# Get loading percentages.
+line_loading = scenario_analysis.get_line_loading_percentages()
+transformer_loading = scenario_analysis.get_transformer_loading_percentages()
+
+# Find out what classes and properties are available.
+for element_class in scenario.list_element_classes():
+    for prop in scenario.list_element_properties(element_class):
+        print(element_class, prop)
+
+for name in scenario.list_element_names("Lines", "Currents"):
+    df = scenario.get_dataframe("Lines", "Currents", name)
+    print(df.head())
+
+# Browse static element information.
+for filename in scenario.list_element_info_files():
+    print(filename)
+    df = scenario.read_element_info_file(filename)
+    print(df.head())
+
+# Use class names to read specific element infomation.
+df = scenario.read_element_info_file("Loads")
+df = scenario.read_element_info_file("PVSystems")
+
+# Read events from the OpenDSS event log.
+event_log = scenario.read_event_log()
+
+# Get the count of each capacitor's state changes from the event log.
+capacitor_changes = scenario.read_capacitor_changes()
+
+
+
+
+

Use the PyDSS Data Viewer

+

PyDSS includes a data viewer that makes it easy to plot circuit element values in a Jupyter +notebook. Refer to its docs.

+
+
+

Generic Models

+

This section follows the same workflow except that it uses pre-defined OpenDSS models. Unlike +the previous example, DISCO will not make any changes to the model files.

+

Refer to Generic Power Flow Models for specific details about the input file +time_series_generic.json.

+
$ disco config-generic-models time-series ~/disco/tests/data/time_series_generic.json \
+    --config-file time-series-config.json \
+    --volt-var-curve volt_var_ieee_1547_2018_catB \
+    --exports-filename exports.toml \
+    --export-data-tables \
+    --store-all-time-points \
+    --store-per-element-data \
+    --thermal-metrics=true \
+    --voltage-metrics=true \
+    --feeder-losses=true
+
+
+
$ jade submit-jobs --local time-series-config.json -o time-series-output
+
+
+
$ jade show-results -o time-series-output
+
+
+
$ disco make-summary-tables time-series-output
+
+
+
+
+ + +
+
+ +
+
+
+
+ + + + \ No newline at end of file diff --git a/analysis-workflows/upgrade-cost-analysis.html b/analysis-workflows/upgrade-cost-analysis.html new file mode 100644 index 00000000..51276e12 --- /dev/null +++ b/analysis-workflows/upgrade-cost-analysis.html @@ -0,0 +1,744 @@ + + + + + + + Upgrade Cost Analysis — DISCO 0.1 documentation + + + + + + + + + + + + + + + + + + + +
+ + +
+ +
+
+
+ +
+
+
+
+ +
+

Upgrade Cost Analysis

+

This chapter introduces the workflow for conducting upgrade cost analysis by using DISCO commands +step by step or DISCO pipeline, where the pipeline chains the individual steps and runs upgrade cost +analysis seamlessly. In the following two sections we will introduce the two methods separately.

+

There is a third method that bypasses the normal DISCO processes. This generic workflow allows you +to run the upgrade simulations on existing, non-standardized OpenDSS models without any transformations.

+

The following commands run with default options. If you need any customization, please run --help on +the commands to see the available options.

+
+

Step-by-Step Workflow

+

1. Transform Model

+

Prepare the model with PV deployments by using DISCO model transformation.

+
$ disco transform-model tests/data/smart-ds/substations upgrade -o upgrade-models
+
+
+

Load shape profiles for Load elements are not used by the upgrade module, and so we recommend that +you remove them from the models in order to speed-up the simulations. Do so with this option:

+
$ disco transform-model tests/data/smart-ds/substations upgrade --exclude-load-profile -o upgrade-models
+
+
+

2. Create Config

+

With the transformed model, create the config.json file with submittable jobs.

+
$ disco config upgrade upgrade-models
+
+
+

DISCO will use default upgrade parameters if the option --params-file is not specified. +If --params-file is specified, that file must contain all required parameters.

+

Here are optional parameters that you can customize in the same file:

+
[thermal_upgrade_params]
+parallel_transformers_limit = 4
+parallel_lines_limit = 4
+upgrade_iteration_threshold = 5
+timepoint_multipliers = {}
+
+[voltage_upgrade_params]
+capacitor_sweep_voltage_gap = 1.0
+reg_control_bands = [1, 2]
+reg_v_delta = 0.5
+max_regulators = 4
+place_new_regulators = true
+use_ltc_placement = true
+timepoint_multipliers = {}
+capacitor_action_flag = true
+existing_regulator_sweep_action = true
+
+
+

3. Submit Jobs

+

Submit jobs by using JADE and conduct upgrade cost analysis within each job. +This command assumes that you are running on a local system. Please remove the option +--local if you run on an HPC.

+
$ jade submit-jobs config.json --local
+
+
+

This step will generate the directory output, which contains all upgrade results.

+

4. Upgrade Analysis

+

Run post-processing to aggregate upgrade cost analysis results and create analysis CSV tables.

+
$ disco-internal make-upgrade-tables output
+
+
+

If everything succeeds, it produces aggregated json file: upgrade_summary.json

+
+
+

Pipeline Workflow

+

1. Create Template

+

Create a DISCO pipeline template file. By default, the output file is pipeline-template.toml.

+
$ disco create-pipeline template --task-name UpgradeTask --simulation-type upgrade --upgrade-analysis ~/Workspace/disco/tests/data/smart-ds/substations
+
+
+

Here, we need to enable the --upgrade-analysis option.

+

2. Config Pipeline

+

Update the pipeline template file for customization if needed. Then create the pipeline config file +pipeline.json with this command.

+
$ disco create-pipeline config pipeline-template.toml
+
+
+

3. Submit Pipeline

+

Submit the pipeline with JADE

+
$ jade pipeline submit pipeline.json
+
+
+

If everything succeeds, it produces same aggregated upgrade tables in output-stage1.

+
+
+

Generic Workflow

+

Let’s assume that you have multiple networks defined in OpenDSS model files where each network has +its own Master.dss.

+
    +
  • ./custom_models/model1/Master.dss

  • +
  • ./custom_models/model2/Master.dss

  • +
+
+

Single Execution Mode

+

1. Configure the simulation parameters and in an input JSON file called upgrades.json. +Refer to this +file +as an example. The JSON schemas are defined in Upgrade Cost Analysis JSON Schemas.

+

Each job represents one OpenDSS network and one upgrade simulation.

+
    +
  1. Run the simulation.

  2. +
+
$ disco upgrade-cost-analysis run upgrades.json
+
+
+

Refer to disco upgrade-cost-analysis run --help for additional options.

+
+
+

Parallel Execution Mode through JADE

+
    +
  1. Configure upgrades.json as described in the previous step.

  2. +
  3. Create the JADE configuration file.

  4. +
+
$ disco upgrade-cost-analysis config upgrades.json
+
+
+
    +
  1. Modify the generated config.json if necessary.

  2. +
  3. Run the jobs through JADE. This will aggregate results across all jobs. +This example assumes local-mode execution.

  4. +
+
jade submit-jobs --local config.json
+
+
+
+
+
+

Technical Details

+

The automated upgrades module consists of three components as shown in the figure: it performs traditional infrastructure upgrades to resolve both thermal and voltage violations, +and then computes the costs associated with each of those upgrades.

+../_images/upgrades.png +

A high level overview of thermal and voltage upgrades considerations is shown below:

+../_images/thermal_upgrades.png +../_images/voltage_upgrades.png +

1. Thermal Upgrades Workflow

+

In this sub-module, the thermal equipment (lines and transformers) violations are identified, and upgrades are determined as per the flowchart given below.

+../_images/thermal_workflow.png +

The technical equipment database is a catalog of available lines and transformers and can optionally be provided as an input. +All the equipment in this database will be considered as available options while determining thermal upgrades. +If this file is not provided, a technical database will be automatically generated from the given feeder model. +This would provide the thermal upgrades module with a limited set of upgrade options. +Refer to this sample technical equipment catalog +for more information.

+

For an overloaded equipment, if a higher rated equipment of similar configuration is available in the technical catalog, that is considered as an upgrade and is chosen. +Else, similar configuration equipment are added in parallel to resolve the observed violations. +Sometimes, extreme thermal equipment overloaded can also cause voltage issues. So, it can be seen that thermal upgrades also resolve some undervoltage violations.

+

2. Voltage Upgrades Workflow

+

In this sub-module, the voltage violations present in the feeder are identified, and are resolved as shown in flowchart below:

+../_images/voltage_workflow.png +

a. Existing Capacitors:

+
    +
  • If capacitors are present

    +
    +
      +
    • If capacitor control is present for a capacitor: correct capacitor control parameters i.e. PT ratio is checked and corrected (if needed)

    • +
    • If capacitor control is present, it is changed to voltage-controlled (if it is of any other kind)

    • +
    • If capacitor control is not present, voltage-controlled capacitor control is added and default control settings are applied to any newly added controller

    • +
    +
    +
  • +
  • A settings sweep is performed through all capacitor settings, and setting with least number of violations is chosen. If initial settings are best, no changes are made. In the capacitor settings sweep method, same settings are applied to all capacitors.

  • +
+

b. Existing Regulators:

+
    +
  • If voltage regulators are present, regulator control parameters (like ptratio) are corrected (if needed), including for substation LTC.

  • +
  • +
    A settings sweep is performed for existing regulator control devices (excluding substation LTC).
      +
    • In this settings sweep method, same settings are applied to all regulators

    • +
    +
    +
    +
  • +
+

c. Add new Regulator:

+
    +
  • A new regulator is added by clustering nearby buses with violations and testing regulator placement (one at a time) on each of the common upstream nodes. The placement option with least number of violations is chosen.

  • +
+

3. Upgrades Cost computation +A unit cost database is used to determine the total costs associated thermal and voltage upgrades determined through the workflows described above. +Sample input cost database can be found here

+
+

Input parameters

+

In order to run this simulation, the following inputs are needed. For required fields, example inputs are provided, and for optional parameters, default inputs are shown.

+

1. Thermal Upgrade Inputs

+

The input parameters for thermal upgrades are shown in table below. For required fields, example inputs are provided, and for optional parameters, default inputs are shown.

+ + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + +
Thermal Upgrade Inputs

Input Parameter

Type

Description

Required/Optional

Default/Example

transformer_upper_limit

float

Transformer upper limit in per unit

Required

1.25

line_upper_limit

float

Line upper limit in per unit

Required

1.25

line_design_pu

float

Line design in per unit

Required

0.75

transformer_design_pu

float

Transformer design in per unit

Required

0.75

voltage_upper_limit

float

Voltage upper limit in per unit

Required

1.05

voltage_lower_limit

float

Voltage lower limit in per unit

Required

0.95

read_external_catalog

bool

Flag to determine whether external catalog is to be used

Required

FALSE

external_catalog

str

Location to external upgrades technical catalog json file. Can be empty string, if read_external_catalog is False

Required

“”

create_plots

bool

Flag to enable or disable figure creation

Optional

TRUE

parallel_transformer_limit

int

Parallel transformer limit

Optional

4

parallel_lines_limit

int

Parallel lines limit

Optional

4

upgrade_iteration_threshold

int

Upgrade iteration threshold

Optional

5

timepoint_multipliers

dict

Dictionary to provide timepoint multipliers

Optional

None

+

2. Voltage Upgrade Inputs

+

The input parameters for voltage upgrades are shown in table below.

+ + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + +
Voltage Upgrade Inputs

Input Parameter

Type

Description

Required/Optional

Default/Example

initial_upper_limit

float

Initial upper limit in per unit

Required

1.05

initial_lower_limit

float

Initial lower limit in per unit

Required

0.95

final_upper_limit

float

Final upper limit in per unit

Required

1.05

final_lower_limit

float

Final lower limit in per unit

Required

0.95

nominal_voltage

float

Nominal voltage (volts)

Required

120

create_plots

bool

Flag to enable or disable figure creation

Optional

TRUE

capacitor_sweep_voltage_gap

float

Capacitor sweep voltage gap

Optional

1

reg_control_bands

list(int)

Regulator control bands

Optional

[1,2]

reg_v_delta

float

Regulator voltage delta

Optional

0.5

max_regulators

int

Maximum number of new regulators that can be placed

Optional

4

place_new_regulators

bool

Flag to enable or disable new regulator placement

Optional

FALSE

use_ltc_placement

bool

Flag to enable or disable substation LTC upgrades module

Optional

FALSE

capacitor_action_flag

bool

Flag to enable or disable capacitor controls settings sweep module

Optional

TRUE

existing_regulator_sweep_action

bool

Flag to enable or disable existing regulator controls settings sweep module

Optional

TRUE

timepoint_multipliers

dict

Dictionary to provide timepoint multipliers

Optional

None

+

3. Simulation Input Parameters

+

In addition to the thermal and voltage input parameters, there are a few other simulation parameters which need to be provided.

+ + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + +
Simulation input parameters

Input Parameter

Type

Description

Required/Optional

Default/Example

upgrade_cost_database

string

Final lower limit in per unit

Required

pydss_controllers

PyDSSControllerModel

If enable_pydss_controllers is True, these PyDSS controllers are applied to each corresponding element type

Required

enable_pydss_controllers

bool

Flag to enable/disable use of PyDSS controllers

Required

FALSE

include_pf1

bool

Include PF1 scenario (no controls) if pydss_controllers are defined.

Required

FALSE

dc_ac_ratio

float

Apply DC-AC ratio for PV Systems

Optional

None

+
+
+

Outputs

+

1. Costs

+ + + + + + + + + + + + + + + + + + + + + + + + + + +
Output costs

Output Parameter

Type

Description

name

str

Job name

type

str

Equipment type

count

str

Count of upgraded equipment

total_cost_usd

float

Total cost in US dollars

+

2. Summary

+ + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + +
Output summary

Output Parameter

Type

Description

name

str

Job name that produced the result

upgrade_type

str

Type of upgrade: thermal or voltage

scenario

str

Simulation scenario describing the controls being used

stage

str

Stage of upgrades: Initial (before upgrades) or Final (after upgrades)

simulation_time_s

float

Simulation time to perform upgrades (seconds). This will be present when stage=Final

thermal_violations_present

bool

Flag indicating whether thermal violations are present

voltage_violations_present

bool

Flag indicating whether voltage violations are present

max_bus_voltage

float

Maximum voltage recorded on any bus

min_bus_voltage

float

Minimum voltage recorded on any bus

num_voltage_violation_buses

int

Number of buses with voltage violations

num_overvoltage_violation_buses

int

Number of buses with voltage above voltage_upper_limit

voltage_upper_limit

float

Voltage upper limit, the threshold considered for determining overvoltages

num_undervoltage_violation_buses

int

Number of buses with voltage below voltage_lower_limit

voltage_lower_limit

float

Voltage lower limit, the threshold considered for determining undervoltages

max_line_loading

float

Maximum line loading

max_transformer_loading

float

Maximum transformer loading

num_line_violations

int

Number of lines with loading above line upper limit

line_upper_limit

float

Line upper limit, the threshold considered for determining line overloading

num_transformer_violations

int

Number of transformers with loading above transformer upper limit

transformer_upper_limit

float

Transformer upper limit, the threshold considered for determining transformer overloading

+
+
+

Example

+

For a feeder with thermal and voltage violations, the following figures show the violations in a feeder before and after upgrades.

+../_images/feeder.png +

1. Thermal Upgrades

+

The following figures show the thermal violations in a feeder before and after thermal upgrades:

+../_images/thermalbefore_thermalupgrades.png +../_images/thermalafter_thermalupgrades.png +

The following figures show the voltage violations in a feeder before and after thermal upgrades:

+../_images/voltagebefore_thermalupgrades.png +../_images/voltageafter_thermalupgrades.png +

2. Voltage Upgrades

+

The following figures show the voltage violations in a feeder before and after voltage upgrades:

+../_images/voltagebefore_voltageupgrades.png +../_images/voltageafter_voltageupgrades.png +
+
+
+ + +
+
+ +
+
+
+
+ + + + \ No newline at end of file diff --git a/build-docs.html b/build-docs.html new file mode 100644 index 00000000..e4b6ff48 --- /dev/null +++ b/build-docs.html @@ -0,0 +1,194 @@ + + + + + + + Build docs — DISCO 0.1 documentation + + + + + + + + + + + + + + + + + + + +
+ + +
+ +
+
+
+ +
+
+
+
+ +
+

Build docs

+

To build docs locally, use command below:

+
$ cd docs
+
+# Rebuild the API pages on first build or if code has changed.
+$ rm -rf source/disco
+$ sphinx-apidoc -o source/disco ../disco
+
+$ make html
+
+
+

To publish the updated docs on Github pages, use this command,

+
$ cd docs
+$ make github
+
+
+
+ + +
+
+ +
+
+
+
+ + + + \ No newline at end of file diff --git a/data-ingestion.html b/data-ingestion.html new file mode 100644 index 00000000..49e1465e --- /dev/null +++ b/data-ingestion.html @@ -0,0 +1,230 @@ + + + + + + + Data Ingestion — DISCO 0.1 documentation + + + + + + + + + + + + + + + + + + + +
+ + +
+ +
+
+
+ +
+
+
+
+ +
+

Data Ingestion

+

DISCO can ingest simulation metrics and analysis results into a sqlite database, which +facilitates data sharing between researchers and data query for further investigation.

+
+

Ingest to New Database

+

Suppose we are assigned a task which requires us to run a DISCO pipeline for +static hosting capacity analysis and the generated pipeline output directory +is /data/snapshot-output/.

+

Run the command below to ingest the results into a database.

+
$ disco ingest-tables --task-name "SFO P1U Snapshot" --model-inputs /data/input-models/ --database=test.sqlite /data/snapshot-output/
+
+
+

It will create a database named test.sqlite with data tables like below,

+_images/db-tables.png +
+
+

Ingest to Existing Database

+

Now we are assigned a second task, and need to run DISCO pipeline for dynamic hosting capacity. +We generated the output directory /data/time-series-output. +Again, we would like to ingest the data into a database.

+

We have two choices for data ingestion:

+
    +
  1. Ingest the results into a new database, let’s say, data.sqlite.

  2. +
  3. Ingest the results into an existing database, for example, the one we created above test.sqlite.

  4. +
+

If choose option 1, then just run the command above with new --database value specified. +Here, we would like to choose option2, and ingest the results of second task into an existing database, +test.sqlite created before. To perform this, we need to assign --task-name a different value, +otherwise, it would prevent the data ingestion, as the task for each ingestion must be unique.

+
$ disco ingest-tables --task-name "SFO P1U Time-series" --model-inputs /data/input-models/ --database=test.sqlite /data/time-series-output/
+
+
+
+

Note

+

Task names must be unique. It’s recommended to use a naming convention like this: <geography> <simulation_type>.

+
+
+
+

Run Database Queries

+

Create a db connection,

+
import sqlite3
+conn = sqlite3.connect("test.sqlite")
+
+
+

Run sql query with pandas,

+
import pandas as pd
+query1 = "SELECT * FROM task"
+df = pd.read_sql_query(query1, conn)
+
+
+

For more query examples, please refer to the Jupyter notebook in this repository db-query.ipynb.

+
+
+ + +
+
+ +
+
+
+
+ + + + \ No newline at end of file diff --git a/data-sources.html b/data-sources.html new file mode 100644 index 00000000..852ae021 --- /dev/null +++ b/data-sources.html @@ -0,0 +1,739 @@ + + + + + + + Data Sources — DISCO 0.1 documentation + + + + + + + + + + + + + + + + + + + +
+ + +
+ +
+
+
+ +
+
+
+
+ +
+

Data Sources

+

DISCO currently supports OpenDSS models stored in several source formats, +namely Generic Models, SourceTree1, SourceTree2, GEM, EPRI.

+

The following sections show how to +prepare the source feeder models which are used as input paths for +transforming models with a given analysis type.

+
+

Generic Power Flow Models

+

You can use this format to run power-flow simulations on your own OpenDSS models. +Unlike simulations run in the other formats, DISCO will not make any dynamic changes to the +models (as it does for DC-AC ratio for PVSystems).

+

Refer to these input JSON files as examples:

+ +

This test file +demonstrates the workflow.

+
+

Note

+

If you enable external controls for PVSystems through PyDSS then the file specified as +opendss_model_file must contain the PVSystem definitions.

+
+

The inputs must conform to the JSON schemas below.

+
+

PowerFlowSnapshotSimulationModel

+
{
+  "title": "PowerFlowSimulationBaseModel",
+  "description": "Defines a snapshot power-flow simulation.",
+  "type": "object",
+  "properties": {
+    "jobs": {
+      "title": "jobs",
+      "description": "Jobs to run as part of the simulation.",
+      "type": "array",
+      "items": {
+        "$ref": "#/definitions/PowerFlowGenericModel"
+      }
+    },
+    "include_control_mode": {
+      "title": "include_control_mode",
+      "description": "Include a control mode (such as volt-var controls) scenario for each job.",
+      "default": true,
+      "type": "boolean"
+    },
+    "include_pf1": {
+      "title": "include_pf1",
+      "description": "Include a Power Factor 1 scenario for each job.",
+      "default": true,
+      "type": "boolean"
+    },
+    "model_type": {
+      "title": "Model Type",
+      "default": "PowerFlowSnapshotSimulationModel",
+      "type": "string"
+    },
+    "start_time": {
+      "title": "start_time",
+      "description": "Start time of simulation. May be overridden by auto-time-point-selection.",
+      "default": "2020-04-15 14:00:00",
+      "type": "string",
+      "format": "date-time"
+    }
+  },
+  "required": [
+    "jobs"
+  ],
+  "additionalProperties": false,
+  "definitions": {
+    "ControllerType": {
+      "title": "ControllerType",
+      "description": "An enumeration.",
+      "enum": [
+        "FaultController",
+        "GenController",
+        "MotorStall",
+        "MotorStallSimple",
+        "PvController",
+        "PvDynamic",
+        "PvFrequencyRideThru",
+        "PvVoltageRideThru",
+        "SocketController",
+        "StorageController",
+        "ThermostaticLoad",
+        "xmfrController"
+      ]
+    },
+    "PyDSSControllerModel": {
+      "title": "PyDSSControllerModel",
+      "description": "PV Controller on deployment",
+      "type": "object",
+      "properties": {
+        "controller_type": {
+          "title": "controller_type",
+          "description": "The controller type defined in PyDSS.",
+          "example_value": "PvController",
+          "allOf": [
+            {
+              "$ref": "#/definitions/ControllerType"
+            }
+          ]
+        },
+        "name": {
+          "title": "name",
+          "description": "The name of the controller",
+          "maxLength": 120,
+          "example_value": "ctrl-name",
+          "type": "string"
+        },
+        "targets": {
+          "title": "targets",
+          "description": "The PV system files that need to apply controller.",
+          "anyOf": [
+            {
+              "type": "string",
+              "format": "file-path"
+            },
+            {
+              "type": "array",
+              "items": {
+                "type": "string",
+                "format": "file-path"
+              }
+            }
+          ]
+        }
+      },
+      "required": [
+        "controller_type",
+        "name"
+      ]
+    },
+    "PowerFlowGenericModel": {
+      "title": "PowerFlowGenericModel",
+      "description": "Parameters for each job in a power-flow simulation",
+      "type": "object",
+      "properties": {
+        "model_type": {
+          "title": "model_type",
+          "description": "Model type",
+          "default": "PowerFlowGenericModel",
+          "type": "string"
+        },
+        "name": {
+          "title": "name",
+          "description": "Unique name identifying the job",
+          "type": "string"
+        },
+        "blocked_by": {
+          "title": "blocked_by",
+          "description": "Names of jobs that must finish before this job starts",
+          "default": [],
+          "type": "array",
+          "items": {
+            "type": "string"
+          },
+          "uniqueItems": true
+        },
+        "job_order": {
+          "title": "job_order",
+          "description": "The execution order of the simulation job.",
+          "anyOf": [
+            {
+              "type": "integer",
+              "minimum": 0.0
+            },
+            {
+              "type": "number",
+              "minimum": 0.0
+            }
+          ]
+        },
+        "opendss_model_file": {
+          "title": "opendss_model_file",
+          "description": "Path to file used load the simulation model files",
+          "type": "string"
+        },
+        "estimated_run_minutes": {
+          "title": "estimated_run_minutes",
+          "description": "Optionally advises the job execution manager on how long the job will run",
+          "type": "integer"
+        },
+        "substation": {
+          "title": "substation",
+          "description": "Substation for the job",
+          "type": "string"
+        },
+        "feeder": {
+          "title": "feeder",
+          "description": "Feeder for the job",
+          "type": "string"
+        },
+        "pydss_controllers": {
+          "title": "pydss_controllers",
+          "description": "Apply these PyDSS controllers to each corresponding element type. If empty, use pf1.",
+          "default": [],
+          "type": "array",
+          "items": {
+            "$ref": "#/definitions/PyDSSControllerModel"
+          }
+        },
+        "project_data": {
+          "title": "project_data",
+          "description": "Optional user-defined metadata for the job",
+          "type": "object"
+        }
+      },
+      "required": [
+        "name",
+        "opendss_model_file",
+        "project_data"
+      ]
+    }
+  }
+}
+
+
+
+
+

PowerFlowTimeSeriesSimulationModel

+
{
+  "title": "PowerFlowSimulationBaseModel",
+  "description": "Defines a time-series power-flow simulation.",
+  "type": "object",
+  "properties": {
+    "jobs": {
+      "title": "jobs",
+      "description": "Jobs to run as part of the simulation.",
+      "type": "array",
+      "items": {
+        "$ref": "#/definitions/PowerFlowGenericModel"
+      }
+    },
+    "include_control_mode": {
+      "title": "include_control_mode",
+      "description": "Include a control mode (such as volt-var controls) scenario for each job.",
+      "default": true,
+      "type": "boolean"
+    },
+    "include_pf1": {
+      "title": "include_pf1",
+      "description": "Include a Power Factor 1 scenario for each job.",
+      "default": true,
+      "type": "boolean"
+    },
+    "model_type": {
+      "title": "Model Type",
+      "default": "PowerFlowTimeSeriesSimulationModel",
+      "type": "string"
+    },
+    "start_time": {
+      "title": "start_time",
+      "description": "Start time of simulation.",
+      "default": "2020-01-01 00:00:00",
+      "type": "string",
+      "format": "date-time"
+    },
+    "end_time": {
+      "title": "end_time",
+      "description": "End time of simulation.",
+      "default": "2020-12-31 23:45:00",
+      "type": "string",
+      "format": "date-time"
+    },
+    "step_resolution": {
+      "title": "step_resolution",
+      "description": "Step resolution of simulation in seconds.",
+      "default": 900,
+      "type": "integer"
+    }
+  },
+  "required": [
+    "jobs"
+  ],
+  "additionalProperties": false,
+  "definitions": {
+    "ControllerType": {
+      "title": "ControllerType",
+      "description": "An enumeration.",
+      "enum": [
+        "FaultController",
+        "GenController",
+        "MotorStall",
+        "MotorStallSimple",
+        "PvController",
+        "PvDynamic",
+        "PvFrequencyRideThru",
+        "PvVoltageRideThru",
+        "SocketController",
+        "StorageController",
+        "ThermostaticLoad",
+        "xmfrController"
+      ]
+    },
+    "PyDSSControllerModel": {
+      "title": "PyDSSControllerModel",
+      "description": "PV Controller on deployment",
+      "type": "object",
+      "properties": {
+        "controller_type": {
+          "title": "controller_type",
+          "description": "The controller type defined in PyDSS.",
+          "example_value": "PvController",
+          "allOf": [
+            {
+              "$ref": "#/definitions/ControllerType"
+            }
+          ]
+        },
+        "name": {
+          "title": "name",
+          "description": "The name of the controller",
+          "maxLength": 120,
+          "example_value": "ctrl-name",
+          "type": "string"
+        },
+        "targets": {
+          "title": "targets",
+          "description": "The PV system files that need to apply controller.",
+          "anyOf": [
+            {
+              "type": "string",
+              "format": "file-path"
+            },
+            {
+              "type": "array",
+              "items": {
+                "type": "string",
+                "format": "file-path"
+              }
+            }
+          ]
+        }
+      },
+      "required": [
+        "controller_type",
+        "name"
+      ]
+    },
+    "PowerFlowGenericModel": {
+      "title": "PowerFlowGenericModel",
+      "description": "Parameters for each job in a power-flow simulation",
+      "type": "object",
+      "properties": {
+        "model_type": {
+          "title": "model_type",
+          "description": "Model type",
+          "default": "PowerFlowGenericModel",
+          "type": "string"
+        },
+        "name": {
+          "title": "name",
+          "description": "Unique name identifying the job",
+          "type": "string"
+        },
+        "blocked_by": {
+          "title": "blocked_by",
+          "description": "Names of jobs that must finish before this job starts",
+          "default": [],
+          "type": "array",
+          "items": {
+            "type": "string"
+          },
+          "uniqueItems": true
+        },
+        "job_order": {
+          "title": "job_order",
+          "description": "The execution order of the simulation job.",
+          "anyOf": [
+            {
+              "type": "integer",
+              "minimum": 0.0
+            },
+            {
+              "type": "number",
+              "minimum": 0.0
+            }
+          ]
+        },
+        "opendss_model_file": {
+          "title": "opendss_model_file",
+          "description": "Path to file used load the simulation model files",
+          "type": "string"
+        },
+        "estimated_run_minutes": {
+          "title": "estimated_run_minutes",
+          "description": "Optionally advises the job execution manager on how long the job will run",
+          "type": "integer"
+        },
+        "substation": {
+          "title": "substation",
+          "description": "Substation for the job",
+          "type": "string"
+        },
+        "feeder": {
+          "title": "feeder",
+          "description": "Feeder for the job",
+          "type": "string"
+        },
+        "pydss_controllers": {
+          "title": "pydss_controllers",
+          "description": "Apply these PyDSS controllers to each corresponding element type. If empty, use pf1.",
+          "default": [],
+          "type": "array",
+          "items": {
+            "$ref": "#/definitions/PyDSSControllerModel"
+          }
+        },
+        "project_data": {
+          "title": "project_data",
+          "description": "Optional user-defined metadata for the job",
+          "type": "object"
+        }
+      },
+      "required": [
+        "name",
+        "opendss_model_file",
+        "project_data"
+      ]
+    }
+  }
+}
+
+
+
+
+
+

SourceTree1 Model

+

This format requires the following directory structure:

+
source_model_directory
+├── format.toml
+├── <substation>
+│   ├── *.dss
+│   └── <substation>--<feeder>
+│       ├── *.dss
+│       └── hc_pv_deployments
+│           ├── feeder_summary.csv
+│           └── <placement>
+│               ├── <sample>
+│                  ├── <penetration-level>
+│                     └── PVSystems.dss
+│                     └── PVSystems.dss
+│                  └── pv_config.json
+└── profiles
+    └── <profile>.csv
+
+
+

Where in format.toml, it defines type = "SourceTree1Model". +To see how to generate the PV deployments data in hc_pv_deployments directory, please +refer to SourceTree1 PV Deployments.

+

The SMART-DS dataset is an open-source dataset which is in the SourceTree1 format. +This dataset is prepared for performing DISCO hosting capacity analysis after some pre-processing which is described in the link below:

+ +
+
+

SourceTree2 Model

+

This format requires the following directory structure:

+
source_model_directory
+├── inputs
+│   ├── <feeder>
+│      ├── LoadShapes
+│         ├── <profile>.csv
+│      ├── OpenDSS
+│         ├── *.dss
+│      ├── PVDeployments
+│         └── new
+│             ├── <dc-ac-ratio>
+│                ├── <scale>
+│                   ├── <placement>
+│                      ├── <sample>
+│                         ├── PV_Gen_<sample>_<penetration-level>.txt
+├── format.toml
+
+
+

Where in format.toml, it defines type = "SourceTree2Model".

+
+
+

GEM Model

+

A GEM config file (JSON) contains paths to source models on a filesystem along with +descriptor schema that describe all feeders and their possible deployments.

+

Here is an example JSON file:

+
{
+  "include_voltage_deviation": false,
+  "path_base": "gem/feeder_models",
+  "type": "GemModel",
+  "feeders": [
+    {
+      "base_case": "deployment0",
+      "deployments": [
+        {
+          "dc_ac_ratios": [],
+          "kva_to_kw_ratings": [],
+          "loadshape_file": null,
+          "loadshape_location": null,
+          "name": "deployment0",
+          "placement_type": null,
+          "project_data": {
+            "pydss_other_loads_dss_files": {
+              "2010-03-11_12-00-00-000": ["/data/path/loads1.dss"],
+              "2010-12-25_15-00-00-000": ["/data/path/loads2.dss"]
+            },
+            "pydss_other_pvs_dss_files": {
+              "2010-03-11_12-00-00-000": ["/data/path/pvs1.dss"],
+              "2010-12-25_15-00-00-000": ["/data/path/pvs2.dss"],
+            }
+          },
+          "pv_locations": [],
+          "sample": null,
+          "pydss_controllers": null,
+          "job_order": 0
+        }
+      ],
+      "end_time": "2010-08-11_15:00:00.000",
+      "simulation_type": "Snapshot",
+      "load_multipliers": [
+        0.3,
+        1.0,
+        0.2,
+        0.9
+      ],
+      "loadshape_location": null,
+      "name": "MyFeeder",
+      "opendss_location": "/opendss/location/path/",
+      "start_time": "2010-08-11_15:00:00.000",
+      "step_resolution": 900
+    },
+  ]
+}
+
+
+

Rules:

+
+
    +
  • start_time and end_time must be set with timestamps.

  • +
  • If simulation_type == "Snapshot", then start_time and end_time must be the same.

  • +
  • dc_ac_ratios, kva_to_kw_ratings may be empty arrays to represent no-PV scenarios.

  • +
  • pydss_controllers has three attributes,

    +
    +
      +
    • controller_type: One controller type defined in PyDSS, for example, “PvController”.

    • +
    • name: One controller name registered in PyDSS registry.

    • +
    • targets (optional): null, a DSS file, or a list of DSS files. If null, then DISCO automatically sets the deployment file.

    • +
    +
    +
  • +
+
+
+
+

EPRI Model

+

The source URL of EPRI J1, K1, and M1 feeder models is +https://dpv.epri.com/feeder_models.html. You can download the source data with +this command:

+
$ disco download-source epri J1 K1 M1 --directory ./epri-feeders
+
+
+
+
+ + +
+
+ +
+
+
+
+ + + + \ No newline at end of file diff --git a/data-sources/smart-ds-model-preparation.html b/data-sources/smart-ds-model-preparation.html new file mode 100644 index 00000000..01c30795 --- /dev/null +++ b/data-sources/smart-ds-model-preparation.html @@ -0,0 +1,239 @@ + + + + + + + SMART-DS OpenDSS Model Preparation — DISCO 0.1 documentation + + + + + + + + + + + + + + + + + + + +
+ + +
+ +
+
+
+ +
+
+
+
+ +
+

SMART-DS OpenDSS Model Preparation

+

Hosting Capacity Analysis makes use of the OpenDSS models from the SMART-DS dataset. +More documentation on the open source SMART-DS datasets can be found at the SMART-DS website. +Pre-processing is performed on this dataset to prepare it for the analysis. The chart below shows the various stages of pre-processing performed on the SMART-DS OpenDSS Models.

+../_images/SMART-DS-flowchart.png +
+

Copy SMART-DS Dataset

+

The dataset can be copied to the analysis location using https://github.com/NREL/disco/blob/main/scripts/copy_smart_ds_dataset.py

+

Usage:

+
$ python ~/sandboxes/disco/scripts/copy_smart_ds_dataset.py -y 2018 -c SFO -v v1.0 /projects/distcosts3/SMART-DS
+
+
+

Here is the help:

+
$ python ~/sandboxes/disco/scripts/copy_smart_ds_dataset.py --help
+
+ Usage: copy_smart_ds_dataset.py [OPTIONS] OUTPUT_DIR
+
+   Copy a SMART-DS from the Eagle source directory to a destination directory.
+
+ Options:
+   -f, --force         overwrite output-dir if it exists
+   -c, --city TEXT     dataset city  [required]
+   -y, --year TEXT     dataset year  [required]
+   -v, --version TEXT  dataset version  [required]
+   --help              Show this message and exit.
+
+
+
+
+

Restructure to substation transformer

+

The SMART-DS dataset has Open DSS models defined at the feeder and substation level. In this stage, Open DSS models are restructured and defined such that the analysis can be performed at the substation transformer level. +This can be performed using https://github.com/NREL/disco/blob/main/scripts/smartds_restructuring_transformer_folder.py

+

Usage:

+
$ python ~/sandboxes/disco/scripts/smartds_restructuring_transformer_folder.py BASE_PATH ORIGINAL_DATASET NEW_DATASET LIST_OF_REGIONS
+
+
+

Example:

+
$ python ~/sandboxes/disco/scripts/smartds_restructuring_transformer_folder.py /projects/distcosts3/SMART-DS/v1.0/2018 SFO SFO_xfmr P1U,P1R,P2U
+
+
+
+
+

Feeder screening & model fixes

+

In this, all the base-case feeders are passed through a preliminary screening process. +Here, disconnected nodes are removed, and the models are checked for connectivity, isolated nodes and extreme cases of thermal or voltage violations. +These would need to be addressed before proceeding to the hosting capacity analysis. This can be done using https://github.com/NREL/Distribution-Integration-Analysis/blob/master/scripts-simulation/generate_screening_jobs.py

+

Usage:

+
$ python generate_screening_jobs.py PATH_TO_REGIONS
+
+
+

Example:

+
$ python generate_screening_jobs.py /projects/distcosts3/SMART-DS/v1.0/2018/SFO
+
+
+
+
+

Create PV deployments

+

In this stage, PV deployments are generated for hosting capacity analysis. There are 10 sample PV deployments for every placement type (close, random, far) for every 5% increment upto 200% PV to load ratio . +This can be done using disco, refer to the PV Deployments documentation.

+
+
+ + +
+
+ +
+
+
+
+ + + + \ No newline at end of file diff --git a/debugging-issues.html b/debugging-issues.html new file mode 100644 index 00000000..65cb19b0 --- /dev/null +++ b/debugging-issues.html @@ -0,0 +1,372 @@ + + + + + + + Debugging Issues — DISCO 0.1 documentation + + + + + + + + + + + + + + + + + + + +
+ + +
+ +
+
+
+ +
+
+
+
+ +
+

Debugging Issues

+

This page describes debugging techniques for issues encountered during the +simulation and analysis process. All of these tools produce output data in both +unstructured (.log) and structured form (.csv, .json, etc.). Aggregating data +from a batch with thousands of jobs will often require use of UNIX tools (find, +grep, awk, etc.) along with bash or Python scripts to process data in stages.

+
+

DISCO Return Codes

+

DISCO processes (snapshot, time-series, upgrades simulations) return these codes for known +conditions.

+ +++++ + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + +

R e t u r n C o d e

D e s c r i p t i o n

C o r r e c t i v e A c t i o n

0

Success

1

Generic error

114

The input configuration is invalid.

115

An Upgrades simulation exceeded the limit for parallel lines.

If not already done, enable ThermalUpgradeParamsModel.read_external_catalogand provide an external catalog. Or, increase ThermalUpgradeParamsModel.parallel_lines_limit to allow more parallel equipment tobe placed to resolve thermal violations.

116

An Upgrades simulation exceeded the limit for parallel transformers.

If not already done, enable ThermalUpgradeParamsModel.read_external_catalogand provide an external catalog. Or, increase ThermalUpgradeParamsModel.parallel_transformers_limit to allow more parallel equipment tobe placed to resolve thermal violations.

117

An OpenDSS element has unexpected properties.

Check the error message and fix the OpenDSS element definitions.

118

OpenDSS failed to compile a model.

Check the error message and fix the OpenDSS model definitions.

119

OpenDSS failed to converge.

Check the OpenDSS model. Also, refer to the OpenDSS manual to vary settings for convergence.

120

PyDSS failed to find a solution in its external controls.

121

PyDSS external controls exceeded the threshold for error counts.

122

PyDSS external controls exceeded the max tolerance error threshold.

123

An Upgrades simulation requires an external catalog for thermal upgrades in order to add or upgrade a component.

Provide an external catalog or disable ThermalUpgradeParamsModel.read_external_catalog.

124

The Upgrades external catalog does not define a required object.

Ensure the external catalog defines all required objects. Refer to the error message for specific details.

125

An Upgrades simulation detected an invalid increase in violations.

This could happen in cases when lines or transformers are extremely overloaded. Check and modify OpenDSS model for such instances.

+
+
+

Using JADE

+

Please refer to JADE documentation - +https://nrel.github.io/jade/tutorial.html#debugging

+

Note that if you need result status in structured form, such as if you want to +find all failed jobs, refer to <output-dir>/results.json.

+
+
+

Using PyDSS

+

DISCO creates a PyDSS project directory for each simulation job. The directory +will have the following contents:

+
    +
  • project.zip

  • +
  • store.h5

  • +
+

When running on an HPC the directory contents will always be zipped because +huge numbers of directories can be problematic for the shared filesystem.

+

Here is example content of an extracted job:

+
$ find output/job-outputs/p1uhs9_1247__p1udt6854__random__9__100/pydss-project
+
+output/job-outputs/p1uhs9_1247__p1udt6854__random__9__100/pydss-project
+output/job-outputs/p1uhs9_1247__p1udt6854__random__9__100/pydss-project/DSSfiles
+output/job-outputs/p1uhs9_1247__p1udt6854__random__9__100/pydss-project/DSSfiles/deployment.dss
+output/job-outputs/p1uhs9_1247__p1udt6854__random__9__100/pydss-project/Exports
+output/job-outputs/p1uhs9_1247__p1udt6854__random__9__100/pydss-project/Exports/control_mode
+output/job-outputs/p1uhs9_1247__p1udt6854__random__9__100/pydss-project/Logs
+output/job-outputs/p1uhs9_1247__p1udt6854__random__9__100/pydss-project/Logs/pydss.log
+output/job-outputs/p1uhs9_1247__p1udt6854__random__9__100/pydss-project/Logs/pydss_project__control_mode__reports.log
+output/job-outputs/p1uhs9_1247__p1udt6854__random__9__100/pydss-project/Scenarios
+output/job-outputs/p1uhs9_1247__p1udt6854__random__9__100/pydss-project/Scenarios/control_mode
+output/job-outputs/p1uhs9_1247__p1udt6854__random__9__100/pydss-project/Scenarios/control_mode/ExportLists/Exports.toml
+output/job-outputs/p1uhs9_1247__p1udt6854__random__9__100/pydss-project/Scenarios/control_mode/pyControllerList/PvControllers.toml
+
+
+

To debug a problem you can unzip the contents. However, this can be problematic +if you need to inspect lots of jobs. You may be better off using a tool like +Vim that lets you view compressed files in place.

+

You can also use zipgrep to search specific files within the .zip for +patterns. This is extremely helpful if you need to inspect many jobs. This tool +uses egrep so you may need to consult help from both locations to customize +searches.

+
+

Errors

+

All errors get logged in pydss.log. Look there for problems reported by +OpenDSS.

+
+
+

Searching for errors

+
+

Here is an example of searching for a pattern without unzipping:

+
+
$ zipgrep "Convergence error" output/job-outputs/p1uhs9_1247__p1udt6854__random__9__100/pydss_project/project.zip Logs/pydss.log
+
+
+

Here is an example that searches all jobs:

+
$ for x in `find output/job-outputs -name project.zip`; do echo "$x"; zipgrep "Convergence error" $x Logs/pydss.log; done
+
+
+

You will likely want to redirect that command’s output to another file for +further processing (or pipe it to another command).

+
+
+

Convergence errors

+

PyDSS creates a report showing each instance of a convergence error for a PV +controller. An example name of this file is +pydss_project__control_mode__reports.log. This file contains line-delimited +JSON objects. This means that each line is valid JSON but the entire file is +not.

+

Here is an example of one line of the file pretty-printed as JSON:

+
{
+  "Report": "Convergence",
+  "Scenario": "control_mode",
+  "Time": 523800,
+  "DateTime": "2020-01-07 01:30:00",
+  "Controller": "pyCont_PVSystem_small_p1ulv32837_1_2_pv",
+  "Controlled element": "PVSystem.small_p1ulv32837_1_2_pv",
+  "Error": 0.00241144335086263,
+  "Control algorithm": "VVar"
+}
+
+
+

Here are some example commands to convert the file to JSON. This example uses +an excellent 3rd-party JSON-parsing tool called jq which you have to +install. (On Eagle: conda install -c conda-forge jq). You may have a +different method.

+
$ zipgrep -h Convergence output/job-outputs/p1uhs9_1247__p1udt6854__random__9__100/pydss_project/project.zip Logs/pydss_project__control_mode__reports.log | jq . -s
+
+
+

Note: That command used -h to suppress the filename from the output.

+

This next command will do do the same for all jobs. Note that it loses the +association between job and error. You would need to do some extra work to keep +those associations.

+
$ for x in `find output/job-outputs -name project.zip`; do zipgrep -h "Convergence" $x Logs/pydss_project__control_mode__reports.log; done | jq . -s
+
+
+
+

Warning

+

Be aware of how much CPU and memory will be consumed by these +operations. You may want to redirect this output to a temporary text file +first.

+
+

In both cases you will probably want to redirect the output to a JSON file for +further processing.

+
+
+

Running searches in parallel

+

The DISCO repository has a script that extracts data from project.zip with +the Python multiprocessing library. You can use this as an example to speed up +large searches. Do not run this kind of search on an HPC login node.

+

Refer to disco/cli/make_summary_tables.py.

+
+
+
+ + +
+
+ +
+
+
+
+ + + + \ No newline at end of file diff --git a/genindex.html b/genindex.html new file mode 100644 index 00000000..8aae30fe --- /dev/null +++ b/genindex.html @@ -0,0 +1,175 @@ + + + + + + Index — DISCO 0.1 documentation + + + + + + + + + + + + + + + + + + + + + + \ No newline at end of file diff --git a/index.html b/index.html new file mode 100644 index 00000000..b613e393 --- /dev/null +++ b/index.html @@ -0,0 +1,251 @@ + + + + + + + DISCO Documentation — DISCO 0.1 documentation + + + + + + + + + + + + + + + + + + +
+ + +
+ +
+
+
+ +
+
+
+
+ +
+

DISCO Documentation

+

DISCO (Distribution Integration Solution Cost Options) is an NREL-developed, +python-based software tool for conducting scalable, repeatable distribution analyses. +While DISCO was originally developed to support photovoltaic (PV) impact analyses, +it can also be used to understand the impact of other distributed energy resources (DER) +and load changes on distribution systems. Analysis modules currently included in DISCO are:

+
    +
  • Snapshot hosting capacity analysis, in which hosting capacity is based on a +traditional definition of if operating thresholds are exceeded for +worst-case/bounding snapshots in time

  • +
  • Snapshot impact analysis, which calculates the same impact metrics as +hosting capacity, but for specific user-defined PV deployment scenarios

  • +
  • Dynamic hosting capacity analysis, in which hosting capacity is calculating +using quasi-static time-series (QSTS) simulations and dynamic impact metrics +for voltage and thermal loading. PV curtailment, +number of device (voltage regulator, capacitor switch) operations, and +energy losses are also calculated as part of this analysis because excessive +PV curtailment, increases in device operations and associated replacement +cost increases, and energy losses can also serve to limit how much PV can +be economically interconnected to a given feeder.

  • +
  • Dynamic impact analysis, which is to dynamic hosting capacity analysis as +snapshot impact analysis is to snapshot hosting capacity analyses.

  • +
+

DISCO analysis is based on power flow modeling with OpenDSS used as the simulation engine. +PyDSS (https://nrel.github.io/PyDSS) is used to interface with OpenDSS +and provide additional control layers.

+

The benefit of using DISCO instead of just directly using OpenDSS or PyDSS is two-fold:

+
    +
  • DISCO provides the infrastructure required to run a large number of analyses +by managing job submission and execution through JADE (https://nrel.github.io/jade/).

  • +
  • DISCO provides ready-made, tested code to calculate snapshot and dynamic impact +metrics, allowing for repeatable analyses across projects and teams without +having to re-create code to process OpenDSS results.

  • +
+

Examples of how DISCO has been or is currently being used are:

+
    +
  • Evaluating curtailment risk associated with using advanced inverter controls +and flexible interconnection options for PV grid integration on 100’s of circuits.

  • +
+

DISCO does not yet have the ability to conduct end-to-end techno-economic analyses +of different distribution integration solutions, including looking at impact on +customer bills, utility revenue, or the economic impact to customers and utilities +of reduced electricity demand. Thus, this is not a tool for comprehensive +techno-economic analysis.

+ +
+

Contribution

+ +
+
+

Indices and tables

+ +
+
+ + +
+
+ +
+
+
+
+ + + + \ No newline at end of file diff --git a/installation.html b/installation.html new file mode 100644 index 00000000..fc727340 --- /dev/null +++ b/installation.html @@ -0,0 +1,232 @@ + + + + + + + Installation — DISCO 0.1 documentation + + + + + + + + + + + + + + + + + + + +
+ + +
+ +
+
+
+ +
+
+
+
+ +
+

Installation

+

We recommend that you install DISCO in a virtual environment such as Conda.

+
+

Conda Installation

+
    +
  1. Create a Conda virtual environment. This example uses the name disco +as a convention.

  2. +
+
$ conda create -n disco python=3.10
+$ conda activate disco
+
+
+

Optional: Install extra packages.

+
$ conda install ipython
+
+
+
    +
  1. Install DISCO from the PyPi repository.

  2. +
+
$ pip install NREL-disco
+
+
+

Known Windows installation problem: DISCO requires PyDSS which requires the +Shapely package. In some cases Shapely will fail to install. +pip will report an error about geos_c.dll. Install it from conda and then +retry.

+
$ conda install shapely
+
+
+

Then retry the DISCO installation command.

+
    +
  1. If you will run your jobs through JADE, install DISCO’s extensions.

  2. +
+
$ disco install-extensions
+
+
+

Now, the Conda environment disco is ready to use. +To deactivate it, run the command below:

+
$ conda deactivate
+
+
+
+
+

Developer Installation

+

Follow these instructions if you will be developing DISCO code and running tests.

+
$ git clone https://github.com/NREL/disco.git
+$ cd disco
+$ pip install -e '.[dev]'
+
+# Run this command if your git version is lower than 2.19.
+$ conda install git">=2.19"
+
+
+
+
+ + +
+
+ +
+
+
+
+ + + + \ No newline at end of file diff --git a/license.html b/license.html new file mode 100644 index 00000000..9db8ab68 --- /dev/null +++ b/license.html @@ -0,0 +1,178 @@ + + + + + + + License — DISCO 0.1 documentation + + + + + + + + + + + + + + + + + + + + + + + \ No newline at end of file diff --git a/objects.inv b/objects.inv new file mode 100644 index 00000000..accfb1f1 Binary files /dev/null and b/objects.inv differ diff --git a/overview.html b/overview.html new file mode 100644 index 00000000..b3f76061 --- /dev/null +++ b/overview.html @@ -0,0 +1,234 @@ + + + + + + + Overview — DISCO 0.1 documentation + + + + + + + + + + + + + + + + + + + +
+ + +
+ +
+
+
+ +
+
+
+
+ +
+

Overview

+

This section gives an overview about DISCO and its workflows.

+

DISCO can be used for distributed grid system simulation analysis. +The analysis types are:

+
    +
  • Snapshot Impact Analysis

  • +
  • Static Hosting Capacity Analysis

  • +
  • Time Series Impact Analysis

  • +
  • Dynamic Hosting Capacity Analysis

  • +
+

The diagram below shows the DISCO workflow:

+_images/DISCO-Workflow.png +

As shown from the diagram, the main steps to run an analysis workflow are:

+
    +
  • Prepare the OpenDSS models with a given data source.

  • +
  • Transform the source OpenDSS models into DISCO models.

  • +
  • Configure JADE jobs with the DISCO models.

  • +
  • Run the jobs with JADE.

  • +
+
+

Data Sources

+

DISCO supports OpenDSS models in several data formats:

+
    +
  1. Generic Model, this format supports any user-defined OpenDSS model - Generic Power Flow Models.

  2. +
  3. SourceTree1 Model, this format requires directory structure tree1 defined by DISCO - SourceTree1 Model.

  4. +
  5. SourceTree2 Model, this format requires directory structure tree2 defined by DISCO - SourceTree2 Model.

  6. +
  7. GEM Model, Grid-connected Energy systems Modeling

  8. +
  9. EPRI Model - J1, K1, and M1, https://dpv.epri.com/feeder_models.html

  10. +
+
+
+

Transform Model

+

Given an analysis type, the source OpenDSS models need to be transformed into +DISCO models which then can be used as inputs for configuring JADE jobs.

+
+

Note

+

The Generic Model workflows do not use this step. DISCO uses those models directly.

+
+
+
+

Config Jobs

+

DISCO configures JADE jobs from standard DISCO models for specific analysis +types. The output is a configuration JSON file.

+
+
+

Submit Jobs

+

JADE parallelizes execution of the jobs on a local computer or an HPC. +Execution on an HPC is highly configurable depending on the job resource +requirements.

+
+
+

Result Analysis

+

After jobs complete JADE can assist with analysis by showing summaries of +individual job status, errors and events, job execution times, and compute +resource utilization statistics.

+
+
+ + +
+
+ +
+
+
+
+ + + + \ No newline at end of file diff --git a/pipelines.html b/pipelines.html new file mode 100644 index 00000000..f57bcefe --- /dev/null +++ b/pipelines.html @@ -0,0 +1,434 @@ + + + + + + + Pipelines — DISCO 0.1 documentation + + + + + + + + + + + + + + + + + + + +
+ + +
+ +
+
+
+ +
+
+
+
+ +
+

Pipelines

+

To conduct power flow simulations and analysis, people normally need to perform several steps, including +transform model, create configurations, submit jobs, and run post-processing scripts/commands. To streamline this workflow, DISCO leverages the power of +JADE pipeline and manage the steps using stages in simpler manner.

+

A pipeline can contain one or more stages, each stage can perform config and submit jobs +to generate stage results. The result output from prior stage can be passed to its subsequent stage +as inputs, so that produces further results. DISCO uses pipeline template and pipeline +config to manage the DISCO analysis workflow.

+

To generate a pipeline template file and create a pipeline config file based on it, +use this group command:

+
$ disco create-pipeline --help
+
+
+

The source models that DISCO pipeline currently supports include:

+
+
+
+

SourceTree1Model

+
+

Snapshot Hosting Capacity Analysis

+

1. Create Pipeline Template File

+

To create pipeline template, use this command:

+
$ disco create-pipeline template <INPUTS>
+
+
+

The opendss model inputs - <INPUTS> can be source models or preconfigured models.

+
+

Note

+

When creating the pipeline template for snapshot simulation the flag --with-loadshape +or --no-with-loadshape must be set according to whether the Loads or PVSystems in the +models use load shapes.

+
    +
  • if --no-with-loadshape, DISCO runs snapshot simulation by using Snapshot mode.

  • +
  • if --with-loadshape, DISCO runs snapshot simulation by using QSTS mode with only one timestamp.

  • +
+
+
    +
  1. Source Model Inputs

  2. +
+
$ disco create-pipeline template tests/data/smart-ds/substations --task-name TestTask -s snapshot --hosting-capacity -t pipeline-template.toml
+
+
+
    +
  1. Preconfigured Model Inputs

  2. +
+

Create preconfigured models:

+
$ disco transform-model tests/data/smart-ds/substations snapshot -o snapshot-feeder-models
+
+
+

Then, use --preconfigured flag to indicate the input models snapshot-feeder-models are preconfigured.

+
$ disco create-pipeline template snapshot-feeder-models  --task-name TestTask --preconfigured -s snapshot --hosting-capacity -t pipeline-template.toml
+
+
+

The commands above create a pipeline template file named pipeline-template.toml.

+

2. Update Pipeline Template File

+

In the template generated above, there are 3 sections, including:

+
+
    +
  • model

  • +
  • simulation

  • +
  • postprocess

  • +
+
+

You can modify the different types of parameters in each section based on your task requirements +on model transform, config/submit jobs, and postprocess. To check the meaning of each parameter, +run --help on its command.

+
+
    +
  • model.transform-params from disco transform-model <INPUTS> snapshot

  • +
  • simulation.config-params from disco config snapshot.

  • +
  • simulation.submitter-params from jade submit-jobs.

  • +
  • postprocess.config-params from jade config create.

  • +
  • postprocess.submitter-params from jade submit-jobs

  • +
+
+

Note that simulation and postprocess can use different JADE submitter parameters. Check the default values +chosen by DISCO and consider whether they can be optimized for your run. If you’re not familiar with the terms +used in this section, please refer to JADE docs.

+

For snapshot simulations DISCO uses a default value for per-node-batch-size. You may be able to pick +a better value for the simulation stage.

+

For time-series simulations DISCO estimates job runtimes and then uses JADE time-based-batching. So, you +should not need to worry about per-node-batch-size. However, you might need to adjust the walltime +value in hpc_config.toml to account for your longest jobs.

+

3. Create Pipeline Config File

+
$ disco create-pipeline config pipeline-template.toml -c pipeline.json
+
+
+

This step creates the pipeline config file named pipeline.json, which contains the stage +information. In this example, there are 2 stages, JADE run each of the stage in order, and manages +the status of each util it completes the whole workflow.

+

4. Sumbit Pipeline Using JADE

+
$ jade pipeline submit pipeline.json -o snapshot-pipeline-output
+
+
+

Pipeline output directory is snapshot-pipeline-output in this example, +which contains two stages’ results, as shown below:

+
$tree snapshot-pipeline-output/ -L 2
+snapshot-pipeline-output/
+├── config-stage1.json
+├── config-stage2.json
+├── output-stage1
+│   ├── config.json
+│   ├── disco-diff.patch
+│   ├── errors.txt
+│   ├── events
+│   ├── feeder_head_table.csv
+│   ├── feeder_losses_table.csv
+│   ├── jade-diff.patch
+│   ├── job-outputs
+│   ├── metadata_table.csv
+│   ├── processed_results.csv
+│   ├── results.csv
+│   ├── results.json
+│   ├── results.txt
+│   ├── run_jobs_batch_0_events.log
+│   ├── thermal_metrics_table.csv
+│   └── voltage_metrics_table.csv
+├── output-stage2
+│   ├── config.json
+│   ├── disco-diff.patch
+│   ├── errors.txt
+│   ├── events
+│   ├── jade-diff.patch
+│   ├── job-outputs
+│   ├── processed_results.csv
+│   ├── results.csv
+│   ├── results.json
+│   ├── results.txt
+│   └── run_jobs_batch_0_events.log
+├── pipeline.json
+└── pipeline_submit.log
+
+
+

From the result tree, the metrics summary tables *.csv were created in output-stage1 +by the postprocess job from stage 2.

+
+
+

Time-series Hosting Capacity Analysis

+

Similarly, you can run time-series hosting capacity analysis using pipeline. +However, there is a difference for time-series pipeline, where one more +stage named prescreen could be enabled, so that to prescreen pv penetration levels +and avoid running jobs with higher failure potentials, which could help reduce the consumption of +HPC compute node hours.

+

1. Create Pipeline Template File

+
$ disco create-pipeline template tests/data/smart-ds/substations  --task-name TestTask -s time-series --hosting-capacity -t pipeline-template.toml
+
+
+

If you need to prescreen PV penetration levels, use the flag --prescreen to create the template.

+
$ disco create-pipeline template tests/data/smart-ds/substations  --task-name TestTask -s time-series --prescreen --hosting-capacity -t pipeline-template.toml
+
+
+

This step create the pipeline-template.toml file.

+

2. Update Pipeline Tempalte File

+
+
There are 3 (or 4, with --prescreen enabled) sections in the template file generated above.
    +
  • model

  • +
  • prescreen (optional)

  • +
  • simulation

  • +
  • postprocess

  • +
+
+
+

Update the params in each section based on your task requirements,

+
+
    +
  • model.transform-params from disco transform-model <INPUTS> time-series

  • +
  • prescreen.config-params from disco config time-series

  • +
  • +
    prescreen.prescreen-params from disco prescreen-pv-penetration-levels create

    and disco prescreen-pv-penetration-levels filter-config.

    +
    +
    +
  • +
  • simulation.submitter-params from jade submit-jobs.

  • +
  • postprocess.config-params from jade config create.

  • +
  • postprocess.submitter-params from jade submit-jobs

  • +
+
+

then save it.

+

3. Create Pipeline Config File

+
$ disco create-pipeline config pipeline-template.toml -c pipeline.json
+
+
+

This command creates the pipeline config file named pipeline.json, there are 3 stages if +you have --prescreen enabled, otherwise, 2 stages - simulation and postprocess.

+

4. Submit Pipeline Using JADE

+
$ jade pipeline submit pipeline.json -o time-series-pipeline-output
+
+
+

Pipeline output directory is time-series-pipeline-output in this example, +which contains the results of 3 stages with --prescreen enabled.

+
$tree time-series-pipeline-output/ -L 2
+time-series-pipeline-output
+├── config-stage1.json
+├── config-stage2.json
+├── config-stage3.json
+├── output-stage1
+│   ├── config.json
+│   ├── disco-diff.patch
+│   ├── errors.txt
+│   ├── events
+│   ├── filter_prescreened_jobs.log
+│   ├── jade-diff.patch
+│   ├── job-outputs
+│   ├── processed_results.csv
+│   ├── results.csv
+│   ├── results.json
+│   ├── results.txt
+│   └── run_jobs_batch_0_events.log
+│   ├── output-stage2
+│   ├── config.json
+│   ├── disco-diff.patch
+│   ├── errors.txt
+│   ├── events
+│   ├── feeder_head_table.csv
+│   ├── feeder_losses_table.csv
+│   ├── jade-diff.patch
+│   ├── job-outputs
+│   ├── metadata_table.csv
+│   ├── processed_results.csv
+│   ├── results.csv
+│   ├── results.json
+│   ├── results.txt
+│   ├── run_jobs_batch_0_events.log
+│   ├── thermal_metrics_table.csv
+│   └── voltage_metrics_table.csv
+├── output-stage3
+│   ├── config.json
+│   ├── disco-diff.patch
+│   ├── errors.txt
+│   ├── events
+│   ├── jade-diff.patch
+│   ├── job-outputs
+│   ├── processed_results.csv
+│   ├── results.csv
+│   ├── results.json
+│   ├── results.txt
+│   └── run_jobs_batch_0_events.log
+├── pipeline.json
+└── pipeline_submit.log
+
+
+

As shown above, the metrics summary tables *.csv were created in output-stage2 +by postprocess job from stage 3.

+

5. Check Results and Plots

+

Based on the metrics results, DISCO integrate plot functions which help create 3 plots.

+
    +
  1. compare voltage primary and secondary on the first feeder.

  2. +
  3. compare pf1 and volt-var on the first feeder.

  4. +
  5. heatmap for max and min hosting capacity for all feeders.

  6. +
+

These visualizations show the feeder operational conditions under different PV penetration levels.

+
+
+
+ + +
+
+ +
+
+
+
+ + + + \ No newline at end of file diff --git a/pv-deployments.html b/pv-deployments.html new file mode 100644 index 00000000..78b6a5f3 --- /dev/null +++ b/pv-deployments.html @@ -0,0 +1,411 @@ + + + + + + + PV Deployments — DISCO 0.1 documentation + + + + + + + + + + + + + + + + + + + +
+ + +
+ +
+
+
+ +
+
+
+
+ +
+

PV Deployments

+

This section shows how to generate DISCO PV deployments from raw opendss models.

+
+

SourceTree1 PV Deployments

+

The main command going to be used is the one below,

+
$ disco pv-deployments source-tree-1 --action <action-name> --hierarchy <hierarchy> --placement <placement type> INPUT_PATH
+
+
+

There are several actions here related to PV deployments manipulation, including

+
    +
  • redirect-pvshapes: Redirect PVShape.dss in both substation and feeder Master.dss files.

  • +
  • transform-loads: Transform Loads.dss file before conducting PV deployments.

  • +
  • generate-jobs: Help generate create-pv and create-configs jobs in JSON, i.e., jade config.

  • +
  • restore-feeders: Before and during PV deployments, Loads.dss and Master.dss files were modified, need to restore after that.

  • +
  • create-pv: create PV deployments on feeders based on placement, sample and penetration levels.

  • +
  • check-pv: check if there are PV deployments missing at each placement, sample and penetration level.

  • +
  • remove-pv: delete PV deployments in case there’s something wrong.

  • +
  • create-configs: create PV config files at each sample deployment level.

  • +
  • check-configs: check if there are PV config files missing in deployment directories.

  • +
  • remove-configs: remove PV config files in case there’s something wrong.

  • +
  • list-feeders: list feeder paths given input of region, substation or feeder.

  • +
+
+

Redirect PVShapes

+

This workflow will generate OpenDSS files with varying counts and sizes of PVSystems. It will +assign load shape profiles to those PVSystems from a pool of profiles. You must define these +profiles in a PVShapes.dss file and copy that files to all substation and/or feeder +directories.

+

All Master.dss need to redirect to PVShapes.dss. We recommend that you add these lines to +your files. If you do that, you can skip to the next section.

+

If your directory structure aligns with the source-tree-1 expectations, the disco CLI command +below will add the redirects automatically.

+
+

Todo

+

Make this code handle all cases generically.

+
+

Run this command:

+
$ disco pv-deployments source-tree-1 -a redirect-pvshapes -h <hierarchy> INPUT_PATH
+
+
+
+
+

Transform Loads

+

Also, Loads.dss file under the feeder needs to be transformed before PV deployments, so that to +change load model to suitable center-tap schema if needed. The command to run this is,

+
$ disco pv-deployments source-tree-1 -a transform-loads -h <hierarchy> INPUT_PATH
+
+
+
+
+

Generate Jobs

+

DISCO provides a command to help generate JADE jobs config files for PV deployments and PV configs, that is,

+
$ disco pv-deployments source-tree-1 -a generate-jobs -h <hierarchy> INPUT_PATH
+
+
+

The hierarchy options are:

+
+
    +
  • city

  • +
  • region

  • +
  • substation

  • +
  • feeder

  • +
+
+

We recommand to run this generate-jobs command with --hierarchy=city and generate jobs on all +feeders within the city path, if your simulation/analysis jobs run relatively stable, this way can help avoid +the repeated job generation work on regions, substations, or feeders. +For test or debug purpose,it’s good to specify --hierarchy=feeder for generating config file with one job, +or --hierarchy=substation with a few jobs.

+

This command will generate two JADE config files:

+
+
    +
  • create-pv-jobs.json contains jobs for PV deployments.

  • +
  • create-config-jobs.json contains jobs for PV configs

  • +
+
+

And, you can submit the jobs via jade submit-jobs <config_file> command.

+
+

Warning

+

Since PV configs are based on the result of PV deployments, so you will need to wait PV deployment +jobs to complete, before to submit PV config jobs.

+
+
+
+

PV Deployments

+
+

Submit Jobs

+

To generate PV deployments, you will need to submit the jobs via JADE, that is,

+
$ jade submit-jobs <OPTIONS> create-pv-jobs.json
+
+
+

If the jobs pass, then the PV deployments task is done. If you’d like to explore details +about create-pv action based on your hierarchy and according input path, please check the section below.

+
+
+

Details Exploration

+

Here are some example commands showing how to create, check and remove PV deployments.

+
    +
  1. Create PV deployments on feeder1 with --placement random.

  2. +
+
$ disco pv-deployments source-tree-1 -a create-pv -h feeder -p random --pv-upscale <feeder1_path>
+
+
+
    +
  1. Create PV deployments on substation1 with a few feeders.

  2. +
+
$ disco pv-deployments source-tree-1 -a create-pv -h substation -p random --pv-upscale <substation1_path>
+
+
+
    +
  1. Create PV deployments on region1 with many feeders in parallel by using JADE.

  2. +
+

As each region has a large number of feeders, it is recommended to use JADE to parallize the jobs.

+
$ disco pv-deployments source-tree-1 -a list-feeders -h region <region1_path>
+# Create a <commands.txt> file which contains create-pv commands on feeders as above.
+$ jade config create <commands.txt> -c config1.json
+$ jade submit-jobs config1.json
+
+
+
    +
  1. If you like to check which PV deployments are missing due to job failures,

  2. +
+
$ disco pv-deployments source-tree-1 -a check-pv -h feeder -p random <feeder1_path>
+$ disco pv-deployments source-tree-1 -a check-pv -h substation  -p random <substation1_path>
+$ disco pv-deployments source-tree-1 -a check-pv -h region  -p random <region1_path>
+
+
+

It returns the missing samples and penetrations on each feeder. If don’t have --placement specified, +the result would include placement missing information on each feeder.

+
    +
  1. If you found some issues with the PV deployments, and like to delete them, here are example commands,

  2. +
+
$ disco pv-deployments source-tree-1 -a remove-pv -h feeder  -p random <feeder1_path>
+$ disco pv-deployments source-tree-1 -a remove-pv -h substation  -p random <substation1_path>
+$ disco pv-deployments source-tree-1 -a remove-pv -h region  -p random <region1_path>
+
+
+
+
+
+

PV Configs

+
+

Submit Jobs

+

To generate PV configs, you will need to submit the jobs via JADE, that is,

+
$ jade submit-jobs <OPTIONS> create-config-jobs.json
+
+
+

If the jobs pass, then the PV configs task is done. If you’d like to explore details +about create-configs action based on your hierarchy and according input path, please check the section below.

+
+
+

Details Exploration

+

After creating PV deployments we need to generate PV config files that define a load shape +profile for each PV system. The config files get created in sample directories. +The examples below show commands for creating, checking or removing PV config files.

+
    +
  1. Create PV configs on feeder1 based on PV deployments data.

  2. +
+
$ disco pv-deployments source-tree-1 -a create-configs -h feeder <feeder1_path>
+
+
+
    +
  1. Create PV configs on substation1 with a few feeders.

  2. +
+
$ disco pv-deployments source-tree-1 -a create-configs -h substation <substation1_path>
+
+
+
+

Warning

+

The option -p or --placement does not apply to create-configs action, as after all +pv configs created in each feeder, a sum group file based on customer types would be created +based on the pv configs of the feeder.

+
+
    +
  1. Create PV configs on region1 with many feeders in parallel by using JADE.

  2. +
+
$ disco pv-deployments source-tree-1 -a list-feeders -h region <region1_path>
+# Create a <commands.txt> file which contains create-configs commands on feeders as above.
+$ jade config create <commands.txt> -c config2.json
+$ jade submit-jobs config2.json
+
+
+
    +
  1. Check if any feeder is missing PV config files.

  2. +
+
$ disco pv-deployments source-tree-1 -a check-configs -h feeder -p random <feeder1_path>
+$ disco pv-deployments source-tree-1 -a check-configs -h substation -p random <substation1_path>
+$ disco pv-deployments source-tree-1 -a check-configs -h region -p random <region1_path>
+
+
+
    +
  1. Remove PV configs if something is wrong.

  2. +
+
$ disco pv-deployments source-tree-1 -a remove-configs -h feeder -p random <feeder1_path>
+$ disco pv-deployments source-tree-1 -a remove-configs -h substation -p random <substation1_path>
+$ disco pv-deployments source-tree-1 -a remove-configs -h region -p random <region1_path>
+
+
+
+
+
+

Restore Feeders

+

As the Loads.dss in SourceTree1 models needs to be transformed during PV deployments, and the +content of Loads.dss was modified. However, we backed up the original Loads.dss before +PV deployments, so we can rename back after that. Simply, the steps look like this.

+

One more thing, to speed up PV deployments, we commented out LoadShapes.dss before PV deployments in master +files, we need to revert it back after PV deployments.

+
    +
  1. Before PV deployments:

  2. +
+
    +
  • Rename raw Loads.dss into Original_Loads.dss.

  • +
+
    +
  1. During PV deployments:

  2. +
+
    +
  • DISCO PV deployment program transformed Loads.dss in place.

  • +
  • and, stripped yearly=<pv-profile> from the load lines.

  • +
+
    +
  1. After PV deployments:

  2. +
+
    +
  • Rename transformed Loads.dss file into PV_Loads.dss.

  • +
  • Rename Original_Loads.dss back to Loads.dss.

  • +
+

Run the command below to rename Loads.dss file and related,

+
$ disco pv-deployments source-tree-1 -a restore-feeders -h <hierarchy> INPUT_PATH
+
+
+
+
+
+ + +
+
+ +
+
+
+
+ + + + \ No newline at end of file diff --git a/quick-start.html b/quick-start.html new file mode 100644 index 00000000..a10d9f62 --- /dev/null +++ b/quick-start.html @@ -0,0 +1,274 @@ + + + + + + + Quick Start — DISCO 0.1 documentation + + + + + + + + + + + + + + + + + + + +
+ + +
+ +
+
+
+ +
+
+
+
+ +
+

Quick Start

+

This tutorial will show an example by using SMART-DS +models with snapshot impact analysis. Note that you could generally substitute “time-series” for +“snapshot” for that type of simulation.

+
+

Source Data

+

Suppose the DISCO repo is downloaded to the ~/disco directory, where the +SMART-DS data is located in the directory tests/data/smart-ds/substations/.

+
+
+

Transform Model

+

DISCO transforms the SMART-DS models into DISCO models with this command.

+
$ disco transform-model ~/disco/tests/data/smart-ds/substations/ snapshot
+Transformed data from ~/disco/tests/data/smart-ds/substations/ to snapshot-feeder-models for Snapshot Analysis.
+
+
+

By default, it generates a directory named snapshot-feeder-models with transformed models.

+
+
+

Config Jobs

+

Configure jobs for execution through JADE with this command:

+
$ disco config snapshot ./snapshot-feeder-models
+Created config.json for Snapshot Analysis
+
+
+

A job config file named config.json was created.

+

Parameters that you may want to configure:

+
    +
  • By default, the PyDSS-exported circuit element properties are taken from +snapshot-exports.toml. +Specify a different file with -e <your-file>.

  • +
  • PyDSS will not automatically export results to CSV files by default. +You can set export_data_tables to true in config.json.

  • +
  • DISCO applies a DC-AC ratio of 1.15 to all PVSystems by default. You can customize it with the +option --dc-ac-ratio. Set it to 1.0 to prevent any changes to your models.

  • +
  • DISCO uses a standard IEEE volt-var curve by default. You can customize the value with the option +--volt-var-curve. This must be a controller name registered with PyDSS. +Run pydss controllers show to see the registered controllers.

  • +
  • DISCO does not store per-element data in reports by default. For example, it stores max/min +voltages across all buses and not the max/min voltages for each bus. +You can set store_per_element_data to true in config.json.

  • +
  • Other PyDSS parameters: Refer to the pydss_inputs section of config.json. +PyDSS documentation

  • +
+
+
+

Submit Jobs

+

Then batch of jobs in config.json can be submitted through JADE. Two examples are shown below: +one on a local machine and one on an HPC.

+
$ jade submit-jobs --local config.json
+$ jade submit-jobs -h hpc_config.toml config.json
+
+
+
+

Note

+

Create hpc_config.toml with jade config hpc and modify it as necessary. +Refer to JADE instructions +for additional information on how to customize execution.

+
+

The submitted jobs run to completion and generate an output directory named output.

+
+
+

Result Analysis

+

To get a quick summary of job results using JADE:

+
$ jade show-results
+Results from directory: output
+JADE Version: 0.1.1
+01/04/2021 08:52:36
+
++-----------------------------------------+-------------+----------+--------------------+----------------------------+
+|                 Job Name                | Return Code |  Status  | Execution Time (s) |      Completion Time       |
++-----------------------------------------+-------------+----------+--------------------+----------------------------+
+|  p1uhs10_1247__p1udt14394__random__1__5 |      0      | finished | 23.069955110549927 | 2021-01-04 08:52:35.939785 |
+| p1uhs10_1247__p1udt14394__random__1__10 |      0      | finished | 23.06603503227234  | 2021-01-04 08:52:35.942345 |
+|  p1uhs10_1247__p1udt14394__random__2__5 |      0      | finished | 23.062479972839355 | 2021-01-04 08:52:35.943899 |
+| p1uhs10_1247__p1udt14394__random__2__10 |      0      | finished | 23.05748414993286  | 2021-01-04 08:52:35.944780 |
++-----------------------------------------+-------------+----------+--------------------+----------------------------+
+
+Num successful: 4
+Num failed: 0
+Total: 4
+
+Avg execution time (s): 23.06
+Min execution time (s): 23.06
+Max execution time (s): 23.07
+
+
+

Each job output directory contains PyDSS-exported data and reports.

+
    +
  • Reports (ex: thermal_metrics.json, voltage_metrics.json) are stored in <output-dir>/job-outputs/<job-name>/pydss_project/project.zip in the Results sub-directory.

  • +
  • Exported data tables, if enabled, are stored in the Exports sub-directory.

  • +
  • You can access the PyDSS-exported data in a Jupyter notebook data-viewer UI or programmatically +as shown in this documentation.

  • +
+

This is the complete workflow for conducting snapshot impact analysis on SMART_DS feeders.

+
+
+ + +
+
+ +
+
+
+
+ + + + \ No newline at end of file diff --git a/search.html b/search.html new file mode 100644 index 00000000..0958258d --- /dev/null +++ b/search.html @@ -0,0 +1,190 @@ + + + + + + Search — DISCO 0.1 documentation + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + \ No newline at end of file diff --git a/searchindex.js b/searchindex.js new file mode 100644 index 00000000..f7af7ed2 --- /dev/null +++ b/searchindex.js @@ -0,0 +1 @@ +Search.setIndex({"docnames": ["advanced-guide", "advanced-guide/upgrade-cost-analysis-generic-models", "analysis-workflows", "analysis-workflows/hosting-capacity-analysis", "analysis-workflows/impact-analysis", "analysis-workflows/upgrade-cost-analysis", "build-docs", "data-ingestion", "data-sources", "data-sources/smart-ds-model-preparation", "debugging-issues", "index", "installation", "license", "overview", "pipelines", "pv-deployments", "quick-start", "transform-models"], "filenames": ["advanced-guide.rst", "advanced-guide/upgrade-cost-analysis-generic-models.rst", "analysis-workflows.rst", "analysis-workflows/hosting-capacity-analysis.rst", "analysis-workflows/impact-analysis.rst", "analysis-workflows/upgrade-cost-analysis.rst", "build-docs.rst", "data-ingestion.rst", "data-sources.rst", "data-sources/smart-ds-model-preparation.rst", "debugging-issues.rst", "index.rst", "installation.rst", "license.rst", "overview.rst", "pipelines.rst", "pv-deployments.rst", "quick-start.rst", "transform-models.rst"], "titles": ["Advanced Guide", "Upgrade Cost Analysis JSON Schemas", "Analysis Workflows", "Hosting Capacity Analysis", "Impact Analysis", "Upgrade Cost Analysis", "Build docs", "Data Ingestion", "Data Sources", "SMART-DS OpenDSS Model Preparation", "Debugging Issues", "DISCO Documentation", "Installation", "License", "Overview", "Pipelines", "PV Deployments", "Quick Start", "Transform Models"], "terms": {"upgrad": [0, 2, 3, 10], "cost": [0, 2, 3, 11], "analysi": [0, 7, 8, 9, 10, 11, 16, 18], "json": [0, 3, 4, 5, 8, 10, 14, 15, 16, 17, 18], "schema": [0, 5, 8, 16], "titl": [1, 8], "descript": [1, 5, 8], "defin": [1, 4, 5, 8, 9, 10, 11, 14, 16, 18], "job": [1, 2, 3, 5, 8, 10, 11, 12, 15], "an": [1, 3, 4, 5, 7, 8, 10, 11, 12, 14, 17, 18], "simul": [1, 3, 4, 7, 8, 9, 10, 11, 14, 15, 16, 17, 18], "type": [1, 3, 4, 5, 8, 9, 14, 15, 16, 17, 18], "object": [1, 8, 10], "properti": [1, 4, 8, 10, 17], "thermal_upgrade_param": [1, 5], "thermal": [1, 3, 4, 9, 10, 11], "param": [1, 5, 15], "default": [1, 3, 5, 8, 15, 17, 18], "transformer_upper_limit": [1, 5], "1": [1, 3, 4, 5, 7, 8, 10, 15, 16, 17, 18], "25": [1, 5], "line_upper_limit": [1, 5], "line_design_pu": [1, 5], "0": [1, 3, 4, 5, 8, 9, 10, 17, 18], "75": [1, 5], "transformer_design_pu": [1, 5], "voltage_upper_limit": [1, 5], "05": [1, 3, 5], "voltage_lower_limit": [1, 5], "95": [1, 3, 5], "read_external_catalog": [1, 5, 10], "fals": [1, 3, 5, 8, 18], "external_catalog": [1, 5], "create_plot": [1, 5], "true": [1, 4, 5, 8, 17], "parallel_transformers_limit": [1, 5, 10], "4": [1, 3, 5, 15, 17], "parallel_lines_limit": [1, 5, 10], "upgrade_iteration_threshold": [1, 5], "5": [1, 5, 9, 15], "timepoint_multipli": [1, 5], "null": [1, 8, 18], "allof": [1, 8], "ref": [1, 8], "definit": [1, 8, 10, 11, 18], "thermalupgradeparamsmodel": [1, 10], "voltage_upgrade_param": [1, 5], "voltag": [1, 3, 4, 9, 11, 15, 17], "initial_upper_limit": [1, 5], "initial_lower_limit": [1, 5], "final_upper_limit": [1, 5], "final_lower_limit": [1, 5], "nominal_voltag": [1, 5], "120": [1, 5, 8, 10], "capacitor_sweep_voltage_gap": [1, 5], "reg_control_band": [1, 5], "2": [1, 3, 5, 8, 12, 15], "reg_v_delta": [1, 5], "max_regul": [1, 5], "place_new_regul": [1, 5], "use_ltc_plac": [1, 5], "capacitor_action_flag": [1, 5], "existing_regulator_sweep_act": [1, 5], "max_control_iter": 1, "50": 1, "voltageupgradeparamsmodel": 1, "upgrade_cost_databas": [1, 5], "databas": [1, 3, 5], "contain": [1, 3, 4, 5, 8, 10, 15, 16, 17, 18], "each": [1, 3, 4, 5, 7, 8, 10, 15, 16, 17, 18], "equip": [1, 5, 10], "string": [1, 5, 8], "calculate_cost": 1, "If": [1, 3, 4, 5, 7, 8, 10, 12, 15, 16, 18], "calcul": [1, 3, 11], "from": [1, 3, 4, 5, 7, 9, 10, 12, 14, 15, 16, 17], "boolean": [1, 8], "upgrade_ord": 1, "order": [1, 5, 8, 10, 15], "algorithm": [1, 10], "can": [1, 3, 5, 7, 8, 9, 10, 11, 14, 15, 16, 17, 18], "remov": [1, 5, 9, 16], "exclud": [1, 5], "them": [1, 3, 5, 16, 18], "thi": [1, 3, 4, 5, 6, 7, 8, 9, 10, 11, 12, 13, 14, 15, 16, 17, 18], "paramet": [1, 3, 4, 8, 15, 17], "arrai": [1, 8], "item": [1, 8], "pydss_control": [1, 5, 8, 18], "enable_pydss_control": [1, 5], "i": [1, 3, 4, 5, 7, 8, 9, 10, 11, 12, 13, 14, 15, 16, 17, 18], "pydss": [1, 2, 3, 5, 8, 11, 12, 17], "control": [1, 5, 8, 10, 11, 17], "ar": [1, 3, 4, 5, 7, 8, 9, 10, 11, 14, 15, 16, 17, 18], "appli": [1, 3, 5, 8, 16, 17, 18], "correspond": [1, 5, 8], "element": [1, 4, 5, 8, 10, 17], "pv_control": 1, "pydsscontrollermodel": [1, 5, 8], "plot_viol": 1, "creat": [1, 3, 4, 5, 7, 10, 11, 12, 15, 16, 17, 18], "plot": [1, 3, 4, 15], "violat": [1, 3, 4, 5, 9, 10], "befor": [1, 5, 7, 8, 9, 16], "after": [1, 3, 5, 8, 14, 16], "flag": [1, 5, 15, 18], "enabl": [1, 3, 5, 8, 10, 15, 17], "disabl": [1, 4, 5, 10], "us": [1, 2, 3, 5, 6, 7, 8, 9, 11, 12, 14, 15, 16, 17, 18], "include_pf1": [1, 5, 8], "includ": [1, 2, 4, 5, 8, 11, 15, 16], "pf1": [1, 3, 4, 5, 8, 15], "scenario": [1, 3, 4, 5, 8, 10, 11], "dc_ac_ratio": [1, 5, 8, 18], "dc": [1, 4, 5, 8, 17], "ac": [1, 4, 5, 8, 17], "ratio": [1, 3, 4, 5, 8, 9, 17], "pv": [1, 3, 5, 8, 10, 11, 15, 18], "system": [1, 5, 8, 11, 14, 16, 18], "number": [1, 5, 8, 10, 11, 16], "upgradecostanalysisgenericmodel": 1, "requir": [1, 3, 4, 5, 7, 8, 9, 10, 11, 12, 14, 15], "additionalproperti": [1, 8], "upgradeparamsbasemodel": 1, "all": [1, 3, 4, 5, 8, 9, 10, 15, 16, 17, 18], "transform": [1, 2, 3, 5, 8, 10, 11, 15], "upper": [1, 5], "limit": [1, 5, 10, 11], "per": [1, 3, 4, 5, 15, 17], "unit": [1, 5], "exampl": [1, 3, 4, 7, 8, 9, 10, 11, 12, 15, 16, 17, 18], "line": [1, 4, 5, 10, 16], "design": [1, 5], "lower": [1, 5, 12], "determin": [1, 5, 18], "whether": [1, 3, 5, 15], "extern": [1, 5, 8, 10], "catalog": [1, 5, 10], "locat": [1, 5, 8, 9, 10, 17, 18], "technic": [1, 2], "file": [1, 2, 3, 5, 8, 10, 14, 15, 16, 17], "figur": [1, 5], "creation": [1, 5], "parallel": [1, 14, 16], "integ": [1, 3, 8], "iter": [1, 5], "threshold": [1, 5, 10, 11], "dictionari": [1, 5], "provid": [1, 5, 10, 11, 16, 18], "timepoint": [1, 5], "multipli": [1, 5], "load_multipli": [1, 8], "with_pv": 1, "without_pv": 1, "6": 1, "initi": [1, 5], "final": [1, 5], "nomin": [1, 5], "volt": [1, 4, 5, 8, 15, 17], "capacitor": [1, 4, 5, 11], "sweep": [1, 5], "gap": [1, 5], "regul": [1, 5, 11], "band": [1, 5], "delta": [1, 5], "maximum": [1, 5], "new": [1, 3, 5, 8, 13], "placement": [1, 5, 8, 9, 16], "substat": [1, 4, 5, 8, 15, 16, 17, 18], "ltc": [1, 5], "modul": [1, 5, 11], "set": [1, 3, 4, 5, 8, 10, 15, 17, 18], "exist": [1, 3, 4, 5, 9], "max": [1, 3, 4, 10, 15, 17], "opendss": [1, 4, 5, 8, 10, 11, 14, 15, 16, 18], "pvcontrollermodel": 1, "controllerbasemodel": 1, "control1": [1, 18], "todo": 1, "control2": [1, 18], "control3": [1, 18], "pf": 1, "pfmin": 1, "pfmax": 1, "pmin": 1, "pmax": 1, "umin": 1, "udbmin": 1, "udbmax": 1, "umax": 1, "qlimpu": 1, "pflim": 1, "enablepflimit": 1, "uminc": 1, "umaxc": 1, "pminvw": 1, "vwtype": 1, "pcutin": 1, "pcutout": 1, "effici": 1, "prioriti": 1, "dampcoef": 1, "model_typ": [1, 8], "model": [1, 2, 3, 5, 7, 10, 11, 15, 16], "name": [1, 3, 4, 5, 7, 8, 10, 12, 15, 16, 17, 18], "uniqu": [1, 7, 8], "identifi": [1, 5, 8], "blocked_bi": [1, 8, 18], "must": [1, 5, 7, 8, 15, 16, 17, 18], "finish": [1, 8, 17], "start": [1, 3, 4, 8, 11], "uniqueitem": [1, 8], "job_ord": [1, 8, 18], "The": [1, 2, 3, 4, 5, 8, 9, 10, 11, 14, 15, 16, 17, 18], "execut": [1, 8, 11, 14, 17], "anyof": [1, 8], "minimum": [1, 5, 8], "opendss_model_fil": [1, 8], "path": [1, 3, 4, 8, 16, 18], "load": [1, 3, 4, 5, 8, 9, 11, 15], "estimated_run_minut": [1, 8], "option": [1, 3, 5, 7, 8, 9, 11, 12, 15, 16, 17, 18], "advis": [1, 8], "manag": [1, 8, 11, 15], "how": [1, 3, 4, 8, 10, 11, 16, 17], "long": [1, 8], "run": [1, 3, 4, 5, 8, 11, 12, 14, 15, 16, 17, 18], "individu": [1, 2, 5, 14], "output": [1, 2, 3, 7, 9, 10, 14, 15, 17, 18], "equipment_typ": 1, "equipment_nam": 1, "statu": [1, 10, 14, 15, 17], "total_cost_usd": [1, 5], "total": [1, 5, 17], "u": [1, 3, 5, 7, 10], "dollar": [1, 5], "parameter1_nam": 1, "parameter1": 1, "parameter1_origin": 1, "origin": [1, 11, 16], "valu": [1, 3, 4, 7, 15, 17], "parameter1_upgrad": 1, "parameter2_nam": 1, "parameter2": 1, "parameter2_origin": 1, "parameter2_upgrad": 1, "parameter3_nam": 1, "parameter3": 1, "parameter3_origin": 1, "parameter3_upgrad": 1, "result": [1, 2, 3, 5, 7, 10, 11, 15, 16], "summari": [1, 4, 14, 15, 17], "upgradesimulationoutputmodel": 1, "violation_summari": 1, "upgrade_summari": [1, 5], "upgradeviolationresultmodel": 1, "costs_per_equip": 1, "inform": [1, 4, 5, 15, 16, 17, 18], "totalupgradecostsresultmodel": 1, "upgradejoboutputmodel": 1, "one": [1, 3, 4, 5, 7, 10, 15, 16, 17, 18], "upgraded_opendss_model_fil": 1, "network": [1, 5], "feeder_stat": 1, "feeder": [1, 3, 4, 5, 8, 11, 15, 17, 18], "metadata": [1, 8], "detail": [1, 2, 4, 10], "return_cod": 1, "return": [1, 16, 17], "code": [1, 3, 4, 6, 11, 12, 16, 17], "process": [1, 2, 3, 5, 8, 9, 10, 11, 15, 18], "zero": 1, "success": [1, 10, 17], "non": [1, 3, 5, 18], "failur": [1, 15, 16], "log_fil": 1, "log": [1, 4, 10, 15], "describ": [1, 4, 5, 8, 10], "being": [1, 5, 11], "control_mod": [1, 3, 4, 10], "stage": [1, 3, 5, 9, 10, 15], "upgrade_typ": [1, 5], "simulation_time_": [1, 5], "time": [1, 2, 3, 4, 5, 7, 8, 10, 11, 14, 17, 18], "perform": [1, 3, 4, 5, 7, 8, 9, 15, 18], "second": [1, 5, 7, 8], "thermal_violations_pres": [1, 5], "indic": [1, 3, 5, 15], "present": [1, 5], "voltage_violations_pres": [1, 5], "max_bus_voltag": [1, 5], "record": [1, 5], "ani": [1, 4, 5, 8, 14, 16, 17, 18], "bu": [1, 4, 5, 17], "pu": 1, "min_bus_voltag": [1, 5], "num_voltage_violation_bus": [1, 5], "buse": [1, 5, 17], "num_overvoltage_violation_bus": [1, 5], "abov": [1, 5, 7, 15, 16], "consid": [1, 5, 15], "overvoltag": [1, 5], "num_undervoltage_violation_bus": [1, 5], "below": [1, 3, 5, 6, 7, 8, 9, 12, 14, 15, 16, 17], "undervoltag": [1, 5], "max_line_load": [1, 5], "max_transformer_load": [1, 5], "num_line_viol": [1, 5], "overload": [1, 5, 10], "num_transformer_viol": [1, 5], "count": [1, 4, 5, 10, 16], "disco": [2, 3, 4, 5, 6, 7, 8, 9, 12, 14, 15, 16, 17], "implement": 2, "allow": [2, 5, 10, 11], "post": [2, 3, 5, 15], "batch": [2, 10, 15, 17], "pipelin": [2, 3, 7, 11], "support": [2, 3, 8, 11, 14, 15, 18], "analys": [2, 11], "static": [2, 4, 7, 11, 14], "host": [2, 4, 7, 8, 9, 11, 14], "capac": [2, 4, 7, 8, 9, 11, 14], "dynam": [2, 3, 4, 7, 8, 11, 14], "snapshot": [2, 3, 4, 7, 8, 10, 11, 14, 17, 18], "seri": [2, 3, 4, 7, 8, 10, 11, 14, 17, 18], "impact": [2, 3, 11, 14, 17], "follow": [2, 3, 4, 5, 8, 10, 12, 18], "section": [2, 3, 4, 5, 8, 14, 15, 16, 17], "show": [2, 3, 4, 5, 8, 9, 10, 14, 15, 16, 17, 18], "step": [2, 3, 8, 14, 15, 16], "gener": [2, 3, 7, 9, 10, 14, 15, 17, 18], "config": [2, 3, 5, 8, 15, 18], "submit": [2, 3, 5, 15], "view": [2, 3, 10], "make": [2, 5, 6, 8, 9, 16, 18], "metric": [2, 3, 7, 11, 15], "tabl": [2, 3, 5, 7, 15, 17], "access": [2, 17], "programmat": [2, 17], "data": [2, 3, 5, 10, 11, 15, 16], "viewer": [2, 17], "conduct": [3, 5, 11, 15, 16, 17], "input": [3, 4, 7, 8, 10, 14, 15, 16], "tutori": [3, 10, 17], "assum": [3, 4, 5], "": [3, 4, 5, 7, 10, 11, 12, 15, 16, 17], "directori": [3, 4, 5, 7, 8, 9, 10, 14, 15, 16, 17], "command": [3, 4, 5, 6, 7, 8, 10, 12, 15, 16, 17, 18], "workflow": [3, 4, 8, 11, 14, 15, 16, 17], "also": [3, 5, 10, 11, 16, 18], "check": [3, 5, 9, 10, 15, 16], "help": [3, 4, 5, 9, 10, 15, 16], "templat": [3, 5, 15], "usag": [3, 9], "t": [3, 4, 10, 15, 16], "task": [3, 5, 7, 15, 16], "text": [3, 4, 9, 10], "p": [3, 10, 16, 18], "preconfigur": [3, 15], "choos": [3, 7], "loadshap": [3, 8, 15, 16], "auto": [3, 8], "select": [3, 7, 8], "point": [3, 4, 8], "automat": [3, 5, 8, 16, 17], "base": [3, 9, 11, 15, 16], "onli": [3, 15], "applic": 3, "d": [3, 4, 5, 8, 10, 15, 16, 17, 18], "search": [3, 11], "durat": 3, "dai": 3, "365": 3, "comput": [3, 5, 14, 15], "h": [3, 4, 10, 16, 17], "c": [3, 5, 9, 10, 15, 16, 18], "benefit": [3, 11], "prescreen": [3, 15], "penetr": [3, 8, 15, 16], "level": [3, 5, 8, 9, 15, 16], "toml": [3, 4, 5, 8, 10, 15, 17, 18], "r": [3, 10], "report": [3, 4, 10, 12, 17], "filenam": [3, 4, 10], "none": [3, 5, 18], "singular": 3, "add": [3, 5, 10, 16, 18], "sqlite": [3, 7], "l": [3, 15], "local": [3, 4, 5, 6, 14, 17], "mode": [3, 8, 15, 18], "hpc": [3, 4, 5, 10, 14, 15, 17], "messag": [3, 9, 10, 18], "exit": [3, 9], "given": [3, 5, 8, 11, 14, 16, 18], "we": [3, 5, 7, 12, 16, 18], "snapshottask": 3, "For": [3, 4, 5, 7, 15, 16, 17, 18], "configur": [3, 4, 5, 10, 14, 15, 17, 18], "It": [3, 4, 7, 16], "differ": [3, 7, 10, 11, 15, 17], "updat": [3, 5, 6, 15], "need": [3, 4, 5, 7, 8, 9, 10, 14, 15, 16, 18], "Then": [3, 5, 12, 15, 17], "two": [3, 4, 5, 7, 11, 15, 16, 17], "accordingli": 3, "stage1": [3, 5, 15], "stage2": [3, 15], "With": [3, 5], "next": [3, 10, 16], "jade": [3, 4, 11, 12, 14, 15, 16, 17], "o": [3, 4, 5, 6, 10, 15], "what": [3, 4], "doe": [3, 4, 8, 10, 11, 16, 17, 18], "do": [3, 5, 10, 14, 16, 18], "In": [3, 5, 9, 10, 12, 15], "power": [3, 4, 11, 14, 15], "flow": [3, 4, 11, 14, 15], "through": [3, 8, 9, 11, 12, 17], "store": [3, 4, 8, 10, 17], "aggreg": [3, 5, 10], "ingest": [3, 11], "3": [3, 5, 8, 12, 13, 15], "feeder_head_t": [3, 15], "csv": [3, 4, 5, 8, 10, 15, 17], "feeder_losses_t": [3, 15], "metadata_t": [3, 15], "thermal_metrics_t": [3, 15], "voltage_metrics_t": [3, 15], "relat": [3, 16, 18], "write": 3, "hosting_capacity_summary__": 3, "scenario_nam": 3, "hosting_capacity_overall__": 3, "depend": [3, 14], "your": [3, 4, 8, 12, 15, 16, 17], "note": [3, 4, 10, 15, 17], "produc": [3, 5, 10, 15], "prototyp": 3, "visual": [3, 15], "hca__": 3, "png": 3, "first": [3, 6, 10, 15, 18], "compar": [3, 15], "v": [3, 9, 10], "voltvar": 3, "primari": [3, 15], "secondari": [3, 15], "max_voltage_pf1_voltvar": 3, "max_voltage_pri_sec": 3, "you": [3, 4, 5, 8, 10, 12, 15, 16, 17, 18], "standard": [3, 5, 14, 17], "sql": [3, 7], "queri": 3, "further": [3, 7, 10, 15], "want": [3, 10, 17, 18], "pleas": [3, 5, 7, 8, 10, 15, 16, 18], "specifi": [3, 5, 7, 8, 16, 17, 18], "absolut": [3, 18], "refer": [3, 4, 5, 7, 8, 9, 10, 15, 17], "jupyt": [3, 4, 7, 17], "notebook": [3, 4, 7, 17], "db": [3, 7], "ipynb": [3, 7], "sourc": [3, 4, 6, 9, 11, 15, 16, 18], "repo": [3, 4, 17], "would": [3, 5, 7, 9, 10, 16, 18], "like": [3, 4, 5, 7, 10, 16], "cli": [3, 10, 16], "tool": [3, 10, 11], "sqlite3": [3, 7], "directli": [3, 11, 14], "here": [3, 4, 5, 7, 8, 9, 10, 16, 18], "some": [3, 4, 5, 8, 10, 12, 16], "case": [3, 9, 10, 11, 12, 16], "singl": 3, "so": [3, 5, 10, 15, 16, 18], "pre": [3, 4, 8, 9], "filter": [3, 15], "don": [3, 16], "alreadi": [3, 10], "have": [3, 4, 5, 7, 10, 11, 15, 16, 18], "instal": [3, 10, 11], "websit": [3, 9], "util": [3, 11, 14, 15], "version": [3, 9, 12, 17], "doesn": 3, "header": 3, "column": 3, "instead": [3, 11], "hosting_capac": 3, "where": [3, 5, 8, 15, 17], "hc_type": 3, "overal": 3, "sampl": [3, 5, 8, 9, 16], "penetration_level": 3, "node_typ": 3, "min_voltag": 3, "max_voltag": 3, "voltage_metr": [3, 4, 17], "AND": 3, "p19udt14287": 3, "min": [3, 15, 17], "across": [3, 5, 11, 17], "min_voltage_overal": 3, "max_voltage_overal": 3, "num_nodes_any_outside_ansi_b": 3, "num_nodes_any_outside_ansi_b_overal": 3, "num_time_points_with_ansi_b_viol": 3, "num_time_points_with_ansi_b_violations_overal": 3, "group": [3, 15, 16], "BY": 3, "line_max_instantaneous_loading_pct": 3, "line_max_inst": 3, "line_max_moving_average_loading_pct": 3, "line_max_mavg": 3, "line_num_time_points_with_instantaneous_viol": 3, "line_num_inst": 3, "line_num_time_points_with_moving_average_viol": 3, "line_num_mavg": 3, "transformer_max_instantaneous_loading_pct": 3, "xfmr_max_inst": 3, "transformer_max_moving_average_loading_pct": 3, "xfmr_max_mavg": 3, "transformer_num_time_points_with_instantaneous_viol": 3, "xfmr_num_inst": 3, "transformer_num_time_points_with_moving_average_viol": 3, "xfmr_num_mavg": 3, "thermal_metr": [3, 4, 17], "custom": [4, 5, 10, 11, 16, 17, 18], "could": [4, 10, 15, 17, 18], "substitut": [4, 17], "other": [4, 5, 8, 11, 17, 18], "document": [4, 9, 10, 17], "reli": 4, "collect": 4, "specif": [4, 10, 11, 14, 18], "instantan": 4, "move": 4, "averag": 4, "everi": [4, 9], "becaus": [4, 10, 11], "amount": 4, "storag": [4, 18], "space": [4, 18], "As": [4, 14, 15, 16], "earlier": 4, "clone": [4, 12], "test": [4, 5, 7, 8, 11, 12, 15, 16, 17, 18], "smart": [4, 5, 8, 15, 17, 18], "timeseri": 4, "own": [4, 5, 8], "end": [4, 8, 11], "resolut": [4, 8], "copi": [4, 16, 18], "call": [4, 5, 10], "export": [4, 10, 17], "instruct": [4, 12, 17], "store_values_typ": 4, "pvsystem": [4, 8, 10, 15, 16, 17], "circuit": [4, 11, 17], "totalpow": 4, "lineloss": 4, "loss": [4, 11], "current": [4, 8, 11, 15], "var": [4, 8, 15, 17], "curv": [4, 17], "re": [4, 11, 15], "window": [4, 12], "termin": 4, "charact": 4, "break": 4, "probabl": [4, 10], "won": 4, "work": [4, 10, 16], "volt_var_ieee_1547_2018_catb": 4, "shown": [4, 5, 14, 15, 17], "machin": [4, 17], "hpc_config": [4, 15, 17], "confirm": 4, "pass": [4, 9, 15, 16], "pydss_project": [4, 10, 17], "project": [4, 9, 10, 11, 13, 17], "zip": [4, 10, 17], "extract": [4, 10], "see": [4, 5, 8, 17], "buses__puvmagangl": 4, "cktelement__exportloadingsmetr": 4, "same": [4, 5, 8, 10, 11], "avail": [4, 5, 18], "convert": [4, 10], "tabular": 4, "form": [4, 10], "look": [4, 10, 11, 16], "import": [4, 7], "logger": 4, "setup_log": 4, "pydss_analysi": 4, "pydssanalysi": 4, "pydssscenarioanalysi": 4, "extens": [4, 12], "pydss_simul": 4, "pydss_configur": 4, "pydssconfigur": 4, "txt": [4, 8, 15, 16], "console_level": 4, "info": 4, "output_dir": [4, 9], "deseri": 4, "join": 4, "output_path": 4, "show_result": 4, "list_result": 4, "up": [4, 5, 10, 16], "get_job": 4, "print": [4, 10], "deploy": [4, 5, 8, 10, 11, 18], "project_data": [4, 8, 18], "get_simul": 4, "get": [4, 10, 16, 17], "datafram": 4, "read_result": 4, "scenario_analysi": 4, "list": [4, 5, 8, 16], "magnitud": 4, "voltages_per_bu": 4, "get_pu_bus_voltage_magnitud": 4, "percentag": 4, "line_load": 4, "get_line_loading_percentag": 4, "transformer_load": 4, "get_transformer_loading_percentag": 4, "find": [4, 10], "out": [4, 16], "class": 4, "element_class": 4, "list_element_class": 4, "prop": 4, "list_element_properti": 4, "list_element_nam": 4, "df": [4, 7], "get_datafram": 4, "head": 4, "brows": 4, "list_element_info_fil": 4, "read_element_info_fil": 4, "read": 4, "infom": 4, "event": [4, 14, 15], "event_log": 4, "read_event_log": 4, "state": 4, "chang": [4, 5, 6, 8, 11, 16, 17], "capacitor_chang": 4, "read_capacitor_chang": 4, "easi": 4, "its": [4, 5, 10, 14, 15], "doc": [4, 11, 15], "except": 4, "unlik": [4, 8], "previou": [4, 5], "about": [4, 12, 14, 15, 16], "time_series_gener": 4, "chapter": 5, "introduc": 5, "chain": 5, "seamlessli": 5, "method": [5, 10], "separ": 5, "There": [5, 9, 15, 16], "third": 5, "bypass": 5, "normal": [5, 15], "without": [5, 10, 11], "prepar": [5, 8, 14, 18], "shape": [5, 12, 15, 16], "profil": [5, 8, 16], "recommend": [5, 7, 12, 16], "speed": [5, 10, 16], "submitt": [5, 15], "within": [5, 10, 16], "which": [5, 7, 8, 10, 11, 12, 14, 15, 16, 18], "intern": 5, "everyth": 5, "succe": 5, "By": [5, 17, 18], "upgradetask": 5, "workspac": 5, "let": [5, 7, 10], "multipl": [5, 18], "ha": [5, 6, 8, 9, 10, 11, 16], "master": [5, 9, 16], "dss": [5, 8, 9, 10, 16, 18], "custom_model": 5, "model1": 5, "model2": 5, "repres": [5, 8], "addit": [5, 11, 17, 18], "modifi": [5, 10, 15, 16, 17], "necessari": [5, 17], "autom": 5, "consist": 5, "three": [5, 8], "compon": [5, 10], "tradit": [5, 11], "infrastructur": [5, 11], "resolv": [5, 10], "both": [5, 10, 16], "associ": [5, 10, 11], "those": [5, 10, 14, 16], "A": [5, 8, 10, 15, 17, 18], "high": 5, "overview": [5, 11], "consider": 5, "sub": [5, 17], "flowchart": 5, "while": [5, 11], "more": [5, 7, 9, 10, 15, 16, 18], "higher": [5, 15], "rate": 5, "similar": 5, "chosen": [5, 15], "els": 5, "ad": 5, "observ": 5, "sometim": 5, "extrem": [5, 9, 10], "caus": 5, "issu": [5, 11, 16], "seen": 5, "correct": [5, 18], "e": [5, 10, 12, 16, 17], "pt": 5, "kind": [5, 10], "newli": 5, "least": 5, "best": 5, "made": [5, 11], "b": 5, "ptratio": 5, "devic": [5, 11], "cluster": 5, "nearbi": 5, "common": 5, "upstream": 5, "node": [5, 9, 10, 15], "found": [5, 9, 16], "field": 5, "float": 5, "bool": 5, "str": 5, "empti": [5, 8], "parallel_transformer_limit": 5, "int": 5, "dict": 5, "place": [5, 10, 16], "few": [5, 16], "when": [5, 10, 15], "To": [6, 7, 8, 10, 12, 15, 16, 17, 18], "cd": [6, 12], "rebuild": 6, "api": 6, "page": [6, 10, 11], "rm": 6, "rf": 6, "sphinx": 6, "apidoc": 6, "html": [6, 8, 10, 14], "publish": 6, "github": [6, 9, 10, 11, 12], "facilit": 7, "share": [7, 10], "between": [7, 10], "research": 7, "investig": 7, "suppos": [7, 17, 18], "assign": [7, 16], "sfo": [7, 9], "p1u": [7, 9], "now": [7, 12], "again": 7, "choic": 7, "sai": 7, "just": [7, 11], "option2": 7, "otherwis": [7, 15], "prevent": [7, 17], "convent": [7, 12], "geographi": 7, "simulation_typ": [7, 8, 18], "connect": [7, 9, 14], "conn": 7, "panda": 7, "pd": 7, "query1": 7, "read_sql_queri": 7, "repositori": [7, 10, 12], "sever": [8, 14, 15, 16], "format": [8, 14, 18], "demonstr": 8, "conform": 8, "powerflowsimulationbasemodel": 8, "part": [8, 11], "powerflowgenericmodel": 8, "include_control_mod": 8, "factor": 8, "start_tim": [8, 18], "mai": [8, 10, 15, 17, 18], "overridden": 8, "2020": [8, 10], "04": [8, 17], "15": [8, 17, 18], "14": 8, "00": [8, 10, 18], "date": 8, "controllertyp": 8, "enumer": 8, "enum": 8, "faultcontrol": 8, "gencontrol": 8, "motorstal": 8, "motorstallsimpl": 8, "pvcontrol": [8, 10, 18], "pvdynam": 8, "pvfrequencyridethru": 8, "pvvoltageridethru": 8, "socketcontrol": 8, "storagecontrol": [8, 18], "thermostaticload": 8, "xmfrcontrol": 8, "controller_typ": [8, 18], "example_valu": 8, "maxlength": 8, "ctrl": 8, "target": [8, 18], "user": [8, 14, 18], "01": [8, 10, 17], "end_tim": [8, 18], "12": 8, "31": 8, "23": [8, 17], "45": 8, "step_resolut": [8, 18], "900": [8, 18], "structur": [8, 10, 14, 16], "source_model_directori": 8, "hc_pv_deploy": [8, 18], "feeder_summari": 8, "pv_config": 8, "sourcetree1model": [8, 18], "dataset": 8, "open": [8, 9], "link": 8, "pvdeploy": 8, "scale": 8, "pv_gen_": 8, "_": 8, "sourcetree2model": [8, 18], "filesystem": [8, 10], "along": [8, 10], "descriptor": 8, "possibl": 8, "include_voltage_devi": [8, 18], "path_bas": 8, "feeder_model": [8, 14], "gemmodel": 8, "base_cas": [8, 18], "deployment0": 8, "kva_to_kw_r": [8, 18], "loadshape_fil": 8, "loadshape_loc": 8, "placement_typ": 8, "pydss_other_loads_dss_fil": 8, "2010": [8, 18], "03": 8, "11_12": 8, "000": [8, 18], "loads1": 8, "25_15": 8, "loads2": 8, "pydss_other_pvs_dss_fil": 8, "pvs1": 8, "pvs2": 8, "pv_locat": [8, 18], "08": [8, 17], "11_15": 8, "9": 8, "myfeed": 8, "opendss_loc": 8, "rule": 8, "timestamp": [8, 15], "attribut": [8, 18], "One": [8, 16], "regist": [8, 17, 18], "registri": 8, "url": 8, "j1": [8, 14, 18], "k1": [8, 14], "m1": [8, 14], "http": [8, 9, 10, 11, 12, 14], "dpv": [8, 14], "com": [8, 9, 12, 14], "download": [8, 17], "chart": 9, "variou": 9, "nrel": [9, 10, 11, 12], "blob": 9, "main": [9, 14, 16], "script": [9, 10, 15], "copy_smart_ds_dataset": 9, "py": [9, 10], "python": [9, 10, 11, 12], "sandbox": 9, "y": 9, "2018": 9, "v1": 9, "distcosts3": 9, "eagl": [9, 10], "destin": 9, "f": 9, "forc": 9, "overwrit": 9, "dir": [9, 10, 17], "citi": [9, 16], "year": 9, "smartds_restructuring_transformer_fold": 9, "base_path": 9, "original_dataset": 9, "new_dataset": 9, "list_of_region": 9, "sfo_xfmr": 9, "p1r": 9, "p2u": 9, "preliminari": 9, "disconnect": 9, "isol": 9, "These": [9, 15], "address": 9, "proceed": 9, "done": [9, 10, 16], "distribut": [9, 11, 14], "integr": [9, 11, 15], "generate_screening_job": 9, "path_to_region": 9, "10": [9, 12], "close": 9, "random": [9, 16], "far": 9, "increment": 9, "upto": 9, "200": 9, "techniqu": 10, "encount": 10, "dure": [10, 16], "unstructur": 10, "etc": 10, "thousand": 10, "often": 10, "unix": 10, "grep": 10, "awk": 10, "bash": 10, "known": [10, 12], "condit": [10, 15], "n": [10, 12], "114": 10, "invalid": 10, "115": 10, "exceed": [10, 11], "read_external_catalogand": 10, "Or": 10, "increas": [10, 11], "tobe": 10, "116": 10, "117": 10, "unexpect": 10, "fix": 10, "118": 10, "fail": [10, 12, 17], "compil": 10, "119": 10, "manual": [10, 18], "vari": [10, 16], "solut": [10, 11], "121": 10, "122": 10, "toler": 10, "123": 10, "124": 10, "ensur": 10, "125": 10, "detect": 10, "happen": 10, "instanc": 10, "io": [10, 11], "content": [10, 16], "h5": 10, "alwai": 10, "huge": 10, "problemat": 10, "p1uhs9_1247__p1udt6854__random__9__100": 10, "dssfile": 10, "pydss_project__control_mode__report": 10, "exportlist": 10, "pycontrollerlist": 10, "problem": [10, 12], "unzip": 10, "howev": [10, 15, 16, 18], "inspect": 10, "lot": 10, "better": [10, 15], "off": 10, "vim": 10, "compress": 10, "zipgrep": 10, "pattern": 10, "mani": [10, 16], "egrep": 10, "consult": 10, "x": 10, "echo": 10, "redirect": 10, "anoth": [10, 18], "pipe": 10, "delimit": 10, "mean": [10, 15], "valid": 10, "entir": 10, "pretti": 10, "523800": 10, "datetim": 10, "07": [10, 17], "30": 10, "pycont_pvsystem_small_p1ulv32837_1_2_pv": 10, "small_p1ulv32837_1_2_pv": 10, "00241144335086263": 10, "vvar": [10, 18], "excel": 10, "3rd": 10, "parti": 10, "pars": 10, "jq": 10, "On": 10, "conda": 10, "forg": 10, "That": 10, "suppress": 10, "lose": 10, "extra": [10, 12], "keep": 10, "Be": 10, "awar": 10, "much": [10, 11], "cpu": 10, "memori": 10, "consum": [10, 18], "oper": [10, 11, 15], "temporari": 10, "multiprocess": 10, "librari": 10, "larg": [10, 11, 16], "login": 10, "make_summary_t": 10, "develop": 11, "softwar": 11, "scalabl": 11, "repeat": [11, 16], "wa": [11, 16, 17], "photovolta": 11, "understand": 11, "energi": [11, 14], "resourc": [11, 14], "der": 11, "worst": 11, "bound": 11, "quasi": 11, "qst": [11, 15], "curtail": 11, "switch": 11, "excess": 11, "replac": [11, 18], "serv": 11, "econom": 11, "interconnect": 11, "engin": 11, "interfac": 11, "layer": 11, "fold": 11, "submiss": 11, "readi": [11, 12], "team": 11, "been": 11, "evalu": 11, "risk": 11, "advanc": 11, "invert": 11, "flexibl": 11, "grid": [11, 14], "100": 11, "yet": 11, "abil": 11, "techno": 11, "bill": 11, "revenu": 11, "reduc": [11, 15, 18], "electr": 11, "demand": 11, "thu": 11, "comprehens": 11, "quick": 11, "debug": [11, 16], "build": 11, "licens": 11, "index": 11, "virtual": 12, "environ": 12, "activ": 12, "packag": 12, "ipython": 12, "pypi": 12, "pip": 12, "error": [12, 14, 15, 18], "geos_c": 12, "dll": 12, "retri": 12, "deactiv": 12, "git": 12, "dev": 12, "than": [12, 18], "19": 12, "under": [13, 15, 16], "term": [13, 15], "bsd": 13, "claus": 13, "revis": 13, "give": 14, "diagram": 14, "sourcetree1": [14, 15], "tree1": 14, "sourcetree2": 14, "tree2": 14, "gem": [14, 18], "epri": 14, "highli": 14, "complet": [14, 15, 16, 17], "assist": 14, "statist": 14, "peopl": 15, "streamlin": 15, "leverag": 15, "simpler": 15, "manner": 15, "prior": 15, "subsequ": 15, "accord": [15, 16], "testtask": 15, "postprocess": 15, "thei": [15, 18], "optim": 15, "familiar": 15, "size": [15, 16], "abl": 15, "pick": 15, "estim": 15, "runtim": 15, "should": [15, 18], "worri": 15, "might": 15, "adjust": 15, "walltim": 15, "account": 15, "longest": 15, "whole": 15, "sumbit": 15, "tree": [15, 16], "diff": 15, "patch": 15, "processed_result": 15, "run_jobs_batch_0_ev": 15, "pipeline_submit": 15, "were": [15, 16], "similarli": 15, "avoid": [15, 16], "potenti": 15, "consumpt": 15, "hour": 15, "tempalt": 15, "save": 15, "stage3": 15, "filter_prescreened_job": 15, "function": 15, "heatmap": 15, "raw": 16, "go": 16, "action": [16, 18], "hierarchi": 16, "input_path": 16, "manipul": 16, "miss": 16, "delet": 16, "someth": 16, "wrong": 16, "region": 16, "pool": 16, "skip": 16, "align": 16, "expect": 16, "handl": 16, "suitabl": 16, "center": 16, "tap": 16, "recommand": 16, "rel": [16, 18], "stabl": 16, "wai": 16, "purpos": 16, "good": 16, "And": [16, 18], "via": [16, 18], "config_fil": 16, "sinc": 16, "wait": 16, "feeder1": 16, "upscal": 16, "feeder1_path": 16, "substation1": 16, "substation1_path": 16, "region1": 16, "paral": 16, "region1_path": 16, "config1": 16, "due": 16, "sum": 16, "config2": 16, "back": 16, "renam": 16, "simpli": 16, "thing": 16, "comment": 16, "revert": 16, "original_load": 16, "program": 16, "strip": 16, "yearli": 16, "pv_load": 16, "taken": 17, "export_data_t": 17, "ieee": 17, "store_per_element_data": 17, "pydss_input": 17, "2021": 17, "52": 17, "36": 17, "p1uhs10_1247__p1udt14394__random__1__5": 17, "069955110549927": 17, "35": 17, "939785": 17, "p1uhs10_1247__p1udt14394__random__1__10": 17, "06603503227234": 17, "942345": 17, "p1uhs10_1247__p1udt14394__random__2__5": 17, "062479972839355": 17, "943899": 17, "p1uhs10_1247__p1udt14394__random__2__10": 17, "05748414993286": 17, "944780": 17, "num": 17, "avg": 17, "06": [17, 18], "ex": 17, "ui": 17, "smart_d": 17, "understood": 18, "eprimodel": 18, "displai": 18, "right": 18, "dirnam": 18, "portabl": 18, "particular": 18, "my": 18, "my_volt_var_curv": 18, "onc": 18, "project123": 18, "file1": 18, "file2": 18, "pydant": 18, "meet": 18, "snapshotimpactanalysismodel": 18, "straightforward": 18, "cat": 18, "tag": 18, "deployment_001": 18, "2013": 18, "17t15": 18, "2014": 18, "j1_123_sim_456": 18, "sure": 18, "match": 18, "validationerror": 18, "rais": 18}, "objects": {}, "objtypes": {}, "objnames": {}, "titleterms": {"advanc": 0, "guid": [0, 11], "upgrad": [1, 5], "cost": [1, 5], "analysi": [1, 2, 3, 4, 5, 14, 15, 17], "json": 1, "schema": [1, 18], "upgradecostanalysissimulationmodel": 1, "upgradescostresultsummarymodel": 1, "jobupgradesummaryoutputmodel": 1, "workflow": [2, 5], "host": [3, 15], "capac": [3, 15], "impact": 4, "transform": [4, 9, 14, 16, 17, 18], "model": [4, 8, 9, 14, 17, 18], "config": [4, 14, 16, 17], "job": [4, 14, 16, 17], "submit": [4, 14, 16, 17], "view": 4, "output": [4, 5], "file": [4, 18], "make": 4, "metric": 4, "tabl": [4, 11], "access": 4, "result": [4, 14, 17], "programmat": 4, "us": [4, 10], "pydss": [4, 10, 18], "data": [4, 7, 8, 14, 17, 18], "viewer": 4, "gener": [4, 5, 8, 16], "step": 5, "pipelin": [5, 15], "singl": 5, "execut": 5, "mode": 5, "parallel": [5, 10], "through": 5, "jade": [5, 10], "technic": 5, "detail": [5, 16], "input": [5, 18], "paramet": 5, "thermal": 5, "voltag": 5, "simul": 5, "summari": 5, "exampl": 5, "build": 6, "doc": 6, "ingest": 7, "new": 7, "databas": 7, "exist": 7, "run": [7, 10], "queri": 7, "sourc": [8, 14, 17], "power": 8, "flow": 8, "powerflowsnapshotsimulationmodel": 8, "powerflowtimeseriessimulationmodel": 8, "sourcetree1": [8, 16], "sourcetree2": 8, "gem": 8, "epri": 8, "smart": 9, "d": 9, "opendss": 9, "prepar": 9, "copi": 9, "dataset": 9, "restructur": 9, "substat": 9, "feeder": [9, 16], "screen": 9, "fix": 9, "creat": 9, "pv": [9, 16], "deploy": [9, 16], "debug": 10, "issu": 10, "disco": [10, 11, 18], "return": 10, "code": 10, "error": 10, "search": 10, "converg": 10, "document": 11, "user": 11, "contribut": 11, "indic": 11, "instal": 12, "conda": 12, "develop": 12, "licens": 13, "overview": 14, "sourcetree1model": 15, "snapshot": 15, "time": 15, "seri": 15, "redirect": 16, "pvshape": 16, "todo": 16, "load": [16, 18], "explor": 16, "restor": 16, "quick": 17, "start": 17, "help": 18, "directori": 18, "shape": 18, "depth": 18, "control": 18, "valid": 18}, "envversion": {"sphinx.domains.c": 3, "sphinx.domains.changeset": 1, "sphinx.domains.citation": 1, "sphinx.domains.cpp": 9, "sphinx.domains.index": 1, "sphinx.domains.javascript": 3, "sphinx.domains.math": 2, "sphinx.domains.python": 4, "sphinx.domains.rst": 2, "sphinx.domains.std": 2, "sphinx.ext.viewcode": 1, "sphinx.ext.todo": 2, "sphinx": 60}, "alltitles": {"Advanced Guide": [[0, "advanced-guide"]], "Upgrade Cost Analysis JSON Schemas": [[1, "upgrade-cost-analysis-json-schemas"]], "UpgradeCostAnalysisSimulationModel": [[1, "upgradecostanalysissimulationmodel"]], "UpgradesCostResultSummaryModel": [[1, "upgradescostresultsummarymodel"]], "JobUpgradeSummaryOutputModel": [[1, "jobupgradesummaryoutputmodel"]], "Analysis Workflows": [[2, "analysis-workflows"]], "Hosting Capacity Analysis": [[3, "hosting-capacity-analysis"]], "Impact Analysis": [[4, "impact-analysis"]], "Transform Model": [[4, "transform-model"], [14, "transform-model"], [17, "transform-model"]], "Config Jobs": [[4, "config-jobs"], [14, "config-jobs"], [17, "config-jobs"]], "Submit Jobs": [[4, "submit-jobs"], [14, "submit-jobs"], [16, "submit-jobs"], [16, "id3"], [17, "submit-jobs"]], "View Output Files": [[4, "view-output-files"]], "Make metric table files": [[4, "make-metric-table-files"]], "Access Results Programmatically": [[4, "access-results-programmatically"]], "Use the PyDSS Data Viewer": [[4, "use-the-pydss-data-viewer"]], "Generic Models": [[4, "generic-models"]], "Upgrade Cost Analysis": [[5, "upgrade-cost-analysis"]], "Step-by-Step Workflow": [[5, "step-by-step-workflow"]], "Pipeline Workflow": [[5, "pipeline-workflow"]], "Generic Workflow": [[5, "generic-workflow"]], "Single Execution Mode": [[5, "single-execution-mode"]], "Parallel Execution Mode through JADE": [[5, "parallel-execution-mode-through-jade"]], "Technical Details": [[5, "technical-details"]], "Input parameters": [[5, "input-parameters"]], "Thermal Upgrade Inputs": [[5, "id1"]], "Voltage Upgrade Inputs": [[5, "id2"]], "Simulation input parameters": [[5, "id3"]], "Outputs": [[5, "outputs"]], "Output costs": [[5, "id4"]], "Output summary": [[5, "id5"]], "Example": [[5, "example"]], "Build docs": [[6, "build-docs"]], "Data Ingestion": [[7, "data-ingestion"]], "Ingest to New Database": [[7, "ingest-to-new-database"]], "Ingest to Existing Database": [[7, "ingest-to-existing-database"]], "Run Database Queries": [[7, "run-database-queries"]], "Data Sources": [[8, "data-sources"], [14, "data-sources"]], "Generic Power Flow Models": [[8, "generic-power-flow-models"]], "PowerFlowSnapshotSimulationModel": [[8, "powerflowsnapshotsimulationmodel"]], "PowerFlowTimeSeriesSimulationModel": [[8, "powerflowtimeseriessimulationmodel"]], "SourceTree1 Model": [[8, "sourcetree1-model"]], "SourceTree2 Model": [[8, "sourcetree2-model"]], "GEM Model": [[8, "gem-model"]], "EPRI Model": [[8, "epri-model"]], "SMART-DS OpenDSS Model Preparation": [[9, "smart-ds-opendss-model-preparation"]], "Copy SMART-DS Dataset": [[9, "copy-smart-ds-dataset"]], "Restructure to substation transformer": [[9, "restructure-to-substation-transformer"]], "Feeder screening & model fixes": [[9, "feeder-screening-model-fixes"]], "Create PV deployments": [[9, "create-pv-deployments"]], "Debugging Issues": [[10, "debugging-issues"]], "DISCO Return Codes": [[10, "disco-return-codes"]], "Using JADE": [[10, "using-jade"]], "Using PyDSS": [[10, "using-pydss"]], "Errors": [[10, "errors"]], "Searching for errors": [[10, "searching-for-errors"]], "Convergence errors": [[10, "convergence-errors"]], "Running searches in parallel": [[10, "running-searches-in-parallel"]], "DISCO Documentation": [[11, "disco-documentation"]], "User Guide": [[11, null]], "Contribution": [[11, null]], "Indices and tables": [[11, "indices-and-tables"]], "Installation": [[12, "installation"]], "Conda Installation": [[12, "conda-installation"]], "Developer Installation": [[12, "developer-installation"]], "License": [[13, "license"]], "Overview": [[14, "overview"]], "Result Analysis": [[14, "result-analysis"], [17, "result-analysis"]], "Pipelines": [[15, "pipelines"]], "SourceTree1Model": [[15, "sourcetree1model"]], "Snapshot Hosting Capacity Analysis": [[15, "snapshot-hosting-capacity-analysis"]], "Time-series Hosting Capacity Analysis": [[15, "time-series-hosting-capacity-analysis"]], "PV Deployments": [[16, "pv-deployments"], [16, "id2"]], "SourceTree1 PV Deployments": [[16, "sourcetree1-pv-deployments"]], "Redirect PVShapes": [[16, "redirect-pvshapes"]], "Todo": [[16, "id1"]], "Transform Loads": [[16, "transform-loads"]], "Generate Jobs": [[16, "generate-jobs"]], "Details Exploration": [[16, "details-exploration"], [16, "id4"]], "PV Configs": [[16, "pv-configs"]], "Restore Feeders": [[16, "restore-feeders"]], "Quick Start": [[17, "quick-start"]], "Source Data": [[17, "source-data"]], "Transform Models": [[18, "transform-models"]], "Transform Model Help": [[18, "transform-model-help"]], "Input File": [[18, "input-file"]], "Input Directory": [[18, "input-directory"]], "Load Shape Data files": [[18, "load-shape-data-files"]], "DISCO Model in Depth": [[18, "disco-model-in-depth"]], "PyDSS Controllers": [[18, "pydss-controllers"]], "Model Schema": [[18, "model-schema"]], "Validate Inputs": [[18, "validate-inputs"]]}, "indexentries": {}}) \ No newline at end of file diff --git a/transform-models.html b/transform-models.html new file mode 100644 index 00000000..de898619 --- /dev/null +++ b/transform-models.html @@ -0,0 +1,355 @@ + + + + + + + Transform Models — DISCO 0.1 documentation + + + + + + + + + + + + + + + + + + + +
+ + +
+ +
+
+
+ +
+
+
+
+ +
+

Transform Models

+
+

Transform Model Help

+

This process transforms user OpenDSS models into a format understood by DISCO +so that it can perform simulation and analysis with the models.

+

Given an input path of source data DISCO can determine the types of analysis +it supports. The input path can be one of:

+
+
    +
  • a GEM config file; the JSON schema definition is here - GEM Model.

  • +
  • a directory path which contains a format.toml with a source type definition. +The source types are:

    +
      +
    • EpriModel

    • +
    • SourceTree1Model

    • +
    • SourceTree2Model

    • +
    +
  • +
+
+
+

Input File

+

The --help option displays the types of analysis the source models support. +For example, if the input path is a GEM file:

+
$ disco transform-model ./gem-file.json --help
+
+Available analysis types: snapshot
+
+For additional help run one of the following:
+    disco transform-model ./gem-file.json snapshot --help
+
+
+
+
+

Input Directory

+

If the input path is a directory, for example, with type = SourceTree1Model +in format.toml.

+
$ disco transform-model tests/data/smart-ds/substations/ --help
+
+Available analysis types: snapshot time-series
+
+For additional help run one of the following:
+    disco transform-model tests/data/smart-ds/substations/ snapshot --help
+    disco transform-model tests/data/smart-ds/substations/ time-series --help
+
+
+
+

Note

+

By default, the name of PV deployments directory is hc_pv_deployments, if the PV deployments +are located in another directory, please specify the right directory by using option -P/--pv-deployments-dirname +in the transform-model command.

+
+
+
+

Load Shape Data files

+

By default, DISCO replaces relative paths to load shape data files with absolute +paths and does not copy them. This reduces time and consumed storage space. +However, it also makes the directory non-portable to other systems.

+

If you want to create a portable directory with copies of these files, add +this flag to the command:

+
$ disco transform-model tests/data/smart-ds/substations time-series -c
+$ disco transform-model tests/data/smart-ds/substations time-series --copy-load-shape-data-files
+
+
+
+
+
+

DISCO Model in Depth

+
+

PyDSS Controllers

+

If you have custom controllers that need to be applied to simulation, +please make the controllers are registered via PyDSS first.

+

Suppose we have particular controller settings defined in a my-custom-controllers.toml file:

+
[my_volt_var_curve]
+Control1 = "VVar"
+Control2 = "None"
+Control3 = "None"
+...
+
+
+
$ pydss controllers register PvController /path/my-custom-controllers.toml
+
+
+

Once registered, the following information could be used to create the input +config related to pydss_controllers.

+
{
+    "name": "project123",
+    "controller_type": "PvController"
+}
+
+
+

By default, the target PyDSS file that the PyDSS controller would be applied to +is the deployment file, you do not need to specify the target DSS files. However, +if you want to specify the target DSS files here, other than the deployment file,

+
{
+    "name": "project123",
+    "controller_type": "PvController",
+    "targets": [
+        "/data/dss/file1.dss",
+        "/data/dss/file2.dss"
+    ]
+}
+
+
+

And, pydss_controllers supports multiple PyDSS controllers here,

+
[
+    {
+        "name": "project123",
+        "controller_type": "PvController"
+    },
+    {
+        "name": "project123",
+        "type": "StorageController"
+    },
+]
+
+
+
+
+

Model Schema

+

DISCO uses pydantic +models to define the schema of model inputs for each type of analysis. Given a +type of analysis in DISCO, the schema shows all attributes used to define +the analysis models.

+

Show Schema

+

The input configurations in JSON should meet the specifications defined +by DISCO. To show the schema of a given analysis type, for example, +SnapshotImpactAnalysisModel using this command with --mode show-schema +option,

+
$ disco simulation-models --mode show-schema SnapshotImpactAnalysisModel
+
+
+

Show Example

+

A data example may be more straightforward, use --mode show-example option,

+
$ disco simulation-models --mode show-example SnapshotImpactAnalysisModel --output-file=disco-models/configurations.json
+$ cat disco-models/configurations.json
+[
+    {
+        "feeder": "J1",
+        "tag": "2010",
+        "deployment": {
+            "name": "deployment_001.dss",
+            "dc_ac_ratio": 1.15,
+            "directory": "disco-models",
+            "kva_to_kw_rating": 1.0,
+            "project_data": {},
+            "pv_locations": [],
+            "pydss_controllers": null
+        },
+        "simulation": {
+            "start_time": "2013-06-17T15:00:00.000",
+            "end_time": "2014-06-17T15:00:00.000",
+            "step_resolution": 900,
+            "simulation_type": "Snapshot"
+        },
+        "name": "J1_123_Sim_456",
+        "base_case": null,
+        "include_voltage_deviation": false,
+        "blocked_by": [],
+        "job_order": null
+    }
+]
+
+
+
+
+

Validate Inputs

+

If you want to prepare the models manually then you must generate them in a +JSON file and then validate them to make sure they match the schema.

+
$ disco simulation-models validate-file disco-models/configurations.json
+
+
+

The ValidationError will be raised if any input does not meet the +specification defined by DISCO. The error messages should provide corrective +action.

+
+
+
+ + +
+
+ +
+
+
+
+ + + + \ No newline at end of file