Skip to content

Commit

Permalink
1.103.0
Browse files Browse the repository at this point in the history
  • Loading branch information
cjdsellers authored Feb 16, 2021
2 parents c05f692 + b423699 commit 8a1cbeb
Show file tree
Hide file tree
Showing 20 changed files with 155 additions and 125 deletions.
9 changes: 6 additions & 3 deletions README.md
Original file line number Diff line number Diff line change
Expand Up @@ -99,7 +99,10 @@ The documentation for the latest version of the package is available at _readthe

## Installation

The latest version is tested against Python 3.7 - 3.9 on Linux and MacOS.
The `master` branch will always reflect the code of the latest release version.
Also, the documentation is always current for the latest version.

The package is tested against Python 3.7 - 3.9 on both Linux and MacOS.

We recommend users setup a virtual environment to isolate the dependencies, and run the platform
with the latest stable version of Python.
Expand Down Expand Up @@ -239,7 +242,7 @@ at commit.

The following steps are for Unix-like systems, and only need to be completed once.

1. Install the pre-commit package:
1. Install the `pre-commit` package by running:

pip install pre-commit

Expand All @@ -255,7 +258,7 @@ The following steps are for Unix-like systems, and only need to be completed onc

poetry install

5. Setup the pre-commit hook which will then run automatically at commit:
5. Setup the `pre-commit` hook which will then run automatically at commit by running:

pre-commit run --all-files

Expand Down
32 changes: 16 additions & 16 deletions docs/source/developer_guide/coding_standards.rst
Original file line number Diff line number Diff line change
Expand Up @@ -3,24 +3,24 @@ Coding Standards

Code Style
----------
`Black` is a PEP-8 compliant opinionated formatter.
``Black`` is a PEP-8 compliant opinionated formatter.

> https://github.com/psf/black

We philosophically agree with the `Black` formatting style, however it does not
We philosophically agree with the ``Black`` formatting style, however it does not
currently run over Cython code. So you could say we are "handcrafting towards"
`Blacks` stylistic conventions.
``Blacks`` stylistic conventions.

The current codebase can be used as a guide for formatting guidance.

- For longer lines of code, and when passing more than a couple of arguments -
- For longer lines of code, and when passing more than a couple of arguments
it's common to take a new line which aligns at the next logical indent (rather
than attempting a hanging alignment off an opening parenthesis).

- The closing parenthesis should be located on a new line, aligned at the logical
indent.

- Also ensure multiple hanging parameters or arguments end with a comma `,`::
- Also ensure multiple hanging parameters or arguments end with a comma::

LongCodeLine(
some_arg1,
Expand All @@ -34,22 +34,22 @@ PEP-8
The codebase generally follows the PEP-8 style guide.

One notable departure is that Python `truthiness` is not always taken advantage
of to check if an argument is `None`, or if a collection is empty/has elements.
of to check if an argument is ``None``, or if a collection is empty/has elements.

There are two reasons for this;

1- Cython can generate more efficient C code from `is None` and `is not None`,
rather than entering the Python runtime to check the `PyObject` truthiness.
1- Cython can generate more efficient C code from ``is None`` and ``is not None``,
rather than entering the Python runtime to check the ``PyObject`` truthiness.

2- As per the `Google Python Style Guide` it's discouraged to use truthiness to
check if an argument is/is not None, when there is a chance an unexpected object
could be passed into the function or method which will yield an unexpected
truthiness evaluation - which could result in a logical error type bug.

_"Always use if foo is None: (or is not None) to check for a None value.
"Always use if foo is None: (or is not None) to check for a None value.
E.g., when testing whether a variable or argument that defaults to None was set
to some other value. The other value might be a value that’s false in a boolean
context!"_
context!"

> https://google.github.io/styleguide/pyguide.html

Expand All @@ -64,22 +64,22 @@ NumPy Docstrings
----------------
The NumPy docstring syntax is used throughout the codebase. This needs to be
adhered to consistently to ensure the docs build correctly during pushes to the
`master` branch.
``master`` branch.

> https://numpydoc.readthedocs.io/en/latest/format.html

Flake8
------
`flake8` is utilized to lint the codebase. Current ignores can be found in the
`.flake8` config file, the majority of which are required so that valid Cython
``flake8`` is utilized to lint the codebase. Current ignores can be found in the
``.flake8`` config file, the majority of which are required so that valid Cython
code is not picked up as flake8 failures.

Cython
------
Ensure that all functions and methods returning `void` or a primitive C type
(such as `bint`, `int`, `double`) include the `except *` keyword in the signature.
Ensure that all functions and methods returning ``void`` or a primitive C type
(such as ``bint``, ``int``, ``double``) include the ``except *`` keyword in the signature.

This will ensure Python exceptions are not ignored, but instead are `bubbled up`
This will ensure Python exceptions are not ignored, but instead are "bubbled up"
to the caller as expected.

More information on Cython syntax and conventions can be found by reading the
Expand Down
12 changes: 6 additions & 6 deletions docs/source/developer_guide/environment.rst
Original file line number Diff line number Diff line change
Expand Up @@ -5,11 +5,11 @@ For development we recommend using the PyCharm _Professional_ edition IDE, as it
interprets Cython syntax. Alternatively, you could use Visual Studio Code with
a Cython extension.

`poetry` is the preferred tool for handling all Python package and dev dependencies.
``poetry`` is the preferred tool for handling all Python package and dev dependencies.

> https://python-poetry.org/

`pre-commit` is used to automatically run various checks, auto-formatters and linting tools
``pre-commit`` is used to automatically run various checks, auto-formatters and linting tools
at commit.

> https://pre-commit.com/
Expand All @@ -18,29 +18,29 @@ Setup
-----
The following steps are for Unix-like systems, and only need to be completed once.

1. Install pre-commit:
1. Install ``pre-commit`` by running:

pip install pre-commit

2. Install the Cython package by running:

pip install -U Cython==3.0a6

3. Install `poetry` by running:
3. Install ``poetry`` by running:

curl -sSL https://raw.githubusercontent.com/python-poetry/poetry/master/get-poetry.py | python -

4. Then install all Python package dependencies, and compile the C extensions by running:

poetry install

5. Setup the pre-commit hook which will then run automatically at commit:
5. Setup the ``pre-commit`` hook which will then run automatically at commit by running:

pre-commit run --all-files

Builds
------

Following any changes to `.pyx` or `.pxd` files, you can re-compile by running:
Following any changes to ``.pyx`` or ``.pxd`` files, you can re-compile by running:

python build.py
4 changes: 2 additions & 2 deletions docs/source/developer_guide/packaged_data.rst
Original file line number Diff line number Diff line change
@@ -1,14 +1,14 @@
Packaged Data
=============

Various data is contained internally in the `tests/test_kit/data` folder.
Various data is contained internally in the ``tests/test_kit/data`` folder.

Libor Rates
-----------
The libor rates for 1 month USD can be updated by downloading the CSV data
from https://fred.stlouisfed.org/series/USD1MTD156N

Ensure you select `Max` for the time window.
Ensure you select ``Max`` for the time window.

Short Term Interest Rates
-------------------------
Expand Down
10 changes: 5 additions & 5 deletions docs/source/developer_guide/testing.rst
Original file line number Diff line number Diff line change
Expand Up @@ -15,19 +15,19 @@ Tests can be run using either Pytest or the Nox tool.
If you're using PyCharm then tests should run directly by right clicking on the
respective folder (or top level tests folder) and clicking 'Run pytest'.

Alternatively you can use the `pytest .` command from the root level `tests`
Alternatively you can use the ``pytest .`` command from the root level ``tests``
folder, or the other sub folders.

Nox
---
Nox sessions are defined within the `noxfile.py`, to run various test collections.
Nox sessions are defined within the ``noxfile.py``, to run various test collections.

To run unit tests with nox::

nox -s tests


If you have `redis-server` up you can run integration tests with nox::
If you have ``redis-server`` up you can run integration tests with nox::

nox -s integration_tests

Expand All @@ -40,11 +40,11 @@ Mocks
-----
Unit tests will often include other components acting as mocks. The intent of
this is to simplify the test suite to avoid extensive use of a mocking framework,
although `MagicMock` objects are currently used in particular cases.
although ``MagicMock`` objects are currently used in particular cases.

Code Coverage
-------------
Code coverage output is generated using `coverage` and reported using `codecov`.
Code coverage output is generated using ``coverage`` and reported using ``codecov``.

High test coverage is a goal for the project however not at the expense of
appropriate error handling, or causing "test induced damage" to the architecture.
Expand Down
17 changes: 8 additions & 9 deletions docs/source/getting_started/core_concepts.rst
Original file line number Diff line number Diff line change
@@ -1,7 +1,7 @@
Core Concepts
=============

`NautilusTrader` has been built from the ground up to deliver the
NautilusTrader has been built from the ground up to deliver the
highest quality performance and user experience.

There are two main use cases for this software package.
Expand All @@ -16,7 +16,7 @@ There are two main reasons for conducting backtests on historical data;
- Verify the logic of trading strategy implementations
- Getting an indication of likely performance **if the alpha of the strategy remains into the future**.

Backtesting with an event-driven engine such as `NautilusTrader` is not meant to be the primary
Backtesting with an event-driven engine such as NautilusTrader is not meant to be the primary
research method for alpha discovery, but merely facilitates the above.

One of the primary benefits of this platform is
Expand All @@ -27,18 +27,17 @@ endpoints through adapters provided with this package (and/or developed by end u
This helps ensure consistency when seeking to capitalize on alpha through a large
sample size of trades, as expressed in the logic of the trading strategies.

Only a small amount of example data is available in the `test`
directory of the repository - as used in the examples. There are many sources of financial
market data, and it is left to the user to supply this for backtesting
purposes.
Only a small amount of example data is available in the ``test`` directory of
the repository - as used in the examples. There are many sources of financial
market data, and it is left to the user to supply this for backtesting purposes.

The platform is extremely flexible and open ended, you could inject dozens of different
datasets into a backtest engine, running them simulatenously with time being
accurately simulated down to the smallest `timedelta` definable by Python.
datasets into a backtest engine, running them simultaneously with time being
accurately simulated down to the smallest ``timedelta`` definable by Python.

Trading Live
------------
A `TradingNode` is able to host a fleet of trading strategies,
A ``TradingNode`` is able to host a fleet of trading strategies,
with data able to be ingested from multiple data clients, and
order execution and management through multiple execution clients.

Expand Down
52 changes: 40 additions & 12 deletions docs/source/getting_started/installation.rst
Original file line number Diff line number Diff line change
@@ -1,30 +1,58 @@
Installation
============

The ``master`` branch will always reflect the code of the latest release version.
Also, the documentation is always current for the latest version.

The package is tested against Python versions 3.7 - 3.9 on both Linux and
MacOS. Users are encouraged to use the latest stable version of Python.

It is a goal for the project to keep dependencies focused, however there are
still a large number of dependencies as found in the `pyproject.toml` file. Therefore we recommend you create a new
virtual environment for `NautilusTrader`.
still a large number of dependencies as found in the ``pyproject.toml`` file.
Therefore we recommend you create a new virtual environment for NautilusTrader.

There are various ways of achieving this - the easiest being to use the `Poetry`
There are various ways of achieving this - the easiest being to use the ``poetry``
tool. https://python-poetry.org/docs/

If you're not used to working with virtual environments, you will find a great
explanation in the `Poetry` documentation under the `Managing environments`
explanation in the ``poetry`` documentation under the `Managing environments`
sub-menu.

The latest version of `NautilusTrader` can be downloaded
as a binary wheel from `PyPI`, just run::
Installation for Unix-like systems can be achieved through `one` of the following options;

From PyPI
---------

To install the latest binary wheel (or sdist package) from PyPI, run:

pip install -U nautilus_trader

From GitHub Release
-------------------

To install a binary wheel from GitHub, first navigate to the latest release.

> https://github.com/nautechsystems/nautilus_trader/releases/latest/

Download the appropriate ``.whl`` for your operating system and Python version, then run::

pip install <file-name>.whl

From Source
-----------

Installation from source requires Cython to compile the Python C extensions.

1. To install Cython, run::

pip install -U nautilus_trader
pip install -U Cython==3.0a6

2. Then to install NautilusTrader using ``pip``, run::

Alternatively, you can install from source via pip by running::
pip install -U git+https://github.com/nautechsystems/nautilus_trader

pip install .
**Or** clone the source with ``git``, and install from the projects root directory by running::

The master branch will always reflect the code of the latest release version.
Also, the documentation found here on `readthedocs` is always current for the
latest version.
git clone https://github.com/nautechsystems/nautilus_trader
cd nautilus_trader
pip install .
18 changes: 8 additions & 10 deletions docs/source/user_guide/framework.rst
Original file line number Diff line number Diff line change
Expand Up @@ -11,16 +11,14 @@ architecture, allowing pluggable implementations of key components with a
feature rich yet straight forward API. `Domain Driven Design` (DDD) and message passing
have been central philosophies in the design.

From a high level
view - a `Trader` can host any number of infinitely customizable
`TradingStrategy`s. A central `Portfolio` has access to `Account`s which can all be queried. A common
`DataEngine` and `ExecutionEngine` then allow asynchronous ingest of any data
and trade events, with the core componentry common to both backtesting and live
implementations.
From a high level view - a ``Trader`` can host any number of infinitely customizable
``TradingStrategy``s. A central ``Portfolio`` has access to ``Account``s which can all be queried.
A common ``DataEngine`` and ``ExecutionEngine`` then allow asynchronous ingest of any data
and trade events, with the core componentry common to both backtesting and live implementations.

Currently a performant `Redis` execution database maintains
state persistence (swapped out for an in-memory only implementation for backtesting).
Currently a performant Redis execution database maintains state persistence
(swapped out for an in-memory only implementation for backtesting).
It should be noted that the flexibility of the framework even allows the live trading
`Redis` database to be plugged into the backtest engine. Interestingly there is
only a 4x performance overhead which speaks to the raw speed of `Redis` and the
Redis database to be plugged into the backtest engine. Interestingly there is
only a 4x performance overhead which speaks to the raw speed of Redis and the
platform itself.
Loading

0 comments on commit 8a1cbeb

Please sign in to comment.