Skip to content

Commit

Permalink
Merge pull request #3 from ami-iit/initialStructure
Browse files Browse the repository at this point in the history
Initial structure and first definition of the variables and parameters classes
  • Loading branch information
S-Dafarra authored Mar 13, 2023
2 parents e34ff71 + 67c68b9 commit 22845ca
Show file tree
Hide file tree
Showing 20 changed files with 229 additions and 57 deletions.
71 changes: 19 additions & 52 deletions .github/workflows/ci_cd.yml
Original file line number Diff line number Diff line change
Expand Up @@ -9,45 +9,8 @@ on:

jobs:

package:
name: Package the project
runs-on: ubuntu-22.04

steps:

- uses: actions/checkout@v3
with:
fetch-depth: 0

- name: Set up Python
uses: actions/setup-python@v4
with:
python-version: "3.10"

- name: Install Python tools
run: pip install build twine

- name: Create distributions
run: python -m build -o dist/

- name: Inspect dist folder
run: ls -lah dist/

- name: Check wheel's abi and platform tags
run: test $(find dist/ -name *-none-any.whl | wc -l) -gt 0

- name: Run twine check
run: twine check dist/*

- name: Upload artifacts
uses: actions/upload-artifact@v3
with:
path: dist/*
name: dist

test:
name: 'Python${{ matrix.python }}@${{ matrix.os }}'
needs: package
runs-on: ${{ matrix.os }}
strategy:
fail-fast: false
Expand All @@ -62,20 +25,24 @@ jobs:

steps:

- name: Set up Python
uses: actions/setup-python@v4
with:
python-version: ${{ matrix.python }}
- uses: actions/checkout@v2

- name: Download Python packages
uses: actions/download-artifact@v3
- uses: conda-incubator/setup-miniconda@v2
with:
path: dist
name: dist

- name: Install wheel
shell: bash
run: pip install dist/*.whl

- name: Import the package
run: python -c "import hippopt"
miniforge-variant: Mambaforge
miniforge-version: latest

- name: Dependencies
shell: bash -l {0}
run: |
mamba install python=${{ matrix.python }} casadi pytest
- name: Install
shell: bash -l {0}
run: |
pip install --no-deps -e .[all]
- name: Test
shell: bash -l {0}
run: |
pytest
13 changes: 8 additions & 5 deletions README.md
Original file line number Diff line number Diff line change
@@ -1,9 +1,10 @@
# hippopt
### HIghly Pythonized Planning and OPTimization framework

### HIgh Performance* Planning and OPTimization framework

hippopt is an open-source framework for generating whole-body trajectories for legged robots, with a focus on direct transcription of optimal control problems solved with multiple-shooting methods. The framework takes as input the robot model and generates optimized trajectories that include both kinematic and dynamic quantities.

*supposedly

## Features

- [ ] Direct transcription of optimal control problems with multiple-shooting methods
Expand All @@ -13,8 +14,11 @@ hippopt is an open-source framework for generating whole-body trajectories for l
- [ ] Extensive documentation and examples to help you get started

## Installation

TODO
It is suggested to use [``conda``](https://docs.conda.io/en/latest/).
```bash
conda install -c conda-forge casadi pytest
pip install --no-deps -e .[all]
```

## Citing this work

Expand All @@ -41,4 +45,3 @@ This repository is maintained by:
| | |
| :----------------------------------------------------------: | :--------------------------------------------------: |
| [<img src="https://github.com/S-Dafarra.png" width="40">](https://github.com/S-Dafarra) | [@S-Dafarra](https://github.com/S-Dafarra) |

7 changes: 7 additions & 0 deletions pyproject.toml
Original file line number Diff line number Diff line change
Expand Up @@ -15,3 +15,10 @@ line-length = 88
[tool.isort]
profile = "black"
multi_line_output = 3

[tool.pytest.ini_options]
minversion = "6.0"
addopts = "-ra -q"
testpaths = [
"test",
]
5 changes: 5 additions & 0 deletions setup.cfg
Original file line number Diff line number Diff line change
Expand Up @@ -50,13 +50,18 @@ package_dir =
=src
python_requires = >=3.10
install_requires =
casadi
numpy

[options.extras_require]
style =
black
isort
testing=
pytest
all =
%(style)s
%(testing)s

[options.packages.find]
where = src
9 changes: 9 additions & 0 deletions src/hippopt/__init__.py
Original file line number Diff line number Diff line change
@@ -0,0 +1,9 @@
from . import base
from .base.optimization_object import (
OptimizationObject,
StorageType,
TOptimizationObject,
default_storage_field,
)
from .base.parameter import Parameter, TParameter
from .base.variable import TVariable, Variable
1 change: 1 addition & 0 deletions src/hippopt/base/__init__.py
Original file line number Diff line number Diff line change
@@ -0,0 +1 @@
from . import optimization_object, parameter, variable
59 changes: 59 additions & 0 deletions src/hippopt/base/optimization_object.py
Original file line number Diff line number Diff line change
@@ -0,0 +1,59 @@
import abc
import copy
import dataclasses
from typing import Any, ClassVar, Type, TypeVar

import casadi as cs
import numpy as np

TOptimizationObject = TypeVar("TOptimizationObject", bound="OptimizationObject")
StorageType = cs.MX | np.ndarray


@dataclasses.dataclass
class OptimizationObject(abc.ABC):
StorageType: ClassVar[str] = "generic"
StorageTypeMetadata: ClassVar[dict[str, Any]] = dict(StorageType=StorageType)

def get_default_initialization(
self: TOptimizationObject, field_name: str
) -> np.ndarray:
"""
Get the default initialization of a given field
It is supposed to be called only for the fields having the StorageType metadata
"""
return np.zeros(dataclasses.asdict(self)[field_name].shape)

def get_default_initialized_object(
self: TOptimizationObject,
) -> TOptimizationObject:
"""
:return: A copy of the object with its initial values
"""

output = copy.deepcopy(self)
output_dict = dataclasses.asdict(output)

for field in dataclasses.fields(output):
if "StorageType" in field.metadata:
output.__setattr__(
field.name, output.get_default_initialization(field.name)
)
continue

if isinstance(output.__getattribute__(field.name), OptimizationObject):
output.__setattr__(
field.name,
output.__getattribute__(
field.name
).get_default_initialized_object(),
)

return output


def default_storage_field(cls: Type[OptimizationObject]):
return dataclasses.field(
default=None,
metadata=cls.StorageTypeMetadata,
)
14 changes: 14 additions & 0 deletions src/hippopt/base/parameter.py
Original file line number Diff line number Diff line change
@@ -0,0 +1,14 @@
import dataclasses
from typing import Any, ClassVar, TypeVar

from hippopt.base.optimization_object import OptimizationObject

TParameter = TypeVar("TParameter", bound="Parameter")


@dataclasses.dataclass
class Parameter(OptimizationObject):
""""""

StorageType: ClassVar[str] = "parameter"
StorageTypeMetadata: ClassVar[dict[str, Any]] = dict(StorageType=StorageType)
Empty file added src/hippopt/base/problem.py
Empty file.
Empty file added src/hippopt/base/solver.py
Empty file.
14 changes: 14 additions & 0 deletions src/hippopt/base/variable.py
Original file line number Diff line number Diff line change
@@ -0,0 +1,14 @@
import dataclasses
from typing import Any, ClassVar, TypeVar

from hippopt.base.optimization_object import OptimizationObject

TVariable = TypeVar("TVariable", bound="Variable")


@dataclasses.dataclass
class Variable(OptimizationObject):
""""""

StorageType: ClassVar[str] = "variable"
StorageTypeMetadata: ClassVar[dict[str, Any]] = dict(StorageType=StorageType)
Empty file.
Empty file.
Empty file.
Empty file.
Empty file.
Empty file.
Empty file.
Empty file.
93 changes: 93 additions & 0 deletions test/test_base.py
Original file line number Diff line number Diff line change
@@ -0,0 +1,93 @@
import dataclasses

import numpy as np

from hippopt import (
OptimizationObject,
Parameter,
StorageType,
TOptimizationObject,
Variable,
default_storage_field,
)


@dataclasses.dataclass
class TestVariable(OptimizationObject):
storage: StorageType = default_storage_field(cls=Variable)

def __post_init__(self):
self.storage = np.ones(shape=3)


@dataclasses.dataclass
class TestParameter(OptimizationObject):
storage: StorageType = default_storage_field(cls=Parameter)

def __post_init__(self):
self.storage = np.ones(shape=3)


def test_zero_variable():
test_var = TestVariable()
test_var_zero = test_var.get_default_initialized_object()
assert test_var_zero.storage.shape == (3,)
assert np.all(test_var_zero.storage == 0)


def test_zero_parameter():
test_par = TestParameter()
test_par_zero = test_par.get_default_initialized_object()
assert test_par_zero.storage.shape == (3,)
assert np.all(test_par_zero.storage == 0)


@dataclasses.dataclass
class CustomInitializationVariable(OptimizationObject):
variable: StorageType = default_storage_field(cls=Variable)
parameter: StorageType = default_storage_field(cls=Parameter)

def __post_init__(self):
self.variable = np.ones(shape=3)
self.parameter = np.ones(shape=3)

def get_default_initialization(
self: TOptimizationObject, field_name: str
) -> np.ndarray:
if field_name == "variable":
return 2 * np.ones(2)

return OptimizationObject.get_default_initialization(self, field_name)


def test_custom_initialization():
test_var = CustomInitializationVariable()
test_var_init = test_var.get_default_initialized_object()
assert test_var_init.parameter.shape == (3,)
assert np.all(test_var_init.parameter == 0)
assert test_var_init.variable.shape == (2,)
assert np.all(test_var_init.variable == 2)


@dataclasses.dataclass
class AggregateClass(OptimizationObject):
aggregated: CustomInitializationVariable
other_parameter: StorageType = default_storage_field(cls=Parameter)
other: str = ""

def __post_init__(self):
self.aggregated = CustomInitializationVariable()
self.other_parameter = np.ones(3)
self.other = "untouched"


def test_aggregated():
test_var = AggregateClass(aggregated=CustomInitializationVariable())
test_var_init = test_var.get_default_initialized_object()
assert test_var_init.aggregated.parameter.shape == (3,)
assert np.all(test_var_init.aggregated.parameter == 0)
assert test_var_init.aggregated.variable.shape == (2,)
assert np.all(test_var_init.aggregated.variable == 2)
assert test_var_init.other_parameter.shape == (3,)
assert np.all(test_var_init.other_parameter == 0)
assert test_var_init.other == "untouched"

0 comments on commit 22845ca

Please sign in to comment.