Skip to content

Commit

Permalink
Deploying to gh-pages from @ 9185ac9 🚀
Browse files Browse the repository at this point in the history
  • Loading branch information
elliottower committed Mar 13, 2024
1 parent ab1a9d7 commit fb144af
Show file tree
Hide file tree
Showing 4 changed files with 4 additions and 4 deletions.
2 changes: 1 addition & 1 deletion main/.buildinfo
Original file line number Diff line number Diff line change
@@ -1,4 +1,4 @@
# Sphinx build info version 1
# This file hashes the configuration used when building these files. When it is not found, a full rebuild will be done.
config: c7b194ee5828776da45d387e7343d57b
config: 0a9a654f36eb6815b4af662934eeb931
tags: d77d1c0d9ca2f4c8421862c7c5a0d620
Original file line number Diff line number Diff line change
Expand Up @@ -420,7 +420,7 @@ <h1>Source code for pettingzoo.mpe.simple_adversary.simple_adversary</h1><div cl
<span class="sd">target landmark, but negatively rewarded based on how close the adversary is to the target landmark. The adversary is rewarded based on distance to the target, but it doesn&#39;t know which landmark is the target landmark. All rewards are unscaled Euclidean distance (see main MPE documentation for</span>
<span class="sd">average distance). This means good agents have to learn to &#39;split up&#39; and cover all landmarks to deceive the adversary.</span>

<span class="sd">Agent observation space: `[self_pos, self_vel, goal_rel_position, landmark_rel_position, other_agent_rel_positions]`</span>
<span class="sd">Agent observation space: `[goal_rel_position, landmark_rel_position, other_agent_rel_positions]`</span>

<span class="sd">Adversary observation space: `[landmark_rel_position, other_agents_rel_positions]`</span>

Expand Down
2 changes: 1 addition & 1 deletion main/environments/mpe/simple_adversary/index.html
Original file line number Diff line number Diff line change
Expand Up @@ -445,7 +445,7 @@ <h1>Simple Adversary<a class="headerlink" href="#simple-adversary" title="Link t
<p>In this environment, there is 1 adversary (red), N good agents (green), N landmarks (default N=2). All agents observe the position of landmarks and other agents. One landmark is the ‘target landmark’ (colored green). Good agents are rewarded based on how close the closest one of them is to the
target landmark, but negatively rewarded based on how close the adversary is to the target landmark. The adversary is rewarded based on distance to the target, but it doesn’t know which landmark is the target landmark. All rewards are unscaled Euclidean distance (see main MPE documentation for
average distance). This means good agents have to learn to ‘split up’ and cover all landmarks to deceive the adversary.</p>
<p>Agent observation space: <code class="docutils literal notranslate"><span class="pre">[self_pos,</span> <span class="pre">self_vel,</span> <span class="pre">goal_rel_position,</span> <span class="pre">landmark_rel_position,</span> <span class="pre">other_agent_rel_positions]</span></code></p>
<p>Agent observation space: <code class="docutils literal notranslate"><span class="pre">[goal_rel_position,</span> <span class="pre">landmark_rel_position,</span> <span class="pre">other_agent_rel_positions]</span></code></p>
<p>Adversary observation space: <code class="docutils literal notranslate"><span class="pre">[landmark_rel_position,</span> <span class="pre">other_agents_rel_positions]</span></code></p>
<p>Agent action space: <code class="docutils literal notranslate"><span class="pre">[no_action,</span> <span class="pre">move_left,</span> <span class="pre">move_right,</span> <span class="pre">move_down,</span> <span class="pre">move_up]</span></code></p>
<p>Adversary action space: <code class="docutils literal notranslate"><span class="pre">[no_action,</span> <span class="pre">move_left,</span> <span class="pre">move_right,</span> <span class="pre">move_down,</span> <span class="pre">move_up]</span></code></p>
Expand Down
2 changes: 1 addition & 1 deletion main/searchindex.js

Large diffs are not rendered by default.

0 comments on commit fb144af

Please sign in to comment.