Skip to content

something wrong with the code #8

@XiaoqiangZhou

Description

@XiaoqiangZhou

In 'models/graph_memory.py', line 274. i.e., the forward function of class ' Memory'.

m_out_all[:, :, x, :, :] = hiden_state

It seems that the updated memory cell m_out_all will not influence the next graph propagation, beecause according to line 258, m_out_all = torch.cat((m_out, q_out.unsqueeze(2)),dim=2).contiguous() # B, D_o, T+1, H, W, the variable m_out_all is related to the initial memory state m_out instead of the refined m_out_all from former propagations.

Could you please answer my question?

Metadata

Metadata

Assignees

No one assigned

    Labels

    No labels
    No labels

    Projects

    No projects

    Milestone

    No milestone

    Relationships

    None yet

    Development

    No branches or pull requests

    Issue actions