networkx 转deepsnap raise TypeError(f“Unknown type {key} in edge attributes.“)

最近在学习图神经网络,在按照官方文档构建networkx带权网络后,试图将其转为deepsnap网络,却出现了以下报错:

  File "/home/ //Deep-reinforcement-learning-with-pytorch-7b9fac7e5e40ffdc6f7ccb8b0a81e7841370a996/Char10 TD3/util.py", line 25, in nx2batch
    batch = Batch.from_data_list([DSGraph(graph)])
  File "/home//.local/lib/python3.8/site-packages/deepsnap/graph.py", line 100, in __init__
    self._update_tensors(init=True)
  File "/home//.local/lib/python3.8/site-packages/deepsnap/graph.py", line 515, in _update_tensors
    self._update_attributes()
  File "/home//.local/lib/python3.8/site-packages/deepsnap/graph.py", line 549, in _update_attributes
    self[key] = self._get_edge_attributes(key)
  File "/home//.local/lib/python3.8/site-packages/deepsnap/graph.py", line 608, in _get_edge_attributes
    raise TypeError(f"Unknown type {key} in edge attributes.")
TypeError: Unknown type weight in edge attributes.

 在搜索了国内外各种网站无果后,我开始认真查看源码,才发现原来networkx支持np.array的输入,但是deepsnap不支持。所以把np array转为list就可以了

    def _get_edge_attributes(self, key: str) -> torch.tensor:
        r"""
        Returns the edge attributes in the graph.
        Multiple attributes will be stacked.

        Args:
            key(string): the name of the attributes to return.

        Returns:
            :class:`torch.tensor`: Edge attributes.
        """

        # new: concat
        attributes = []
        for x in self.G.edges(data=True):
            if key in x[-1]:
                attributes.append(x[-1][key])
        if len(attributes) == 0:
            return None
        if torch.is_tensor(attributes[0]):
            attributes = torch.stack(attributes, dim=0)
        elif isinstance(attributes[0], float):
            attributes = torch.tensor(attributes, dtype=torch.float)
        elif isinstance(attributes[0], int):
            attributes = torch.tensor(attributes, dtype=torch.long)
        else:
            raise TypeError(f"Unknown type {key} in edge attributes.")

如上文,它只支持tensor, int, float输入,而我的输入是

你可能感兴趣的:(深度学习深入理解,深度学习)