[LeetCode]Network Delay Time@Python

Network Delay Time

There are N network nodes, labelled 1 to N.
Given times, a list of travel times as directed edges times[i] = (u, v, w), where u is the source node, v is the target node, and w is the time it takes for a signal to travel from source to target.
Now, we send a signal from a certain node K. How long will it take for all nodes to receive the signal? If it is impossible, return -1.

Note

N will be in the range [1, 100].
K will be in the range [1, N].
The length of times will be in the range [1, 6000].
All edges times[i] = (u, v, w) will have 1 <= u, v <= N and 0 <= w <= 100.

Solution

#Dijkstra算法,用堆免去了比较大小
import collections
class Solution:
    def networkDelayTime(self, times: List[List[int]], N: int, K: int) -> int:
        dis, used, graph = [(0, K)], {}, collections.defaultdict(list)
        for u,v,w in times:
            graph[u].append((w, v))
        while dis:
            time, node = heapq.heappop(dis)
            if node not in used:
                used[node] = time
                for w,v in graph[node]:
                    heapq.heappush(dis, (w+time, v))
        return max(used.values()) if len(used)==N else -1

你可能感兴趣的:(leetcode,heap,图)