Know Thy Complexities!

原文网址:http://bigocheatsheet.com/

 

Hi,你好!这篇文章包含了一些计算机领域中常见算法的时间复杂度和空间复杂度。过去在准备技术面试时,为了在面试过程中不被一些有关查询、排序之类的算法的最好、最坏以及平均复杂度这种问题难倒,我花费大量的时间到网上搜集这方面的资料。在过去的几年中,我既参加过硅谷一些新兴公司的面试,也有许多大公司的面试,例如雅虎、易趣、邻客音、谷歌等,每次在我准备面试的时候,我就在想为什么没有人把他们整理成表格呢?,所以为了节省大家的时间,我开了个头,做了一个这样的表格,各位尽情享用吧!

 

Good Fair Poor


 

Searching查询

Algorithm算法 Data Structure数据结构 Time Complexity时间复杂度

Space Complexity

空间复杂度



Average Worst Worst
Depth First Search (DFS)深度优先 Graph of |V| vertices and |E| edges - O(|E| + |V|) O(|V|)
Breadth First Search (BFS)广度优先 Graph of |V| vertices and |E| edges - O(|E| + |V|) O(|V|)
Binary search二分法查询 Sorted array of n elements O(log(n)) O(log(n)) O(1)
Linear (Brute Force)BF算法 Array O(n) O(n) O(1)
Shortest path by Dijkstra,

using a Min-heap as priority queue

使用一个最小堆作为优先级队列的

Dijkstra算法求最短路径

Graph with |V| vertices and |E| edges O((|V| + |E|) log |V|) O((|V| + |E|) log |V|) O(|V|)
Shortest path by Dijkstra,

using an unsorted array as priority queue

使用一个未排序的数组作为优先级队列

的Dijkstra算法求最短路径

Graph with |V| vertices and |E| edges O(|V|^2) O(|V|^2) O(|V|)

Shortest path by Bellman-Ford

Bellman-Ford求最短路径

Graph with |V| vertices and |E| edges O(|V||E|) O(|V||E|) O(|V|)

Sorting排序

Algorithm Data Structure Time Complexity Worst Case Auxiliary Space Complexity
    Best Average Worst Worst
Quicksort快排 Array O(n log(n)) O(n log(n)) O(n^2) O(n)
Mergesort归并 Array O(n log(n)) O(n log(n)) O(n log(n)) O(n)
Heapsort堆排序 Array O(n log(n)) O(n log(n)) O(n log(n)) O(1)
Bubble Sort冒泡 Array O(n) O(n^2) O(n^2) O(1)
Insertion Sort插入 Array O(n) O(n^2) O(n^2) O(1)
Select Sort选择 Array O(n^2) O(n^2) O(n^2) O(1)
Bucket Sort桶排序 Array O(n+k) O(n+k) O(n^2) O(nk)
Radix Sort基数排序 Array O(nk) O(nk) O(nk) O(n+k)

Data Structures数据结构

Data Structure Time Complexity Space Complexity

Average Worst Worst

Indexing Search Insertion Deletion Indexing Search Insertion Deletion

Basic Array数组

O(1) O(n) - - O(1) O(n) - - O(n)

Dynamic Array动态数组

O(1) O(n) O(n) O(n) O(1) O(n) O(n) O(n) O(n)
Singly-Linked List单链表 O(n) O(n) O(1) O(1) O(n) O(n) O(1) O(1) O(n)
Doubly-Linked List双链表 O(n) O(n) O(1) O(1) O(n) O(n) O(1) O(1) O(n)
Skip List跳表 O(log(n)) O(log(n)) O(log(n)) O(log(n)) O(n) O(n) O(n) O(n) O(n log(n))
Hash Table哈希表 - O(1) O(1) O(1) - O(n) O(n) O(n) O(n)
Binary Search Tree二叉搜索树 O(log(n)) O(log(n)) O(log(n)) O(log(n)) O(n) O(n) O(n) O(n) O(n)
Cartresian Tree笛卡尔树 - O(log(n)) O(log(n)) O(log(n)) - O(n) O(n) O(n) O(n)
B-TreeB树 O(log(n)) O(log(n)) O(log(n)) O(log(n)) O(log(n)) O(log(n)) O(log(n)) O(log(n)) O(n)
Red-Black Tree红黑树 O(log(n)) O(log(n)) O(log(n)) O(log(n)) O(log(n)) O(log(n)) O(log(n)) O(log(n)) O(n)
Splay Tree伸展树 - O(log(n)) O(log(n)) O(log(n)) - O(log(n)) O(log(n)) O(log(n)) O(n)
AVL Tree二叉平衡树 O(log(n)) O(log(n)) O(log(n)) O(log(n)) O(log(n)) O(log(n)) O(log(n)) O(log(n)) O(n)

Heaps堆

Heaps Time Complexity

Heapify Find Max Extract Max Increase Key Insert Delete Merge

Linked List (sorted)

有序链表

- O(1) O(1) O(n) O(n) O(1) O(m+n)

Linked List (unsorted)

无序链表

- O(n) O(n) O(1) O(1) O(1) O(1)
Binary Heap二叉堆 O(n) O(1) O(log(n)) O(log(n)) O(log(n)) O(log(n)) O(m+n)
Binomial Heap二项堆 - O(log(n)) O(log(n)) O(log(n)) O(log(n)) O(log(n)) O(log(n))
Fibonacci Heap 斐波那契堆 - O(1) O(log(n))* O(1)* O(1) O(log(n))* O(1)

Graphs图

Node / Edge Management Storage Add Vertex Add Edge Remove Vertex Remove Edge Query
Adjacency list邻接表 O(|V|+|E|) O(1) O(1) O(|V| + |E|) O(|E|) O(|V|)
Incidence list关联表 O(|V|+|E|) O(1) O(1) O(|E|) O(|E|) O(|E|)
Adjacency matrix邻接矩阵 O(|V|^2) O(|V|^2) O(1) O(|V|^2) O(1) O(1)
Incidence matrix关联矩阵 O(|V| ⋅ |E|) O(|V| ⋅ |E|) O(|V| ⋅ |E|) O(|V| ⋅ |E|) O(|V| ⋅ |E|) O(|E|)

Notation for asymptotic growth

letter bound growth
(theta) Θ upper and lower, tight[1] equal[2]
(big-oh) O upper, tightness unknown less than or equal[3]
(small-oh) o upper, not tight less than
(big omega) Ω lower, tightness unknown greater than or equal
(small omega) ω lower, not tight greater than

[1] Big O is the upper bound, while Omega is the lower bound. Theta requires both Big O and Omega, so that's why it's referred to as a tight bound (it must be both the upper and lower bound). For example, an algorithm taking Omega(n log n) takes at least n log n time but has no upper limit. An algorithm taking Theta(n log n) is far preferential since it takes AT LEAST n log n (Omega n log n) and NO MORE THAN n log n (Big O n log n).SO

[2] f(x)=Θ(g(n)) means f (the running time of the algorithm) grows exactly like g when n (input size) gets larger. In other words, the growth rate of f(x) is asymptotically proportional to g(n).

[3] Same thing. Here the growth rate is no faster than g(n). big-oh is the most useful because represents the worst-case behavior.

In short, if algorithm is __ then its performance is __

algorithm performance
o(n) < n
O(n) ≤ n
Θ(n) = n
Ω(n) ≥ n
ω(n) > n

Big-O Complexity Chart

Know Thy Complexities!_第1张图片

你可能感兴趣的:(exit)