List去重

最近任务上有去重,随手翻了下发现有不少好方法啊~

记录一下,转自http://yxmhero1989.blog.163.com/blog/static/112157956201381443244790/:


一.{}.fromkeys(list).keys()

list2 ={}.fromkeys(list1).keys()

二.set

list2 = list(set(list1))

三.itertools.grouby
ids = [1,4,3,3,4,2,3,4,5,6,1]
ids.sort()
it = itertools.groupby(ids)
 for k, g in it:
    print k

四,笨方法

ids = [1,2,3,3,4,2,3,4,5,6,1]
news_ids = []
for id in ids:
    if id not in news_ids:
        news_ids.append(id)

print news_ids

这四种都有个特点,去重后元素排序变了,效率 据说第一种比第二种快一点


五.索引再次排序 这种可以去重并且保持元素顺序

#要结果是[1, 4, 3, 2, 5, 6]  不要[1, 2, 3, 4, 5, 6]
ids = [1,4,3,3,4,2,3,4,5,6,1]
news_ids = list(set(ids))
news_ids.sort(key=ids.index)
print news_ids #[1, 4, 3, 2, 5, 6]

六:Reduce

ids = [1,4,3,3,4,2,3,4,5,6,1]
func = lambda x,y:x if y in x else x + [y]
print reduce(func, [[], ] + ids)#[1, 4, 3, 2, 5, 6]


参考:

http://the5fire.com/python-remove-duplicates-in-list.html

http://xcw31.diandian.com/post/2012-11-28/40042801718

http://www.benben.cc/blog/?p=386
http://blog.csdn.net/zhengnz/article/details/6265282


你可能感兴趣的:(python,list,去重)