Densepose代码和论文理解

代码:

  1. 训练:同FRCNN
  2. 测试: 同FRCNN
  3. 可视化:

Texture Transfer Using Estimated Dense Coordinates
image----RGB
image对应IUV------3D
需要‘换装’的模板图: 3D表面图(25块)

1得出每块的坐标

.x,y = np.where(IUV[:,:,0]==PartInd) 
  1. 写的我自己也看不懂了
    UV 中属于那一块的点(3d坐标点)
    属于那一块的3d坐标点,转化成那一块模板图的点(x,y),且得到切图的像素
    将模块图贴到原图上
u_current_points = U[x,y]
r_current_points = R[((255-v_current_points)*199./255.).astype(int),(u_current_points*199./255.).astype(int)]*255
R_im[IUV[:,:,0]==PartInd] = r_current_points
def TransferTexture(TextureIm,im,IUV):
    U = IUV[:,:,1]
    V = IUV[:,:,2]
    #
    R_im = np.zeros(U.shape)
    G_im = np.zeros(U.shape)
    B_im = np.zeros(U.shape)
    ###
    for PartInd in xrange(1,23):    ## Set to xrange(1,23) to ignore the face part.
        tex = TextureIm[PartInd-1,:,:,:].squeeze() # get texture for each part.
        #####
        R = tex[:,:,0]
        G = tex[:,:,1]
        B = tex[:,:,2]
        ###############
        x,y = np.where(IUV[:,:,0]==PartInd)
        u_current_points = U[x,y]   #  Pixels that belong to this specific part.
        v_current_points = V[x,y]
        ##
        r_current_points = R[((255-v_current_points)*199./255.).astype(int),(u_current_points*199./255.).astype(int)]*255
        g_current_points = G[((255-v_current_points)*199./255.).astype(int),(u_current_points*199./255.).astype(int)]*255
        b_current_points = B[((255-v_current_points)*199./255.).astype(int),(u_current_points*199./255.).astype(int)]*255
        ##  Get the RGB values from the texture images.
        R_im[IUV[:,:,0]==PartInd] = r_current_points
        G_im[IUV[:,:,0]==PartInd] = g_current_points
        B_im[IUV[:,:,0]==PartInd] = b_current_points
    generated_image = np.concatenate((B_im[:,:,np.newaxis],G_im[:,:,np.newaxis],R_im[:,:,np.newaxis]), axis =2 ).astype(np.uint8)
    BG_MASK = generated_image==0
    generated_image[BG_MASK] = im[BG_MASK]  ## Set the BG as the old image.
    return generated_image

论文:

  1. 网络结构
    网络结构:1.先用Faster-RCNN检测,得到人物区域检测图。2.先用传统神经网络分块。3.分块后用传统神经网在每一块分点。4.把点转化成热力图IVU。
    损失函数:交叉熵损失函数。
    输入与输出:输入一张RGB图像,得到 NDS(分割图),IVU(I:人体25块中的那一块。U,V:3D中的UV贴图。),检测图(框出人物)
  2. COCO-Densepose数据库
    标注方式
    数据库介绍
  3. 结果

你可能感兴趣的:(Densepose代码和论文理解)