一种LaTex的table,emnlp的

{\renewcommand{\arraystretch}{1.0}
\begin{table*}[!ht]
\centering
{
\begin{tabular}{L{0.85\columnwidth}|C{0.2\columnwidth}C{0.2\columnwidth}|C{0.2\columnwidth}C{0.2\columnwidth}}
\hline
\multirow{3}{*}{\textbf{Model}}   & \multicolumn{2}{c|}{\textbf{TrecQA}}\Tstrut  & \multicolumn{2}{c}{\textbf{WikiQA}} \\ 
\cline{2-5}              & \multicolumn{1}{c}{MAP}\Tstrut   & \multicolumn{1}{c|}{MRR}   & \multicolumn{1}{c}{MAP}   & \multicolumn{1}{c}{MRR} \\
\hline \hline
BERT + \modelacronym + Transfer & \textbf{0.914} & \textbf{0.957} & \textbf{0.857} & \textbf{0.872} \Tstrut  \\
BERT + Transformers + Transfer & 0.895 & 0.939 & 0.831 & 0.848\Tstrut \\
\hline
BERT + \modelacronym & 0.906 & 0.949 & 0.821 & 0.832 \Tstrut \\
BERT + Transformers & 0.886 & 0.926 & 0.813 & 0.828 \Tstrut \\
ELMo + Compare-Aggregate & 0.850 & 0.898 & 0.746 & 0.762 \Tstrut \\
\hline
BERT + Transfer & 0.902 & 0.949 & 0.832  & 0.849 \Tstrut \\
BERT & 0.877 & 0.922 & 0.810 & 0.827 \Tstrut \\
\hline
QC + PR + MP CNN \shortcite{tayyar-madabushi-etal-2018-integrating} & 0.865 & 0.904 & --- & --- \Tstrut \\
IWAN + sCARNN \shortcite{Tran2018TheCA} & 0.829 & 0.875 & 0.716 & 0.722 \Tstrut\\
IWAN \shortcite{Shen2017InterWeightedAN} & 0.822 & 0.889 & 0.733 & 0.750 \Tstrut \\
Compare-Aggregate \shortcite{Bian2017ACM} & 0.821 & 0.899 & 0.748 & 0.758 \Tstrut\\
BiMPM \shortcite{Wang2017BilateralMM} & 0.802 & 0.875 & 0.718 & 0.731 \Tstrut \\
HyperQA \shortcite{HyperQA} & 0.784 & 0.865 & 0.705 & 0.720 \Tstrut \\
NCE-CNN \shortcite{rao2016noise} & 0.801 & 0.877 & --- & --- \Tstrut\\
Attentive Pooling CNN \shortcite{Santos2016AttentivePN} & 0.753 & 0.851 & 0.689 & 0.696 \Tstrut\\
W\&I \shortcite{Wang2015FAQbasedQA} & 0.746 & 0.820 & --- & --- \Tstrut\\

\end{tabular}
}
\caption{Results on the TrecQA and WikiQA datasets}
\label{tab:TrecQA_WikiQA_overall}
\end{table*}

一种LaTex的table,emnlp的_第1张图片
https://arxiv.org/pdf/1909.09696.pdf

你可能感兴趣的:(LaTex)