sparksql报错:expression ‘a.id‘ is neither present in the group by,nor is it an aggregate function.

今天在运行带有group by的spark时候发生了报错,运行语句如下:

select a.id as model_id,
a.model as model_name,
count(a.model) as total,
sum(b.token) as token
from chatnio_qa as a left join model_count as b on a.model = b.model 
group by(a.model)

这里发生了一个报错:

expression 'a.`id`' is neither present in the group by, nor is it an aggregate function. Add to group by or wrap in first() (or first_value) if you don't care which value you

原因在于使用group by分组的时候是按照a.model来进行分组,此时就可能出现一个分组中存在多个a.id的情况,当a.model相同的时候a.id可能不同,因此我们需要加入一个first函数来指定是a.model分组之后的第一个a.id,修改之后的sparksql语句如下:

select first(a.id) as model_id,
a.model as model_name,
count(a.model) as total,
sum(b.token) as token,
from chatnio_qa as a left join model_count as b on a.model = b.model
group by(a.model)

你可能感兴趣的:(大数据学习笔记,javascript,开发语言,ecmascript,spark)