前面讲过,一个会话可以有多个流,比如音频流、视频流,这里讲一个音频流有哪些东西。
媒体流
struct pjmedia_stream
{
pjmedia_endpt *endpt; /**< Media endpoint. */
pjmedia_codec_mgr *codec_mgr; /**< Codec manager instance. */
pjmedia_stream_info si; /**< Creation parameter. */
pjmedia_port port; /**< Port interface. */
pjmedia_channel *enc; /**< Encoding channel. */
pjmedia_channel *dec; /**< Decoding channel. */
pjmedia_transport *transport; /**< Stream transport. */
pjmedia_codec *codec; /**< Codec instance being used. */
pjmedia_codec_param codec_param; /**< Codec param. */
pj_mutex_t *jb_mutex;
pjmedia_jbuf *jb; /**< Jitter buffer. */
pjmedia_rtcp_session rtcp; /**< RTCP for incoming RTP. */
}
此结构体很庞大,这里省略了大部分成员,其中endpt就是流的上一级端点,transport会指向上一节创建的传输对象。
媒体流创建
在示例程序simpleua.c中,当sip协商成功调用call_on_media_update,在里面会创建并启动流。
/* Create new audio media stream, passing the stream info, and also the
* media socket that we created earlier.
*/
status = pjmedia_stream_create(g_med_endpt, inv->dlg->pool, &stream_info,
g_med_transport[0], NULL, &g_med_stream);
/* Start the UDP media transport */
pjmedia_transport_media_start(g_med_transport[0], 0, 0, 0, 0);
先看创建流
创建流的流程很多,这里抽取一些关键性代码
1、申请媒体流空间
2、初始化流的若干参数
3、codec管理者及codec相关的操作
4、设置第一组回调put_frame和get_frame,这组回调是音频设备要用的,下一节再讲。
5、创建jitterbuffer,这个后面会单独讲
6、创建编码通道和解码通道
7、调用上一节中提到的媒体传输attach,传入第2组回调on_rx_rtp和on_rx_rtcp
/*
* Create media stream.
*/
PJ_DEF(pj_status_t) pjmedia_stream_create( pjmedia_endpt *endpt,
pj_pool_t *pool,
const pjmedia_stream_info *info,
pjmedia_transport *tp,
void *user_data,
pjmedia_stream **p_stream)
{
pjmedia_stream *stream;
pjmedia_transport_attach_param att_param;
/* Allocate the media stream: */
stream = PJ_POOL_ZALLOC_T(pool, pjmedia_stream);
/* Init stream: */
stream->endpt = endpt;
stream->codec_mgr = pjmedia_endpt_get_codec_mgr(endpt);
stream->user_data = user_data;
/* Create and initialize codec: */
status = pjmedia_codec_mgr_alloc_codec( stream->codec_mgr,
&info->fmt, &stream->codec);
stream->port.put_frame = &put_frame;
stream->port.get_frame = &get_frame;
/* Create jitter buffer */
status = pjmedia_jbuf_create(pool, &stream->port.info.name,
stream->frame_size,
stream->codec_param.info.frm_ptime,
jb_max, &stream->jb);
/* Create decoder channel: */
status = create_channel( pool, stream, PJMEDIA_DIR_DECODING,
info->rx_pt, info, &stream->dec);
/* Create encoder channel: */
status = create_channel( pool, stream, PJMEDIA_DIR_ENCODING,
info->tx_pt, info, &stream->enc);
pj_bzero(&att_param, sizeof(att_param));
att_param.stream = stream;
att_param.media_type = PJMEDIA_TYPE_AUDIO;
att_param.user_data = stream;
pj_sockaddr_cp(&att_param.rem_addr, &info->rem_addr);
pj_sockaddr_cp(&stream->rem_rtp_addr, &info->rem_addr);
if (stream->si.rtcp_mux) {
pj_sockaddr_cp(&att_param.rem_rtcp, &info->rem_addr);
} else if (pj_sockaddr_has_addr(&info->rem_rtcp.addr)) {
pj_sockaddr_cp(&att_param.rem_rtcp, &info->rem_rtcp);
}
att_param.addr_len = pj_sockaddr_get_len(&info->rem_addr);
att_param.rtp_cb2 = &on_rx_rtp;
att_param.rtcp_cb = &on_rx_rtcp;
/* Only attach transport when stream is ready. */
status = pjmedia_transport_attach2(tp, &att_param);
}
到这里就可以知道数据的流向,先从pjmedia_transport通过ioqueue接收到音频数据,然后通过几层回调,最终调用到stream的on_rx_rtp。
再看启动流,没有太多东西,就是设置两个标志位。
PJ_DEF(pj_status_t) pjmedia_stream_start(pjmedia_stream *stream)
{
if (stream->enc && (stream->dir & PJMEDIA_DIR_ENCODING)) {
stream->enc->paused = 0;
}
if (stream->dec && (stream->dir & PJMEDIA_DIR_DECODING)) {
stream->dec->paused = 0;
}
return PJ_SUCCESS;
}
on_rx_rtp
当音频数据流到on_rx_rtp时
1、更新RTP session参数
2、解码
3、插入到jitterbuffer
/*
* This callback is called by stream transport on receipt of packets
* in the RTP socket.
*/
static void on_rx_rtp( pjmedia_tp_cb_param *param)
{
pjmedia_stream *stream = (pjmedia_stream*) param->user_data;
void *pkt = param->pkt;
pj_ssize_t bytes_read = param->size;
pjmedia_channel *channel = stream->dec;
const pjmedia_rtp_hdr *hdr;
const void *payload;
unsigned payloadlen;
pjmedia_rtp_status seq_st;
pj_status_t status;
pj_bool_t pkt_discarded = PJ_FALSE;
/* Update RTP and RTCP session. */
status = pjmedia_rtp_decode_rtp(&channel->rtp, pkt, (int)bytes_read,
&hdr, &payload, &payloadlen);
/* Update RTP session (also checks if RTP session can accept
* the incoming packet.
*/
pjmedia_rtp_session_update2(&channel->rtp, hdr, &seq_st,
hdr->pt != stream->rx_event_pt);
/* Parse the payload. */
status = pjmedia_codec_parse(stream->codec, (void*)payload,
payloadlen, &ts, &count, frames);
/* Put each frame to jitter buffer. */
for (i=0; ijb, frames[i].buf, frames[i].size,
frames[i].bit_info, ext_seq, &discarded);
if (discarded)
pkt_discarded = PJ_TRUE;
}
}
总结
pjmedia_stream是pjmedia最复杂的结构体,大部分的操作都在这个对象里进行,其中两组回调是连接数据流的关键,一组跟网络绑定,一组跟设备绑定,使得整个媒体流可以串起来。本节先介绍这些,pjmedia_stream还有很多子模块,比如jitterbuffer,后面再单独介绍。