rtmp推流h264+aac

一. 概述

本文主要讲述如何使用开源的rtmp库来搭建一个rtmp client,推送h264和aac流到rtmp server。笔者基于两套开源的项目进行了测试:rtmpdump以及srslibrtmp(这个是国人在rtmpdump基础上优化改进扩展的rtmp项目)。

srslibrtmp支持多平台上linux/mac/windows以及arm/mips交叉平台但是对交叉编译的支持不是很好(其对交叉编译工具链的引用不灵活,只支持系统默认的sysroot路径,假如你的交叉编译toolchain在某一sdk中,那么你就得需要修改makefile了)。笔者基于某mips平台ipcamera的sdk进行开发测试,实测srs server以及rtmp client运行正常,且资源占用较理想,唯一的缺点是srslibrtmp所提供的接口对于aac的支持不是很好(8000 sample_rate不支持,个人认为是bug,或者是我使用的不当,也懒得去细看代码了,后面直接通过rtmpdump的接口开发的rtmp client运行正常)。

rtmpdump是官方的rtmp lib,但是这个项目缺乏sample/接口说明(据说rtmp server的架构模式也不是很好,笔者只做rtmp client的学习研究,rtmp server部分不做探讨),用它来开发rtmp client最大的优点是数据的封装可以由自己清楚把握(其实rtmp跟flv都是Adobe公司的,rtmp的数据格式也遵从了flv格式,这个稍后讲解),这与api怎么调用可以我是参考的这两篇博文:rtmp 推送h264 + aac 的数据    使用librtmp库发布直播流  ,不把握的地方可以翻看一下源码。rtmpdump的编译倒是还比较灵活这个可以参考项目的README。

rtmp server可以通过nginx搭(参考 手把手教你搭建Nginx-rtmp流媒体服务器+使用ffmpeg推流)或者使用srslibrtmp里的rtmp server(可以在嵌入式平台上运行)。

 

二. 实现

这里不对rtmp的协议进行介绍(实际上只是调用api实现rtmp client的话不需要了解rtmp协议,只需要知道他是基于tcp协议端口号是1935即可,需要研究的读者可以查阅其他资源)。

rtmp在推送h264和aac流时,流的内容可以分为四类,首先在连接到rtmp server后需要传送的是h264的sequence header以及aac的sequence header(内容格式参考flv格式《Video File Format Specification Version 10》),通过这两包数据decoder才会清楚应该如何解码h264和aac数据,接下来的部分就是h264和aac 负载data部分了。

我们知道flv主要有flv file hrader/previous tag length/flv tag/previous tag length/组成,其中tag类型可分为video/audio/script三类,在rtmp传输时只需要传输flv tag即可,并且这个flv tag是去掉flv tag header的tag(以下我们统一称作rtmp tag),rtmp tag作为rtmp packet的body部分被封进了rtmp packet,rtmp传输的就是rtmp packet(当然就像前面讲的先传送h264 sequence header tag再传送aac sequence header tag,然后是h264和aac data tag),至于如何封装可以参考《Video File Format Specification Version 10》或者参考我的另外一篇blog flv封装H264+AAC[附完整代码]   (当然我的flv muxer存在一些缺陷,只支持8k sample rate/bitformat 16bits,pcm buffer为1024bytes等,由于我比较懒所以就不打算改了,在源码里应该都用"TODO"标记出来了)。

这里有几点注意的得稍微提一下:

(1)RTMPPacket中的m_hasAbsTimestamp这个绝对/相对时间戳我不请出什么意思,我试了两个都可以用(我时间戳的基选的是0);

(2)RTMP_Sendpacket里的发送到queue还是直接发送,没有研究过这两者的差异;

(3)在不同线程中调用RTMP_Sendpacket最好加锁处理,至于原因没有研究。

 

三. 例程

通过一个例程简单演示一下如何通过rtmpdump提供的lib书写rtmpclient代码,向rtmpserver推送h264和aac流;

详细资源请访问我的github: https://github.com/steveliu121/pistreamer/tree/master/rtmp

#include 
#include 
#include 
#include 
#include 
#include 
#include 
#include 

#include "myrtmp.h"
#include "aacenc.h"
#include "my_middle_media.h"
#include 


/* XXX:WARNING the pcm period buf length should be the common factor of
 * the aac input pcm frame length, or the aac timestamp will be wrong
 * here the pcm period buf length == 1024 bytes,
 * and the aac input pcm frame length == 2048
 */


#define SPS_LEN		28
#define PPS_LEN		6

#define RES_720P
#ifdef RES_720P
#define RESOLUTION_720P MY_VIDEO_RES_720P
#define RES_WIDTH	1280
#define RES_HEIGHT	720
#endif

#define VIDEO_FPS		15
#define VIDEO_TIME_SCALE	90000
#define VIDEO_SAMPLE_DURATION	(VIDEO_TIME_SCALE / VIDEO_FPS)

#define AUDIO_SAMPLERATE	8000
#define AUDIO_CHANNELS		1
#define AUDIO_TIME_SCALE	(AUDIO_SAMPLERATE * AUDIO_CHANNELS)
/* (AUDIO_TIME_SCALE / AUDIO_FPS)audio_period_time = 64ms, fps = 15.625 */
#define AUDIO_SAMPLE_DURATION	512
#define AAC_BITRATE		16000


static const uint8_t sps_buf[SPS_LEN] = {0x27, 0x64, 0x00, 0x29, 0xac, 0x1a, 0xd0, 0x0a,
			0x00, 0xb7, 0x4d, 0xc0, 0x40, 0x40, 0x50, 0x00,
			0x00, 0x03, 0x00, 0x10, 0x00 ,0x00, 0x03, 0x01,
			0xe8, 0xf1 ,0x42, 0x2a};
static const uint8_t pps_buf[PPS_LEN] = {0x28, 0xee, 0x01, 0x34, 0x92, 0x24};
/*
static const uint8_t sps_buf[SPS_LEN + 4] = {0x00, 0x00, 0x00, 0x1c, 0x27, 0x64,
			0x00, 0x29, 0xac, 0x1a, 0xd0, 0x0a,
			0x00, 0xb7, 0x4d, 0xc0, 0x40, 0x40, 0x50, 0x00,
			0x00, 0x03, 0x00, 0x10, 0x00 ,0x00, 0x03, 0x01,
			0xe8, 0xf1 ,0x42, 0x2a};
static const uint8_t pps_buf[PPS_LEN + 4] = {0x00, 0x00, 0x00, 0x06, 0x28, 0xee,
			0x01, 0x04, 0x92, 0x24};
			*/
static int g_exit;
static HANDLE_AACENCODER aac_enc_hd;
static uint8_t aac_decoder_conf[64];
static int aac_decoder_conf_len;
uint32_t g_timestamp_begin;
RTMP *rtmp;
RTMPPacket video_pkt;
RTMPPacket audio_pkt;
pthread_mutex_t av_mutex;

void sig_handle(int sig)
{
	g_exit = 1;
}

void h264_cb(const struct timeval *tv, const void *data,
	const int len, const int keyframe)
{
	int ret = 0;
	uint8_t *buf = NULL;
	int buf_len = 0;
	uint32_t timestamp = 0;
	int buf_payload_len = 0;

	timestamp = (tv->tv_sec * 1000) + (tv->tv_usec / 1000);

	if (g_timestamp_begin == 0)
		g_timestamp_begin = timestamp;

	/* strip sps/pps from I frame and
	 * replace NALU start flag '0x00/0x00/0x00/0x01' with
	 * the length of NALU in BIGENDIAN
	 */
	if (keyframe) {
		buf = (uint8_t *)data + SPS_LEN + PPS_LEN + 2 * 4;
		buf_len = len - SPS_LEN - PPS_LEN - 2 * 4;
	} else {
		buf = (uint8_t *)data;
		buf_len = len;
	}
	buf_payload_len = buf_len - 4;
	buf[0] = buf_payload_len >> 24;
	buf[1] = buf_payload_len >> 16;
	buf[2] = buf_payload_len >> 8;
	buf[3] = buf_payload_len & 0xff;

	video_pkt.m_headerType = RTMP_PACKET_SIZE_LARGE;
	video_pkt.m_nTimeStamp = (timestamp - g_timestamp_begin);
	video_pkt.m_nBodySize = buf_len + 5;//5bytes VIDEODATA tag header
	rtmppacket_alloc(&video_pkt, video_pkt.m_nBodySize);
	rtmp_write_avc_data_tag(video_pkt.m_body, buf, buf_len, keyframe);

	ret = rtmp_isconnected(rtmp);
	if (ret == true) {
		/* true: send to outqueue;false: send directly */
		pthread_mutex_lock(&av_mutex);
		ret = rtmp_sendpacket(rtmp, &video_pkt, true);
		if (ret == false)
			printf("rtmp send video packet fail\n");
		pthread_mutex_unlock(&av_mutex);
	}

	rtmppacket_free(&video_pkt);
}

void audio_cb(const struct timeval *tv, const void *pcm_buf,
	const int pcm_len, const void *spk_buf)
{
	int ret = 0;
	uint8_t *aac_buf = NULL;
	int aac_buf_len = 0;
	uint32_t timestamp = 0;

	timestamp = (tv->tv_sec * 1000) + (tv->tv_usec / 1000);

	if (g_timestamp_begin == 0)
		g_timestamp_begin = timestamp;

	aac_buf_len = aac_encode(aac_enc_hd, pcm_buf, pcm_len, &aac_buf);
	if (aac_buf_len == 0)
		return;

	audio_pkt.m_headerType = RTMP_PACKET_SIZE_LARGE;
	audio_pkt.m_nTimeStamp = (timestamp - g_timestamp_begin);
	audio_pkt.m_nBodySize = aac_buf_len - 7 + 2;//7bytes ADTS header & 2bytes AUDIODATA tag header
	rtmppacket_alloc(&audio_pkt, audio_pkt.m_nBodySize);
	rtmp_write_aac_data_tag(audio_pkt.m_body, aac_buf, aac_buf_len);

	ret = rtmp_isconnected(rtmp);
	if (ret == true) {
		/* true: send to outqueue;false: send directly */
		pthread_mutex_lock(&av_mutex);
		ret = rtmp_sendpacket(rtmp, &audio_pkt, true);
		if (ret == false)
			printf("rtmp send audio packet fail\n");
		pthread_mutex_unlock(&av_mutex);
	}

	rtmppacket_free(&audio_pkt);
}

static int __connect2rtmpsvr(char *url)
{
	int ret = 0;

	rtmp = rtmp_alloc();
	rtmp_init(rtmp);

	rtmp->Link.timeout=5;	//default 30s
	ret = rtmp_setupurl(rtmp, url);
	if (ret == false) {
		printf("rtmp setup url fail\n");
		goto exit;
	}

	rtmp_enablewrite(rtmp);

	ret = rtmp_connect(rtmp, NULL);
	if (ret == false) {
		printf("rtmp connect fail\n");
		goto exit;
	}

	ret = rtmp_connectstream(rtmp, 0);
	if (ret == false) {
		printf("rtmp connect stream fail\n");
		rtmp_close(rtmp);
		goto exit;
	}

	return 0;

exit:
	return -1;
}

static void __rtmp_send_sequence_header(void)
{
	int ret = 0;

/* rtmp send audio/video sequence header frame */
	rtmppacket_reset(&video_pkt);
	rtmppacket_reset(&audio_pkt);

	video_pkt.m_packetType = RTMP_PACKET_TYPE_VIDEO;
	video_pkt.m_nChannel = 0x04;
	video_pkt.m_nInfoField2 = rtmp->m_stream_id;
	video_pkt.m_hasAbsTimestamp = false;

	audio_pkt.m_packetType = RTMP_PACKET_TYPE_AUDIO;
	audio_pkt.m_nChannel = 0x04;
	audio_pkt.m_nInfoField2 = rtmp->m_stream_id;
	audio_pkt.m_hasAbsTimestamp = false;

	video_pkt.m_headerType = RTMP_PACKET_SIZE_LARGE;
	video_pkt.m_nTimeStamp = 0;
	video_pkt.m_nBodySize = SPS_LEN + PPS_LEN + 16;
	rtmppacket_alloc(&video_pkt, video_pkt.m_nBodySize);
	rtmp_write_avc_sequence_header_tag(video_pkt.m_body,
						sps_buf, SPS_LEN,
						pps_buf, PPS_LEN);

	ret = rtmp_isconnected(rtmp);
	if (ret == true) {
		/* true: send to outqueue;false: send directly */
		pthread_mutex_lock(&av_mutex);
		ret = rtmp_sendpacket(rtmp, &video_pkt, true);
		if (ret == false)
			printf("rtmp send video packet fail\n");
		pthread_mutex_unlock(&av_mutex);
	}

	rtmppacket_free(&video_pkt);


	audio_pkt.m_headerType = RTMP_PACKET_SIZE_LARGE;
	audio_pkt.m_nTimeStamp = 0;
	audio_pkt.m_nBodySize = 4;
	rtmppacket_alloc(&audio_pkt, audio_pkt.m_nBodySize);
	rtmp_write_aac_sequence_header_tag(audio_pkt.m_body,
					AUDIO_SAMPLERATE, AUDIO_CHANNELS);

	ret = rtmp_isconnected(rtmp);
	if (ret == true) {
		/* true: send to outqueue;false: send directly */
		pthread_mutex_lock(&av_mutex);
		ret = rtmp_sendpacket(rtmp, &audio_pkt, true);
		if (ret == false)
			printf("rtmp send audio packet fail\n");
		pthread_mutex_unlock(&av_mutex);
	}

	rtmppacket_free(&audio_pkt);/* rtmp send audio/video sequence header frame */
}

int main(int argc, char *argv[])
{
	int ret = 0;

	MYVideoInputChannel chn = {
		.channelId = 0,
		.res = RESOLUTION_720P,
		.fps = VIDEO_FPS,
		.bitrate = 1024,
		.gop = 1,
		.vbr = MY_BITRATE_MODE_CBR,
		.cb = h264_cb
	};

	MYVideoInputOSD osd_info = {
		.pic_enable = 0,
		.pic_path = "/usr/osd_char_lib/argb_2222",
		.pic_x = 200,
		.pic_y = 200,
		.time_enable = 1,
		.time_x = 100,
		.time_y  = 100
	};

	MYAudioInputAttr_aec audio_in = {
		.sampleRate = AUDIO_SAMPLERATE,
		.sampleBit = 16,
		.volume = 95,
		.cb = audio_cb
	};


	signal(SIGTERM, sig_handle);
	signal(SIGINT, sig_handle);

	pthread_mutex_init(&av_mutex, NULL);

	rtmp_logsetlevel(RTMP_LOGINFO);

	if (argc <= 1) {
		printf("Usage: %s \n"
		"	rtmp_url	 RTMP stream url to publish\n"
		"For example:\n"
		"	%s rtmp://127.0.0.1:1935/live/livestream\n",
		argv[0], argv[0]);
		exit(-1);
	}


	ret = __connect2rtmpsvr(argv[1]);
	if (ret < 0)
		goto exit;

/* create aacencoder */
	ret = create_aac_encoder(&aac_enc_hd,
				AUDIO_CHANNELS, AUDIO_SAMPLERATE, AAC_BITRATE,
				aac_decoder_conf, &aac_decoder_conf_len);
	if (ret < 0)
		goto exit;/* create aacencoder */

	__rtmp_send_sequence_header();

/* start audio&video device and receive buffers, do muxer in callback */
	MYAV_Context_Init();

	ret = MYVideoInput_Init();
	if (ret)
		goto out;

	ret = MYVideoInput_AddChannel(chn);
	if (ret)
		goto out;

	ret = MYVideoInput_SetOSD(chn.channelId, &osd_info);
	if (ret)
		goto out;

	ret = MYAudioInputOpen(&audio_in);
	if (ret)
		goto out;

	ret = MYVideoInput_Start();
	if (ret)
		goto out;

	ret = MYAudioInputStart();
	if (ret)
		goto out;/* start audio&video device and receive buffers, do muxer in callback */

	while (!g_exit)
		sleep(1);

out:
	MYVideoInput_Uninit();
	MYAudioInputStop();
	MYAudioInputClose();

	MYAV_Context_Release();

exit:
	pthread_mutex_destroy(&av_mutex);
	rtmp_close(rtmp);
	rtmp_free(rtmp);
	rtmp = NULL;

	return ret;

}

 

你可能感兴趣的:(嵌入式linux,c开发,音视频开发)