live555 接收rtsp视频流详细源码流程详细解析

程序从rtsp_player_task这个线程开始进入进入到live555的客户端。

1:char *argv[5]={"openRTSP", "-b", "80000", "-t", ""};这是输入参数。b代表FileSink的大小具体值为80000. T代表用tcp   run_live_rtsp(int argc, char **argv) 这个函数来处理这些参数。
2:在run_live_rtsp(int argc, char **argv) 这个函数里接着ourClient = createClient(*env, verbosityLevel, progName);来创建一个客户端
实例。char* optionsResponse = getOptionsResponse(ourClient, url, username, password); 来发送接受并判断options。
 char* sdpDescription= getSDPDescriptionFromURL(ourClient, url, username, password,
       proxyServerName, proxyServerPortNum,
       desiredPortNum);这个函数用来用来发送接受description,并得到sdp:
v=0
o=- 1266888546188420 1 IN IP4 10.0.4.152
s=MPEG Transport Stream, streamed by the LIVE555 Media Server
i=question.ts
t=0 0
a=tool:LIVE555 Streaming Media v2009.07.09
a=type:broadcast
a=control:*
a=range:npt=0-182.512
a=x-qt-text-nam:MPEG Transport Stream, streamed by the LIVE555 Media Server
a=x-qt-text-inf:question.ts
m=video 0 RTP/AVP 33
c=IN IP4 0.0.0.0
a=control:track1
根据sdp创建会话session = MediaSession::createNew(*env, sdpDescription);
MediaSession* newSession = new MediaSession(env);创建会话。newSession->initializeWithSDP(sdpDescription)初始化变量 最主要是依据m=。。。。来创建子会话和选用rtp协议
subsession->initiate(simpleRTPoffsetArg)创建接收区的一些东西进入initiate仔细看。首先是fRTPSocket = new Groupsock(env(), tempAddr, rtpPortNum, 255);设置rtpsocket。调用到OutputSocket-------》Socket(env, port)------》setupDatagramSocke在这个函数里面int newSocket = socket(AF_INET, SOCK_DGRAM, 0);创建newsocket。。port.num() =0 ReceivingInterfaceAddr==0    INADDR_ANY==0adddddddr=0所以 if (port.num() != 0 || ReceivingInterfaceAddr != INADDR_ANY)就没进去。。。在 if (!socketJoinGroup(env, socketNum(), groupAddr.s_addr))里if (!IsMulticastAddress(groupAddress)) return True;直接返回然后再ourSourceAddressForMulticast这个函数里调用setupDatagramSocket函数,里面由于有了port所以会执行bind。回到ourSourceAddressForMulticast里 if (!socketJoinGroup(env, sock, testAddr.s_addr))就可以加入成功了。writeSocket和 readSocket是通过写和读hostIdTest来获取本地ip。。。这个fRTPSocket = new Groupsock(env(), tempAddr, rtpPortNum, 255);其实就是申请一个socket放在那,然后再申请一个,绑定后writeSocket和 readSocket组播得到本机ip然后撤销socket....
getSourcePort(env(), fRTPSocket->socketNum(), clientPort)随即得到一个rtp端口号然后把fRTPSocket = new Groupsock第一次创建的那个socket和这个rtp端口进行绑定。
而frtcpsocket没有加入到组播地址里。接着依据fCodecName来创建fRTPSource和fReadSource。。由于我的ts流是MP2T的所以if (strcmp(fCodecName, "MP2T") == 0) { // MPEG-2 Transport Stream
      printf("keke\n");
fRTPSource = SimpleRTPSource::createNew(env(), fRTPSocket, fRTPPayloadFormat,
fRTPTimestampFrequency, "video/MP2T",
0, False);
fReadSource = MPEG2TransportStreamFramer::createNew(env(), fRTPSource);
printf("defr\n");
    // this sets "durationInMicroseconds" correctly, based on the PCR values
      } 
fRTPSource创造媒体流名字并加入列表,而fReadSource有点像filter其输入端像是fRTPSource。fRTCPInstance = RTCPInstance::createNew(env(), fRTCPSocket,
      totSessionBandwidth,
      (unsigned char const*)
      fParent.CNAME(),
      NULL /* we're a client */,
      fRTPSource);创造fRTCPInstance进行rtp的控制。
现在进入setupStreams();一步一步调用到setupMediaSubsession它主要完成
Sending request: SETUP rtsp://10.0.4.152/question.ts/track1 RTSP/1.0
CSeq: 3
Transport: RTP/AVP;unicast;client_port=32770-32771
User-Agent: openRTSP (LIVE555 Streaming Media v2006.11.15)
本地的端口号和获取server的端口号
Received SETUP response: RTSP/1.0 200 OK
CSeq: 3
Date: Mon, Mar 01 2010 07:56:37 GMT
Transport: RTP/AVP;unicast;destination=10.0.4.155;source=10.0.4.152;client_port=32770-32771;server_port=6970-6971
Session: 16
 fileSink = FileSink::createNew(*env, outFileName,
 fileSinkBufferSize, oneFilePerFrame);
      if (strcmp(subsession->mediumName(), "video") == 0){
   printf("fileSink->set_media_type(1);\n");
               fileSink->set_media_type(1);
       }创建sink文件并设置
文件类型。
 subsession->sink->startPlaying(*(subsession->readSource()),
 subsessionAfterPlaying,
 subsession);开始取
数据
startPlayingStreams(-1);告诉server可以发数据了
现在进入不断循环取数据的阶段:
env->taskScheduler().doEventLoop()------->BasicTaskScheduler::SingleStep.
  它首先接受描述符fd_set readSet = fReadSet; 而fReadSet在之前就已经创建了并且通过turnOnBackgroundReadHandling(int socketNum,
BackgroundHandlerProc* handlerProc,
void* clientData) 这个函数把每个套
接口加入进去并组织好了队列。
接下来就是不断循环看哪个套接口可读呢。然后调用MultiFramedRTPSource:: networkReadHandler这个函数来读取数据。
bPacket->fillInData(source->fRTPInterface)取数据最后调用到Boolean RTPInterface::handleRead(unsigned char* buffer,
 unsigned bufferMaxSize,
 unsigned& bytesRead,
 struct sockaddr_in& fromAddress) 
由于用的是传输层用的是udp所以fNextTCPReadStreamSocketNum<0。调用readSuccess = fGS->handleRead(buffer, bufferMaxSize, bytesRead, fromAddress);调用到Boolean Groupsock::handleRead(unsigned char* buffer, unsigned bufferMaxSize,
      unsigned& bytesRead,
      struct sockaddr_in& fromAddress)接着调用到
int readSocket(UsageEnvironment& env,
       int socket, unsigned char* buffer, unsigned bufferSize,
       struct sockaddr_in& fromAddress,
       struct timeval* timeout) 里面bytesRead = recvfrom(socket, (char*)buffer, bufferSize, 0,
 (struct sockaddr*)&fromAddress,
 &addressSize);读取数据
 statsIncoming.countPacket(numBytes); statsGroupIncoming.countPacket(numBytes);这两个函数统计读入的字节数。fillInData这个函数到此就执行完。接下来的工作是分析包头直到The rest of the packet is the usable data.  Record and save it:  storePacket(bPacket)这个函数是把bPacket组织成队列形式,接下来调用到MultiFramedRTPSource::doGetNextFrame1()别被他名字骗了,其实他就死去取我们刚刚得到的那包数据BufferedPacket* nextPacket
      = fReorderingBuffer->getNextCompletedPacket(packetLossPrecededThis);然后判断是不是我们希望的。然后调用FramedSource::afterGetting,接着调用FileSink::afterGettingFrame
接着FileSink::afterGettingFrame1然后addData(unsigned char* data, unsigned dataSize,
       struct timeval presentationTime)这时调用notify_rtp_frame(get_media_type(), data, dataSize, presentationTime);这个函数告诉dma有数据来了 continuePlaying();用从头开始循环

你可能感兴趣的:(live555 接收rtsp视频流详细源码流程详细解析)