得到RTP包中的timestamp

时间:2022-10-12 14:53:46

NTP------网络时间协议

PTP------精确时间协议

PTS,DTS的关系:

http://www.cnblogs.com/qingquan/archive/2011/07/27/2118967.html

都知道RTSP协议中,真正的数据传输是RTP协议来传输的,每个RTP包都有一个timestamp,(相对时间戳 relative timestamp)这个时间戳是需要经过换算的,我需要把它换算成相应的时间打印到播放器显示的每一帧上。

不过据http://*.com/questions/20094998/retrieving-timestamp-in-rtp-rtsp

介绍,AftergettingFrame回调函数中处理的帧时间的,这个时间是PTS(presentationTime timestamp)

以下是代码片断:

 void DummySink::afterGettingFrame(void* clientData, unsigned frameSize, unsigned numTruncatedBytes,
struct timeval presentationTime, unsigned durationInMicroseconds) {
DummySink* sink = (DummySink*)clientData;
sink->afterGettingFrame(frameSize, numTruncatedBytes, presentationTime, durationInMicroseconds);
} // If you don't want to see debugging output for each received frame, then comment out the following line:
#define DEBUG_PRINT_EACH_RECEIVED_FRAME 1 void DummySink::afterGettingFrame(unsigned frameSize, unsigned numTruncatedBytes,
struct timeval presentationTime, unsigned /*durationInMicroseconds*/) {
// We've just received a frame of data. (Optionally) print out information about it:
#ifdef DEBUG_PRINT_EACH_RECEIVED_FRAME
if (fStreamId != NULL) envir() << "Stream \"" << fStreamId << "\"; ";
envir() << fSubsession.mediumName() << "/" << fSubsession.codecName() << ":\tReceived " << frameSize << " bytes";
if (numTruncatedBytes > ) envir() << " (with " << numTruncatedBytes << " bytes truncated)";
char uSecsStr[ + ]; // used to output the 'microseconds' part of the presentation time
sprintf_s(uSecsStr, "%06u", (unsigned)presentationTime.tv_usec);
envir() << ".\tPresentation time: " << (int)presentationTime.tv_sec << "." << uSecsStr;
if (fSubsession.rtpSource() != NULL && !fSubsession.rtpSource()->hasBeenSynchronizedUsingRTCP()) {
envir() << "!"; // mark the debugging output to indicate that this presentation time is not RTCP-synchronized
}
//#ifdef DEBUG_PRINT_NPT
envir() << "\tNPT: " << fSubsession.getNormalPlayTime(presentationTime);
//#endif
envir() << "\n";
#endif // Then continue, to request the next frame of data:
continuePlaying();
} Boolean DummySink::continuePlaying() {
if (fSource == NULL) return False; // sanity check (should not happen) // Request the next frame of data from our input source. "afterGettingFrame()" will get called later, when it arrives:
fSource->getNextFrame(fReceiveBuffer, DUMMY_SINK_RECEIVE_BUFFER_SIZE,
afterGettingFrame, this,
onSourceClosure, this);
return True;
}

以下是openRTSP Demo的timestamp片断:

  if (notifyTheUser) {
struct timeval timeNow;
gettimeofday(&timeNow, NULL);
char timestampStr[];
sprintf_s(timestampStr, "%ld%03ld", timeNow.tv_sec, (long)(timeNow.tv_usec/));
*env << (syncStreams ? "Synchronized d" : "D")
<< "ata packets have begun arriving [" << timestampStr << "]\007\n";
return;
}

当然LIVE555自身就在服务端把PTS转换成RTP timestamp了然后进行传输,然后到客户端又自动将RTP timestamp转换成PTS。如果要把PTS转换成世界协调时间(UTS),就需要用gettimeofday()这个函数来转换。

references:

http://comments.gmane.org/gmane.comp.multimedia.live555.devel/12552

http://bbs.csdn.net/topics/390676557?page=1

http://live-devel.live.narkive.com/LfuvIyZj/rtp-timestamp-to-utc-time