Imagine I have H.264 AnxB frames coming in from a real-time conversation. What is the best way to encapsulate in MPEG2 transport stream while maintaining the timing information for subsequent playback?
I am using libavcodec and libavformat libraries. When I obtain pointer to object (*pcc) of type AVCodecContext, I set the foll.
pcc->codec_id = CODEC_ID_H264; pcc->bit_rate = br; pcc->width = 640; pcc->height = 480; pcc->time_base.num = 1; pcc->time_base.den = fps;
When I receive NAL units, I create a AVPacket and call av_interleaved_write_frame().
AVPacket pkt; av_init_packet( &pkt ); pkt.flags |= AV_PKT_FLAG_KEY; pkt.stream_index = pst->index; pkt.data = (uint8_t*)p_NALunit; pkt.size = len; pkt.dts = AV_NOPTS_VALUE; pkt.pts = AV_NOPTS_VALUE; av_interleaved_write_frame( fc, &pkt );
I basically have two questions:
1) For variable framerate, is there a way to not specify the foll. pcc->time_base.num = 1; pcc->time_base.den = fps; and replace it with something to indicate variable framerate?
2) While submitting packets, what "timestamps" should I assign to pkt.dts and pkt.pts?
Right now, when I play the output using ffplay it is playing at constant framerate (fps) which I use in the above code.
I also would love to know how to accommodate varying spatial resolution. In the stream that I receive, each keyframe is preceded by SPS and PPS. I know whenever the spatial resolution changes. IS there a way to not have to specify pcc->width = 640; pcc->height = 480; upfront? In other words, indicate that the spatial resolution can change mid-stream.
Thanks a lot, Eddie
DTS and PTS are measured in a 90 KHz clock. See ISO 13818 part 1 section 220.127.116.11 way down below the syntax table.
As for the variable frame rate, your framework may or may not have a way to generate this (vui_parameters.fixed_frame_rate_flag=0). Whether the playback software handles it is an ENTIRELY different question. Most players assume a fixed frame rate regardless of PTS or DTS. mplayer can't even compute the frame rate correctly for a fixed-rate transport stream generated by ffmpeg.
I think if you're going to change the resolution you need to end the stream (nal_unit_type 10 or 11) and start a new sequence. It can be in the same transport stream (assuming your client's not too simple).