-
Notifications
You must be signed in to change notification settings - Fork 19
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Single frame movie fails due to lastEncodedDuration not having a value #22
Comments
thanks for the report- good catch! if we automatically set the duration of the frame to (1/timescale) the frame duration may not be what you expect, because it's relatively common to use larger timescales to simplify the process of expressing frame times as the quotient of two integers. for example, a timescale of 600 can describe frame times for 15, 20, 24, 25, and 30 fps timelines- this may not be an issue if you're creating the timestamps yourself, but if you're working with frames from another source you'll encounter this sooner or later. hmm...if the framework can't calculate the frame duration because there's only one frame, then we either have to make assumptions about the duration of the frame (not wild about this), or we have to rely on the dev to provide the correct frame duration somehow. i can add a simple setter that allows devs to provide a fallback framerate if you're encoding one-frame movies...how does that sound? |
Good to know about timescale. A user settable fallback frame duration sounds great. Thanks! |
commit d345920 introduces these changes. specifically:
cheers |
This is the first time i've needed to do 29.97df, and since that is 1001/30000, the default 1/timescale trick doesn't work. What I discovered is that on line 181 of AVAssetWriterHapInput, it checks for the existence of a dictionary for AVVideoCompressionPropertiesKey before looking for the AVFallbackFPSKey which shouldn't necessarily have to do with the compression settings. So it kept ignoring my fallback framerate since i hadn't specified a dict with quality, chunk settings. etc. In my fork i just removed that check and it was fine. What I'm now finding though, is that the completed media is reporting a framerate of 30fps in media players and in the metadata, even though I've verified that throughout the encode process it was always reporting a duration of 1001/30000. |
If you try to encode a movie with just one frame it fails due to lastEncodedDuration not having a meaningful value. I propose that if there’s only one buffer when the writer is marked as finished, 1 unit of the buffer’s time scale is used as the duration. I’ve implimented this in my own private fork but hasn’t been rigorously tested yet outside my use case.
The text was updated successfully, but these errors were encountered: