Skip to content
Snippets Groups Projects
ffmpeg.texi 59.4 KiB
Newer Older
  • Learn to ignore specific revisions
  • Each frame is passed with its timestamp from the demuxer to the muxer.
    
    Frames will be duplicated and dropped to achieve exactly the requested
    
    constant frame rate.
    
    Frames are passed through with their timestamp or dropped so as to
    prevent 2 frames from having the same timestamp.
    
    Reimar Döffinger's avatar
    Reimar Döffinger committed
    @item drop
    As passthrough but destroys all timestamps, making the muxer generate
    fresh timestamps based on frame-rate.
    
    Chooses between 1 and 2 depending on muxer capabilities. This is the
    default method.
    @end table
    
    
    Note that the timestamps may be further modified by the muxer, after this.
    For example, in the case that the format option @option{avoid_negative_ts}
    is enabled.
    
    
    With -map you can select from which stream the timestamps should be
    taken. You can leave either video or audio unchanged and sync the
    remaining stream(s) to the unchanged one.
    
    
    @item -frame_drop_threshold @var{parameter}
    Frame drop threshold, which specifies how much behind video frames can
    be before they are dropped. In frame rate units, so 1.0 is one frame.
    The default is -1.1. One possible usecase is to avoid framedrops in case
    of noisy timestamps or to increase frame drop precision in case of exact
    timestamps.
    
    
    @item -async @var{samples_per_second}
    
    Audio sync method. "Stretches/squeezes" the audio stream to match the timestamps,
    
    the parameter is the maximum samples per second by which the audio is changed.
    -async 1 is a special case where only the start of the audio stream is corrected
    without any later correction.
    
    
    Note that the timestamps may be further modified by the muxer, after this.
    For example, in the case that the format option @option{avoid_negative_ts}
    is enabled.
    
    
    This option has been deprecated. Use the @code{aresample} audio filter instead.
    
    @item -copyts
    
    Do not process input timestamps, but keep their values without trying
    to sanitize them. In particular, do not remove the initial start time
    offset value.
    
    Note that, depending on the @option{vsync} option or on specific muxer
    
    processing (e.g. in case the format option @option{avoid_negative_ts}
    is enabled) the output timestamps may mismatch with the input
    
    timestamps even when this option is selected.
    
    
    @item -start_at_zero
    When used with @option{copyts}, shift input timestamps so they start at zero.
    
    This means that using e.g. @code{-ss 50} will make output timestamps start at
    50 seconds, regardless of what timestamp the input file started at.
    
    
    @item -copytb @var{mode}
    Specify how to set the encoder timebase when stream copying.  @var{mode} is an
    integer numeric value, and can assume one of the following values:
    
    @table @option
    @item 1
    Use the demuxer timebase.
    
    The time base is copied to the output encoder from the corresponding input
    demuxer. This is sometimes required to avoid non monotonically increasing
    timestamps when copying video streams with variable frame rate.
    
    @item 0
    Use the decoder timebase.
    
    The time base is copied to the output encoder from the corresponding input
    decoder.
    
    @item -1
    Try to make the choice automatically, in order to generate a sane output.
    @end table
    
    Default value is -1.
    
    
    @item -shortest (@emph{output})
    
    Finish encoding when the shortest input stream ends.
    @item -dts_delta_threshold
    Timestamp discontinuity delta threshold.
    
    @item -muxdelay @var{seconds} (@emph{input})
    
    Set the maximum demux-decode delay.
    
    @item -muxpreload @var{seconds} (@emph{input})
    
    Set the initial demux-decode delay.
    
    @item -streamid @var{output-stream-index}:@var{new-value} (@emph{output})
    
    Assign a new stream-id value to an output stream. This option should be
    specified prior to the output filename to which it applies.
    For the situation where multiple output files exist, a streamid
    may be reassigned to a different value.
    
    
    For example, to set the stream 0 PID to 33 and the stream 1 PID to 36 for
    an output mpegts file:
    @example
    ffmpeg -i infile -streamid 0:33 -streamid 1:36 out.ts
    @end example
    
    
    @item -bsf[:@var{stream_specifier}] @var{bitstream_filters} (@emph{output,per-stream})
    
    Set bitstream filters for matching streams. @var{bitstream_filters} is
    
    a comma-separated list of bitstream filters. Use the @code{-bsfs} option
    to get the list of bitstream filters.
    @example
    
    ffmpeg -i h264.mp4 -c:v copy -bsf:v h264_mp4toannexb -an out.h264
    
    @end example
    @example
    
    ffmpeg -i file.mov -an -vn -bsf:s mov2textsub -c:s copy -f rawvideo sub.txt
    
    @item -tag[:@var{stream_specifier}] @var{codec_tag} (@emph{input/output,per-stream})
    
    Force a tag/fourcc for matching streams.
    
    
    @item -timecode @var{hh}:@var{mm}:@var{ss}SEP@var{ff}
    Specify Timecode for writing. @var{SEP} is ':' for non drop timecode and ';'
    (or '.') for drop.
    @example
    ffmpeg -i input.mpg -timecode 01:02:03.04 -r 30000/1001 -s ntsc output.mpg
    @end example
    
    @item -filter_complex @var{filtergraph} (@emph{global})
    
    Define a complex filtergraph, i.e. one with arbitrary number of inputs and/or
    
    outputs. For simple graphs -- those with one input and one output of the same
    type -- see the @option{-filter} options. @var{filtergraph} is a description of
    
    the filtergraph, as described in the ``Filtergraph syntax'' section of the
    
    ffmpeg-filters manual.
    
    
    Input link labels must refer to input streams using the
    @code{[file_index:stream_specifier]} syntax (i.e. the same as @option{-map}
    uses). If @var{stream_specifier} matches multiple streams, the first one will be
    used. An unlabeled input will be connected to the first unused input stream of
    the matching type.
    
    Output link labels are referred to with @option{-map}. Unlabeled outputs are
    added to the first output file.
    
    
    Note that with this option it is possible to use only lavfi sources without
    normal input files.
    
    
    For example, to overlay an image over video
    @example
    
    ffmpeg -i video.mkv -i image.png -filter_complex '[0:v][1:v]overlay[out]' -map
    
    '[out]' out.mkv
    @end example
    Here @code{[0:v]} refers to the first video stream in the first input file,
    which is linked to the first (main) input of the overlay filter. Similarly the
    first video stream in the second input is linked to the second (overlay) input
    of overlay.
    
    Assuming there is only one video stream in each input file, we can omit input
    labels, so the above is equivalent to
    @example
    
    ffmpeg -i video.mkv -i image.png -filter_complex 'overlay[out]' -map
    
    '[out]' out.mkv
    @end example
    
    Furthermore we can omit the output label and the single output from the filter
    graph will be added to the output file automatically, so we can simply write
    @example
    
    ffmpeg -i video.mkv -i image.png -filter_complex 'overlay' out.mkv
    
    
    To generate 5 seconds of pure red video using lavfi @code{color} source:
    @example
    
    ffmpeg -filter_complex 'color=c=red' -t 5 out.mkv
    
    @end example
    
    
    @item -lavfi @var{filtergraph} (@emph{global})
    
    Define a complex filtergraph, i.e. one with arbitrary number of inputs and/or
    
    outputs. Equivalent to @option{-filter_complex}.
    
    
    @item -filter_complex_script @var{filename} (@emph{global})
    This option is similar to @option{-filter_complex}, the only difference is that
    its argument is the name of the file from which a complex filtergraph
    description is to be read.
    
    
    @item -accurate_seek (@emph{input})
    This option enables or disables accurate seeking in input files with the
    @option{-ss} option. It is enabled by default, so seeking is accurate when
    transcoding. Use @option{-noaccurate_seek} to disable it, which may be useful
    e.g. when copying some streams and transcoding the others.
    
    
    @item -seek_timestamp (@emph{input})
    This option enables or disables seeking by timestamp in input files with the
    @option{-ss} option. It is disabled by default. If enabled, the argument
    to the @option{-ss} option is considered an actual timestamp, and is not
    offset by the start time of the file. This matters only for files which do
    not start from timestamp 0, such as transport streams.
    
    
    @item -thread_queue_size @var{size} (@emph{input})
    
    This option sets the maximum number of queued packets when reading from the
    file or device. With low latency / high rate live streams, packets may be
    discarded if they are not read in a timely manner; raising this value can
    avoid it.
    
    
    @item -override_ffserver (@emph{global})
    
    Overrides the input specifications from @command{ffserver}. Using this
    option you can map any input stream to @command{ffserver} and control
    many aspects of the encoding from @command{ffmpeg}. Without this
    option @command{ffmpeg} will transmit to @command{ffserver} what is
    requested by @command{ffserver}.
    
    
    The option is intended for cases where features are needed that cannot be
    
    specified to @command{ffserver} but can be to @command{ffmpeg}.
    
    @item -sdp_file @var{file} (@emph{global})
    Print sdp information to @var{file}.
    This allows dumping sdp information when at least one output isn't an
    rtp stream.
    
    
    @item -discard (@emph{input})
    Allows discarding specific streams or frames of streams at the demuxer.
    Not all demuxers support this.
    
    @table @option
    @item none
    Discard no frame.
    
    @item default
    Default, which discards no frames.
    
    @item noref
    Discard all non-reference frames.
    
    @item bidir
    Discard all bidirectional frames.
    
    @item nokey
    Discard all frames excepts keyframes.
    
    @item all
    Discard all frames.
    @end table
    
    
    @item -xerror (@emph{global})
    Stop and exit on error
    
    
    As a special exception, you can use a bitmap subtitle stream as input: it
    will be converted into a video with the same size as the largest video in
    
    the file, or 720x576 if no video is present. Note that this is an
    
    experimental and temporary solution. It will be removed once libavfilter has
    proper support for subtitles.
    
    For example, to hardcode subtitles on top of a DVB-T recording stored in
    MPEG-TS format, delaying the subtitles by 1 second:
    @example
    ffmpeg -i input.ts -filter_complex \
      '[#0x2ef] setpts=PTS+1/TB [sub] ; [#0x2d0] [sub] overlay' \
      -sn -map '#0x2dc' output.mkv
    @end example
    (0x2d0, 0x2dc and 0x2ef are the MPEG-TS PIDs of respectively the video,
    audio and subtitles streams; 0:0, 0:3 and 0:7 would have worked too)
    
    
    @section Preset files
    A preset file contains a sequence of @var{option}=@var{value} pairs,
    one for each line, specifying a sequence of options which would be
    awkward to specify on the command line. Lines starting with the hash
    ('#') character are ignored and are used to provide comments. Check
    
    the @file{presets} directory in the FFmpeg source tree for examples.
    
    There are two types of preset files: ffpreset and avpreset files.
    
    @subsection ffpreset files
    ffpreset files are specified with the @code{vpre}, @code{apre},
    
    @code{spre}, and @code{fpre} options. The @code{fpre} option takes the
    filename of the preset instead of a preset name as input and can be
    used for any kind of codec. For the @code{vpre}, @code{apre}, and
    @code{spre} options, the options specified in a preset file are
    
    applied to the currently selected codec of the same type as the preset
    option.
    
    The argument passed to the @code{vpre}, @code{apre}, and @code{spre}
    preset options identifies the preset file to use according to the
    following rules:
    
    
    First ffmpeg searches for a file named @var{arg}.ffpreset in the
    
    directories @file{$FFMPEG_DATADIR} (if set), and @file{$HOME/.ffmpeg}, and in
    the datadir defined at configuration time (usually @file{PREFIX/share/ffmpeg})
    
    Gianluigi Tiesi's avatar
    Gianluigi Tiesi committed
    or in a @file{ffpresets} folder along the executable on win32,
    
    in that order. For example, if the argument is @code{libvpx-1080p}, it will
    search for the file @file{libvpx-1080p.ffpreset}.
    
    
    If no such file is found, then ffmpeg will search for a file named
    @var{codec_name}-@var{arg}.ffpreset in the above-mentioned
    directories, where @var{codec_name} is the name of the codec to which
    the preset file options will be applied. For example, if you select
    
    the video codec with @code{-vcodec libvpx} and use @code{-vpre 1080p},
    then it will search for the file @file{libvpx-1080p.ffpreset}.
    
    
    @subsection avpreset files
    avpreset files are specified with the @code{pre} option. They work similar to
    ffpreset files, but they only allow encoder- specific options. Therefore, an
    @var{option}=@var{value} pair specifying an encoder cannot be used.
    
    When the @code{pre} option is specified, ffmpeg will look for files with the
    suffix .avpreset in the directories @file{$AVCONV_DATADIR} (if set), and
    @file{$HOME/.avconv}, and in the datadir defined at configuration time (usually
    @file{PREFIX/share/ffmpeg}), in that order.
    
    First ffmpeg searches for a file named @var{codec_name}-@var{arg}.avpreset in
    the above-mentioned directories, where @var{codec_name} is the name of the codec
    to which the preset file options will be applied. For example, if you select the
    video codec with @code{-vcodec libvpx} and use @code{-pre 1080p}, then it will
    search for the file @file{libvpx-1080p.avpreset}.
    
    If no such file is found, then ffmpeg will search for a file named
    @var{arg}.avpreset in the same directories.
    
    
    @c man end OPTIONS
    
    @chapter Examples
    @c man begin EXAMPLES
    
    @section Video and Audio grabbing
    
    
    If you specify the input format and device then ffmpeg can grab video
    and audio directly.
    
    
    @example
    ffmpeg -f oss -i /dev/dsp -f video4linux2 -i /dev/video0 /tmp/out.mpg
    
    @end example
    
    Or with an ALSA audio source (mono input, card id 1) instead of OSS:
    @example
    ffmpeg -f alsa -ac 1 -i hw:1 -f video4linux2 -i /dev/video0 /tmp/out.mpg
    
    @end example
    
    Note that you must activate the right video source and channel before
    
    launching ffmpeg with any TV viewer such as
    @uref{http://linux.bytesex.org/xawtv/, xawtv} by Gerd Knorr. You also
    
    have to set the audio recording levels correctly with a
    standard mixer.
    
    @section X11 grabbing
    
    
    Grab the X11 display with ffmpeg via
    
    ffmpeg -f x11grab -video_size cif -framerate 25 -i :0.0 /tmp/out.mpg
    
    @end example
    
    0.0 is display.screen number of your X11 server, same as
    the DISPLAY environment variable.
    
    @example
    
    ffmpeg -f x11grab -video_size cif -framerate 25 -i :0.0+10,20 /tmp/out.mpg
    
    0.0 is display.screen number of your X11 server, same as the DISPLAY environment
    variable. 10 is the x-offset and 20 the y-offset for the grabbing.
    
    
    @section Video and Audio file format conversion
    
    
    Any supported file format and protocol can serve as input to ffmpeg:
    
    @itemize
    @item
    You can use YUV files as input:
    
    
    @example
    ffmpeg -i /tmp/test%d.Y /tmp/out.mpg
    @end example
    
    It will use the files:
    @example
    /tmp/test0.Y, /tmp/test0.U, /tmp/test0.V,
    /tmp/test1.Y, /tmp/test1.U, /tmp/test1.V, etc...
    @end example
    
    The Y files use twice the resolution of the U and V files. They are
    raw files, without header. They can be generated by all decent video
    decoders. You must specify the size of the image with the @option{-s} option
    
    if ffmpeg cannot guess it.
    
    @item
    You can input from a raw YUV420P file:
    
    
    @example
    ffmpeg -i /tmp/test.yuv /tmp/out.avi
    @end example
    
    test.yuv is a file containing raw YUV planar data. Each frame is composed
    of the Y plane followed by the U and V planes at half vertical and
    horizontal resolution.
    
    
    @item
    You can output to a raw YUV420P file:
    
    
    @example
    ffmpeg -i mydivx.avi hugefile.yuv
    @end example
    
    
    @item
    You can set several input files and output files:
    
    
    @example
    ffmpeg -i /tmp/a.wav -s 640x480 -i /tmp/a.yuv /tmp/a.mpg
    @end example
    
    Converts the audio file a.wav and the raw YUV video file a.yuv
    to MPEG file a.mpg.
    
    
    @item
    You can also do audio and video conversions at the same time:
    
    
    @example
    ffmpeg -i /tmp/a.wav -ar 22050 /tmp/a.mp2
    @end example
    
    Converts a.wav to MPEG audio at 22050 Hz sample rate.
    
    
    @item
    You can encode to several formats at the same time and define a
    
    mapping from input stream to output streams:
    
    @example
    
    ffmpeg -i /tmp/a.wav -map 0:a -b:a 64k /tmp/a.mp2 -map 0:a -b:a 128k /tmp/b.mp2
    
    @end example
    
    Converts a.wav to a.mp2 at 64 kbits and to b.mp2 at 128 kbits. '-map
    file:index' specifies which input stream is used for each output
    stream, in the order of the definition of output streams.
    
    
    ffmpeg -i snatch_1.vob -f avi -c:v mpeg4 -b:v 800k -g 300 -bf 2 -c:a libmp3lame -b:a 128k snatch.avi
    
    @end example
    
    This is a typical DVD ripping example; the input is a VOB file, the
    output an AVI file with MPEG-4 video and MP3 audio. Note that in this
    command we use B-frames so the MPEG-4 stream is DivX5 compatible, and
    GOP size is 300 which means one intra frame every 10 seconds for 29.97fps
    input video. Furthermore, the audio stream is MP3-encoded so you need
    to enable LAME support by passing @code{--enable-libmp3lame} to configure.
    The mapping is particularly useful for DVD transcoding
    to get the desired audio language.
    
    NOTE: To see the supported input formats, use @code{ffmpeg -formats}.
    
    
    @item
    You can extract images from a video, or create a video from many images:
    
    
    For extracting images from a video:
    @example
    ffmpeg -i foo.avi -r 1 -s WxH -f image2 foo-%03d.jpeg
    @end example
    
    This will extract one video frame per second from the video and will
    output them in files named @file{foo-001.jpeg}, @file{foo-002.jpeg},
    etc. Images will be rescaled to fit the new WxH values.
    
    If you want to extract just a limited number of frames, you can use the
    above command in combination with the -vframes or -t option, or in
    combination with -ss to start extracting from a certain point in time.
    
    For creating a video from many images:
    @example
    
    ffmpeg -f image2 -framerate 12 -i foo-%03d.jpeg -s WxH foo.avi
    
    @end example
    
    The syntax @code{foo-%03d.jpeg} specifies to use a decimal number
    composed of three digits padded with zeroes to express the sequence
    number. It is the same syntax supported by the C printf function, but
    
    only formats accepting a normal integer are suitable.
    
    
    When importing an image sequence, -i also supports expanding
    shell-like wildcard patterns (globbing) internally, by selecting the
    image2-specific @code{-pattern_type glob} option.
    
    For example, for creating a video from filenames matching the glob pattern
    @code{foo-*.jpeg}:
    @example
    
    ffmpeg -f image2 -pattern_type glob -framerate 12 -i 'foo-*.jpeg' -s WxH foo.avi
    
    @item
    You can put many streams of the same type in the output:
    
    ffmpeg -i test1.avi -i test2.avi -map 1:1 -map 1:0 -map 0:1 -map 0:0 -c copy -y test12.nut
    
    The resulting output file @file{test12.nut} will contain the first four streams
    from the input files in reverse order.
    
    @item
    To force CBR video output:
    @example
    
    ffmpeg -i myfile.avi -b 4000k -minrate 4000k -maxrate 4000k -bufsize 1835k out.m2v
    
    @end example
    
    @item
    The four options lmin, lmax, mblmin and mblmax use 'lambda' units,
    but you may use the QP2LAMBDA constant to easily convert from 'q' units:
    @example
    
    ffmpeg -i src.ext -lmax 21*QP2LAMBDA dst.ext
    
    @ifset config-all
    
    @ifset config-avutil
    @include utils.texi
    @end ifset
    @ifset config-avcodec
    @include codecs.texi
    @include bitstream_filters.texi
    @end ifset
    @ifset config-avformat
    @include formats.texi
    @include protocols.texi
    @end ifset
    @ifset config-avdevice
    @include devices.texi
    @end ifset
    @ifset config-swresample
    @include resampler.texi
    @end ifset
    @ifset config-swscale
    @include scaler.texi
    @end ifset
    @ifset config-avfilter
    @include filters.texi
    @end ifset
    
    @ifset config-all
    @url{ffmpeg.html,ffmpeg}
    @end ifset
    @ifset config-not-all
    @url{ffmpeg-all.html,ffmpeg-all},
    @end ifset
    
    @url{ffplay.html,ffplay}, @url{ffprobe.html,ffprobe}, @url{ffserver.html,ffserver},
    @url{ffmpeg-utils.html,ffmpeg-utils},
    @url{ffmpeg-scaler.html,ffmpeg-scaler},
    @url{ffmpeg-resampler.html,ffmpeg-resampler},
    @url{ffmpeg-codecs.html,ffmpeg-codecs},
    
    @url{ffmpeg-bitstream-filters.html,ffmpeg-bitstream-filters},
    
    @url{ffmpeg-formats.html,ffmpeg-formats},
    @url{ffmpeg-devices.html,ffmpeg-devices},
    @url{ffmpeg-protocols.html,ffmpeg-protocols},
    @url{ffmpeg-filters.html,ffmpeg-filters}
    @end ifhtml
    
    @ifnothtml
    
    @ifset config-all
    ffmpeg(1),
    @end ifset
    @ifset config-not-all
    ffmpeg-all(1),
    @end ifset
    
    ffplay(1), ffprobe(1), ffserver(1),
    ffmpeg-utils(1), ffmpeg-scaler(1), ffmpeg-resampler(1),
    ffmpeg-codecs(1), ffmpeg-bitstream-filters(1), ffmpeg-formats(1),
    ffmpeg-devices(1), ffmpeg-protocols(1), ffmpeg-filters(1)
    @end ifnothtml
    
    
    @settitle ffmpeg video converter