Getting StartedBest PracticesNormalizing Video Input for the Web

Normalizing Video Input for the Web

The Video API accepts a variety of different video formats/CODECs, and will transcode most video files into stream-friendly formats. To do this, imgix will normalize ingested video in order to process them through the Video API. This process takes time, so videos will be unavailable to be streamed until they are normalized and processed by our service. Standardization can also be done by yourself prior to serving videos with the Video API. Standardizing videos beforehand may potentially improve Video API performance.

You can see below for our requirements for video standard inputs. For non-standard video inputs, imgix will process the video to ensure that each standard input is met.

Acceptable Video File Formats

We accept any video file format as input. To ensure successful processing of videos, an appropriate video content-type should be defined both in the metadata of the video and in the header response of when the video is requested from your Origin.

Standard Inputs

  • 1080p/2K or smaller: Video resolution up to 2048x2048 are considered standard, including 1080p (1920x1080) video. Any videos considered larger than this will be normalized to 2048x2048.
  • H.264 video CODEC: H.264 is the video CODEC most in use today, and is supported by almost every device. While imgix accepts other CODECs as input, we will normalize all other CODECs to H.264.

    We highly recommend using H.264 codec for your origin files. You will observe the best performance and quality when using H.264. Processing time will also be faster.

  • Max 10-second keyframe interval: Streaming using HTTP-based streaming methods (ex: HLS) requires keyframes intervals to be less than 10 seconds. If it is larger, we will limit it to less than 10 seconds.
  • Closed GOP (group-of-pictures): Any open-GOP video will be normalized to closed-GOP. Closed-GOP is the default for most video inputs, so if you’re trying to optimize and you’re not sure where to find this setting, you can most likely ignore it.
  • 8Mbps or below: We normalized bitrates to 8Mbps and below - bitrates should also not exceed 16Mbps for any single GOP. This is because higher bitrates are typically difficult for most user connections.
  • 8-bit 4:2:0 or below: Color depth and chroma subsampling will be normalized to 8-bit 4:2:0. This means that high dynamic range video (HDR) will be normalized to SDR, which can result in color changes in the video output.
  • Simple Edit Decision Lists: Edit Decision List (EDL)‘s added during post-production with complex EDLs will be noramlized to Simple EDLs.
  • Frame rate between 5 and 120: Video frame rates within the range of 10 to 120 will be preserved. Video with less than 10 fps or greater than 120 fps will be normalized to 30 fps.
  • Square Pixel Aspect Ratio: Pixel Aspect Ratio is a ratio of pixel’s width to the height of that pixel. The value 1:1 represents Square Pixel Aspect Ratio. Videos with any other value with be normalized to a 1:1 Square Pixel Aspect Ratio.
  • AAC audio CODEC: AAC is the audio CODEC most in use today, and is supported by almost every device. While imgix accepts other CODECs as input, imgix will normalized other audio CODECs to AAC audio.
  • 12 hour duration: The max video duration for any video is 12 hours. If a video is longer than this, it will be cut short.

Modifying Videos for Standard Input

If you wish to normalize your videos prior to serving them with the Video API, you can do so by using several different tools.

Using ffmpeg

ffmpeg is a open-source CLI tool that can be used to modify video files. To use ffmpeg to normalize an imgix video prior to using the Video API, follow these instructions:

  1. Install ffmpeg
  2. Using your CLI, navigate to where your video file is
  3. Assuming input.mp4 is your video file and out.mp4 is the output file you want to save it as, run this command:
ffmpeg -i input.mp4 -c:a copy -vf "scale=w=min(iw\,1920):h=-2" -c:v libx264 \
-profile high -b:v 7000k -pix_fmt yuv420p -maxrate 16000k out.mp4

Feel free to modify this by using different presets as long as they meet imgix’s standard input guidelines.

Using your mobile device

Most mobile devices capture H.264 8-bit 4:2:0 video by default, though there are some things to watch for when recording video that will be served using the Video API:

  • The total file bitrate should be below 8 mbps.
  • Use SDR (standard dynamic range) instead of HDR (High dynamic range) for recording.
  • The output file should be 1080p. If it is larger (such as 4k), imgix will rescale the video so that it is served in 1080p.

To re-iterate, it is not necessary to conform your videos to the standard input guidelines, but it can be done if you wish to fully optimize your video processing times and input.

Non-Standard Input

Regardless of whether or not a video meets the imgix video standards, imgix will still be able to transcode the video and serve it.

However, in the case of a non-standard video input, imgix will be required to normalize the video input prior to trasncoding. This means that non-normalized videos will:

  • Take longer to process due to the standardizing process
  • Have static standard input format settings applied
    • For example, if the included frame rate is standard (10-120 fps), imgix will respect it. If it is non-standard, we will convert it to 30 fps

Playing Videos on the Web

By using the Video API, you can transcode your videos into a streamable format that can be played on any major platform or device: web browser, mobile app, and TV.

The example below uses hls.js to create an embeddable web player that can be played on any web page, on any device.

<script src="https://cdn.jsdelivr.net/npm/hls.js@latest"></script>
 
<video
  id="my-player"
  controls
  style="width: 100%; max-width: 1500px;"
  crossorigin="anonymous"
/>
 
<script>
  const video = document.querySelector("#my-player")
  const src =
    "https://assets.imgix.video/videos/girl-reading-book-in-library.mp4?fm=hls"
  if (video.canPlayType("application/vnd.apple.mpegurl")) {
    video.src = src
  } else if (Hls.isSupported()) {
    const hls = new Hls()
    hls.loadSource(src)
    hls.attachMedia(video)
  } else {
    console.error(
      "This is a legacy browser that doesn't support Media Source Extensions",
    )
  }
</script>

This creates the following video element:

Why Should I use HLS?

The HLS (HTTP Live Streaming) format uses Adaptive Bitrate Streaming, which is a method that measures the client’s connection strength when watching a HLS video. Once measured, the video quality adjusts dynamically based on the strength of the connection. This method will continue to measure the internet connection throughout the video and adjust to provide the best quality while reducing buffering.

MP4 videos do not use any Adaptive Bitrate Streaming methods. HLS resolves this issue with MP4 videos by chopping them up into shorter clips of around 10 seconds. This allows HLS to continuously check the client’s internet connection to determine the best quality to render the video.

HLS video support is available for most devices and web browsers.

In order to use HLS, you will need to append either fm=hls or format=hls to your video URL query.