Blog Infos
Author
Published
Topics
Published
Topics

TL;DR: High Dynamic Range (HDR) provides improved color, contrast, and brightness for visually stunning video experiences in media-rich Android apps. Developers can enhance their Android apps with HDR video capabilities on devices powered by Snapdragon mobile platforms. This blog dives into using Android’s Camera2 API to query for and enable HDR capabilities on supported devices.

The latest Android devices feature powerful mobile compute and rich media capabilities, with ultra-high quality video in High Dynamic Range or HDR, being one of the most enticing features. HDR allows for brighter, more detailed highlights and shadows, and a wider range of colors compared to Standard Dynamic Range (SDR). However, because SDR is still a prominent format, users need a seamless experience to share their HDR content with other users’ SDR devices, as well as to external systems like social media sites (e.g., some social media sites support the expansion of HDR from a standard 8-bit JPEG).

To support this, Android 13 requires that any Android device with 10-bit YUV capability must also support SDR and use HLG10 as the baseline for HDR capture. Apps can optionally support additional HDR standards including HDR10, HDR10+, and Dolby Vision 8.4.

If you’re an Android app developer integrating a typical camera-to-end-user pipeline that supports HDR, then you’ll want to become more familiar with the Camera2 API package found in the Android API. The API provides low-level access to device-specific functionality and although it requires managing device-specific configurations, it allows you to handle complex use cases.

Let’s take a closer look at what this entails.

Common Terms

Before getting into Camera2, it’s important to understand a few key terms you’ll encounter when implementing a typical camera-to-end-user pipeline:

  • Capture: Capture data from the onboard camera sensor(s) — either for preview or recording.
  • Edit: Process the raw data as HDR or SDR at the codec level. A key phase of this process is tone mapping, which reduces tonal values so the imagery is suitable for display on digital screens.
  • Encode: Compress the raw data (e.g., for storing and sharing).
  • Transcode: Decompress the video and re-encode it (e.g., to different codec formats, HDR to SDR mode, etc.). Additional alterations can also be made during this phase (e.g., adding watermarks).
  • Decode: Decompress an encoded video for playback.
Check for HLG Support

The first order of business is to check for HLG support. Android’s Camera2 API provides a straight-forward interface for this. Android device manufacturers must support HLG10 as the baseline HDR standard for their 10-bit cameras, so start by checking for the presence of a 10-bit camera as shown in their code sample:

val cameraCharacteristics = cameraManager.getCameraCharacteristics(cameraId)
val availableCapabilities = cameraCharacteristics.get(CameraCharacteristics.REQUEST_AVAILABLE_CAPABILITIES)
for (capability in availableCapabilities!!)
{
  if (capability == CameraMetadata.REQUEST_AVAILABLE_CAPABILITIES_DYNAMIC_RANGE_TEN_BIT)
  {
    //HDR available
  }
}

HDR Video Capture

The next step is to set up a CameraCaptureSession for a CameraDevice to capture video from the camera. A CameraCaptureSession abstracts the process for capturing images from a camera to one or more target surfaces.

The following code sample from the Android developer documentation, shows the creation of a CameraCaptureSession using different methods based on the OS version:

private fun setupSessionWithDynamicRangeProfile(
          dynamicRange: Long,
          device: CameraDevice,
          targets: List,
          handler: Handler? = null,
          stateCallback: CameraCaptureSession.StateCallback
  ): Boolean {
      if (android.os.Build.VERSION.SDK_INT >=
              android.os.Build.VERSION_CODES.TIRAMISU) {
          val outputConfigs = mutableListOf()
              for (target in targets) {
                  val outputConfig = OutputConfiguration(target)
                  //sets the dynamic range profile, for example DynamicRangeProfiles.HLG10
                  outputConfig.setDynamicRangeProfile(dynamicRange)
                  outputConfigs.add(outputConfig)
              }
 
          device.createCaptureSessionByOutputConfigurations(
                  outputConfigs, stateCallback, handler)
          return true
      } else {
          device.createCaptureSession(targets, stateCallback, handler)
          return false
      }
  }
}

Note: A preview stream and its shared streams require a low-latency profile, but this is optional for video streams. An application can determine if there is an extra look-ahead delay for any of the HDR modes by invoking isExtraLatencyPresent() (passing in DynamicRangeProfiles.HDR10_PLUSDynamicRangeProfiles.HDR10, and DynamicRangeProfiles.HLG) before invoking setDynamicRangeProfile().

The session object can then be used for both preview and recordings. The code sample below shows how to start a preview by invoking a repeating CaptureRequest:

session.setRepeatingRequest(previewRequest, null, cameraHandler).

Notes:

  • cameraHandler is the thread handler on which the listener should be invoked (or can be set to null to use the current thread).
  • If the application is using different HDR profiles for preview and video, it must check for valid profile combinations using getProfileCaptureRequestConstraints().

A repeating CaptureRequest maintains a continuous stream of frames, without having to continually invoke frame-by-frame capture requests. The first parameter is a CaptureRequest that contains the information required to perform the capture (e.g., capture hardware, output buffer, target surface(s), etc.).

Similarly, a recording is also started using a repeating request. The following example shows this request with a CaptureCallback that can be used to track the capture progress (e.g., started, stopped, etc.):

Job Offers

Job Offers

There are currently no vacancies.

OUR VIDEO RECOMMENDATION

, ,

Migrating to Jetpack Compose – an interop love story

Most of you are familiar with Jetpack Compose and its benefits. If you’re able to start anew and create a Compose-only app, you’re on the right track. But this talk might not be for you…
Watch Video

Migrating to Jetpack Compose - an interop love story

Simona Milanovic
Android DevRel Engineer for Jetpack Compose
Google

Migrating to Jetpack Compose - an interop love story

Simona Milanovic
Android DevRel Engin ...
Google

Migrating to Jetpack Compose - an interop love story

Simona Milanovic
Android DevRel Engineer f ...
Google

Jobs

session.setRepeatingRequest(recordRequest,
        object : CameraCaptureSession.CaptureCallback() {
    override fun onCaptureCompleted(session: CameraCaptureSession,
            request: CaptureRequest, result: TotalCaptureResult) {
        if (currentlyRecording) {
            encoder.frameAvailable()
        }
    }
}, cameraHandler)
HDR10/10+ Video Editing

Video editing is performed using the MediaCodec class. To determine if the device supports HDR video, invoke the getCapabilitiesForType() method which returns a MediaCodecInfo.CodecCapabilities object. Then invoke that object’s isFeatureSupported() method passing in the FEATURE_HdrEditing string. If the method returns true, then the device supports YUV and RGB input. In this case, the encoder transforms and tone-maps RGBA1010102 to encode-able YUV P010. For example, if a user recorded an HDR video in HLG, they can downscale/rotate it or add a logo/sticker and save it in HDR format.

The TransformationRequest.Builder class’s experimental_setEnableHdrEditing() method can be used to construct a transformation for HDR editing.

HDR Transcoding to SDR

You may need to support transcoding HDR content to SDR to allow for sharing content across different devices or exporting video to other formats. Snapdragon SoCs feature an optimized pipeline that looks for ways to reduce latency during the transformation and can tone map various HDR formats including HLG10, HDR10, HDR10+, and Dolby Vision (on licensed devices).

You enable transcoding by implementing the Codec.DecoderFactory interface and working with the Media API. Here, you construct a MediaFormat object for your video’s MIME type and pass MediaFormat.KEY_COLOR_TRANSFER_REQUEST to the object’s setInteger() method, along with the MediaFormat.COLOR_TRANSFER_SDR_VIDEO flag.

The following code sample shows an implementation of the interface’s createForVideoDecoding() method. The implementation configures a codec that can tone-map raw video frames to match the requested transfer, if requested by the caller:

public Codec createForVideoDecoding(
      Format, Surface outputSurface, boolean enableRequestSdrToneMapping)
      throws TransformationException {
   
    MediaFormat =
        MediaFormat.createVideoFormat(
            checkNotNull(format.sampleMimeType), format.width, format.height);
   
    MediaFormatUtil.maybeSetInteger(mediaFormat, MediaFormat.KEY_ROTATION, format.rotationDegrees);
   
    MediaFormatUtil.maybeSetInteger(
        mediaFormat, MediaFormat.KEY_MAX_INPUT_SIZE, format.maxInputSize);
    MediaFormatUtil.setCsdBuffers(mediaFormat, format.initializationData);
    if (SDK_INT >= 29) {
      // On API levels over 29, Transformer decodes as many frames as possible in one render
      // cycle. This key ensures no frame dropping when the decoder's output surface is full.
      mediaFormat.setInteger(MediaFormat.KEY_ALLOW_FRAME_DROP, 0);
    }
    if (SDK_INT >= 31 && enableRequestSdrToneMapping) {
      mediaFormat.setInteger(
          MediaFormat.KEY_COLOR_TRANSFER_REQUEST, MediaFormat.COLOR_TRANSFER_SDR_VIDEO);
    }
 
    @Nullable
    String mediaCodecName = EncoderUtil.findCodecForFormat(mediaFormat, /* isDecoder= */ true);
    if (mediaCodecName == null) {
      throw createTransformationException(format);
    }
    return new DefaultCodec(
        format, mediaFormat, mediaCodecName, /* isDecoder= */ true, outputSurface);
}
Conclusion

Camera2 is a good starting point for Android app developers to add HDR support. With it, you can query for device capabilities at runtime, and provide optional code paths which take full advantage of HDR on supported devices like those powered by Snapdragon mobile platforms. Best of all, these foundations are available now.

Be sure to check out the High Dynamic Range Playback documentation for more information. And visit the Android on Snapdragon developer portal for more APIs and vendor extensions.

Author: Morris Novello
Co-written by Sunid Wilson, Director of Engineering, Camera and Satish Goverdhan, Sr. Director of Engineering, Camera

Morris Novello is Staff Manager and Head of Developer Marketing at Qualcomm Technologies, Inc. In this role, he provides marketing leadership, strategy and execution for both Developers and Partners across QTI technologies.

Snapdragon branded products are products of Qualcomm Technologies, Inc. and/or its subsidiaries.

This article was previously published on proandroiddev.com

YOU MAY BE INTERESTED IN

YOU MAY BE INTERESTED IN

blog
TL;DR: High Dynamic Range (HDR) provides improved color, contrast, and brightness for visually stunning…
READ MORE

Leave a Reply

Your email address will not be published. Required fields are marked *

Fill out this field
Fill out this field
Please enter a valid email address.

Menu