{"id":267240,"date":"2024-12-21T16:57:38","date_gmt":"2024-12-21T16:57:38","guid":{"rendered":"https:\/\/michigandigitalnews.com\/index.php\/2024\/12\/21\/whats-new-in-camerax-1-4-0-and-a-sneak-peek-of-jetpack-compose-support\/"},"modified":"2025-06-25T17:09:57","modified_gmt":"2025-06-25T17:09:57","slug":"whats-new-in-camerax-1-4-0-and-a-sneak-peek-of-jetpack-compose-support","status":"publish","type":"post","link":"https:\/\/michigandigitalnews.com\/index.php\/2024\/12\/21\/whats-new-in-camerax-1-4-0-and-a-sneak-peek-of-jetpack-compose-support\/","title":{"rendered":"What&#8217;s new in CameraX 1.4.0 and a sneak peek of Jetpack Compose support"},"content":{"rendered":"<p> [ad_1]<br \/>\n<\/p>\n<div>\n<meta content=\"https:\/\/blogger.googleusercontent.com\/img\/b\/R29vZ2xl\/AVvXsEigeusOF9AySkwGgQw_ZPwUQnNXgh4lXVZsvahkJj_ulJBfDyjHuJh0KI80fR1rxY04IywD5rELnxOSsjWmA42uSd_IWUa79QPaK79PQF1C_kHRgqbBRV6Mk-4zxi6mino6coR6QaHL3kT27NiwUt1lbDvzzogHwp-V3MixukVhrQ21v5_nJW1UzNzYC_s\/s1600\/AndroidSpotlight_Adaptive_CameraX_Metadata_02.png\" name=\"twitter:image\"\/><br \/>\n<img decoding=\"async\" src=\"https:\/\/blogger.googleusercontent.com\/img\/b\/R29vZ2xl\/AVvXsEigeusOF9AySkwGgQw_ZPwUQnNXgh4lXVZsvahkJj_ulJBfDyjHuJh0KI80fR1rxY04IywD5rELnxOSsjWmA42uSd_IWUa79QPaK79PQF1C_kHRgqbBRV6Mk-4zxi6mino6coR6QaHL3kT27NiwUt1lbDvzzogHwp-V3MixukVhrQ21v5_nJW1UzNzYC_s\/s1600\/AndroidSpotlight_Adaptive_CameraX_Metadata_02.png\" style=\"display: none;\"\/><\/p>\n<p><em>Posted by Scott Nien \u2013 Software Engineer (scottnien@)<\/em><\/p>\n<p>Get ready to level up your Android camera apps! CameraX 1.4.0 just dropped with a load of awesome new features and improvements. We&#8217;re talking expanded HDR capabilities, preview stabilization and the versatile effect framework, and a whole lot of cool stuff to explore. We will also explore how to seamlessly  integrate CameraX with Jetpack Compose! Let&#8217;s dive in and see how these enhancements can take your camera app to the next level.<\/p>\n<h2><span style=\"font-size: x-large;\">HDR preview and Ultra HDR<\/span><\/h2>\n<p><image><\/p>\n<div style=\"text-align: center;\"><img decoding=\"async\" alt=\"A split-screen image compares Standard Dynamic Range (SDR) and High Dynamic Range (HDR) image quality side-by-side using a singular image of a detailed landscape. The HDR side is more vivid and vibrant.\" border=\"0\" id=\"imgCaption\" src=\"https:\/\/blogger.googleusercontent.com\/img\/b\/R29vZ2xl\/AVvXsEgLNYJXd1Kk97dTXZe5M09aMs6gXCEr0DxDIazLXDJOIAvh2JbYnmklTDFFrFVkpqzjyxox8yL7EeYCNnXSxKFJHwbvtCCgEDwo8U1cg2KTJdNe34sldODOzK9JKixr7Dt4XHkoGYFCF6TBxVWGZO6oKM0WHktPHxVGXzlp6G0LhTWKS-5JMvb3m-JIeio\/s1600\/image3.png\" width=\"640\"\/><\/div>\n<p><\/image><\/p>\n<p>High Dynamic Range (HDR) is a game-changer for photography, capturing a wider range of light and detail to create stunningly realistic images.  With CameraX 1.3.0, we brought you HDR video recording capabilities, and now in 1.4.0, we&#8217;re taking it even further! Get ready for <b>HDR Preview<\/b> and <b>Ultra HDR<\/b>. These exciting additions empower you to deliver an even richer visual experience to your users.<\/p>\n<h3><span style=\"font-size: large;\">HDR Preview<\/span><\/h3>\n<p>This new feature allows you to enable HDR on Preview without needing to bind a VideoCapture use case. This is especially useful for apps that use a single preview stream for both showing preview on display and video recording with an OpenGL pipeline.<\/p>\n<p>To fully enable the HDR,  you need to ensure your OpenGL pipeline is capable of processing the specific dynamic range format and then check the camera capability.<\/p>\n<p>See following code snippet as an example to enable HLG10 which is the baseline HDR standard that device makers must support on cameras with 10-bit output.<\/p>\n<div style=\"background: rgb(248, 248, 248); border: 0px; overflow: auto; width: auto;\">\n<pre style=\"line-height: 125%; margin: 0px;\"><span style=\"color: #408080; font-style: italic;\">\/\/ Declare your OpenGL pipeline supported dynamic range format. <\/span>\n<span style=\"color: green; font-weight: bold;\">val<\/span> openGLPipelineSupportedDynamicRange = setOf(\n     DynamicRange.SDR, \n     DynamicRange.HLG_10_BIT\n)\n<span style=\"color: #408080; font-style: italic;\">\/\/ Check camera dynamic range capabilities. <\/span>\n<span style=\"color: green; font-weight: bold;\">val<\/span> isHlg10Supported =  \n     cameraProvider.getCameraInfo(cameraSelector)\n           .querySupportedDynamicRanges(openGLPipelineSupportedDynamicRange)\n           .contains(DynamicRange.HLG_10_BIT)\n\n<span style=\"color: green; font-weight: bold;\">val<\/span> preview = Preview.Builder().apply {\n     <span style=\"color: green; font-weight: bold;\">if<\/span> (isHlg10Supported) {\n        setDynamicRange(DynamicRange.HLG_10_BIT)\n     }\n}\n<\/pre>\n<\/div>\n<h3><span style=\"font-size: large;\">Ultra HDR<\/span><\/h3>\n<p>Introducing Ultra HDR, a new format in Android 14 that lets users capture stunningly realistic photos with incredible dynamic range. And the best part? CameraX 1.4.0 makes it incredibly easy to add Ultra HDR capture to your app with just a few lines of code:<\/p>\n<div style=\"background: rgb(248, 248, 248); border: 0px; overflow: auto; width: auto;\">\n<pre style=\"line-height: 125%; margin: 0px;\"><span style=\"color: green; font-weight: bold;\">val<\/span> cameraSelector = CameraSelector.DEFAULT_BACK_CAMERA\n<span style=\"color: green; font-weight: bold;\">val<\/span> cameraInfo = cameraProvider.getCameraInfo(cameraSelector)\n<span style=\"color: green; font-weight: bold;\">val<\/span> isUltraHdrSupported = \n      ImageCapture.getImageCaptureCapabilities(cameraInfo)\n                  .supportedOutputFormats\n                  .contains(ImageCapture.OUTPUT_FORMAT_JPEG_ULTRA_HDR)\n\n<span style=\"color: green; font-weight: bold;\">val<\/span> imageCapture = ImageCapture.Builder().apply {\n    <span style=\"color: green; font-weight: bold;\">if<\/span> (isUltraHdrSupported) {\n        setOutputFormat(ImageCapture.OUTPUT_FORMAT_JPEG_ULTRA_HDR)\n    }\n}.build()\n<\/pre>\n<\/div>\n<h3><span style=\"font-size: large;\">Jetpack Compose support<\/span><\/h3>\n<p>While this post focuses on 1.4.0, we&#8217;re excited to announce the Jetpack Compose support in CameraX 1.5.0 alpha. We\u2019re adding support for a Composable Viewfinder built on top of <span style=\"font-family: Courier;\"><a href=\"https:\/\/developer.android.com\/reference\/kotlin\/androidx\/compose\/foundation\/package-summary#AndroidExternalSurface%28androidx.compose.ui.Modifier,kotlin.Boolean,androidx.compose.ui.unit.IntSize,androidx.compose.foundation.AndroidExternalSurfaceZOrder,kotlin.Boolean,kotlin.Function1%29\" target=\"_blank\" rel=\"noopener\">AndroidExternalSurface<\/a><\/span> and <span style=\"font-family: Courier;\"><a href=\"https:\/\/developer.android.com\/reference\/kotlin\/androidx\/compose\/foundation\/package-summary#AndroidEmbeddedExternalSurface%28androidx.compose.ui.Modifier,kotlin.Boolean,androidx.compose.ui.unit.IntSize,androidx.compose.ui.graphics.Matrix,kotlin.Function1%29\" target=\"_blank\" rel=\"noopener\">AndroidEmbeddedExternalSurface<\/a><\/span>. The <span style=\"font-family: Courier;\"><a href=\"https:\/\/developer.android.com\/reference\/kotlin\/androidx\/camera\/compose\/package-summary#CameraXViewfinder%28androidx.camera.core.SurfaceRequest,androidx.compose.ui.Modifier,androidx.camera.viewfinder.surface.ImplementationMode,androidx.camera.viewfinder.compose.MutableCoordinateTransformer%29\" target=\"_blank\" rel=\"noopener\">CameraXViewfinder<\/a><\/span> Composable hooks up a display surface to a CameraX Preview use case, handling the complexities of rotation, scaling and Surface lifecycle so you don\u2019t need to.<\/p>\n<p><!--HTML generated using hilite.me--><\/p>\n<div style=\"background: rgb(248, 248, 248); border: 0px; overflow: auto; width: auto;\">\n<pre style=\"line-height: 125%; margin: 0px;\"><span style=\"color: #408080; font-style: italic;\">\/\/ in build.gradle <\/span>\nimplementation (<span style=\"color: #ba2121;\">\"androidx.camera:camera-compose:1.5.0-alpha03\"<\/span>)\n\n\n<span style=\"color: green; font-weight: bold;\">class<\/span> <span style=\"color: blue; font-weight: bold;\">PreviewViewModel<\/span> : ViewModel() {\n    <span style=\"color: green; font-weight: bold;\">private<\/span> <span style=\"color: green; font-weight: bold;\">val<\/span> _surfaceRequests = MutableStateFlow<surfacerequest>(<span style=\"color: green; font-weight: bold;\">null<\/span>)\n\n    <span style=\"color: green; font-weight: bold;\">val<\/span> surfaceRequests: StateFlow<surfacerequest>\n        <span style=\"color: green; font-weight: bold;\">get<\/span>() = _surfaceRequests.asStateFlow()\n\n    <span style=\"color: green; font-weight: bold;\">private<\/span> <span style=\"color: green; font-weight: bold;\">fun<\/span> <span style=\"color: blue;\">produceSurfaceRequests<\/span>(previewUseCase: Preview) {\n        <span style=\"color: #408080; font-style: italic;\">\/\/ Always publish new SurfaceRequests from Preview<\/span>\n        previewUseCase.setSurfaceProvider { newSurfaceRequest -&gt;\n            _surfaceRequests.value = newSurfaceRequest\n        }\n    }\n\n    <span style=\"color: #408080; font-style: italic;\">\/\/ ...<\/span>\n}\n\n@Composable\n<span style=\"color: green; font-weight: bold;\">fun<\/span> <span style=\"color: blue;\">MyCameraViewfinder<\/span>(\n    viewModel: PreviewViewModel,\n    modifier: Modifier = Modifier\n) {\n    <span style=\"color: green; font-weight: bold;\">val<\/span> currentSurfaceRequest: SurfaceRequest? by\n        viewModel.surfaceRequests.collectAsState()\n\n    currentSurfaceRequest?.let { surfaceRequest -&gt;\n        CameraXViewfinder(\n            surfaceRequest = surfaceRequest,\n            implementationMode = ImplementationMode.EXTERNAL, <span style=\"color: #408080; font-style: italic;\">\/\/ Or EMBEDDED<\/span>\n            modifier = modifier        \n        )\n    }\n}\n<\/surfacerequest><\/surfacerequest><\/pre>\n<\/div>\n<h3><span style=\"font-size: large;\">Kotlin-friendly APIs<\/span><\/h3>\n<p>CameraX is getting even more Kotlin-friendly! In 1.4.0, we&#8217;ve introduced two new suspend functions to streamline camera initialization and image capture.<\/p>\n<div style=\"background: rgb(248, 248, 248); border: 0px; overflow: auto; width: auto;\">\n<pre style=\"line-height: 125%; margin: 0px;\"><span style=\"color: #408080; font-style: italic;\">\/\/ CameraX initialization <\/span>\n<span style=\"color: green; font-weight: bold;\">val<\/span> cameraProvider = ProcessCameraProvider.awaitInstance()\n\n<span style=\"color: green; font-weight: bold;\">val<\/span> imageProxy = imageCapture.takePicture() \n<span style=\"color: #408080; font-style: italic;\">\/\/ Processing imageProxy<\/span>\nimageProxy.close()\n<\/pre>\n<\/div>\n<h2><span style=\"font-size: x-large;\">Preview Stabilization and Mirror mode<\/span><\/h2>\n<h3><span style=\"font-size: large;\">Preview Stabilization<\/span><\/h3>\n<p><a href=\"https:\/\/source.android.com\/docs\/core\/camera\/camera-preview-stabilization\" target=\"_blank\" rel=\"noopener\">Preview stabilization mode<\/a> was added in Android 13 to enable the stabilization on all non-RAW streams, including previews and MediaCodec input surfaces. Compared to the previous <a href=\"https:\/\/developer.android.com\/reference\/android\/hardware\/camera2\/CameraMetadata#CONTROL_VIDEO_STABILIZATION_MODE_ON\" target=\"_blank\" rel=\"noopener\">video stabilization mode<\/a>, which may have inconsistent FoV (Field of View) between the preview and recorded video, this new preview stabilization mode ensures consistency and thus provides a better user experience.  For apps that record the preview directly for video recording, this mode is also the only way to enable stabilization.<\/p>\n<p>Follow the code below to enable preview stabilization. Please note that once preview stabilization is turned on, it is not only applied to the Preview but also to the VideoCapture if it is bound as well.<\/p>\n<div style=\"background: rgb(248, 248, 248); border: 0px; overflow: auto; width: auto;\">\n<pre style=\"line-height: 125%; margin: 0px;\"><span style=\"color: green; font-weight: bold;\">val<\/span> isPreviewStabilizationSupported =  \n    Preview.getPreviewCapabilities(cameraProvider.getCameraInfo(cameraSelector))\n        .isStabilizationSupported\n<span style=\"color: green; font-weight: bold;\">val<\/span> preview = Preview.Builder().apply {\n    <span style=\"color: green; font-weight: bold;\">if<\/span> (isPreviewStabilizationSupported) {\n      setPreviewStabilizationEnabled(<span style=\"color: green; font-weight: bold;\">true<\/span>)\n    }\n}.build()\n<\/pre>\n<\/div>\n<h3><span style=\"font-size: large;\">MirrorMode<\/span><\/h3>\n<p>While CameraX 1.3.0 introduced mirror mode for VideoCapture,  we&#8217;ve now brought this handy feature to Preview in 1.4.0. This is especially useful for devices with outer displays, allowing you to create a more natural selfie experience when using the rear camera.<\/p>\n<p>To enable the mirror mode, simply call <span style=\"font-family: Courier;\"><a href=\"https:\/\/developer.android.com\/reference\/kotlin\/androidx\/camera\/core\/Preview.Builder#setMirrorMode%28int%29\" target=\"_blank\" rel=\"noopener\">Preview.Builder.setMirrorMode<\/a><\/span> APIs. This feature is supported for Android 13 and above.<\/p>\n<h2><span style=\"font-size: x-large;\">Real-time Effect<\/span><\/h2>\n<p>CameraX 1.3.0 introduced the CameraEffect framework, giving you the power to customize your camera output with OpenGL. Now, in 1.4.0, we&#8217;re taking it a step further.  In addition to applying your own custom effects, you can now leverage a set of pre-built effects provided by CameraX and Media3, making it easier than ever to enhance your app&#8217;s camera features.<\/p>\n<h3><span style=\"font-size: large;\">Overlay Effect<\/span><\/h3>\n<p>The new camera-effects artifact aims to provide ready-to-use effect implementations, starting with the OverlayEffect. This effect lets you draw overlays on top of camera frames using the familiar <a href=\"https:\/\/developer.android.com\/reference\/android\/graphics\/Canvas\" target=\"_blank\" rel=\"noopener\">Canvas API<\/a>.<\/p>\n<p>The following sample code shows how to detect the QR code and draw the shape of the QR code once it is detected.<\/p>\n<p>By default, drawing is performed in surface frame coordinates. But what if you need to use camera sensor coordinates? No problem! <span style=\"color: #0d904f; font-family: Courier;\">OverlayEffect<\/span> provides the <span style=\"font-family: Courier;\"><a href=\"https:\/\/developer.android.com\/reference\/kotlin\/androidx\/camera\/effects\/Frame#getSensorToBufferTransform%28%29\" target=\"_blank\" rel=\"noopener\">Frame#getSensorToBufferTransform<\/a><\/span> function, allowing you to apply the necessary transformation matrix to your overlayCanvas.<\/p>\n<p>In this example, we use CameraX&#8217;s MLKit Vision APIs (<span style=\"color: #0d904f; font-family: Courier;\">MlKitAnalyzer<\/span>) and specify <span style=\"color: #0d904f; font-family: Courier;\">COORDINATE_SYSTEM_SENSOR<\/span> to obtain QR code corner points in sensor coordinates. This ensures accurate overlay placement regardless of device orientation or screen aspect ratio.<\/p>\n<p><!-- HTML generated using hilite.me --><\/p>\n<div style=\"background: #f8f8f8; overflow:auto;width:auto;border:0;\">\n<pre style=\"margin: 0; line-height: 125%\"><span style=\"color: #408080; font-style: italic\">\/\/ in build.gradle <\/span>\nimplementation (<span style=\"color: #BA2121\">\"androidx.camera:camera-effects:1.4.1}\"<\/span>)      \nimplementation (<span style=\"color: #BA2121\">\"androidx.camera:camera-mlkit-vision:1.4.1\"<\/span>)\n\n<span style=\"color: #008000; font-weight: bold\">var<\/span> qrcodePoints: Array<point>? = <span style=\"color: #008000; font-weight: bold\">null<\/span>\n<span style=\"color: #008000; font-weight: bold\">var<\/span> qrcodeTimestamp = <span style=\"color: #666666\">0L<\/span>\n<span style=\"color: #008000; font-weight: bold\">val<\/span> qrcodeBoxEffect \n    = OverlayEffect(\n        PREVIEW <span style=\"color: #408080; font-style: italic\">\/* applied on the preview only *\/<\/span>,\n        <span style=\"color: #666666\">5<\/span>, <span style=\"color: #408080; font-style: italic\">\/* hold multiple frames in the queue so we can match analysis result <\/span>\n<span style=\"color: #408080; font-style: italic\">              with preview frame *\/<\/span>, \n        Handler(Looper.getMainLooper()), {}\n      )\n\n<span style=\"color: #008000; font-weight: bold\">fun<\/span> <span style=\"color: #0000FF\">initCamera<\/span>() {\n    qrcodeBoxEffect.setOnDrawListener { frame -&gt;\n        <span style=\"color: #008000; font-weight: bold\">if<\/span>(frame.timestamp != qrcodeTimestamp) {\n            <span style=\"color: #408080; font-style: italic\">\/\/ Do not change the drawing if the frame doesn\u2019t match the analysis <\/span>\n            <span style=\"color: #408080; font-style: italic\">\/\/ result.<\/span>\n            <span style=\"color: #008000; font-weight: bold\">return<\/span>@setOnDrawListener <span style=\"color: #008000; font-weight: bold\">true<\/span>\n        }\n        frame.overlayCanvas.drawColor(Color.TRANSPARENT, PorterDuff.Mode.CLEAR)\n        qrcodePoints?.let {\n            <span style=\"color: #408080; font-style: italic\">\/\/ Using sensor coordinates to draw.<\/span>\n            frame.overlayCanvas.setMatrix(frame.sensorToBufferTransform)\n            <span style=\"color: #008000; font-weight: bold\">val<\/span> path = android.graphics.Path().apply {\n                it.forEachIndexed { index, point -&gt;\n                    <span style=\"color: #008000; font-weight: bold\">if<\/span> (index == <span style=\"color: #666666\">0<\/span>) {\n                        moveTo(point.x.toFloat(), point.y.toFloat())\n                    } <span style=\"color: #008000; font-weight: bold\">else<\/span> {\n                        lineTo(point.x.toFloat(), point.y.toFloat())\n                    }\n                 }\n                 lineTo(it[<span style=\"color: #666666\">0<\/span>].x.toFloat(), it[<span style=\"color: #666666\">0<\/span>].y.toFloat())\n            }\n            frame.overlayCanvas.drawPath(path, paint)\n        }\n        <span style=\"color: #008000; font-weight: bold\">true<\/span>\n    }\n\n    <span style=\"color: #008000; font-weight: bold\">val<\/span> imageAnalysis = ImageAnalysis.Builder()\n        .build()\n        .apply {\n            setAnalyzer(executor,\n                MlKitAnalyzer(\n                    listOf(barcodeScanner!!),\n                    COORDINATE_SYSTEM_SENSOR,\n                    executor\n                ) { result -&gt;\n                    <span style=\"color: #008000; font-weight: bold\">val<\/span> barcodes = result.getValue(barcodeScanner!!)\n                    qrcodePoints = \n                        barcodes?.takeIf { it.size &gt; <span style=\"color: #666666\">0<\/span>}?.<span style=\"color: #008000; font-weight: bold\">get<\/span>(<span style=\"color: #666666\">0<\/span>)?.cornerPoints\n                    <span style=\"color: #408080; font-style: italic\">\/\/ track the timestamp of the analysis result and release the <\/span>\n                    <span style=\"color: #408080; font-style: italic\">\/\/ preview frame.<\/span>\n                    qrcodeTimestamp = result.timestamp\n                    qrcodeBoxEffect.drawFrameAsync(qrcodeTimestamp)\n                }\n            )\n        }\n\n    <span style=\"color: #008000; font-weight: bold\">val<\/span> useCaseGroup = UseCaseGroup.Builder()\n          .addUseCase(preview)\n          .addUseCase(imageAnalysis)\n          .addEffect(qrcodeBoxEffect)\n          .build()\n\n    cameraProvider.bindToLifecycle(\n        lifecycleOwner, cameraSelector, usecaseGroup)\n  }\n<\/point><\/pre>\n<\/div>\n<p>Here is what the effect looks like:<\/p>\n<p><image><\/p>\n<div style=\"text-align: center;\"><img decoding=\"async\" alt=\"A black and white view from inside a coffee shop looking out at a city street.  The bottom of the photo shows the edge of a table with a laptop and two buttons labeled 'BACK' and 'RECORD'\" border=\"0\" id=\"imgCaption\" src=\"https:\/\/blogger.googleusercontent.com\/img\/b\/R29vZ2xl\/AVvXsEgiBJ6m4-p4Wmh3sI2mhexJqVnSdHDiJ0ifoetK7ccdWWHdmWZgnjPx_IWd95TUx1NTMkokZnN9oZK1fd_rEk-M8KaEZ-LXzSkiQ5P2VtNUYrBRos-nA-dw0yQ0uMlbZVSKtHf9FdlkmvoDSK6Dpt4qIdP2oOtsYceiFrC8lWf0ZOU2BN4nxojG3wCbu5s\/s1600\/image1.png\" width=\"45%\"\/><\/div>\n<p><\/image><\/p>\n<h2><span style=\"font-size: x-large;\">Screen Flash<\/span><\/h2>\n<p>Taking selfies in low light just got easier with CameraX 1.4.0! This release introduces a powerful new feature: screen flash.  Instead of relying on a traditional LED flash which most selfie cameras don\u2019t have, screen flash cleverly utilizes your phone&#8217;s display. By momentarily turning the screen bright white, it provides a burst of illumination that helps capture clear and vibrant selfies even in challenging lighting conditions.<\/p>\n<p>Integrating screen flash into your CameraX app is flexible and straightforward. You have two main options:<\/p>\n<ul>\n<ul>\n<p>1. Implement the <span style=\"color: #0d904f; font-family: Courier;\">ScreenFlash<\/span> interface: This gives you full control over the screen flash behavior. You can customize the color, intensity, duration, and any other aspect of the flash. This is ideal if you need a highly tailored solution.<\/p>\n<\/ul>\n<ul>\n<p>2. Use the built-in implementation: For a quick and easy solution, leverage the pre-built screen flash functionality in <span style=\"color: #0d904f; font-family: Courier;\">ScreenFlashView<\/span> or <span style=\"color: #0d904f; font-family: Courier;\">PreviewView<\/span>. This implementation handles all the heavy lifting for you.<\/p>\n<\/ul>\n<\/ul>\n<p>If you&#8217;re already using <span style=\"color: #0d904f; font-family: Courier;\">PreviewView<\/span> in your app, enabling screen flash is incredibly simple. Just enable it directly on the <span style=\"color: #0d904f; font-family: Courier;\">PreviewView<\/span> instance. If you need more control or aren&#8217;t using <span style=\"color: #0d904f; font-family: Courier;\">PreviewView<\/span>, you can use <span style=\"color: #0d904f; font-family: Courier;\">ScreenFlashView<\/span> directly.<\/p>\n<p>Here&#8217;s a code example demonstrating how to enable screen flash:<\/p>\n<div style=\"background: rgb(248, 248, 248); border: 0px; overflow: auto; width: auto;\">\n<pre style=\"line-height: 125%; margin: 0px;\"><span style=\"color: #408080; font-style: italic;\">\/\/ case 1: PreviewView + CameraX core API.<\/span>\npreviewView.setScreenFlashWindow(activity.getWindow());\nimageCapture.screenFlash = previewView.screenFlash\nimageCapture.setFlashMode(ImageCapture.FLASH_MODE_SCREEN)\n\n<span style=\"color: #408080; font-style: italic;\">\/\/ case 2: PreviewView + CameraController<\/span>\npreviewView.setScreenFlashWindow(activity.getWindow());\ncameraController.setImageCaptureFlashMode(ImageCapture.FLASH_MODE_SCREEN);\n\n<span style=\"color: #408080; font-style: italic;\">\/\/ case 3 : use ScreenFlashView <\/span>\nscreenFlashView.setScreenFlashWindow(activity.getWindow());\nimageCapture.setScreenFlash(screenFlashView.getScreenFlash());\nimageCapture.setFlashMode(ImageCapture.FLASH_MODE_SCREEN);\n<\/pre>\n<\/div>\n<h2><span style=\"font-size: x-large;\">Camera Extensions new features<\/span><\/h2>\n<p>Camera Extensions APIs aim to help apps to access the cutting-edge capabilities previously available only on built-in camera apps. And the ecosystem is growing rapidly! In 2024, we&#8217;ve seen major players like Pixel, Samsung, Xiaomi, Oppo, OnePlus, Vivo, and Honor all embrace Camera Extensions, particularly for Night Mode and Bokeh Mode. CameraX 1.4.0 takes this even further by adding support for brand-new Android 15 Camera Extensions features, including:<\/p>\n<ul>\n<ul>\n<li><b>Postview:<\/b> Provides a preview of the captured image almost instantly before the long-exposure shots are completed<\/li>\n<\/ul>\n<ul>\n<li><b>Capture Process Progress:<\/b> Displays a progress indicator so users know how long capturing and processing will take, improving the experience for features like Night Mode<\/li>\n<\/ul>\n<ul>\n<li><b>Extensions Strength:<\/b> Allows users to fine-tune the intensity of the applied effect<\/li>\n<\/ul>\n<\/ul>\n<p>Below is an example of the improved UX that uses postview and capture process progress features on Samsung S24 Ultra.<\/p>\n<p><image><\/p>\n<div style=\"text-align: center;\"><img decoding=\"async\" alt=\"moving image capturing process progress features on Samsung S24 Ultra\" border=\"0\" id=\"imgCaption\" src=\"https:\/\/blogger.googleusercontent.com\/img\/b\/R29vZ2xl\/AVvXsEiQQbOgL1fhJBA4nL-eO4dhqbA1POcGQImYsAGg6N1F3iFNLD3F2yQyBKaWUxS0yOv5ycc46h4bndnMmwX10KP1ZMk3L6PS_WWTDK_EPRAOQub4G_ssZarclyqvXbllKS5htzF2QEv4uQtdNyvXsbgma7_ZkezDKTk98-rE6Jz0HHiuBM-trnlsLqp0gV8\/s1600\/image2.gif\" width=\"45%\"\/><\/div>\n<p><\/image><\/p>\n<p>Interested to know how this can be implemented? See the sample code below:<\/p>\n<div style=\"background: rgb(248, 248, 248); border: 0px; overflow: auto; width: auto;\">\n<pre style=\"line-height: 125%; margin: 0px;\"><span style=\"color: green; font-weight: bold;\">val<\/span> extensionsCameraSelector =  \n    extensionsManager\n        .getExtensionEnabledCameraSelector(DEFAULT_BACK_CAMERA, extensionMode)\n<span style=\"color: green; font-weight: bold;\">val<\/span> isPostviewSupported = ImageCapture.getImageCaptureCapabilities(                   \n<span>\u00a0\u00a0 \u00a0<\/span>cameraProvider.getCameraInfo(extensionsCameraSelector)\n).isPostviewSupported\n<span style=\"color: green; font-weight: bold;\">val<\/span> imageCapture = ImageCapture.Builder().apply {\n    setPostviewEnabled(isPostviewSupported)\n}.build()\n\nimageCapture.takePicture(outputfileOptions, executor,  \n    object : OnImageSavedCallback {\n        <span style=\"color: green; font-weight: bold;\">override<\/span> <span style=\"color: green; font-weight: bold;\">fun<\/span> <span style=\"color: blue;\">onImageSaved<\/span>(outputFileResults: OutputFileResults) {\n            <span style=\"color: #408080; font-style: italic;\">\/\/ final image saved. <\/span>\n        }\n        <span style=\"color: green; font-weight: bold;\">override<\/span> <span style=\"color: green; font-weight: bold;\">fun<\/span> <span style=\"color: blue;\">onPostviewBitmapAvailable<\/span>(bitmap: Bitmap) {\n            <span style=\"color: #408080; font-style: italic;\">\/\/ Postview bitmap is available.<\/span>\n        }\n        <span style=\"color: green; font-weight: bold;\">override<\/span> <span style=\"color: green; font-weight: bold;\">fun<\/span> <span style=\"color: blue;\">onCaptureProcessProgressed<\/span>(progress: Int) {\n            <span style=\"color: #408080; font-style: italic;\">\/\/ capture process progress update <\/span>\n        }\n}\n<\/pre>\n<\/div>\n<blockquote><p><b>Important:<\/b> If your app ran into the CameraX Extensions issue on Pixel 9 series devices, please use CameraX 1.4.1 instead. This release fixes a critical issue that prevented Night Mode from working correctly with <span style=\"color: #0d904f; font-family: Courier;\">takePicture<\/span>.<\/p><\/blockquote>\n<h3>What&#8217;s Next<\/h3>\n<p>We hope you enjoy this new release. Our mission is to make camera development a joy, removing the friction and pain points so you can focus on innovation. With CameraX, you can easily harness the power of Android&#8217;s camera capabilities and build truly amazing app experiences.<\/p>\n<p>Have questions or want to connect with the CameraX team? Join the CameraX developer discussion group or file a bug report:<\/p>\n<p>We can\u2019t wait to see what you create!<\/p>\n<\/div>\n<p>[ad_2]<br \/>\n<br \/><a href=\"http:\/\/android-developers.googleblog.com\/2024\/12\/whats-new-in-camerax-140-and-jetpack-compose-support.html\">Source link <\/a><\/p>\n","protected":false},"excerpt":{"rendered":"<p>[ad_1] Posted by Scott Nien \u2013 Software Engineer (scottnien@) Get ready to level up your Android camera apps! CameraX 1.4.0 just dropped with a load<\/p>\n","protected":false},"author":1,"featured_media":267241,"comment_status":"closed","ping_status":"closed","sticky":false,"template":"","format":"standard","meta":{"_monsterinsights_skip_tracking":false,"_monsterinsights_sitenote_active":false,"_monsterinsights_sitenote_note":"","_monsterinsights_sitenote_category":0,"_uf_show_specific_survey":0,"_uf_disable_surveys":false,"footnotes":""},"categories":[146],"tags":[],"_links":{"self":[{"href":"https:\/\/michigandigitalnews.com\/index.php\/wp-json\/wp\/v2\/posts\/267240"}],"collection":[{"href":"https:\/\/michigandigitalnews.com\/index.php\/wp-json\/wp\/v2\/posts"}],"about":[{"href":"https:\/\/michigandigitalnews.com\/index.php\/wp-json\/wp\/v2\/types\/post"}],"author":[{"embeddable":true,"href":"https:\/\/michigandigitalnews.com\/index.php\/wp-json\/wp\/v2\/users\/1"}],"replies":[{"embeddable":true,"href":"https:\/\/michigandigitalnews.com\/index.php\/wp-json\/wp\/v2\/comments?post=267240"}],"version-history":[{"count":0,"href":"https:\/\/michigandigitalnews.com\/index.php\/wp-json\/wp\/v2\/posts\/267240\/revisions"}],"wp:featuredmedia":[{"embeddable":true,"href":"https:\/\/michigandigitalnews.com\/index.php\/wp-json\/wp\/v2\/media\/267241"}],"wp:attachment":[{"href":"https:\/\/michigandigitalnews.com\/index.php\/wp-json\/wp\/v2\/media?parent=267240"}],"wp:term":[{"taxonomy":"category","embeddable":true,"href":"https:\/\/michigandigitalnews.com\/index.php\/wp-json\/wp\/v2\/categories?post=267240"},{"taxonomy":"post_tag","embeddable":true,"href":"https:\/\/michigandigitalnews.com\/index.php\/wp-json\/wp\/v2\/tags?post=267240"}],"curies":[{"name":"wp","href":"https:\/\/api.w.org\/{rel}","templated":true}]}}