{"id":277875,"date":"2025-06-05T15:06:19","date_gmt":"2025-06-05T15:06:19","guid":{"rendered":"https:\/\/michigandigitalnews.com\/index.php\/2025\/06\/05\/building-powerful-ai-driven-experiences-with-jetpack-compose-gemini-and-camerax\/"},"modified":"2025-06-25T17:08:11","modified_gmt":"2025-06-25T17:08:11","slug":"building-powerful-ai-driven-experiences-with-jetpack-compose-gemini-and-camerax","status":"publish","type":"post","link":"https:\/\/michigandigitalnews.com\/index.php\/2025\/06\/05\/building-powerful-ai-driven-experiences-with-jetpack-compose-gemini-and-camerax\/","title":{"rendered":"Building powerful AI-driven experiences with Jetpack Compose, Gemini and CameraX"},"content":{"rendered":"<p> [ad_1]<br \/>\n<\/p>\n<p>The Android bot is a beloved mascot for Android users and developers, with previous versions of the bot builder being very popular &#8211; we decided that this year we\u2019d rebuild the bot maker from the ground up, using the latest technology backed by Gemini. Today we are releasing a new <a href=\"http:\/\/github.com\/android\/androidify\" target=\"_blank\" rel=\"noopener\">open source app, Androidify<\/a>, for learning how to build powerful AI driven experiences on Android using the latest technologies such as Jetpack Compose, Gemini through Firebase, CameraX, and Navigation 3.<\/p>\n<p>Here\u2019s an example of the app running on the device, showcasing converting a photo to an Android bot that represents my likeness:<\/p>\n<div id=\"\">\n<div style=\"text-align: center;\"><img decoding=\"async\" alt=\"moving image showing the conversion of an image of a woman in a pink dress holding na umbrella into a 3D image of a droid bot wearing a pink dress holding an umbrella\" border=\"0\" id=\"imgCaption\" src=\"https:\/\/blogger.googleusercontent.com\/img\/b\/R29vZ2xl\/AVvXsEhxOHgFiXuoeTUjM4oXoTr84cmNqihVX2xt7ARBzl_6NOsgYy6VM2lNNpQtjBIH9vh3zP-Ap8cRw-ew2MMvfhaDkX7-1v2t8jiaHiQR6RqtAIGP8aWmN43mlKF7E5_Ynq3IOYM9HIKCWPIk8wi7KFYfvItrDc7IvEbpo_3oTV0p-9EpeQjC2wDUrl__4TQ\/s1080\/androidify-bot-demo-google-io.gif\" width=\"40%\"\/><\/div>\n<h2><span style=\"font-size: x-large;\">Under the hood<\/span><\/h2>\n<p>The app combines a variety of different Google technologies, such as:<\/p>\n<ul>\n<ul>\n<li><b><a href=\"http:\/\/developer.android.com\/ai\/overview-gemini\" target=\"_blank\" rel=\"noopener\">Gemini API<\/a><\/b> &#8211; through Firebase AI Logic SDK, for accessing the underlying Imagen and Gemini models.<\/li>\n<\/ul>\n<ul>\n<li><b><a href=\"http:\/\/d.android.com\/compose\" target=\"_blank\" rel=\"noopener\">Jetpack Compose<\/a><\/b> &#8211; for building the UI with delightful animations and making the app adapt to different screen sizes.<\/li>\n<\/ul>\n<ul>\n<li><b>Navigation 3<\/b> &#8211; the latest navigation library for building up Navigation graphs with Compose.<\/li>\n<\/ul>\n<ul>\n<li><b><a href=\"https:\/\/developer.android.com\/reference\/kotlin\/androidx\/camera\/compose\/package-summary\" target=\"_blank\" rel=\"noopener\">CameraX Compose<\/a> and <a href=\"https:\/\/developer.android.com\/media\/media3\/ui\/compose\" target=\"_blank\" rel=\"noopener\">Media3 Compose<\/a><\/b> &#8211; for building up a custom camera with custom UI controls (rear camera support, zoom support, tap-to-focus) and playing the promotional video.<\/li>\n<\/ul>\n<\/ul>\n<p>This sample app is currently using a standard Imagen model, but we&#8217;ve been working on a fine-tuned model that&#8217;s trained specifically on all of the pieces that make the Android bot cute and fun; we&#8217;ll share that version later this year. In the meantime, don&#8217;t be surprised if the sample app puts out some interesting looking examples!<\/p>\n<h2><span style=\"font-size: x-large;\">How does the Androidify app work?<\/span><\/h2>\n<p>The app leverages our best practices for Architecture, Testing, and UI to showcase a real world, modern AI application on device.<\/p>\n<p><image><\/p>\n<div style=\"text-align: center;\"><img decoding=\"async\" alt=\"Flow chart describing Androidify app flow\" border=\"0\" id=\"imgCaption\" src=\"https:\/\/blogger.googleusercontent.com\/img\/b\/R29vZ2xl\/AVvXsEicZhSlnTWz7A9-stUVfbB-rbgpv38xFKgCjJSYoggrrfARVC2YFtyqNYW5_J3LMq20zfqeADQh52q6DwxBUBC455yKX3WctM23xfh-QlhvZBGoY6Q76UKr44ReVU5QWWNQ2P-IFW86caPOh5dIrIrNGBzQthLv7EU9cOCDCHMERjIUMyOg_h97t7q1GK8\/s16000\/androidify-app-flow-architecture.png\"\/><\/div>\n<p><imgcaption><center><em>Androidify app flow chart detailing how the app works with AI<\/em><\/center><\/imgcaption><\/image><\/p>\n<h2><span style=\"font-size: x-large;\">AI in Androidify with Gemini and ML Kit<\/span><\/h2>\n<p>The Androidify app uses the Gemini models in a multitude of ways to enrich the app experience, all powered by the <a href=\"https:\/\/firebase.google.com\/docs\/vertex-ai\" target=\"_blank\" rel=\"noopener\">Firebase AI Logic SDK<\/a>. The app uses Gemini 2.5 Flash and Imagen 3 under the hood:<\/p>\n<ul>\n<ul>\n<li><b>Image validation:<\/b> We ensure that the captured image contains sufficient information, such as a clearly focused person, and assessing for safety. This feature uses the multi-modal capabilities of Gemini API, by giving it a prompt and image at the same time:<\/li>\n<\/ul>\n<\/ul>\n<p><!--Kotlin--><\/p>\n<div style=\"background: rgb(248, 248, 248); border: 0px; overflow: auto; width: auto;\">\n<pre style=\"line-height: 125%; margin: 0px;\"><span style=\"color: green; font-weight: bold;\">val<\/span> response = generativeModel.generateContent(\n   content {\n       text(prompt)\n       image(image)\n   },\n)\n<\/pre>\n<\/div>\n<p><\/p>\n<ul>\n<ul>\n<li><b>Text prompt validation:<\/b> If the user opts for text input instead of image, we use Gemini 2.5 Flash to ensure the text contains a sufficiently descriptive prompt to generate a bot.<\/li>\n<\/ul>\n<p><\/p>\n<ul>\n<li><b>Image captioning:<\/b> Once we\u2019re sure the image has enough information, we use Gemini 2.5 Flash to perform image captioning., We ask Gemini to be as descriptive as possible,focusing on the clothing and its colors.<\/li>\n<\/ul>\n<p><\/p>\n<ul>\n<li><b>\u201cHelp me write\u201d feature:<\/b> Similar to an \u201cI\u2019m feeling lucky\u201d type feature,  \u201cHelp me write\u201d uses Gemini 2.5 Flash to create a random description of the clothing and hairstyle of a bot.<\/li>\n<\/ul>\n<p><\/p>\n<ul>\n<li><b>Image generation from the generated prompt:<\/b> As the final step, Imagen generates the image, providing the prompt and the selected skin tone of the bot.<\/li>\n<\/ul>\n<\/ul>\n<p>The app also uses the <a href=\"https:\/\/developers.google.com\/ml-kit\/vision\/pose-detection\" target=\"_blank\" rel=\"noopener\">ML Kit pose detection<\/a> to detect a person in the viewfinder and enable the capture button when a person is detected, as well as adding fun indicators around the content to indicate detection.<\/p>\n<p>Explore more detailed information about <a href=\"https:\/\/android-developers.googleblog.com\/2025\/05\/androidify-how-androidify-leverages-gemini-firebase-ml-kit.htm\" target=\"_blank\" rel=\"noopener\">AI usage in Androidify<\/a>.<\/p>\n<h2><span style=\"font-size: x-large;\">Jetpack Compose<\/span><\/h2>\n<p>The user interface of Androidify is built using Jetpack Compose, the modern UI toolkit that simplifies and accelerates UI development on Android.<\/p>\n<h3><span style=\"font-size: x-large;\">Delightful details with the UI<\/span><\/h3>\n<p>The app uses <a href=\"https:\/\/m3.material.io\/blog\/building-with-m3-expressive?utm_source=blog&amp;utm_medium=motion&amp;utm_campaign=IO25\" target=\"_blank\" rel=\"noopener\">Material 3 Expressive<\/a>, the latest alpha release that makes your apps more premium, desirable, and engaging. It  provides delightful bits of UI out-of-the-box, like new shapes, componentry, and using the <span style=\"color: #0d904f; font-family: courier;\">MotionScheme<\/span> variables wherever a motion spec is needed.<\/p>\n<p><span style=\"font-family: courier;\"><a href=\"https:\/\/developer.android.com\/reference\/kotlin\/androidx\/compose\/material3\/MaterialShapes\" target=\"_blank\" rel=\"noopener\">MaterialShapes<\/a><\/span> are used in various locations. These are a preset list of shapes that allow for easy morphing between each other\u2014for example, the cute cookie shape for the camera capture button:<\/p>\n<p><image><\/p>\n<div style=\"text-align: center;\"><img decoding=\"async\" alt=\"Androidify app UI showing camera button\" border=\"0\" id=\"imgCaption\" src=\"https:\/\/blogger.googleusercontent.com\/img\/b\/R29vZ2xl\/AVvXsEjtTIuUO9OlzwfseVWyceNrYI5DWDcvyMZosXPJr1I5uN8X5vFGjpSQtSCqIGDVZpD8ocxogmtPfKftJb8jSjUt7E4dndHDue4x2Yvq8mwC2JkVuqyeGYGOOUEvSqk7koc7mUDXMTC4FYQYnvsi_ioJu4UClFadiv_HKZ7mLioCLHr3DpvnYrf_kbwq_f4\/s1600\/camera-button-materialshapes-androidify-google-io.png\" width=\"40%\"\/><\/div>\n<p><imgcaption><center><em>Camera button with a <span style=\"color: #0d904f; font-family: courier;\">MaterialShapes.Cookie9Sided<\/span> shape<\/em><\/center><\/imgcaption><\/image><\/p>\n<p>Beyond using the standard Material components, Androidify also features custom composables and delightful transitions tailored to the specific needs of the app:<\/p>\n<ul>\n<ul>\n<li>There are plenty of shared element transitions across the app\u2014for example, a morphing shape shared element transition is performed between the \u201ctake a photo\u201d button and the camera surface.<\/li>\n<\/ul>\n<p><\/p>\n<ul>\n<p><image><\/p>\n<div style=\"text-align: center;\"><img decoding=\"async\" alt=\"moving example of expressive button shapes in slow motion\" border=\"0\" id=\"imgCaption\" src=\"https:\/\/blogger.googleusercontent.com\/img\/b\/R29vZ2xl\/AVvXsEhWVTQUwtQo13XnUS8EByVxH7Ou3rUWqyvY2BpsJh-Ub6ZA4_S3Z3yPZORg9TAUZvHNsaNjhUsAq3bcGZKDStCaeUCzcqxBDu4zwcalJ9ZSFKcyJyqocu6YVvv3yMhNuz-getfnHY5uVD6royUWUm58nBPCwQmA30uMUqopjL_MhRWYn10yeJgsf-yfiV0\/s1600\/morph-shared-element-androidify-google-io.gif\" width=\"40%\"\/><\/div>\n<p><\/image><\/p>\n<li>Custom enter transitions for the <span style=\"font-family: courier;\"><a href=\"https:\/\/github.com\/android\/androidify\/blob\/169b2d521b0743af765e8d52dd714029d4bf24cc\/feature\/results\/src\/main\/java\/com\/android\/developers\/androidify\/results\/ResultsScreen.kt#L101\" target=\"_blank\" rel=\"noopener\">ResultsScreen<\/a><\/span> with the usage of marquee modifiers.<\/li>\n<\/ul>\n<p><\/p>\n<ul>\n<p><image><\/p>\n<div style=\"text-align: center;\"><img decoding=\"async\" alt=\"animated marquee example\" border=\"0\" id=\"imgCaption\" src=\"https:\/\/blogger.googleusercontent.com\/img\/b\/R29vZ2xl\/AVvXsEivftZEN51EFmOej_1nfp-7zrGWUxbJx7nEx5GgVAtjlK9t4f49VmKE0RSOWP6zhkhMTaj7fnYq5SH5su_5hzxCGoX9wXiyVEROOmPAwIev5dk6O8Uw_cIcPmqERVEA48vD4jcN3pSJvzHhLEchXU5h5locO8qOQFGO-eErP-oXSWU017BaF7Q3Auvjlh4\/s1600\/end-screen-animation-androidify-google-io.gif\" width=\"40%\"\/><\/div>\n<p><\/image><\/p>\n<li>Fun color splash animation as a transition between screens.<\/li>\n<\/ul>\n<p><\/p>\n<ul>\n<p><image><\/p>\n<div style=\"text-align: center;\"><img decoding=\"async\" alt=\"moving image of a blue color splash transition between Androidify demo screens\" border=\"0\" id=\"imgCaption\" src=\"https:\/\/blogger.googleusercontent.com\/img\/b\/R29vZ2xl\/AVvXsEi_5rJvIIo6ioe_Jo5dkrOD7OuZdo4xJHbmnIqr-kVB6pViOimjtcqZUnxZD2trpDTvoySZWcxx4RjqQEPYhlm1vfr8M_NY7Y9rl7ea7XOUneHLGlIxHWQBjIGKaZZ78SddX1UoUYPoG88fNWcr9t09ShrI_KFZcg8nZaUF59enD7sbcL5x5gsLl-jsQiA\/s1600\/color-splash-androidify-google-io.gif\" width=\"40%\"\/><\/div>\n<p><\/image><\/p>\n<li>Animating gradient buttons for the AI-powered actions.<\/li>\n<\/ul>\n<p><\/p>\n<ul>\n<p><image><\/p>\n<div style=\"text-align: center;\"><img decoding=\"async\" alt=\"animated gradient button for AI powered actions example\" border=\"0\" id=\"imgCaption\" src=\"https:\/\/blogger.googleusercontent.com\/img\/b\/R29vZ2xl\/AVvXsEj1RYgxeWS2GZA6Lk9JK0E74VBJBhHeySxP12qXhYwGLl42-o1pdKBCdQY_QUq5z0yW_eHnaRFqT5TiAqyjGJVZ2TqH8WO-QqqJYWhPWjrUJfHveBowy_h5ltkZ9xa54QuAS0e2pANkENbkg0qb5CaaU8Zp5u-ebYgbC-i6HJ_ACzG_sdS6q1aH89cgAZk\/s1600\/animated-gradient-button-androidify-google-io.gif\" width=\"40%\"\/><\/div>\n<p><\/image><\/ul>\n<\/ul>\n<p> To learn more about the unique details of the UI, read <a href=\"https:\/\/android-developers.googleblog.com\/2025\/05\/androidify-building-delightful-ui-with-compose.html\" target=\"_blank\" rel=\"noopener\">Androidify: Building delightful UIs with Compose<\/a><\/p>\n<h2><span style=\"font-size: x-large;\">Adapting to different devices<\/span><\/h2>\n<p>Androidify is designed to look great and function seamlessly across candy bar phones, foldables, and tablets. The general goal of developing adaptive apps is to <b>avoid reimplementing the same app multiple times on each form factor<\/b> by extracting out reusable composables, and leveraging APIs like <span style=\"color: #0d904f; font-family: courier;\">WindowSizeClass<\/span> to determine what kind of layout to display.<\/p>\n<p><image><\/p>\n<div style=\"text-align: center;\"><img decoding=\"async\" alt=\"a collage of different adaptive layouts for the Androidify app across small and large screens\" border=\"0\" id=\"imgCaption\" src=\"https:\/\/blogger.googleusercontent.com\/img\/b\/R29vZ2xl\/AVvXsEic1TH6JufLXDvVrsaIgTSbPHlP6OnoAj5ZXxbUIr7bVTgGclDnLVJQcp8yPshtg8rrWQfFRG6W37kYYKwNyBiKzbkX2Z-z2WMuDzSHCmmt9fqvqTDObJWIIrWu0v_GznhdgALR8gcJfLjOMyBRB5g8anHmdwyqJyYWx9sAFmwbpu47HiSfQXZVknl5pow\/s16000\/adaptive-layouts-androidify-google-io.png\"\/><\/div>\n<p><imgcaption><center><em>Various adaptive layouts in the app<\/em><\/center><\/imgcaption><\/image><\/p>\n<p>For Androidify, we only needed to leverage the width window size class. Combining this with different layout mechanisms, we were able to reuse or extend the composables to cater to the multitude of different device sizes and capabilities.<\/p>\n<ul>\n<ul>\n<li><b>Responsive layouts:<\/b> The <span style=\"font-family: courier;\"><a href=\"https:\/\/github.com\/android\/androidify\/blob\/169b2d521b0743af765e8d52dd714029d4bf24cc\/feature\/creation\/src\/main\/java\/com\/android\/developers\/androidify\/creation\/CreationScreen.kt#L156\" target=\"_blank\" rel=\"noopener\"><a href=\"https:\/\/github.com\/android\/androidify\/blob\/169b2d521b0743af765e8d52dd714029d4bf24cc\/feature\/creation\/src\/main\/java\/com\/android\/developers\/androidify\/creation\/CreationScreen.kt#L156\" target=\"_blank\" rel=\"noopener\">CreationScreen<\/a><\/span> demonstrates adaptive design. It uses helper functions like <span style=\"color: #0d904f; font-family: courier;\">isAtLeastMedium()<\/span> to detect window size categories and adjust its layout accordingly. On larger windows, the image\/prompt area and color picker might sit side-by-side in a <span style=\"color: #0d904f; font-family: courier;\">Row<\/span>, while on smaller windows, the color picker is accessed via a <span style=\"color: #0d904f; font-family: courier;\">ModalBottomSheet<\/span>. This pattern, called \u201csupporting pane\u201d, highlights the supporting dependencies between the main content and the color picker.<\/li>\n<\/ul>\n<p><\/p>\n<ul>\n<li><b>Foldable support:<\/b> The app actively checks for foldable device features. The camera screen uses <span style=\"color: #0d904f; font-family: courier;\">WindowInfoTracker<\/span> to get <span style=\"color: #0d904f; font-family: courier;\">FoldingFeature<\/span> information to adapt to different features by optimizing the layout for tabletop posture.<\/li>\n<\/ul>\n<p><\/p>\n<ul>\n<li><b>Rear display:<\/b> Support for devices with multiple displays is included via the <span style=\"font-family: courier;\"><a href=\"https:\/\/github.com\/android\/androidify\/blob\/169b2d521b0743af765e8d52dd714029d4bf24cc\/feature\/camera\/src\/main\/java\/com\/android\/developers\/androidify\/camera\/RearCameraUseCase.kt#L42\" target=\"_blank\" rel=\"noopener\">RearCameraUseCase<\/a><\/span>, allowing for the device camera preview to be shown on the external screen when the device is unfolded (so the main content is usually displayed on the internal screen).<\/li>\n<\/ul>\n<\/ul>\n<p>Using window size classes, coupled with creating a custom <span style=\"font-family: courier;\"><a href=\"https:\/\/github.com\/android\/androidify\/blob\/169b2d521b0743af765e8d52dd714029d4bf24cc\/core\/util\/src\/main\/java\/com\/android\/developers\/androidify\/util\/AdaptivePreview.kt#L59\" target=\"_blank\" rel=\"noopener\">@LargeScreensPreview<\/a><\/span> annotation, helps achieve unique and useful UIs across the spectrum of device sizes and window sizes.<\/p>\n<h2><span style=\"font-size: x-large;\">CameraX and Media3 Compose<\/span><\/h2>\n<p>To allow users to base their bots on photos, Androidify integrates <a href=\"https:\/\/developer.android.com\/media\/camera\/camerax\" target=\"_blank\" rel=\"noopener\">CameraX<\/a>, the Jetpack library that makes camera app development easier.<\/p>\n<p>The app uses a custom <span style=\"font-family: courier;\"><a href=\"https:\/\/github.com\/android\/androidify\/blob\/169b2d521b0743af765e8d52dd714029d4bf24cc\/feature\/camera\/src\/main\/java\/com\/android\/developers\/androidify\/camera\/CameraLayout.kt#L52\" target=\"_blank\" rel=\"noopener\">CameraLayout<\/a><\/span> composable that supports the layout of the typical composables that a camera preview screen would include\u2014 for example, zoom buttons, a capture button, and a flip camera button. This layout adapts to different device sizes and more advanced use cases, like the tabletop mode and rear-camera display. For the actual rendering of the camera preview, it uses the new <span style=\"font-family: courier;\"><a href=\"https:\/\/developer.android.com\/reference\/kotlin\/androidx\/camera\/compose\/package-summary#CameraXViewfinder%28androidx.camera.core.SurfaceRequest,androidx.compose.ui.Modifier,androidx.camera.viewfinder.core.ImplementationMode,androidx.camera.viewfinder.compose.MutableCoordinateTransformer%29\" target=\"_blank\" rel=\"noopener\">CameraXViewfinder<\/a><\/span> that is part of the <span style=\"color: #0d904f; font-family: courier;\">camerax-compose<\/span> artifact.<\/p>\n<p><image><\/p>\n<div style=\"text-align: center;\"><img decoding=\"async\" alt=\"CameraLayout in Compose\" border=\"0\" id=\"imgCaption\" src=\"https:\/\/blogger.googleusercontent.com\/img\/b\/R29vZ2xl\/AVvXsEgI8uGN5mVP0rjmzH6UdJm1lDfrQPg52kArxQaz0t41xh-3cM7dlbS1u6um4bPSIEhUiX8NorFs_tOI_V6K-rd8b7cEUJWIbOzbBlYOoPqz_lfIhfEakWDiqdHG1xQnhJLXvyeVgowNWqTCr3okJ4CiJaKSGzAaAw9ThkhRcIPk6trKZuh6BOCEzdfXb-s\/s16000\/cameralayout-camerax-media3-compose-androidify-google-io.png\"\/><\/div>\n<p><imgcaption><center><em><span style=\"color: #0d904f; font-family: courier;\">CameraLayout<\/span> composable that takes care of different device configurations, such as table top mode<\/em><\/center><\/imgcaption><\/image><br \/><image><\/p>\n<div style=\"text-align: center;\"><img decoding=\"async\" alt=\"CameraLayout in Compose\" border=\"0\" id=\"imgCaption\" src=\"https:\/\/blogger.googleusercontent.com\/img\/b\/R29vZ2xl\/AVvXsEjvgaQZTySov3PINb0Hce1XNxW6cbhtExTgKwZZbUQ9XYGGuMilMhqEbiVp6W73Bf-AP_uFKbGEYtav1nChf-zTuXCBHzxXkM_jxP4TvGOnTehRq0sJ5zlhqBAPzFbnLNTzr2IHoRAiWLN0Mil9VPC-44IYONEyFJiDece2PCDuFPF6XXPABYlbe2I7KFQ\/s16000\/adaptive-camera-layouts-androidify-google-io.png\"\/><\/div>\n<p><imgcaption><center><em><span style=\"color: #0d904f; font-family: courier;\">CameraLayout<\/span> composable that takes care of different device configurations, such as table top mode<\/em><\/center><\/imgcaption><\/image><\/p>\n<p>The app also integrates with <a href=\"https:\/\/developer.android.com\/media\/media3\" target=\"_blank\" rel=\"noopener\">Media3<\/a> APIs to load an instructional video for showing how to get the best bot from a prompt or image. Using the new <span style=\"color: #0d904f; font-family: courier;\">media3-ui-compose<\/span> artifact, we can easily add a <span style=\"font-family: courier;\"><a href=\"https:\/\/github.com\/android\/androidify\/blob\/169b2d521b0743af765e8d52dd714029d4bf24cc\/feature\/home\/src\/main\/java\/com\/android\/developers\/androidify\/home\/HomeScreen.kt#L517\" target=\"_blank\" rel=\"noopener\">VideoPlayer<\/a><\/span> into the app:<\/p>\n<p><!-- Kotlin --><\/p>\n<div style=\"background: #f8f8f8; overflow:auto;width:auto;border:0;\">\n<pre style=\"margin: 0; line-height: 125%\">@Composable\n<span style=\"color: #008000; font-weight: bold\">private<\/span> <span style=\"color: #008000; font-weight: bold\">fun<\/span> <span style=\"color: #0000FF\">VideoPlayer<\/span>(modifier: Modifier = Modifier) {\n    <span style=\"color: #008000; font-weight: bold\">val<\/span> context = LocalContext.current\n    <span style=\"color: #008000; font-weight: bold\">var<\/span> player by remember { mutableStateOf<player>(<span style=\"color: #008000; font-weight: bold\">null<\/span>) }\n    LifecycleStartEffect(Unit) {\n        player = ExoPlayer.Builder(context).build().apply {\n            setMediaItem(MediaItem.fromUri(Constants.PROMO_VIDEO))\n            repeatMode = Player.REPEAT_MODE_ONE\n            prepare()\n        }\n        onStopOrDispose {\n            player?.release()\n            player = <span style=\"color: #008000; font-weight: bold\">null<\/span>\n        }\n    }\n    Box(\n        modifier\n            .background(MaterialTheme.colorScheme.surfaceContainerLowest),\n    ) {\n        player?.let { currentPlayer -&gt;\n            PlayerSurface(currentPlayer, surfaceType = SURFACE_TYPE_TEXTURE_VIEW)\n        }\n    }\n}\n<\/player><\/pre>\n<\/div>\n<p>Using the new <span style=\"font-family: courier;\">onLayoutRectChanged<\/span> modifier, we also listen for whether the composable is completely visible or not, and play or pause the video based on this information:<\/p>\n<p><!-- Kotlin --><\/p>\n<div style=\"background: #f8f8f8; overflow:auto;width:auto;border:0;\">\n<pre style=\"margin: 0; line-height: 125%\"><span style=\"color: #008000; font-weight: bold\">var<\/span> videoFullyOnScreen by remember { mutableStateOf(<span style=\"color: #008000; font-weight: bold\">false<\/span>) }     \n\nLaunchedEffect(videoFullyOnScreen) {\n     <span style=\"color: #008000; font-weight: bold\">if<\/span> (videoFullyOnScreen) currentPlayer.play() <span style=\"color: #008000; font-weight: bold\">else<\/span> currentPlayer.pause()\n} \n\n<span style=\"color: #408080; font-style: italic\">\/\/ We add this onto the player composable to determine if the video composable is visible, and mutate the videoFullyOnScreen variable, that then toggles the player state. <\/span>\nModifier.onVisibilityChanged(\n                containerWidth = LocalView.current.width,\n                containerHeight = LocalView.current.height,\n) { fullyVisible -&gt; videoFullyOnScreen = fullyVisible }\n\n<span style=\"color: #408080; font-style: italic\">\/\/ A simple version of visibility changed detection<\/span>\n<span style=\"color: #008000; font-weight: bold\">fun<\/span> Modifier.onVisibilityChanged(\n    containerWidth: Int,\n    containerHeight: Int,\n    onChanged: (visible: Boolean) -&gt; Unit,\n) = <span style=\"color: #008000; font-weight: bold\">this<\/span> then Modifier.onLayoutRectChanged(<span style=\"color: #666666\">100<\/span>, <span style=\"color: #666666\">0<\/span>) { layoutBounds -&gt;\n    onChanged(\n        layoutBounds.boundsInRoot.top &gt; <span style=\"color: #666666\">0<\/span> &amp;&amp;\n            layoutBounds.boundsInRoot.bottom &lt; containerHeight &amp;&amp;\n            layoutBounds.boundsInRoot.left &gt; <span style=\"color: #666666\">0<\/span> &amp;&amp;\n            layoutBounds.boundsInRoot.right &lt; containerWidth,\n    )\n}\n<\/pre>\n<\/div>\n<p>Additionally, using <span style=\"color: #0d904f; font-family: courier;\">rememberPlayPauseButtonState<\/span>, we add on a layer on top of the player to offer a play\/pause button on the video itself:<\/p>\n<p><!-- Kotlin --><\/p>\n<div style=\"background: #f8f8f8; overflow:auto;width:auto;border:0;\">\n<pre style=\"margin: 0; line-height: 125%\"><span style=\"color: #008000; font-weight: bold\">val<\/span> playPauseButtonState = rememberPlayPauseButtonState(currentPlayer)\n            OutlinedIconButton(\n                onClick = playPauseButtonState::onClick,\n                enabled = playPauseButtonState.isEnabled,\n            ) {\n                <span style=\"color: #008000; font-weight: bold\">val<\/span> icon =\n                    <span style=\"color: #008000; font-weight: bold\">if<\/span> (playPauseButtonState.showPlay) R.drawable.play <span style=\"color: #008000; font-weight: bold\">else<\/span> R.drawable.pause\n                <span style=\"color: #008000; font-weight: bold\">val<\/span> contentDescription =\n                    <span style=\"color: #008000; font-weight: bold\">if<\/span> (playPauseButtonState.showPlay) R.string.play <span style=\"color: #008000; font-weight: bold\">else<\/span> R.string.pause\n                Icon(\n                    painterResource(icon),\n                    stringResource(contentDescription),\n                )\n            }\n<\/pre>\n<\/div>\n<p>Check out the code for more details on <a href=\"http:\/\/github.com\/android\/androidify\" target=\"_blank\" rel=\"noopener\">how CameraX and Media3 were used in Androidify<\/a>. <\/p>\n<h2><span style=\"font-size: x-large;\">Navigation 3<\/span><\/h2>\n<p>Screen transitions are handled using the new Jetpack Navigation 3 library <span style=\"color: #0d904f; font-family: courier;\">androidx.navigation3<\/span>. The <span style=\"font-family: courier;\"><a href=\"https:\/\/github.com\/android\/androidify\/blob\/169b2d521b0743af765e8d52dd714029d4bf24cc\/app\/src\/main\/java\/com\/android\/developers\/androidify\/navigation\/MainNavigation.kt#L63\" target=\"_blank\" rel=\"noopener\">MainNavigation<\/a><\/span> composable defines the different destinations (Home, Camera, Creation, About) and displays the content associated with each destination  using <span style=\"color: #0d904f; font-family: courier;\">NavDisplay<\/span>. You get full control over your back stack, and navigating to and from destinations is as simple as adding and removing items from a list.<\/p>\n<p><!-- Kotlin --><\/p>\n<div style=\"background: #f8f8f8; overflow:auto;width:auto;border:0;\">\n<pre style=\"margin: 0; line-height: 125%\">@Composable\n<span style=\"color: #008000; font-weight: bold\">fun<\/span> <span style=\"color: #0000FF\">MainNavigation<\/span>() {\n   <span style=\"color: #008000; font-weight: bold\">val<\/span> backStack = rememberMutableStateListOf<navigationroute>(Home)\n   NavDisplay(\n       backStack = backStack,\n       onBack = { backStack.removeLastOrNull() },\n       entryProvider = entryProvider {\n           entry<home> { entry -&gt;\n               HomeScreen(\n                   onAboutClicked = {\n                       backStack.add(About)\n                   },\n               )\n           }\n           entry<camera> {\n               CameraPreviewScreen(\n                   onImageCaptured = { uri -&gt;\n                       backStack.add(Create(uri.toString()))\n                   },\n               )\n           }\n           <span style=\"color: #408080; font-style: italic\">\/\/ etc<\/span>\n       },\n   )\n}\n<\/camera><\/home><\/navigationroute><\/pre>\n<\/div>\n<p>Notably, Navigation 3 exposes a new composition local, <span style=\"color: #0d904f; font-family: courier;\">LocalNavAnimatedContentScope<\/span>,  to easily integrate your shared element transitions without needing to keep track of the scope yourself. By default, Navigation 3 also integrates with predictive back, providing delightful back experiences when navigating between screens, as seen in this prior shared element transition:<\/p>\n<p><image><\/p>\n<div style=\"text-align: center;\"><img decoding=\"async\" alt=\"CameraLayout in Compose\" border=\"0\" id=\"imgCaption\" src=\"https:\/\/blogger.googleusercontent.com\/img\/b\/R29vZ2xl\/AVvXsEjMN81Kqzol9hbhrH4CTbCUz5rKw1FkQkYxst1uZFtFarE3CsVcYgc_nuLOlhgmJLphHovpOZB2eTw0lEdi71qEWks1hARucQKbnmOlcsLgrsUwnPftJS0N6Uex_ZYECqWumfUdg-GQcTFxttRhFJkzY2pKKQVrwsIdMieNqX0Sz9WgvdOain2WIFoeKII\/s1600\/shared-element-transition-androidify-app-google-io.gif\" width=\"40%\"\/><\/div>\n<p><\/image><\/p>\n<p>Learn more about <a href=\"http:\/\/goo.gle\/nav3\" target=\"_blank\" rel=\"noopener\">Jetpack Navigation 3, currently in alpha<\/a>.<\/p>\n<h2><span style=\"font-size: x-large;\">Learn more<\/span><\/h2>\n<p>By combining the declarative power of Jetpack Compose, the camera capabilities of CameraX, the intelligent features of Gemini, and thoughtful adaptive design, Androidify is a personalized avatar creation experience that feels right at home on any Android device. You can find the full code sample at <a href=\"http:\/\/github.com\/android\/androidify\" target=\"_blank\" rel=\"noopener\">github.com\/android\/androidify<\/a> where you can see the app in action and be inspired to build your own AI-powered app experiences.<\/p>\n<p>Explore this announcement and all Google I\/O 2025 updates on <a href=\"https:\/\/io.google\/2025\/?utm_source=blogpost&amp;utm_medium=pr&amp;utm_campaign=event&amp;utm_content=\" target=\"_blank\" rel=\"noopener\">io.google<\/a> starting May 22.<\/p>\n<p><\/div>\n<p>[ad_2]<br \/>\n<br \/><a href=\"http:\/\/android-developers.googleblog.com\/2025\/05\/androidify-building-ai-driven-experiences-jetpack-compose-gemini-camerax.html\">Source link <\/a><\/p>\n","protected":false},"excerpt":{"rendered":"<p>[ad_1] The Android bot is a beloved mascot for Android users and developers, with previous versions of the bot builder being very popular &#8211; we<\/p>\n","protected":false},"author":1,"featured_media":277876,"comment_status":"closed","ping_status":"closed","sticky":false,"template":"","format":"standard","meta":{"_monsterinsights_skip_tracking":false,"_monsterinsights_sitenote_active":false,"_monsterinsights_sitenote_note":"","_monsterinsights_sitenote_category":0,"_uf_show_specific_survey":0,"_uf_disable_surveys":false,"footnotes":""},"categories":[146],"tags":[],"_links":{"self":[{"href":"https:\/\/michigandigitalnews.com\/index.php\/wp-json\/wp\/v2\/posts\/277875"}],"collection":[{"href":"https:\/\/michigandigitalnews.com\/index.php\/wp-json\/wp\/v2\/posts"}],"about":[{"href":"https:\/\/michigandigitalnews.com\/index.php\/wp-json\/wp\/v2\/types\/post"}],"author":[{"embeddable":true,"href":"https:\/\/michigandigitalnews.com\/index.php\/wp-json\/wp\/v2\/users\/1"}],"replies":[{"embeddable":true,"href":"https:\/\/michigandigitalnews.com\/index.php\/wp-json\/wp\/v2\/comments?post=277875"}],"version-history":[{"count":0,"href":"https:\/\/michigandigitalnews.com\/index.php\/wp-json\/wp\/v2\/posts\/277875\/revisions"}],"wp:featuredmedia":[{"embeddable":true,"href":"https:\/\/michigandigitalnews.com\/index.php\/wp-json\/wp\/v2\/media\/277876"}],"wp:attachment":[{"href":"https:\/\/michigandigitalnews.com\/index.php\/wp-json\/wp\/v2\/media?parent=277875"}],"wp:term":[{"taxonomy":"category","embeddable":true,"href":"https:\/\/michigandigitalnews.com\/index.php\/wp-json\/wp\/v2\/categories?post=277875"},{"taxonomy":"post_tag","embeddable":true,"href":"https:\/\/michigandigitalnews.com\/index.php\/wp-json\/wp\/v2\/tags?post=277875"}],"curies":[{"name":"wp","href":"https:\/\/api.w.org\/{rel}","templated":true}]}}