Now In Android #123

Friday, December 19, 2025Daniel GalpinView original
Featuredandroid-xrandroid-studiojetpack-composenow-in-android

Android XR, Android Studio Otter 2 feature drop; Android 16 QPR2, compose updates, Jetpack Navigation 3, performance and much, much, much more.

Welcome to Now in Android, your ongoing guide to what’s new and notable in the world of Android development.

You can catch a short subset of what’s in this gigantified update on YouTube, but read on for the full story.

Start building for glasses, new devices for Android XR and more in The Android Show | XR Edition 👓

The Android Show | XR Edition introduced updates to the Android XR platform, focusing on new devices and developer tools. The platform is expanding to include lightweight AI and Display AI glasses from Samsung, Gentle Monster, and Warby Parker, integrating Gemini for features like live translation and visual search. Uber is exploring AI Glasses for contextual directions. Wired XR glasses, such as XREAL’s Project Aura, are scheduled for release next year.

Android XR SDK Developer Preview 3 offers increased stability for headset APIs and opens development for AI Glasses. This includes new libraries like Jetpack Compose Glimmer for transparent display UI and Jetpack Projected for extending your mobile apps to glasses. ARCore for Jetpack XR gains Geospatial capabilities, and new APIs enable detection of device field-of-view for adaptive UIs.

The platform, built on OpenXR, supports Unreal Engine development with a Google vendor plugin for hand tracking coming next year, and Godot Engine now includes Android XR support via its OpenXR vendor plugin v4.2.2 stable.

Start building for glasses, new devices for Android XR and more in The Android Show | XR Edition

Check out #TheAndroidShow in 60 seconds for a quick video overview of what we covered.

Build for AI Glasses with the Android XR SDK Developer Preview 3 and unlock new features for immersive experiences 🚀

Android XR SDK Developer Preview 3 is now available, enabling you to build augmented experiences for AI Glasses in addition to immersive experiences for XR Headsets.

Key updates include:

  1. For AI Glasses: New tools and libraries such as Jetpack Projected for accessing sensors, speakers, and displays; Jetpack Compose Glimmer with UI components optimized for display AI Glasses; and an AI Glasses emulator within Android Studio. ARCore for Jetpack XR now supports motion tracking and geospatial capabilities for augmented experiences on AI Glasses.
  2. For Immersive Experiences (XR Headsets and XR Glasses):

To begin building, update to Android Studio Canary (Otter 3, Canary 4 or later) and emulator version 36.4.3 Canary or later, then visit developer.android.com/xr for libraries and samples.

Build for AI Glasses with the Android XR SDK Developer Preview 3 and unlock new features for immersive experiences

Check out the The Android Show XR Edition Recap to get caught up.

Android Studio Otter 2 Feature Drop is stable! 🚀

The Android Studio Otter 2 Feature Drop is now stable. This release introduces updates to Agent Mode, including the Android Knowledge Base for improved accuracy and the option to use the Gemini 3 model. You can now use Backup and Sync to maintain consistent IDE settings across your machines and opt-in to receive communications from the Android Studio team. Additionally, this release incorporates stability and performance enhancements from the IntelliJ IDEA 2025.2 platform, such as Kotlin compiler and terminal improvements.

Android Studio Otter 2 Feature Drop is stable!

We’ve released a bunch of shorts to highlight important Otter 2 features, such as The Gemini 3 model is now available for AI assistance in Android Studio, Agent Mode’s Android knowledge, and

Android Studio Otter Backup and Sync. Also check out Top 4 agentic experiences for Gemini in Android Studio, and What’s new in Android Studio’s AI Agent.

Android 16 QPR2 is Released 🚀

Android 16 QPR2 has been released, marking the first minor SDK version. This release aims to accelerate innovation by delivering new APIs and features outside of major yearly platform releases.

Key updates include:

  • Minor SDK Release: You can now check for new APIs using SDK_INT_FULL and VERSION_CODES_FULL in the Build class as of Android 16.
  • Expanded Dark Theme: This feature provides an option to invert apps that do not have a native dark theme, intended as an accessibility feature. You should declare isLightTheme=”false” in your dark theme if your app does not inherit from standard DayNight themes to prevent unintended inversion.
  • Custom Icon Shapes & Auto-Theming: Users can select custom shapes for app icons, and the system can automatically generate themed icons if your app does not provide one.
  • Interactive Chooser Sessions: The sharing experience now supports real-time content updates within the Chooser, keeping the UI interactive.
  • Linux Development Environment with GUI Applications: You can now run Linux GUI applications directly within the terminal environment.
  • Generational Garbage Collection: The Android Runtime (ART) includes a Generational Concurrent Mark-Compact (CMC) Garbage Collector to reduce CPU usage and improve battery efficiency.
  • Widget Engagement Metrics: You can query user interaction events such as clicks, scrolls, and impressions for your widgets.
  • 16KB Page Size Readiness: Debuggable apps not 16KB page-aligned will receive early warning dialogs.
  • IAMF and Audio Sharing: Software decoding support for Immersive Audio Model and Formats (IAMF) is added, and Personal Audio Sharing for Bluetooth LE Audio is integrated into the system Output Switcher.
  • Health Connect Updates: Health Connect automatically tracks steps using device sensors, and you can now track weight, set index, and Rate of Perceived Exertion (RPE) in exercise segments.
  • Smoother Migrations: A new Data Transfer API enables data migration between Android and iOS devices.
  • Developer Verification: APIs support developer verification during app installation, with ADB commands available to simulate outcomes.
  • SMS OTP Protection: The delivery of messages containing an SMS retriever hash is delayed for most apps by three hours to help prevent OTP hijacking.
  • Secure Lock Device: A new system-level security state locks the device immediately, requiring the primary PIN, pattern, or password to unlock and temporarily disabling biometric unlock.

To get started, you can get the Android 16 QPR2 release on your Pixel device, or use 64-bit system images with the Android Emulator in Android Studio. Using the latest Canary build of Android Studio Otter is recommended.

Android 16 QPR2 is Released

What’s new in the Jetpack Compose December ’25 release 🚀

The Jetpack Compose December ’25 release is now stable, including core Compose modules version 1.10 and Material 3 version 1.4. To use this release, update your Compose BOM version to 2025.12.00.

Key updates include:

  • Performance Improvements: Scroll performance now matches Views, with pausable composition in lazy prefetch enabled by default to reduce jank. Further optimizations improve Modifier.onPlaced and Modifier.onVisibilityChanged performance.

New Features:

  1. The retain API helps persist non-serializable state across configuration changes, useful for objects like media players.
  2. Material 3 1.4 adds an experimental TextFieldState for TextField, new SecureTextField variants, autoSize support for Text, a HorizontalCenteredHeroCarousel variant, TimePicker input mode switching, and a vertical drag handle for adaptive panes.
  3. Animation features include dynamic shared elements, allowing you to control sharedElement() and sharedBounds() animation transitions via SharedContentConfig’s isEnabled property.
  4. Modifier.skipToLookaheadPosition() helps create “reveal” type shared element animations by preserving a composable’s final position.
  5. A new prepareTransitionWithInitialVelocity API supports passing initial gesture velocity to shared element transitions.
  6. Experimental veil option for EnterTransition and ExitTransition lets you specify a color to scrim content during animations.

Tools: Android Studio adds Transform UI for natural language design iteration, the ability to generate @Preview for composables, customized Material Symbols in the Vector Asset wizard, code generation from screenshots using Gemini (with remote MCP support), and UI quality issue fixes.

Upcoming Changes: Modifier.onFirstVisible will be deprecated in Compose 1.11 due to non-deterministic behavior; migrate to Modifier.onVisibilityChanged. Coroutine dispatch in tests will shift to StandardTestDispatcher by default in a future release to align with production behavior; you can opt-in now using effectContext = StandardTestDispatcher() in createComposeRule.

What's new in the Jetpack Compose December '25 release

Jetpack Navigation 3 is stable 🚀

Jetpack Navigation 3 version 1.0 is now stable. This new navigation library is built to embrace Jetpack Compose state, offering full control over your back stack, helping you retain navigation state, and facilitating adaptive layouts. A cross-platform version is also available from JetBrains.

Developed to address the shift to reactive programming and declarative UI, Navigation 3 provides more flexibility and customizability compared to the original Jetpack Navigation (now Nav2) through smaller, decoupled APIs. For example, NavDisplay observes a list of keys backed by Compose state to update the UI. It also allows you to supply your own state for a single source of truth, lets you customize screen animations, and create flexible layouts with the Scenes API.

If you are currently using Navigation Compose with Nav2, you can consider migrating to Nav3. A migration guide is available that outlines key steps, including adding Nav3 dependencies, updating routes to implement NavKey, creating navigation state classes, and replacing NavController with these classes. You also move destinations from NavHost’s NavGraph into an entryProvider and replace NavHost with NavDisplay. You can experiment with an AI agent, like Gemini in Android Studio’s Agent Mode, for this migration by providing the markdown version of the guide as context.

For common navigation scenarios, a recipes repository is available, covering topics such as multiple back stacks, modularization, dependency injection, passing arguments to ViewModels, and returning results from screens. A deep links and Koin integration recipe are currently in development, and a Compose Multiplatform version of the recipes is also available.

To get started, you can refer to the official documentation and the recipes. You can file any issues you encounter in the issue tracker. We have lots of video content that can help, including a Navigation 3 API overview, our 3 things to know about Jetpack Navigation 3, a recording of our Navigation 3 #AskAndroid session, as well as shorts on Navigation 3 basics, How to animate screen transitions, and Implementing deep links.

Jetpack Navigation 3 is stable

Fully Optimized: Wrapping up Performance Spotlight Week 🚀

Performance Spotlight Week concluded with several announcements aimed at optimizing Android app performance.

You can now utilize the R8 optimizer for faster, smaller, and more stable apps, with updated documentation available. For instance, enabling R8 full mode has resulted in 40% faster cold startup and 30% fewer ANR errors for some apps.

Profile Guided Optimizations, including Baseline Profiles and Startup Profiles, can enhance startup speed, scrolling, animation, and rendering performance. Jetpack Compose 1.10 also introduced performance improvements like pausable composition and a customizable cache window for handling complex list items.

To measure performance, a new Performance Leveling Guide outlines a five-step journey, starting with data from Android Vitals and progressing to advanced local tooling like Jetpack Macrobenchmark and the UiAutomator 2.4 API for accurate measurement and verification.

Debugging tools received upgrades, including Automatic Logcat Retrace in Android Studio Narwhal to de-obfuscate stack traces automatically. New guidance on Narrow Keep Rules helps fix runtime crashes, supported by a lint check in Android Studio Otter 3. Additionally, new documentation and the Background Task Inspector offer insights into debugging WorkManager tasks and background work.

Performance optimization is an ongoing process, and the App Performance Score framework can help you integrate continuous improvements into your product roadmap.

Fully Optimized: Wrapping up Performance Spotlight Week

You can learn more on video at App Performance Spotlight Week Overview, App Performance #AskAndroid, App performance improvements, and Boost Android app performance with the R8 optimizer .

Introducing CameraX 1.5: Powerful Video Recording and Pro-level Image Capture 📸

CameraX 1.5 introduces features for video recording and image capture, alongside core API enhancements.

For video, you can now capture slow-motion or high-frame-rate videos. The new Feature Group API enables combinations like 10-bit HDR and 60 FPS, supporting features such as HDR (HLG), 60 fps, Preview Stabilization, and Ultra HDR, with plans for 4K recording and ultra-wide zoom.

Concurrent Camera improvements allow binding Preview, ImageCapture, and VideoCapture concurrently, and applying CameraEffects in composition mode. Additionally, CameraX 1.5 includes dynamic audio muting during recording, improved insufficient storage error handling, and a low light boost for dark environments on supported devices.

For image capture, CameraX 1.5 adds support for capturing unprocessed, uncompressed DNG (RAW) files, either standalone or simultaneously with JPEG. You can also leverage Ultra HDR output when using Camera Extensions.

Core API changes include the new SessionConfig API, which centralizes camera setup, removes the need for manual unbind() calls when updating use cases or switching cameras, and provides deterministic frame rate control. The camera-compose library has reached stable version 1.5.1, addressing bugs and preview stretching. Other improvements include fine-grained control over torch strength (querying max strength and setting levels) and NV21 image format support in ImageAnalysis.

To access these features, update your dependencies to CameraX 1.5.1. You can join the CameraX developer discussion group or file bug reports for support.

Introducing CameraX 1.5: Powerful Video Recording and Pro-level Image Capture

Health Connect Jetpack v1.1.0 is now available! 🔗

The Health Connect Jetpack library has reached its 1.1.0 stable release, providing a foundation for health and fitness applications. This version incorporates features such as background reads for continuous data monitoring, historical data synchronization, and support for data types including Personal Health records, Exercise Routes, Training Plans, and Skin Temperature. The platform supports over 50 data types across various health and fitness categories.

Additionally, Health Connect is expanding its device type support, which will be available in version 1.2.0-alpha02. New supported device types include Consumer Medical Devices (e.g., Continuous Glucose Monitors, Blood Pressure Cuffs), Glasses (for smart glasses and head-mounted optical devices), Hearables (for earbuds, headphones, and hearing aids with sensing capabilities), and Fitness Machines (for stationary and outdoor equipment). This expansion aims to enhance data representation by specifying the source hardware.

You are encouraged to upgrade to the 1.1.0 library, review the official documentation and release notes for further details, and submit feedback or report issues via the public issue tracker.

Health Connect Jetpack v1.1.0 is now available!

ML Kit’s Prompt API: Unlock Custom On-Device Gemini Nano Experiences ✨

ML Kit has released the Alpha version of its GenAI Prompt API, enabling custom on-device Gemini Nano experiences. This API allows you to send natural language and multimodal requests to Gemini Nano, supporting use cases requiring more control and flexibility for generative models.

The Prompt API processes data locally, offering offline functionality and enhanced user privacy. Examples of its application include image understanding, intelligent document scanning, transforming data for UI, content prompting, content analysis, and information extraction.

Implementation involves a few lines of code, using Generation.getClient().generateContent() with optional parameters like temperature, topK, candidateCount, and maxOutputTokens. Detailed examples are available in the official documentation and a GitHub sample.

The API performs optimally on the Pixel 10 device series, which features Gemini Nano (nano-v3), built on the same architecture as Gemma 3n. Developers without a Pixel 10 can prototype features locally using Gemma 3n. Refer to the device support documentation for a comprehensive list of compatible devices.

ML Kit's Prompt API: Unlock Custom On-Device Gemini Nano Experiences

Kakao Mobility utilized Gemini Nano via ML Kit’s GenAI Prompt API for two main functions:

  • Parking Assistance: It uses multimodal capabilities to detect improperly parked bikes and scooters on yellow tactile paving, reducing server costs and enhancing user privacy compared to cloud-based image recognition.
  • Improved Address Entry: For parcel delivery, it streamlines entity extraction from natural language order requests, which eliminated error-prone manual address entry by drivers.

The implementation of Gemini Nano on-device led to:

  • Cost savings by shifting AI processing from the cloud.
  • Enhanced user privacy by keeping sensitive location data on the device.
  • Reduced order completion time for delivery orders by 24%.
  • Increased conversion rates for new users by 45% and existing users by 6%.
  • Over 200% increase in AI-powered orders during peak seasons.
  • Reduced developer effort and shortened development time.

You can use ML Kit’s GenAI Prompt API to integrate on-device AI capabilities like Gemini Nano into your applications.

Kakao Mobility uses Gemini Nano on-device to reduce costs and boost call conversion by 45%

Articles 📚

Explore AI on Android with Our Sample Catalog App 🤖

The Android team has launched a redesigned, open-source Android AI Sample Catalog app on GitHub to showcase various AI-enabled features using both on-device (ML Kit GenAI API with Gemini Nano) and cloud (Firebase AI Logic SDK) models. The catalog includes samples for tasks like image generation (Imagen), on-device text summarization, a chatbot for image editing (Gemini 3 Pro Image model), on-device image description, a voice-controlled to-do list, and on-device rewrite assistance. The app features a new Material 3 design and provides structured code for easy integration into your own projects.

Explore AI on Android with Our Sample Catalog App

Learn about our newest Jetpack Navigation library with the Nav3 Spotlight Week 🌟

We had a Nav3 Spotlight Week to help you learn and integrate the library into your app. Nav3 can assist in reducing technical debt, improving separation of concerns, accelerating feature development, and supporting new form factors.

The week featured daily content:

  • API Overview explores core APIs like NavDisplay, NavEntry, and entryProvider, including a coding walkthrough video.
  • Animations demonstrates how to set custom animations for screen transitions and override them for individual screens, with accompanying documentation and recipes.
  • Deep links covers creating deep links with various code recipes, offering a guide and both basic and advanced examples for parsing intents and synthetic back stacks. The Now in Android sample has also migrated to Nav3.
  • Modularization focuses on modularizing navigation code to avoid circular dependencies and using dependency injection and extension functions for feature modules.
  • Ask Me Anything was a live session where the community submitted questions using the #AskAndroid tag on BlueSky, LinkedIn, and X.

Learn about our newest Jetpack Navigation library with the Nav3 Spotlight Week

#WeArePlay: Solving the dinner dilemma — how DELISH KITCHEN empowers 13 million home cooks 🍲

#WeArePlay spotlights DELISH KITCHEN co-founder Chiharu and her app, which provides 55,000 video recipes to over 13 million Japanese users to solve the “dinner dilemma.” Google Play supports the app’s growth, offering distribution to Android users, developer tools, and feature campaigns. Future plans for you to note include an AI-powered cooking assistant, a new health management app, and supermarket partnerships.

#WeArePlay: Solving the dinner dilemma - how DELISH KITCHEN empowers 13 million home cooks

Leveling Guide for your Performance Journey 📈

The Android Developers Blog published a “Leveling Guide for your Performance Journey,” outlining five stages for optimizing app performance.

  1. Level 1: Play Console Field Monitoring
    Use Android Vitals within the Play Console to monitor automatically collected field data, including crash rate, ANR rate, and excessive battery usage.
  2. Level 2: App Performance Score Action Items
    Start with the Static Performance Score (configuration and tooling changes like R8 optimization, Baseline Profiles, and Startup Profiles) before moving to a dynamic assessment to validate improvements on a real device, measuring startup time and rendering performance.
  3. Level 3: Local Performance Test Frameworks
    Integrate automated testing with frameworks like Macrobenchmark (for startup time, dropped frames) and UiAutomator (for simulating user interactions).
  4. Level 4: Trace Analysis Tools
    Use deep analysis tools like Perfetto to capture and analyze the entire device state, including kernel scheduling and system services, to provide context for performance issues. You can record traces via developer options, Android Studio CPU Profiler, or the Perfetto UI, then load and analyze them to debug jank, slow startup, and excessive battery/CPU usage.
  5. Level 5: Custom Performance Tracking Framework
    For teams with dedicated resources, you can build a custom performance tracking framework using Android APIs like ApplicationStartInfo (API 35), ProfilingManager (API 35), and ApplicationExitInfo (API 30) to understand why your app process died (e.g., native crashes, ANRs, out-of-memory kills).

Leveling Guide for your Performance Journey

Stronger threat detection, simpler integration: Protect your growth with the Play Integrity API 🔒

The Play Integrity API has received updates aimed at improving threat detection and simplifying integration. It verifies that user interactions originate from your unmodified app on a certified Android device installed via Google Play, resulting in an average of 80% lower unauthorized usage for apps utilizing its features.

The API provides various verdicts to detect specific threats, including checks for:

  • App Status: If the user installed or paid for the app via Google Play (accountDetails) and if the app binary is unmodified (appIntegrity).
  • Device Status: If the app runs on a genuine Play Protect certified Android device (deviceIntegrity), if the device has recent security updates (MEETS_STRONG_INTEGRITY), and if Google Play Protect is active and no risky apps are present (playProtectVerdict).
  • Security Risks: If risky apps are running that could capture the screen or control the device (appAccessRiskVerdict).

Improvements also focus on user recovery through new Play in-app remediation prompts.

Other integrity solutions include Google Play’s automatic protection (installer checks, advanced anti-tamper protection), Android platform key attestation (which Play Integrity API leverages, with direct implementers needing to prepare for root certificate rotation in February 2026), Firebase App Check, and reCAPTCHA Enterprise.

Stronger threat detection, simpler integration: Protect your growth with the Play Integrity API

How Uber is reducing manual logins by 4 million per year with the Restore Credentials API 📲

Uber has reduced manual logins by an estimated 4 million per year by integrating the Restore Credentials API into its rider app. This feature enables a seamless transition for users when they switch to a new device, eliminating the need for re-authentication.

A five-week A/B experiment confirmed the positive impact, demonstrating:

  • A 3.4% decrease in manual logins (SMS OTP, passwords, social login).
  • A 1.2% reduction in expenses related to SMS OTP logins.
  • A 0.575% increase in the rate of devices successfully reaching the app’s home screen.
  • A 0.614% rise in devices with completed trips.

Interested in implementing Restore Credentials? You can consult sample code, documentation, a codelab, and validate your integration using new features in Android Studio Otter.

How Uber is reducing manual logins by 4 million per year with the Restore Credentials API

Configure and troubleshoot R8 Keep Rules 🔒

R8 is the primary tool for shrinking and optimizing Android apps. Keep Rules are essential because R8 cannot predict dynamic code (like reflection), which could lead to unintended code removal.

Key takeaways for Keep Rules:

  • Location: Write rules in a proguard-rules.pro file, always using proguard-android-optimize.txt.
  • Best Practice: Write narrow, specific rules, and use annotations or common ancestors for scalability.
  • Avoid: Global options (like -dontoptimize) and overly broad rules, as they negate R8’s performance benefits.
  • Troubleshooting: Use -printconfiguration to see all merged rules and -whyareyoukeeping to understand why a class is being preserved.
  • Goal: Use modern libraries with code generation instead of reflection to reduce the need for Keep Rules entirely.

Configure and troubleshoot R8 Keep Rules

Gemini 3 is now available for AI assistance in Android Studio 🚀

The Gemini 3 Pro model is now available for AI assistance, providing new coding and agentic features in the latest version of Android Studio Otter.

Gemini 3 is now available for AI assistance in Android Studio

How Reddit used the R8 optimizer for high impact performance improvements 🚀

Reddit significantly improved its app’s performance by implementing the R8 optimizer in full mode, which took less than two weeks.

Key results from the implementation:

Real-World Metrics (Android Vitals/Crashlytics):

  • 40% faster cold startup time
  • 30% reduction in “Application Not Responding” (ANR) errors
  • 25% improvement in frame rendering
  • 14% decrease in app size

Controlled Testing (Macrobenchmark):

  • 55% faster app startup
  • 18% quicker time for users to begin browsing

You can enable R8 by setting minifyEnabled and shrinkResources to true in your release build type within app/build.gradle.kts. This process should be followed by holistic end-to-end testing, and you may need to define keep rules to prevent R8 from modifying essential parts of your code.

#WeArePlay: Meet the game creators who entertain, inspire and spark imagination 🎮

The latest #WeArePlay stories highlight game creators who develop for Google Play. These stories feature developers who entertain players, inspire new ideas, and spark imagination through their creations.

You can learn about:

  • Ralf and Matt from Vector Unit, creators of Beach Buggy Racing. Their kart racing game has over 557 million downloads and has received community engagement for its console-quality feel on mobile. They continue to update the game and are prototyping new projects.
  • Camilla from Clover-Fi Games, who developed Window Garden. This lofi idle game, which encourages players to care for digital plants and decorate spaces, has surpassed 1 million downloads and received a “Best of 2024” award from Google Play. Camilla aims to expand her studio and collaborate with other creatives.
  • Rodrigo from Kolb Apps, the founder behind Real Drum. This virtual drum set app offers a realistic experience, allowing users to play drums and cymbals. It has accumulated over 437 million downloads, making music accessible to many, and Rodrigo plans to release new apps for children.

#WeArePlay: Meet the game creators who entertain, inspire and spark imagination

#WeArePlay: Meet the people making apps & games to improve your health ❤️

This week’s #WeArePlay series highlights applications and games on Google Play that focus on health and wellness. You can learn about:

  • Alarmy (Delightroom, Seoul), an app for heavy sleepers that uses challenge-based alarms, including math problems and photo missions, and is expanding into sleep tracking and general wellness.
  • Betwixt (Mind Monsters Games, Cambridge, UK), an interactive adventure game designed to reduce anxiety by combining storytelling with evidence-based techniques.
  • MapMyFitness (MapMyFitness, Boulder, CO, U.S.), an app for runners and cyclists to map routes and track training, offering features like adaptive training plans, guided workouts, and live safety tracking.

#WeArePlay: Meet the people making apps & games to improve your health

Android developer verification: Early access starts now as we continue to build with your feedback 🛡️

Android developer verification has begun its early access phase. This initiative introduces verification requirements as an additional layer of security to protect Android users from scams and digital fraud, particularly with sideloaded apps. The system aims to deter malicious app distribution by linking apps to verified identities.

In response to community feedback, changes address specific developer needs:

  • You, as a student or hobbyist, will have a dedicated account type, enabling distribution to a limited number of devices without full verification.
  • For experienced users, a new advanced flow is being developed to permit the installation of unverified apps. This flow will include clear warnings about risks and is designed to resist coercion.

You can find a video walkthrough and detailed guides for the new Android Developer Console experience.

Android developer verification: Early access starts now as we continue to build with your feedback

Raising the bar on battery performance: excessive partial wake locks metric is now out of beta 🔋

The “excessive partial wake locks” metric has moved out of beta and is now generally available as a new core vitals metric in Android vitals. This metric, co-developed with Samsung, identifies user sessions where an app holds more than two cumulative hours of non-exempt wake locks within a 24-hour period.

If your app surpasses a bad behavior threshold — 5% of user sessions being excessive over 28 days — it may be excluded from prominent Google Play discovery surfaces and a warning may appear on its store listing, starting March 1, 2026.

Android vitals now features a wake lock names table to help you pinpoint excessive wake locks by name and duration, particularly those with P90 or P99 durations over 60 minutes. You are encouraged to review your app’s performance in Android vitals and consult the updated documentation for best practices.

Raising the bar on battery performance: excessive partial wake locks metric is now out of beta

redBus uses Gemini Flash via Firebase AI Logic to boost the length of customer reviews by 57% 🗣️✨

redBus utilized Gemini Flash via Firebase AI Logic to revamp its customer review system, resulting in a 57% increase in review length. The company’s previous text-based review process presented challenges such as language barriers and a lack of detailed feedback.

To address this, redBus implemented a voice-first approach, enabling users to submit reviews in their native language. Gemini Flash transcribes and translates speech, performs sentiment analysis, and generates star ratings, relevant tags, and summaries from these voice inputs. Firebase AI Logic facilitated the frontend team’s independent development and launch of this feature within 30 days, removing the need for complex backend implementation. The solution employs structured output to ensure well-formed JSON responses from the AI model. redBus plans to continue exploring on-device generative AI and will use Google AI Studio for prompt iteration.

redBus uses Gemini Flash via Firebase AI Logic to boost the length of customer reviews by 57%

New tools and programs to accelerate your success on Google Play 🚀

Google Play has released new tools and programs designed to streamline your development and accelerate your app’s growth. You can now validate deep links directly within Play Console using a built-in emulator. A new Gemini-powered localization service offers no-cost translations for app strings, automatically translating new app bundles into selected languages while allowing you to preview, edit, or disable them.

On the Statistics page, a new Gemini-powered feature generates automated chart summaries to help you understand data trends and provides access to reporting for screen reader users. The Play Console now includes a “Grow users” overview page, offering a tailored view to acquire new users and expand your reach. A new “You” tab on the Play Store is available for re-engagement; you can integrate with Engage SDK to help users resume content or get personalized recommendations. Game developers can use this tab to showcase in-game events, content updates, and offers, with promotional content, YouTube video listings, and Play Points coupons available.

For monetization, you can now configure one-time products with more flexibility, including limited-time rentals and pre-orders through an early access program, and manage your catalog more efficiently with a new taxonomy. A new Play Points page in Play Console provides reporting on the revenue, buyers, and acquisitions generated by both your developer-created and Google-funded Play Points promotions.

New tools and programs to accelerate your success on Google Play

How Calm Reimagined Mindfulness for Android XR 🌌

Calm has brought its mindfulness content to Android XR. Its engineering team developed functional XR orbiter menus in one day and a core XR experience within two weeks. This involved extending existing Android development, including leveraging Jetpack Compose and reusing codebase components such as backend and media playback.

The team utilized Android XR design guides and evolved features like the “Immersive Breathe Bubble” for 3D breathwork and “Immersive Scene Experiences” for ambient environments. The creative workflow involved concept art, 3D models with human-scale reference, and in-headset testing, with the Android XR emulator available as a testing option.

To build for XR, you can integrate Jetpack XR APIs into existing Android apps and reuse code to create prototypes quickly. Resources for building on the Android XR platform are available at developer.android.com/xr.

How Calm Reimagined Mindfulness for Android XR

Introducing Cahier: A new Android GitHub sample for large screen productivity and creativity ✍️

Android Developers has introduced Cahier, a new GitHub sample application designed to showcase best practices for building productivity and creativity apps optimized for large screens.

Cahier demonstrates how you can develop versatile note-taking applications that combine text, freeform drawings using the Ink API (now in beta), and image attachments. Key features include fluid content integration with drag and drop for importing and sharing, and note organization capabilities.

The sample utilizes an offline-first architecture with Room and supports multi-window and multi-instance capabilities, including desktop windowing. Its user interface adapts to various screen sizes and orientations, including phones, tablets, and foldable devices, by employing ListDetailPaneScaffold and NavigationSuiteScaffold from the material3-adaptive library.

Cahier also illustrates deep system integration, showing you how to enable your app to become the default note-taking app on Android 14 and higher by responding to Notes intents. Lenovo has enabled Notes Role support on its tablets running Android 15 and above, allowing note-taking apps to be set as default on these devices. The sample provides comprehensive input support, including stylus, keyboard shortcuts, and mouse/trackpad interactions.

Introducing Cahier: A new Android GitHub sample for large screen productivity and creativity

Material 3 Adaptive 1.2.0 is stable 📐

Material 3 Adaptive 1.2.0 is now stable, building on previous versions with expanded support for window size classes and new strategies for display pane placement.

The release introduces support for Large (L) and Extra-large (XL) breakpoints for width window size classes, enabled by setting supportLargeAndXLargeWidth = true in your currentWindowAdaptiveInfo() call.

New adaptive strategies, reflow and levitate, are available for ListDetailPaneScaffold and SupportingPaneScaffold. The reflow strategy rearranges panes based on window size or aspect ratio, moving a second pane to the side or underneath. The levitate strategy docks content and offers customization for draggability, resizability, and background scrim. Both strategies can be declared in the Navigator constructor using the adaptStrategies parameter.

Material 3 Adaptive 1.2.0 is stable

5 things you need to know about publishing and distributing your app for Android XR ⚙️

When publishing and distributing your app for Android XR, consider five key areas:

  1. Uphold quality with Android XR app quality guidelines. Ensure your app delivers a safe, comfortable, and performant user experience by following guidelines that cover camera movement, frame rates, visual elements (like strobing), performance metrics, and recommended minimum interactive target sizes for eye-tracking and hand-tracking inputs.
  2. Configure your app manifest correctly. In your AndroidManifest.xml, specify android.software.xr.api.spatial for apps using the Jetpack XR SDK or android.software.xr.api.openxr for apps using OpenXR or Unity. Set android:required=”true” accordingly for dedicated XR tracks or false for mobile tracks. Also, set the android.window.PROPERTY_XR_ACTIVITY_START_MODE on your main activity to define the default user environment (Home Space, Full Space Managed, or Full Space Unmanaged). Check for optional hardware features dynamically at runtime using PackageManager.hasSystemFeature() instead of setting them as required in the manifest to avoid limiting your audience.
  3. Use Play Asset Delivery (PAD) to deliver large assets. For immersive apps with large assets, use PAD’s install-time, fast follow, or on-demand delivery modes. Android XR apps have an increased cumulative asset pack limit of 30 GB. Unity developers can integrate Unity Addressables with PAD.
  4. Showcase your app with spatial video previews. Provide a 180°, 360°, or stereoscopic video asset to offer an immersive 3D preview on the Play Store for users browsing on XR headsets.
  5. Choose your Google Play release track. You can publish to the mobile release track if you are adding spatial XR features to an existing mobile app and can bundle XR features into your existing Android App Bundle (AAB). Alternatively, you can publish to the dedicated Android XR release track for new XR apps or XR versions that are functionally distinct, which restricts visibility to Android XR devices supporting spatial or OpenXR features.

5 things you need to know about publishing and distributing your app for Android XR

Bringing Androidify to XR with the Jetpack XR SDK 🥽

The Android Developers Blog details how the Androidify app was adapted for Extended Reality (XR) using the Jetpack XR SDK, coinciding with the launch of Samsung Galaxy XR powered by Android XR.

Originally designed with adaptive layouts for phones, foldables, and tablets, Androidify is compatible with Android XR without modifications. For a differentiated XR experience, developers created specific spatial layouts.

Key XR concepts include Home Space, which allows multitasking with multiple app windows in a virtual environment, and Full Space, where an app uses the full spatial features of Android XR. You are advised to support both modes.

Designing for XR involved organizing UI elements using containment, embracing spatial UI elements that adjust to the user, and adapting camera layouts for headsets. Design tips for spatial UI include allowing uncontained elements, removing background surfaces, motivating with motion, and choosing an anchor element for content.

For development, the Jetpack XR Compose dependency was added. You can transition to Full Space by checking for XR spatial features using LocalSpatialConfiguration.current.hasXrSpatialFeature and !LocalSpatialCapabilities.current.isSpatialUiEnabled. Spatial UI elements like SpatialPanel, SubspaceModifier, and Orbiter enable the creation of XR layouts with existing 2D content. SpatialPanels can incorporate ResizePolicy and MovePolicy for user interaction, and hierarchical relationships allow grouped movement.

To publish, include <uses-feature android:name=”android.software.xr.api.spatial” android:required=”false” /> in your AndroidManifest.xml to signify XR-differentiated features. The same app binary can be distributed to both mobile and XR users, with options to add XR-specific screenshots or spatial video assets for immersive previews on the Play Store.

Bringing Androidify to XR with the Jetpack XR SDK

Videos 📹

#WeArePlay: Miksapix Interactive — bringing ancient Sámi mythology and culture to gamers worldwide

Miksapix Interactive launched their game “Raanaa” on Google Play, leveraging Sámi mythology to preserve and share indigenous culture. This demonstrates a successful approach to niche content development and localization on the platform, with the game being translated into various Sámi languages.

Building intelligent Android apps with Gemini

Google is empowering you to build intelligent apps using Gemini AI, offering a comprehensive end-to-end AI stack. Key takeaways include:

  • Tools: AI Studio for prototyping, ML Kit GenAI APIs (Beta) for on-device inference (summarization, proofreading, image description, custom prompt API), and Firebase AI Logic SDK for cloud inference with production features (App Check, Remote Config, monitoring).
  • Models: On-device options like Gemini Nano and Gemma 3n; cloud models like Gemini Pro, Flash, and Flash-Lite. Specialized models include Nano Banana and Imagen for image generation.
  • New APIs: The Gemini Live API (Preview) enables real-time voice/video interactions and features “function calling” for Gemini to invoke custom Kotlin functions within apps.
  • Focus: You can choose between on-device (offline, private, no cost) and cloud (powerful, broad availability) AI approaches based on their app’s needs.

Building adaptive apps for Android

It’s time to build adaptive apps that optimally scale across diverse form factors (tablets, foldables, Chromebooks, etc.).

Key takeaways:

  • Incentives: Play Store will prioritize adaptive apps in search/features. By 2026, “quality badging” and form-factor-specific ratings will be introduced.
  • Platform Changes: Android 16 (SDK 34) will remove orientation, resize, and aspect ratio constraints on large screens, aiming to make 75% of top apps automatically adaptive in landscape.
  • Tools & Resources: Leverage new/improved Android Studio tools, including dedicated layout libraries (e.g., SlidingPaneLayout, ActivityEmbedding), enhanced emulators, design guidelines, and Window Size Classes for streamlined layout adaptation.

More customization in Material 3: the path to expressive apps

Material 3 Expressive is now available, offering new capabilities to build more premium, engaging, and expressive UIs.

Key updates include:

  • Enhanced Components: Flexible app bars, buttons with shape-morphing motion, a new FAB Menu (“speed dial”), new loading and progress indicators, and revamped menu/list/slider components.
  • Adaptive UI: New Adaptive Navigation Bar and Rail seamlessly adapt to various window sizes and form factors, including foldables.
  • Style Enhancements: An expanded shape library (35 unique shapes), a physics-based motion system, richer dynamic colors, and emphasized typography with variable font support.

Crucially, these new features are available today for both Jetpack Compose and Android Views, ensuring bidirectional compatibility with existing Material 3 implementations. The update aims to improve clarity, usability, and user delight, as validated by extensive user research.

Building Androidify: an AI-powered Android experience

Androidify has been re-released as an open-source app, built with Jetpack Compose. It offers you a practical example of integrating Firebase AI Logic SDK, Gemini, and a fine-tuned Imagen model for AI features like image validation, captioning, and bot generation.

Key takeaways include:

  • Using ML Kit Subject Segmentation for features like sticker creation.
  • Implementing modern UI/UX with SharedTransitionLayout for smooth transitions.
  • Integrating predictive back support.
  • Building a fully adaptive UI for phones, tablets, foldables, and Chromebooks from a single codebase.

Building for TV and cars with Compose

Android apps using Jetpack Compose will benefit from significant performance improvements, including 21% faster Time to First Frame and a 76% reduction in jank.

New resources are available for TV and car app development:

  • TV: Leverage the dedicated Compose for TV library, with design guidance emphasizing clear focus indicators and a focus management codelab.
  • Cars: A new “Design for cars” guide differentiates Android Auto and Automotive OS, outlines driving restrictions, defines app quality tiers (including making existing apps “Car ready” via Google Play opt-in).

Testing is also enhanced with new Android Automotive OS emulators, an early access program for Firebase Test Lab offering direct device access, and an AAOS image for the Pixel Tablet. Google champions adaptive app development with Compose for extensive code reuse across all Android form factors.

Google Play PolicyBytes — October 2025 policy updates

Google Play’s October 2025 policy updates bring several key changes for developers:

  • Age-Restricted Content: Apps facilitating dating, gambling, or real-money games must now use the “Restrict Minor Access” feature to block minors.
  • Personal Loans (India): Apps must be on the Indian government’s approved digital lending list.
  • Health & Medical Apps: EU medical device apps require regulatory info and will get a “Medical Device” label. Other health/medical apps must include a disclaimer stating they are not medical devices.
  • Subscriptions: Policy clarification emphasizes clear free trial cancellation and prominent display of total charges to avoid violations.
  • Appeals Process: A new 180-day appeal window is being introduced for account terminations.
  • Compliance Deadline: January 2026 for these and other related policy updates.

Google Play Console: Streamlining workflows, from testing to growth

The redesigned Google Play Console introduces key new features for Android developers:

Pre-launch Deep Link Testing: A new built-in emulator on the Deep links page allows developers to test deep links and visualize user experience before launch.

Enhanced Monitoring: The “Monitor and improve” section provides actionable recommendations to address issues like ANR rates and slow warm-start times.

Gemini AI Integration:

  1. Automatically summarizes app metric trends, highlighting performance changes.
  2. Offers high-quality, automated localization of app strings for global markets, improving upon traditional machine translation.

Android Developer Story: Pocket FM cuts 50% in development time with Gemini in Android Studio

Pocket FM significantly cut Android development time (50% for new features, 30% for existing) by integrating Gemini in Android Studio. For developers, this highlights Gemini’s practical utility in generating code (like impression tracking), resolving complex issues (e.g., Media3 errors), and streamlining SDK upgrades by identifying dependencies, enabling engineers to focus on more complex development.

AndroidX Releases 🚀

Here’s a summary of the AndroidX changes, many of which have been covered earlier in the post:

Compose UI & Foundation (1.11.0-alpha01)

New UI Modifiers:

  • Modifier.scrollIndicator: A new API to allow developers to add custom scroll indicators to scrollable containers, offering more control over the scroll UI.
  • Modifier.visible(): Introduced to skip drawing a Composable’s content without affecting the space it occupies in the layout. This is useful for conditional visibility when you want to maintain layout structure.

Important Deprecation:

  • Modifier.onFirstVisible() is now deprecated. Its behavior was often misleading (e.g., triggering on every scroll for lazy lists). Developers are advised to use Modifier.onVisibilityChanged() and manually track visibility state based on their specific use case.

Default Behavior Changes:

  • TextField DPAD navigation and semantic autofill are now enabled by default, removing previous configuration flags.\

Advanced Layouts:

  • MeasuredSizeAwareModifierNode: A new, more specific interface for obtaining onRemeasured() callbacks, recommended for custom layout nodes needing only measurement-related events.

Navigation3 (1.1.0-alpha01)

  • Shared Element Transitions for Scenes: Navigation3 now supports treating “scenes” (likely Compose destinations/screens) as shared element objects. This enables smooth, coordinated transitions between composables as they change, by passing a SharedTransitionScope to NavDisplay or rememberSceneState.

DataStore (1.3.0-alpha01)

  • KMP Web Support: Introduces experimental Kotlin Multiplatform Web support for DataStore, leveraging the browser’s sessionStorage API for temporary data persistence within a single browser tab.

SwipeRefreshLayout (1.2.0)

  • Addresses issues with the refresh icon’s retraction and position reset, ensuring it behaves correctly after being shown and hidden.
  • Corrects requestDisallowInterceptTouchEvent(boolean) behavior, now honoring the request like other ViewGroups (though developers can opt out of this new behavior if necessary).

Window (1.6.0-alpha01)

  • Adaptive UI Helpers: Adds helper methods to construct WindowSizeClassSets in a grid format, simplifying the creation of responsive layouts for different screen sizes and folding states.

Other Noteworthy Releases

  • androidx.webgpu:webgpu:1.0.0-alpha01: Initial alpha release of a new library bringing WebGPU capabilities to Android applications. This is a developer preview aimed at specialized graphics use cases.
  • androidx.xr.glimmer:glimmer:1.0.0-alpha01: Initial alpha release of Jetpack Glimmer, a new design language and UI component library specifically for building Android XR (Extended Reality) experiences.
  • Compose Animation (1.11.0-alpha01): Includes a bug fix ensuring position is acquired for shared elements only when SharedTransitionLayout is attached.
  • Compose Runtime (1.11.0-alpha01): Minor API change with RetainedValuesStore.getExitedValueOrDefault renamed to consumeExitedValueOrDefault, and the experimental concurrent recomposition API has been removed.

Signing off

That’s it for now, with Android XR, the Android Studio Otter 2 feature drop with Gemini 3; the release of Android 16 QPR2, compose updates including the stable release of Jetpack Navigation 3, highlights performance spotlight week, and much, much, much more.

See you all in the new year for more updates from the Android developer ecosystem!


Now In Android #123 was originally published in Android Developers on Medium, where people are continuing the conversation by highlighting and responding to this story.