Skip to main content
TestDel
Mobile Testing

Mobile App Testing: A Complete Guide for iOS and Android

By TestDel Mobile Team

Mobile App Testing: A Complete Guide for iOS and Android

The short answer: Mobile testing is categorically harder than web testing. The combination of hardware fragmentation, OS version diversity, network variability, and platform-specific behaviours creates a testing surface that most desktop-focused QA teams underestimate until their first major mobile release. According to Statista, there are over 24,000 distinct Android device models in active use. Even limiting to the top 20 devices by market share, the combinations of screen size, manufacturer skin, OS version, and chipset create a testing matrix that no team can cover exhaustively. The answer is prioritisation, not completeness.

What Makes Mobile Testing Different

Device and OS Fragmentation

iOS: The platform is controlled by Apple, which means hardware variation is limited. However, OS version fragmentation is real — Apple's most recent iOS release is not universally adopted, and apps typically need to support the current version minus two (iOS 16, 17, and 18 at the time of writing). Device size variation is meaningful: an iPhone 16 Pro Max and an iPhone SE have very different screen real estate.

Android: Fragmentation is severe. Manufacturers apply their own UI skins (Samsung One UI, OnePlus OxygenOS, Xiaomi MIUI), which modify system behaviour in ways that affect apps. Memory management policies vary significantly between manufacturers — particularly background process handling, which affects how apps behave when returned from background. Battery optimisation features on Huawei, Xiaomi, and OnePlus devices aggressively kill background processes in ways that can break push notifications and background sync.

Network Condition Variability

Mobile users experience a far wider range of network conditions than desktop users. Testing exclusively on a strong Wi-Fi connection in the office is one of the most common mobile testing mistakes. Key network scenarios to cover:

  • Network transitions: Moving from Wi-Fi to mobile data (4G/5G) and back — does the app handle the transition without losing state or crashing?
  • Low-bandwidth conditions: 2G and poor 4G coverage — does the app degrade gracefully, show appropriate loading states, and not time out silently?
  • Complete offline: Does the app have meaningful offline functionality? Does it sync correctly when connectivity returns?
  • High latency: 500ms+ round-trip latency — do API calls time out correctly, or do they hang indefinitely?

Platform-Specific Behaviours

Apps are suspended and terminated by the OS to manage memory. This creates failure modes that don't exist on the web:

App lifecycle testing: What happens when the user receives a phone call during a critical flow? When the screen locks mid-transaction? When the app is backgrounded and then foregrounded after 10 minutes? When the OS terminates the app and the user re-opens it — does it restore state correctly or restart from scratch?

Deep link and push notification testing: Both iOS and Android support deep links that launch the app into a specific state. Test deep links from multiple entry points: cold start, warm start, already-open app in a different section. Push notification routing — particularly when the app is in the background versus fully terminated — is a common source of navigation failures.

Permission flows: iOS and Android both require explicit user permission for camera, microphone, location, notifications, and contacts. Test the permission request flows, denial states (does the app degrade gracefully when permission is denied?), and the app's behaviour when permissions are revoked after initial grant.

The Mobile Testing Matrix

Given the impossibility of exhaustive device coverage, define a priority matrix based on your user analytics:

Tier 1 (must pass): The top 5 devices by your actual user base, current OS version, both iOS and Android. These devices must work correctly for every release.

Tier 2 (should pass): Top 10–15 devices, previous major OS version. Test before each release; failures here are high-priority.

Tier 3 (smoke test): A representative sample of older devices and OS versions covering your long tail. Smoke test on major releases; known limitations documented.

For device cloud testing, BrowserStack and Sauce Labs provide access to real physical devices — not emulators or simulators — for testing against the long tail of your matrix without maintaining hardware.

iOS-Specific Considerations

App Store review compliance: Apple's review process rejects apps for a range of policy violations that are not functional bugs — use of private APIs, insufficient crash handling, missing privacy declarations for data usage. Build App Store compliance into your testing checklist, not as an afterthought when your release is rejected.

TestFlight distribution: Internal and external TestFlight testing has specific behaviours — TestFlight builds have slightly different entitlements than production App Store builds. Test using both TestFlight and production builds before a major release.

VoiceOver and accessibility: Apple's accessibility standards are enforced in the review process. Apps with broken VoiceOver support have been rejected or demoted in App Store rankings.

Haptic feedback and Dynamic Island interactions: On newer iPhone models, haptic feedback patterns and Dynamic Island live activities are expected by users. Test that these work correctly and that their absence on older devices is handled gracefully.

Android-Specific Considerations

Manufacturer skin testing: Do not assume that behaviour on a stock Android emulator reflects behaviour on Samsung, Xiaomi, or OnePlus devices. Test on physical devices from your target manufacturers, particularly for:

  • Background process management (critical for push notifications and background sync)
  • System-level permissions handling (varies between manufacturers)
  • Dark mode implementation (manufacturer skins implement dark mode inconsistently)

Back button behaviour: Android's back navigation is more complex than iOS and a frequent source of navigation bugs. Test back button behaviour exhaustively across all major user journeys, particularly modal dialogs and deep link entry points.

Google Play requirements: Google Play now enforces target SDK level requirements (apps must target recent Android API levels) and has specific requirements around sensitive permissions. Play Store policy violations discovered post-submission cause delays equivalent to App Store rejections.

Test Automation on Mobile

Appium: The industry standard for cross-platform mobile automation. Tests written in WebDriver protocol work across iOS and Android. The trade-off is execution speed — Appium tests are slower than native approaches.

XCUITest (iOS): Apple's native UI testing framework. Significantly faster and more reliable than Appium for iOS-only teams. Integrates directly into Xcode.

Espresso (Android): Google's native UI testing framework for Android. Fast, reliable, and well-integrated with Android Studio and CI.

Our recommendation: Use XCUITest and Espresso for unit and integration-level UI testing in your CI pipeline. Use Appium for cross-platform end-to-end tests that run on real device clouds. The speed trade-off is worth accepting for comprehensive real-device coverage.

Key Takeaways

  • Device fragmentation on Android is severe — prioritise by your actual user analytics, not theoretical coverage
  • Network condition testing (transitions, low bandwidth, offline) is consistently under-done
  • App lifecycle testing (backgrounding, OS termination, permission revocation) finds a class of failures unique to mobile
  • Manufacturer skin testing on Android is not optional — Samsung, Xiaomi, and OnePlus devices behave differently from stock Android in ways that break apps
  • Use XCUITest/Espresso for CI speed, Appium for cross-platform real-device cloud coverage