Step-by-Step Guide: Running the Microsoft Platform Ready Test Tool on Your App
This guide walks you through preparing, running, and interpreting results from the Microsoft Platform Ready Test Tool so your app meets platform compatibility requirements.
Before you start — prerequisites
- Windows SDK / developer tools: Ensure Visual Studio and the Windows SDK matching your target OS are installed.
- App package: Have your app packaged (MSI, APPX, MSIX, or installer) and signed if required.
- Test environment: Use a clean test machine or virtual machine that mirrors your target OS/version and has no conflicting developer tools or test artifacts.
- Administrator access: Required for some tests and log collection.
1. Obtain the Platform Ready Test Tool
- Download the latest Platform Ready Test Tool from Microsoft’s official distribution (Microsoft Partner Center or Microsoft Docs).
- Extract and place the tool on the test machine in a folder with sufficient disk space.
2. Review the applicable test checklist
- Open the tool’s documentation and identify which test suites and rules apply to your app type (desktop, UWP, MSIX, device drivers).
- Note any mandatory tests (install/uninstall, digital signature validation, API compatibility, performance, reliability, security checks).
3. Prepare your app package and test inputs
- Build and sign your package for the target architecture (x86/x64/ARM).
- Include any required configuration files, test data, and command-line arguments the app needs to run under automated conditions.
- Create a test account/profile if the app requires authentication.
4. Configure the test tool
- Launch the Platform Ready Test Tool with Administrator rights.
- Create a new test session/project and point it to your app package or installer.
- Select the suites to run. Typical selections:
- Installation/uninstallation
- Runtime validation (API, manifest, capabilities)
- Compatibility and interoperability
- Performance and reliability
- Security and privacy checks
- Set timeouts and retry policies for flaky tests.
- Configure logging level (Normal/Verbose) and output folder for reports and artifacts.
5. Run a smoke test
- Execute a quick subset of core tests (install/uninstall, launch, basic functional check) to verify the app and environment are ready.
- Inspect logs and fix any immediate blocking issues (missing dependencies, required privileges, incorrect installer behavior).
6. Execute the full test suite
- Start the full run. Monitor progress and ensure the test machine remains powered and network connectivity is stable if tests require it.
- Avoid other heavy processes that could skew performance or reliability results.
7. Collect and review results
- After completion, open the consolidated report (HTML/XML/CSV depending on the tool).
- Key items to check:
- Pass/fail summary for each suite and test case
- Error and failure traces, stack traces, and screenshots for UI failures
- Installer logs and system event logs
- Performance metrics and thresholds exceeded
- Export artifacts to a developer-accessible location for debugging.
8. Triage and fix failures
- Prioritize fixes by severity: blocking install/run failures first, then functional, then performance/security warnings.
- Reproduce failures locally and use logs to identify root causes (missing manifests, incorrect API usage, permission issues).
- Implement fixes, update manifests/capabilities, re-sign packages if necessary.
9. Re-run targeted tests
- Re-run only the affected test cases or suites to validate fixes before running the full suite again.
- Keep iterations small and document changes between runs.
10. Final full validation and submission
- When all mandatory tests pass, run a final full-suite validation to confirm stability.
- Save final reports and include them with your submission package if required by Microsoft Partner Center or your distribution channel.
Tips and best practices
- Use virtual machine snapshots to revert to a clean baseline quickly between runs.
- Automate test runs in CI for regression checks on each build.
- Keep detailed change logs mapping code changes to test outcomes.
- Use verbose logging only when diagnosing; normal logging reduces noise during routine runs.
- If encountering ambiguous failures, consult Microsoft Docs and community forums for similar issues and official guidance.
Common troubleshooting checklist
- Installer fails: check digital signature, installer command-line options, and prerequisites.
- App crashes on launch: collect crash dumps, check dependencies and runtime frameworks.
- Permission or capability errors: verify app manifest and declared capabilities.
- Performance flakiness: repeat tests, isolate background services, and increase measurement runs.
If you need, I can produce a ready-to-run CI script for automated Platform Ready test runs tailored to your build system (Azure Pipelines, GitHub Actions, or Jenkins).
Leave a Reply