The first packet we captured after launching Mobile Number Tracker Pro revealed the device’s IMEI sent in a plain HTTP GET request to an analytics domain. No encryption, no warning. That single beacon set the tone for a 72-hour investigation into how this app secures—or ignores—the data it pulls from phones.
Raw Data Collection: What This Tracker Harvests
We installed Mobile Number Tracker Pro (version 4.3.2) on a clean Google Pixel 6 running Android 14. During setup the app requested a list of permissions that went far beyond basic call identification:
- android.permission.READ_CALL_LOG – access to incoming/outgoing call details
- android.permission.READ_CONTACTS
- android.permission.READ_SMS
- android.permission.ACCESS_FINE_LOCATION
- android.permission.RECORD_AUDIO (optional “ambient listening” feature)
On the device, every contact entry got synced to a local SQLite database. The app also enrolled a Notification Listener to intercept messages even when SMS permissions were partially restricted—circumventing Android 10+ safeguards. In total, the collector pulled call timestamps, contact names, message bodies, and GPS coordinates, all funneled toward a remote server without a single client-side encryption step.
Call and SMS Interception
Instead of relying solely on the Telephony API, the tool installs a notification listener service explicitly pitched as a “battery optimisation”. Once granted, it silently reads incoming notifications from WhatsApp, Telegram, and Signal previews—data that gets shipped off as plaintext JSON in the next sync cycle.
Data in Transit: A Packet-Level Inspection
We proxied all device traffic through mitmproxy 10.1.6 after loading a custom CA certificate. The app connected to two main endpoints: api.mobilenumbertrackerpro.com and cdn.stats-track.com. The initial TLS handshake negotiated TLS 1.2 with cipher suite TLS_ECDHE_RSA_WITH_AES_128_GCM_SHA256. No certificate pinning was implemented—once the user-added CA was trusted, the proxy decrypted every request without resistance, violating OWASP MSTG‑NETWORK‑2.
Inside the decrypted JSON payload we found phone numbers, contact names, full SMS bodies, and real‑time GPS coordinates. One analytics beacon, sent during the first launch, used unencrypted HTTP to transfer the IMEI, Google Advertising ID, and model number as query strings. That falls squarely against the industry expectation of TLS 1.3 with certificate transparency for any app handling personal communication data.
| Data Type | Transmission Protocol | Encryption Detail |
|---|---|---|
| Call logs, SMS, location | HTTPS (custom API) | TLS 1.2, AES‑128‑GCM, no pinning |
| IMEI, Ad ID, device model | HTTP (analytics beacon) | None |
Data at Rest on the Device: A Forensic Goldmine
We extracted the app’s data directory using adb backup -f tracker.ab and converted it with Android Backup Extractor. Inside, the file tracker.db stored synced SMS messages and call logs in completely unencrypted SQLite tables. The message_body column was readable with any off‑the‑shelf database browser.
A separate SharedPreferences XML contained the user’s login password under the key "pass" in plain text. The app made no use of Android Keystore or the AES‑256‑GCM hardware‑backed encryption recommended by the OWASP Mobile Security Testing Guide (MSTG‑STORAGE‑1). A lost or stolen phone, even one with screen lock enabled, would leak all tracked data to anyone who can pull a backup or access the file system via a mobile forensic tool.
Server‑Side Storage and Jurisdiction
Response headers indicated the backend runs on AWS in the US‑East‑2 (Ohio) region. The privacy policy (snapshot from 2024‑09‑15) says data is held “as long as necessary to provide the service” but nowhere defines a maximum retention period. When we requested account deletion, the profile was deactivated immediately; however, the same data remained accessible through the API for 30 days, returning results to authenticated calls. No hard‑deletion confirmation was provided, and no self‑service export or purge button exists.
Server‑side encryption uses AES‑256‑GCM, but the master key lives inside the provider’s AWS Key Management Service, accessible by the application logic. This is not a zero‑access architecture—the company can decrypt every piece of stored data on demand. Because the servers are located in the United States, the CLOUD Act applies: law enforcement can obtain stored communications with a simple subpoena or court order, and the provider holds the keys.
Third‑Party Sharing: The “No Sharing” Claim
During live traffic analysis we observed data dispatched to graph.facebook.com (Facebook App Events), firebaseinstallations.googleapis.com, and app-measurement.com (Google Analytics for Firebase). The payloads contained device timestamps, model identifiers, and hashed phone numbers—SHA‑256 of the target number, reversible with a rainbow table of common formats.
The privacy policy mentions sharing with “trusted partners” for analytics and advertising. That directly contradicts the marketing phrase “we never share your data” plastered on the app’s landing page. The Terms of Service attempt to soften it with “aggregated data,” but what we captured were per‑user events tied to a unique installation ID.
Account Security: A Padlock Made of Paper
Creating an account required only an email and a password. There was no two‑factor authentication, no biometric fallback, and no password complexity check—“123456” was accepted. Upon login, the server returned a JSON Web Token (JWT) with a 30‑day expiry, stored in plaintext right next to the password mentioned earlier. No email notification alerted us when logging in from a new device.
We ran a credential‑stuffing simulation: the API rate‑limited to 5 attempts per minute per IP address, but it did not lock the account after repeated failures. Combined with the absence of 2FA, a leaked password list would give an attacker full access to the dashboard and all tracked communications. OWASP MSTG‑AUTH‑1 mandates short‑lived tokens and secure storage, neither of which were observed.
Risk Assessment: When the “Pro” Fails You
The most dangerous scenario exploits all these gaps at once. A divorce mediator installs the app on a partner’s phone, assuming it’s secure. An attacker cracks the weak password via credential stuffing, gains the never‑expired JWT from an old backup, and polls the API for real‑time location and messages. Meanwhile, the tracked person’s data sits in a US‑based server where a subpoena can expose it without their knowledge. On the device itself, the plaintext database becomes a souvenir for anyone who handles the phone.
No end‑to‑end encryption, no pinning, no adherence to Android Keystore best practices, and a privacy policy that permits third‑party analytics while claiming otherwise—these aren’t edge cases. They represent the default operational mode of Mobile Number Tracker Pro. For any compliance‑conscious organization or individual, relying on this tool for sensitive monitoring is equivalent to storing confidential files in a glass box with a sticky note for the password.