Most IPTV review sites operate on a single-afternoon test followed by copy-paste reviews based on provider marketing materials. That's not testing. That's rewriting. We don't do it that way, and we want you to know exactly how our process differs.
The ninety-day rule
No provider appears in our rankings until it has been under active test for at least 90 consecutive days. This is non-negotiable. A service that works brilliantly for a week often falls apart in the second month as server load grows or the provider takes on too many new subscribers. We've watched it happen dozens of times.
The seven-criteria scoring system
Every provider is scored against seven criteria. Each criterion has a weight. The final score is the weighted sum, displayed as X.X out of 10.
| Criterion | Weight | How We Measure It |
|---|---|---|
| Stream Reliability | 25% | Continuous uptime probe every 5 min for 90 days. Downtime tracked in seconds. |
| Channel Count & Quality | 20% | Automated enumeration of advertised channels. Live connection test on each. |
| Picture Quality | 15% | Measured bitrate at peak hours. Spot checks for upscaling. |
| Device Compatibility | 15% | Functional testing on Fire Stick, Nvidia Shield, Smart TVs, iOS, Windows. |
| Customer Support | 10% | Six tickets per provider. Response time and resolution quality scored. |
| Value for Money | 10% | Price normalized to per-channel and per-feature metrics. |
| EPG & Interface | 5% | Program guide accuracy over 7 days. Interface navigability scored by two testers independently. |
The test setup
We maintain a dedicated testing rig in a residential setting with average household network conditions. This matters — a provider that works on a commercial-grade 1 Gbps connection might collapse on a typical 50 Mbps home connection.
- Primary device: Amazon Fire TV Stick 4K Max (the device most users will have)
- Performance device: Nvidia Shield Pro (to test if low-spec devices are a bottleneck)
- Smart TV: Samsung QLED with Tizen OS (tests built-in IPTV app compatibility)
- Mobile: iPhone 15 with GSE Smart IPTV
- PC: Windows 11 workstation with VLC and TiviMate
- Internet: 500 Mbps residential fiber (representative of a decent but not excessive home connection)
Uptime measurement
This is where most "testing" sites fudge their numbers. We wrote a simple Python probe that every 5 minutes (288 times per day) attempts to fetch a 2-second video segment from a representative channel on each provider. A successful fetch means the service is up. A failed fetch (timeout, 404, or garbage data) means it's down.
Over 90 days that's 25,920 probe attempts per provider. We record the percentage of successful fetches and display it as measured uptime. If a provider claims 99.9% uptime and our probe measures 97.2%, we publish our number. The discrepancy usually comes from load-balancing failures that affect some subscribers but not others.
We also track the distribution of outages — one 4-hour outage is better than eight 30-minute outages at the same total downtime, because the second pattern suggests ongoing instability.
Channel count verification
"40,000 channels" is meaningless if a third of them are dead. We enumerate the full channel list, make a live connection attempt to each, and report the percentage that actually delivers video. We also sample the picture quality of a representative 50 channels per provider to catch services that advertise HD but deliver 480p.
Customer support testing
We open six support tickets per provider during testing, covering:
- Two genuine issues we encounter during testing
- Two synthetic beginner questions ("how do I install the app?")
- One cancellation inquiry
- One technical question about server regions or protocol support
We score response time, accuracy, and whether the agent resolves the issue or escalates to a black hole.
What we don't do
- We don't accept press packs or provider-supplied accounts. We pay for our own subscriptions with our own money.
- We don't accept payment for placement. Several providers have offered. We've declined every one.
- We don't let providers pre-review our reviews. They see the review when it's published, same as you.
- We don't review services with less than 24 months of operating history. Too risky for readers.
Re-testing cadence
Every provider in our rankings gets re-tested quarterly. A service that slips in performance gets re-ranked within 14 days of a confirmed regression. A service that improves also gets a re-rank. Rankings are not static — the market moves fast and we try to move with it.
Raw data availability
We publish the uptime logs and channel count data for each provider alongside their full review. If you want to verify our numbers, the data is right there. This is unusual for review sites — most don't publish raw data because they don't have any.
Questions?
If you have questions about our methodology, or you want us to re-test a specific provider you think we've misjudged, email editors@streamreviewhq.com. We read every email.
Start with our current rankings to see the methodology applied.