Every product reviewed on The Tested Hub is independently purchased, tested for a minimum of 30 days, and graded against measurable criteria. We accept no payment for reviews and no review samples from manufacturers. Our income comes solely from affiliate commissions, fully disclosed on every page.
1. Research & Shortlist
Before we touch a single product, we identify the candidates. For each category, we analyze 50+ products on Amazon, narrow to the top 10 contenders by user reviews, professional press coverage, and reported issues, and then narrow again to the 3–5 we'll actually test.
Our shortlists are biased toward what people actually buy — not what manufacturers want us to feature. If a $79 contender has 12,000 reviews and a $399 flagship has 240, we test both.
2. Buy Like a Customer
We purchase every product at retail, from Amazon, with the same checkout flow you'd use. We never accept manufacturer-provided units, "review samples," or PR loaners. This means we pay full retail, and if a product breaks during testing, that's our problem (and a useful data point).
This policy exists because review samples are typically hand-picked by manufacturers from the best units off the line. The unit you receive at retail is statistically more likely to have issues. We want to test what you'll get.
3. Lab + Real-World Testing
Each product spends a minimum of 30 days in our testing rotation. For headphones and audio gear, that includes:
- Standardized lab tests on calibrated equipment (B&K Type 4128-C HATS for ANC and frequency response measurements, calibrated dB meter for noise attenuation).
- Real-world wear tests across daily commutes, gym sessions, flights, and home use.
- Battery measurement with controlled conditions: 50% volume, ANC on, AAC codec, repeated 3 times for accuracy.
- Comparison testing against the top 3 alternatives, switching between products on the same source material.
For other categories (vacuums, kitchen appliances, smart home devices), the protocol is adapted but the principle is the same: standardized + real-world, minimum 30 days.
4. Cross-Editor Review
Two editors independently rate every product before publication. If their scores diverge by more than 0.5 (on our 5-point scale), a third editor breaks the tie. This catches biases — both positive and negative — that any single reviewer can develop after weeks with a product.
Section ratings (sound quality, comfort, battery, etc.) are similarly reconciled. This is why our scores rarely cluster at the high end: real disagreement keeps the bar honest.
5. Long-Term Updates
A review isn't done at publication. We re-test every product in our recommended list at least once per year, and we update reviews immediately when:
- A new generation is released (we test the upgrade and update the comparison).
- A firmware update changes performance (we re-test the affected metrics).
- A recall, defect, or durability issue emerges in our long-term unit.
- The retail price changes meaningfully.
Every review carries a visible "Updated" date and an update log at the bottom. If you're reading a 2024 review in 2026, you'll know exactly what changed and when.
What we don't do
Some practices common in our industry that you won't find here:
- Sponsored reviews. Period. We've turned down meaningful money to keep this rule.
- Affiliate-link-driven verdicts. Our writers don't see commission data while writing. We have a written editorial firewall.
- Manufacturer review samples. See Step 2 above.
- "Best of" lists where everything is recommended. If we can't honestly recommend at least one product in a category, we don't publish a "best of" list for it.
- Stock product images. Every product photo on this site is shot by our team, in-house. Look for the "in-house photo" caption.
Questions about a specific review?
Email editorial@thetestedhub.com. We read everything and reply within 5 business days.