How is HostScore Calculated?
HostScore™ is proprietary scoring system devised by us for a more holistic approach toward web hosting reviews. The system is designed to offer users a reliable, easy-reference scoring system based on consistent results, rather than single-point-in-time snapshots.
The final result of HostScore is made up of a weighted sum of factors which heavily influence the quality of a web host. The combination includes; Uptime Score, Speed Score, Editor’s Score, and User’s Score.
By using ongoing automated tests, HostScore is able to factor in real data over time to present a more accurate idea of how well a hosting service performs over time.
Each of the four core components of HostScore influence the final tally differently. Based on our experience in the industry, we’ve adjusted weight of main scoring factors to focus on the specifics that contribute to what an ideal host should perform like.
As a formula, this would be shown as;
HostScore = (0.40 * A) + (0.30 * B) + (0.20 * C) + (0.10 * D)
, representing a combination of
- Uptime Score (A) – 40% weight
- Speed Score (B) – 30% weight
- Editor’s Score (C) – 20% weight
- User’s Score (D) – 10% weight
Note: HostScore scoring model updated in January 2020 – more weight given to Uptime Score.
1. Uptime Score
- “Uptime” is the measurement of service availability.
- 0.1% of downtime is equivalent to approximately 8 hours and 45 minutes a year.
- The primary uptime monitoring location is in the United States. If a test from that location fails, the status will be verified from other locations.
“Uptime” is one of the most vital things that contributes to the quality of a web hosting service provider. It is the amount of time which a web hosting service is accessible. Having the fastest server in the world means little unless the server can be reached.
The primary uptime monitoring location is in the United States. If a test from that location fails, the status is verified from other locations including; Germany, the United Kingdom, Singapore, Japan, the Netherlands, Australia and Brazil.
Website owners need to be aware that uptime in percentages always seems higher than an actual figure. For example, if a web hosting service provider guarantees a 99% uptime, that means your total downtime liability can be as much as 7.2 hours each month.
In consideration of this, HostScore calculations put increased emphasis on uptime as a representation of the final score calculation. Uptime scores are based on monitoring data with final internal adjustments for accuracy.
2. Speed Score
- “Speed” is the measurement of time a web host take to respond.
- We measure speed in milliseconds (ms) – the lower the number, the faster the website.
- Speed is tested every 4 hours from 10 locations around the world.
- Speed published in the table is calculated by averaging the fastest and slowest server response times.
- Different speed benchmarks are used for Shared and VPS hosting. We expect websites hosted on VPS plans and Website Builders to load more quickly.
Important note on server speed: Server response speed is the time taken to reply to a request and should not be confused with website speed.
“Speed” is the measurement of how long it takes a host to acknowledge a TCP connection. We measure this in milliseconds (ms), a number which ideally should be as low as possible.
As DNS resolution and SSL negotiation are part of TCP acknowledgement time – All our test sites are in HTTPS (certificate issued by Zero SSL or Let’s Encrypt) and using their respective host’s name server.
Speed measurements are calculated three times every four hours, based on the fastest reading from each of the ten locations. Detailed speed information from each country are available on individual hosting review pages.
We expect servers on VPS plans to perform better than those on Shared hosting plans due to the discrepancy in pricing and equipment.
Based on our calculations, adjustments are made automatically in order to more accurately represent the performance expectations of shared and VPS servers.
Ideally, servers which are able to respond in less than 220ms are considered good, while anything between 221ms to 600ms can be considered as reasonable. Any server response above 600ms will result in a failed Google Page Speed Audit.
3. Editor’s Score
- “Editor’s Score” is based on other elements evaluated during the review process.
- Editor’s score considers features, customer support, value, and company reputation.
There is more to web hosting than merely uptime and speed, even though those are key considerations. Due to this we also factor in the findings of individual reviewers who assess each web host. The reason for this is to assess other criteria.
There are a range of factors which influence an editor’s scoring decision. This includes items such as how streamlined the onboarding experience is, whether customer support is quick and useful, what features might be present on certain hosts, or even the finer details of the company’s terms of service.
The Editor’s Score contributes 20% to the final total of HostScore.
4. User’s Score
- “User’s Score” is based on reviews and feedbacks submitted to HostScore.net.
- User’s Score is sorted by Wilson Score Interval – the more user’s inputs we have, the more accurate User’s Score sorting will be.
To include the opinion of users in our scoring system, we offer you a chance to make your opinion on the hosts matter. This is represented by a yes/no rating, representing either a positive or negative view of the host.
The User Score is sorted by Wilson Score Interval. We use a lower bound, 95% confidence interval. The method has been proven to work in major upvote/downvote systems which abound on the Internet. Remember, your feedback is vital to maintain the integrity of our scoring system, so please rate the web hosts!
How Frequently is HostScore Updated?
- Speed and uptime data are updated daily at 00:00 UTC.
- HostScore is calculated weekly and updated every Sunday at 00:00 UTC.
- Monthly HostScore is updated and published on the first day of the month at 00:00 UTC.
Known Limitations of Our Methodology
- We acknowledge that for a truly stable scoring system, larger numbers of Users Reviews would be ideal.
- We do not own all the websites used in our tests, affecting the potential test environment. Differences in variables such as SSL providers may have a slight effect on test results.
- Current speed tests are from 10 locations only and do not include any from the African continent.
- Speed test results include latency time which cannot be avoided. Despite speed tests from 10 locations, some servers will still have a small advantage due to greater proximity to particular test nodes.
- Should sites being monitored implement changes, some test data might face minor inaccuracies affecting 30 days of data.
- Speed tests monitor TCP connection times, not web page load times.