A PC Benchmark
Measure the genuine performance of your system with professional quality.
Reliable Time Measurement
Benchmarking is in a difficult spot right now.
If you have seen a PC benchmark world record, advertized performance numbers or casual benchmark results on the internet, chances are you just saw a single number, a chart or a screenshot. Is the number correct? Which Windows version was used? CPU temperature? Was the time correctly measured? Even if some additional information is provided, you have to trust the source for the validity of the result, because you simply have not enough data to understand, validate and reproduce it yourself.
A big improvement came with advent of automated result uploads to online platforms. The time-consuming and error-prone form submission could finally be done with a single click and an internet connection. But to this day only a few benchmarks (like or own GPUPI) provided this functionality. The implementation also depends heavily on the knowledge and maintenance of the benchmark developer and therefore varies in security, reliability and in the end quality. In addition all benchmarks need to be updated regularly to provide up-to-date hardware detection for proper submission. While this was an important step for benchmarking, it still does not provide a common denominator to compare results from different online platforms and rankings.
BenchMate is a common result standard.
A well-considered minimum of security and timer reliability applicable to all benchmarks, though much more advanced than anything currently available in this industry. Benchmarks are guarded by a combination of a custom driver, two background services, the BenchMate launcher and client to protect against access with malicious intent as well as unknowingly skewed results.
BenchMate monitors the benchmark’s process closely to enable precise time measurement provided by low-level access to the system’s timer hardware. Every run is measured with up to three independent timers allowing a maximum deviance of only 0.01% and less than 10 milliseconds. We also validate all files and platform parameters for the benchmark and its workloads automatically to avoid tampering of executables, libraries and resource files. The benchmark executable, its executed machine code and the application’s logic remains untouched throughout the whole process.
Last but not least BenchMate captures the result automatically before it is displayed by the benchmark and stores it in a secure location. Hence we can provide an easy and unified workflow even for unmaintained legacy benchmarks.
Our primary goal is to regain trust in benchmark results.
HWBOT, the world’s biggest result database, has tackled these problems for many years to provide high-quality ranked results from various, technically diverse benchmarks. With an extensive, up-to-date rule book and an elobrate manual validation process, the moderators of HWBOT invest a lot of time to provide the overclocking community a fair playground for competitive benchmarking. Since June 2019 we have joined the fight and developed a new benchmarking workflow that directly integrates result submission to HWBOT.
But this not only a problem of competitive benchmarking. It also concerns advertized results by vendors and PC builders, data shown in reviews and tech videos, and randomly posted results around the globe. Shouldn’t they be reliable, comparable, available for analysis and repeatable to be trusted?
The next steps will integrate more benchmarks, 3D game benchmarks and a custom-made validation platform with lots of data to analyze. Stay tuned!
This software will always be free for the benchmarking community around the globe. Therefore we rely on crowdfunding to be able to pay for ongoing certificate and licensing costs and to allow us to dedicate more time to bugfixes and development of brand-new features. Supporters will get access to all alpha and beta versions!
With your help we can define a future-proof validation standard for benchmarking!