How to define a minspec

So, it’s time to think about properly defining hardware specs once again, and I realize that over the years this has become increasingly difficult to do. The multplicity of platforms complicates things a lot, as does the ever moving target for latest-and-greatest hardware. Compounding this is the very long lifespan of CPUs – or, to be more exact, of CPU names: the first intel i3 was released 13 years ago, but they are still being sold under that name, so a modern i3-12100 could be more than three times faster than an older i3-4005.

So, what I’m leaning towards is picking some kind of benchmark, ideally from a site that provides lots and lots of scores. Instead of specifying a minspec model, you’d specify a minspec CPU and GPU benchmark and then see if your physical test hardware is in the right ballpark.

This is not ideal – most benchmarks are kinda bogus – but with the incredible diversity of hardware out there it feels like a more reasonable method than what usually gets done in practice, which is going entirely off intuition (“my old laptop is like a low-end PC, right”) or picking a popular model off a market share chart without having any real idea how it performs.

Thoughts? suggestions? Has anybody seen this done well?

1 Like

I’ve not seen this done particularly well, but I’m eagerly awaiting the results of your efforts.

I asked some engineers, and the main takeaway there was not to trust Passmark – which is really annoying, since passmark has great platform coverage, but a couple of engies I spoke to said it produces misleading results.