Recently google pixel 2 came out with the highest ever rating of a smartphone camera on DxOmark, AGAIN. Google tweeted it, Google’s CEO tweeted it, the whole bunch of websites put it in their headlines. They showed it like a badge of honor-everytime it happened, but why? I mean we have talked about smartphone cameras in the past, but certainly, in the past two or three years, DxOmark rating has become the holy grail of measuring smartphone camera quality.
So DxO lab measures and rates several different cameras, some DSLRs, some Sony cameras glass, nikon glass, etc. and they have an entire section dedicated to smartphone cameras. On their website, if you click on the “Mobile” section at the top, you will see this hierchy of smartphone cameras, each of them with a big number rating, this is somewhat accurate, but also a bit misleading. Using one single number to describe and to encompass all the complexities that goes into a smartphone camera is a bit too broad and a little too much in my opinion. So how do they do it? DxO breaks it down into the photo, and video and then breaks each of those into numerous subcategories, so if you scroll down to any DxO camera review, underneath the rating you will see that breakdown of that total score into photo and video.
Pixel 2, for example, got a 99 score for photos and 96 for video. But you notice these numbers clearly are an average of subscores, so basically, they have this rating system, essentially their own algorithm for combining all the sub score into this overall score that scores that get published all over the internet and all over your twitter feed, and most of these articles. In fact, probably all of them ignore the sub score and how well they did in each category. You’d have to actually go to DxOmark’s site and actually scroll all the way down and read the review for that, and if you did you would find that DxO does some legit and pretty solid testing, they also have to be scientific and objective, wherever they possibly can. They have some of these indoor sets for testing things like noise performance, sharpness, color reproduction and details in the exact same situation for every phone, so it’s repeatable every time. For testing bokeh with portrait mode, they have a four-round subject and a background that’s a certain distance away, they measure autofocus speed and shutter speed lag with moving subject and they also have a several slightly “less exact” but still pretty telling outdoor tests for things like HDR and color contrast.
Furthermore, according to the website, they have “more than 50 challenging and realistic indoor and outdoor scenes”, for which they turn these photo samples into raw ratings for each category, using a panel of experts (allegedly, according to them) and essentially what they do is they take the new image from the smartphone camera and match it with the closest one on a scale,which is called an image quality ruler. They have an exisisting scale of many different levels of noise for example, and then they match the new sample with where it falls in that scale. So basically they turn a qualitative measurement into a quantitative value. But these websites reporting these new phones when that come out, well, they don’t really justify all that, that’s a little too much for a headline.
So we can say that the DxO is really astute and they make it a little too easy to headline, for they summarise everything into a one big score, which like i mentioned is not the average of the subscore, we can assume they are weighted roughly in that order with exposure and contrast being the most important, autofocus and color being almost as important and this decision of the order that they weight things is subjective. The order that they decide to rank these and the amount of weight that they put is entirely up to them and that’s why you shouldn’t put all your purchase decision into just the overall score, you should look past the overall rating, compiling everything into an overall score can be very misleading. Different characteristics matter to different people, some people take a lot of low light photos and a lot of selfies or a lot of landscape shots, here is a perfect example, the Galaxy Note 8 got an overall score of 94, the pixel 2 got an overall score of 98, you take a lot of portrait photos, which one you’re going to pick?
Well, its obvious that you’re gonna pick the Galaxy Note 8, you’re going to throw the “overall score” out of the window and pick the one that has a dedicated camera for portrait photos, if you look at the subscore of photos, Note 8 has a way higher score for Zoom: 66 and Bokeh: 45. but these are some really new things , therefore they are not weighted this heavily, so they really have a nearly as much of an effect on that overall score and as a result of the phone that you are going to pick, because its better for what you like has got a lower overall score on the DxOmark scale. Are you following my drift?
Most of the smartphone’s cameras nowadays are great when you talk about, contrast, color and dynamic range to varying degrees, but top one are always pretty good> however, the bigger difference comes with the different modes, things like portrait mode and long exposure and panorama stitching, etc., those are what makes them a bigger difference to the user experience, especially when you see a phone like pixel 2 doing a portrait mode mostly with a software versus a second camera. Now here is another reason you might wanna look past the overall scores.
DxO Labs is a consulting company as well as a testing company, for a fee they would work with smartphones manufacturers before the phone comes out to create a better camera. So they will provide their testing software, their hardware, and their testing methodology so that you can calibrate your sensor and your image processing pipelines to produce better photos, but also to just do better on their tests.
Its pretty easy to look at this and think, they are just working with you to get a higher score. Now I am not saying that smartphone manufacturers are specifically tuning their cameras to do better on the DxO mark, instead of actually doing what is better for the consumer, but it definitely has a blurry line there.It’s kinda similar to what we see with benchmarks, I mean most of the things that do better on DXOmark are also better for the camera, but again, it’s combining it and its tuning it with their exact specs to see what’s better with their methodology. Basically, if you don’t work with them on tuning your camera sweet, you would probably get a lower score, and you don’t have that DXO mark money, then you are kinda left out in the cold, they might not even test your phone( LG V30). Ahem.
So you can take all that with a grain of salt, obviously, I don’t think DxO mark is changing their test result based on who paid.
Obviously their reputation is based on objectivity and scientific accuracy in a way so they wouldn’t want to risk that. But the simple fact that they partner up with certain manufacturers and work on certain devices where they don’t with others is kind of an awkward relationship. Imagine if GeekBench the benchmarking suite, Geekbench, Partner with certain manufacturers, on improving the overall device performance and also made a big deal about which one scored better in Geekbench, you see how weird that might be? Anyway moving on to another quick point, with the pixel 2 getting a 98 you are probably wondering that what’s the best score they have given to any camera. That would be a 108 and that was the sensor score for Red helium 8k with the super 35 sensor.
Now if you are thinking, ‘wait a minute, if combining all the subscore values makes the value higher than the average and they gave a value of a 108 for a sensor in a different test then how do you get a score over a hundred?’
According to Engadget, the famous Tech blogging website, DxOmark is given to a device out of the total score of 100. No that’s not possible, I know that is not whats happening,
DxO scores are not out of a 100, the pixel 2 did not get 98 score out of 100 it’s not a 98%, and they are not the only one I see, many tech news outlets reporting the DXO rating as if they are like close to a perfect score or because they think they are out of a 100, they are not.
The fact that they are close to a 100 is great but that’s pure coincidence, the cameras are gonna keep getting better on this linear scale and they are going to reach and pass the 100, there is going to be a phone that will reach 100, they are going to make a huge deal about that, you are going to see it in every headline and then there is going to be the first smartphone to pass the 100, they are going to make a big deal about that too, that’s going to make a lot of headlines again but there’s no maximum or perfect score . The fact that they are near 100 right now, like I said is a pure coincidence and as the result of their waiting system that it pushes those numbers up to a near 100 and that’s why in the past year too, they got reported, because they look like a near perfect score but again that’s just a happenstance. They will keep going above 100, that is just the way it is for now.
So the whole thing about every new smartphone camera being the new highest ever rated on the DxOmark, well it’s kind of annoying right now. That is supposed to happen, the problem is that it just keeps getting reported. Its more like, when the Apple says on their keynote, “this is the fastest iPhone we have ever made” . Well, I hope its the fastest iPhone you have ever made, you shouldn’t be making slower phones from last year.
But you don’t see anyone like GeekBench to rating these chips on a scale and then giving a new crown to the highest ever rating every time, it just doesn’t work that way, my problem is not with the DxOmark, it’s really more with the press, they just have a habit of parroting the new highest overall score and not really looking too far into it, or putting any effort into reporting that too. It’s nice when they do but for the most part they are just trying to get their clicks (it’s hypocritical that we did the same, but this is an epiphany) so if you are on the other side of this and you are actually reading these reviews and these ratings, you will just have to look a little bit more into it, you should actually read the reviews if you are thinking about making a purchase.so keep doing your thing the XO mark if you want to bundle everything into 1 overall score that representative of the phone that’s cool just try to surface a little bit
So keep doing your thing the XO mark if you want to bundle everything into 1 overall score that representative of the phone that’s cool just try to surface a little bit more information more easily and if you are in Texas just like me and if you are writing a new article every time there is a new day XO mark score but you don’t write a new article every time there is a new highest Geekbench score, think about what you are doing. in summary,
To summarize, DxO mark does Real testing yes, but an overall number to describe everything that goes into a camera is a little too broad, yes and then DXO also does consulting to improve cameras based on their own algorithms in testing so maybe take those numbers with a grain of salt and the score just happened to be around hundred right now, but that’s really coincidence they are not out of 100. So what we have learned here is ratings on the nice clean even number to point to one being better than the other but if you are actually wanting to make a Purchase Decision you have to look a little deeper than the DxO mark rating, some posts are really incredible anyway.