Why design it this way?
I've been thinking about this a bit, wondering what the engineering team could have been thinking when they designed the electonic oil gauge/display and removed the dipstick. Since no explanation seems to be forthcoming from Volvo I guess we're free to speculate.
In the old setup, the dipstick showed a range which was "OK" (the crosshatched area between MIN and MAX). The range was used because the actual level varied due to a number of factors (temperature, incline of the vehicle, amount of oil which was somewhere up in the engine or filter and not drained into the pan) but wherever the level was indicated, it was still OK if it was in the crosshatched range. The instructions for checking were to check it cold, on a level surface, or at least let it drain for a while if it had been recently run to get a more accurate reading. For people who actually checked the dipstick, the indicated level could vary from time to time based on these factors. A conscientous owner might improve the consistency of the results by always checking in the driveway before the car was started in the morning, or always checking at the gas station close to home after filling the tank and letting the oil settle into the pan for a few minutes. But, it seems obvious that the correct level is not an absolute level (such as 40mm) unless the readings are taken under controlled conditions which are probably beyond the capability or at least the attention span of most owners.
It is also worth pointing out that the dipstick, for all its 2 foot length, provides only a tiny indicator range at the end to measure oil level. It's not like you're getting a full range measurement as if you had poured the oil into a graduated cylinder, you only see a small window around the expected full mark. And, since the oil pan is a shallow, wide container, you will not get the consistency of readings that you would get with a tall narrow container. You measure your flour in a measuring cup, not in a cake pan. Since the dipstick is inverted with the tip facing down into the oil pan, as the level drops it passes the end of the stick a bit below the MIN mark, so past that point you have no idea how much oil is in the pan, just that it's too low to register. And, I'm fairly confident that even when the level has passed the MIN mark (or even if it fails to register because it is just below the end of the stick) there must be a reasonable safety margin between a low level indicated on the dipstick and the level where the low pressure warning light would come on and engine damage would become imminent.
So, the dipstick has it's own set of limitations.
In replacing the dipstick with an electronic sensor, which approach would be chosen?
They could provide a binary sensor and indicator, but since the correct level is actually a range and not an absolute, 2 sensors would be required to define the MIN and MAX of the OK range. The corresponding display would then just read OK or NOT OK.
They could instead provide an absolute oil level sensor (calibrated in mm) which would correspond to the dipstick. This sensor would still be subject to the same variation as the dipstick (assuming it's in the same location), yet it's nature would tend to lend more credence to its accuracy than actually exists. Under perfect conditions a reading of 40mm might be considered correct, but due to the known variations there would still be a range which was OK, perhaps 35-45mm. From the VIDA readout it appears that this sensor does actually exist, and some feel that since it does this information should be displayed to the owner on the dashboard. The drawback of this is that an owner might feel that normal variations in the absolute reading constitute a problem, when in fact they are just normal variations. The 1 mm resolution of the sensor is finer than would be detectable on a dipstick, and would show a lot more variation than could be detected by visual inspection on a dipstick, which is not calibrated beyond the crosshatching. A technician might use this information, but not many owners would know how to interpret it.
So there is a second approach to the display of this information, which seems to have been chosen by Volvo, which instead presents an analogue to the OK range of the dipstick. The normal variations do not appear on the display, just a graphic showing that the level is in the OK range and a message that it is in fact OK. This prevents unnecessary problem reports due to normal variation, but also provides the very real benefit of automatically checking the oil level frequently and providing a warning message if the level drops below the OK range. Presumeably this warning comes long before the level reaches the critical point where engine damage is imminent, functioning in the same way as the disptick would, assuming it were checked regularly. If the car is taken to the dealership then the actual oil level history can still be extracted using VIDA and used for diagnosis. What the owner does not get is the ability to, by carefully controlling his observations and reducing variability due to the factors noted above, detect a trend. We're talking about a small percentage of owners here, but it's a real shortcoming.
For the vast majority of owners, the new electronic implementation is beneficial, because it provides automatic oil level checks and warnings. For those it is better than having the old dipstick. For the minority who actually pay attention to oil level, it provides the same benefit, but could be made better by either leaving the dipstick as a backup, and/or by making it clear what the electronic display is actually showing. The segmented design implies that there is more accuracy than there really is. It seems that in fact the accuracy is the same as the crosshatched area on the dipstick, indicating that the level is either OK or not, and if not, then the owner should add oil until it is again OK.
The diesel version seems to work differently, with the segmented level meter actually reflecting the absolute reading from the sensor, at least in segments corresponding to what could be discerned by visual inspection of a dipstick. So, it's possible that this may change in the future, and perhaps the current display is simply a victim of circumstance. Maybe the required sensor was unavailable or not up to spec, or the engineering team simply ran out of time to implement the graduated display. Maybe it does exist but the software wasn't ready in time and it will magically appear with an update. It would not surprise me if someone decided to just have it read MAX all the time until it was down below MIN and thought that the omission would go unnoticed. I bet that it will be fixed to match the diesel implementation at some point.
This is all just my opinion and conjecture of course, and I would personally prefer an old fashioned dipstick as a backup.
Anyone care to poke holes in this?