Originally Posted by
Z28ricer
Ever actually setup a gauge that you believe to be correct say to a tank, and the autometer, and pressurize it to see if it really is as inaccurate as you're making it seem ?
Now we're getting into engineering, and I refuse to argue that with a bunch of hardass posters and shadetree mechanics. The following is a statement. Don't expect a reply to "yeah, but..." comments.
Here's what I have done. I have a Greddy mechanical gauge that works exactly like the Autometer mechanical gauge. A brass arm that changes with pressure, some brass gears, etc. That's all fine and dandy until the meter gets bumped around. The brass arm has mass, and when you drop it, it will deform. I did this. Picked it up off the ground and it read -300mmHG. I also opened it back up, bent it back, and double checked against some other references. This type of gauge may have a nice relative precision, but it is not necessarily accurate.
By they way, precision is "closeness within a group", like shooting a guy in the chest 10 times covering 2sq inches. Accuracy is closeness to the right answer, like shooting that guy 10 times was great, except that you were inaccurate because you were aiming for his face.
If you increase pressure 5psi, the Autometer will probably show a 5psi increase. But if it's READING 5psi, who knows what the real pressure is.
If you think that "0" box is to account for atmospheric pressure, you need to go back to school. That box is 2psi across. Assume the middle of the box is standard pressure at sea level. -1psi is around 930mbar, or what the pressure is inside the middle of hurricane Andrew. Yeah, I don't think Autometer had that in their design criteria. That wide box is to account for jostling during shipping. The same jostling that will affect it's accuracy.
Don't think that electronic gauges/sensors are any better. The venerable GM 3-bar sensor has a ton of wiggle room listed in it's own datasheet. Is 1.6V 0psi? Who knows until you calibrate it. My last Defi read a little high, compared to my MS-iBC. This is why real instruments are calibrated against knowns.
I did do a little testing. I have a 1" gauge on my Motive power bleeder. Once I got the Greddy to zero out correctly, I jammed it up to 15psi. They both read more or less right on. I T'd off to the GM map sensor and check the voltage against the datasheet trend. It fell within the tolerances. Once the Greddy and Motive agreed with each other, I took 3 voltage measurements at difference pressures on the GM 3-bar. That's how I determined the trendline for my particular sensor, so I could accurately convert the MAP voltage to pressure. That necessarily mean that my data is accurate, but I do have a consensus between 3 sensors, which says something about standard deviations that I slept through in statistics.