HCHTech
Well-Known Member
- Reaction score
- 4,268
- Location
- Pittsburgh, PA - USA
Ok, I'm old. I could forgive the 1960's cars I had when I first started driving for having wildly inaccurate fuel gauges. They used mechanical floats that could have different readings depending on the level of the road/driveway/etc., were subject to corrosion and breakage, the floats could leak or become saturated with gas, the list of potential problems was fairly long.
I had a car in the 80s, it was a Pontiac Bonneville. It would drop what seemed like normal until it got to empty, but if you filled it up at that point, it only took 3/4 of a tank. You could drive forever on empty in that car.
It seems that each car had it's own idiosyncratic gauge. I had a car in the 90s, I think it was a Mazda 626 - anyway, the gauge would go down what seemed too quickly until it got to half-full. There it would stay for 100 miles or so, then it would drop very quickly to empty over the next maybe 50 miles. For that car, you had to remember how many miles it was stuck at half to know when you'd better be looking for a gas station. If it read empty, you had better be coasting up to the pump.
Thinking back on the dozen or so cars I've owned in my lifetime, I don't think accuracy has improved at all. My current car, for example, has a digital gauge, with little pips that make a kind-of progress meter. There as 12 pips that light up when the tank is full. While the range of the car varies depending on gas mileage, it's fairly typical for me to get 400 miles per tank in the summertime. It can take anywhere between 80 and 115 miles for that first pip to go dark, and the miles per pip as the tank goes to empty seems to be clearly non-linear. I can be at 300 miles for the current tank of gas and the gauge will still read half-full. The last couple of pips can go dark at maybe 20 miles per pip. In my mind, it shouldn't be unreasonable to expect an average "miles per pip" to be about equal to average miles per tank divided by number of pips, you know, linear.
Similarly, for needle-type gauges, it shouldn't be unreasonable to expect the gauge to read 1/2 full when you have driven half of the miles that tank of gas will allow.
It seems the only accurate point is when you finish filling up the tank and the gauge reads "full". Other than that, it's a crap-shoot.
Before you bring driving style into this argument, I should say that I am a fairly consistent driver. I drive a lot of miles, and gave up road rage or concern over being 5 minutes late for somewhere a long time ago. My car displays the instant gas mileage as well as the running gas mileage for the current tank - those figures are commonly very level over the course of a tank of gas.
My only conclusion is that it must be an incredibly difficult task to make an accurate fuel gauge (probably only slightly more complicated than it must be to make an accurate progress bar for program installations) if in 40 years of automotive development the damned gauge still can't accurately tell you how the $%#& much gas you have left in the tank. Just sayin'
I had a car in the 80s, it was a Pontiac Bonneville. It would drop what seemed like normal until it got to empty, but if you filled it up at that point, it only took 3/4 of a tank. You could drive forever on empty in that car.
It seems that each car had it's own idiosyncratic gauge. I had a car in the 90s, I think it was a Mazda 626 - anyway, the gauge would go down what seemed too quickly until it got to half-full. There it would stay for 100 miles or so, then it would drop very quickly to empty over the next maybe 50 miles. For that car, you had to remember how many miles it was stuck at half to know when you'd better be looking for a gas station. If it read empty, you had better be coasting up to the pump.
Thinking back on the dozen or so cars I've owned in my lifetime, I don't think accuracy has improved at all. My current car, for example, has a digital gauge, with little pips that make a kind-of progress meter. There as 12 pips that light up when the tank is full. While the range of the car varies depending on gas mileage, it's fairly typical for me to get 400 miles per tank in the summertime. It can take anywhere between 80 and 115 miles for that first pip to go dark, and the miles per pip as the tank goes to empty seems to be clearly non-linear. I can be at 300 miles for the current tank of gas and the gauge will still read half-full. The last couple of pips can go dark at maybe 20 miles per pip. In my mind, it shouldn't be unreasonable to expect an average "miles per pip" to be about equal to average miles per tank divided by number of pips, you know, linear.
Similarly, for needle-type gauges, it shouldn't be unreasonable to expect the gauge to read 1/2 full when you have driven half of the miles that tank of gas will allow.
It seems the only accurate point is when you finish filling up the tank and the gauge reads "full". Other than that, it's a crap-shoot.
Before you bring driving style into this argument, I should say that I am a fairly consistent driver. I drive a lot of miles, and gave up road rage or concern over being 5 minutes late for somewhere a long time ago. My car displays the instant gas mileage as well as the running gas mileage for the current tank - those figures are commonly very level over the course of a tank of gas.
My only conclusion is that it must be an incredibly difficult task to make an accurate fuel gauge (probably only slightly more complicated than it must be to make an accurate progress bar for program installations) if in 40 years of automotive development the damned gauge still can't accurately tell you how the $%#& much gas you have left in the tank. Just sayin'