I'm not sure I understand changing polarity. From what I can tell the OP gauge is just a voltmeter. +12V is fed to the gauge all the time, then when OP is high enough the sender closes and completes the circuit to ground. Since there is (supposed to be) a resistor inline it reduces the voltage to whatever and the gauge reads midscale instead of being pegged. That's what I gathered anyhow. If you inverted the signal, wouldn't the gauge needle move backwards?
You have shaken some of the cobwebs and I am remembering a bit more.
The gauge can be a volt meter or current meter. I had originally been thinking current and the resistor set the appropriate current for mid scale.
The problem with that theory is that 20 Ohms would be an awfully low resistance and current would be high unless a voltage regulator was also used to drop the voltage very low to start with. 20K Ohms would be more appropriate if a current meter.
It being a volt meter than the resistance doesn't matter really cause it is sensing voltage. So if one side of the gauge is tied to +12V and connecting the other end to ground, 20 ohms or not the gauge sees 12V. The resistance can be 500 Ohms and the gauge would still see 12V because there is virtually no current flow.
I looked up the transducer you are using and keep seeing it referred to as oil pressure switch or gauge switch without much info, but I found one listing that said 8psi 24-36 Ohms, 90 psi 8.5 to 17.5 ohms. That seems very sloppy to me, but it does infer a resistance change.
What I think is happening to you is that the gauge resistance is adding to the 20 Ohms but as I was saying, if it is configured as a voltmeter, it really doesn't matter what resistance you have in series. Depending on what kind of bridge the meter uses the total resistance won't affect the reading until at least multiple hundreds of K Ohms or very likely into the megohms.
So instead of 20 Ohms connecting the meter negative (positive tied to +12), you have maybe 36 Ohms at 8PSI plus the 20 Ohms for 56 Ohms, the meter still sees 12V and thus reads half scale.
I wonder what the resistance is at 0 PSI? Does it go fully open? Or does it go to a high resistance? Even if it went to multiple K ohms I expect it will still read half scale. But maybe, just speculating, it has both transducer and switch elements in it so when at 0 PSI the switch part is open? Again I am curious, if you just turn on ignition but not start the engine, what does the gauge show?
As to purpose of the negative voltage let me explain what is happening. The gauge is always connected on the positive side. When a pressure switch is used and the switch closes the other side connects to return. This yields a half scale reading with a voltage of roughly 13.5V (cause the vehicle is running). This means that the "voltmeter" used for this gauge needs roughly 27V to read full scale.
Since the gauge positive is connected to the vehicle positive, to get more than 13.5V across the gauge you need a negative supply relative ground/vehicle chassis. For a simple example, we can take an extra 12V battery and connect the (+) terminal to the chassis. The (-) terminal of that battery will now be -12V. If we were to measure between the +13.5V and the -12V with a voltmeter we would now get 25.5V. See where I am going with this? The negative supply is needed so that the gauge negative terminal can be pulled lower than ground in order to get a larger absolute voltage to the gauge to cause it to read larger than half scale. The polarity sent to the gauge was never reversed because the positive of the gauge remains connected to the most positive available voltage.
Making this work, from what I had (and my transducer had low ohm resistances like yours) it was going to require not just a negative supply which could be done with a small DC-DC converter chip like the one a31ford mentioned, but also some additional circuitry that would put a small amount of current thru the pressure transducer (like 10ma or 20ma) to get a small voltage out of it and then scale that voltage up such that 5psi would output something like +8V to the wire going to the gauge to read at the bottom of the normal range and 60 psi would generate -11V to read at the top of the normal range.
While the circuit to do that wouldn't be terribly complex, there was going to be a pain to build. Then there would be the issue of where to locate. Close to the sensor where less chance of picking up noise but then there are all the environmental stuff requiring it be well sealed and/or potted. Put it up in the dash which probably makes more sense but then two wires still need to go out to the sensor (chassis return may affect reading) and then one wire going to the gauge. When needs to be found in the instrument cluster or a third wire running out the sensor and connect to the existing wire (may be the easiest).
I just didn't want to mess with all that.
I don't know if there is some other sensor that simplifies this, but it seems to me that no matter what there is the problem getting the gauge reading more than half scale without additional voltage from a negative supply unless other mods are done in the instrument cluster.
I don't know what others have done that have done this. When I was looking at it I found people in the process of trying, speaking about what they heard, and even mention of knowing someone who did, but didn't come across anyone who had and what they did.
Maybe there would be demand for a small circuit board that would convert this type of transducer into a signal that would work with the dash pressure gauge?