Years ago I built a power supply with two adjustable outputs. It works for most of my needs, but it's still a work in progress. I built in an old digital multimeter with *incandescent display* (no, not LCD, not LED, not even vacuum fluorescent). Actually, that's the main reason I built the supply; I had to do *something* with that unusual display.
Functionally:
The two separate channels are completely independent. I could use it as a +/- supply by connecting CH1's positive output and CH2's negative output to GND. Or I could have two separate positive voltage supplies by connecting CH1 and CH2's negative terminals to GND.
The multimeter is connected internally to readout voltage or current from either channel and also has a third voltage measurement input (the middle terminals), the measurement can be selected via the pushbuttons between the voltage-setting dials.
Electrically:
It has two 24V 4A switching power supplies, on top. I think they were from a laptop. I used a simple adjustable-voltage-regulator circuit for the output; each output can vary roughly 2V-22V, and the regulators are rated to 5A.
The current-readout is a little bit flawed... I designed it to measure the voltage across a tiny-valued resistor in series with the output. I wanted the output voltage to be the least-affected over a wide range of loads. Unfortunately, that means the multimeter displays a fraction of the actual current (I think it's 1/100th) which is easy to deal with if you're using it all the time, but kinda confusing if you only turn it on once or twice a year.
So, one of the things I've been planning to do is add a simple op-amp circuit to amplify the voltage across the resistor to something more logical (1mA = 1mV would make sense!). This is more difficult than it sounds. For one thing, I'm not particularly good at op-amp circuits... It might well be because all the old op-amps I inherited from the 70's are burnt out, because I didn't have nearly as much trouble with those same circuits in school.
For another thing, I have to figure out which power source to run it off of! I might have to build two identical amplifier circuits, one for each channel, running off that channel's power supply. I'm not sure how well this will work if the voltages input into the op-amp are right near the rail. Or another option would be to run it off the multimeter's power supply. Either way, there're limitations. None of the internal supplies are bipolar. Also, I have no idea what would happen to the multimeter if it shared a voltage source with the measurement. It's circuitry is way beyond me (what's this piece of glass with wires coming out of it?! And this welded-shut-aluminum box?).
Tonight I had the brilliant idea that the circuit might work off a single-supply... how those single-supply op-amps work is magic to me... but they have example circuits in the datasheets. Anyways, I came to the conclusion it might be worthwhile to explore...
Part of that exploration came to checking whether there'd ever be a case where the output current would be negative. So I thought sure, what if I had one supply set to 12V and the other to 5V, sharing grounds, and I wanted to make use of the 7V in between...
Yeah, enter Voltage Regulator 101:
I hooked up a resistor inbetween 24V on one supply (long story, one of my regulators blew a long time ago and I don't have a replacement) and 20V on the other and was surprised to measure... no current. Zero (measured to .0001V). I measured the 20V output and saw it at 24V (higher than the regulator can output!). I tried a lower voltage setting, measuring it *with* the load resistor (connected at the other side to 24V)... set it to 20V and it looked like it was working. But when I removed the load, it dropped down to 8V!
Turns out, these adjustable voltage regulators *require* a minimum load. That's handled by the adjustment-circuitry, usually. But, by driving current *into* the regulator, that minimum load wasn't met. Sure enough, the datasheet says: "all quiescent operating current is returned to the output establishing a minimum load current requirement. If there is insufficient load on the output, the output will rise."
Anyways, this entire self-indulgent post to point out this little tidbit, because people use adjustable regulators for power-supplies all the time. They work great as long as your load *is positive*!
Realistically, there probably aren't many cases where this is a problem... Most circuitry returns the current back to ground, anyhow, as well it probably should. But part of the point of having an adjustable powersupply is for experimentation... And this is one thing that could be quite confusing. And, I have thought of a case where it might be handy to use the supplies as I just attempted... The minimum output is ~2V; if I needed 1V (and the regulators worked as I imagined before), I would have set one to 2V and the other to 3V and used the difference. But they don't work that way, so I need to put a note on my powersupply... maybe have a warning indicator if the regulators are dropping out.
In the meantime, I guess this makes my amplifier circuitry easier... I don't have to worry about negative-current cases. (Then again, it *did* work when I set the voltage *much* lower... I wonder if that's hard on the regulator circuitry).
No comments:
Post a Comment