The idea came out of my need for a reliable, well-performing bench power supply. Adjustable DC bench supplies abound on the internet, but I couldn’t find any that met my requirements for a reasonable cost. Plus, building one for yourself is a great learning opportunity in addition to knowing exactly what performance you’re going to get out of your power supply.
The design is for a low-noise adjustable bench DC power supply with the following specs:
-Output voltage 0-18V each channel
-Output current 0-3A (per channel) upgradable to 5A
-LCD output of each channel’s voltage and current
-USB programmable outputs
-Crowbar circuit for overcurrent
-No exact spec on noise, transient response, load/line regulation except to have the best performance that is reasonably possible.
-I set a budget of ~$200, which places it on par with, or slightly cheaper than comparable units found online.
order priligy Architecture
The design is based around a 3-stage converter, with an AC/DC forward converter feeding a buck converter, which feeds an LDO. For the AC/DC converter, I chose a Lenovo laptop power supply. I have experience designing AC/DC supplies and am comfortable doing it, but I’m living in a big apartment complex where technically I’m not supposed to be any hardware hackery, so I decided not to take chances. I took a peek inside and it appears to be well built (as you’d expect from Lenovo)- I wouldn’t take chances on a cheap ebay power brick. The converter is specced for 20V, 8.5A. This should work even in my worst case of 2 channels at 18V, 3A each plus overhead in the control circuitry. I’m sure the brick has some form of fault protection, which should serve as an extra layer on top of what I will add. Measuring the Lenovo brick, it looks like it outputs 20.75V under no load, with ~50mVpp of 60Hz noise:
This is a good starting point for simulations.
The DC portion of the supply was implemented as a two-stage converter, with a buck converter feeding an LDO. The buck converter drops the bulk of the voltage from 20V to the output voltage, while the high gain of the LDO cleans up the low frequency noise. Having an LDO on the output also allows me to place a second (or even more) LC filter on the buck converter’s output to kill high frequency noise, without hurting load regulation. I realized later that load regulation and DC output impedance aren’t a primary design factor here because there will be leads with tens or hundreds of milliohms connected externally, but it’s still good practice to minimize output impedance of the converter.
Here’s the architecture which I used as a starting point, using the TPS54335 buck converter and LT1764Aoutput LDO:
In this design, a potentiometer acting as the lower resistor in the buck’s feedback loop would serve to very the buck regulator’s output. An op-amp subtractor would servo the feedback loop of the output LDO such that it is always two volts below the buck’s output. Something akin to this:
LTSPICE schematics available here.
The 2V headroom on the LDO allows for improved PSRR and overall performance, while remaining within LDO thermal requirements. Although I was unable to do a paper design on the feedback loop due to LT selfishly hoarding their controller specs, I was able to do the next best thing by simulating the part in LSPICE (hopefully LT uses decent models for their own simulation software). Here are the results of feedforward cap optimization both for transient response and gain/phase analysis, when stepping feedforward cap between 10pF and 300pF:
There were a few things I didn’t like about this design. For one, both power converters would be operating at redline (3A) with no opportunity to increase power output without a complete redesign. Also operating the LT1764 below 1.21V technically should work, but it’s not how the part is designed so it’s somewhat of a risk. Also, low frequency ripple rejection was not as good as I was hoping for. In addition, this method of using the subtractor to servo the LDO from the buck converter’s output voltage has a glaring flaw in that it essentially bypasses the LDO’s PSRR capabilities. Any low-frequency noise will show up on the output of the subtractor, the feedback node of the LDO, and thus the LDO output. It basically bypasses the LDO’s noise benefits!
Because of these issues, I redid the architecture. I went through a few designs based on various ICs (and briefly considering a linear regulator with external FET), but I settled on the LT8612 buck converter and 3x of the LT3080-1 LDOs:
This design has a few advantages. The LT8612 has 6A output, allowing me to extend power output in the future. I also like that the part is not redlining at maximum current. In addition, the 8612 has a TR pin. This pin is intended for use when supplying DSPs or SOCs which need all of it’s rails to come up at the same time. Essentially, if the voltage on this pin is less than 0.8V that will be the reference value to which the output is regulated (the noninverting input on the error amplifier). If the voltage on this pin is over 0.8V, the regulator will use 0.8V as the reference. By using this TR pin, the buck regulator will track the LDO’s output. By slaving the buck to the LDO the issues with PSRR are avoided. The LT3080-1 is a nifty part that actually has no ground pin. It is capable of being paralleled with other LT3080-1’s for increased current output at 1.1A per LDO. This is perfect for having 3A output extendable to 5A. There is a 25mohm ballast resistor on the output which hurts load regulation performance, but this is mitigated by paralleling multiple 3080-1s. Plus, as mentioned before load regulation isn’t a huge concern. The 3080-1 is adjustable- a 10uA feeding the SET pin allows an adjustable output all the way to 0V with the use of a potentiometer. An op amp adder takes the LDO’s output, adds 2V and properly scales to feed the buck’s TR pin so that the LDOs always have 2V across them.
A four-channel 16-bit ADC (ads1115) measures output voltages and currents. A sense resistor with instrumentation amplifier (INA214) provides current measurement to ADC. The ADC communicates via I2C to a microcontroller (ATMEGA328P) which aggregates the information and displays it on an LCD display. The uC also has I2C communication with two 12-bit DACs (DAC121C085), allowing output voltage to be controlled by a PC through a serial connection. The ADC incorporates a digital comparator feeding an external FAULT pin. During an overcurrent or overvoltage event the FAULT signal disables the buck regulators and asserts a crowbar circuit on the LDO output. Two auxilliary LDOS (ADP7142) feed 5V and 19V to control circuitry.
Full schematic PDF available here.
http://blog.burningsilverphotography.com/tag/raleigh/ Buck Converter and feedback
As mentioned before, the buck converter’s TR pin was used to slave the buck converter’s output voltage to that of the output LDO stage. In this way there is always a set voltage drop across the LDOs, so that LDO power consumption is solely a function of output current and not output voltage. This was implemented as an op-amp adder:
The op amp adds 2V to the LDO’s output voltage and properly scales it from (0-18V) to (0-0.9V). 0.9V is the maximum voltage of the TR pin.
In retrospect, this was not the best way to implement this. I don’t think having the two voltages linked in this way is the best design, as noise and perturbations can feed back between the two. For example, a large positive load step will cause LDO output voltage to dip. This dip flows through the feedback adder and appears at the buck converter’s TR pin, on the noninverting input of the buck’s error amplifier, causing the buck regulator to also dip in output voltage. The opposite happens for any upwards deviation on output voltage. The result is an oscillation between the LDO and buck’s output voltage, with frequency determined by the buck regulator’s LC output filters, and dampening determined by the LC filters as well as the PSRR of the LDO. Fortunately the oscillation is in low frequencies (a few KHz) where the LDO has high gain and high PSRR so the oscillation is very highly damped. Still though, it is not the best design and could have caused larger problems. One thing I did add here was a capacitor between the LDO output and the inverting input of the op-amp adder. The idea here is that high frequency content from a deviation on the LDO output voltage will perturb the buck regulator’s output in the opposite direction, helping to mitigate the oscillatory effect. I’ll be playing with this when I get the real part.
I think a better way to implement this would have been a single control voltage which drives both the SET pin on the LDOs and the TR pin on the buck regulator, properly scaled of course. This would have avoided the oscillatory problems. I realized this late in the schematic design phase, but the project already had a mad case of feature creep so I decided to leave it as-is.
The op-amp adder’s 2V input comes from the 5V rail divided by low value resistors to provide a (relatively) low output impedance. The exact value of the isn’t critical so I was okay doing this to save parts. The extra 1mA or so of power isn’t a concern. The op amp was chosen as the LT6015. The part seems pretty bulletproof, and can really take a beating. It meets my supply, noise and bandwidth requirements, and I like the SOT23 package. Having the voltage divider on the output conveniently allows me to place an output capacitor on the TR pin to help with noise issues, without risking oscillations on the op amp.
The feedback divider was chosen so that when V(TR)=0.9V, the buck regulator regulates to (18+2)=20V to meet the 18V maximum output spec with 2V lost on the LDO. For the values themselves, I used the datasheet as a starting point and used LTSPICE to empirically select values for best transient response, phase margin, etc. As before, I was disappointed by LT’s lack of information regarding the regulator’s error amplifier so I wasn’t able to do any design work on paper here. To mitigate risk, a footprint was added for a phase lead capacitor in the feedback network. One more interesting point here is that the TR pin is grounded internally by the LT8612 during shutdown. It is not an issue because the op amp output is fed through a voltage divider and is thus high impedance, so there is no risk of damage to the op amp.
For the output filter, the same holds true as the feedback resistors. I was disappointed by the lack of info from LT regarding phase margin, etc so I used the datasheet as a starting point and used LTSPICE simulations to optimize from there. Ceramic output capacitors allow for very low ESL and enhanced performance. I added some capacitance here to account for DC bias derating. One interesting issue here- the capacitors derate differently over voltage, and since this is an arbitrarily adjustable power supply the capacitance will change over output voltage, changing transient response and phase margin over output voltage. This effect should be minor and in either case it was designed with enough margin to be robust in any case. Choosing capacitor voltage ratings as high as possible helps mitigate this effect.
A second LC filter on the buck regulator’s output helps to kill high frequency switching ripple, which is well out of the high gain region of the LDOs. Placing this second LC filter upstream of the LDOs allows for the filtering without negatively effecting load regulation of the LDOs. The inductors were chosen with a maximum current capability of over 10A, to account for 5A DC current plus ripple and transient current during steady state and startup. Here are the two inductors chosen:
For the second LC filter I chose a 120uF polymer capacitor. These are a bit pricey, but I like their very low ESR, high ripple capability, and that they have no DC bias derating. The same capacitor was used as bulk capacitance on each buck converter input, as well as LDO output. Each buck converter also uses a 22uF ceramic capacitor (same as on output filter of buck converter) as the primary input capacitor. Ceramic was used here for the small package size and low ESL for minimized noise and EMI effects due to the large high frequency currents being drawn through it.
The choice of buck regulator switching frequency was a trade-off between noise (high switching frequency enhances output filter effectiveness) and output voltage range (minimum FET on and off times limits duty cycle and thus maximum and minimum output voltage for a given switching frequency). My selection of 650KHz allows for a duty cycle range of 3.5%-92%, or an output voltage of roughly 0.8 to 20.2, given 22V input, worst-case of 55ns minimum on time, 120ns minimum off time. The main concern was voltage headroom on the top, which constrains my maximum output voltage. Setting 650KHz allows me to hit 18V output with 2V headroom on the LDO stage. 650KHz also gives me ~50dB of ripple attenuation per LC stage, or 100dB total. The LCs should pretty readily handle noise at high frequencies, while the buck and LDO stages handle ripple at low frequencies.
The SYNC pin was tied high to force the converter to go into pulse skip mode at light loads, rather than burst mode. It’s my belief that pulse-skip mode will have lower noise at the cost of increased power output, and I don’t care about light load efficiency. I would have preferred a converter which allowed for full PWM at light loads, but I couldn’t find such a converter. A 0 ohm resistor was used for tying the SYNC pin high so that I can play with it on real hardware to see how the performance compares.
Here is the LTSPICE design file. I could have done a better job with Monte Carlos tolerance and temperature analysis as is standard but it’s not bad for a first pass.
I spent a long time searching for a suitable output LDO. The primary design considerations were:
-Output current >3A
-Output voltage over 18V
-Adjustable output with tracking ability preferred
-High bandwidth and ripple rejection
-Overvoltage overcurrent, thermal protection preferred
-Low dropout (determines maximum output voltage)
My initial choice was the LT1764. It’s pretty solid on these specs and has an excellent dropout of 340mV at 3A, and low noise. I ended up going a different route because PSRR was only okay, and the converter has a maximum current of 3A, so there’s no flexibility to extend the current output spec in the future. In addition, being adjustable via the fb pin would require the op-amp servo method described before, which I feel is a bit hacky. Plus altering the impedance of the feedback node changes the converter’s dynamic characteristics which is risky. Finally, the converter could only regulate down to 1.2V without some hackery in the servo feedback, and while it worked in SPICE I had no idea if it would really work.
It turns out that finding an LDO that can handle up to 18V at 3A with high performance is pretty tricky. I briefly considered implementing a discrete LDO (or at least an external FET to give me more flexibility) but ended up going with the LT3080-1. This is a pretty nifty part in that it doesn’t have a ground pin- all ground current flows into the output. I’m still not sure exactly how it works, but I’m going to look into this once I have a chance. This part has two big advantages: an internal ballast resistor, allowing multiples of this part to be paralleled for increased output current. In addition, the noninverting input of the LDO’s error amplifier is exposed via the SET pin, allowing you to arbitrarily set output voltage.
Not much design work to be done here, as again I had no information or control over the feedback loop (booo LT). Small (50k) and large (1meg) pots were used for fine and coarse control of the output voltage. One concern here is output noise, with any noise in the reference showing up in the output. It’s TBD how big of an effect having noisy pots and the inductance of the potentiometer wires has on output noise. I added a ferrite bead and capacitor to hopefully mitigate this.
One additional feature I added here is the ability to program output voltage from a PC, for automated testing or similar. This was implemented via a 12 bit I2C DAC, the DAC121C085. This connects over I2C to the MCU which can receive USB commands via an external FTDI USB-serial chip. An op amp scales the DAC’s output from 0-5V to 0-18v. This feature can be enabled by loading R64 and removing the ferrite bead, allowing the DAC’s op amp to drive the LDO’s SET pin. 12 bits of resolution provides 4.3mV of resolution when programming an output.
The LT3080-1 has a dedicated supply pin for control circuitry, V_CONTROL. This can be connected to the input for “3-terminal mode” and minimized BOM, at the expense of dropout voltage increasing to 1.35V (from ~350mV) and losing a few dB of PSRR. I didn’t expect to ever need less than 1.35V of dropout as the design had 2V always across the LDO. However, voltage headroom was going to be tight to hit 18V maximum output so I opted for an external rail for V_CONTROL to by myself extra margin in output voltage. The requirement here was that maximum output voltage is equal to V_CONTROL minus 1.35V. I chose to set V_CONTROL as high as possible to give myself margin in the output voltage. I briefly considered connecting V_CONTROL directly to the DC input from the laptop charger, however the DC input is quite noisy and I was concerned about noise coupling into the output through V_CONTROL. The part does specify ~90dB of PSRR at the V_CONTROL pin which, with ~50mVpp on the DC input would contribute ~5uVpp of noise in the output, well below the noise floor of the part. However, to get as much performance a possible out of the part I added an LDO in between the DC input and the V_CONTROL rail, for another ~60dB of PSRR. That part will be discussed later but the rail is a nominal 19.5V (to be increased if possible).
Each LDO has a 4.7uF input capacitor, plus a 0.1uF decoupling capacitor on V_CONTROL. In addition, the 120uF polymer capacitor on the second LC output filter of the buck regulator provides bulk capacitance. Each LDO also has a 1uF and 47uF ceramic capacitor on it’s output. The output also has a single 120uF polymer capacitor for bulk capacitance.
One negative point about these LDOs is the 25mohm output impedance brought about by the ballast resistor. This hinders load regulation as it’s outside of the feedback loop. The effect is mitigated by paralleling LDOs. In addition, any external wiring connected to the power supply is likely to have more series resistance so having an extremely low load regulation isn’t critical. In my implementation, I paralleled 3 of the LDOs for 3A maximum output capability, with no-loaded spots for two more to extend capability up to 5A. A quick thermal analysis shows that even in worst case conditions, the ICs should be within thermal spec.
Current draw devices were added to the LDO output to meet minimum output current requirements. A simple resistor is insufficient here because the output voltage is unknown. I opted for the NSI45015WT1G, an in-line current regulator that maintains a constant current over an admirably wide voltage range. It’s intended for the LED driving market but works well here. The constant current sources have the additional advantage of discharging the output upon shutdown, at least to their dropout voltage.
One highly desired feature which is standard on most bench power supplies is a digital readout of output voltage and current. I briefly considered using the MCU’s built in ADC, however at only 10 bits it would only have a resolution of 17mV or 2.9mA. I opted for a 16-bit, four channel I2C ADC, the ADS1115. 16 bits of resolution allows for .27mV or .04mA of resolution. Using the ADC in single-ended mode reduces the output range to 15 bits because of the two’s complement output, reducing resolution to .54mV or .08mA. I would have preferred an extra few bits to throw away to noise, but being a DC power supply the noise can be filtered out digitally. An INA210 current sense amplifier was used with a precision 10m ohm resistor for current measurement. The use of a current sense resistor causes a hit in load regulation because the resistor is outside of the feedback loop. This could have been avoided by placing the resistor at the output of the buck regulator or opting for a 1m ohm hall effect sensor. However, as mentioned before external leads far exceed 10m ohm so it’s contribution is negligible.
One feature I was unwilling to compromise on was safety. The AC/DC forward converter is certainly the riskiest part. I mitigated this by choosing a well respected supplier, and also taking a peek inside to make sure the proper precautions were taken. I’ve designed such a converter myself, but didn’t want to take any chances here. On the DC/DC side, each of the buck and linear regulators have their own thermal shutdown and overcurrent protections. I added my own as an extra layer of safety. My requirement was that this be implemented in hardware, not firmware so that it would be bulletproof and not subject to firmware bugs, microcontroller lockup, etc. To this end, I chose an ADC (see above) containing a built-in comparator feature. Upon over-current or over-voltage, the ADC asserts the fault pin, disabling the buck regulators. The fault signal also drives a crowbar circuit, clamping the dual outputs to ground with a time constant of 75ms, with some hefty discharge resistors and NFETs rated to handle the worst-case 1.9W dissipation. The fault signal also drives a fault LED for user indication. Finally, the fault signal toggles a pin on the microcontroller, which in turn updates the LCD screen to politely inform the user that they should re-evaluate their life choices.
The primary piece of user feedback is on a 4×20 backlit display from Newport LCD, which has just enough characters to display current and voltage for two channels, as well as some snarky messages when the user triggers a fault. This is driven over the standard Hitachi LCD interface from an Atmel ATMEGA328P, my favorite microcontroller workhorse. The 328P allows me to tap into standard arduino libraries, while cutting out the arduino bootloader for some extra space/performance. In my experience the 328P is perfectly happy running on it’s internal RC oscillator, even with I2C (although I always leave hooks for an external crystal, just in case). It’s programmed with Atmel’s AVRISP. The Microcontroller’s primary purpose is to read voltage/current from the I2C ADC and display it on the LCD screen. Combined with an external FTDI USB-Serial, the microcontroller can also program output voltages, for automated tests and the such. The microcontroller is supplied by a 5V LDO, fed from the AC/DC converter’s messy output. The LDO is the ADP7142, which is also used for the 19V rail. The two LDOs are fed through an LC filter, to provide extra isolation from the noisy AC/DC flyback converter. Nothing super exciting here. One point to note is that for the 19V rail, I added an RC circuit in parallel with the upper resistor in the feedback divider to kill loop gain at higher frequencies. Because the part has a 1.2V CMOS reference, scaling all the way to 19V scales noise proportionately, which is mitigated by reducing loop gain at these high frequencies.
Layout files and Gerbers available upon request. This is implemented on a 4-layer PCB designed in Kicad, following standard DFM, EMC and general analog design practices. I’ll point out a few things I paid particular attention to. Layer 1 was used for power planes, and a few DC digital signals (button, LED, etc). Layer 2 served as the power ground. Layer 3 served as the clean analog ground, while layer 4 was reserved for clean internal rails, regulator feedback and set point signals and associated op amps, as well as clean ADC/DAC signals. Extra care was applied on solid ground returns for the sensitive analog signals (especially high-impedance regulator feedback), with a single connection point (at the source) for analog and power grounds.
The power path generally flows from left to right, with the AC/DC input feeding the bucks then the LDOs on the top side, with support circuitry on the back. Testpoints on important nodes allowed for easy probing and blue-wiring. Here’s a floorplan:
And the copper for the four layers:
From a decoupling standpoint, I used standard practice with vias placed near decoupling caps. Extra vias to tie the ground plane near thermal loads, especially the buck and linear regulators. I’m not a huge fan of thermal relief on pads (especially high frequency decoupling due to added inductance) but deemed it necessary since the board would be hand-soldered. Copper for the two interesting layers:
See above (control section) for general microcontroller function. Software was programmed in the arduino environment using standard arduino libraries. Code is available here.
Nothing super interesting here, although I did assemble the board entirely by hand, giving me some more valuable experience with SMD soldering and the heat gun. Soldering is fun! Despite the use of thermal reliefs, some of the larger components were a passive pain to solder, especially the inductors and their massive power power planes. I used rubber standoffs as mechanical supports for the board. I purchased a case from Vetco Electronics (Bellevue, WA) and other mechanical parts from Digikey. The case was crudely machined and filed out by hand to fit the parts in- I’ll learn solidworks and 3D printing one of these days.
Noise and transient analysis still to be completed, however I have pushed the supply to it’s DC limits and verified crowbar and fault circuitry, and thermal handling of maximum spec power output. Will update!
Post-Mortem and Lessons
As usual, I did learn a ton on this project. A few things I would have done differently:
-I would have definitely added current control, both from a manual standpoint and an automated software-control standpoint.
-I spent a long time struggling with Kicad and specifying a power/ground plane, only to figure it out at the very end once I was done. There’s an atypical priority-setting scheme, gotta watch out!
-I do wish I had more control over regulator feedback, and tuning the control loop for transient response and having more confidence in stability. This was entirely out of my control, as the vendors of these parts I used are abysmal at publishing information on their control loops.
-This is a silly one- I specified the PCB way too thin. I went with the minimum thickness of 1500um. The flimsy board made me constantly worry about it snapping, when really there’s no advantage to having a thin board.
-The project revealed my desperate need to improve my mechanical design chops. It’s almost an industrial design problem- the electronics are kickass but the thing looks like it’s help together with duct tape. Goal for next project- design and 3D print it’s enclosure.
-A repeated mistake (the worst kind of mistake)- pins need to be labeled in silkscreen! This makes things way easier during debug and assembly.
Here’s a list of hardware bugs in V0.1, each of which required a blue-wire rework.
-This is a silly one. When deciding I2C address I checked the datasheets to be sure there were no conflicts. Unfortunately I neglected to notice that the DACs have an “all-call” address which they respond to in addition to their resistor-strapped address. And of course, the all-call address conflicts with the address I chose for the ADCs. A quick trace cut and blue wire rework changes the ADC address to 100 1001, resolving the conflict.
-Fault override issue. In the initial design, the microcontroller had no knowledge whether a fault had occurred or not. This was a mistake, as I had neglected to notice in the ADC datasheet that a fault condition on it’s digital comparator is overridden by a subsequent I2C read of the ADC output register. Because the microcontroller infinitely reads the ADC output to update the LCD screen, any fault would be almost instantly resolved. This was solved by reworking the fault signal to also drive a spare pin on the micro. Therefore the micro would know that a fault had occurred and to not query the ADC output register. I took this opportunity to add a feature such that the LCD screen is updated with a message upon fault condition, something that wasn’t possible without the rework. Always try to turn bugs into features!
-Finally, an unexpected issue arose with the output LDOs in a fault crowbar condition. Unfortunately I can’t remember the exact issue (another lesson- don’t wait 6 months before doing the writeup!). The output LDOs are supplied by an 19V supply driving V_CTRL (the drive circuitry for the LDO). The 18V supply is permanently enabled with it’s EN pin hard-wired high, which causes a problem when the output is crowbarred low. This was solved by cutting the trace hardwiring the LDO’s EN pin, and wiring it to the fault signal such that the 19V LDOs are disabled upon a fault condition.
All in all, this project was a success. I was able to make a power supply similar in performance and feature set to commercial supplies 10x more expensive, and was able to learn something at the same time. Kicad has some quirks but I’ve gained some confidence in it’s utility in future commercial and hobby designs. Excited to apply the lessons from the project to future work!