Tale of Two Power Systems – UPS Edition

This one nearly fooled us; we recalled the “two power systems” nature of a recent site and so when a second data set came in with somewhat similar characteristics, we thought it might be more data from the same facility. But this is a completely different site, and a completely different problem!

Looking over a power monitoring data set recently; we came across a site with a dual personality. The site in question had a marginally high total harmonic distortion (THD ~ 5.5%) from 5/15/17 up until 5/24/17 (specifically, at 3:50 pm). After that, the THD trended much higher, rising as high as 12% (with a large amount of visible high frequency noise).

A “before / after” look at the voltage and current waveforms provides more evidence, with visible notching on the voltage waveform “before”; and very high levels of high frequency noise (broad spectrum “after”.

Before 5/24/17 After 5/24/17

Individual harmonics similarly supported the findings of the THD log, with harmonics under 3% before 5/24/17, and very high harmonics across the spectrum after 5/24/17.

Before 5/24/17 After 5/24/17


Finally, the “before / after” affect was also seen in the Neutral-Ground voltage – with severely high NG voltages after 5/24/17; consisting mainly of high frequency components.

Before 5/24/17 After 5/24/17

The clue to understand this puzzle is that the RMS voltage of the device under test was very stable and well regulated (probably a UPS or power conditioner output) before the 5/24/17 date, and higher / less well regulated after the 5/24/17 date.

And here is the “moment of truth” when the voltage changes from moderately distorted / notched to severely distorted with high frequency noise.

Moment of Truth

So what’s the scoop? We’re not on site, but here’s our bet – that there is a UPS supplying 480Y/277 VAC power to the load, but is itself being fed 480 VAC Delta (ungrounded and/or no neutral). During Inverter operation, the UPS works fairly well (although we bet the notching and higher THD are not normal for this device). But when switched to Bypass, the load loses the neutral reference, and is picking up noise from the UPS rectifier and/or inverter circuitry.

The trip report notes “No power problem suspected, multiple tube failures, want to eliminate power as an issue.” They probably assume UPS installed = no power issues (and they are probably not experiencing sags / swells / etc. on other facility loads). But if they are operating a system requiring 480Y/277 VAC from 480 VAC Delta, and relying on the UPS to provide a neutral connection point, they are probably having some serious grounding, reference, and noise issues!

Tale of Two Power Systems

Looking over a power monitoring dataset recently; we came across a site with a dual personality. The site in question had low total harmonic distortion (THD ~ 1.5%) from 4/1/17 up until 4/10/17 (specifically, at 2:00 pm). After that, the THD fluctuated much higher, rising as high as 5.4% (outside of manufacturer requirements for medical imaging equipment).

Tale2Power THD

A closer “before / after” look at the voltage and current waveforms provides more evidence, with visible notching on the voltage waveform “after”; monitored current showed some noise but was similarly low.

Tale2Power Waveforms Before Tale2Power Waveforms After
Before 4/10/17 After 4/10/17

Individual harmonics similarly supported the findings of the THD log, with all harmonics higher, and 5th harmonics exceeding 3%.

Tale2Power Harmonics Before Tale2Power Harmonics After
Before 4/10/17 After 4/10/17

Tale2Power NG RMS
Finally, the “before after” affect was also seen in the Neutral-Ground voltage – with noise voltages evident although the lower frequency voltages were not much higher.

Tale2Power NG Before Tale2Power NG After
Before 4/10/17 After 4/10/17

The funny thing is that the RMS voltage and the current of the device under test were not significantly different before / after the 4/10/17 date, in terms of RMS level or in terms of stability: sags, swells or fluctuations.
Tale2Power RMS

So what’s the scoop? We’re not on site, but odds are good that some facility load (we’re betting air conditioning, but could be other facility loads) got switched on at this time. Alternately, perhaps the facility transitioned to an alternative power source. But whatever the reason, this is clearly a tale of two power systems, and we’re curious about it!

UPS Overload and Bypass: CT Scanner Load

A quick consulting project came over the transom this week. A 150 KVA UPS, protecting a CT scanner, was occasionally overloading and transferring to bypass.

UPS Bypass 02

Here, the transition to Bypass is evident by the step change in voltage from a rock solid 480 VAC (UPS Inverter) to a very high 515 VAC (Bypass)

UPS Bypass 01

Drilling in a bit more, we see the CT Scanner switch on (point “A”) with a maximum current of 245 Amps and a resultant collapse of the UPS output, a short period where the CT current drops and the UPS output stabilizes, then a transition to Bypass (point “B”). Note the increase in voltage while operating on Bypass.

At the end of the CT scan (point “C”) the voltage rises due to impedance. And the UPS stays in Bypass for an extended period (point “D”) needing to be manually reset.

UPS Bypass 03

A close-up of the “start of scan” waveform shows the nature of the inrush current (higher for just once cycle) – although the UPS voltage drops more than usual, it does not really fold or collapse.

Nothing really unusual here – some finger pointing at impedance (not really an issue, the voltage drop on the unregulated bypass was just 2.7% at full load) and voltage distortion (under 3% voltage distortion on the UPS input) – neither of which is a problem. The UPS got sized in based on power monitoring, which apparently did not capture peak load condition.

I suggested that the higher voltage on Bypass (515 VAC = 7% higher than nominal) would mean lower observed current, although that did not factor into the calculations (they were monitoring further upstream on a 208 VAC source). The UPS vendor is going to see if they can tweak the protection circuitry a bit to be able to survive and supply this short overload without a bypass transition.

Remembering Joe Briere

I found out this morning that Joe Briere of Computer Power Northeast passed away back in March – obituary.

Joe and I go back a long ways. When I worked at Philips Medical in Shelton (pre-1995) he stopped in a few times to see if there was any business there; he was plying his trade as a power quality consultant in the medical imaging field, doing a lot of work with Siemens Medical, so it was natural that our paths would cross. He was one of those “I’ll help you with your power quality issues and maybe sell a transformer, voltage regulator, power conditioner, Uninterruptible Power Supply, or Surge Protective Device along the way” kind of consultants that inevitably made a lot more money than me (who chose not to sell or rep products, just provide technical services). He was always a straight-shooter, never found him to oversell or over-promise, and was a hands-on guy who knew his way around grounding, the electrical code, isolation transformers, etc. We did not agree on everything, but I always respected his opinions and experience.

For a few years there (2001 – 2003) Joe and I would often find ourselves meeting at Siemens trouble sites across the country as the service organization tried to get a handle on power and grounding issues, so I got to know him pretty well, poking around hospital electrical systems and sharing a beer and a meal afterwards.

Have not been in contact for many years (my last contact with Joe was 2003, and with Computer Power Northeast was 2009) but I’d pop over to the website now and then to see if they were still in business. Joe was probably retirement age when I started working closely with him, and reportedly kept busy well into his 80’s. His business partner called for a little consulting project this morning and shared the news.

Was a long time ago in a galaxy far, far away. Power Quality Consulting was kind of the wild west back then and the folks who knew what they were doing, were not afraid to open an electrical panel and make some measurements, and could sort out technical issues that left others scratching their heads were a rare breed. RIP, old friend.

Support Your Independent Power Quality Consultant (or Pay a Lot More)

Came across this one on a Linked In power quality board.

*************************************************
Skiers blowing up UV light bulbs

I am looking for a small three phase voltage regulator/ power conditioner to fix utility voltage fluctuation problem that is blowing expensive UV lamps in the water purification system in a small town. You see, the water filtration facility ( or shack ) is connected to the grid at the end of a long utility line; and it’s closest neighbor is a ski hill. When the sky lifts; or the pumps for the snow making machines start/ stop, the 600V gets high voltage spikes of up to 640V. This is blowing up the UV bulbs.

I need to replace the existing auto transformer with a 600V – 3 wire input, and a 277V – 3 wire output power conditioner/ voltage regulator. Probably will need something in the 10-15kVA range.

Open Delta Autotransformer

Any ideas for good quality US or Canadian manufacturer?

*************************************************

Spoiler Alert: odds are pretty good that this site does not need a “power conditioner/ voltage regulator”. The culprit here is almost certainly the transformer used to convert 600 VAC (ph-ph) to 277 VAC (ph-n) for the lighting circuits. They are using autotransformers hooked up in an Open Delta configuration – no doubt something like this:

Open Delta Wiring

Problems:

  1. The voltages from X1 / X2 / X3 to neutral are not going to be balanced or particularly stable – they will definitely change with load. X3-N will in fact be 347 VAC. The other phases will be all over the place, and will fluctuate with load.
  2. The voltages from X1-X3 and X2-X3 will be fairly stable (direct magnetic coupling) but X1-X2 (the open delta phase) will fluctuate.

I’d probably recommend measuring the X1 / X2 / X3 voltages (Ph-Ph, Ph-N, and Ph-G) and when they come up unbalanced, I’d know where the problem was.

Now a dozen or so power quality experts (with a product to sell and a commission to accrue) had lots of suggestions for power conditioning or voltage regulation solutions. These might fix the problem, but at a much higher cost / complexity.

The best solution is mostly likely a simple Delta-Wye isolation transformer, 600 VAC primary, 480Y/277 VAC secondary.

Delta Wye Wiring

The problem is – nobody gets much of a commission on a 15 KVA isolation transformer (maybe a $1500 box plus installation, breakers, etc.) so who wants to recommend THAT?

Your friendly neighborhood independent power quality consultant, that’s who . . .

Ground Resistance Testing

I just sent a client a document from a seminar that I created and led in 1996. (The seminar client is long out of business).

It’s nice to be (a) the old dog who was around back in the day, and (b) a bit of a digital pack rat. Also interesting that the technical issues of 2017 are no so different from the technical issues of 1996.

Here’s a snapshot of that document (pretty slick for 1996, no?) – and no guarantees that the IEC / UL references or requirements are still valid. But the concept – that measuring ground resistance with a low current ohmmeter is going to give you sketchy results – remains valid.

Ground Resistance Testing

Reading the Tea Leaves

Sometimes when reviewing power monitoring data, key information is left out of the problem statement. But the astute power quality engineer can “read the tea leaves” and pick up information about the installation, equipment, and technical issues.

A set of data from a Magnetic Resonance Imaging system was presented for analysis with the following notes:

System has had several intermittent issue that have caused system to be down and functional upon arrival. System issues have been in RF section. No issues have been reported since system since installation of power recorder.

RMS Voltage and Current

Chiller RMS Logs

First clues come from looking at the RMS logs of the voltage and current. The voltage is suspiciously well regulated – and is probably a UPS or power conditioner rather than a normal utility source (which will tend to fluctuate over a 24 hour period). A second clue is the small voltage increase or swell related to load switch-off – typical of an active source, not typical of a passive source.

Second, this appears to be a regularly cycling load – a pump or compressor. MRI systems typically have a chiller or cryogen cooler associated with them – so odds are good this was monitored on this load, and not on the MRI system itself.

Chiller or Cryogen Cooler Load

Chiller Cycling RMS
More evidence supporting the chiller or cryogen cooler load – a regular (practically like clock-work) cycling load, with a marginally higher operating current (~30 Amps), but a very high inrush current (~180 Amps)

Chiller Transient

Normal chiller or cryogen cooler inrush is seen here. A minor (~5%) voltage sag was captured during each inrush current, as well as minor associated transients (probably relay or contactor switch bounce)

Abnormally High Inrush Currents

Chiller Sag

In addition to the regular inrush currents associated with chiller cycling, six instances of very high inrush current were captured. These were seen both as voltage sag events as well as current triggered events. We’re concerned that this high inrush current may be causing an overcurrent condition on the UPS / power conditioner – which may be throwing a fault or error, or perhaps switching to Bypass.

Chiller Swell RMS

Looking at the RMS logs of the high current swell / voltage sag event, we see that it precedes a period of extended chiller / compressor operation. Unknown if this is normal operation for the system / device or indicates an error or fault of some sort.

Summary

Although the accompanying technical information was thin, we’ve “read the tea leaves” and provided the following analysis bullets

  • System appears to be powered by a UPS or power conditioner. Service personnel may not have known this.
  • System appears to be a chiller, compressor, or similar device (not the medical imaging system itself)
  • Occasional high current swells were seen; these may be normal or may point to system issues.
  • Voltage sags and collapse during these high current swells may indicate that the UPS or power conditioner is overloaded, and may be experiencing faults or alarms that may be impacting system operation / uptime

The Curious Case of the UPS Loading

We recently got to review input and output monitoring data from a UPS system (make and model not specified) feeding a medical imaging system. The monitoring was done as a precaution, but we noticed something unusual.

First, take a look at the RMS voltage and current logging of the UPS input and output. Phase A voltage, Phase B current shown for clarity, but all voltage and current phases are balanced and similar.

UPS Compare Input RMS

UPS Input – Normal facility RMS voltage (daily fluctuations, with occasional sags) and RMS current peaks at approximately 80 Amps.

UPS Compare Output RMS

UPS Output – Highly regulated RMS voltage (with small load related fluctuations) and RMS current peaks at approximately 170 Amps.

The discrepancy between the input current and output current is unusual. It would be typical for input current to be marginally higher than output current (due to device efficiencies) but not lower. Our guess – the UPS DC bus (and probably, the battery string) is being called on to support the peak output load.

UPS Compare Output Highest Load

UPS Output – Step change in load current and nonlinear load is typical of medical imaging system. Very small fluctuation in output voltage related to load changes, and small increase in voltage distortion related to nonlinear load current.

UPS Compare Input Highest Load

UPS Input – Even at highest levels, current is linear, UPS must have a unity power factor front end / rectifier. However, lower current level is unusual, and indicates that UPS battery is probably being called on to supply the peak medical imaging load.

There’s really no immediate problem here – the UPS is doing a great job of correcting input power issues, as well as supplying the complex loads (step change, pulsing currents, nonlinear power factor) of the medical imaging system.

However,it’s pretty clear that the UPS batteries are getting discharged during highest current imaging system operations – not really their intended purpose, which is to ride through far less frequent utility sags and outages. So it’s possible that the UPS batteries are being stressed and may degrade or fail prematurely, and need replacement. We’ve referred this to the UPS manufacturer / supplier for attention.

As a quick “in the field” test (we’re doing this analysis remotely, not on site) we might suggest disconnecting the battery string temporarily, and seeing how the UPS performs without the battery, just relying on the DC bus. We’re guessing the UPS might start to collapse or struggle to supply the medical imaging load – and may be undersized for the application without the battery string supplied.

We’ve seen situations where a UPS that has heretofore worked well for years stops working quite so well, because the batteries started to wear out, and the unit was no longer able to supply the peak loads required by the imaging system.

The Guru’s Cat

When the guru sat down to worship each evening, the ashram cat would get in the way and distract the worshipers. So he ordered that the cat be tied during evening worship.

After the guru died the cat continued to be tied during evening worship. And when the cat died, another cat was brought to the ashram so that it could be duly tied during evening worship.

Centuries later learned treatises were written by the guru’s disciples on the religious and liturgical significance of tying up a cat while worship is performed.

– Anthony De Mello, The Song of the Bird

A music festival where I’ve been a volunteer for nearly 25 years is doing their annual mid-winter pre-fest sales – selling a limited number of tickets at a reduced price. It’s a good way to carry the festival organizers over the winter, and to give regular festies a price break.

What’s NOT so good is how they do it – phone only, with a limited staff processing orders manually over a three day period. I’ve seen some posts on social media:

“After 111 attempts to get through, over a span of twelve minutes, tickets have been procured!”

Mine took longer than usual — 179 calls and 48 minutes (you re-dial faster than I do) . . .

and from one of the folks on the other end of the line:

FYI y’all S & A are working fingers off to accommodate youralls calls for tix & are very grateful for your wonderful patience

See, the thing is, this could be done online through eCommerce – put a limited number of tickets for sale (so you do not oversell) and for a limited time. Yeah, there’s a service fee (but probably not all that much higher than the credit card fees) and I suspect most customers would pony up an additional $5 or $10 per ticket to cover an eCommerce solution (and not have to dial in 100+ time). You could sell out your winter pre-fest stock without having to tie up customers, your staff, etc.

But….it’s been done this way for 25+ years and will probably always be done this way. Like the Guru’s Cat, sometimes we do things out of habit or tradition or inertia without stepping back and considering other options.