Curious Voltage Regulation

Came across an unusual case of voltage regulation for a CT Scanner site recently.

Site voltage appears to be well (and artificially) regulated, with no daily voltage fluctuations typical of utility power but with small load related voltage drops. Site had no significant outages, sags, swells, or transients over two weeks of monitoring.
Typical voltage harmonics supports some sort of static, inverter based power source, with broad band harmonics. Elevated 5th, 7th, 11th and 13th harmonics are curious.
No apparent voltage regulation related to step change in load current – which is unusual for a static UPS or power conditioning device feeding a medical imaging load.
Moderate level of increased voltage distortion related to nonlinear load current – again, not typical for a medical imaging device powered directly from a properly sized UPS or power conditioner.

So, what’s going on?

My first guess was a slow, electro-mechanical voltage regulator (such as a motor driven Variac or Powerstat) and which shows up occasionally for medical imaging or other sensitive loads. However, that sort of regulator would not eliminate short utility sags or drop-outs, or affect the voltage harmonics or THD.

My next theory is that this site has a large UPS or static power conditioning device feeding an entire medical imaging department. The voltage drop and increased voltage distortion under load would be the result of voltage drops in distribution (impedance calculated to be roughly 60 mohms @ 480 VAC) – so perhaps the power conditioning device is located in the basement with normal distribution voltage drops (primarily wiring, perhaps a transformer) from there to the equipment.

No significant power issues found, just like to understand what I am seeing in the data!

Won’t Get Fooled Again . . .

No critique or judgement implied in the title of this post, just the lyrics of The Who’s song “…meet the new boss…same as the old boss…” percolated to the surface as I came across this press release last recently:

MTE Relaunches Prestigious TEAL® Brand for Precision Power (March 30, 2022)

Teal Electronics Corporation was my first and best client for many years. I worked closely with them developing and managing the PDU (Power Distribution Unit) for medical imaging systems with my then-employer Philips Medical Systems. I’ve consulted with them for ages – working on training, applications notes, customer support (remote and in the field).

I’m particularly fond of an Excel based viewer for their TEALwatch product – which you can see a bit of the GUI from here – Dedicated Line, Back to the Service Entrance

Times change, principals and technical contacts move on, we dropped the retainer in 2011, and my last invoice to them was 2015. They were purchased by MTE Corp in 2016, and despite a few fleeting contacts, my relationship with the company effectively ended. My guess is that the TEAL business unit itself got a bit lost in the shuffle.

They do seem to have tweaked the logo a bit. So that’s something . . .

I’m only slightly amused / disappointed that the sum total of the MTE / TEAL application library consists of 3 applications notes, the last survivors of a series I wrote for TEAL back in 1995….


Sneaky Voltage Swell Events

Long time, no post! A pandemic will do that to you . . . suffice it to say I am healthy, relatively happy, and staying busy with both my engineering work (mostly stable through the troubles) and my yoga studio work (very different these days but busier than ever with online classes and the need to provide appropriate technology).

Today’s engineering bon-bon involves a trio of voltage swell events. Seemingly caused by a short circuit (another facility load fault, perhaps utility lighting arrestors) that causes a 1/4 cycle drop-out on one phase and resultant voltage swell following. The swell has a serious overvoltage, likely to cause problems for many types of equipment / power supplies.

This event was captured as a minor RMS voltage swell (5.1% above nominal) – but the peak voltage is very high (540 V vs 395 V normal). Note the ~30 Amp peak current swell.
This event was barely detectable as an RMS voltage swell (4.8% above nominal) – once I recognized the nature of these events, I went looking for more and came up with just this one. Almost no resultant current swell.
This is the big smoking gun event. Even though it did not register as a sag / swell event at all (the voltage drop-out being balanced by the following swell, just 3.25% over nominal) it was captured as a current swell (the highest current event over 6 weeks of data capture) – 650 Amps waveform peak. Something within the client load saw this event as a problem and drew a slug of current.

I’m just the hired gun reviewing the data, pulling out events and issues, writing a report. So I won’t be following this to a resolution or further troubleshooting on site. But this is a good example why automated report writing is not always sufficient – the Fluke 1750 analysis tools would see these events as minor voltage swells (if that). It takes a human being with some experience (I’ve reviewed over 5000 power quality data sets since 2003) to see something unusual, scratch one’s head, and dig in a bit deeper.

Cause and Effect: Arcing Transients and Equipment Faults

We recently reviewed some power monitor data for a client. Problem statement:

Breaker Q1 in the WCS electronic box trips on a sporadic basis. The breaker is the M4 and M5 fan motor overcurrent protection. We have replaced the breaker multiple times.

First pass, we noticed three very high current swells in the ground current data:

Ground RMS

We also saw 100s of very serious arcing voltage transients, not related to load current changes or other voltage events. The transients showed up on Phase-Neutral and Neutral-Ground, but the NG transients were much lower amplitude (secondary, not the primary issue).

Transient PhN Transient NG

Finally, we captured three current swell events that clearly show equipment faults to ground (notice the elevated ground current) immediately following voltage transients – cause and effect.

Fault1 Fault2 Fault3

Figuring out the cause of the transients is an exercise for the local service engineers or an onsite power quality engineer. But we’ve got a pretty clear linkage here between transients and equipment faults. Most of the time, power quality problems are a lot less concrete and clear.

Audio Project: Buying Local vs. Amazon

I’m in the middle of a small audio project – converting 11 digital micro-cassettes (the sort used in old answering machines and personal recorders) to digital MP3 files for a client’s book / memoir project. It’s potentially slogging work (each cassette has up to 90 minutes of content, that’s up to 16.5 hours of recording) but I’ve got it set up to run in the background, while I do other things. I’m sort of embarrassed to admit that I started doing this on my computer (using Audacity sound recorder / editor) before realizing I have this perfectly good TASCAM DR-07 digital recorder (that I use for live recording) that is actually designed for this sort of thing. Using the computer would have been exceptionally onerous; the digital recorder makes it almost trivial. The only potential problem is battery life on the micro-cassette – I picked up a DC power supply but it introduced severe hum into the signal so I’m back to AAA batteries (I have a stack of those, but I imagine that the batteries will die mid-recording a few times during the process.

I’m probably over-killing the set-up here – running the 2.5mm mono output of the recorder through adapters (2.3mm -> 3.5mm -> 1/4″), then through a DI to get XLR out, and into a Mackie mixer to tweak the sound a bit (cutting out some of the highs and lows, optimizing the level) and into the recorder. I’ve got a set of headphones to listen in now and then. Next time I’m out I’m going to pick up my spare monitor speakers at my storage locker so I can have it going low in the background and lose the headphones.

Getting the adapters was 1/2 the battle here – the 2.5mm mono out is super non-standard. I went to the local Cables & Connectors store which could only supply a 2.5mm stereo, which kind of sort of worked but not really (was a little flaky) – I ended up (as always) finding exactly what I needed at Amazon (2.5mm mono to 3.5mm stereo) and while I was at it, picked up a couple of 3.5mm stereo to 1/4″ breakout cables which I seem to be wanting every other time I pull out the mixer these days (record out to the TASCAM, line out from phones and tablets). I have a bunch of 3.5mm to RCA breakouts but the 1/4 is a lot more mixer friendly.

I like to support local retail when I can, but I’m invariably looking for something a little weird or left of center and rather than drive around all day for something that almost works, just pull up the exact right item on Amazon, order via Prime, and it shows up 1-2 days later. Probably better for the environment as well (considering my gas).

Emergency Lighting & Exit Signs: Sleazy, Sloppy and Unethical

Let’s say you pull into one of those quick lube places to get an oil change. You’re not really paying attention, so you do not realize that your spouse was in there a few weeks earlier, and had gotten the oil changed. The attendant knows this (they pull up the vehicle data from the license plate or they check the little sticker, it’s been just a few hundred miles) but they say nothing – simply change the oil and send you on your way.

Would you be pissed off? I suspect the answer is yes.

Happened to my employer this week. I’ve been doing their exit sign and emergency exit testing for a few years now – a monthly quick test (push the button, works for 30 seconds, pass / fail) as well as a more involved annual test (kill the building power for 90 minutes and make sure the lighting batteries hold up). I replace the batteries and/or lighting units as needed. I last did the big annual test in July 2018, and dutifully punched out the little official stickers that I purchased to make it all official.

Exit Sign Sitcker

This week, I noticed that one of those big industrial service companies was in to inspect the fire extinguishers, and somehow convinced the facility manager that she needed an annual electrical inspection (they may have actually sold her under-coating and floor mats while they were at it). So all of my little stickers were removed, replaced with theirs, even though the stickers clearly indicated that an annual test had been done in July 2018, two months prior.

Unethical = check. Sleazy = check. And to add insult to injury, there is no way in hell they did a full 90 minute “annual test” in the time they were on site – I suspect they did a quick push-button test.  Potential criminal liability there. The funny thing is, the only reason I noticed was that they slapped the stickers on the front panel of the devices (rather than discretely along the side where my stickers were) so I spotted them as soon as I walked into the facility. So add “sloppy and lazy” to the descriptors.

We’ll be going back to them once the invoice comes in to get the charges reversed. And if that is ineffective, we’ll be sending a letter of complaint to the town fire inspector, as well as the state dept of consumer protection.

Vivatech Service Cart – Circa 1997

This one goes back 20 years, to the earliest days of PowerLines; prompted by an off hand recollection at a customer site this morning. I’ve always been willing to jump in with both feet, and to pick up work however it showed up, especially in the early days.

Vivatech (Teterboro, NJ) was a start-up company intent on developing tools and processes designed to prolong the life of an X-Ray tube, through replacement and processing of the tube cooling oil. As background, an X-Ray tube is a fairly high dollar expendable for a CT scanner; with a lifespan measured in “slices”. The principal of Vivatech was a doctor who owned an imaging center, and as far as I can see, was simply pissed off at having to purchase replacement tubes. So he came up with the concept of periodically replacing and/or filtering the X-Ray tube cooling oil as a way to extend tube life, and obtained a patent – US 5440608 A: Method and system for extending the service life of an x-ray tube

If you have seen the carts that the quick lube places use to process / replace automobile coolant, you get the idea.

To do this, he fitted his CT Scanner with some connectors to permit the cooling lines to be opened, a service cart with pumps, reservoirs, filters and valves inserted into the oil cooling system, a fairly complicated sequence or process intent on filtering the oil, replacing the oil, and then working the air bubbles out of the system. The claim: “The company has developed a tube maintenance program that it says can help tubes last for up to 300,000 slices, far more than the 75,000 slices that are the industry average.” (Diagnostic Imaging, July 1996). That’s pretty much the only online evidence that I can find that the company existed – the Vivatech name has been picked up by a variety of businesses and conferences / events in the ensuing years.

I got sucked in through the side door; I had worked with one of the principals on power quality issues at NJ area clinics; and ended up being brought in to consult in a bunch of areas:

  • Develop the controls for a production quality cart (PLC vs. relays)
  • Advanced diagnostics and user interface
  • User / Service Manual
  • Marketing support (PowerPoint, etc.)

I’d have to dig a bit for some photos, but I did find the Operator’s Manual for the Vivatech Service Cart (PDF) in my archives. Kind of a fun look back (for me, anyway). Put together in MS-Word, all the graphics are RF-Flow (if I recall correctly), interesting that we were working in B/W documentation back then (plenty of places where color would have been useful / advised, thinking of the cautions and warnings in particular)

The front panel (Page 3-2) was all me – we had transitioned from a hand-wired panel using 24 VAC relay logic and incandescent bulbs to a PC Board with mounted, square LED lamps (in one of three colors: Red / Amber / Green, before Blue LEDs came along).

The other fun part was adding some user diagnostics / indicators – the prototype unit simply had lights that turned on and off. I added all sorts of “value added” indications using the PLC to make the same sets of lights flash during operating (slow flash = normal / wait, fast flash = error / fault).

I also recall an intricately machined manifold that all the various valves, motors, filters, and ports were mounted on, and a large plexiglass reservoir to hold the in process oil.

Some of the wild and crazy engineering things I got involved with, back in the day. I’ll see if I can dig up some photos and add those.

Edit: No dice on the photos. I did find a box of photos that go back that far, and there was a divider in the box labelled “Vivatech”, but no photos. I recall putting everything to do with that project (hard copies, manuals, PLC hardware, spare PC boards, photos, etc.) in a box; and years later donating all the tech to a local makerspace. I imagine I tossed the rest at that time. It was back when you developed a roll of film carried the photos around, and if you needed something electronically, you scanned it (180 degrees away from today, when a hard-copy photo is exceptional). I could be wrong; things sometimes turn up in odd boxes or shelves, but for the moment, I think I’m out of luck.

Om Street 2017: Time Lapse Video

“On July 22, 2017, over 2500 people gathered on LaSalle Road in West Hartford, CT to “get their asana in the street!” The weather was perfect and the energy was incredible, making the 7th annual OM Street a huge success. The 75 minute all-levels yoga class was led by Barbara Ruzansky of WHY (http://www.westhartfordyoga.com) and included assistants from 40 area studios and businesses. Every studio that gathered their tribe, and every individual that put a mat or a chair on LaSalle represented a community, a coming together of like-minded spirits into something like a neighborhood, like a family.”

Remarkably, I made it into the video frame this year (directly behind, to the left of the tree at the far right lower corner of the shot, blue shirt) standing at the audio table, keeping an eye on the sound. And at the end (around 2:50) I take the front speakers down…

It’s kind of amazing to be part of this each year, to be the adult in charge of the sound system, and that, seven years running, we’ve never had a significant technical issue!

Om Street 2017: Better, Faster, Stronger

We survived the 2017 edition of Om Street: Yoga on LaSalle road, my yoga studio’s annual “yoga in the streets” event that I’ve been audio engineering since the event’s inception in 2011.

Some history via the blogosphere:

Just finished cleaning up the audio equipment from the event this morning; three bins and one large duffel bag loaded with equipment that needed to get sorted, re-wrapped as necessary, and restowed for the next gig.

It was a pretty quick and pain-free recovery this year, owing to a few factors:

  • I ended up wrapping the bulk of the long cables myself – 4 x 100′ XLR (audio) cables, 4 x 100′ extension (AC power) cords, and all of the 1/4″ and speakon speaker cable (several hundred feet worth). There were just 2 x 100′ AC power cords that someone else wrapped and needed to be re-wrapped. In past years the cables were a bit of a hot mess.
  • I had a great assistant (yay, Steve!) who was dedicated to the band, so when I unpacked the storage bin from the band, cables were nicely  wrapped, mic stands nicely folded, mics and headphones all bagged up neatly. He also did a yoeman’s job of setting the band up (mics, direct boxes, and monitor headphones) so I could focus on other things.

Honestly, the only bin that was a bit of a hot mess was the main audio station bin (that I packed, last) because it was kind of the catch-all for everything lying around.

It’s a huge outdoor yoga event:

Om Street 2017 Wide Shot

Om Street 2017 (Breck Macnab Photography / West Hartford Yoga)

And it’s pretty much me handling the audio all by myself. Some things that helped out this year:

  • I added a pair of low cost wireless stick mics for emcee / stage mics this year. In previous years I used the studio wired mics but that’s a couple of fairly long XLR cables I do not need to worry about setting up or striking. I picked them up for Q&A for some larger workshops we have, but they proved useful for Om Street as well.
  • I did some more work on the band monitor setup, adding a small mixer, buying some dedicated cables / adapters, buying a gamer headset for myself (with a small headset mic on a boom), and setting it all up ahead of time to get levels. So the band had great monitors in both ears and we had a talk-back channel, with one band mic and my headset mic going only to the monitor channel, so we could communicate during the practice.
  • I took the time to kit out the audio equipment. Typically I show up with audio stuff in bins sorted mostly by who owns the equipment (yoga studio, or me) and general function (audio cables, power, speaker cabloes, mic stands, etc.). I would end up running around a lot during set-up, getting things to the right place This year I put everything for the band in one bin (power strip, mics, stands, direct boxes, cables, monitor headphones and amplifier) and everything for the main audio station in a second bin. I also got some of those big family zip-lock bags and kitted together all the cables and adapters for the satellite PA systems (200′ and 400′ down the road) so I could unload those with the speakers, stands, and amplifier, and not have to walk back and froth so much.

This year I took the time to sketch out the audio schematic:

RFFlow - Om Street 2017

I’ll add some links to “sure to be posted” video of the event, but here are a few news articles that have already made it to press:

Tale of Two Power Systems

Looking over a power monitoring dataset recently; we came across a site with a dual personality. The site in question had low total harmonic distortion (THD ~ 1.5%) from 4/1/17 up until 4/10/17 (specifically, at 2:00 pm). After that, the THD fluctuated much higher, rising as high as 5.4% (outside of manufacturer requirements for medical imaging equipment).

Tale2Power THD

A closer “before / after” look at the voltage and current waveforms provides more evidence, with visible notching on the voltage waveform “after”; monitored current showed some noise but was similarly low.

Tale2Power Waveforms Before Tale2Power Waveforms After
Before 4/10/17 After 4/10/17

Individual harmonics similarly supported the findings of the THD log, with all harmonics higher, and 5th harmonics exceeding 3%.

Tale2Power Harmonics Before Tale2Power Harmonics After
Before 4/10/17 After 4/10/17

Tale2Power NG RMS
Finally, the “before after” affect was also seen in the Neutral-Ground voltage – with noise voltages evident although the lower frequency voltages were not much higher.

Tale2Power NG Before Tale2Power NG After
Before 4/10/17 After 4/10/17

The funny thing is that the RMS voltage and the current of the device under test were not significantly different before / after the 4/10/17 date, in terms of RMS level or in terms of stability: sags, swells or fluctuations.
Tale2Power RMS

So what’s the scoop? We’re not on site, but odds are good that some facility load (we’re betting air conditioning, but could be other facility loads) got switched on at this time. Alternately, perhaps the facility transitioned to an alternative power source. But whatever the reason, this is clearly a tale of two power systems, and we’re curious about it!