The Shoemaker’s Children…

….go barefoot. An old proverb that speaks to those with particular skills and aptitude often taking care of their own stuff last. And usually the case with me!

I had a couple of basement outlets / lights go intermittent, I tracked it down to a particular outlet, and so far as a bad neutral (because the voltage tester was chirping on a “dead” outlet.

This noontime, I pulled a shelf away from the wall, pulled the outlet, and heard the familiar (and unsettling) crackle-crackle of arcing as a nearby light on the circuit blinked.

The sparky-monkey who wired the basement (no doubt the former owner) used those stupid / dangerous outlet holes to make connections (called “back-stabbing”) One neutral connection was high resistance, getting warm, and thankfully broke connection before anything caught fire.

I cut out the outlet, replaced with properly screwed down and wrapped around the terminal connections, and all better.

Best of all, since the circuit in question also powered my desktop computer, I got to listen to my UPS chirp while I made the swap (NASCAR-like) and returned the circuit to service with 12 minutes to spare (my UPS is oversized for the load, so lots of battery time).

Sneaky Voltage Swell Events

Long time, no post! A pandemic will do that to you . . . suffice it to say I am healthy, relatively happy, and staying busy with both my engineering work (mostly stable through the troubles) and my yoga studio work (very different these days but busier than ever with online classes and the need to provide appropriate technology).

Today’s engineering bon-bon involves a trio of voltage swell events. Seemingly caused by a short circuit (another facility load fault, perhaps utility lighting arrestors) that causes a 1/4 cycle drop-out on one phase and resultant voltage swell following. The swell has a serious overvoltage, likely to cause problems for many types of equipment / power supplies.

This event was captured as a minor RMS voltage swell (5.1% above nominal) – but the peak voltage is very high (540 V vs 395 V normal). Note the ~30 Amp peak current swell.
This event was barely detectable as an RMS voltage swell (4.8% above nominal) – once I recognized the nature of these events, I went looking for more and came up with just this one. Almost no resultant current swell.
This is the big smoking gun event. Even though it did not register as a sag / swell event at all (the voltage drop-out being balanced by the following swell, just 3.25% over nominal) it was captured as a current swell (the highest current event over 6 weeks of data capture) – 650 Amps waveform peak. Something within the client load saw this event as a problem and drew a slug of current.

I’m just the hired gun reviewing the data, pulling out events and issues, writing a report. So I won’t be following this to a resolution or further troubleshooting on site. But this is a good example why automated report writing is not always sufficient – the Fluke 1750 analysis tools would see these events as minor voltage swells (if that). It takes a human being with some experience (I’ve reviewed over 5000 power quality data sets since 2003) to see something unusual, scratch one’s head, and dig in a bit deeper.

Pandemic Crisis: Danger / Opportunity

Things have been very interesting in the yoga world since the COVID-19 pandemic forced the studio to close in March 2020. I taught my last in person class on March 11th; the studio last class was noon on March 13th.

After a weekend of stunned silence, we were back the following Monday. We started with free classes, live-streamed via Facebook Live and archived there and on YouTube. We started simple – an iPad on a tripod, and slowly added lights, sound (headsets and ambient microphones) as we realized this was not going away quickly. Unable at that time to monetize classes, we set up a couple of membership levels for those wishing to support the studio (whose revenues went to zero almost immediately). Lots of work for the studio technologist (me) to do both in terms of physical hardware, as well as online work as we transitioned away from in person classes.

Eventually, we followed our peers, teaching virtual classes via the Zoom platform. More tech work for me – setting up a zoom station at the studio as well as at home.

My home yoga room zoom set-up (left) is also handy to watch online concerts.
The studio zoom cart (right) is perhaps the 3rd iteration of the tech – with a wifi extender, on board lights, multiple cameras and a USB microphone. We can roll it anywhere!

Once the weather warmed up, we took our show on the road, teaching 7-8 outdoor classes each week. We’ve been doing free WHY in the Parks classes for many years through the local town Parks & Rec – now we ratcheted it up, with mandatory registration (to limit capacity and facilitate contact tracing), providing for donations and paid classes. We’d get up to 100 people at some classes. Opportunities for the studio techie to set up a second battery powered speaker, to figure out a way to daisy-chain (left / right speakers for larger classes), and to manage multiple wireless headsets. Oh yeah, and haul out the trusty iPad for some time-lapse videos!

Long terms, we knew that we’d be back teaching live classes, albeit capacity limited and socially distanced, so I started to build up some technology competence. I found a low cost video switcher, and some low cost HDMI cameras (think glorified security cams) and cobbled together a four camera set up that we used first for in-studio zoom classes and eventually to capture live classes.

Left: iPad confidence / stream monitor (what the end user sees)
Center: Laptop zoom computer
Right: Preview monitor and Blackmagic ATEM Mini video switcher
AIDA HD-100A cameras with Arducam lenses – top is the wide shot, bottom is the close shot. We’ve got two other cameras off to the side. HDMI cables + 12V power run into the studio office / control area.

We started with a temporary setup to prove concept and technology (HDMI cables on the floor, cameras on tripods or clamped to carts) – eventually moving to a permanent, four camera installation. You can see the end result on the studio Vimeo page here – https://vimeo.com/westhartfordyoga

Kicking It Old School

My Christmas present to myself this year – a 1974 Gottlieb Duotron pinball machine, courtesy of a local shop that once upon a time rented, sold, and serviced pool tables, juke boxes, and pinball / arcade games, but has transitioned to a warehouse full of relics and a small business in refurbing and selling old games.

Truth be told I’ve been virtually tire-kicking games for a while now – via Facebook Marketplace and Craigslist. I knew I was serious when I took down a folding table that had become a catch-all rather than a work space, replacing it with two shelving units – leaving plenty of space in my basement office for a game. And while I was a bit concerned with dissassembly, transport, and reassembly, the shop offered delivered and set-up. Sold….

The game was mostly working, with some small issues, so I’ve spent the last few weeks cleaning, tweaking, updating. Burned out bulbs, a few functions not working properly, replacing the rubber rings / elastics and some faded plastic bits. etc. It’s all up and running!

There is a significant industry out there for information and supplies. I was able to purchase a schematic and manual from an authorized source. I bought a kit of rings / rubbers curated for this machine as well as pop bumper caps. I purchased a spares kit (blubs, fuses, a new pinball, cleaner and polisher) and some LED lamps (to judiciously replace the incandescent bulbs, especially in the back box where the lamps tend to heat / damage the backglass artwork. I even found a site that recreates the instructions and replay / award cards so familiar to those who have played.

And, as an old school engineer who has been around long enough to have played this game new, and knows my way around a schematic, but never did a lot of work with relay / ladder logic, the game is an opportunity to break out a multimeter, jumper cables, and do some serious troubleshooting.

The ingenuity and complexity of these beast is amazing – stepper units to add bonus points, count players, track first and last balls, scoring reels, a motor with cams to trigger relays periodically. The Tilt devices alone are amazing.

So yeah, Merry Christmas to me. Something for an electrical engineer who does far too little engineering or troubleshooting work these days to mess around with!

The Shoemaker’s Children Go Barefoot (no more)

For someone who makes the bulk of her income working with power quality, my own computer systems have been fairly under-protected for many years.

I picked up a stand-by UPS (APC Model ES550) many years ago (maybe 10? hard to say, might have been my second device); it has served me reasonably well. And even though I’m well aware of the nature of stand-by UPS (time delay before inverter switches on, step wave inverter output) it’s done a pretty solid job of keeping my computer up and running.

A few days ago, my home office lost power for a bit – clocks were reset, the computer switched off – and I realized it was time to upgrade the office UPS. I picked up another APC – a line interactive, sine wave output model RS 1000MS – rated for 1000VA / 600W.

It’s got plenty of juice for my needs – sitting at about 20% of load / 37 minutes of battery time with my desktop, monitor, cable modem, and a small backup server and peripheral hard drive. I’m much enamored with the front panel LED screen and the PowerChute software. And while I have not set my computer up to hibernate at the command of the UPS, that’s a possibility.

I go back a long ways – when a buck a watt was perhaps a reasonable price to pay for a small UPS. So to get all this for about $150 – well, I’m not complaining.

And I took the time to run my house cable through the internal TVSS and the Ethernet from the cable modem back to the computer through the UPS – so I’ve got a better chance of surviving nearby lightning strikes / transients – related to both transient voltages and ground potential issues. I’m not at the point of driving a ground rod and connecting an external ground though. I’m down in a basement and close to the residence service panel, so not super worried about ground issues.

And I’ve also spent some time separating critical loads (computer, monitor, cable modem, exterior drives / servers) from less critical loads (printers, speakers), plugging these latter into the TVSS only outlets. And while I was down there with the system off, I spent some time untangling the cable spaghetti, wrapping and tying off cables, neatening things up.

Saying Good-bye to an Old & Loyal Friend

Not a long, emotional post (I’ve saved those for my personal blog and for social media) but I want to note the passing of my beloved canine buddy, Elo. He lived a good long doggie life of 16 years. Some clients may have heard him in the background during phone calls – he was my shadow and wherever I was in the house / office, he followed.

Elo came into my life way back in 2004; he moved with me to my present residence in 2010, and we’ve been a bonded pair ever since. He never had a lot of dog friends (being part Aussie Cattle Dog / Heeler, he was super bossy and would want to herd them) but was a great people dog. I was a little concerned how he’d transition to condo life (having come from a single family house with a fenced in yard) but he did great.

I miss the little f*cker (a pet name, well earned) a lot, although I’m happy to finally have the opportunity to get ahead of the dog hair and shedding.

Cause and Effect: Arcing Transients and Equipment Faults

We recently reviewed some power monitor data for a client. Problem statement:

Breaker Q1 in the WCS electronic box trips on a sporadic basis. The breaker is the M4 and M5 fan motor overcurrent protection. We have replaced the breaker multiple times.

First pass, we noticed three very high current swells in the ground current data:

Ground RMS

We also saw 100s of very serious arcing voltage transients, not related to load current changes or other voltage events. The transients showed up on Phase-Neutral and Neutral-Ground, but the NG transients were much lower amplitude (secondary, not the primary issue).

Transient PhN Transient NG

Finally, we captured three current swell events that clearly show equipment faults to ground (notice the elevated ground current) immediately following voltage transients – cause and effect.

Fault1 Fault2 Fault3

Figuring out the cause of the transients is an exercise for the local service engineers or an onsite power quality engineer. But we’ve got a pretty clear linkage here between transients and equipment faults. Most of the time, power quality problems are a lot less concrete and clear.

Sneaky Low Frequency Transients

Low frequency transients, sometimes called Utility Switching Transients or Power Factor Correction Capacitor Switching Transients, can be pretty hard to identify. Traditional power monitoring equipment has never done a particularly good job at spotting these – folks of a certain age will recall that the BMI-4800 power monitor would throw a frequency error (either 61.9 Hz or 64.0 Hz) if the transient caused an extra zero-crossing – sometimes that was the only way to detect the transient, and savvy engineers would use these frequency faults as a diagnostic tool.

Looking through a lot of Fluke 1750 data sets over the years (we’re looking at Site #4472 this week), we’ve gotten pretty good at pulling these transients out of the 100s or 1000s of transient events captured. Some detection tools:

  • Some transients do indeed trigger a voltage transient event, but need to be carefully reviewed because the reported magnitude is often that of the higher frequency leading edge
  • Many transients are accompanied by a rise in RMS voltage, so carefully adjusting the voltage swell threshold can often help to spot these.
  • In Wye systems, many transients cause a Neutral-Ground swell event, which can often be spotted.

In a recent data set; none of these indicators worked out. We were very fortunate that the first current event captured (with a current swell threshold set to 10 Amps, a typical threshold for our reports) was a transient event – so we happened to notice it.

Low Frequency Transient

Current Triggered Event #1

Then, identifying the duration of the current swell event (~ 17 msec, much shorter than the normal equipment loading) and the amplitude (between 15-20 Arms, normal equipment current swells were much higher) we were able to sort through 100s of current triggered events to find nine (9) low frequency transients in the data.

Low Frequency Table

Normally, we would not be so concerned about these transients, which are comparatively minor, simply looking at the voltage waveforms, with no significant overvoltage nor multiple voltage zero-crossings, However, the associated current swell (70-100 A peak) indicates something in the equipment under test is sensitive to or reacting to these transients, and drawing a slug of current. So they are worth looking into….

Low Frequency Transient 2

Current Triggered Event #646

The Road to Perdition…..

….is paved with good intentions. And the road to poor power quality is paved with power conditioning devices. Four examples:

1) Reviewing a power quality study, wherein the complaint is a chiller that is shutting down due to excessive voltage imbalance. The voltage imbalance chart showed unusual step changes in voltage balance, often exceeding 2%. The problem: the site had a tap-switch voltage regulator installed, and as individual phases were regulated, voltage balance jumped from relatively balanced to relatively imbalanced. The solution: Bypass the voltage regulator.

2) A second study, with imaging artifacts affecting an MRI system after several years of satisfactory service. The system is protected by a UPS, and shows a higher than typical number of load related events. The problem: the UPS requires the battery string to supply the maximum imaging system load. As the batteries have aged, they were unable to support the peak load. The solution: service the UPS / battery string; in all likelihood the battery string needs to be replaced. A different UPS, that did not rely on the battery supply peak power, would not have had this problem.

3) Finally, a site with repetitive (~ 10 minutes) and severe low frequency transients. A facility based capacitor bank, intended to correct voltage and power factor, is malfunctioning – the capacitors pull in, and then shut down 45 seconds later (for so far unknown reasons) – probably internal fault or error indications. Ten minutes later, the caps try again. A CT scanner has been damaged repeatedly by this problem.