Investigating a Significant Discrepancy in Modern Bulk Cable Injection Test Methods
In the 1990s, bulk cable injection (BCI) techniques were still relatively new and controversial. BCI in support of high intensity radiated field (HIRF) certification had been incorporated in RTCA/DO-160C section 20 in 1989. In 1993, MIL-STD-461D added (for the first time) BCI type CS114, CS115, and CS116 requirements. Long after adoption in these standards and others, controversy raged in some quarters over their legitimacy. Figure 1 shows representative BCI-type limits.
In this charged atmosphere, (Javor 1997) entitled, “On Field-To-Wire Coupling Versus Conducted Injection Techniques” was presented and published, then publicly debated in front of audiences. It provided the canonical basis for bulk cable injection (BCI) requirements.1 In addition to the physical analysis basis, various practical objections were listed and discussed.
But one item escaped perusal by both pro and con factions. Hence the present effort and title. MIL-STD-461 CS114 and CS116 test methods did something different than the BCI test methods in RTCA/DO-160C/D. All BCI test methods require the same pre-calibration of forward power in a calibration fixture, but RTCA/DO-160C/D, SAE ARP-1972 and DEF STAN 59-41 all use that recorded forward power to inject on the cable-under-test (CUT), subject only to an over-current limit that is a set value (roughly 10 dB in RTCA/DO-160) above the maximum level in the appropriate curve. So for instance, in Figure 1a, the over-current limit for the green curve according to RTCA/DO-160C/D is 1 Amp, at any frequency from 10 kHz to 400 MHz. For the blue curve, the over-current limit would be 5 mA, over the same frequency range. But MIL-STD-462D (1993), the test procedures for MIL-STD-461D, made the over-current limit 6 dB above the appropriate curve in Figure 1b, at the frequency of interest. RTCA/DO-160E (2004) partially followed along with that for some categories, but retained the 10 dB over-current limit for others. RTCA/DO-160F went the MIL-STD-461 route completely in 2007 and that was retained in RTCA/DO-160G.
The purpose of this investigation is to compare and contrast the original and modern methods – the differences are stark. Relative to the original technique, and to how electromagnetic fields actually couple to wires, the modern technique can under-test shielded cables at low frequencies by up to 40 dB.
(Javor 1997) explains the physics of electromagnetic field-to-wire (FTW) coupling and presents experimental test results validating the analytical treatment (Figure 2). Faraday’s Law is sufficient to explain the Figure 1 limits. The low-frequency limit flattens above the frequency at which the cable’s physical length becomes 1/2 wavelength electrical length, the boundary condition for maximum coupling.
Both CS114 and Section 20 (-160F/G) present a family of limits (expressed as induced currents vs. frequency) that are initially calibrated in a standardized fixture. The power required into the injection clamp to induce those current levels is recorded. Then the injection clamp is placed around the cable-under-test (CUT), along with a current probe to monitor injected current. Probe power is increased until either the desired current level from the standard limit (plus 6 dB) is induced, or the power limit is reached, whichever comes first.
Electromagnetic Field-to-Wire Coupling – Results Depend on Wire Load Impedance
But that isn’t how the technique was developed – it was much simpler to begin with, and the point of this investigation is to show the original technique provided a better simulation of electromagnetic FTW coupling.
Originally, once the power required to drive current in the calibration fixture had been recorded, that power was used to drive the clamp placed on the CUT, irrespective of the actual current induced on the CUT. A current probe was used to monitor and record the induced current, but it was for information only, not part of the control/leveling loop during the test (except for the really stringent frequency-independent over-current limit cited in the introduction). Anyone familiar with these tests can see the workload is much lower, and in fact commensurate with manual operation, whereas the modern test procedure practically demands automation, and was in many cases one of the first susceptibility tests to be automated, due to the workload.
The discrepancy between the two test methods is most noticeable on a low impedance cable at low frequencies, i.e., a shielded cable with good terminations to continuous metallic structure at both ends, and at frequencies where the cable is electrically short. Under these conditions, the impedance presented by the CUT is at greatest variance from that of the calibration fixture, which provides a 100 Ohm loop. One might expect more current in the lower impedance loop, but the physics is more complex than that. (Javor 1997) uses Faraday’s Law to compute FTW coupling, and demonstrates that the coupling will increase monotonically with increasing frequency when the cable is electrically short, and then flatten out when the cable approaches and exceeds a half-wavelength in length, which is the characteristic of the limits shown in Figure 1. But, and here is the crux of the problem discussed herein, Faraday Law’s calculates the coupled potential, and these limits control coupled current. The relationship between coupled current and potential is only constant when the circuit has resistive terminations whose value dominates the cable impedance. That is not true when a cable is shielded, and the shield is well-terminated, as is the case for safety-critical flight and engine controls. Under such conditions the cable impedance is inductive from the low end of the requirement frequency range (10 kHz) until the cable is at least a tenth wavelength long. Over that frequency range, the impedance of the cable increases monotonically with increasing frequency, just as does the Faraday Law coupled potential. Therefore the coupled current, expressed as the ratio of coupled potential to cable impedance, should be constant as a function of frequency. This is hardly a new observation; it was presented by A.A. Smith in 1977 in his seminal book on the topic of “Coupling of External Electromagnetic Fields to Transmission Lines,” (Smith 1977). Figure 2-4 on page 25 of the second edition, reproduced here as Figure 3, displays the results of a numerical analysis showing the significant difference between the flat current vs. frequency profile for a low impedance cable vs. the increasing current on a resistively loaded high impedance cable.
Upwards of 40 dB difference is evident between the high and low impedance cases.
This report presents measurements of electromagnetic field coupling to high and low impedance electrically short transmission lines, and the correspondence to the two different BCI test methods.
For the FTW coupling measurements, the same parallel plate was used as in (Javor 1997). The plate, shown in Figure 4a, is 12” wide, and 6” tall (one-tenth wavelength at 200 MHz), yielding a 90 Ohm characteristic impedance, with 50 to 90 Ohm and 90 to 50 Ohm matching networks used at each end to match to 50 Ohm test equipment. Plate performance as shown in Figure 4b over test frequency range (up to 30 MHz) is lossless. The transmission line exposed beneath it was one meter long, suspended five centimeters above ground. As such, it was one half-wavelength long at 150 MHz, and a tenth wavelength long at 30 MHz. Over the investigation range of 10 kHz to 30 MHz, the cable is very short and thus coupling is inefficient. There was no attempt to make it a matched transmission line; the wire was terminated in 50 Ohms at each end, to aid in direct comparison to BCI testing in the second half of the investigation. At a tenth wavelength and smaller, the resultant mismatch causes no measurement inaccuracy.
Figure 5 portrays a 1 Volt/Amp Pearson Electronics Model 2877 current monitor used for these measurements. The high and flat transfer impedance (1 Ohm, 1 kHz to >100 MHz) of this small monitor (0.25” window diameter) was necessary to measure extremely small currents when exposed to a 4.6 V/m field intensity. Ordinary-sized hinged current probes did not provide sufficient rejection of the electric field. Since the probe is designed to have a 50 Ohm output impedance, and the 1 V/A transducer factor only applies into a high impedance, a Stoddart 95010-1 rod antenna base was used as a matching network. It was specifically designed for this very purpose: the amplifier has 0 dB (voltage) gain from 10 kHz to 40 MHz. The dBuV legends on the plots in Figures 6, 7b and 9 are dBuA rather than dBuV as shown, when the 1 V/A transfer impedance or other pertinent conversion factors are applied to the
An HP 4195A network/spectrum analyzer was configured as a spectrum analyzer, using its maximum source output of 15 dBm to drive the plate. This results in 4.6 V/m field intensity, due to losses in the matching network and the 15 cm plate height. This can be computed looking at Figure 4b and from a knowledge of the matching networks, which provide a 75 Ohm shunt resistor facing the 50 Ohm side, and a 60 Ohm series resistor facing the 90 Ohm side. The loss on the load end is 9.5 dB, so the 107.25 dBuV reading is adjusted upwards 9.5 dB and then further adjusted for the 15 cm height to reach a field intensity of
107.25 dBuV + 9.5 dB – 20 log (0.15) dB meter = 133.25 dBuV/m = 4.6 V/m
Figures 6a and b show coupled current from this 4.6 V/m field to the previously described transmission line beneath it from 10 kHz to 30 MHz (where the cable is one-tenth wavelength long). Figure 6a is coupled current to the line terminated in 50 Ohms. Figure 6b is the same as 6a, but the line has both ends shorted to ground.
Analysis based on (Javor 1997) computes the induced potential on an electrically short line as
Vi = 2πlh Eo/l
Using the one-meter length, wire height of 5 cm, 4.6 V/m illumination and the stop frequency of 30 MHz (10 meter wavelength), the coupled potential is 0.145 Volt. That will induce 57 dBuA into a transmission line terminated in 50 Ohms at both ends (100 Ohms terminations in addition to ~180 Ohms loop inductance). The measured value in Figure 6a is 57 dBuA.
Also note that while the coupling has begun to flatten out near 20 MHz, the slope from 100 kHz to 10 MHz is 20 dB/decade as predicted by theory – below 100 kHz the data is noise floor limited. The flattening at the high frequency end is due to the selection of 50 Ohm loads on the ~300 Ohm transmission line; line inductive reactance is a significant fraction of the termination resistance on this mismatched line.
The same analysis computes a coupled current of 58 dBuA in a shorted line at 30 MHz, based on an inductive reactance of 1 uH/m for one meter, at 30 MHz. The measured value is 59 dBuA. The important property to be noted is the flat current vs. frequency profile all the way down to almost 10 kHz, as compared to the current vs. frequency profile in the matched transmission line. At 10 kHz, there is 50 dB difference between the current in the two lines, and that is dynamic range-limited: Figure 6a is showing noise floor below 100 kHz.
What this means is if we limit BCI current on a shielded cable to that induced in a 100 Ohm calibration jig circuit, or even 6 dB above that, we are vastly under-testing relative to what is induced by electromagnetic field illumination, which is checked when an aircraft is HIRF-qualified. During a low-level swept cw (LLSCW) illumination of an aircraft undergoing HIRF certification, we expect to measure more current on an aircraft-installed shielded cable at low frequencies than to what it was subjected during Section 20 rf conducted susceptibility testing, assuming the illumination and curve categories line up. In turn, this would force requalification of the installed system to the value predicted by the LLSCW scan.
Bulk Cable Injection Test Results
Figure 7a shows the set-up for pre-calibrating the forward power required to inject Figure 6a currents in the standard 100 Ohm calibration fixture per aircraft, automotive, and defense BCI test procedures. Figure 7b shows very close agreement between the electromagnetic field-to-wire coupling in Figure 6a and the current in the calibration fixture when the Tegam Model 95242-1 clamp is driven by -13 dBm from 10 kHz to 30 MHz. The Model 95242-1 has a nominal frequency range of 2 – 400 MHz, but note that for a cable of this length, its 20 dB/decade insertion loss roll-off at lower frequencies works very well to model FTW coupling.
Figures 8a and b show the set-up for injecting current in the same transmission line that was formerly under the parallel plate shown in Figure 4a. The wire has not been disturbed, only the top plate has been removed. Figure 8a shows injection on the wire terminated in 50 Ohms at each end, while Figure 8b shows the injection on the same wire terminated in a short to ground. The only change is which banana jack is selected.
Figures 9a and b show current coupled to the 50 Ohm and short-circuited wire, respectively. Not surprisingly, Figure 9a is identical to Figure 7b, because the only difference between the calibration fixture and the cable is conductor length, and both are electrically short in this investigation.
Comparing Figure 9b to Figure 6b is the payoff. Although the BCI calibration was performed in a 100 Ohm fixture, using the pre-calibrated drive value in the short-circuited wire yields the same current as when the short-circuited wire was exposed to an electromagnetic field. The roll-off at the high end of Figure 9b is due to the insertion loss of the 95242-1 flattening out in a manner not compensated for by the simple single-value pre-calibration performed in Figures 7. Had a true frequency-by-frequency pre-calibration been performed, curves 6b and 9b would have been identical. Also note that the short-circuit BCI results are due to the high insertion loss of the clamp at the lower frequencies, which is carefully controlled by identical Bode plot limits in both MIL-STD-461 CS114 and RTCA/DO-160 section 20. Insertion loss at lower frequencies is due to magnetizing inductance, which is a low impedance in shunt with the 50 Ohms driving the clamp. If it weren’t for the magnetizing inductance, the clamp would act as a near ideal transformer with either 1:1, or 2:1 step-down ratio, depending on model, which in turn would insert either a 50 Ohm or 12.5 Ohm impedance in series with the CUT, severely limiting short-circuit current. These insertion loss limits are archaic relics in the present versions of these standards, since it is injected current that is controlled, with a power limit protecting against too much potential in a high impedance circuit. But with a change back to injecting a pre-calibrated power, these insertion loss controls become critical to achieving correct low impedance cable current.
It is clear that the original, simple method of injecting a pre-calibrated power level and “letting the chips fall where they may” does a better job of simulating electromagnetic field-to-wire coupling at frequencies where the cable is electrically short. Which is precisely the frequency range where the BCI-type requirement and test method is needed; BCI requirements allow proper stressing of such a cable when it is impossible to electromagnetically illuminate the actual length.
However, just blindly using the limits shown in Figure 1 on platforms smaller than battleships will result in a massive over-test instead of the present under-testing. This is because the 500 kHz/1 MHz breakpoints in the Figure 1 limits correspond to platforms respectively 300 meters and 150 meters long. While MIL-STD-461 contains instructions to tailor the limit for platform size, in the author’s experience this is rarely done. While RTCA/DO-160 is not tailored, the application to HIRF certification recognizes that the low frequency breakpoint shifts upwards for smaller aircraft (FAA 2014). Applying the original BCI test technique on a shielded cable using these untailored limits applies the flat portion of the limit all the way down to 10 kHz. If a platform is instead 15 meters in length, the breakpoint frequency for the limit would be 10 MHz, which means that the pre-calibrated drive levels below 1 MHz would be 20 dB lower than for the untailored limit.2 See Figure 10.
From the above discussion, we can make the following inferences and draw some conclusions.
Assume the tailored green curve of Figure 10 is correct and the proper baseline for a fifteen-meter platform.
The drive level relative to the untailored limit is 20 dB lower below 1 MHz, so the short-circuit current will be 20 dB lower than for the untailored limit. Figure 11 shows actual (scaled) test results using a Tegam 95236-1 injection clamp.3 Figure 11 data was taken at specific frequencies along the continuous curves of Figure 10. The dashed lines are there to make trends clear, not to represent actual data.
Green data points are current in a low impedance cable when subjected to the MIL-STD-461F Curve 5 limit tailored for a 15 meter long platform as shown by the green curve in Figure 10. The forward power drive level into the 95236-1 was recorded when the green curve of Figure 10 was induced in the 100 Ohm calibration fixture.
Black data points in Figure 11 are the currents on a low impedance cable tested to MIL-STD-461F Curve 5. The levels are 6 dB above those shown in the black curve of Figure 10 because MIL-STD-461F CS114 requires testing to 6 dB over the limit if the precalibrated power limit is not exceeded before achieving the +6 dB level.
The red data points are the currents in a low impedance cable if the MIL-STD-461F CS114 limit is not tailored and the precalibrated forward power is applied without the modern limit as for the black data points.
Using the green data points as a baseline best approximation to the reality of electromagnetic field-to-low impedance wire coupling, it is clear that the presently used method of MIL-STD-461F (and similarly, RTCA/DO-160E/F/G) results in under-testing up to 20 dB at 10 kHz. At the same time, applying the original testing approach without tailoring the limit for a fifteen-meter platform results in over-testing for the smaller platform by up to 20 dB.
It is important to note that if the original test method were re-adopted but the present limits not tailored, the increase in injected current would be up to 40 dB at 10 kHz. For field-to-wire coupling, this would only be appropriate on a very large platform, and, on the basis of field-to-wire coupling alone, would result in massive over-testing for the majority of platforms that are fifteen meters in extent, or smaller. But airborne and ground vehicles tend to use structure for primary power return, and such large currents are in fact present on structures for that reason, and cables with shields terminated to structure at both ends will have such currents induced, and possibly causing ground plane interference (GPI) to poorly designed circuits. This is one more reason for adopting the original BCI test technique.
The author wishes to thank Mr. Mark Nave of Mark Nave Consultants and Mr. David Brumbaugh (Boeing) for thorough and thoughtful reviews. Any errors are the sole responsibility of the author.
- RTCA/DO-160C – G Environmental Conditions for Aircraft (Section 20)
- MIL-STD-461D-F Requirements for the Control of Electromagnetic Interference Characteristics of Subsystems and Equipment (Requirements CS114 & CS116)
- MIL-STD-462D Measurement of Electromagnetic Interference Characteristics (Methods CS114/CS116), 11 January 1993.
- Javor, K. On Field-To-Wire Coupling Versus Conducted Injection Techniques, 1997 IEEE EMC Symposium Record. Austin, Texas.
- SAE ARP-1972 Recommended Measurement Practices and Procedures for EMC Testing (Section F), January 1986.
- DEF STAN 59-41 Electromagnetic Compatibility Part 3 Technical Requirements Test Methods and Limits Section
- 3 LRU and Sub Systems (Including Land Vehicle Installed Antenna Test) Requirement DCS02.3
- Smith, Albert A. 1977, 1987. Coupling of External Electromagnetic Fields to Transmission Lines. Interference Control Technologies.
- FAA AC 20-158A The Certification of Aircraft Electrical and Electronic Systems for Operation in the High-intensity Radiated Fields (HIRF) Environment, 30 May 2014.
- Canonical meaning rule-based: induced current of 1.5 mA per Volt per meter when cable is at least one-half wavelength long; 20 dB per decade roll-off when cable is shorter than that. As opposed to the heuristic approach, where cable currents are measured on a variety of aircraft in various locations and some sort of statistical average is used as a limit. The reader should be aware that this article presents a canonical school of thought on this matter, and there are those who disagree, and feel that BCI limits should be solely based on heuristics – DEF STAN 59-41 and 59-411 use heuristically-based limits. In effect, the commercial aircraft HIRF certification process described herein is a heuristic process, but the BCI limits in MIL-STD-461 and RTCA/DO-160 are canonical. The heuristic school-of-thought considers such limits a convenient simplification. If currents measured on various platform cables are the sole basis for limits, as in DEF STAN 59-411, then the present method of leveling on injected current with a power limit for high impedance cables does suffice.
- Fifteen meters (~50 feet) was arbitrarily chosen to cover the vast majority of vehicles, air and ground. Clearly ships and large transport aircraft are a different story.
- The (Eaton) 95236-1 and the 95242-1 are the original clamps around which the test method centered back in the 1980s when it was first developed. The 95236-1 is well-suited to the low frequency portion of both the MIL-STD-461 and RTCA/DO-160 limits. The 95242-1 used when tailoring the limit for the short wire used in this investigation was developed for use at higher frequencies, and is generally used above 10 MHz, where it is more efficient than the 95236-1, which is more efficient below 10 MHz.
Measuring minute low-frequency currents in the presence of a strong external electric field proved challenging. The actual coupling to the 50 Ohm terminated line at 10 kHz was noise floor at lower than 10 dBuA. Even with a high transfer impedance such as 1 Ohm, that means any electric field coupling must be less than 10 dBuV. Given the 4.6 V/m incident field, that means the current probe’s “antenna factor” had to be larger than 123 dB/m. Ordinary hinged current probes used in EMI testing proved unequal to the task. See Figure 1 for typical results from a 1 Ohm transfer impedance probe.
A current monitor works according to the same principle as a current probe, but lacks the ability to be opened and closed. This makes it less appealing as test equipment, but when maximum rejection of electric field is a necessity, Figure 2 shows that the current monitor provides the necessary performance.
Prior to resorting to the use of a small current monitor, an attempt was made to seal up all the possible leaks in a conventional probe. The results were fruitless, even though Figure 3 shows comprehensive extra shielding.