View Single Post
  #6   Report Post  
Old October 1st 06, 07:54 PM posted to rec.radio.amateur.homebrew
[email protected] LenAnderson@ieee.org is offline
external usenet poster
 
First recorded activity by RadioBanter: Aug 2006
Posts: 1,027
Default Help calibrating a noise source


Roy Lewallen wrote:
wrote:
. . .
Random noise voltage or current can ONLY be calibrated by
a TRUE Root Mean Square measuring instrument. By "true" I
mean a thermionic type such as an RF power meter (thermistor,
bolometer, etc. sensor). Few voltmeters on the market have
TRUE RMS measuring capability; those that do are specified
as such and rather on the expensive side.
. . .


Not too long ago, I got an HP3400A (10 MHz bandwidth) on eBay for the
purpose of measuring noise. Don't recall what I paid, but it was very
reasonable. Since truly random noise also has low-frequency
fluctuations, an RMS meter reading wanders quite a bit while measuring
noise. I find it easier to eyeball interpolate an average reading with
the analog meter of the HP3400A than with a digital meter.


I agree there on analog display versus digital.

Dug out my CD of old "saves" and found the Jim Williams article
in the 11 May 2000 edition of EDN (
www.ednmag.com). In that
he lists the following as having zero error when compared to using
the Linear true-RMS IC: HP 3400, 3403C, 3478; Fluke 8920A.

By the bye, the 1967-era HP pseudorandom noise generator has
the following "calibration" procedure to check RMS output voltage:
Connect it to an HP digital voltmeter-recorder, set it to slow
speed and record 1023 voltage readings (corresponding to the
slow-speed 'flat' pseudorandom waveform parts), then take the
sum of the squares of all those readings and find the square root
of it. Heh heh heh...I can just see a calibration technician doing
all that work with a mechanical calculator in the 60s...with the QC
chief storming in after an hour or so asking "You STILL working
on that?!?" :-)