Home |
Search |
Today's Posts |
#41
![]() |
|||
|
|||
![]()
Owen Duffy wrote:
Jim Lux wrote in : Whilst it might be reasonable to assume that the combined error in measurement of a high S/N sine wave voltage might be normally distributed, and that might also be true of measurement of noise voltage in some circumstances, I propose that measurement of noise power in narrow bandwidth with short integration times is distributed as Chi- squared and the number of samples becomes relevant in determining the number of degrees of freedom for the distribution. For this reason, I have talked about a confidence level rather than sigma (which is more applicable to normally distributed data). I would agree.. My question would be whether the original measurement (before averaging) is normally distributed. I suspect it is, being essentially integrated noise. |
#42
![]() |
|||
|
|||
![]()
On Thu, 30 Aug 2007 22:46:46 GMT, Owen Duffy wrote:
It is interesting in marketing hype that reference is made to 2 digit and 3 digit instruments, which implies a log based metric (10*log (MaxReading)) when you assume a 'full count', and the same hype refers to the upper digit if it can only have values of 0 or 1 as half a digit, whereas it probably has a weight of log(0.5) or 0.3... so in utility terms, a 2 1/2 digit instrument is really a 2.3 digit instrument. Hi Owen, There are also 3000 count meters. In my case, I was making the measurements straddling 200mV, so I needed a bit of headroom for outliers, say 1dB or 225mV fsd, so it was effectively 2.35 digit instrument if you followed that argument. Certainly, but I abandoned multimeters to general utility long ago and went straight to my own designs for known precision. The common sound card will give you 65000 count readings; and there is a world of higher ADCs up to at least 16 million count readings. Nevertheless, the error introduced by the resolution issue and instrument accuracy does not explain the experimental results... something else is happening, and one needs to look beyond the instrument itself to form a realistic view of measurement uncertainty when measuring narrowband noise. You get non-monotonicity, quantization error, sample hold time errors, codec error, issues of conversion errors through flash, successive approximation, or single/dual-slope methods. It would be simpler to handle the noise power in the linear domain, and do the conversion to digital late in the chain (if at all). Getting out into the hundredths of dB resolution (outside of the standard 1KHz product lines) drives you into building your own solution. Linear circuits in the AF arena have long managed 6, 7 and sometimes 8 place resolution. You might have to twist as many knobs to get the reading, but you also control the variables. Bolometery solves a lot of complexities (but that is where this topic started - after a fashion; and in that regard, optical pyrometry might be summoned up). 73's Richard Clark, KB7QHC |
#43
![]() |
|||
|
|||
![]()
Dear Crew: I am delighted with the content, care, and thought contained in
this thread. The continual issue of all measurements comprising at least two numbers (estimate of the quantity measured and an estimate of the uncertainty of the former) needs always to be dealt with. While the law-of-large-numbers suggests that a normal distribution is a good assumption to start with, experience shows that sometimes normal is not normal. The famous paper by Costa (Dec 1959, Proc. of the (wonderful) IRE) about communication in the presence of noise and other signals notes that a Poisson distribution is the appropriate distribution ("Poisson, Shannon, and the radio amateur"). I had the opportunity at Ohio State to craft a system that measured very wide BW noise that changed slowly and to add statistical measures to what was measured. Today, the task would be trivial - a sound card would run circles around what I did with a voltage to frequency converter, accumulator, counter-made-into-a-sidereal-clock, punched paper tape, and an IBM 1620. It is not enough just to put a number on something. We must remember the early speed-of-light measurements that had a mean that turned out to be outside of latter measurement's uncertainty band. An investigation of the old log books found that not all of the data had been used! When all of the data was used, the mean was within the more modern measurement's span. 73, Mac N8TT -- J. Mc Laughlin; Michigan U.S.A. Home: "Richard Clark" wrote in message ... On Thu, 30 Aug 2007 22:46:46 GMT, Owen Duffy wrote: |
#44
![]() |
|||
|
|||
![]()
Owen Duffy wrote in
: .... Graphically, the distributions are shown at http://www.vk1od.net/nfm/temp.gif . The graphic has a different URL, and is now incorporated in a write up of the experiment, draft at http://www.vk1od.net/nfm/multimeter.htm . Owen |
#45
![]() |
|||
|
|||
![]()
On Fri, 31 Aug 2007 20:43:42 -0400, "J. Mc Laughlin"
wrote: I had the opportunity at Ohio State to craft a system that measured very wide BW noise that changed slowly and to add statistical measures to what was measured. Today, the task would be trivial - a sound card would run circles around what I did with a voltage to frequency converter, accumulator, counter-made-into-a-sidereal-clock, punched paper tape, and an IBM 1620. Hi Mac, Last night at dinner, I had a conversation with a former HP exec and we rambled on over glasses of Burgundy about how kids had lost access to "flipping bits" on the computer, and instead played on them. What this has in regard to the quote above is that newer technology may have made everything simpler, but the laborious route you took drew together many issues and gave you a visceral connection to the process, building an instinct so to speak. For instance, your allusion to counter-made-into-a-sidereal-clock may not be fully appreciated for its "sidereal" quality which is a specie of time with a continuous slip against civil time. This would be a source of constant irritation for one being tugged away from their Cesium Beam Standard. (At a rate of something roughly at 4 minutes a day?) So in some sense the solution becoming "trivial" removes intuition from the problem. 73's Richard Clark, KB7QHC |
#46
![]() |
|||
|
|||
![]()
On Sat, 1 Sep 2007 22:50:09 -0400, "J. Mc Laughlin"
wrote: The scheme of working half time and going to school half time is one way to overcome those limitations. Hi Mac, Sounds like a good plan - got to avoid sudden transitions or spurs will fill the spectrum. 73's Richard Clark, KB7QHC |
Reply |
Thread Tools | Search this Thread |
Display Modes | |
|
|