Home |
Search |
Today's Posts |
#51
![]() |
|||
|
|||
![]()
Ian Jackson wrote:
Here they went from 1 megawatt to about 50 kilowatt (ERP). And then there are several programmes on one transponder, instead of one analog programme. This gives significant savings in power. That's quite s drop in power. In the UK, it seems that the digitals are being run at 1/5th of what the analogues were. Certainly the main transmitter for London, Crystal Palace, was 1MW erp, but is now 200kW on the main six digital muxes. [There are also a couple more running around 10dB less.] When received with a similar quality setup as was required for longer distance analog reception, the power is adequate. Of course it does not allow indoor reception at 50km distance, but in the areas where indoor reception is advertised there are local transmitters. "the countryside" still needs a roof-mounted yagi, but they always did. (I think the spec was a yagi at least 1.5m above the roof and 12m above the ground) Of course the 1MW was peak envelope power (at the sync pulses), with a mean power a lot less than that (for typical content). |
#52
![]() |
|||
|
|||
![]()
In message , Rob
writes Ian Jackson wrote: Here they went from 1 megawatt to about 50 kilowatt (ERP). And then there are several programmes on one transponder, instead of one analog programme. This gives significant savings in power. That's quite s drop in power. In the UK, it seems that the digitals are being run at 1/5th of what the analogues were. Certainly the main transmitter for London, Crystal Palace, was 1MW erp, but is now 200kW on the main six digital muxes. [There are also a couple more running around 10dB less.] When received with a similar quality setup as was required for longer distance analog reception, the power is adequate. Of course it does not allow indoor reception at 50km distance, but in the areas where indoor reception is advertised there are local transmitters. "the countryside" still needs a roof-mounted yagi, but they always did. (I think the spec was a yagi at least 1.5m above the roof and 12m above the ground) Of course the 1MW was peak envelope power (at the sync pulses), with a mean power a lot less than that (for typical content). That is indeed true. The UK black level (which is when the highest average power is being transmitted) is 2.4dB below sync - and peak white (minimum power) is 14dB below sync. Even allowing for the relatively high average power during the vertical interval, it's obviously the average TV programme will consume a lot less power than if the transmitter was pumping out full envelope power all the time. Of course, the 1MW is erp, and as the transmitting antenna gains can be considerable, the transmitter won't be putting out 1MW. But again, you've got combiner losses and feeder losses ....... -- Ian |
#53
![]() |
|||
|
|||
![]()
On 3/17/2014 12:15 PM, Ian Jackson wrote:
In message , Jerry Stuckle writes On 3/17/2014 11:32 AM, Ian Jackson wrote: In message , Jerry Stuckle writes On 3/17/2014 3:45 AM, Jeff wrote: 7dBm is an absolutely colossal signal for a TV set. Even 0dBm is an absolutely colossal signal! Not in the United States. It was the minimum that the cable industry provides to the TV set. We are talking a signal 4.25Mhz wide signal, not SSB or CW. dBm is not a bandwidth dependant measurement such as CNR which is. Putting +7dBm into a tv receiver is madness, it would cause severe overload and inter mods. +7dBm is 50mW and that equates to about 61mV in a 75 ohm system which is an enormous signal. Jeff Wrong. TV's are made to handle at least 20 dbm. And cable tv companies must deliver at least 10 dbm to the premises. You do realise that 20dBm (appx 68dBmV) is a massive 100mW? With a modest 50 channel analogue cable TV system, that would be a total input power of 5W - which would have a TV set or set-top box sagging at the knees - if not even beginning top smoke! TV signals (at least in the U.S.) are not measured by CNR Well of course they aren't. CNR is a ratio - not a level. - they are measured by dbm. No. The US and UK cable TV industry definitely uses dBmV. Which is generally shortened to dbm here. I must emphasise that you are simply WRONG. None of the professional cable TV engineers I've ever been associated with (both in the UK and the USA have ever used the term 'dBm' when they mean 'dBmV'. Can you think of a reason why? [Clue - There's 48dB difference between the two units.] We aren't talking professional cable TV engineers here. We are talking installers and cable pullers (a much larger group, BTW). They barely know what a volt is - much less the difference between dBmW and dBmV. TV technicians at least know what a volt is. But most of them don't know the difference between dBmV and dBmW. What you are talking about is dBmW - which, unfortunately, is also often shortened to dBm. But most people on this side of the pond who are in the business understand that. I can live with that. The incorrect use of 'dBm' to mean 'dBmW' is a de facto industry standard - and I'm not going to try and change the world by pretending that I don't understand the incorrect 'dBm'. It depends on the industry you are in. 0dBmV is 1mV - a reasonable signal to feed to a TV set (especially directly from an antenna). 0dBm is appx 48dBmV (250mV) - and that's one hell of a TV signal! With a 75 ohm source impedance (antenna and coax) - and no significant levels of outside noise-like interference, a 0dBmV (1mV) analogue NTSC signal, direct from an antenna, will have a CNR of around 57dB. A TV set with a decent tuner noise figure (5dB?) or a set-top box (8dB) will produce essentially noise-free pictures. However, with an analogue TV signal from a large cable TV system, the signal CNR will be much worse than 57dB (regardless of its level). If I recall correctly, the NCTA ( National Cable Television Association) minimum spec is a CNR of 43dB (UK is 6B). At this ratio, it is judged that picture noise is just beginning to become visible. CNR is not important because the bandwidth does not change. You're havin' a laff - surely?! Nope. OK. Are you by any chance related to John McEnroe? http://www.youtube.com/watch?v=ekQ_Ja02gTY Not everyone works the same way. Your insistence on using CNR shows you know nothing about how the industry measures signal strength. I'm not insisting on anything. However, an analogue with a poor CNR will produce noisy pictures - regardless of the signal level. Similarly, a digital signal with a too poor an SNR/MER will fail to decode - regardless of the signal level. I think the UK cable TV spec for digital signals is 25dB (although a good set-top box will decode down to the mid-teens). External noise is somewhat consistent. Front ends are pretty much comparable in their S/N ratio. The only problems with noise are generally if you have something generating noise locally. But that is not a problem with the signal nor the receiver. That is why the real world uses signal strength to determine proper signal levels. CNR in TV is not used nor is it required when the other parameters are known. So pray tell me why, in my many years in the cable TV industry, I spent so many pointless hours measuring (among all the other parameters) CNR? Maybe because you're talking to people who design front ends, etc. They are only a small group in the entire industry. -- ================== Remove the "x" from my email address Jerry Stuckle ================== |
#54
![]() |
|||
|
|||
![]()
On 3/17/2014 12:21 PM, Jeff wrote:
On 17/03/2014 16:01, Jerry Stuckle wrote: On 3/17/2014 11:58 AM, Jerry Stuckle wrote: Then why, pray tell, does the several $K Sencore signal analyzer sitting on the back shelf (because it's now pretty much obsolete) say "dbm"? It has been that way since I first started with MATV systems back in the early 70's. It's so common many cable techs wouldn't know there even is I can't comment onyour Sencore signal analyzer as I have never used one, BUT every other signal generator and spectrum analyser I have come across and used, from HP/Agilent, R&S, MI etc etc when labelled dBm mean dB relative to a milliwatt. Also every other RF engineer I have come across universally understands dBm to mean dB relative to a milliwatt NOT dBmV. Just check the specs of any rf test gear line you will see that they refer to dbm meaning dB relative to a milliwatt. Even Sencore's website with the specs of their latest equipment, Where they mean dBuV or dBmV they say so. Jeff Remember - these are NOT RF engineers - they are only a small subset of the entire industry. These are cable installers, TV technicians, and the like. Even the TV signal generators I used in the 70's and early 80's when I did some TV work were listed as dbm. And these guys don't look at websites to use the equipment. They are given a spec to meet and meet it. They don't know and don't care if it's dBmV or dBmW. -- ================== Remove the "x" from my email address Jerry Stuckle ================== |
#55
![]() |
|||
|
|||
![]()
Ian Jackson wrote:
Of course, the 1MW is erp, and as the transmitting antenna gains can be considerable, the transmitter won't be putting out 1MW. But again, you've got combiner losses and feeder losses ...... The transmitters feeding the old analog 1MW ERP system were running 40kW output per vision carrier. So antenna gain minus feedline and combiner losses was 14dB. The feedline was about 300m. Not RG6, of course :-) |
#56
![]() |
|||
|
|||
![]()
On 3/17/2014 12:09 PM, Rob wrote:
Jerry Stuckle wrote: On 3/17/2014 10:45 AM, Rob wrote: Jerry Stuckle wrote: On 3/16/2014 11:42 AM, Ian Jackson wrote: In message , Jerry Stuckle writes HDTV requires a stronger signal than the old NTSC. It really depends on how good your old analogue NTSC was. For a noiseless picture, you would need around 43dB CNR, but pictures were still more-than-watch-able at 25dB, and the picture was often still lockable at ridiculously low CNRs (when you certainly wouldn't bother watching it). Digital signals can work at SNRs down to around 15dB for 64QAM and 20dB for 256QAM (although if it's a little below this, and you will suddenly get nothing). That has not been our experience. We had a number of customers here in the DC area who had great pictures on NTSC sets, but got either heavy pixilation or no picture at all when the switchover occurred. We sent them to a company which does tv antenna installations (we do a lot of low voltage, including tv - but not antennas). In every case, installing a better outdoor antenna solved the problem. Most likely the company reduced the transmitted power by a factor of 10 at the time of the switchover, to put the added link margin in their own pockets. (transmitting a megawatt of ERP as was regular in the analog days puts a serious dent in your electricity bill, even when you have a lot of antenna gain) Not at all. If anything, they raised their power. Here they went from 1 megawatt to about 50 kilowatt (ERP). And then there are several programmes on one transponder, instead of one analog programme. This gives significant savings in power. OK, you mean absolute power. Yes, they can lower the ERP - but that does not necessarily lower the power for the signal. Remember at 1MW the power was spread over 4.25 Mhz (assuming video only, of course). Digital requires much less bandwidth, so they don't need as much power to get the same effective signal. However, digital still requires a stronger signal than analog, in the bandwidth provided. You need quite a bit of noise before it becomes visible in analog. Digital, a single noise pulse can cause the loss of several bits of information. Because of the compression involved, this is more than one or two pixels. -- ================== Remove the "x" from my email address Jerry Stuckle ================== |
#57
![]() |
|||
|
|||
![]()
This is what I love about USENET. Ask a question and sit back and watch while
the majority of respondents argue the minutia. Thanks to those who answered on-topic. Much appreciated. |
#58
![]() |
|||
|
|||
![]()
Jerry Stuckle wrote:
Most likely the company reduced the transmitted power by a factor of 10 at the time of the switchover, to put the added link margin in their own pockets. (transmitting a megawatt of ERP as was regular in the analog days puts a serious dent in your electricity bill, even when you have a lot of antenna gain) Not at all. If anything, they raised their power. Here they went from 1 megawatt to about 50 kilowatt (ERP). And then there are several programmes on one transponder, instead of one analog programme. This gives significant savings in power. OK, you mean absolute power. Yes, they can lower the ERP - but that does not necessarily lower the power for the signal. Remember at 1MW the power was spread over 4.25 Mhz (assuming video only, of course). Digital requires much less bandwidth, so they don't need as much power to get the same effective signal. However, digital still requires a stronger signal than analog, in the bandwidth provided. You need quite a bit of noise before it becomes visible in analog. Digital, a single noise pulse can cause the loss of several bits of information. Because of the compression involved, this is more than one or two pixels. I think not much of that is correct. The systems differ a bit between US and elsewhere, but over here the channel spacing of digital and analog is the same, and the bandwidth is similar (a bit more for digital if anything). Also there is no discission of "spreading", we are just discussing peak envelope ERP. You could argue that a single digital stream sending 5 programmes means that 1 programme is transmitted at 1/5 the power, but that is not what I mean. The total ERP for 1 transmitter has been lowered, and it transmits multiple programmes to boot. Digital requires less power because it requires less signal-to-noise ratio at the receiver. |
#59
![]() |
|||
|
|||
![]()
In message ,
Bob E. writes This is what I love about USENET. Ask a question and sit back and watch while the majority of respondents argue the minutia. Thanks to those who answered on-topic. Much appreciated. You're welcome, Bob. At least it got the buggers stirred up! Among all the smoke and dust, did your query get adequately answered? Are you going to go ahead, and suck it and see? -- Ian |
#60
![]() |
|||
|
|||
![]()
On 3/17/2014 3:15 PM, Rob wrote:
Jerry Stuckle wrote: Most likely the company reduced the transmitted power by a factor of 10 at the time of the switchover, to put the added link margin in their own pockets. (transmitting a megawatt of ERP as was regular in the analog days puts a serious dent in your electricity bill, even when you have a lot of antenna gain) Not at all. If anything, they raised their power. Here they went from 1 megawatt to about 50 kilowatt (ERP). And then there are several programmes on one transponder, instead of one analog programme. This gives significant savings in power. OK, you mean absolute power. Yes, they can lower the ERP - but that does not necessarily lower the power for the signal. Remember at 1MW the power was spread over 4.25 Mhz (assuming video only, of course). Digital requires much less bandwidth, so they don't need as much power to get the same effective signal. However, digital still requires a stronger signal than analog, in the bandwidth provided. You need quite a bit of noise before it becomes visible in analog. Digital, a single noise pulse can cause the loss of several bits of information. Because of the compression involved, this is more than one or two pixels. I think not much of that is correct. The systems differ a bit between US and elsewhere, but over here the channel spacing of digital and analog is the same, and the bandwidth is similar (a bit more for digital if anything). Also there is no discission of "spreading", we are just discussing peak envelope ERP. You could argue that a single digital stream sending 5 programmes means that 1 programme is transmitted at 1/5 the power, but that is not what I mean. The total ERP for 1 transmitter has been lowered, and it transmits multiple programmes to boot. Digital requires less power because it requires less signal-to-noise ratio at the receiver. There are major differences between Europe's PAL and the U.S.'s NTSC. But the digital signal has much LESS bandwidth than the old analog one. That was the major impetus over here to switch to digital - to free up major bandspace in the VHF and UHF spectrums. We now have as many (or, in some areas, more) stations in a much smaller band than before. Digital requires less power because the bandwidth is much lower. -- ================== Remove the "x" from my email address Jerry Stuckle ================== |
Reply |
Thread Tools | Search this Thread |
Display Modes | |
|
|
![]() |
||||
Thread | Forum | |||
Connecting coax shield to tower near top | Antenna | |||
High Quality {Low Noise} Coax Cable for Shortwave Listening (SWL) Antennas ? - - - Why Not Quad-Shield RG6 ! | Shortwave | |||
soldering coax shield | Equipment | |||
soldering coax shield | Homebrew | |||
soldering coax shield | Homebrew |