You've gotten some good advice some others. I'll just add that most Part 
15 devices are specified in terms of field strength at some distance 
from the antenna, depending on frequency, and not in terms of power or 
ERP. There might be some sections with other criteria, but if there are, 
field strength specification is by far the most common. The FCC does cut 
some slack in testing for home-built devices (not marketed, not 
constructed from a kit, and built in quantities of five or less for 
personal use), in section 15.23. My copy is nearly ten years old now, so 
I suggest checking a newer copy of Part 15. It's likely on the Web these 
days. 
 
Roy Lewallen, W7EL 
 
Liam Ness wrote: 
 I've been homebrewing some simple part 15 transmitters and have always 
 thought that I was safely within part 15 by controling the RF output. 
 I use a spice program to estimate my output levels.  I just read a web 
 page that suggests a antenna can increase the RF output power and I 
 wanted advice if that is true.  It was suggested that output could be 
 increased from 30milliwatts to 60milliwatts by using this antenna.  I 
 understand how you could increase voltage with a decrease in amperage 
 and vice versa, but I was under the assumption that you couldn't 
 increase total power without adding more power.  I thought it would 
 violate one of the laws of thermodymanics otherwise.  They didn't seem 
 to be talking about more effieciently radiating the transmitters 
 power, but actually increasing it above what is present at the antenna 
 port. 
 
 Could someone confirm whether it is posible to increase the power 
 output of an RF transmitter above the total presented to the antenna. 
 If it is, I'd appreciate any pointers to information about this.  I 
 don't want put myself out of part 15 by a poor antenna choice. (even 
 though I still can't believe that it is possible, it sounds to much 
 like perpetual motion) 
 
 TIA 
 
		 
		
		
		
		
		
		
		
		
	
	 |