Does anybody has any idea on how to calculate the phase and magnitude
of a radiating signal located at (x,y,z) from my receiving antenna.
The following example show my approach to solving this problem. Am I
correct in using this assumption? If so, how do I model the receiving
antenna, and specify a source location.
For example, let the receiving antenna be the dipole (as in the
example file) radiated at 300 MHz and centered at the (0,0,0) point,
and the source located at the (-10, 2.25, 0) point. If I move my
receiving antenna to (0, 2.25, 0), I should be able to detect the
phase change of 90 degree between the two points.
Any help or hint would greatly appreciated.
Haigan Chea
Received on Wed Dec 10 1997 - 12:04:09 EST
This archive was generated by hypermail 2.2.0 : Sat Oct 02 2010 - 00:10:38 EDT