If the antenna gain is increasing,its beamwidth gets decreased (i.e. beamwidth and gain of an antenna are inversely proprtional to each other)
But what I have read is, by increasing the gain, pattern of an antenna should flatten. If it flattens, then the beamwidth should increase.
Becoz of flattening, does the transmission range really increase?
The unit of antenna gain is dBi. dbI means "Isotropic", a perfect POINT SOURCE, which radiates in a spherical manner. A perfect dipole radiates with a donut pattern, broadside to the long dimension of the dipole. It is a relative measurement to an ideal dipole that radiates in a perfect sphere.
To achieve higher gains, antennas are constructed such that they radiate more in one direction than another. An omni directional antenna radiates uniformly in the horizontal plane and radiates very little up or down. This is how higher gain omni-directional antennas get that gain, by flattening the vertical dispersion zone (aka squishing a balloon or adding reflectors to divert water in a more horizontal plane).
Panel, sector, yagi, and parabolic grid antennas radiate in cones of various widths. The higher the gain, the smaller the horizontal AND vertical angles. Using the water analogy again, for a given amount of water, the distance the water shoots can be increased by focusing the spray; for a given amount of of microwave energy, distance can be increased by focusing the beam.
Antennas angles are specified by their half power point (3 dbi less than the specified max output).
For example, one '14 dBi' directional antenna has 14dBi gain straight ahead but only 11 dBi gain 32 degrees horizontally and 31 degrees vertically; one '24 dBi' parabolic grid also has 24 dBi gain straight ahead but only 21 dBi gain 6.5 degrees horizontally and 10 degrees vertically.
So, I think the point of confusion was that higher gain antennas shape the beam in both planes.
Also higher gain antennas are a big deal as it is a one time investment to get huge increases in signal quality. Where as getting an amplifier for example has on going costs as well as technically issues. IMO shaping the RF coverage area also would go a long way to decrease the RF pollution/congestion that we are starting to see.
If we say that we will get the higher range with increased antenna gain,how much range we can expect with per 1dBi increased antenna gain.e.g.If I have replaced the antenna from 10 dBi to 11 dBi,how much increased range will I get?(Consider the free space for antenna signal propagation.There is no any obstruction in between transmitter and receiver)
Hope I get to all of your questions. The first one about testing antennas can and will get very involved. That typically requires some rather intense and costly equipment. But you can get an idea with a notebook and an application like NetStumbler. The first link talks about what is required for antenna testing.
Antenna polarization is not a real issue at these frequencies until you consider more than one device. It is extremely critical, that you have all of the antennas trying to communicate be the same polarization. Whether that be vertical or horizontal polarization is debatable. I prefer vertical polarization as it seems to cancel out man-made and natural occurring interference.
As for antennas and frequency, that again is a very important issue. To get the highest efficiency out of the antenna, each antenna is designed and built to operate in a specific frequency range. It may work outside of that range, but at a reduced efficiency. Using the wrong antenna also can harm the transmitter portion of the radio due to wrong impedance matching. That typically comes into play at much higher output power though. The WiKi link is an excellent study of this and other questions, read it and then you will have a pretty good understanding of what is going on.
Why the receiver sensitivity is expressed in negative? Many times I have seen the value for the receiver power sensitivity as -95 dBm.Why is it negative?
What is the meaning of the negative threshold value.If I expressed is as 0 dBm,is it mean that there is no signal level?
dBm is the ratio referenced to 1 milliwatt so for example, +3dBm equals 2 milliwatts, +10dBm equals 10 milliwatts and +20dBm equals 100 milliwatts etc.
Where +dBm indicates "more than", -dBm simply indicates "less than" so a value of -3dBm means 3dB less than 1 milliwatt ... or 0.5 milliwatts. Similarly, -10dBm is 1/10 of a milliwatt or 0.1 milliwatts and -20dBm is 1/100 of a milliwatt or 0.01 milliwatt.
In the case of a receiver having a sensitivity of -20dBm, it would require a power at the receiver antenna input of 1/100 (0.01) milliwatts before it could hear the signal. If the receiver had a sensitivity of -82dBm it would need a power at the receive antenna of 82dB below 1 milliwatt ... a very small number! The lower the number, the better the sensitivity ... a receiver with -82dBm is much more sentitive than one with a sensitivity of -20dBm.
In regard to signal to noise ratio, you shouldn't really see a negative value and it would normally be quoted in dB, not dBm. For example, a s/n ratio of 3dB means that the signal is twice that of the noise. A s/n ratio of 10dB means the signal is 10 times the noise. If you were to see a s/n ratio of -10dB it would mean that the signal was one tenth of the noise ... meaning that you wouldn't hear the signal.
Signal to noise ratio is expressed in dB, not in dBm. Signal to noise ratio is just that ... the ratio of the signal to the noise. If you are seeing dBm in this regard it means the absolute value of the signal and/or the noise. For example, the noise might have an absolute value of -50dBm and the signal might have an absolute value of -40dBm in which case the signal to noise ratio (the difference between the two) would be +10dB ... NOT 10dBm.
This link goes into the real details of the calculation if you are interested.
One more question I would like to ask ,which is regarding the "Frequency of application" and "range of the application".One of the formula which I read is like while reading is
Log R (Km) = [ Loss (dB) -32.4 -20 log (f in MHz) ] / 20.
with this relation if my loss is constant,the range R is directly proptional to the frequency band.
Is it correct that,with larger frequency we will get higher range.In short if I am using sensor application working in 900MHz and Wi-fi in 2.4 GHz,whether I will get more range in case of wi-fi application?
I have to admit that I am not familiar with that formula. But even if you work through it you will realize that 900mHz has a great deal more range than 2400mHz.
Once more from my notes:
""The physics of radio signals typically place two primary constrictions on spectrum. To generalize, the higher the spectrum frequency the greater the amount of bandwidth that can be transported---lower frequencies transport less bandwidth.
Secondly the lower the frequency the greater the carry range and penetration of a signal. For example: A 900 MHz license free radio will travel farther and penetrate some tree cover fairly easily at ranges up to one to two miles. But it can carry much less bandwidth than a 2.4 GHz signal which cannot penetrate any tree cover whatsoever but can deliver a lot more data.
The caveat that can somewhat alter this equation is power. Licensed band spectrum by virtue of being dedicated to one user is allotted significantly higher power levels which aids in tree and building wall penetration.""
Some of the actual reasons are that the higher the frequency the more energy is required to create the RF. Second the higher the frequency the more active it is and will be affected and affect it surroundings to a larger affect. Microwaves use 2.4gHz not 900mHz.
Also Spider you are correct. The higher the frequency the smaller the wavelength. As an example 2.4gHz has a wavelength of approximately 2.7 inches through space where 7mHz has a wavelength of approximately 132 feet.
This question is regarding the signal to noise ratio.
If I have configured a wireless network ,in which there are total 4 access points.We are not accessing any data with the help of these AP's, only AP's are sending beacons(or any control frames) to each other.In this case whether the signal to noise ratio will change or not?
Basically I want to ask the question like, if my network is ON but there is no any data transfer is going on then ,whether Siganl to noise ratio will change?
I wish to calculate the distance between transmitter and receiver whenever I am deploying wireless network.
Both the transmitter and Receiver antenns gain is = 2dB
Losses because of transmitter and receiver cables are = 2dB
Indoor losses because of walls = 94 dB
Transmitter Power = 200mW
Receiver sensitivity = -95 dB
Please let me know with this much information,how can I calcualte the distance between transmitter and receiver?
I would appreciate if someone could give the formula for the distance to be calcualted between transmitter and receiver.
To address your first question about signal to noise ratio. If I understand you correctly, I do not see any relationship between S/N and other devices. The S/N is a measure of receiver quality if you will. It is a bit more complex when you have to consider digital or analog signals. But, it is basically a ratio of the received signal to the internal noise level of the receiver's electronics.
As for the next question about calculating distances, you and a great many other people would like to have an accurate calculation like that. The only research that I know of that is even remotely working is described by this paper. Their theory is to basically use the same theory that gives radar the ability to determined distances.
How to differentiate between noise and interference?
While talking about WLAN network,sometimes I have gone through the term interfernce while sometimes some paper are talkng about noise,I am not clear what is to be called as noise and what is to be called as Interference?
If my network consists of 802.11b LAN as well as cordless phone working in 2.4 GHz band,whatever performnce degradation is there,is coz of Interfernce or coz of noise?