Absolute Humidity, often just referred to as 'the humidity', is a measure of the actual amount of water vapour in a particular sample of air: measured as a partial pressure (vapour pressure/hPa or millibars); a mixing ratio (gm water vapour/kg of dry air), dew point etc
Relative Humidity - expressed commonly as a percentage value, is the ratio of the actual amount of water vapour present in a sample (the Absolute Humidity) to that amount that would be needed to saturate that particular sample.
The two terms are not interchangeable and can lead to confusion; e.g. on a cold, raw winter's day close to the east coast of England, the dew point might be 1 degC and an air temperature of just 2 degC. This would give a RH of 93%; a 'high' Relative Humidity, yet few would refer to such conditions as 'humid'. Conversely, on a hot summer's day, with a dew point of 18 deg C, and an afternoon temperature of 30 deg C, that's a RH of 49%; a 'low' Relative Humidity, but high Absolute Humidity.

This Is a Certified Answer

Certified answers contain reliable, trustworthy information vouched for by a hand-picked team of experts. Brainly has millions of high quality answers, all of them carefully moderated by our most trusted community members, but certified answers are the finest of the finest.
Hi students! Here is your answer. Relative humidity is the percentage of the maximum amount of water vapor  no matter what is the temperature of air, whereas the absolute humidity does not take into consideration the amount of possible water vapor in the air. However, both the humidities are measures of water vapor.