Protocol and has it made any difference? Photograph: Omar Torres/AFP/Getty Images
Extract from The Rough Guide to Climate Change
Friday 11 March 2011 10.45 GMT Last modified on Thursday 14 January 2016 11.37 GMT
Share on Pinterest Share on LinkedIn Share on Google+
Save for later
The Kyoto protocol was the first agreement between nations to mandate country-by-country reductions in greenhouse-gas emissions. Kyoto emerged from the UN Framework Convention on Climate Change (UNFCCC), which was signed by nearly all nations at the 1992 mega-meeting popularly known as the Earth Summit. The framework pledges to stabilize greenhouse-gas concentrations "at a level that would prevent dangerous anthropogenic interference with the climate system". To put teeth into that pledge, a new treaty was needed, one with binding targets for greenhouse-gas reductions. That treaty was finalized in Kyoto, Japan, in 1997, after years of negotiations, and it went into force in 2005. Nearly all nations have now ratified the treaty, with the notable exception of the United States. Developing countries, including China and India, weren't mandated to reduce emissions, given that they'd contributed a relatively small share of the current century-plus build-up of CO2.
Under Kyoto, industrialised nations pledged to cut their yearly emissions of carbon, as measured in six greenhouse gases, by varying amounts, averaging 5.2%, by 2012 as compared to 1990. That equates to a 29% cut in the values that would have otherwise occurred. However, the protocol didn't become international law until more than halfway through the 1990–2012 period. By that point, global emissions had risen substantially.