Soil thermal resistivity is defined as “the difference in degrees centigrade between opposite faces of a centimeters cube of soil caused by the transference of one watt of heat and is expressed in thermal ohm/cm or °C cm/watt. When the cables are buried in the ground the heat transmitted through the cables passes into the surrounding soil. The ground surface above the cable is a plain isothermal of the ambient temperature so that all the heat generated is ultimately transmitted to the ground surface which remains at a constant temperature. In general, soils having a higher percentage of moisture will have a lower thermal resistivity and consequently the best heat dissipating qualities, whereas porous or well-drained soils have a higher thermal resistivity. Soil resistivity varies over a period of 12 months. Lower resistivity occurs in the rainy season when the moisture content is high and ground temperature low and the converse occurs in the summer. In water logged area the soil thermal resistivity will tend to be reduced. Soils vary enormously in their thermal resistivity from as low as g = 30°C cm/watt in certain water logged soils to a very unfavorable ground which may approach g = 400 °C cm/watt. The current rating of the cable should then be based on the highest value of soil thermal resistivity observed. Therefore, soil thermal resistivity should be an essential preliminary to the determination of the correct current for a given cable.