Metal foams have been investigated for both forced and free convection heat transfer applications. However, there appear to be no engineering models for determining the thermal resistance of metal foams when used in free convection. In this paper, a new engineering model is presented to fill this gap. The model is developed from first principles of heat transfer and the concept of thermal resistance network. It identifies the various resistances present inside the foam, and the associated temperature difference for each resistance. A few coefficients were needed to calibrate the model, and they were determined from experimental data on an actual aluminum foam sample having 20 pores per inch (ppi) and a porosity of 76.4 %. The dimensions of the foam sample were 109 mm by 110 mm by 20 mm. The sample was subjected to three heat fluxes: 884.1 W/m2, 1217.7 W/m2 and 1884.9 W/m2, and was cooled by room air. Subsequently, the model of the thermal resistance of the foam was validated by further experiments on three foam samples having different materials and morphological properties. These samples included aluminum and copper foam, and had porosities in the range 74.5 %–78.0 %. There was very good agreement between the thermal resistances of the foam as predicted by the model and their experimental counterparts. The heated wall temperature was predicted by the model within reasonable error. The model allows optimization of metal foam in terms of pore density and porosity for free convection. The model also shows that no advantage is gained by using copper foam as opposed to aluminum foam for passive cooling.