Simulated sample heating from a nanofocused X-ray beam

Recent developments in synchrotron brilliance and X-ray optics are pushing the flux density in nanofocusing experiments to unprecedented levels, which increases the risk of different types of radiation damage. The effect of X-ray induced sample heating has been investigated using time-resolved and steady-state three-dimensional finite-element modelling of representative nanostructures. Simulations of a semiconductor nanowire indicate that the heat generated by X-ray absorption is efficiently transported within the nanowire, and that the temperature becomes homogeneous after about 5   ns. The most important channel for heat loss is conduction to the substrate, where the heat transfer coefficient and the interfacial area are limiting the heat transport. While convective heat transfer to air is significant, the thermal radiation is negligible. The steady-state average temperature in the nanowire is 8   K above room temperature at the reference parameters. In the absence of heat transfer to the substrate, the temperature increase at the same flux reaches 55   K in air and far beyond the melting temperature in vacuum. Reducing the size of the X-ray focus at constant flux only increases the maximum temperature marginally. These results suggest that the key strategy for reducing the X-ray induced heating is to improve the heat transfer to the surrounding.
Source: Journal of Synchrotron Radiation - Category: Physics Authors: Tags: heating radiation damage simulation nanostructures research papers Source Type: research
More News: Nanotechnology | Physics