Roger Penrose proposed that a spatial quantum superposition collapses as a back-reaction from spacetime, which is curved in different ways by each branch of the superposition. In this sense, one speaks of gravity-related wave function collapse. He also provided a heuristic formula to compute the decay time of the superposition—similar to that suggested earlier by Lajos Diósi, hence the name Diósi–Penrose model. The collapse depends on the effective size of the mass density of particles in the superposition, and is random: this randomness shows up as a diffusion of the particles’ motion, resulting, if charged, in the emission of radiation. Here, we compute the radiation emission rate, which is faint but detectable. We then report the results of a dedicated experiment at the Gran Sasso underground laboratory to measure this radiation emission rate. Our result sets a lower bound on the effective size of the mass density of nuclei, which is about three orders of magnitude larger than previous bounds. This rules out the natural parameter-free version of the Diósi–Penrose model.
Submitted by: NOVA Admin, 10-09-2020 13:11, Type of article: Paper, Viewed: 2203 times