The total estimated wattage produced by radioactive decay in the Earth is 2e13 watts. That's the equivalent of The whole problem here is that you are assuming that in order to maintain the 2e13 watt generated by radioactive decay, that the interior of the Earth must be more radioactive than the crust of the Earth is on average. This is not the case. For instance, it is estimated that the crust contain ~2 parts per million of Uranium. If we were to assume that this held for the entirety of the Earth, this gives us ~ 1.2e19 kg of Uranium for the Earth. At roughly 2.5e24 uranium atoms per kg, this gives us 3e43 uranium atoms in the Earth total. Given the half-life of U-238, the most abundant isotope of 4.468e9 years, We get a decay rate of ~4.9e-10/sec. At this decay rate, and starting with 3e43 uranium atoms, in one second, ~1.5e26 atoms will have decayed. Upon decaying, each atom releases ~6.9e-13 joules of energy, so this equates to 1e14 joules/sec or 1e14 watts. This is 5 times the 2e13 watts needed. In essence what this means that the interior of the Earth, if anything is on average less radioactive than the crust. Of course this is just a rough estimate, but is close enough in degrees of magnitude to show that lava expelled by volcanoes does not need to be more radioactive than surface rock in order to get the estimated heat output from radioactive decay within the Earth.