We study the finite temperature and density effects on beta decay rates to compute their contributions to nucleosynthesis. QED type corrections to beta decay from the hot and dense background are estimated in terms of the statistical corrections to the self-mass of an electron. For this purpose, we re-examine the hot and dense background contributions to the electron mass and compute its effect to the beta decay rate, helium yield, energy density of the universe as well as the change in neutrino temperature from the first order contribution to the self-mass of electrons during these processes. We explicitly show that the thermal contribution to the helium abundance at T = m of a cooling universe (0.045 percent) is higher than the corresponding contribution to helium abundance of a heating universe (0.031 percent) due to the existence of hot fermions before the beginning of nucleosynthesis and their absence after the nucleosynthesis, in the early universe. Thermal contribution to helium abundance was a simple quadratic function of temperature, before and after the nucleosynthesis. However, this quadratic behavior was not the same before the decoupling temperature due to weak interactions;so the nucleosynthesis did not even start before the universe had cooled down to the neutrino decoupling temperatures and QED became a dominant theory in the presence of a high concentration of charged fermions. It is also explicitly shown that the chemical potential in the core of supermassive and superdense stars affect beta decay and their helium abundance but the background contributions depend on the ratio between temperature and chemical potential and not the chemical potential or temperature only. We calculate the hot and dense background contributions for m = T = μ. It has been noticed that temperature plays a role in regulating parameter in an extremely dense systems. Therefore, for extremely dense systems, temperature has to be large enough to get the expected value of helium production in the stellar cores.