"So a supernova with a measured redshift z = 0.5 implies the universe was 1/(1+0.5) = 2/3 of its present size when the supernova exploded. In an accelerating universe, the universe was expanding more slowly in the past than it is today, which means it took a longer time to expand from 2/3 to 1.0 times its present size compared to a non-accelerating universe. This results in a larger light-travel time, larger distance and fainter supernovae, which corresponds to the actual observations. Riess found that "the distances of the high-redshift SNe Ia were, on average, 10% to 15% farther than expected in a low mass density  \Omega_M = 0.2 universe without a cosmological constant".[12] This means that the measured high-redshift distances were too large, compared to nearby ones, for a decelerating universe.[13]"

Well, with the large absolute expansion speed that my work has proved exists through aberration, I'm just 2.5% above Riess's estimation range... however, when one considers that part of this would be gobbled up, by some of what is already known about the background cosmic microwave radiation movement: I would probably fall right into Riess's range!

Furthermore, the much lower speed of light that's currently in acceptance - i.e. the relative speed - is conducive to giving rise to the figmentary artefact of a present acceleration, because the light years traveled outwards would not appear linear!

In conclusion one must ask oneself: is it reasonable to embark on the ridiculously incredible, when the answer is right in front of us: logically developed from stellar aberration and the Michelson-Morley... I think Not!

Frank Pio Russo.