Skip to main content

Timeline for answer to In nuclear physics, what length year in seconds is used? by user189728

Current License: CC BY-SA 3.0

Post Revisions

6 events
when toggle format what by license comment
Apr 25, 2018 at 18:53 comment added anon @JEB After $10^9$ years of 20 minutes' error each, you could expect ~40,000 years of error maximum. I think you got it about right. (or possibly three orders of magnitude off; I can't read scientific notation)
Apr 24, 2018 at 21:31 comment added JEB @Walt I was winging it, so let's see. TwitchyKid said he had a 20 minute uncertainty (peak-to-peak) in his year definition, which 1 part in 3 in 24 in 365, or 1 part in 26000. Multiply that by 4.5 billion and I get 0.0002 billion. Which is not what I said. I better check my .python_history file.
Apr 24, 2018 at 19:49 comment added Tin Wizard @JEB could you share your math for that? my attempts to replicate it aren't quite matching up.
Apr 24, 2018 at 3:43 vote accept RocketTwitch
Apr 24, 2018 at 16:16
Apr 24, 2018 at 3:40 comment added JEB U-238 is listed as $4.468 \pm 0.005 \times 10^9$ years, while any systematic error from the definition of years should top out at $\pm 0.0004 \times 10^9$ years.
Apr 24, 2018 at 3:30 history answered user189728 CC BY-SA 3.0