user image

Deepika Deepika

Class 11th
Physics
2 years ago

It is claimed that two cesium clocks, if allowed to run for 100 years, free from any disturbance, may differ by only about 0.02 s. What does this imply for the accuracy of the standard cesium clock in measuring a time-interval of 1 s ?

user image

Abhishek Mishra

2 years ago

Toppr Errror in 100 years =0.02s Error in 1s= 100×365 4 1 ​ ×24×60×60 0.02 ​ =7.9×10 −13 ≈10 −12 Hence, the accuracy of standard caesium clock in measuring a time interval of 1s is 10 −12

Recent Doubts

Close [x]