The vast majority of Americans believe that vaccines helped rid the world of deadly diseases. Is that really the case? Is it possible that these diseases were already in decline by the time the vaccines were introduced?
Just a little food for thought.
Labels: Health
0 Comments:
Post a Comment
Subscribe to Post Comments [Atom]
<< Home