i've been thinking about what explains how people can draw different conclusions from the same information, with respect to vaxes, and i think that our previously held beliefs are what explain most of the difference in opinion.
i can't even estimate how many research articles i've read (i try to restrict myself to original sources of info). too many. and i haven't kept a spreadsheet of results or anything. i have to remember all of my reasoning and conclusions, and i'm sure there's a bias filter going on in my brain, and everyone else's. so when i come to the conclusion that vaxing is a pretty great thing, for the most part, and there are only a few i am interested in skipping/delaying, well that has a lot to do with the fact that i believe most scientists are honest (wakefield being of of the exceptions!), and that the CDC isn't on a mass campaign to poison our children and cover it up. my science background (BS in physics, MA in science journalism) certainly plays into it.
but, if i were a person that distrusted government more, and had an initial bias away from science, then i'm sure i could easily come to a different conclusion. it's so hard to try to control our own biases. one thing i always try to do is be on the lookout for how much i "like" the results of a scientific paper--how much they agree with my previously held beliefs. if they're results i like, i make myself pretend they're the opposite of what i want to hear. how hard would i be on the paper then? and likewise, if i don't like the paper, how would i analyze it if i did? i usually find that my first instinct is to go easier on papers when the conclusion agrees with my beliefs, and harder if it doesn't. forcing myself to behave oppositely has been immensely helpful.