While waiting in the Dr. Office today, I picked up Newsweek Magazine. Not to sure of the date on it, but when I got to the last page, These words popped up and hit me in the face! The title was "Oral Sex can cause mouth cancer" Didn't get to read the whole article, but there are at least 3 guys on this board that had better be a little careful. Or is it really like I was taught? Oral sex means just talking about it?