I am from New Orleans and went to school in Mississippi. When I took health, the reproductive system was discussed for maybe 2 class periods, and we barely spent any time talking about sex. It was basically: “You already know what sex is, don’t do it because you are too young, wait until you get married.” Just as you would expect from the South. We briefly touched on STDs, but it was very minimal, and to be honest, I still didn’t know a lot about STDs until I went off to college and did the research myself. It was also framed to seem like something scary and bad. No one ever said “sex is great but here’s what you should know.” So I’d say I definitely viewed it as a scary thing and a major deal. (Not that I think you shouldn’t be sure and whatnot, but I felt like they made it seem awful and like it’s the biggest deal in the world and you will regret it.)
As far as how it impacted me, although I grew up in the Deep South, my mom is from the North, so I grew up in a very liberal household. I knew about contraception, and was never told that I had to wait until marriage to have sex, but that it should be my decision, it was very important to use contraception, etc. However, I would still say that in terms of how much I really knew about sex, I did not know a lot of what I probably should have known when I went to college. Of course I could have asked my parents, and we are pretty close, but that’s still not really a comfortable conversation to have with them when you are in high school. So I definitely believe that proper sex education is important, because I clearly didn’t get that in the South, and I can guarantee that it still is definitely not happening down there.