Yesterday I saw a documentary on Netflix and now when I look at meat products and dairy products I feel very bad. Is it bad if I give it up?
If you just give it up, yes.
Of course you have to see how you continue to eat well balanced and possibly supplement missing nutrients.
There's a lot of research going on now if you really want to do that.
No not at all, if not even healthier. If you want to do it in the long term, I would advise you to talk to a doctor or nutritionist.
:)