Hart of Dixie really liked it a lot. You had this "healing world" feeling and closed the characters in the heart. Now I'm looking for a series that is so similar. I've already read from "Gilmore Girls", but that's probably something for women.
What can you recommend to me? Ideally also something that exists on Netflix.
Do not know if it's your taste, but look below:
Heartland - paradise for horses