I don’t know if this is a right, but recently I’ve experienced a disparity in medical care. Specifically I’ve seen disease that are prevalent in women, and have been know about for over 50 years, but there has been limited research and investment in finding cures and rather women are just blamed for getting sick with them.
Also, when I’ve gone to doctors with complaints, many times I’ve been waved off and my concerns ignored when I was actually quite sick.
Don’t know if quality medical care is considered a right, but I think health might be.