Yes. According to our Declaration of Independence, our rights don't come from any government. We have INALIENABLE RIGHTS which are granted to us by our Creator. It is the job of governments to protect those rights.
You can argue whether or not control over our own bodies is or isn't one of those inalienable rights. But it should be clear that the government cannot legitimately take rights away. Or at least, that this is the American idea.
This is the 9th amendment of the Constitution:
"The enumeration in the Constitution, of certain rights, shall not be construed to deny or disparage others retained by the people."