1. A robot may not, by either direct action or inaction, allow a human to come to harm.
2. A robot must obey any command given to it by a human, unless it conflicts with the first law.
3. A robot must not allow itself to come to harm, excepting where the first two laws supercede.
Laws One and Three are conflicting right now.
I put the "I" in Christ.