皮皮学,免费搜题
登录
搜题
【简答题】
Passage One Questions 46 to 50 are based on the following passage . In the beginning of the movie I , Robot , a robot has to decide whom to save after two cars plunge into the water — Del Spooner or a child . Even though Spooner screams " Save her ! Save her !" the robot rescues him because it calculates that he has a 45 percent chance of survival compared to Sarah's 11 percent . The robot's decision and its calculated approach raise an important question : would humans make the same choice ? And which choice would we want our robotic counterparts to make ? Isaac Asimov evaded the whole notion of morality in devising his three laws of robotics , which hold that 1. Robots cannot harm humans or allow humans to come to harm ; 2. Robots must obey humans , except where the order would conflict with law 1; and 3. Robots must act in self - preservation , unless doing so conflicts with laws 1 or 2. These laws are programmed into Asimov's robots — they don't have to think , judge , or value . They don't have to like humans or believe that hurting them is wrong or bad . They simply don't do it . The robot who rescues Spooner's life in I , Robot follows Asimov's zeroth law : robots cannot harm humanity ( as opposed to individual humans ) or allow humanity to come to harm — an expansion of the first law that allows robots to determine what's in the greater good . Under the first law , a robot could not harm a dangerous gunman , but under the zeroth law , a robot could kill the gunman to save others . Whether it's possible to program a robot with safeguards such as Asimov's laws is debatable . A word such as " harm " is vague ( what about emotional harm ? Is replacing a human employee harm ?), and abstract concepts present coding problems . The robots in Asimov's fiction expose complications and loopholes in the three laws , and even when the laws work , robots still have to assess situations . Assessing situations can be complicated . A robot has to identify the players , conditions , and possible outcomes for various scenarios . It's doubtful that a computer program can do that — at least , not without some undesirable results . A roboticist at the Bristol Robotics Laboratory programmed a robot to save human proxies (替身) called " H - bots " from danger . When one H - bot headed for danger , the robot successfully pushed it out of the way . But when two H - bots became imperiled , the robot chocked 42 percent of the time , unable to decide which to save and letting them both " die ." The experiment highlights the importance of morality : without it , how can a robot decide whom to save or what's best for humanity , especially if it can't calculate survival odds ? 46. What question does the example in the movie raise ? A ) Whether robots can reach better decisions . B ) Whether robots follow Asimov's zeroth law . C ) How robots may make bad judgments . D ) How robots should be programmed . 47. What does the author think of Asimov's three laws of robotics ? A ) They are apparently divorced from reality . B ) They did not follow the coding system of robotics . C ) They laid a solid foundation for robotics . D ) They did not take moral issues into consideration . 48. What does the author say about Asimov's robots ? A ) They know what is good or bad for human beings . B ) They are programmed not to hurt human beings . C ) They perform duties in their owners' best interest . D ) They stop working when a moral issue is involved . 49. What does the author want to say by mentioning the word " harm " in Asimov's laws ? A ) Abstract concepts are hard to program . B ) It is hard for robots to make decisions . C ) Robots may do harm in certain situations . D ) Asimov's laws use too many vague terms . 50. What has the roboticist at the Bristol Robotics Laboratory found in his experiment ? A ) Robots can be made as intelligent as human beings some day . B ) Robots can have moral issues encoded into their programs . C ) Robots can have trouble making decisions in complex scenarios . D ) Robots can be programmed to perceive potential perils .
拍照语音搜题,微信中搜索"皮皮学"使用
参考答案:
参考解析:
知识点:
.
..
皮皮学刷刷变学霸
举一反三
【多选题】肝功能不良患者在使用药物时,应适当采取下列措施
A.
增加给药次数
B.
增加药物剂量
C.
延长给药间隔时间
D.
缩短给药间隔时间
E.
避免使用经肝脏代谢的药物
【简答题】某施工总承包单位承担一项建筑基坑工程的施工,基坑开挖深度12m,基坑南侧距基坑边6m处有一栋六层既有住宅楼。其中:基坑工程为专业分包工程。在施工过程中,发生了如下事件:事件一:为宣传企业形象,总承包单位在现场办公室前空旷场地竖立了悬挂企业旗帜的旗杆,旗杆与基座预埋件焊接连接。事件二:为确保施工安全,总承包单位委派一名经验丰富的同志到项目担任安全总监。项目经理部建立了施工安全管理机构,设置了以安全总...
【单选题】大叶性肺炎的病变性质是
A.
纤维素性炎
B.
变态反应性炎
C.
化脓性炎
D.
浆液性炎
【多选题】选择电气设备必须按短路状态校验其( )
A.
动稳定
B.
热稳定
C.
最大允许有功功率
D.
最大允许无功功率
【多选题】肝功能不良患者在使用药物时.应适当采取下列措施( )
A.
加给药次数
B.
增加药物剂量
C.
延长给药间隔时间
D.
缩短给药间隔时间
E.
避免使用经肝代谢的药物
【简答题】某施工总承包单位承担一项建筑基坑工程的施工,基坑开挖深度12m,基坑南侧距基坑边6m处有一栋6层既有住宅楼。其中:基坑工程为专业分包工程。在施工过程中,发生了如下事件: 事件一:为宣传企业形象,总承包单位在现场办公室前空旷场地树立了悬挂企业旗帜的旗杆,旗杆与基座预埋件焊接连接。 事件二:为确保施工安全,总承包单位委派一名经验丰富的同志到项目担任安全总监。项目经理部建立了施工安全管理机构,设...
【单选题】大叶性肺炎的病变性质是
A.
出血性炎
B.
纤维素性炎
C.
浆液性炎
D.
化脓性炎
E.
变态反应性炎
【多选题】护士运用系统论指导工作时应明确( )。
A.
系统是没有边界的
B.
人是一个闭合系统
C.
体内各系统是相互作用的
D.
系统整体是由各组成部分简单相加而成的
E.
外环境的改变可影响人的整体功能
【多选题】肝功能不良患者在使用药物时,应适当采取下列措施()
A.
增加给药次数
B.
增加药物剂量
C.
延长给药间隔时间
D.
缩短给药间隔时间
E.
避免使用经肝代谢药物
【单选题】大叶性肺炎的病变性质是
A.
纤维素性炎
B.
变态反应性炎
C.
化脓性炎
D.
出血性炎
相关题目: