序号 专利名 申请号 申请日 公开(公告)号 公开(公告)日 发明人
121 ROBOT CONTROL SYSTEM AND A PROGRAM EP16202954.0 2016-12-08 EP3258336A1 2017-12-20 Mizutani, Ryota; Thapliya, Roshan; Komatsuzaki, Kazunari

A robot control system includes: plural robots; a receiving unit that receives a robot dispatch request from a user; a sensor that detects a state of the user who performs the robot dispatch request; and a controller that determines priority on the robot dispatch request based on the state of the user detected by the sensor, selects, in a case where plural robot dispatch requests are received, a robot to be dispatched among the plural robots in order of the priority on the robot dispatch requests, and dispatches the robot.

122 ROBOT CONTROL SYSTEM EP16186790.8 2016-09-01 EP3219447A1 2017-09-20 THAPLIYA, Roshan; DUNSTAN, Belinda

A robot control system (10) includes a humanoid conversation robot (210, 230) that has a conversation with a user (50), at least one service execution robot (240, 260) that provides a service to the user, a recognition unit (222) that recognizes a request and an emotion of the user through the conversation between the user and the humanoid conversation robot, and a determination unit (407, 408) that determines a service which is to be provided to the user and a service execution robot which is to execute the service among the at least one service execution robot according to the request and the emotion of the user. The service execution robot determined by the determination unit executes the determined service.

123 MEDIZINISCHER HALTEARM MIT RINGFÖRMIGEN LED-ANZEIGEMITTELN EP15180826.8 2015-08-12 EP3130305A1 2017-02-15 NOWATSCHIN, Stephan; KRINNINGER, Maximilian; GIERLACH, Dominikus

Die Erfindung betrifft eine Haltevorrichtung (1), insbesondere Haltearm und/oder Stativ, für medizinische Zwecke, mit einem proximalen Ende (2) zum Befestigen der Haltevorrichtung (1) an einer Basis und einem distalen Ende (4) zum Aufnehmen eines Anbaugeräts (6); wenigstens einem ersten und einem zweiten Armsegment (12, 14), wobei das erste Armsegment (12) mit einem ersten Gelenk (13) und das zweite Armsegment (14) mit einem zweiten Gelenk (15) verbunden ist, wobei jedes Gelenk (13, 15) freigebbar und arretierbar ist; einer Bedieneinrichtung (50) zum Freigeben und/oder Arretieren des entsprechenden Gelenks (13, 15) zum Verbringen der Haltevorrichtung (1) in eine gewünschte Pose; und einer ersten Anzeigeeinheit (34, 100, 200, 250), die an dem ersten Gelenk (13) angeordnet ist und einer zweiten Anzeigeeinheit (36, 100, 200, 252), die an dem zweiten Gelenk (15) angeordnet ist. Erfindungsgemäß sind die erste und/oder zweite Anzeigeeinheit (34, 36, 100, 200, 250, 252) dazu eingerichtet wenigstens einen vom Freigeben und/oder Arretieren des entsprechenden Gelenks (13, 15) verschiedenen Status der Haltevorrichtung (1) und/oder eines Anbaugeräts (6) anzuzeigen. Die Erfindung betrifft ferner ein Verfahren.

124 CONTROL SYSTEM, METHOD AND DEVICE OF INTELLIGENT ROBOT BASED ON ARTIFICIAL INTELLIGENCE EP15199520.6 2015-12-11 EP3109856A1 2016-12-28 GE, Xingfei; WU, Hua; LI, Jialin; XU, Qian; WANG, Haifeng; JING, Kun; SUN, Wenyu; WU, Tian; GUAN, Daisong

The present disclosure provides a control system, a control method and a control device of an intelligent robot based on artificial intelligence. The system includes: a decision engine, disposed on the intelligent robot, and configured to generate cloud processing information according to a multimodal input signal, and to send the cloud processing information; and a cloud control center, configured to receive the cloud processing information, to obtain a user demand by analyzing the cloud processing information, and to return the user demand, such that the decision engine controls the intelligent robot according to at least one of the user demand and the multimodal input signal. The control system may make full use of great online information, enhance the capability of the intelligent robot for storage, calculation and processing complex decisions, and meanwhile may respond to the user's instruction timely, rapidly and intelligently, and improve the user experience.

125 USER INTERFACES FOR ROBOT TRAINING EP13740399.4 2013-06-21 EP2864085B1 2016-11-30 CHEN, Elaine, Y.; BROOKS, Rodney; BLUMBERG, Bruce; DYE, Noelle; CAINE, Michael; SUSSMAN, Michael; LINDER, Natan; LONG, Paula; BUEHLER, Christopher, J.; WILLIAMSON, Matthew, M.; ROMANO, Joseph, M.; GOODWIN, William, A.
126 COMMUNICATION DRAW-IN SYSTEM, COMMUNICATION DRAW-IN METHOD, AND COMMUNICATION DRAW-IN PROGRAM EP12867118 2012-11-15 EP2810748A4 2016-09-07 ISHIGURO SHIN
A communication draw-in system that enables robot-human communication to start smoothly is provided. The communication draw-in system is a communication draw-in system provided in a robot that communicates with a target human, and includes: a human specifying unit 200 for specifying a position of the target human; a light source control unit 201 for moving light toward the specified position of the target human; a draw-in control unit 203 for instructing the robot to perform a draw-in operation for making the target human recognize a direction of the robot; and a human recognition specifying unit 204 for determining whether or not the target human has recognized the robot, wherein the robot is instructed to start communicating with the target human, in the case where the target human is determined to have recognized the robot.
127 APPARATUS AND METHODS FOR PROVIDING A PERSISTENT COMPANION DEVICE EP14767391.7 2014-03-13 EP2974273A1 2016-01-20 BREAZEAL, Cynthia
A method includes providing a telecommunications enabled robotic device adapted to persist in an environment of a user, receiving an instruction to photograph one or more persons in the environment according to a time parameter and photographing the one or more persons in accordance with the time parameter resulting in one or more photographs.
128 Medical tele-robotic system EP10163079.6 2003-07-25 EP2214111A3 2013-10-30 Wang, Yulun; Laby, Keith; Jordan, Charles; Butner, Steven; Southard, Jonathan

A remote controlled robotic system that is coupled to a broadband network, comprising:

a first remote control station coupled to the network;

a base station coupled to said remote control station through the broadband network; and,

a mobile robot wirelessly coupled to said base station and controller through said first remote station, said mobile robot having a robot camera and a robot monitor, said mobile robot transmits an existing video image of a patient and a pre-existing image of the patient to said first remote control station, said first remote station displays said existing video image of the patient and said pre-existing image of the patient to make a side by side comparison of the patient.

129 ROBOT HUMANOIDE DOTE D'UNE INTERFACE DE DIALOGUE NATUREL, PROCEDE DE CONTROLE DU ROBOT ET PROGRAMME CORRESPONDANT EP11730675.3 2011-07-11 EP2596493A1 2013-05-29 MAISONNIER, Bruno; MONCEAUX, Jérôme
The invention relates to a humanoid robot equipped with an interface for natural dialogue with an interlocutor. In the prior art, the methods of dialogue between humanoid robots equipped, moreover, with developed movement functionalities and human beings are limited particularly by the capabilities of voice and visual recognition processing with which said robots can be fitted. The present invention equips said robots with capabilities for removing doubt from several methods of communication for the messages which they receive and for combining these different methods, which allow a great improvement in the quality and the natural character of the dialogues with those with whom the robots are speaking. The invention likewise provides simple and user-friendly means for implementing the programming of the functions which allow the free flow of these dialogues using multiple methods to be ensured.
130 TELEPRESENCE ROBOT, TELEPRESENCE SYSTEM COMPRISING THE SAME AND METHOD FOR CONTROLLING THE SAME EP10847544.3 2010-08-19 EP2544865A1 2013-01-16 CHOI, Mun-Taek; KIM, MunSang; PARK, InJun; KIM, Chang Gu; YOO, Jin Hwan; LEE, YoungHo; HWANG, Juk Kyu; SHINN, Richard H.
A telepresence robot may include a manual navigation unit configured to move the telepresence robot according to navigation information received from a user device; an autonomous navigation unit configured to detect environment of the telepresence robot and control the movement of the telepresence robot using the detected result; a motion control unit comprising a database related to at least one motion, the motion control unit configured to receive selection information on the motion of the database and actuate the telepresence robot according to the selection information; and an output unit configured to receive expression information of a user from the user device and output the expression information. The telepresence robot may be applied to various fields such as language education by a native speaking teacher, medical diagnoses, teleconferences, or remote factory tours.
131 MEDICAL TELE-ROBOTIC SYSTEM EP03771847 2003-07-25 EP1573406A4 2010-07-28 WANG YULUN; LABY KEITH PHILLIP; JORDAN CHARLES S; BUTNER STEVEN EDWARD; SOUTHARD JONATHAN
A remote controlled robotic system that is coupled to a broadband network, comprising: a first remote control station coupled to the network; a base station coupled to said remote control station through the broadband network; and, a mobile robot wirelessly coupled to said base station and controller through said first remote station, said mobile robot having a robot camera and a robot monitor, said mobile robot transmits an existing video image of a patient and a pre-existing image of the patient to said first remote control station, said first remote station displays said existing video image of the patient and said pre-existing image of the patient to make a side by side comparison of the patient.
132 ROBOT APPARATUS, METHOD OF CONTROLLING ROBOT APPARATUS, METHOD OF DISPLAY, AND MEDIUM EP99943284.2 1999-09-10 EP1059147A1 2000-12-13 SABE, Kotaro Sony Corporation; FUJITA, Masahiro Sony Corporation

A CPU 15 determines an output of a feeling model based on signals supplied from a touch sensor 20. The CPU 15 also deciphers whether or not an output value of the feeling model exceeds a pre-set threshold value. If the CPU finds that the output value exceeds the pre-set threshold value, it verifies whether or not there is any vacant area in a memory card 13. If the CPU finds that there is any vacant area in a memory card 13, it causes the picture data captured from the CCD video camera 11 to be stored in the vacant area in the memory card 13. At this time, the CPU 15 causes the time and date data and the feeling parameter in the memory card 13 in association with the picture data. The CPU 15 also re-arrays the picture data stored in the memory card 13 in the sequence of the decreasing magnitude of the feeling model output.

133 COMMUNICATION DEVICE, COMMUNICATION ROBOT AND COMPUTER-READABLE STORAGE MEDIUM EP18177184.1 2018-06-12 EP3418008A1 2018-12-26 FUNAZUKURI, Mina; YOSHIZAWA, Shintaro; KAKU, Wataru; YAMADA, Hitoshi

A communication device including: an utterance acquisition part (101, 201) configured to acquire an utterance of a user to a character; an information acquisition part (102, 202) configured to acquire information different from the utterance; a voice generation part (209, 210) configured to generate a response voice to be emitted by the character based on a content of the utterance acquired by the utterance acquisition part (101, 201); and an expression generation part (205, 207) configured to generate a response expression to be expressed by a face portion of the character based on the content of the utterance acquired by the utterance acquisition part (101, 201), wherein when the information is acquired from the information acquisition part (102, 202), the expression generation part (205, 207) generates the response expression using the information together with the content of the utterance, the response expression generated when the information is acquired being different from a response expression generated when the information is not acquired.

134 ELECTRONIC DEVICE AND CRADLE THEREOF EP16875878 2016-08-31 EP3329347A1 2018-06-06 LEE SO HEE; SHIN WON HO; ROH KYUNGSHIK; PARK SOON YONG; PARK JOONG KYUNG; YOON SUK JUNE
An electronic device, capable of being placed on a cradle, may include a camera module configured to acquire an image, a sensor module configured to sense information regarding an orientation of the electronic device, a processor configured to determine a target orientation of the electronic device based on the acquired image and a communication module configured to transmit the information regarding the orientation of the electronic device and information regarding the target orientation of the electronic device to the cradle.
135 ROBOT AND ROBOT SYSTEM EP16824527.2 2016-07-14 EP3323567A1 2018-05-23 HOSOI, Hiroshi; HOSOI, Yoji; TANAKA, Masahide

Provided is a robot to which is disposed a cartilage vibration-transmission source, the robot being configured so that both hands of the robot are lain onto the face of a person or hold the back of the head, or so that one hand touches the cartilage of the ear. The relative positions of both hands touching the cartilage of both ears are maintained, but movement of the face is not restricted. The hands of the robot extend in the line-of-sight direction of the robot. The mouth of the robot is caused to move in conjunction with cartilage transmission audio. A limiter is applied to the touch pressure of the hands. Permission is sought before touching. Safety is confirmed before touching. When a malfunction occurs, touching with the hands is cancelled. The hands are wanned to the temperature of the human body. The robot makes indirect contact with a person who is wearing an accessory which covers a portion of the cartilage of the ear. Information for specifying the wearer is held in the accessory, and the robot reads such information. When the person guides the hands of the robot, the robot does not resist. When vibrations are transmitted to both ears, the hand of the robot tracks the movement of the head. The cartilage vibration-transmission source is provided to the first finger of the robot, and the second finger of the robot supports the head of the person.

136 MEDIZINISCHER HALTEARM MIT RINGFÖRMIGEN LED-ANZEIGEMITTELN EP17184477.2 2015-08-12 EP3269323A1 2018-01-17 Nowatschin, Stephan; Krinninger, Maximilian; Gierlach, Dominikus

Die Erfindung betrifft eine Haltevorrichtung (1), insbesondere Haltearm und/oder Stativ, für medizinische Zwecke, mit einem proximalen Ende (2) zum Befestigen der Haltevorrichtung (1) an einer Basis und einem distalen Ende (4) zum Aufnehmen eines Anbaugeräts (6); wenigstens einem ersten und einem zweiten Armsegment (12, 14), wobei das erste Armsegment (12) mit einem ersten Gelenk (13) und das zweite Armsegment (14) mit einem zweiten Gelenk (15) verbunden ist, wobei jedes Gelenk (13, 15) freigebbar und arretierbar ist; einer Bedieneinrichtung (50) zum Freigeben und/oder Arretieren des entsprechenden Gelenks (13, 15) zum Verbringen der Haltevorrichtung (1) in eine gewünschte Pose; und einer ersten Anzeigeeinheit (34, 100, 200, 250), die an dem ersten Gelenk (13) angeordnet ist und einer zweiten Anzeigeeinheit (36, 100, 200, 252), die an dem zweiten Gelenk (15) angeordnet ist. Erfindungsgemäß sind die erste und/oder zweite Anzeigeeinheit (34, 36, 100, 200, 250, 252) dazu eingerichtet wenigstens einen vom Freigeben und/oder Arretieren des entsprechenden Gelenks (13, 15) verschiedenen Status der Haltevorrichtung (1) und/oder eines Anbaugeräts (6) anzuzeigen.

137 APPARATUS AND METHODS FOR PROVIDING A PERSISTENT COMPANION DEVICE EP14767391 2014-03-13 EP2974273A4 2018-01-10 BREAZEAL CYNTHIA
A method includes providing a telecommunications enabled robotic device adapted to persist in an environment of a user, receiving an instruction to photograph one or more persons in the environment according to a time parameter and photographing the one or more persons in accordance with the time parameter resulting in one or more photographs.
138 SYSTEM FOR CONTENT RECOGNITION AND RESPONSE ACTION EP17164815.7 2017-04-04 EP3228426A1 2017-10-11 BLAKELY, John; CARROLL, Jon; EARWOOD, Corey; DORRIS, Brandon; WILLIAMS, Adam; RODES, David

A system can use a microphone (130) to detect audio content from a content source (150), and analyze the audio content to determine whether the audio content corresponds to a respective portion of a stored content file. The system can then identify a response action correlated to the respective portion of the content file, and generate control commands to cause a robotic device (100) to perform the response action.

139 HUMAN-COMPUTER INTERACTIVE METHOD BASED ON ARTIFICIAL INTELLIGENCE AND TERMINAL DEVICE EP15199221.1 2015-12-10 EP3109800A1 2016-12-28 LI, Jialin; JING, Kun; GE, Xingfei; WU, Hua; XU, Qian; WANG, Haifeng; SUN, Wenyu; WU, Tian; GUAN, Daisong

The present disclosure provides a human-computer interactive method and apparatus based on artificial intelligence, and a terminal device. The human-computer interactive method based on artificial intelligence includes: receiving a multimodal input signal, the multimodal input signal including at least one of a speech signal, an image signal and an environmental sensor signal; determining an intention of a user according to the multimodal input signal; processing the intention of the user to obtain a processing result, and feeding back the processing result to the user.

140 Methods and systems for managing dialogs of a robot EP14305581.2 2014-04-17 EP2933071A1 2015-10-21 Monceaux, Jérôme; Gate, Gwennael; Barbieri, Gabriele

There is disclosed a computer-implemented method of handling an audio dialog between a robot and a human user, the method comprising: during said audio dialog, receiving audio data and converting said audio data into text data; in response to said text data, determining a dialog topic, said dialog topic comprising a dialog content and a dialog voice skin; wherein a dialog content comprises a plurality of sentences; determining a sentence to be rendered in audio by the robot; receiving a modification request of said determined dialog sentence. Described developments for example comprise different regulation schemes (e.g. open-loop or closed-loop), the use of moderation rules (centralized or distributed) and the use of priority levels and/or parameters depending on the environment perceived by the robot.

QQ群二维码
意见反馈