首页 / 国际专利分类库 / 物理 / 测量 / 无线电定向;无线电导航;采用无线电波测距或测速;采用无线电波射或再辐射的定位或存在检测;采用其它波的类似装置 / 应用除无线电波外的电磁波的反射或再辐射系统,例如激光雷达系统(摄影测量学或视频测量学入G01C 11/00)
序号 专利名 申请号 申请日 公开(公告)号 公开(公告)日 发明人
141 Software development kit for LiDAR data US14178812 2014-02-12 US09354825B2 2016-05-31 Mark J. Kozak; Jimmy X. Wu
The present invention relates to a method and system for compressing and retrieving Light Detection and Ranging output data, and, more specifically, to a method and system for compressing Light Detection and Ranging output data by Run Length Encoding Light Detection and Ranging output data and rapidly accessing this compressed data which is filtered by attributes without the need to read or decompress the entire collection of data.
142 EXCAVATION SYSTEM PROVIDING MACHINE CYCLE TRAINING US14485088 2014-09-12 US20160076225A1 2016-03-17 David C. ATKINSON; Kenneth L. STRATTON
An excavation system is disclosed as having a locating device configured to generate a position signal indicative of a position of an excavation machine, a sensor configured to generate a load signal indicative of a work tool status, and a controller. The controller may be configured to determine a location of a haul truck, to determine when the work tool is filled with at least a threshold amount of material from a pile at the worksite, and to determine the position of the excavation machine. The controller may further be configured, based on operating limitations of the excavation machine, to determine a pivot point between the location of the haul truck and the position of the excavation machine; to plan a travel path for the excavation machine to the haul truck through the pivot point; and to direct an operator of the excavation machine to follow the travel path.
143 Image device including dynamic vision sensor, ambient light sensor and proximity sensor function US14336428 2014-07-21 US09257461B2 2016-02-09 Soonik Cho; Moo-Young Kim; Minseok Oh; Taechan Kim; Jae-Cheol Yun
An image device including a pixel array and a controller, The pixel array having first pixels and second pixels and corresponding channel drivers. The controller may perform operations of a dynamic vision sensor (DVS), an ambient light sensor (ALS) and a proximity sensor (PS).
144 Obstacle sensor and robot cleaner having the same US13616137 2012-09-14 US09239389B2 2016-01-19 Yeon Kyu Jeong; Shin Kim; Jeong Hun Kim; Jong Owan Kim; Sang Sik Yoon; Dong Hun Lee; Jea Yun So
An obstacle sensor includes a line light irradiating unit including a light-emitting unit, a light-emitting driving unit to drive the light-emitting unit, and a first conical mirror, an apex of which is disposed towards the light-emitting unit in a light irradiation direction of the light-emitting unit and which converts light emitted from the light-emitting unit into line light irradiated in all directions, and a reflected light receiving unit including a second conical mirror to condense light, that is irradiated from the first conical mirror and is then reflected from an obstacle, a lens, that is spaced from the apex of the second conical mirror by a predetermined distance and transmits the reflected light, an imaging unit to image the reflected light that passes through the lens, an image processing unit, and an obstacle sensing control unit.
145 Integrated multifunction scope for optical combat identification and other uses US13186058 2011-07-19 US09068798B2 2015-06-30 Tony Maryfield; Mahyar Dadkhah; Thomas Potendyk
Systems and methods for enabling an integrated multifunction scope for optical combat identification and other uses. The functionality of Multiple Integrated Laser Engagement System (MILES) is combined with Optical Combat Identification Systems (OCIDS) or other identification as friend or foe (IFF) systems. This can provide for improved MILES performance through the utilization of a common laser transmission system and/or the use of location information systems, such as global positioning system (GPS) coordinates. According to some embodiments, various additional features may be included for use in training and/or combat environments.
146 IMAGE DEVICE INCLUDING DYNAMIC VISION SENSOR, AMBIENT LIGHT SENSOR AND PROXIMITY SENSOR FUNCTION US14336428 2014-07-21 US20150069218A1 2015-03-12 Soonik CHO; Moo-Young KIM; Minseok OH; Taechan KIM; Jae-Cheol YUN
An image device including a pixel array and a controller, The pixel array having first pixels and second pixels and corresponding channel drivers. The controller may perform operations of a dynamic vision sensor (DVS), an ambient light sensor (ALS) and a proximity sensor (PS).
147 Method and apparatus for mapping in stereo imaging US13178537 2011-07-08 US08964002B2 2015-02-24 Lawrence A. Ray; Richard A. Simon
A method for registering a first imaging detector to a surface projects a sequence of k images toward the surface, wherein k≧4, wherein each of the k images has a pattern of lines that extend in a direction that is orthogonal to a movement direction. The pattern encodes an ordered sequence of labels, each label having k binary elements, such that, in the movement direction, any portion of the pattern that is k equal increments long encodes one label of the ordered sequence. The method obtains, for at least a first pixel in the first imaging detector, along at least one line that is parallel to the movement direction, a first sequence of k signal values indicative of the k binary elements of a first label from the ordered sequence of labels and correlates the at least the first pixel in the first imaging detector to the surface.
148 OBJECT REMOVAL USING LIDAR-BASED CLASSIFICATION US13918159 2013-06-14 US20140368493A1 2014-12-18 Aaron Matthew Rogan; Benjamin James Kadlec
In scenarios involving the capturing of an environment, it may be desirable to remove temporary objects (e.g., vehicles depicted in captured images of a street) in furtherance of individual privacy and/or an unobstructed rendering of the environment. However, techniques involving the evaluation of visual images to identify and remove objects may be imprecise, e.g., failing to identify and remove some objects while incorrectly omitting portions of the images that do not depict such objects. However, such capturing scenarios often involve capturing a lidar point cloud, which may identify the presence and shapes of objects with higher precision. The lidar data may also enable a movement classification of respective objects differentiating moving and stationary objects, which may facilitate an accurate removal of the objects from the rendering of the environment (e.g., identifying the object in a first image may guide the identification of the object in sequentially adjacent images).
149 Target Localization Utilizing Wireless and Camera Sensor Fusion US14064020 2013-10-25 US20140285660A1 2014-09-25 Mark Jamtgaard; Nathan Mueller
According to some implementations, an estimate of a target's location can be calculated by correlating Wi-Fi and video location measurements. This spatio-temporal correlation combines the Wi-Fi and video measurements to determine an identity and location of an object. The accuracy of the video localization and the identity from the Wi-Fi network provide an accurate location of the Wi-Fi identified object.
150 Software Development Kit for LIDAR Data US14178812 2014-02-12 US20140229672A1 2014-08-14 Mark J. Kozak; Jimmy X. Wu
The present invention relates to a method and system for compressing and retrieving Light Detection and Ranging output data, and, more specifically, to a method and system for compressing Light Detection and Ranging output data by Run Length Encoding Light Detection and Ranging output data and rapidly accessing this compressed data which is filtered by attributes without the need to read or decompress the entire collection of data.
151 INTEGRATED MULTIFUNCTION SCOPE FOR OPTICAL COMBAT IDENTIFICATION AND OTHER USES US13186058 2011-07-19 US20140109458A1 2014-04-24 Tony Maryfield; Mahyar Dadkhah; Thomas Potendyk
Systems and methods for enabling an integrated multifunction scope for optical combat identification and other uses. The functionality of Multiple Integrated Laser Engagement System (MILES) is combined with Optical Combat Identification Systems (OCIDS) or other identification as friend or foe (IFF) systems. This can provide for improved MILES performance through the utilization of a common laser transmission system and/or the use of location information systems, such as global positioning system (GPS) coordinates. According to some embodiments, various additional features may be included for use in training and/or combat environments.
152 System and method for monitoring eye movement US12199693 2008-08-27 USRE42471E1 2011-06-21 William C. Torch
Apparatus for monitoring movement of a person's eye, e.g., to monitor drowsiness. The system includes a frame that is worn on a person's head, an array of emitters on the frame for directing light towards the person's eye, and an array of sensors on the frame for detecting light from the array of emitters. The sensors detect light that is reflected off of respective portions of the eye or its eyelid, thereby producing output signals indicating when the respective portions of the eye is covered by the eyelid. The emitters project a reference frame towards the eye, and a camera on the frame monitors movement of the eye relative to the reference frame. This movement may be correlated with the signals from the array of sensors and/or with signals from other sensors on the frame to monitor the person's level of drowsiness.
153 Distributed motion prediction network US10369218 2003-02-15 US20040061605A1 2004-04-01 Michael D. Howard
The present invention provides a distributed motion prediction network including a plurality of nodes. The nodes in the network comprise a mechanism for detecting the presence of an object in an area around a node. A node that detects an object is termed a nulldetecting node.null After a node has detected an object, it communicates a signal to local nodes that are local with the detecting node to inform the local nodes of the presence of the object in the area around the detecting node. After the local nodes receive the signal, they further propagate the signal to other nodes (local to the local nodes) so that information regarding the presence of the object is progressively propagated to nodes away from the detecting node and so that as the object moves, the propagated signal is used to predict a direction of travel of the object.
154 System and method for monitoring eye movement US09740738 2000-12-18 US06542081B2 2003-04-01 William C. Torch
Apparatus for monitoring movement of a person's eye, e.g., to monitor drowsiness. The system includes a frame that is worn on a person's head, an array of emitters on the frame for directing light towards the person's eye, and an array of sensors on the frame for detecting light from the array of emitters. The sensors detect light that is reflected off of respective portions of the eye or its eyelid, thereby producing output signals indicating when the respective portions of the eye is covered by the eyelid. The emitters project a reference frame towards the eye, and a camera on the frame monitors movement of the eye relative to the reference frame. This movement may be correlated with the signals from the array of sensors and/or with signals from other sensors on the frame to monitor the person's level of drowsiness.
155 Method and apparatus for detecting the presence and location of an object in a field US487409 1995-06-07 US5493112A 1996-02-20 David F. Welch
An optical system and method for detecting the presence and location of at least one stationary or moving object in a field. The optical system has at least one light source to generate a beam, which beam is scanned by at least one first reflecting surface to generate a plurality of beams. The beams are overlapped across the field by at least one second reflecting surface and their intensity is measured by at least one detection means.
156 다광축 광전 센서 KR1020150028936 2015-03-02 KR101630117B1 2016-06-13 키쿠치케이사쿠; 오사코카즈노리
[과제] 워크의종류에응한사전의번잡한설정을하는일 없이, 높이가다른복수종류의워크에적합한뮤팅처리가가능한다광축광전센서를제공한다. [해결수단] 다광축광전센서(SNS)는, 투광기(1)와, 투광기(1)와함께복수의광축을형성하는수광기(2)를구비하고, 각광축에의해설정되는검출에어리어(LC)의적어도일부에, 광축의차광을검출한결과가무효가되는뮤팅에어리어가설정된다. 센서시스템은, 워크(W)의검출에어리어의통과중에, 차광된광축에대응하는차광범위를취득하고, 그결과에의거하여, 다광축광전센서(SNS)의뮤팅에어리어를제1의범위로부터제2의범위로변경한다.
157 선박 건조용 경사시험 장치 KR2020140005725 2014-07-30 KR2020160000462U 2016-02-12 이희근; 김재빈; 김철수; 박병희
본고안은선박건조용경사시험장치에관한것으로, 선박에양방향레이저발생기를설치하고, 상기양방향레이저발생기에서선박의좌현방향과우현방향으로각각레이저를양방향조사하고이를높이측정용스타프부재로조사된레이저포인트를확인하여선박의건조경사시험시 선박의기울어지는각도를정확하게확인한다.
158 진동측정장치, 진동측정방법, 및 진동측정장치의 광학계 KR1020140026989 2014-03-07 KR101544828B1 2015-08-17 김동규; 황성의; 장준환; 박기환
본 발명에 따른 진동측정장치에는, 측정대상물체의 진동 측정점에 관한 형상정보를 획득하는 레이저 거리측정기; 및 상기 측정대상물체의 진동을 측정할 진동 측정점에 레이저 빔을 조사하여, 상기 진동 측정점에 대한 진동성분을 측정하는 레이저 스캐닝 진동측정기가 포함되고, 상기 레이저 거리측정기에서 조사되는 제 1 레이저와 상기 레이저 스캐닝 진동측정기에서 조사되는 제 2 레이저 중에서, 어느 하나는 반사하고 어느 하나는 투과하여, 상기 레이저 거리측정기와 상기 레이저 스캐닝 진동측정기가 동시에 동작할 수 있도록 하는 광학필터를 포함된다. 본 발명에 따르면, 공통되는 광경로를 이용하므로 경로일치가 용이해져서 제작이 손쉽게 되고 측정오차의 우려가 없어지는 장점이 있다. 또한, 형상측정과 진동측정이 한꺼번에 수행될 수 있으므로 제품의 동작시간이 줄어드는 장점을 얻을 수 있다. 따라서 한 대의 레이저 스캐너 진동측정기를 사용하는 경우에 문제가 되는 동작시간이 증가문제를 해소할 수 있다. 또한, 광학계의 시스템이 집약적이고 광틀어짐을 해소할 수 있으므로, 광 정렬 불일치로 인한 문제점을 해소할 수 있다. 또한, 기존에 레이저 스캐닝 진동측정기만이 제공되었던 장치에 형상측정광학계를 간단히 부가함으로써, 진동측정장치의 동작성능을 향상시킬 수 있다.
159 근접 센서 KR1020147019375 2013-01-11 KR1020140105566A 2014-09-01 모리따,요스께
터치리스 모션을 행할 때, 손 또는 손가락 등의 피검출 물체가 근접 센서(19)에 대하여 아래에서 위로 이동하는 경우에, 발광 소자(10A, 10B)로부터 각각 출사된 광은 피검출 물체에서 반사된다. 수광 소자(32)는 그 반사광을 검출하고 검출 신호(1, 2)를 출력한다. 그 후, 피검출 물체가 발광 소자(10C)에 도달하면, 수광 소자(32)는 발광 소자(10C)의 반사광을 검출하고 검출 신호(3)를 출력한다. 피검출 물체의 아래에서 위로의 이동은 검출 신호(1, 2, 3)의 출력 패턴으로부터 검출된다.
160 휴대 단말기 및 그 제어 방법 KR1020110033170 2011-04-11 KR1020120115701A 2012-10-19 유지호
PURPOSE: A portable terminal and a control method thereof are provided to calculate a location of a vehicle by using a signal received from a tag as an initial value when the portable terminal is located in the outside of the vehicle. CONSTITUTION: A DF(Direction Find) reader unit(182) receives a signal from a tag attached to the inside of a vehicle. A control unit(180) calculates location information by using the signal received through the DF reader unit. The control unit displays the calculated location information on a display unit(151). When a portable terminal(100) is located in the outside of the vehicle, the control unit calculates the location information. [Reference numerals] (110) Wireless communication unit; (111) Broadcast receiving unit; (112) Mobile communication unit; (113) Wireless internet module; (114) Near field communication module; (115) Position information module; (120) A/V input unit; (121) Camera; (122) Microphone; (130) User input unit; (140) Sensing unit; (141) Proximity sensor; (150) Output unit; (151) Display unit; (152) Voice output module; (153) Alarm module; (154) Haptic module; (155) Projector module; (160) Memory; (170) Interface unit; (180) Control unit; (181) Multimedia module; (182) DF reader unit; (190) Power supply unit
QQ群二维码
意见反馈