![]() Character input device, character input method, and character input program
专利摘要:
A character input device includes a display that displays a scene image in a virtual space with a virtual character input function, a starting character determination unit that identifies a starting position for the character input in the virtual character input function, an operation direction determination unit that detects a direction from the starting point, an operation detector that determines the character input in accordance with the direction, and a controller that outputs information indicating an operation of the character input to the display. 公开号:EP3709132A1 申请号:EP20156562.9 申请日:2020-02-11 公开日:2020-09-16 发明作者:Riki NOMURA 申请人:Omron Corp;Omron Tateisi Electronics Co; IPC主号:G06F3-00
专利说明:
[0001] This application claims priority to Japanese Patent Application No. 2019-045469 filed on March 13, 2019 , the contents of which are incorporated herein by reference. FIELD [0002] The present invention relates to a technique for inputting characters in a virtual space. BACKGROUND [0003] An input device described in Patent Literature 1 allows a touchpad operation in a virtual space. A user operates the input device with his or her digit to input characters in a virtual space or to operate, for example, an application. CITATION LISTPATENT LITERATURE [0004] Patent Literature 1: Japanese Unexamined Patent Application Publication No. 2014-154074 SUMMARYTECHNICAL PROBLEM [0005] However, the user operates the input device with the structure described in Patent Literature 1 while holding the device in his or her hand, and thus has a limitation in other operations. The user may thus have a lower sense of immersion in the virtual space. [0006] One or more aspects of the present invention are directed to a technique for enabling an efficient input operation without lowering the user's sense of immersion. SOLUTION TO PROBLEM [0007] A character input device includes a display that displays a scene image in a virtual space with a virtual character input function, a starting character determination unit that identifies a starting position for the character input in the virtual character input function, an operation direction determination unit that detects a direction from the starting point, an operation detector that determines the character input in accordance with the direction, and a controller that outputs information indicating an operation of the character input to the display. [0008] This structure enables the user to efficiently input characters without having a lower sense of immersion in the virtual space. [0009] The operation direction determination unit in the character input device may be a touchpad. [0010] This structure facilitates determination of the direction of a character input. [0011] The operation direction determination unit in the character input device may be a cross-shaped key. [0012] This structure allows reliable determination of the direction of a character input. [0013] The display in the character input device may display the virtual space in a manner superimposed on a real space. [0014] This structure is usable for augmented reality. [0015] The display in the character input device may display a real space in a manner superimposed on the virtual space. [0016] This structure is usable for mixed reality. [0017] The display in the character input device may display the virtual space having a time axis in a manner superimposed on a real space. [0018] This structure is usable for substitutional reality. ADVANTAGEOUS EFFECTS [0019] The above aspects of the present invention enable an efficient input operation without lowering a user's sense of immersion. BRIEF DESCRIPTION OF THE DRAWINGS [0020] Fig. 1 is a block diagram of a character input device according to a first embodiment of the present invention. Figs. 2A and 2B are schematic diagrams of the character input device according to the first embodiment of the present invention. Fig. 3 is a flowchart showing an operation of the character input device according to the first embodiment of the present invention. Fig. 4 is a schematic diagram of a character input device according to a second embodiment of the present invention. Fig. 5 is a block diagram of a character input device according to a third embodiment of the present invention. Fig. 6 is a schematic diagram of the character input device according to the third embodiment of the present invention. DETAILED DESCRIPTION [0021] Embodiments of the present invention will now be described with reference to the drawings. Example Use [0022] An example use of the present invention will be described first with reference to Fig. 1. Fig. 1 is a block diagram of a character input device 10 according to a first embodiment of the present invention. The character input device 10 is, for example, a stick hand controller. The character input device 10 detects movement of a user's hand. [0023] The character input device 10 includes a starting character determination unit 110, an operation direction determination unit 120, an operation detector 130, and a controller 140. [0024] A user experiences the virtual space by wearing a virtual reality (VR) headset. The VR headset for experiencing the virtual space is in the form of goggles. The VR headset includes a gyroscope, an accelerometer, and a magnetometer. The VR headset is mounted on the head of the user, and detects forward and backward, right and left, and up and down movements of the head to project such movements on the x-, y-, and z-axes in the virtual space. [0025] A virtual space 20 includes a display 200. In other words, the display 200 is placed as a scene image appearing in the virtual space 20, which is visible to the user wearing the VR headset. [0026] The display 200 includes a virtual software keyboard that allows character input in the virtual space 20. A string of characters is arranged on the software keyboard, for example, in the same format as a numeric keypad. When a character is input with the character input device 10 in the virtual space 20, a display image based on the input character is output to the display 200. The software keyboard is not limited to the same format as a numeric keypad described above, and may have any structure that allows input of a character string. [0027] In this embodiment, the user identifies the display 200 in the virtual space 20 and inputs the Japanese hiragana character No. The user identifies, with the starting character determination unit 110 included in the character input device 10, the hiragana N-column of the Japanese syllabary table. The operation detector 130 detects an input for the hiragana N-column. [0028] The operation detector 130 outputs information indicating the input for the hiragana N-column to the controller 140. The controller 140 outputs information indicating the input for the hiragana N-column to the display 200. [0029] The display 200 may display a flick input guide for the hiragana N-column. When a key corresponding to a character to be input is selected, the flick input guide displays, as a guide, characters included in the same hiragana column as the character to be input. [0030] More specifically, for example, the user selects the character Na. In response to this, the characters Na, Ni, Nu, Ne, and No appear in another area as a guide. In other words, the flick input guide allows the user to clearly recognize a character to be input. [0031] The user selects the character No with the operation direction determination unit 120 included in the character input device 10. [0032] The operation detector 130 detects the character No selected by the user. The operation detector 130 outputs, to the controller 140, information indicating that the character No is selected. The controller 140 outputs the character No to the display 200. [0033] In this manner, the user can input characters in the virtual space 20. Additionally, the user can input characters with an easy operation. The user can use a familiar method for inputting characters, and thus does not have a lower sense of immersion. Example Structure 1 [0034] Fig. 1 is a block diagram of a character input device 10 according to a first embodiment of the present invention. Figs. 2A and 2B are schematic diagrams of the character input device 10 according to the first embodiment of the present invention. Fig. 3 is a flowchart showing an operation of the character input device 10 according to the first embodiment of the present invention. [0035] An example structure will be described in more detail with reference to Figs. 2A and 2B based on the structure of the character input device 10 shown in Fig. 1. [0036] As shown in Figs. 1 and 2A, a user 40 wears a virtual reality (VR) headset 30. The user 40 can view the virtual space 20 through the VR headset 30. The virtual space 20 includes the display 200. [0037] The user 40 holds the character input device 10. As shown in Fig. 2B, the character input device 10 includes the operation direction determination unit 120. The operation direction determination unit 120 is, for example, a touchpad. More specifically, the touchpad includes a flat sensor. The touchpad is operable with a digit sliding on the sensor. [0038] The user 40 identifies the display 200 in the virtual space 20, and inputs the character No. The user 40 identifies the hiragana N-column. More specifically, the user 40 identifies the hiragana N-column on the display 200 in the virtual space 20 by, for example, pointing at the column with a laser pointer. For example, the position of the hiragana N-column may be identified with a gaze cursor included in the VR headset 30. In response to this, the operation detector 130 detects an input for the hiragana N-column. [0039] The operation detector 130 outputs information indicating the input for the hiragana N-column to the controller 140. The controller 140 outputs information indicating the input for the hiragana N-column to the display 200. [0040] The user 40 operates the touchpad as the operation direction determination unit 120 with the same method as the flick input while pointing at the character Na on the display 200. In this state, a flick input guide 220 displays characters included in the hiragana N-column. The user 40 performs a sliding operation downward from the character Na to select the character No. The character Na corresponds to a starting position in an aspect of the present invention. [0041] To input the character Na, an operation described below may be performed. The user 40 taps the operation direction determination unit 120 while pointing at the character Na. In response to this, the character Na is input. [0042] The operation detector 130 detects the character No selected by the user 40. The operation detector 130 outputs, to the controller 140, information indicating that the character No is selected. The controller 140 outputs the character No to the display 200. [0043] A process performed by the character input device 10 will now be described with reference to the flowchart in Fig. 3. [0044] The character input device 10 receives an operation performed by the user 40, and activates the display 200 in the virtual space 20 (S101). [0045] The character input device 10 receives an operation performed by the user 40, and identifies a starting character on the display 200 (S102). [0046] The operation detector 130 determines whether a starting character is identified (S103). When the operation detector 130 determines that a starting character is identified (Yes in S103), the character input device 10 receives an operation performed by the user 40 and selects an input character (S104). The input character may be displayed by the flick input guide 220. [0047] The operation detector 130 determines whether the input character is selected (S105). When the operation detector 130 determines that the input character is selected (Yes in S105), the controller 140 outputs the input character to the display 200 (S106). [0048] When determining that no starting character is identified in step S103 (No in S103), the processing in step S102 is repeated. [0049] When determining that no input character is selected in step S105 (No in S105), the processing in step S104 is repeated. [0050] In this manner, the user 40 can input characters in the virtual space 20. Additionally, the user 40 can input characters with an easy operation. The user 40 can use a familiar flick input method for inputting characters, and thus does not have a lower sense of immersion. [0051] Although the structure described above uses a touchpad as the operation direction determination unit 120, the character input device 10 may have any structure that allows an input character to be selected by a sliding operation such as a flicking operation. [0052] The character input device 10 may have any shape that does not eliminate the sense of immersion for the user. The operation direction determination unit 120 may also have any shape. [0053] The structure described above uses the flick input guide 220 to input characters on the display 200, and improves usability for the user 40 in inputting characters. However, the above structure may eliminate the flick input guide 220 when the flick input guide 220 lowers the sense of immersion for the user 40. Example Structure 2 [0054] An operation for inputting a character will now be described in detail with reference to Fig. 4. Fig. 4 is a schematic diagram of a character input device 10A according to a second embodiment of the present invention. [0055] The second embodiment differs from the first embodiment in the shape of an operation detector. The other components and processes are the same as those in the first embodiment, and will not be described. [0056] The character input device 10A includes an operation direction determination unit 120A. The operation direction determination unit 120A is, for example, a cross-shaped key. [0057] This structure allows the user 40 to more easily determine the direction being identified. In other words, this structure reduces input errors in selecting characters to be input. [0058] This structure also allows the user 40 to input characters in the virtual space 20. Additionally, the user 40 can input characters with an easy operation. The user 40 can use a familiar method for inputting characters with a cross-shaped key, and thus does not have a lower sense of immersion. Example Structure 3 [0059] An operation for inputting a character will now be described in detail with reference to Fig. 5. Fig. 5 is a block diagram of a character input device 10 according to a third embodiment of the present invention. Fig. 6 is a schematic diagram of the character input device 10 according to the third embodiment of the present invention. [0060] The third embodiment differs from the first embodiment in including a transmissive display 200A. The other components and processes are the same as those in the first embodiment, and will not be described. [0061] A virtual space 20A includes the transmissive display 200A. More specifically, the transmissive display 200A displays the virtual space 20A in a manner superimposed on a real space. In other words, the virtual space 20A is used in augmented reality (AR). [0062] The transmissive display 200A is included in a thin plate-like device having a camera function. The transmissive display 200A is, for example, a liquid crystal display of, for example, a smartphone. [0063] The user 40 captures an image of a real space 50 using the camera function. The captured image of the real space 50 appears on the transmissive display 200A. [0064] The user 40 inputs the Japanese hiragana character No on the transmissive display 200A. The user 40 identifies the hiragana N-column on the transmissive display 200A. The operation detector 130 detects an input for the hiragana N-column. [0065] The operation detector 130 outputs information indicating the input for the hiragana N-column to the controller 140. The controller 140 outputs information indicating the input for the hiragana N-column to the transmissive display 200A. [0066] The transmissive display 200A uses the flick input guide 220 to display characters included in the hiragana N-column. [0067] The user 40 operates the touchpad as the operation direction determination unit 120 with the same method as the flick input to perform a sliding operation from the character Na to select the character No. In this case, the user 40 performs a sliding operation downward as viewed from the front of the character input device 10. In response to this, the character No is selected. [0068] The operation detector 130 detects the character No selected by the user 40. The operation detector 130 outputs, to the controller 140, information indicating that the character No is selected. The controller 140 outputs the character No to the transmissive display 200A. [0069] This structure allows the user 40 to input characters in AR. Additionally, the user 40 can input characters with an easy operation. The user 40 can use a familiar method for inputting characters, and thus does not have a lower sense of immersion. [0070] AR may use either a vision-based technique or a markerless technique. [0071] The structure in the above example is used in AR. However, the transmissive display 200A can also be used in mixed reality (MR), in which the transmissive display 200A displays the virtual space and the real space in a mixed manner, or specifically, the transmissive display 200A displays the real space in a manner superimposed on the virtual space. [0072] Also, the transmissive display 200A can be used in a space in which the virtual space having a time axis is displayed in a manner superimposed on the real space. [0073] The correspondence between the structures of the claimed invention and the structures in the above embodiments may be expressed as specified in the appendix below. Appendix [0074] A character input device (10), comprising: a display (200) configured to display a scene image in a virtual space allowing a character input; a starting character determination unit (110) configured to identify a starting position for the character input; an operation direction determination unit (120) configured to detect a direction from the starting point; an operation detector (130) configured to determine the character input in accordance with the direction; and a controller (140) configured to output information indicating an operation of the character input to the display (200). REFERENCE SIGNS LIST [0075] 10, 10Acharacter input device20, 20Avirtual space30VR headset40user50real space110starting character determination unit120, 120Aoperation direction determination unit130operation detector140controller200display200Atransmissive display210character string display area220flick input guide
权利要求:
Claims (8) [0001] A character input device (10; 10A), comprising: a display (200) configured to display a scene image in a virtual space allowing a character input; a starting character determination unit (110) configured to identify a starting position for the character input; an operation direction determination unit (120; 120A) configured to detect a direction from the starting point; an operation detector (130) configured to determine the character input in accordance with the direction; and a controller (140) configured to output information indicating an operation of the character input to the display (200). [0002] The character input device (10; 10A) according to claim 1, whereinthe operation direction determination unit (120; 120A) is a touchpad. [0003] The character input device (10; 10A) according to claim 1, whereinthe operation direction determination unit (120; 120A) is a cross-shaped key. [0004] The character input device (10; 10A) according to claim 1, whereinthe display (200) displays the virtual space in a manner superimposed on a real space. [0005] The character input device (10; 10A) according to claim 1, whereinthe display (200) displays a real space in a manner superimposed on the virtual space. [0006] The character input device (10; 10A) according to claim 1, whereinthe display (200) displays the virtual space having a time axis in a manner superimposed on a real space. [0007] A character input method implementable by a computer, the method comprising: displaying a scene image in a virtual space allowing a character input; identifying a starting position for the character input; detecting a direction from the starting point; determining the character input in accordance with the direction; and outputting information indicating an operation of the character input. [0008] A character input program causing a computer to implement: displaying a scene image in a virtual space allowing a character input; identifying a starting position for the character input; detecting a direction from the starting point; determining the character input in accordance with the direction; and outputting information indicating an operation of the character input.
类似技术:
公开号 | 公开日 | 专利标题 US10222980B2|2019-03-05|Touch input cursor manipulation US9785305B2|2017-10-10|Touch input cursor manipulation US20180218202A1|2018-08-02|Image processing device, method thereof, and program EP2907020B1|2020-04-08|Multi-modal user expressions and user intensity as interactions with an application US10671275B2|2020-06-02|User interfaces for improving single-handed operation of devices ES2761198T3|2020-05-19|Eye interaction system US9740297B2|2017-08-22|Motion-based character selection US10082940B2|2018-09-25|Text functions in augmented reality EP2960752A1|2015-12-30|Character entry for an electronic device using a position sensing keyboard US9176668B2|2015-11-03|User interface for text input and virtual keyboard manipulation US20160349926A1|2016-12-01|Interface device, portable device, control device and module US9323340B2|2016-04-26|Method for gesture control KR100474724B1|2005-03-08|Apparatus having touch screen and external display device using method therefor JP4904375B2|2012-03-28|User interface device and portable terminal device KR101695174B1|2017-01-11|Ergonomic motion detection for receiving character input to electronic devices US20150143234A1|2015-05-21|Ergonomic micro user interface display and editing KR100478020B1|2005-10-07|On-screen key input device US20150261310A1|2015-09-17|One-dimensional input system and method US6885363B2|2005-04-26|Pointing device dwell time KR101467513B1|2014-12-01|Apparatus for controlling mobile terminal and method thereof JP2012230718A|2012-11-22|Smart soft keyboard JP4518955B2|2010-08-04|User interface using moved representation of contact area Bragdon et al.2011|Experimental analysis of touch-screen gesture designs in mobile environments EP3447622B1|2020-06-24|Devices and methods for processing touch inputs over multiple regions of a touch-sensitive surface JP6180888B2|2017-08-16|Electronic device, method and program
同族专利:
公开号 | 公开日 US20200293118A1|2020-09-17| JP2020149269A|2020-09-17|
引用文献:
公开号 | 申请日 | 公开日 | 申请人 | 专利标题
法律状态:
2020-08-14| PUAI| Public reference made under article 153(3) epc to a published international application that has entered the european phase|Free format text: ORIGINAL CODE: 0009012 | 2020-09-16| 17P| Request for examination filed|Effective date: 20200305 | 2020-09-16| AK| Designated contracting states|Kind code of ref document: A1 Designated state(s): AL AT BE BG CH CY CZ DE DK EE ES FI FR GB GR HR HU IE IS IT LI LT LU LV MC MK MT NL NO PL PT RO RS SE SI SK SM TR | 2020-09-16| AX| Request for extension of the european patent|Extension state: BA ME | 2021-01-15| STAA| Information on the status of an ep patent application or granted ep patent|Free format text: STATUS: THE APPLICATION HAS BEEN WITHDRAWN | 2021-02-17| 18W| Application withdrawn|Effective date: 20210111 |
优先权:
[返回顶部]
申请号 | 申请日 | 专利标题 相关专利
Sulfonates, polymers, resist compositions and patterning process
Washing machine
Washing machine
Device for fixture finishing and tension adjusting of membrane
Structure for Equipping Band in a Plane Cathode Ray Tube
Process for preparation of 7 alpha-carboxyl 9, 11-epoxy steroids and intermediates useful therein an
国家/地区
|