专利摘要:

公开号:BR112016017206B1
申请号:R112016017206
申请日:2015-01-23
公开日:2018-08-28
发明作者:Chen Lei
申请人:Huawei Device Co Ltd;
IPC主号:
专利说明:

(54) Title: METHOD, APPARATUS AND INFORMATION PROCESSING DEVICE. (51) Int.CI .: G06F 3/0484 (30) Unionist Priority: 26/01/2014 CN 201410038752.7 (73) Holder (s): HUAWEI DEVICE CO., LTD.
(72) Inventor (s): LAW CHEN (85) National Phase Start Date: 25/07/2016
1/55
METHOD, APPARATUS AND INFORMATION PROCESSING DEVICE
TECHNICAL FIELD [001] The present invention relates to the field of information processing technologies and, more specifically, to an information processing method, apparatus and device.
BACKGROUND [002] In recent years, smart devices have been used increasingly and widely and applied with the development of science and technologies. To satisfy the increased user demands on richer functions and more diverse forms of smart devices, the ways of entering smart devices become smarter and richer.
[003] In the prior art, to make it convenient for a user to interact with a smart device, a gesture control bracelet is developed. When in use, the gesture control bracelet needs to be worn around a user's wrist, and implements a human-computer interaction by detecting a change in bioelectricity that is generated by the arm or wrist muscles, when the user is in movement, and in combination with the monitoring of a physical action of the arm. The gesture control bracelet can trigger an operation by using a specific action. For example, an action that the user forms a fist and then shakes the wrist up and down three times can be regulated as a corresponding action to enable a device.
[004] However, there are many operations that can be
6/94
2/55 performed on the smart device. If the user wants to operate the smart device by using the gesture control bracelet in the prior art, the user needs to remember many correspondences between gesture movements and operations, which makes it extremely inconvenient for the user to use the smart device.
SUMMARY [005] In view of this, the present invention provides an information processing method, an apparatus and a device, in order to solve a problem in the prior art, in which, due to the fact that a specific trigger operation needs correspond to a specific gesture movement, a user needs to memorize multiple correspondences between gesture movements and operations.
[006] To achieve the foregoing objective, the present invention provides the following technical solutions:
[007] According to a first aspect, the present application exposes an information processing method, which includes:
the acquisition of a user's mode enable gesture information;
enabling an input mode corresponding to the mode enable gesture information, wherein the input mode comprises a keyboard input mode and a mouse input mode;
the acquisition of a gesture information from the user in the entry mode, wherein the gesture information comprises a strike gesture information and / or a slip gesture information; and generation, according to a pre-7/94 correspondence
3/55 regulated between a management information and an operating instruction, an operating instruction corresponding to the management information.
[008] In a first possible implementation of the first aspect, when the input mode is a keyboard input mode, there is a one-to-one correspondence between 12 finger joints or finger phalanxes of an index finger, a finger middle, a ring finger and a little finger of a user and 12 keys of a mini keyboard of 12 keys;
the acquisition of a user's gesture information in the input mode includes: the acquisition, in the keyboard input mode, of a beat beat gesture information from any finger joint or phalanx of the index finger, the middle finger , the ring finger and the little finger by the user; and correspondingly, the generation, according to a pre-regulated correspondence between a management information and an operating instruction, of an operating instruction corresponding to the management information includes:
the generation, according to a pre-regulated correspondence between the strike gesture information and an operating instruction, of an operating instruction corresponding to the strike gesture information.
[009] In a second possible implementation of the first aspect, when the input mode is the mouse input mode, the acquisition of user gesture information in the input mode includes:
the acquisition, in the mouse input mode, of sliding gesture information of an index finger, a middle finger, a ring finger and a little finger of the user
8/94
4/55 and / or a strike gesture information from a finger joint or finger phalanx; and correspondingly, the generation, according to a pre-regulated correspondence between a management information and an operating instruction, of an operating instruction corresponding to the management information includes:
the generation, according to a pre-regulated correspondence between the slip gesture information and / or the hit gesture information and an operation instruction, of an operation instruction corresponding to the slip gesture information and / or the gesture information beat; wherein the swipe gesture information corresponds to the movement tracking information of a mouse pointer, and the strike gesture information corresponds to the trigger information of a left or right mouse button.
[010] In a third possible way of implementing the first aspect, enabling an input mode corresponding to the mode enabling gesture information includes:
enabling the keyboard input mode, when the mode enabling gesture information corresponds to a gesture of extending an index finger, a middle finger, a ring finger and a little finger of the user; and enabling the mouse input mode, when the mode enabling gesture information corresponds to a gesture of shrinking the index finger, the middle finger, the ring finger and the little finger towards the user's palm;
9/94
5/55 in which the mode enable gesture information or the gesture information is detected by a sensor arranged on a wrist or palm of the user.
[011] In a fourth possible implementation of the first aspect, the acquisition of a user's mode enable gesture information or the acquisition of a user's management information in input mode includes:
the acquisition of a pressure value for each part of the wrist or palm, when the sensor disposed on the wrist or palm of the user detects a movement of enabling mode of entry mode or a movement of gesture of the user;
determining a sensor displacement value on the user's wrist or palm according to the pressure value; and determining a sensor displacement value on the user's wrist or palm, according to the pressure value; and the determination, according to the displacement value, of an enabling gesture information entered by the user.
[012] In any of the preceding implementation ways of the first aspect, after the generation, according to a pre-regulated correspondence between a management information and an operating instruction, of an operating instruction corresponding to the management information, the method still includes:
sending the operating instruction to a terminal, so that the terminal responds to the operating instruction.
[013] According to a second aspect, a method of
10/94
6/55 information processing is exposed, which includes:
confirmation that you enter a keyboard entry mode, in which, in the keyboard entry mode, there is a one-to-one correspondence between 12 finger joints or finger phalanxes of an index finger, a middle finger, a finger ring and a user's little finger and 12 keys on a 12-key mini keyboard;
the acquisition of a user's gesture information in the input mode understand: the acquisition, in the keyboard input mode, of a strike beat information from any finger joint or phalanx of the index finger, the middle finger , the ring finger and the little finger by the user; and the generation, according to a pre-regulated correspondence between a strike gesture information and an operating instruction, of an operation instruction corresponding to the strike gesture information.
[014] According to a third aspect, an information processing method is exposed, which includes:
confirmation that you enter a mouse entry mode;
the acquisition of a user's slip gesture information on a forefinger, a middle finger, a ring finger and a little finger and / or a strike gesture information from a finger joint or a finger phalanx index finger, middle finger, ring finger and little finger by the user; and the generation, according to a pre-regulated correspondence between a slip gesture information and / or a hit gesture information and a
11/94
7/55 operation, an operating instruction corresponding to the slip gesture information and / or the hit gesture information; wherein the swipe gesture information corresponds to the movement tracking information of a mouse pointer, and the strike gesture information corresponds to the trigger information of a left or right mouse button.
[015] According to a fourth aspect, an information processing device is exposed, which includes:
a first acquisition module, configured for the acquisition of a user's mode enable gesture information;
a mode enable module, configured to enable an input mode corresponding to the mode enable gesture information, wherein the input mode comprises a keyboard input mode and a mouse input mode;
a second acquisition module, configured for the acquisition of a user's management information in the input mode, in which the management information comprises a hit gesture information and / or a slip gesture information; and an instruction generation module, configured for the generation, according to a pre-regulated correspondence between a management information and an operating instruction, of an operating instruction corresponding to the management information.
[016] In a first possible way of implementing the fourth aspect, when the input mode is a mode of
12/94
8/55 keyboard input, there is a one-to-one correspondence between 12 finger joints or finger phalanxes of an index finger, a middle finger, a ring finger and a little finger from a user and 12 keys from a 12 mini keyboard keys; the second acquisition module is specifically configured to:
the acquisition, in the keyboard input mode, of a beat beat gesture information from any finger joint or finger phalanx of the index finger, middle finger, ring finger and little finger by the user.
[017] In a second possible implementation of the fourth aspect, when the input mode is the mouse input mode, the second acquisition module is specifically configured for:
the acquisition, in the mouse input mode, of a sliding gesture information of an index finger, a middle finger, a ring finger and a little finger of the user and / or a hitting gesture information of a knuckle finger or a finger phalanx; wherein the swipe gesture information corresponds to the movement tracking information of a mouse pointer, and the strike gesture information corresponds to the trigger information of a left or right mouse button.
[018] In a third way of implementing the fourth aspect, the mode enabling module includes:
a first mode enable module, configured for enabling the keyboard input mode, when the mode enable gesture information corresponds to a gesture of extending an index finger, a
13/94
9/55 middle finger, ring finger and little finger of the user; and a second mode enable module, configured for enabling the mouse input mode, when the mode enable gesture information corresponds to a gesture of shrinking the index finger, the middle finger, the ring finger and the finger minimum towards the user's palm.
[019] In a fourth way of possible implementation of the fourth aspect, the mode enable gesture information or the gesture information be detected by a sensor placed on a wrist or palm of the user.
[020] In a fifth possible implementation of the fourth aspect, the first acquisition module or the second acquisition module includes:
an acquisition sub-module, configured to acquire a pressure value for each part of the wrist or palm, when the sensor arranged on the wrist or palm of the user detects an entry mode enabling gesture or a gesture movement of user;
a calculation module, configured to determine a sensor displacement value on the user's wrist or palm according to the pressure value; and a gesture determination module, configured for the determination, according to the displacement value, of an enabling gesture information entered by the user.
[021] In any of the possible implementation ways precedent of the fourth aspect, the device still includes:
an instruction transfer module, configured
14/94
10/55 for sending the operating instruction to a terminal, so that the terminal responds to the operating instruction.
[022] According to a fifth aspect, an information processing device is exposed, which includes:
a first input enable module, configured for confirmation that you enter a keyboard input mode, in which, in the keyboard input mode, there is a one-to-one correspondence between 12 finger joints or finger phalanxes. an index finger, a middle finger, a ring finger and a user's little finger and 12 keys on a 12-key mini keyboard;
a first gesture acquisition module, configured for the acquisition of a strike gesture information from any finger joint or finger phalanx of the index finger, middle finger, ring finger and little finger by the user; and a first instruction generation module, configured for the generation, according to a pre-regulated correspondence between a strike gesture information and an operation instruction, of an operation instruction corresponding to the strike gesture information.
[023] According to a sixth aspect, an information processing device is exposed, which includes:
a second input enable module, configured to confirm that a system enters a mouse input mode;
a second gesture acquisition module, configured for the acquisition of a user's slip gesture information on an index finger, a middle finger, a ring finger and a little finger and / or a gesture gesture information
15/94
11/55 tapping of a finger joint or a finger phalanx of the index finger, middle finger, ring finger and little finger by the user; and a second instruction generation module, configured for generation, according to a pre-regulated correspondence between a slip gesture information and / or a hit gesture information and an operating instruction, an operating instruction corresponding to the information sliding gesture and / or tapping gesture information.
[024] In accordance with a seventh aspect, an information processing device is exposed, wherein the information processing device includes any of the foregoing information processing devices.
[025] According to an eighth aspect, an intelligent terminal is exposed, in which the intelligent terminal includes any of the preceding information processing devices.
[026] It can be learned from the previous technical solution that, compared to the previous technique, the modalities of the present invention expose an information processing method, an apparatus and a device. In the information processing method, a mode enable gesture information entered by a user is first acquired; then, a corresponding entry mode is entered, according to the mode enable gesture information entered by the user, and a user's gesture information is identified in a given entry mode; yet, a user intent is
16/94
12/55 identified according to a pre-regulated correspondence between a management information and an operating instruction. The preceding input mode includes a keyboard input mode and a mouse input mode. In this way, the user can implement a convenient control of a terminal in a familiar entry mode environment by using only one customary entry operation mode, for example, a tap operation and a tap and slide operation. In this method, the device and the device do not require the user to memorize multiple correspondences between specific gesture movements and specific operations. Only by the prior implantation of a correspondence between a user's usual basic input operation and a standard keyboard and / or mouse operation event in an execution body of the information processing method, a purpose of terminal control by user in a customary manner of operation can be used.
BRIEF DESCRIPTION OF THE DRAWINGS [027] For the description of technical solutions in the modalities of the present invention or in the prior art more clearly, the following briefly introduces the associated drawings required for the description of the modalities or the prior art. Of course, the associated drawings in the following description merely show the modalities of the present invention, and a person of ordinary skill in the art can still derive other designs from these associated designs without creative efforts.
[028] Figure 1 is a flow chart of a method of
17/94
13/55 information processing according to an embodiment of the present invention.
[029] Figure 2 is a schematic diagram of a correspondence between 12 keys on a 12-key mini keyboard according to an embodiment of the present invention.
[030] Figure 3 is a flowchart of acquisition of mode enable gesture information or gesture information of a user according to one embodiment of the present invention.
[031] Figure 4 is a schematic diagram of a location of a sensor arranged on a bracelet according to an embodiment of the present invention.
[032] Figure 5 is a flow chart of another information processing method according to an embodiment of the present invention.
[033] Figure 6 is a flow chart of a third information processing method according to an embodiment of the present invention.
[034] Figure 7 is a flow chart of a fourth information processing method according to an embodiment of the present invention.
[035] Figure 8 is a schematic structural diagram of an information processing apparatus for an embodiment of the present invention.
[036] Figure 9 is a schematic structural diagram of a first acquisition module according to an embodiment of the present invention.
[037] Figure 10 is a schematic structural diagram of another information processing apparatus for an embodiment of the present invention.
18/94
14/55 [038] Figure 11 is a schematic structural diagram of a third information processing apparatus for an embodiment of the present invention.
[039] Figure 12 is a schematic structural diagram of a fourth information processing apparatus for an embodiment of the present invention.
[040] Figure 13 is a schematic diagram of communication between an information processing device and a terminal according to an embodiment of the present invention.
[041] Figure 14 is a schematic structural diagram of an information processing device according to an embodiment of the present invention.
[042] Figure 15 is a schematic structural diagram of another information processing device according to an embodiment of the present invention.
DESCRIPTION OF MODALITIES [043] The following describes clearly and completely the technical solutions in the modalities of the present invention with reference to the associated drawings in the modalities of the present invention. Of course, the embodiments described are merely some, but not all, of the present invention. All other modalities obtained by a person of ordinary skill in the art, based on the modalities of the present invention, without creative efforts, must fall within the scope of protection of the present invention.
[044] Figure 1 is a flow chart of an information processing method according to an embodiment of the present invention. As shown in figure 1, the method
19/94
15/55 may include:
Step 101: Acquisition of user mode enable information.
[045] In this embodiment of the present invention, there can be multiple input modes for the user, and each input mode corresponds to a mode enable gesture. In this way, a body of execution of the information processing method exposed in this modality of the present invention can automatically identify, according to an enabling gesture information entered by the user, an input mode currently desired by the user, which facilitates use for the user.
Step 102: enable an input mode corresponding to the mode enable gesture information.
[046] The input mode can include a keyboard input mode and a mouse input mode. The keyboard input mode described here indicates that, in this mode, the user can directly trigger a character on a known keyboard by tapping a part of a finger, where the character can be a digit, an English letter, or a symbol with a fixed function, for example, a # key. Certainly, it is required to pre-configure a correspondence between a tap gesture information or a trigger of a part of a finger by the user and each known keyboard key. The preceding mouse input mode indicates that, in this mode, the user can perform a corresponding operation on a finger or flap as if operating a mouse. In this case, the operation may include a slide operation and a tapping operation. For example, a user's thumb slides to the left on a finger or palm, which is
20/94
16/55 corresponding to a slide to the left of a mouse pointer; the user taps a part of a finger or palm, which corresponds to a click on a left mouse button. Of course, in this case, it is also required to pre-configure a match between a user's gesture information and a standard mouse operation event.
Step 103: Acquire user management information in entry mode.
[047] Gesture information can include hit gesture information and / or slip gesture information.
[048] It should be noted that, for the same gesture information, different results of processing and response can be obtained in different input modes, and a piece of gesture information can be processed and answered in an input mode, but can not be processed and answered in another input mode. It was described in the intended text that, in the keyboard input mode, a character that the user wants to trigger can be determined by the acquisition of a tap gesture information from part of the palm by the user. In this mode, if the user enters a slip gesture information, a system will not process and respond to the slip gesture information. However, if the user enters a swipe gesture information in the mouse input mode, the swipe gesture information will be processed and responded to, because the swipe gesture information may correspond to a mouse pointer operation. In addition, a
21/94
17/55 correspondence such as this is in accordance with a user habit of performing an input operation on a computer interface by using a mouse or performing an input operation directly on a touch screen interface in the prior art. For example, the user can experience corresponding page-turning operations up, down, left and right on the touch screen by moving the thumb up, down, left and right on the palm .
Step 104: generate, according to a pre-regulated correspondence between a management information and an operating instruction, an operating instruction corresponding to the management information.
[049] A correspondence between a gesture information and an operating instruction is implemented beforehand in a system program, which can be implemented by using the prior art. However, in this embodiment of the present invention, multiple input modes are introduced, and there is a separate correspondence between gesture information and an operating instruction in each input mode, because different processing and response results can be obtained in input modes. different inputs for the same management information, which was described above.
[050] In addition, the operation instruction generated according to the user's management information can be transferred to a terminal, so that the terminal performs a response operation according to the operation instruction.
[051] In the information processing method in this modality, an enabling gesture information entered by a user is first acquired;
22/94
18/55 then, a corresponding entry mode is entered according to the mode enable gesture information entered by the user, and a user's management information is identified in a given entry mode; furthermore, a user intention is identified according to a pre-regulated correspondence between a gesture information and an operating instruction. The preceding input mode includes a keyboard input mode and a mouse input mode. In this way, the user can implement a convenient control of a terminal in a familiar entry mode environment by using only one usual entry operation way, for example, a tapping and a tapping and sliding operation. In this method, the user does not need to memorize multiple correspondences between specific gesture movements and specific operations. Only by the prior implantation of a correspondence between a user's usual basic input operation and a standard keyboard and mouse operation event in an information processing method execution body, a purpose of properly executing an input operation and control of the terminal by the user in a manner of customary operation can be obtained.
[052] In the previous mode, when a mode enable gesture information acquired by a system is an enable gesture corresponding to the keyboard input mode, the system enables the keyboard input mode. In this mode, there can be a one-to-one correspondence between 12 finger joints or finger phalanxes of an index finger, a middle finger, a ring finger and a finger
23/94
19/55 minimum of one user and 12 keys of a 12-key mini keyboard. It is common knowledge that the 12-key mini keyboard includes keys 1, 2 (abc), 3 (def), 4 (ghi), 5 (jkl), 6 (mno), 7 (pqrs), 8 (tuv), 9 (wxyz), *, 0 and #. In addition, in a hand frame, each of an index finger, a middle finger, a ring finger and a little finger includes three finger joints and three finger phalanxes, and the index finger, the middle finger, the ring finger and the little finger together include 12 finger joints and 12 finger phalanges. In addition, a 4 * 3 arrangement of the 12 keys on the 12-key mini keyboard is the same as a 4 * 3 arrangement of the 12 finger joints or phalanxes is the same as a 4 * 3 arrangement of the 12 finger joints or 12 phalanges of index finger, middle finger, ring finger and little finger on the structure of the hand. A reference can be made to figure 2, and figure 2 is a schematic diagram of a correspondence between 12 keys on a 12-key mini keyboard according to an embodiment of the present invention. A first phalanx of the index finger can correspond to the key 1, a second phalanx of the index finger can correspond to the key 2, ..., and a third phalanx of the little finger can correspond to the key #.
[053] Certainly, a correspondence between keys in an input mode and 12 finger joints or finger phalanxes is not fixedly limited to the previous way. It can also be set that three little finger joints correspond to the 1 key, the 2 key and the 3 key respectively, and three index finger finger joints correspond to the * key, the 0 key and the # key respectively. A specific match can be
24/94
20/55 regulated according to user preference and habits.
[054] Thus, in the preceding modality, the step 103 of acquiring user gesture information in the entry mode may include: the acquisition, in the keyboard entry mode, of a strike gesture information from any finger joint or phalanx of the index finger, middle finger, ring finger and little finger by the user.
[055] Correspondingly, the generation step 104, according to a pre-regulated correspondence between a management information and an operating instruction, of an operating instruction corresponding to the management information may include: generation according to a pre-regulated correspondence between a strike gesture information and an operating instruction, an operating instruction corresponding to the strike gesture information.
[056] In this mode, the keys on a 12-key mini keyboard are corresponding one by one to 12 finger joints or finger phalanges, respectively, and a correspondence like this is pre-configured in a system. Because basically all users are extremely familiar with a method of using the 12-key mini keyboard, and can also accurately detect the locations of multiple finger joints or finger phalanxes, users can conveniently and quickly implement introduced without the need to memorize a correspondence between each key on a 12-key mini keyboard and a fixed gesture movement.
[057] In a first modality, when a qualification gesture information acquired by a system is
25/94
21/55 a mode enable gesture corresponding to the mouse input mode, the system enables the mouse input mode. In this mode, the step 103 of acquiring a user's gesture information in input mode may include: acquiring, in mouse input mode, a sliding gesture information on a forefinger, a middle finger, a ring finger and a little finger and / or a tapping gesture information from a finger joint or finger phalanx.
[058] Correspondingly, the generation step 104, according to a pre-regulated correspondence between a management information and an operation instruction, of an operation instruction corresponding to the management information can include: the generation, according with a pre-regulated correspondence between a slip gesture information and / or a hit gesture information and an operating instruction, an operation instruction corresponding to the slip gesture information and / or a hit gesture information.
[059] In the mouse input mode, it can be pre-configured that the user's swipe gesture information matches the movement tracking information of a mouse pointer, and the user's swipe gesture information matches information left or right mouse button trigger. Specifically, the user can shrink four fingers, except the thumb to form a panel. The thumb can move on the panel to simulate mouse movement on a display screen. Certainly, a correspondence between a user's management information and an operation event
26/94
22/55 standard mouse needs to be pre-configured. In consideration of a user's habit of using a PAD and the execution of an operation by the use of a mouse, a swipe gesture information to the user's left may correspond directly to a mouse pointer movement to the left, an upward sliding gesture information from the user can directly correspond to an upward mouse pointer movement, and so on. In this way, the user's thumb is equivalent to the mouse on the panel formed by the four fingers, and the panel is equivalent to the display screen. A user's thumb gesture information on the panel can also correspond to a physical touch screen operation user operation event, so that the user can experience a touch screen operation by using the panel formed by the four fingers.
[060] In a real scenario, the mouse includes a left button and a right button. In order to facilitate the use for the user, in this mode, in the mouse input mode, a tap gesture movement of any fingertip of the index finger, middle finger, ring finger and little finger by the user can be defined as a left mouse button trigger operation; a tapping motion of any root of the index finger, middle finger, ring finger and little finger by the user can be defined as a right mouse button trigger operation. Alternatively, a tapping gesture movement of the index finger by the user can be defined as a left mouse button trigger operation, a tapping gesture movement of the tip
27/94
23/55 of the middle finger by the user can be defined as a right-click trigger operation, or similar, which is not limited here.
[061] In this modality, several pieces of a user's fixed gesture information correspond to several basic operations of mouse operations, so that the user directly implements an entry operation by executing a gesture movement in a palm of a familiar mouse operation or touch screen operation, which is convenient and efficient, and the user does not need to memorize multiple correspondences between gesture movements and operations.
[062] It can be understood that, in the preceding modality, enabling gestures corresponding to different input modes are not fixedly limited. To facilitate the use for a user, in this modality, the step 102 that is in the first modality and is to enable an input mode corresponding to the mode enable gesture information is specifically: enable the keyboard input mode when the information enabling mode gesture corresponds to a gesture of extension of an index finger, a middle finger, a ring finger and a little finger of the user; and enable the mouse input mode, when the mode enable gesture information corresponds to a gesture of shrinking the index finger, the middle finger, the ring finger and the little finger towards the user's palm.
[063] Due to the fact that the user can unconsciously separate several fingers when tapping with finger joints or finger phalanxes other than the finger
28/94
24/55 index, middle finger, ring finger and little finger, the gesture of extending the index finger, middle finger, ring finger and little finger of the user correspond to the keyboard input mode; the gesture of shrinking the index finger, the middle finger, the ring finger and the little finger towards the user's palm corresponds to the mouse input mode, and a panel formed by the four shrunken fingers can be used to simulate a screen of display, which is relatively close to a user operating habit. Certainly, the gestures enabling the keyboard input mode and the mouse input mode are not fixedly limited.
[064] In the previous mode, the mode enable gesture information or the gesture information can be detected by a sensor placed on a wrist or palm of the user.
[065] It should be noted that the sensor placed on the user's wrist or palm can detect a mode enabling gesture movement or a gesture movement of one hand with the sensor.
[066] Figure 3 is a flowchart of acquisition of mode enable information or gesture information of a user according to one embodiment of the present invention. As shown in Figure 3, a specific process of acquiring a mode enable gesture information from a user and the acquisition of user gesture information can include:
Step 301: acquisition of a pressure value for each part of a pulse or palm, when a sensor placed on a user's wrist or hand detects a movement of
29/94
25/55 entry mode enabling gesture or a user gesture movement.
[067] The sensor placed on the user's wrist can be placed on a bracelet, and the sensor placed on the user's palm can be placed on a glove or half glove. If the sensor is placed on the user's wrist, the user's gesture movements can be determined according to a different pressure that is caused by a muscle change and a change in movement of each part of the wrist, when the user makes different gestures. If the sensor is placed in the user's palm, the user's gesture movements can also be determined, according to a different pressure caused by a muscle change or a change in movement of each part of the palm. If the sensor is placed on a user's finger, the sensor on the finger can directly detect touch information from a user's thumb, to further determine the user's gesture information. It should be noted that, regardless of the location of the sensor in the user's hand, the sensor can not only be used to detect related detection information from a mode enable gesture, but can also be used for the detection of a related gesture information, when the user performs a sliding operation or a tapping operation on the palm or finger, after a system
enter into a way input keyboard or mode in input from Mouse. Stage 302: determine a offset value of sensor no pulse or on the palm user according to O
pressure value.
30/94
26/55 [068] Generally, a larger pressure change indicates a larger displacement value. After the user uses a device with a sensor, in a case where the user does not make any gesture movements, a contact pressure between the device with a sensor and the skin of a user's hand has a basically stable initial value. However, when the user makes a gesture movement, a pressure value of part of the hand may become higher, or it may become lower. Therefore, a displacement of hand muscles also has different vector directions. A higher pressure indicates a greater displacement value of the sensor shift from an original location. In this mode, there can be multiple sensors arranged on the user's wrist or palm, and the sensors can be arranged in different parts according to a requirement, in order to improve the accuracy of a detection result.
Step 303: determine, according to the displacement value, an enabling gesture information entered by the user.
[069] Figure 4 is a schematic diagram of a location of a sensor arranged on a bracelet according to an embodiment of the present invention. As shown in figure 4, a group of pressure sensors can be arranged on the bracelet, in which, when a user gesture changes, each sensor can obtain, according to a measured pressure variation value, a deviation of the sensor's location an initial sensor location (that is, a sensor location when the user does not move, and one hand is in a natural state). In this way, since a muscle group of a wrist
31/94
27/55 user changes due to a finger movement, a user gesture movement can finally be determined according to data detected by the sensors. A muscle group closest to a finger changes most obviously. Therefore, the sensors on the bracelet can be arranged at a location close to the finger.
[070] In a real situation, a location where the user wears the bracelet each time may be different. Therefore, the data detected by the sensors on the bracelet each time is also different, thereby affecting the accuracy of a detection result. In this modality, to address the impact of the preceding problem on the accuracy of the detection result, the following provides two solutions.
Solution 1: build a database as comprehensive as possible for storing a correspondence between data detected by a sensor and a gesture movement, when the bracelet is in different locations.
Solution 2: arrange multiple groups of sensors on the bracelet. When using the bracelet for the first time, the user can use data detected by a group of sensors located in the middle of the bracelet, and complete a related calibration job, in order to ensure that the group of sensors located in the middle of the bracelet can identify an accurate movement of the user's gesture. When the user subsequently wears the bracelet, a location on the bracelet may differ from a previous location. In this case, each group of sensors on the bracelet can directly determine a current bracelet location,
32/94
28/55 according to related displacement data (different locations of the pulse have different diameters, and therefore a cross-sectional shape formed by a current group of sensors can be obtained by calculation, according to detected displacement data by the sensors, in order to estimate a location of the bracelet on the wrist), and calculate a different location between a current location of the sensor and a location of the sensor when the bracelet is used for the first time, in order to complete a deviation correction . Alternatively, in a bracelet wearing process, a group of sensors closest to the sensor group that is located in the middle of the bracelet, when the bracelet is used for the first time, is automatically found, according to a corresponding current location information to each sensor group, and data detected by the found group of sensors is used.
[071] A calibration process can be as follows: 1. A user wears a bracelet. 2. The bracelet establishes communication with a terminal. 3. Enable a calibration mode. 4. Display a calibration interface on a terminal display screen. 5. The bracelet instructs the terminal to display a 1 key. 6. The user uses the thumb of one hand with the bracelet to touch the first finger joint of the index finger, which indicates that the key 1 is pressed, and the touch continues until the terminal completes the identification. 7. A sensor's current location information on the bracelet is recorded on the bracelet, and the current location information is used as a reference to complete a key calibration
33/94
29/55
1. For other key calibration processes, reference the previous calibration process for key 1.
[072] Certainly, the preceding method for acquiring user mode enable gesture information is also applicable to the acquisition of other user gesture information.
[073] In this modality, a specific method for acquiring mode enable information is provided, but it is not a single method for acquiring a user's access information. For example, a user's gesture information can still be acquired by using a method for identifying a user's gesture information by using bioelectricity, and by using a method in which a contact sensor is installed in a finger location of a glove. It should be noted that, regardless of the location of a sensor disposed in one of a user's hands, the sensor can not only be used to detect related detection information from a mode enable gesture, but can also be used for the detection of a related gesture detection information, when the user performs a slide operation or a tapping operation on a palm or a finger, after a system enters a keyboard input mode or an input mode of Mouse.
[074] Figure 5 is a flow chart of another method of processing information according to an embodiment of the present invention. As shown in figure 5, the information processing method can include:
34/94
30/55
Step 501: acquire a mode enable gesture information from a user.
Step 502: enable an input mode corresponding to the mode enable gesture information.
[075] The input mode includes a keyboard input mode and a mouse input mode.
Step 503: Acquire user management information in entry mode.
[076] The gesture information includes a hit gesture information and / or a slip gesture information.
Step 504: generate, according to a pre-regulated correspondence between a management information and an operating instruction, an operating instruction corresponding to the management information.
Step 505: send the operating instruction to a terminal.
[077] The operating instruction is sent to the terminal in step 505, so that the terminal responds to the operating instruction, thereby implementing a human-computer interaction.
[078] In this mode, a user can implement a convenient control of a terminal by using a customary input operation method, for example, a tap operation and a tap and slide operation. In this method, the user does not need to memorize multiple correspondences between specific gesture movements and specific operations. Just by deploying in advance a match between a user's usual basic input operation and a keyboard operation event
35/94
31/55 and standard mouse in an execution body of the information processing method, a purpose of conveniently executing an input operation and controlling the terminal can be achieved.
[079] Figure 6 is a flow chart of a third information processing method according to an embodiment of the present invention. As shown in figure 6, the information processing method can include:
Step 601: Confirm that a keyboard entry mode is entered.
[080] In keyboard input mode, there is a one-to-one correspondence between 12 finger joints or finger phalanxes of a forefinger, a middle finger, a ring finger and a little finger for a user and 12 keys for a 12-key mini keyboard.
Step 602: Acquire a beat beat gesture information from any finger joint or phalanx of an index finger, a middle finger, a ring finger and a little finger by a user.
Step 603: generate, according to a pre-regulated correspondence between a strike gesture information and an operating instruction, an operation instruction corresponding to the strike gesture information.
[081] In this mode, the keys on a 12-key mini keyboard are respectively one to one to 12 finger joints or finger phalanxes, and such a correspondence is pre-configured in a system. Due to the fact that basically all users are extremely familiar with a method of using the 12-key mini keyboard, and can also
36/94
32/55 the locations of several finger joints or finger phalanxes accurately, users can conveniently and quickly implement the information entered, without the need to memorize a correspondence between each key on a 12-key mini keyboard and a movement of fixed gesture.
[082] Figure 7 is a flow chart of a fourth information processing method according to an embodiment of the present invention. As shown in figure 7, the method can include:
Step 701: confirm that a mouse input mode is entered.
Step 702: Acquire a user's slip gesture information from an index finger, a middle finger, a ring finger and a little finger and / or a hit gesture information from a finger joint or finger phalanx index finger, middle finger, ring finger and little finger by the user.
Step 703: generate, according to a pre-regulated correspondence between a slip gesture information and / or a hit gesture information and an operating instruction, an operating instruction corresponding to the slip gesture information and / or the beat gesture information.
[083] The sliding gesture information corresponds to the movement tracking information of a mouse pointer, and the hitting gesture information corresponds to the trigger information of a left or right mouse button.
[084] In this modality, several pieces of information from
37/94
33/55 fixed gestures of a user correspond to several basic operations of mouse operations, so that the user directly implements an input operation by executing a gesture movement on a palm in a familiar mouse operation way or a touch screen operation, which is convenient and efficient, and the user does not need to memorize multiple correspondences between gesture movements and operations.
[085] The foregoing embodiments set out in the present invention describe the methods in detail. The methods in the present invention can be implemented by using devices in multiple ways. Therefore, the present invention still exposes an apparatus, which is described in detail below by the use of a specific modality.
[086] Figure 8 is a schematic structural diagram of an information processing apparatus according to an embodiment of the present invention. As shown in figure 8, an information processing apparatus 80 may include: a first acquisition module 801, an 802 mode enable module, a second acquisition module 803 and an instruction generation module 804.
[087] The first acquisition module 801 is configured for the acquisition of a mode enable gesture information from a user.
[088] In this embodiment of the present invention, there can be multiple input modes for the user, and each input mode corresponds to a mode enable gesture.
[089] The 802 mode enable module is configured to enable an input mode corresponding to the mode enable gesture information.
38/94
34/55 [0 90] The input mode includes a keyboard input mode and a mouse input mode. The keyboard input mode described here indicates that, in this mode, the user can directly trigger a character on a known keyboard by tapping a part of a finger, where the character can be a digit, an English letter or a symbol with a fixed function. The preceding mouse input mode indicates that, in this mode, the user can perform an operation on a palm like operating a mouse. In this case, an operation can include a slide operation and a strike operation. Certainly, in the two previous cases, it is required to pre-configure a correspondence between a user's gesture information and a standard keyboard or mouse operation event.
[091] The second acquisition module 803 is configured for the acquisition of user management information in the entry mode, where:
the gesture information includes a hit gesture information and / or a slide gesture information.
[092] It should be noted that, for the same gesture information, different processing and response results can be obtained in different input modes, and a piece of gesture information can be processed and answered in an input mode, but it can not be processed and answered in another input mode.
[093] The instruction generation module 804 is configured to generate, according to a pre-regulated correspondence between a management information and an operating instruction, an operating instruction corresponding to the management information.
39/94
35/55 [094] A correspondence between gesture information and an operating instruction is implemented beforehand in a system program, which can be implemented using the prior art. However, in this embodiment of the present invention, multiple input modes are introduced and there is a separate correspondence between a management information and an operating instruction in each input mode, because different processing and response results can be obtained in input modes different for the same management information, which was described above.
[095] In this mode, the information processing device can first acquire an enabling gesture information entered by a user; then, enter a corresponding entry mode according to the mode enable gesture information entered by the user, and identify a user intention according to a pre-regulated correspondence between a gesture information and an operating instruction. The preceding input mode can include a keyboard input mode and a mouse input mode. In this way, the user can implement a convenient control of a terminal in a familiar entry mode environment by using only one customary entry operation mode, for example, a tap operation and a tap and slide operation. The device does not require the user to memorize multiple correspondences between specific gesture movements and specific operations. Just by pre-deploying a match between a user's usual basic input operation and a standard keyboard and mouse operation event
40/94
36/55 in an execution body of the information processing method, a purpose of conveniently executing an input operation and controlling the terminal by the user in a customary way of operation can be achieved.
[096] In the previous mode, when the input mode is the keyboard input mode, there is a one-to-one correspondence between 12 finger joints or finger phalanxes of an index finger, a middle finger, a ring finger and a a user's little finger and 12 keys on a 12-key mini keyboard; the second acquisition module can be specifically configured to: acquire, in keyboard input mode, beat beat gesture information from any finger joint or finger phalanx of the index finger, middle finger, ring finger and little finger by the user.
[097] It can be understood that a 4 * 3 arrangement of the 12 keys on the 12-key mini keyboard is the same as a 4 * 3 arrangement of the 12 finger joints or phalanxes of the index finger, middle finger, finger ring and little finger in a hand frame, and therefore the 12 keys on the 12-key mini keyboard can be matched one by one to 12 finger joints or finger phalanxes. However, in this modality, a specific power conversion sequence is not specifically limited and, specifically, can be regulated according to the user's preference and habits.
[098] In this mode, the keys on a 12-key mini keyboard are corresponding to one to one finger joints or finger phalanges, respectively, and a correspondence like this is pre-configured in a system.
41/94
37/55
Because basically all users are extremely familiar with a method of using the 12-key mini keyboard, and can also accurately detect the locations of various finger joints or finger phalanges, users can implement quickly and conveniently a correspondence between each key on a 12-key mini keyboard and a fixed gesture movement.
[099] In a first mode, when the input mode is the mouse input mode, the second acquisition module can be specifically configured for: the acquisition, in the mouse input mode, of a sliding gesture information. sliding on an index finger, a middle finger, a ring finger and a little finger of the user and / or a tapping gesture information from a finger joint or finger phalanx.
[100] In the mouse input mode, it can be pre-configured that the user's swipe gesture information corresponds to the movement tracking information of a mouse pointer, and the user's swipe gesture information corresponds to the information left or right mouse button trigger.
[101] Specifically, to make it easier for the user, in this mode, in the mouse input mode, to tap information on any fingertip of the index finger, the middle finger, the ring finger and the little finger by user can be defined as a left mouse button trigger operation; a tap gesture information on any finger root of the index finger, middle finger, ring finger and little finger by the user
42/94
38/55 can be defined as a right-click trigger operation.
[102] In this mode, several pieces of a user's fixed gesture information correspond to several basic operations of mouse operations, so that the user directly implements an input operation by executing a gesture movement in a palm of a familiar mouse operation or touch screen operation, which is convenient and efficient, and the user does not need to memorize multiple correspondences between gesture movements and operations.
[103] It can be understood that, in the previous mode, mode enabling gestures corresponding to different input modes are not fixedly limited. To facilitate use for a user, in this mode, the mode enable module can include a first mode enable module and a second mode enable module, where the first mode enable module can be configured to enable the keyboard entry mode, when the mode enable gesture information corresponds to an extension gesture of an index finger, a middle finger, a ring finger and a little finger of the user; and the second mode enable module can be configured to enable mouse input mode,
when the mode enable gesture information is corresponding to a gesture in shrinkage of finger index finger medium finger ring and finger Minimum towards the palm of user. [104] Due to the fact of the user power
unconsciously separate several fingers when tapping
43/94
39/55 finger joints or finger phalanxes other than the index finger, middle finger, ring finger and little finger, the gesture of extending the index finger, middle finger, ring finger and little finger of the user are corresponding the keyboard input mode; the gesture of shrinking the index finger, the middle finger, the ring finger and the little finger towards the user's palm corresponds to the mouse input mode, and a panel formed by the four shrunk fingers can be used to simulate a screen of display, which is relatively close to a user operating habit. Certainly, gestures enabling the keyboard input mode and the mouse input mode are not fixedly limited.
[105] In the previous modality, the first acquisition module 801 can be specifically configured for the acquisition of an enabling gesture information so that it is from the user and is detected by a sensor placed on a wrist or palm of the user. Correspondingly, the second acquisition module can be specifically configured for the acquisition of gesture information that belongs to the user and is detected by the sensor placed on the user's wrist or palm. The sensor placed on the user's wrist or palm can detect movement information from one hand with the sensor.
[106] Figure 9 is a schematic structural diagram of a first acquisition module according to an embodiment of the present invention. As shown in figure 9, the first acquisition module 801 can specifically include an acquisition sub-module 901, a calculation module 902 and a gesture determination module
44/94
40/55
903.
[107] Acquisition sub-module 901 is configured to acquire a pressure value for each part of the wrist or palm, when the sensor arranged on the user's wrist or palm detects an entry mode enable gesture or a movement of gesture.
[108] The sensor placed on the user's wrist can be placed on a wristband, and the sensor placed on the user's palm can be placed on a glove or half glove. If the sensor is placed on the user's wrist, the user's gesture movements can be determined according to a different pressure that is caused by a muscle change and a change in movement of each part of the wrist, when the user makes different gestures. If the sensor is placed on the user's palm, the user's gesture movements can also be determined according to a different pressure caused by a muscle change and a change in movement of each part of the palm. If the sensor is placed on a user's finger, the sensor on the finger can directly detect touch information from a user's thumb, to additionally determine a user's gesture information.
[109] Calculation module 902 is configured to determine a sensor shift value on the user's wrist or palm according to the pressure value.
[110] Generally, a larger pressure change indicates a larger displacement value. There are also cases where the pressure becomes higher or lower. Therefore, an offset also has different directions. A higher pressure indicates a greater displacement value of the sensor change from
45/94
41/55 from an original location. In this mode, there can be multiple sensors arranged on the user's wrist or palm, and the sensors can be arranged in different parts according to a requirement, in order to improve the accuracy of a detection result.
[111] The gesture determination module 903 is configured to determine, according to the displacement value, an enabling gesture information entered by the user.
[112] When the sensor is closest to a finger, a muscle group changes most obviously. Therefore, the sensor on the bracelet can be arranged at a location close to the finger.
[113] In this modality, a specific method for acquiring mode enable information is provided, but it is not a single method for acquiring a user's access information. For example, a user's gesture information can still be acquired by using a method for identifying a user's gesture information by using bioelectricity, and by using a method in which a contact sensor is installed in a finger location of a glove.
[114] Figure 10 is a schematic structural diagram of another information processing apparatus according to an embodiment of the present invention. As shown in figure 10, an information processing apparatus 100 can include:
a first acquisition module 801, configured for the acquisition of an enabling gesture information
46/94
42/55 of a user;
an 802 mode enable module, configured to enable an input mode corresponding to the mode enable gesture information; wherein the input mode includes a keyboard input mode and a mouse input mode;
a second acquisition module 803, configured for the acquisition of user management information in the entry mode; wherein the gesture information includes a hit gesture information and / or a slide gesture information; and an instruction generation module 804, configured for the generation, according to a pre-regulated correspondence between a management information and an operating instruction, of an operating instruction corresponding to the management information; and an instruction transfer module 1001, configured to send the operating instruction to a terminal, so that the terminal responds to the operating instruction.
[115] The instruction transfer module 1001 sends the operating instruction to the terminal, so that the terminal responds to the operating instruction, thereby implementing a human-computer interaction.
[116] In this mode, a user can implement a convenient control of a terminal by using a customary input operation method, for example, a tap operation and a tap and slide operation. In this method, the user does not need to memorize multiple correspondences between specific gesture movements and
47/94
43/55 specific operations. Just by implementing in advance a correspondence between a user's usual basic input operation and a standard keyboard and mouse operation event in an information processing method execution body, can a human-computer interaction process be conveniently implemented.
[117] Figure 11 is a schematic structural diagram of a third information processing apparatus according to an embodiment of the present invention. As shown in figure 11, an information processing apparatus 110 may include:
a first input enable module 1101, configured for confirmation that you enter a keyboard input mode, in which, in the keyboard input mode, there is a one-to-one correspondence between 12 finger joints or finger phalanxes an index finger, a middle finger, a ring finger and a little finger from a user and 12 keys from a 12-key mini keyboard;
a first gesture acquisition module 1102, configured for the acquisition of a tapping gesture information from any finger joint or finger phalanx of the index finger, middle finger, ring finger and little finger by the user; and a first instruction generation module 1103, configured for the generation, according to a pre-regulated correspondence between a strike gesture information and an operating instruction, of an operation instruction corresponding to the strike gesture information.
[118] In this mode, the keys on a 12-key mini keyboard are respectively corresponding to
48/94
44/55 finger joints or finger phalanxes, and a match like this is pre-configured in a system. Because basically all users are extremely familiar with a method of using the 12-key mini keyboard, and can also accurately detect the locations of multiple finger joints or finger phalanxes, users can conveniently and quickly implement introduced without the need to memorize a correspondence between each key on a 12-key mini keyboard and a fixed gesture movement.
[119] Figure 12 is a schematic structural diagram of a fourth information processing apparatus according to an embodiment of the present invention. As shown in figure 12, an information processing apparatus 120 can include:
a second input enable module 1201, configured to confirm that a system enters a mouse input mode;
a second gesture acquisition module 1202, configured for the acquisition of a user's slip gesture information on an index finger, a middle finger, a ring finger and a little finger and / or a hit beat gesture information a finger joint or phalanx of the index finger, middle finger, ring finger and little finger by the user; and a second instruction generation module 1203, configured for generation, according to a pre-regulated correspondence between a slip gesture information and / or a hit gesture information and an operating instruction, of an operating instruction
49/94
45/55 corresponding to the sliding gesture information and / or the hitting gesture information.
[120] The swipe gesture information can be matched to the movement tracking information of a mouse pointer, and the hit gesture information is matched to a left or right mouse button trigger information.
[121] In this mode, several pieces of a user's fixed gesture information correspond to several basic operations of mouse operations, so that the user directly implements an entry operation by executing a gesture movement in a palm of a familiar mouse operation or touch screen operation, which is convenient and efficient, and the user does not need to memorize multiple correspondences between gesture movements and operations.
[122] In addition, an embodiment of the present invention further exposes an information processing device, wherein the information processing device includes any of the information processing devices exposed in the foregoing modalities. Due to the fact that the information processing device includes any of the information processing devices set out in the preceding modalities, the information processing device can also first acquire an enabling gesture information as entered by a user; then, enter a corresponding input mode according to the mode enable gesture information entered by the user, and identify a user's management information in an input mode
50/94
46/55 determined; also, to identify a user intention according to a pre-regulated correspondence between a gesture information and an operating instruction. The preceding input mode includes a keyboard input mode and a mouse input mode. In this way, the user can implement a convenient control of a terminal in a familiar entry mode environment by using only one customary entry operation mode, for example, a tap operation and a tap and slide operation. The device does not require the user to memorize multiple correspondences between specific gesture movements and specific operations. Only by the prior implantation of a correspondence between a user's usual basic input operation and a standard keyboard and mouse operation event in an information processing method execution body, a purpose of conveniently executing an input and controlling the terminal by the user in a manner of customary operation can be achieved.
[123] Preferably, the information processing device can be any device that has a processing function and can be used in one of the user's hands. The information processing device can exchange information with a terminal in a wireless way or in a Bluetooth way. Specifically, the information processing device can transfer an operation instruction generated according to the user's management information to the terminal, so that the terminal performs a response operation according to the operation instruction. Figure 13
51/94
47/55 is a schematic diagram of communication between an information processing device and a terminal according to an embodiment of the present invention.
[124] Preferably, the information processing device may be an intelligent terminal integrating the preceding information processing apparatus and a terminal function, for example, a portable intelligent terminal. A smart watch is used as an example of the portable smart terminal. The sensors can be arranged on a watch strap, and the sensors on the watch strap can determine the user's gesture movements according to a different pressure that is caused by a muscle change and a change in movement of each part of a wrist, when the user makes different gestures. In this way, as long as a muscle group of a user's wrist changes because of a finger movement, the user's gesture movement can finally be determined according to data detected by the sensors, and then the sensors can transfer information gesture detected for the information processing device. It should be noted that after acquiring the user's management information and generating a corresponding operating instruction, the information processing device does not need to transfer the operating instruction to the smart watch by using a wireless communications module or a Bluetooth module, but you only need to transfer the operating instruction by using a communications line inside the smart watch. In addition, after receiving the operating instruction, the smart watch can also normally respond to the instruction
52/94
48/55 of operation. Therefore, in addition to the easy advantage of memorizing precedent, particularly when the information processing device is an intelligent terminal integrating the preceding information processing device and the terminal function, for example, a portable intelligent terminal, an input accuracy can be improved. A smart watch is used as an example of the portable smart terminal. A smart watch input interface is relatively small, because of the portability of the smart watch. Entering a phone number is used as an example. When digits 1, 2, ..., 9 and 0 are entered by the use of a finger, the digits cannot be entered accurately, because the input interface is small. However, by using the information processing method in this embodiment of the present invention, each digit has a fixed piece of corresponding management information. Therefore, a digit that the user wants to enter can be accurately determined by identifying the user's management information, thereby greatly reducing or even avoiding a case in which the user cannot accurately enter information.
[125] Figure 14 is a schematic structural diagram of an information processing device according to an embodiment of the present invention. As shown in figure 14, an information processing device 140 may include: a sensor 1401, a processor 1402, a communications device 1403, a memory 1404 and a bus 1405.
[126] The 1401 sensor, the 1402 processor, the
53/94
49/55 communications 1403 and memory 1404 complete mutual communication via bus 1405.
[127] Sensor 1401 is configured to collect a user's management information. The 1401 sensor can be a contact sensor, a pressure sensor or a biological electrostatic sensor. As long as a sensor can detect enable gesture information differently and user gesture information, the sensor can be applied in this mode.
[128] Memory 1404 is configured to store a group of program instructions.
[129] The memory can be a high-speed RAM memory, or it can be a non-volatile memory (non-volatile memory), for example, at least a magnetic disk memory.
[130] Processor 1402 is configured to invoke program instructions stored in memory 1404 to:
acquiring a mode enable gesture information from a user, where the mode enable gesture information can be detected by the sensor and then transferred to processor 1402;
enabling an input mode corresponding to the mode enable gesture information, wherein the input mode includes a keyboard input mode and a mouse input mode;
the acquisition of a gesture information from the user in the entry mode, wherein the gesture information includes a strike gesture information and / or a slide gesture information; and generation, according to a pre-54/94 correspondence
50/55 regulated between a management information and an operating instruction, an operating instruction corresponding to the management information.
[131] In this mode, there can be multiple input modes, and each input mode corresponds to a mode enable gesture. In this way, processor 1402 can automatically identify, according to a mode enable gesture information entered by the user, an input mode currently desired by the user, which facilitates use for the user.
[132] The keyboard input mode indicates that, in this mode, the user can directly trigger a character on a known keyboard by tapping a part of a finger, where the character can be a digit, an English letter or a symbol with a fixed function, for example, a # key. Certainly, it is required to pre-configure a correspondence between a hit or trigger gesture information from a part of a finger by the user and each known keyboard key.
[133] It should be noted that, for the same management information, different processing and response results can be obtained in different input modes, and a piece of management information can be processed and answered in an input mode, but it may not be processed and responded to in another input mode.
[134] Processor 1402 can be a central processing unit CPU or an application specific integrated circuit ASIC (application specific integrated circuit), or it is one or more integrated circuits configured for the implementation of this modality of
55/94
51/55 the present invention.
[135] Communication device 1403 is configured to receive data in a service operation task. The communications device 1403 can be specifically a wireless communications device or a Bluetooth device, so that the operating instruction generated by processor 1402 is transferred to a terminal using the wireless communications device or Bluetooth device, and the terminal can respond to the operating instruction.
[136] Figure 15 is a schematic structural diagram of another information processing device according to an embodiment of the present invention. As shown in figure 15, a smart terminal 150 can include: a sensor 1501, a processor 1502, a memory 1503 and a bus 1504.
[137] Sensor 1501, processor 1502 and memory 1503 complete a mutual communication through bus 1504.
[138] Sensor 1501 is configured to collect a user's management information. The 1501 sensor can be a contact sensor, a pressure sensor or a biological electrostatic sensor. As long as a sensor can detect enable gesture information differently and user gesture information, the sensor can be applied in this mode.
[139] Memory 1503 is configured to store a group of program instructions.
[140] The memory can be a high-speed RAM memory, or it can be a non-volatile memory (memory
56/94
52/55 non-volatile), for example, at least one magnetic disk memory.
[141] Processor 1502 is configured to invoke program instructions stored in memory 1503 to:
the acquisition of a mode enable gesture information from a user, in which the mode enable gesture information can be detected by the sensor;
enabling an input mode corresponding to the mode enable gesture information, wherein the input mode includes a keyboard input mode and a mouse input mode;
the acquisition of a gesture information from the user in the entry mode, wherein the gesture information includes a strike gesture information and / or a slide gesture information;
the generation, according to a pre-regulated correspondence between a management information and an operating instruction, of an operating instruction corresponding to the management information; and the response to the operating instruction.
[142] In this mode, there can be multiple input modes for the user, and each input mode corresponds to a mode enable gesture. In this way, processor 1502 can automatically identify, according to a mode enable gesture information entered by the user, an input mode currently desired by the user, which facilitates use for the user.
[143] The keyboard input mode indicates that, in this mode, the user can directly trigger a character on
57/94
53/55 a keyboard known by tapping a part of a finger, where the character can be a digit, an English letter or a symbol with a fixed function, for example, a # key. Certainly, it is required to pre-configure a correspondence between a hit or trigger gesture information from a part of a finger by the user and each known keyboard key.
[144] It should be noted that, for the same management information, different processing and response results can be obtained in different input modes, and a piece of management information can be processed and answered in an input mode, but it may not be processed and responded to in another input mode.
[145] Processor 1502 may be a central processing unit CPU or an ASIC application specific integrated circuit (application specific integrated circuit), or it is one or more integrated circuits configured to implement this embodiment of the present invention.
[146] The information processing device is integrated into the terminal. Therefore, after acquiring the user's management information and generating a corresponding operating instruction, the processor does not need to transfer the operating instruction to the portable smart terminal by using a wireless communications module or a Bluetooth module, but you only need to transfer the operating instruction by using a communications line inside the portable smart terminal. In addition, after receiving the operating instruction, the processor of the portable smart terminal can also
58/94
54/55 respond normally to the operating instructions. In addition, when the information processing device is an information processing device integrating the preceding information processing device and a terminal function, for example, a portable smart terminal, the input accuracy can also be improved.
[147] The modalities in this specification are all progressively described, for the same or similar parts in the modalities, a reference can be made to these modalities, and each modality focuses on a difference from other modalities. The device exposed in the modality is basically similar to the method exposed in the modality, and therefore it is described briefly; for related parties, a reference can be made to partial method descriptions.
[148] It should be further noted that, in this specification, relational terms, such as first and second, are used only to distinguish one entity or operation from another, and do not necessarily require or imply that any real relationship or sequence exists between these entities or operations. Furthermore, it is intended that the terms include, include or any other variant of them cover a non-exclusive inclusion, so that a process, method, article or device that includes a list of elements, not only includes those elements, but also include other elements that are not expressly listed, or include elements inherent in a process like this, a method, article or device. An element preceded by includes a ..., without further restrictions, do not prevent the
59/94
55/55 presence of additional identical elements in the process, method, article or device that includes the element.
[149] In combination with the modalities set out in this specification, method or algorithm steps can be implemented by hardware, a software module executed by a processor, or a combination of them. The software module can be configured in a random access memory (RAM), a memory, a read-only memory (ROM), a recorder, a hard disk, a removable disk, a CD-ROM, or a storage medium in any other forms well known in the art.
[150] The exposed modalities are described in the foregoing to allow a person skilled in the art to implement or use the present invention. Several modifications to the modalities are obvious to the person skilled in the art, and the general principles defined in this specification can be implemented in other modalities, without departing from the scope of the present invention. Therefore, the present invention will not be limited to the modalities described in this specification, but extends to the broader scope that is in accordance with the principles and novelty set out in this specification.
60/94 / 10
权利要求:
Claims (21)
[1]
1. Information processing method, CHARACTERIZED by understanding:
the acquisition (101, 501) of a user's mode enable gesture information;
enabling (102, 502) an input mode corresponding to the mode enabling gesture information, wherein the input mode comprises a keyboard input mode and a mouse input mode;
the acquisition (103, 503) of a user gesture information in the entry mode, wherein the gesture information comprises at least one of a hit gesture information or a slip gesture information; and the generation (104, 504), according to a pre-regulated correspondence between a management information and an operation instruction, an operation instruction corresponding to the management information;
enabling an input mode corresponding to the enabling gesture information in order to understand:
enabling the keyboard input mode, when the mode enabling gesture information corresponds to a gesture of extending an index finger, a middle finger, a ring finger and a little finger of the user; and enabling the mouse input mode, when the mode enabling gesture information corresponds to a gesture of shrinking the index finger, the middle finger, the ring finger and the little finger towards the user's palm.
[2]
2. Information processing method, according to claim 1, CHARACTERIZED by the fact that, when the input mode is a keyboard input mode, there is a
Petition 870180011120, of 02/09/2018, p. 94/172
2/10 one to one correspondence between 12 finger joints or finger phalanxes of an index finger, a middle finger, a ring finger and a little finger from a user and 12 keys from a 12-key mini keyboard;
the acquisition (103, 503) of a user's gesture information in the input mode comprises: the acquisition (602), in the keyboard input mode, of a strike beat information from any finger joint or phalanx of finger of the index finger, middle finger, ring finger and little finger by the user; and correspondingly, the generation (104, 504), according to a pre-regulated correspondence between a management information and an operating instruction, of an operating instruction corresponding to the management information comprising:
the generation (603), according to a pre-regulated correspondence between the strike gesture information and an operating instruction, of an operation instruction corresponding to the strike gesture information.
[3]
3. Information processing method, according to claim 1, CHARACTERIZED by the fact that, when the input mode is the mouse input mode, the acquisition (103, 503) of a user's management information in the input understand:
the acquisition (702), in mouse input mode, of at least one of the sliding gesture information of an index finger, a middle finger, a ring finger and a little finger of the user or a slap gesture information tapping of a finger joint or finger phalanx; and correspondingly, the generation (104, 504), according to a pre-regulated correspondence between a
Petition 870180011120, of 02/09/2018, p. 95/172
3/10 gesture and an operating instruction, an operating instruction corresponding to the gesture information comprise:
the generation (703), according to a preset correspondence between at least one of a slip gesture information or a hit gesture information and an operation instruction, of an operation instruction corresponding to at least one of a slip gesture or hit gesture information.
[4]
4. Information processing method, according to claim 3, CHARACTERIZED by the fact that the slide gesture information corresponds to a movement information of a mouse pointer, and the strike gesture information corresponds to a trigger information for a left or right mouse button.
[5]
5. Information processing method, according to claim 1, CHARACTERIZED by the fact that the mode enable gesture information or the gesture information is detected by a sensor placed on a wrist or palm of the user.
[6]
6. Information processing method, according to claim 5, CHARACTERIZED by the fact that the acquisition of the user's mode enable information or the acquisition of the user's management information in the entry mode comprises:
the acquisition (301) of a pressure value of each part of the wrist or palm, when the sensor disposed on the wrist or palm of the user detects an enabling gesture movement of the input mode or a movement of the user gesture;
Petition 870180011120, of 02/09/2018, p. 96/172
4/10 determining (302) a sensor displacement value on the user's wrist or palm according to the pressure value; and determining (303) a sensor displacement value on the user's wrist or palm, according to the pressure value.
[7]
7. Information processing method, according to any one of claims 1 to 6, CHARACTERIZED by the fact that, after generation, according to a pre-regulated correspondence between a management information and an operating instruction, of an instruction corresponding to the management information, the method still comprises:
sending (505) the operating instruction to a terminal, so that the terminal responds to the operating instruction.
[8]
8. Information processing method, CHARACTERIZED by understanding:
the confirmation (601) that a keyboard entry mode is entered when the mode enable gesture information corresponds to a gesture of extending an index finger, a middle finger, a ring finger and a little finger of the user, in that, in keyboard input mode, there is a one-to-one correspondence between 12 finger joints or phalanxes of a forefinger, a middle finger, a ring finger and a little finger from a user and 12 keys from a mini keyboard 12 keys;
the acquisition (602) of a user's gesture information in the input mode comprises: the acquisition, in the keyboard input mode, of a tapping gesture information from any finger joint or index finger phalanx, middle finger, ring finger and little finger
Petition 870180011120, of 02/09/2018, p. 97/172
5/10 by the user; and the generation (603), according to a pre-regulated correspondence between a strike gesture information and an operating instruction, of an operation instruction corresponding to the strike gesture information.
[9]
9. Information processing method, CHARACTERIZED by understanding:
the confirmation (701) that you enter a mouse input mode when the mode enable gesture information corresponds to a gesture of shrinking the index finger, the middle finger, the ring finger and the little finger towards the palm of user;
the acquisition (702) of at least one of a user's slip gesture information on a forefinger, a middle finger, a ring finger and a little finger or a tapping gesture information from a finger joint or a phalanx of the index finger, middle finger, ring finger and little finger by the user; and the generation (703), according to a preset correspondence between at least one of a slip gesture information or a hit gesture information and an operation instruction, of an operation instruction corresponding to at least one of an information sliding gesture or hit gesture information.
[10]
10. Information processing method, according to claim 9, CHARACTERIZED by the fact that the slide gesture information corresponds to the movement tracking information of a mouse pointer, and the strike gesture information corresponds to a trigger information of a left or right button
Petition 870180011120, of 02/09/2018, p. 98/172
6/10 of mouse.
[11]
11. Information processing device, CHARACTERIZED by understanding:
a first acquisition module (801), configured for the acquisition of a mode enable gesture information from a user;
a mode enable module (802), configured to enable an input mode corresponding to the mode enable gesture information, wherein the input mode comprises a keyboard input mode and a mouse input mode;
a second acquisition module (803), configured for the acquisition of a user's gesture information in the input mode, wherein the gesture information comprises at least one of a hit gesture information or a slip gesture information; and an instruction generation module (804), configured for the generation, according to a pre-regulated correspondence between a management information and an operation instruction, of an operation instruction corresponding to the management information;
where the mode enable module comprises: a first mode enable module, configured for enabling the keyboard input mode, when the mode enable gesture information corresponds to a gesture of extending an index finger, a middle finger, ring finger and little finger of the user; and a second mode enable module, configured to enable the mouse input mode, when the mode enable gesture information matches
Petition 870180011120, of 02/09/2018, p. 99/172
7/10 to a gesture of shrinking the index finger, the middle finger, the ring finger and the little finger towards the user's palm.
[12]
12. Information processing apparatus according to claim 11, CHARACTERIZED by the fact that, when the input mode is a keyboard input mode, there is a one-to-one correspondence between 12 finger joints or finger phalanxes. an index finger, a middle finger, a ring finger and a user's little finger and 12 keys on a 12-key mini keyboard; the second acquisition module (803) is specifically configured to:
the acquisition, in the keyboard input mode, of a beat beat gesture information from any finger joint or finger phalanx of the index finger, middle finger, ring finger and little finger by the user.
[13]
13. Information processing apparatus according to claim 11, CHARACTERIZED by the fact that, when the input mode is the mouse input mode, the second acquisition module (803) is specifically configured to:
the acquisition, in the mouse input mode, of at least one of the sliding gesture information of an index finger, a middle finger, a ring finger and a little finger of the user or a slap gesture information of a user finger joint or a finger phalanx.
[14]
14. Information processing apparatus according to claim 13, CHARACTERIZED by the fact that the sliding gesture information corresponds to the movement tracking information of a mouse pointer, and the hitting gesture information corresponds to a trigger information of a left or right button
Petition 870180011120, of 02/09/2018, p. 100/172
8/10 mouse.
[15]
15. Information processing apparatus according to claim 11, CHARACTERIZED by the fact that the mode enable gesture information or the gesture information is detected by a sensor arranged on a wrist or palm of the user.
[16]
16. Information processing apparatus, according to claim 15, CHARACTERIZED by the fact that the first acquisition module or the second acquisition module comprises:
an acquisition sub-module, configured to acquire a pressure value for each part of the wrist or palm, when the sensor arranged on the wrist or palm of the user detects an entry mode enabling gesture or a gesture movement of user;
a calculation module, configured to determine a sensor displacement value on the user's wrist or palm according to the pressure value; and a gesture determination module, configured for the determination, according to the displacement value, of an enabling gesture information entered by the user.
[17]
17. Information processing apparatus, according to any of claims 11 to 16, CHARACTERIZED by the fact that it still comprises:
an instruction transfer module (1001), configured to send the operating instruction to a terminal, so that the terminal responds to the operating instruction.
[18]
18. Information processing apparatus,
Petition 870180011120, of 02/09/2018, p. 101/172
9/10
CHARACTERIZED by understanding:
a first input enable module (1101), configured for confirmation that a keyboard entry mode is entered, when the mode enable gesture information corresponds to a gesture of extending an index finger, a middle finger, a ring finger and a little finger of the user, in which, in keyboard input mode, there is a one-to-one correspondence between 12 finger joints or finger phalanxes of an index finger, a middle finger, a ring finger and a a user's little finger and 12 keys on a 12-key mini keyboard;
a first gesture acquisition module (1102), configured for the acquisition of a strike gesture information from any finger joint or finger phalanx of the index finger, middle finger, ring finger and little finger by the user ; and a first instruction generation module (1103), configured for the generation, according to a pre-regulated correspondence between a strike gesture information and an operating instruction, of an operation instruction corresponding to the strike gesture information .
[19]
19. Information processing apparatus,
CHARACTERIZED by understanding:
a second input enable module (1201), configured to confirm that a system enters a mouse input mode when the mode enable gesture information corresponds to a gesture of shrinking the index finger, the middle finger, the ring finger and the little finger towards the user's palm;
a second gesture acquisition module (1202),
Petition 870180011120, of 02/09/2018, p. 102/172
10/10 configured for the acquisition of at least one of a user's slip gesture information on an index finger, a middle finger, a ring finger and a little finger or a hit gesture information from a joint finger or a phalanx of the index finger, middle finger, ring finger and little finger by the user; and a second instruction generation module (1203), configured for generation, according to a pre-regulated correspondence between at least one of a slip gesture information or a strike gesture information and an operation instruction, according to an operating instruction corresponding to at least one of a slip gesture information or a hit gesture information.
[20]
20. Intelligent terminal, CHARACTERIZED by the fact that the intelligent terminal comprises the information processing apparatus defined in any one of claims 11 to 19.
[21]
21. Information processing device, CHARACTERIZED in that the information processing device comprises the information processing device defined in any one of claims 11 to 17; comprising the information processing apparatus defined in claim 18; or comprise the smart terminal as defined in claim 19.
Petition 870180011120, of 02/09/2018, p. 103/172
1/8 __ 101
类似技术:
公开号 | 公开日 | 专利标题
BR112016017206B1|2018-08-28|information processing method, apparatus and device.
Sridhar et al.2015|Investigating the dexterity of multi-finger input for mid-air text entry
Yi et al.2015|Atk: Enabling ten-finger freehand typing in air based on 3d hand tracking data
Whitmire et al.2017|Digitouch: Reconfigurable thumb-to-finger input and text entry on head-mounted displays
US9477874B2|2016-10-25|Method using a touchpad for controlling a computerized system with epidermal print information
US9916044B2|2018-03-13|Device and method for information processing using virtual keyboard
CN104903836A|2015-09-09|Method and device for typing on mobile computing devices
KR20150133688A|2015-11-30|Input device
CN104536574B|2017-11-21|A kind of glove-type input equipment and its input method
US20150242120A1|2015-08-27|Data input peripherals and methods
CN103425430B|2017-06-27|The method and apparatus for supporting one-hand text input in the terminal
CN103631368B|2017-04-19|Detection device, detection method and electronic equipment
Belkacem et al.2019|Textile: Eyes-free text input on smart glasses using touch enabled textile on the forearm
Lepouras2018|Comparing methods for numerical input in immersive virtual environments
CN104615984A|2015-05-13|User task-based gesture identification method
TW201812559A|2018-04-01|Input system and input method
Wang et al.2004|FingerSense: augmenting expressiveness to physical pushing button by fingertip identification
CN104156148B|2018-05-08|A kind of method and apparatus for being used to provide the dummy keyboard in mobile equipment
US20190179412A1|2019-06-13|Method for using fingers to interact with a smart glove worn on a hand
Zhang et al.2020|AirTyping: a mid-air typing scheme based on leap motion
CN104049772B|2017-11-07|A kind of input method, device and system
CN108008819A|2018-05-08|A kind of page map method and terminal device easy to user's one-handed performance
Ljubic et al.2014|Predicting upper-bound text entry speeds for discrete-tilt-based input on smartphones
Suzuki et al.2009|Interaction technique for a pen-based interface using finger motions
CN106383658A|2017-02-08|Virtual key operation method and system
同族专利:
公开号 | 公开日
RU2016134720A3|2018-02-28|
JP2017509957A|2017-04-06|
WO2015110063A1|2015-07-30|
US9965044B2|2018-05-08|
KR20160110992A|2016-09-23|
CN103793057A|2014-05-14|
EP3089018A1|2016-11-02|
EP3089018B1|2019-01-09|
CN103793057B|2017-02-01|
RU2016134720A|2018-02-28|
EP3089018A4|2017-01-18|
KR101877823B1|2018-07-12|
RU2662408C2|2018-07-25|
US20160342217A1|2016-11-24|
JP6249316B2|2017-12-20|
引用文献:
公开号 | 申请日 | 公开日 | 申请人 | 专利标题

US7109970B1|2000-07-01|2006-09-19|Miller Stephen S|Apparatus for remotely controlling computers and other electronic appliances/devices using a combination of voice commands and finger movements|
US7362305B2|2004-02-10|2008-04-22|Senseboard Technologies Ab|Data input device|
US20030056278A1|2001-09-26|2003-03-27|Lung Kuo|Structure of finger keyboard|
CN1482527A|2002-09-15|2004-03-17|廖华勇|Finger movement input technology and equipment|
JP4379214B2|2004-06-10|2009-12-09|日本電気株式会社|Mobile terminal device|
KR20060022984A|2004-09-08|2006-03-13|홍광석|Keypad glove apparatus|
RU2457532C2|2006-03-10|2012-07-27|Кенджи Йошида|Input processing system for information processing apparatus|
CN101401059B|2006-03-10|2012-08-15|吉田健治|System for input to information processing device|
CN200944218Y|2006-06-30|2007-09-05|山东大学|Fingerstall type digital keyboard device|
US20080129694A1|2006-11-30|2008-06-05|Liberty Reach Inc.|Keyless user interface device|
JP2008135033A|2007-11-26|2008-06-12|Olympus Corp|Hand posture operation detector|
CN101226438A|2008-01-18|2008-07-23|于非|Glove-shaped computer mouse|
CN201638148U|2009-09-10|2010-11-17|深圳市亿思达显示科技有限公司|Glove-type virtual input device|
US8436821B1|2009-11-20|2013-05-07|Adobe Systems Incorporated|System and method for developing and classifying touch gestures|
US8421634B2|2009-12-04|2013-04-16|Microsoft Corporation|Sensing mechanical energy to appropriate the body for data input|
CN201780561U|2010-07-29|2011-03-30|陈誉航|Finger sleeve type mouse|
US20120242584A1|2011-03-22|2012-09-27|Nokia Corporation|Method and apparatus for providing sight independent activity reports responsive to a touch gesture|
US9218058B2|2011-06-16|2015-12-22|Daniel Bress|Wearable digital input device for multipoint free space data collection and analysis|
US10795448B2|2011-09-29|2020-10-06|Magic Leap, Inc.|Tactile glove for human-computer interaction|
US9880629B2|2012-02-24|2018-01-30|Thomas J. Moscarillo|Gesture recognition devices and methods with user authentication|
US8743052B1|2012-11-24|2014-06-03|Eric Jeffrey Keller|Computing interface system|
JP2014115688A|2012-12-06|2014-06-26|Sharp Corp|Character input system and character input method|
CN103793057B|2014-01-26|2017-02-01|华为终端有限公司|Information processing method, device and equipment|CN103793057B|2014-01-26|2017-02-01|华为终端有限公司|Information processing method, device and equipment|
CN105373321A|2014-08-13|2016-03-02|中兴通讯股份有限公司|Control method and device for functional objects of mobile terminal and mobile terminal|
CN105138136A|2014-09-15|2015-12-09|北京至感传感器技术研究院有限公司|Hand gesture recognition device, hand gesture recognition method and hand gesture recognition system|
CN105278699A|2014-09-29|2016-01-27|北京至感传感器技术研究院有限公司|Easy-wearable gesture identification device|
CN105204645A|2014-10-02|2015-12-30|北京至感传感器技术研究院有限公司|Easy-wearing gesture identification device|
CN104461365A|2014-12-11|2015-03-25|三星电子(中国)研发中心|Touch method and device of terminal|
CN104750253B|2015-03-11|2018-10-12|苏州佳世达电通有限公司|A kind of electronic device carrying out body-sensing input for user|
CN106073126B|2015-04-27|2018-05-18|陈明洙|LED safety vests|
US10638316B2|2016-05-25|2020-04-28|Intel Corporation|Wearable computer apparatus with same hand user authentication|
CN107490370B|2016-06-13|2020-01-10|原相科技股份有限公司|Measuring device and operation method thereof, track sensing system and track sensing method thereof|
CN107329574A|2017-06-30|2017-11-07|联想有限公司|Input method and system for electronic equipment|
WO2019028650A1|2017-08-08|2019-02-14|方超|Gesture acquisition system|
CN208752575U|2018-07-27|2019-04-16|深圳市志海和科技有限公司|A kind of keyboard|
US10902250B2|2018-12-21|2021-01-26|Microsoft Technology Licensing, Llc|Mode-changeable augmented reality interface|
CN109947245A|2019-03-12|2019-06-28|合肥工业大学|For operating the finger movement instruction set and its generation method of wrist-watch devices|
CN110209265A|2019-04-04|2019-09-06|北京理工大学|A kind of input system based on finger touching detection|
CN110162183A|2019-05-30|2019-08-23|努比亚技术有限公司|Gesture operation method, wearable device and computer readable storage medium high up in the air|
CN110780732A|2019-09-06|2020-02-11|北京理工大学|Input system based on space positioning and finger clicking|
CN111897477A|2020-08-04|2020-11-06|上海传英信息技术有限公司|Mobile terminal control method, mobile terminal and storage medium|
法律状态:
2018-07-31| B09A| Decision: intention to grant [chapter 9.1 patent gazette]|
2018-08-28| B16A| Patent or certificate of addition of invention granted [chapter 16.1 patent gazette]|Free format text: PRAZO DE VALIDADE: 20 (VINTE) ANOS CONTADOS A PARTIR DE 23/01/2015, OBSERVADAS AS CONDICOES LEGAIS. |
2019-06-04| B25D| Requested change of name of applicant approved|Owner name: HUAWEI DEVICE (SHENZHEN) CO., LTD. (CN) |
2021-07-20| B25A| Requested transfer of rights approved|Owner name: HONOR DEVICE CO., LTD. (CN) |
2022-01-11| B25L| Entry of change of name and/or headquarter and transfer of application, patent and certificate of addition of invention: publication cancelled|Owner name: HUAWEI DEVICE (SHENZHEN) CO., LTD. (CN) Free format text: ANULADA A PUBLICACAO CODIGO 25.1 NA RPI NO 2637 DE 20/07/2021 POR TER SIDO INDEVIDA. |
2022-01-25| B25A| Requested transfer of rights approved|Owner name: HONOR DEVICE CO., LTD. (CN) |
优先权:
申请号 | 申请日 | 专利标题
CN201410038752.7A|CN103793057B|2014-01-26|2014-01-26|Information processing method, device and equipment|
PCT/CN2015/071424|WO2015110063A1|2014-01-26|2015-01-23|Method, apparatus, and device for information processing|
[返回顶部]