专利摘要:
electronic device and method for controlling screen display using temperature and humidity a background image is displayed on an electronic device's touchscreen. superimposed on the background image, a semitransparent layer is displayed. when a touch and drag action is detected on the semitransparent layer, the transparency of a touch and drag region is changed. the transparency of the semitransparent layer can be changed according to temperature or humidity.
公开号:BR102014006171A2
申请号:R102014006171-1
申请日:2014-03-14
公开日:2020-03-31
发明作者:Boyoung Lee;Hyebin PARK;Jiyoung KANG;Minwook NA;Jongwoo SHIN;Yumi AHN;Hyunwoo Yoo;Jungwoo CHOI
申请人:Samsung Electronics Co Ltd;
IPC主号:
专利说明:

ELECTRONIC DEVICE AND METHOD TO CONTROL SCREEN DISPLAY USING TEMPERATURE AND HUMIDITY
TECHNICAL FIELD
[001] The present disclosure concerns a display control for an electronic device screen having a touch screen. More particularly, the present disclosure relates to an electronic device and method for controlling a display of images and objects on a touchscreen, based on at least one of temperature and humidity.
BACKGROUND
[002] Nowadays a touch screen is used for various electronic devices in order to display graphics and text and to offer a user interface that allows interaction between a user and an electronic device. A touchscreen detects contact on a surface and reacts to that contact. Typically, a touchscreen can still display one or more programmable keys, one or more menu items, and any other object for a user interface. User contact with any object on a touchscreen can cause an interaction between a user and an electronic device.
[003] A touch screen provides an intuitive user interface. For this reason, a touchscreen has been widely used as a display and input device for many types of mobile devices such as mobile phones or smart phones.
[004] Currently a smart phone uses a touchscreen as a basic input / output device and has several sensors, such as a temperature sensor, a humidity sensor and an optical sensor, to detect external environments. By combining an intuitive touch screen interface with multiple sensors, a smart phone can offer natural and enhanced experiences for a user.
[005] The above information is presented as prior information only to assist in an understanding of the present revelation. No determination has been made, and no statement is made, as to whether any of the above may apply as prior art with reference to the present disclosure.
SUMMARY
[006] Aspects of the present disclosure are to address at least the problems and / or disadvantages mentioned above and to provide at least the advantages described below. In this way, an aspect of the present disclosure is to provide an electronic device and method for a screen display control that allows a transparent layer to be displayed without a superimposed background image on a touchscreen.
[007] Another aspect of the present disclosure is to provide an electronic device and method for a screen display control that can change the transparency of a touch and drag region in response to a touch and drag based user input. a semitransparent layer.
[008] Another aspect of the present disclosure is to provide an electronic device and method for a screen display control that can vary the transparency of a semitransparent layer according to at least one of temperature and humidity.
[009] In accordance with a first aspect of the present invention, an electronic device is provided comprising: a touch screen configured to display an image and to detect user input; and a control unit configured to control the touchscreen to display a first layer overlaid on the image; where, when a user input is detected in a partial region of the first layer, the control unit is further configured to change the transparency of the partial region.
[010] In accordance with a second aspect of the present invention, a display control method is provided for an electronic device having a touchscreen, the method comprising: displaying a first layer superimposed on an image displayed on the touchscreen; detect user input on the touchscreen; and changing the transparency of a partial region of the first layer in which user input is detected.
[011] In accordance with one aspect of the present disclosure, an electronic device is provided. The electronic device includes a display unit configured to display an image and having a touchscreen to detect user input, and a control unit configured to control the touchscreen to display a first layer overlaid on the image. In this electronic device, when user input is detected in a partial region of the first layer, the control unit is additionally configured to change transparency of the partial region.
[012] The electronic device can additionally comprise a temperature sensor, and the control unit can be further configured to determine temperature based on information received from the temperature sensor, to control the first layer to display an image of water vapor when the determined temperature is higher than a predefined temperature, and to control the first layer to display an image of frost when the determined temperature is lower than the predefined temperature.
[013] The control unit can be additionally configured to change the transparency of the entire region of the first layer according to variations in the determined temperature.
[014] The control unit can be additionally configured to control the first layer to display an object of water drop after the end of a given time. The control unit can be further configured to control the dripping water object to be displayed as if it were moving in the gravity direction. The control unit can be further configured to control the drip object to be displayed at a user entry removal point.
[015] The control unit can be additionally configured to change the transparency of the detected user input region as time passes.
[016] The electronic device can additionally comprise a humidity sensor, and the control unit can be additionally configured to change transparency of the first layer in response to a signal received from the humidity sensor.
[017] The control unit can be additionally configured to control the first layer to display a frost object accumulated at a point of removal from user input when the frost image is displayed.
[018] The control unit can be additionally configured to control the touchscreen to display a second layer superimposed on the first layer, the second layer having an opaque region for displaying information.
[019] The control unit can be additionally configured to determine temperature based on data received from an external entity, to control the first layer to display an image of water vapor when the determined temperature is greater than a predefined temperature, and to control the first layer to display a frost image when the determined temperature is less than the preset temperature.
[020] The control unit can be additionally configured to change the transparency of the entire region of the first layer according to meteorological information. The meteorological information can include at least one of temperature and humidity.
[021] The first layer can have at least two attributes that are determined according to temperature.
[022] The control unit can be additionally configured to assign a first attribute to the first layer when the temperature is greater than a predefined temperature, and to assign a second attribute to the first layer when the temperature is less than the predefined temperature.
[023] According to another aspect of the present disclosure, an electronic device is provided. The electronic device includes a display unit configured to display an image and having a touchscreen to detect user input, and a control unit configured to control the touchscreen to display a semitransparent layer overlaid on the image. In this electronic device, when user input is detected in a partial region of the semitransparent layer, the control unit is additionally configured to change transparency of the partial region.
[024] According to another aspect of the present disclosure, an electronic device is provided. The electronic device includes a display unit configured to display a user interface and having a touch screen to detect user input, and a control unit configured to maintain a user interface lock state when controlling the touch screen touch to display a semitransparent layer superimposed on the user interface, to change transparency of a partial region of the semitransparent layer when user input is detected in the partial region, and to release the locking state of the user interface when the partial region occupies a specific percentage or more of the total touchscreen region.
[025] According to another aspect of the present disclosure, an electronic device is provided. The electronic device includes a display unit configured to display a plurality of icons to run applications and to have a touchscreen to detect user input, and a control unit configured to reject a selection of icons when controlling the touchscreen touch to display a semitransparent layer superimposed on the icons, and to control at least one of the icons to be in a selectable state when changing transparency of a partial region of the semitransparent layer when user input is detected in the partial region.
[026] According to another aspect of the present disclosure, a method for display control for an electronic device screen having a touch screen is provided. The method includes a first layer superimposed on an image displayed on the touchscreen, detecting a user input on the touchscreen, and changing transparency of a partial region of the first layer on which the user input is detected.
[027] The screen display control method may further comprise determining temperature, and displaying an image of water vapor on the first layer when the determined temperature is greater than a predefined temperature, and displaying a frost image on the first layer when the determined temperature is less than the preset temperature.
[028] The screen display control method may further comprise changing the transparency of the entire region of the first layer according to variations in the determined temperature.
[029] The screen display control method may additionally comprise displaying a drop of water object in the first layer after the end of a given time.
[030] The screen display control method may further comprise displaying the dripping water object as if it were moving in the direction of gravity.
[031] The screen display control method may additionally comprise displaying the water drop object at a point of removal from user input.
[032] The screen display control method can further comprise changing transparency of the detected user input region as time passes.
[033] The screen display control method can additionally comprise detecting moisture, and changing the transparency of the first layer according to the detected humidity.
[034] The screen display control method may further comprise displaying an accumulated frost object at a user entry removal point on the first layer when the frost image is displayed.
[035] The screen display control method may further comprise displaying a second layer superimposed on the first layer on the touchscreen, the second layer having an opaque region for displaying information.
[036] The screen display control method may additionally comprise receiving temperature, and displaying an image of water vapor on the first layer when the received temperature is greater than a predefined temperature, and displaying a frost image on the first layer when the received temperature is lower than the preset temperature.
[037] The screen display control method can further comprise changing transparency of the entire region of the first layer according to meteorological information. The meteorological information can include at least one of temperature and humidity.
[038] The first layer can have at least two attributes that are determined according to temperature.
[039] The screen display control method may further comprise assigning a first attribute to the first layer when the temperature is greater than a preset temperature, and assigning a second attribute to the first layer when the temperature is less than the preset temperature .
[040] According to another aspect of the present disclosure, a method for a display control for an electronic device screen having a touch screen is provided. The method includes displaying an image on the touchscreen, displaying a semitransparent layer superimposed on the image, detecting a user input on the touchscreen, and changing transparency of a partial region of the semitransparent layer on which the user input is detected.
[041] According to another aspect of the present disclosure, a method for display control for an electronic device screen having a touch screen is provided. The method includes displaying a user interface on the touchscreen, maintaining a user interface lock state when controlling the touchscreen to display a semitransparent layer superimposed on the user interface, changing transparency of a partial region of the semitransparent layer when a user input is detected in the partial region, and release the locking state of the user interface when the partial region occupies a specific percentage or more of the total region of the touchscreen.
[042] According to another aspect of the present disclosure, a method for a display control for an electronic device screen having a touch screen is provided. The method includes displaying a plurality of icons to run applications on the touchscreen, rejecting a selection of icons when controlling the touchscreen to display a semitransparent layer overlaid on the icons, and controlling at least one of the icons to stay in a state selectable when changing transparency of a partial region of the semitransparent layer when a user input is detected in the partial region.
[043] Another aspect of the invention provides a computer program comprising instructions arranged, when executed, to implement a method according to any of the aspects described above. An additional aspect provides machine-readable storage by storing a program like this.
[044] Other aspects, advantages and notable features of the disclosure will become apparent to those skilled in the art from the detailed description below, which, considered in combination with the accompanying drawings, reveals various modalities of the present disclosure.
BRIEF DESCRIPTION OF THE DRAWINGS
[045] The above and other aspects, features and advantages of certain modalities of the present disclosure will be more apparent from the description below considered in combination with the accompanying drawings, in which: [04 6] Figure 1 is a diagram of blocks illustrating an electronic device according to one embodiment of the present disclosure;
[047] Figures 2A and 2B are exploded views illustrating layers of screen displayed on an electronic device's touchscreen according to one embodiment of the present disclosure;
[048] Figure 3 is a flowchart illustrating a process associated with a touch and drag action detected on an electronic device's touchscreen according to one embodiment of the present disclosure;
[049] Figures 4A and 4B show screenshots illustrating graphic objects represented on a screen by a touch and drag action detected on an electronic device's touchscreen according to one embodiment of the present disclosure;
[050] Figure 5 is a flow chart illustrating a process of displaying a layer with a variable attribute according to temperature on an electronic device touch screen according to one embodiment of the present disclosure;
[051] Figures 6A and 6B show screenshots illustrating layers displayed with different attributes according to temperature on an electronic device touch screen according to one embodiment of the present disclosure;
[052] Figure 7 is a flow chart illustrating a process associated with changes in temperature in an electronic device according to one embodiment of the present disclosure;
[053] Figures 8A and 8B show screenshots illustrating variations in transparency according to changes in temperature on a touchscreen of an electronic device according to one embodiment of the present disclosure;
[054] Figure 9 is a flow chart illustrating a process associated with the removal of a touch and drag action from a touch screen on an electronic device according to one embodiment of the present disclosure;
[055] Figure 10 shows a screen capture illustrating a drop of water object displayed at a point of removal of a touch and drag action on the first layer of an electronic device touch screen according to a modality of present revelation;
[056] Figure 11 is a flowchart illustrating a process after the end of a given time from a display of a drop of water object on a touchscreen on an electronic device according to one embodiment of the present disclosure;
[057] Figures 12A and 12B show screenshots illustrating the movement of a water droplet object after a certain time has elapsed from a display of the water droplet object on an electronic device's touchscreen according to one embodiment of the present disclosure;
[058] Figure 13 is a flowchart illustrating a process associated with a touch and drag action detected in a layer with a frost image on an electronic device's touchscreen according to one embodiment of the present disclosure;
[059] Figures 14A, 14B and 14C show screenshots illustrating an accumulated frost object displayed in response to the removal of a touch and drag action detected in a layer with a frost image on a device's touchscreen electronic according to one embodiment of the present disclosure;
[060] Figure 15 is a flow chart illustrating a process after the end of a given time from a display of a region having been changed in transparency on a touchscreen on an electronic device according to a modality of the present disclosure ;
[061] Figures 16A, 16B and 16C show screenshots illustrating the disappearance of a region having been changed in transparency after the end of a given time from a display of the region on a touchscreen of an electronic device. according to one embodiment of the present disclosure;
[062] Figures 17A, 17B, 17C and 17D show screenshots illustrating a sequential disappearance of a region having been changed to transparency after a certain time has elapsed from a display of the region on a touchscreen. electronic device according to one embodiment of the present disclosure;
[063] Figures 18A, 18B and 18C show screenshots illustrating a layer having been changed in opacity in the event of a touch and drag action determined in about 90% of the regions on an electronic device's touchscreen according to one embodiment of the present disclosure;
[064] Figure 19 is a flow chart illustrating a process associated with changes in humidity in an electronic device according to one embodiment of the present disclosure;
[065] Figures 20A, 20B, 20C, 20D and 20E show screenshots illustrating various ways of displaying a layer according to changes in temperature and humidity on an electronic device's touchscreen according to a modality of present revelation;
[066] Figure 21 is a view illustrating a variation in a way of displaying a layer at a touch point according to the duration of a touch on an electronic device's touch screen according to one embodiment of the present revelation;
[067] Figures 22A, 22B, 22C and 22D show screenshots illustrating a process of releasing a locking state from a user interface in response to a user input on an electronic device touch screen according to one embodiment of the present disclosure; and [068] Figures 23A, 23B and 23C show screenshots illustrating a process of releasing a locked state from a user interface in a partial region on an electronic device touch screen according to one embodiment of the present revelation.
[069] Throughout the drawings, it should be noted that equal reference numbers are used to represent the same or similar elements, resources and structures.
DETAILED DESCRIPTION
[070] The following description with reference to the accompanying drawings is provided to assist in a comprehensive understanding of the various modalities of the present disclosure as defined by the claims and their equivalences. It includes several specific details to assist in this understanding, but these are to be considered as exemplary only. In this way, persons of ordinary skill in the art will recognize that various changes and modifications to the various modalities described in this document can be made without departing from the scope and spirit of the present disclosure. Furthermore, descriptions of well-known functions and constructs may be omitted for clarity and conciseness.
[071] The expressions and words used in the following description and claims are not limited to bibliographic meanings and are used by the inventor merely to enable a clear and consistent understanding of the present disclosure. Accordingly, it should be apparent to those skilled in the art that the following description of various modalities of the present disclosure is provided for the purpose of illustration only and not for the purpose of limiting the present disclosure as defined by the appended claims and their equivalences.
[072] It is to be understood that the singular forms "one", "one", "o" and "a" include plurals referring unless the context clearly dictates otherwise. Thus, for example, reference to "one layer" includes reference to one or more of such layers.
[073] Figure 1 is a block diagram illustrating an electronic device according to one embodiment of the present disclosure.
[074] Referring to figure 1, electronic device 100 can be connected to any external device (not shown), using a mobile communication unit 120, a subcommunication unit 130 and a connector 165. An external device can be a mobile phone, smart phone, input device, tablet, Personal Computer (PC), server, or any other electronic device. Electronic device 100 is portable, is capable of transmitting / receiving data, and can have at least one touch screen. The electronic device 100 can be a mobile phone, a smart phone, a tablet, a three-dimensional (3D) Television (TV), a smart TV, a Light Emitting Diode (LED) TV, a Liquid Crystal Screen TV ( LCD), or any other device that allows data to be transmitted to or received from a peripheral device or other remote device.
[075] The electronic device 100 includes a touchscreen 190 and a touchscreen controller 195, which collectively can be a display unit (not shown).
[076] In addition, electronic device 100 includes a control unit 110, mobile communication unit 120, subcommunication unit 130, a multimedia unit 140, a camera unit 150, a Global Positioning System unit ( GPS) 155, an input / output unit 160, a sensor unit 170, a storage unit 175 and a power supply unit 180. Subcommunication unit 130 includes at least one of a Wireless Local Area Network unit (WLAN) 131 and a short range communication unit 132. The multimedia unit 140 includes at least one of a broadcast communication unit 141, an audio reproduction unit 142 and a video reproduction unit 143. The unit camera 150 includes at least one of the first camera 151, the second camera 152 and a flash 153. The input / output unit 160 includes at least one of a key 161, a microphone 162, a speaker 163, a vibration motor 164, connector 165, a mini keypad 166 and an input unit 167. Sensor unit 170 includes a temperature sensor 171 and a humidity sensor 172.
[077] Control unit 110 may include a Central Processing Unit (CPU) 111, a Read-Only Memory (ROM) 112 that stores a control program to control electronic device 100, and a Random Access Memory (RAM) ) 113 which stores a signal or data received from any external entity or is used as a storage region for a task performed on electronic device 100.
[078] The control unit 110 controls general operations of the electronic device 100 and signal flows between the internal components 120 to 195 mentioned above, and also performs a function to process data. The control unit 110 controls the supply of electrical energy from the power supply unit 180 to the internal components 120 to 195 mentioned above. In addition, control unit 110 runs an Operating System (OS) and an application stored in storage unit 175.
[079] CPU 111 may include a Graphics Processing Unit (GPU) (not shown). CPU 111 can be a One Chip System (SoC) that contains a core (not shown) and the GPU. CPU 111 may include a single core, a double core, a triple core, a quadruple core, or any other multiple cores. CPU 111, ROM 112 and RAM 113 can be connected to each other via an internal bus.
[080] The control unit 110 can control the mobile communication unit 120, the subcommunication unit 130, the multimedia unit 140, the camera unit 150, the GPS unit 155, the input / output unit 160, the sensor unit 170, storage unit 175, power supply unit 180, touchscreen 190 and touchscreen controller 195.
[081] Mobile communication unit 120 enables electronic device 100 to be connected to any external device via mobile communication, using one or more antennas (not shown) under the control of control unit 110. The communication unit mobile 120 transmits or receives a wireless signal for a voice call, a video call, a Short Message Service (SMS), a Multimedia Message Service (MMS), or a data communication to or from a mobile phone, a smart phone, a tablet or any other electronic device.
[082] The subcommunication unit 130 can include at least one of the WLAN unit 131 and the short range communication unit 132. For example, the subcommunication unit 130 can include only the WLAN unit 131, only the communication unit of small range 132, or both.
[083] The WLAN 131 unit can be connected to the Internet via an Access Point (AP) (not shown) under the control of the control unit 110. The WLAN 131 unit supports related standards, such as 802.llx from Institute of Electrical and Electronic Engineers (IEEE). Short-range communication unit 132 can perform short-range communication between electronic device 100 and any external device under the control of control unit 110. For example, short-range communication can be, but is not limited to, this. , Bluetooth, Infrared Data Association (IrDA), or Near Field Communication (NFC).
[084] The electronic device 100 can include at least one of the mobile communication unit 120, the WLAN unit 131 and the short-range communication unit 132. For example, the electronic device 100 can have any combination of the communication unit mobile 120, WLAN unit 131 and short range communication unit 132. In various embodiments of this disclosure, "a communication unit" is the term used to include mobile communication unit 120 and subcommunication unit 130.
[085] The multimedia unit 140 may include the broadcast communication unit 141, the audio reproduction unit 142 or the video reproduction unit 143. The broadcast communication unit 141 may receive a broadcast signal (for example , a TV broadcast signal, a radio broadcast signal, or a data broadcast signal) and any additional information (for example, Electronic Program Guide (EPG) or Electronic Service Guide (ESG)) from a radio station. diffusion via a suitable antenna (not shown) under the control of control unit 110. Then the broadcast communication unit 141 can reproduce the received signal and information, using the touchscreen, a video encoder / decoder unit (not shown) and an audio encoder / decoder unit (not shown).
[086] The audio playback unit 142 can play an audio source (for example, an audio file having the file extension of mp3, wma, ogg or wav) stored in the storage unit 175 or received from any external entity , using the audio encoder / decoder unit under the control of control unit 110.
[087] Video playback unit 143 can play a digital video file (for example, a file having the file extension of mpeg, mpg, mp4, avi, mov or mkv) stored in storage unit 175 or received from any external entity, using the video encoder / decoder unit under the control of control unit 110. Many applications that can be installed on the electronic device 100 may allow audio and / or video playback using the video encoder / decoder unit audio and / or the video encoder / decoder unit.
[088] As will be understood by those skilled in the art, many types of video / audio encoding / decoding units have been manufactured and sold. In addition, the video playback unit 143 can play an audio source, using the video encoder / decoder unit or the audio encoder / decoder unit.
[089] The multimedia unit 140 may include the audio reproduction unit 142 and the video reproduction unit 143, excluding the broadcast communication unit 141. In another embodiment, the audio reproduction unit 142 and the unit video playback units 143 can be included in control unit 110. In various embodiments of this disclosure, "a video encoder / decoder unit" is the term used to include one or more video encoder / decoder units. Similarly, "an audio encoder / decoder unit" is the term used to include one or more audio encoder / decoder units.
[090] The camera unit 150 can include at least one of the first camera 151 and the second camera 152, which are arranged on the front and rear sides of the electronic device 100 respectively. Each of the first and second cameras 151 and 152 can capture an image and / or record a video. Each camera 151 or 152 can have an auxiliary light source (for example, flash 153) to provide a sufficient amount of light.
[091] The camera unit 150 can still include at least one additional camera (not shown) that is arranged close to the first and second cameras 151 and 152. For example, the distance between the additional camera and the first or second camera may be within a range of 2 cm to 8 cm. This combination of cameras can allow you to obtain a three-dimensional image or a 3D video.
[092] GPS unit 155 receives radio waves from a plurality of GPS satellites (not shown) in Earth orbit. The electronic device 100 can calculate a current position, using time of arrival (ToA) of such GPS satellites for the GPS unit 155.
[093] The input / output unit 160 can include at least one of the keys 161, microphone 162, speaker 163, vibration motor 164, connector 165, mini-keyboard 166 and input unit 167.
[094] The 161 key can be formed from a touch-sensitive virtual key instead of a mechanical key.
Additionally, key 161 can be displayed on touch screen 190.
[095] The microphone 162 receives voice or sound from the outside and creates an electrical signal under the control of the control unit 110. This electrical signal can be converted by the audio encoder / decoder unit and then stored in the storage unit 175 or produced through from speaker 163. Microphone 162 or more may be located at the front, side and rear of electronic device 100. Alternatively, microphone 162 or more may be located only on the side.
[096] Loudspeaker 163 can produce any sound corresponding to several signals to the outside (for example, a wireless signal, a broadcast signal, an audio source, a video file, an image, etc.). ) of the mobile communication unit 120, the subcommunication unit 130, the multimedia unit 140 or the camera unit 150, using the audio encoder / decoder unit under the control of the control unit 110. Speaker 163 can produce sound (for example, a keystroke tone or a ringtone) corresponding to a particular function performed by electronic device 100.
[097] Vibration motor 164 can convert an electrical signal into mechanical vibration under the control of control unit 110. For example, when a request for a voice call is received from a calling device (not shown), the motor vibration 164 operates in the case where the electronic device 100 is in a vibration mode. The vibration motor 164 or more can be used in the electronic device 100 to transfer a vibration to the entire electronic device 100 or to a part of it.
[098] The connector 165 can be used as an interface to connect the electronic device 100 to an external device (not shown) or to a power source (not shown). Using a cable connected to connector 165, electronic device 100 can transmit data stored in storage unit 175 to any external device or can receive data from any external device under control of control unit 110. Additionally, using a cable connected to connector 165, electronic device 100 can receive electrical energy from a power source and recharge a battery (not shown) incorporated in it.
[099] The mini keyboard 166 can receive a key input to control a user's electronic device 100. The mini keyboard 166 may be a mechanical mini keyboard (not shown) formed on the electronic device 100 and / or a virtual mini keyboard (not shown) displayed on the touch screen 190. In various embodiments, a mechanical mini keyboard may not be used depending on performance or structure of the electronic device 100.
[100] Input unit 167 can touch or select any object (for example, menu, text, image, figure, icon, etc.) displayed on touch screen 190. For example, input unit 167 can touch capacitive type touchscreen, resistive type, electromagnetic induction type, or Electromagnetic Reaction (EMR) type, or enter characters and more by means of a virtual keyboard.
[101] The sensor unit 170 may include temperature sensor 171 to detect the temperature outside and humidity sensor 172 to detect moisture outside. Various sensors can be added or removed from sensor unit 170, depending on the performance of electronic device 100. For example, a proximity sensor (not shown) to detect the approach of something, a luminance sensor (not shown) to detect the amount of light around the electronic device 100, an acceleration sensor (not shown) to detect a triaxial slope of the electronic device 100 and / or a gravity sensor to detect the direction of gravity can be included in the sensing unit 170.
[102] At least one sensor included in sensor unit 170 can detect conditions from electronic device 100 or environmental information around electronic device 100, and then transmit a detection signal to control unit 110.
[103] The storage unit 175 can store signals or data entered / sent in connection with operations of the mobile communication unit 120, the subcommunication unit 130, the multimedia unit 140, the camera unit 150, the GPS unit 155 , the input / output unit 160, the sensor unit 170 and the touchscreen 190 under the control of the control unit 110. The storage unit 175 can store a specific control program for the control of the electronic device 100 or the control unit 110, various applications offered by a manufacturer or received from any external entity, a related Graphical User Interface (GUI), images used to provide a GUI like this, user information, documents, database and / or data related.
[104] In various embodiments of this disclosure, a "storage unit" is the term used to include storage unit 175, ROM 112 and RAM 113 in control unit 110, and / or a memory card (not shown) , for example, a micro Secure Digital (SD) card or a memory card) mounted on the electronic device 100. This storage unit can be formed from non-volatile memory, volatile memory, Hard Disk Drive (HDD), or Storage Unit Solid State (SSD).
[105] The power supply unit 18 0 can supply electrical power to one or more batteries (not shown) arranged in the electronic device 100 under the control of the control unit 110. Additionally, the power supply unit 180 can receive power electrical power from an external power source (not shown) via a cable (not shown) connected to connector 165.
[106] The touch screen 190 can offer a user a user interface such as GUI corresponding to various services (for example, calling, data transmission, broadcasting, image acquisition, video, application, etc.). The touchscreen 190 transmits analog signals corresponding to one or more touch inputs introduced via a GUI to the touchscreen controller 195. The touchscreen 190 can receive one or more touch inputs from a member user (for example, finger) or input unit 167.
[107] In various embodiments of this disclosure, a touch is not limited to contact between the touch screen 190 and a user member or input unit 167, and can include any non-contact action (for example, fluctuation with a detectable distance of 30 mm or less). .. This detectable distance can be varied depending on the - performance or structure of the electronic device 100.
[108] The 190 touchscreen can be formed of resistive type, capacitive type, infrared type or acoustic wave type.
[109] In addition, the 190 touchscreen may include an Electromagnetic Reaction (EMR) pad capable of detecting contact from an active type pointing pen (not shown).
[110] This pen can contain a coil and can form a magnetic field at a certain point on the EMR pad through the coil. The EMR pad can detect a touch point of the pen on the touch screen 190 when detecting the location of a magnetic field.
[111] The 195 touchscreen controller receives signals (for example, X and Y coordinates from a touchpoint) corresponding to one or more touchscreen inputs 190 and transmits the received signals to the touchscreen unit. control 110. Then the control unit 110 can control the touchscreen 190, using signals received from the touchscreen controller 195. For example, in response to a touch input, the control unit 110 can indicate a selection a shortcut icon (not shown) displayed on the 190 touchscreen, or you can run an application corresponding to the selected shortcut icon.
[112] controle Control unit 110 can calculate the X and Y coordinates corresponding to a touch point, using signals received from the 195 touchscreen controller. In various embodiments of this disclosure, the 195 touchscreen controller controls the touch screen 190 and, depending on the performance or structure of the electronic device 100, can be contained in the control unit 110.
[113] The elements discussed above of the electronic device 100 in figure 1 can be used essentially or optionally depending on the performance of the electronic device 100. Additionally, as will be understood by those skilled in the art, locations of such elements can be varied according to with the performance or structure of the electronic device 100.
[114] Figures 2A and 2B are exploded views illustrating layers of screen displayed on an electronic device's touchscreen according to one embodiment of the present disclosure.
[115] The electronic device may include the touch screen to allow an image to be displayed and to detect user input. In particular, the electronic device may display a semitransparent layer overlaid on the image. In addition, the electronic device can include the control unit to change the transparency of a part of the semitransparent layer corresponding to a specific point on the touchscreen from which a user input is detected.
[116] Referring to Figures 1 and 2A, an image 210 is displayed on the 190 touchscreen. Image 210 can be a background image displayed in an idle state of the electronic device.
[117] Additionally, the first layer 220 can be displayed to overlay the image 210. The first image 210 can be considered to be displayed on a layer underlying the first layer 220. At least part of the first layer 220 may be in a state semitransparent (or translucent) in such a way that the image 210 can be seen obscurely through the first layer 220. However, the first layer 220 may contain graphic objects displayed on it, such as a temperature object 230, a moisture object 240 and a current-time object 250. Graphic objects can be opaque, or they can also be semi-transparent.
[118] The first layer 220 can have the transparency changed under the control of the control unit 110. Transparency can be expressed through numbers from 0 to 100. For example, transparency '0' indicates an opaque state in which the image 210 is completely invisible. Transparency '50' indicates a semi-transparent state in which image 210 is obscured. Transparency '10' indicates another semi-transparent state in which image 210 is almost invisible. Transparency '100' ’indicates a transparent state in which image 210 is completely visible. A semi-transparent state can mean any state having transparency excluding '0' and '100'. Under the control of control unit 110, transparency of the first layer 220 or at least part of the first layer (for example, excluding any graphic objects) can be varied from 0 to 100.
[119] Graphic objects 230, 240 and 250 can be displayed opaque in the first layer 220. Even if the transparency of the first layer 220 is changed, graphic objects 230, 240 and 250 indicating temperature, humidity and current time can remain in an opaque state. The first layer 220 can have one region to allow a change in transparency and the other region to reject a change in transparency. Regions of graphic objects 230, 240 and 250 can reject a change in transparency, and the other regions can allow a change in transparency.
[120] Depending on the transparency of the first layer 220, image 210 may be displayed obscurely or clearly on the touchscreen. For example, when the first layer 220 is displayed, the initial transparency can be '70' indicating a semi-transparent state. In this semi-transparent state the background image 210 can be displayed less obscurely.
[121] Figure 2B shows another example of an image and a screen layer overlaid on the touchscreen. Referring to figures 1 and 2B, an image 292 is displayed on touch screen 190. Image 292 may be a background image displayed in an idle state of the electronic device. In addition, the first layer 290 can be displayed to overlay the image 292.
Since the first layer 290 is in a semi-transparent state, the image 292 can be seen obscurely through the first layer 290.
[122] The first layer 290 can have the transparency changed by the control unit 110. Transparency can be expressed by numbers from 0 to 100. For example, transparency '0' indicates an opaque state in which the 292 image is completely invisible . Transparency '50' indicates a semitransparent state in which the 292 image is obscured. Transparency '10' indicates another semi-transparent state in which the 292 image is almost invisible. Transparency '100' indicates a transparent state in which the 292 image is completely visible. A semi-transparent state can mean any state having transparency excluding '0' and Ί00Λ. Under the control of the control unit 110 the transparency of the first layer 290 can be varied from 0 to 100.
[123] Depending on the transparency of the first layer 2 90, the image 2 92 can be displayed obscurely or clearly on the touchscreen. For example, when the first layer 290 is displayed, the initial transparency can be '70' indicating a semi-transparent state. In this semi-transparent state the image 292 can be displayed less obscurely.
[124] Additionally, the second layer 280 can be displayed to overlap the first layer 290. The second layer 280 can contain an opaque region to display information associated with conditions of the electronic device. For example, the second layer 280 may contain graphic objects displayed therein, such as a temperature object 288, a humidity object 286, a current time object 284 and a status bar 282.
[125] Graphic objects 282, 284, 286 and 288 can be displayed opaque on second layer 280. Second layer 280 may have an opaque region for displaying information such as graphic objects 282, 284, 286 and 288, and the other region that remains in a transparent state.
[126] Figure 3 is a flow chart illustrating a process associated with a touch and drag action detected on an electronic device's touchscreen according to one embodiment of the present disclosure. Figures 4A and 4B show screenshots illustrating graphic objects represented on a screen by a touch and drag action detected on an electronic device's touchscreen according to one embodiment of the present disclosure.
[127] Referring to figures 1, 3, 4A and 4B, an image is displayed on the touch screen 190 in operation 310. The touch screen 190 can detect user input, which can be a touch to contact the touchscreen when using a separate input unit such as the user's finger or a stylus, and / or a drag action to move a finger or stylus like this while maintaining contact with the touchscreen to the touch. When a user's finger touches the touch screen 190, the touch screen 190 can transmit coordinates from a touch point to the control unit 110.
[128] In operation 320, the control unit 110 can control a display of the first layer 400 overlaid on the image. The first layer 400 is displayed opaque, so that the image displayed on the touch screen 190 can be seen obscurely.
[129] The first layer 400 can contain several graphic objects displayed on it, such as a temperature icon 410, a humidity icon 420, a temperature value 430 and a humidity value 440.
[130] In operation 330, the control unit 110 can detect a touch and drag action on the 190 touchscreen. Specifically, the first layer 400 is displayed on the 190 touchscreen. When a user's 450 finger touches the touch screen 190, touch screen 190 detects a touch action at a touch point on it. Then the touch screen 190 can transmit the coordinates of the touch point to the control unit 110. In addition, the control unit 110 can detect a continuous movement of the touch action. This movement can be a drag action.
[131] Referring to figure 4B, a user's finger 470 comes into contact with the first layer displayed on the touch screen 190 and moves to create a drag path 460. Drag path 460 can comprise a line of center of the user's finger movement and a predetermined area on either side of the center line. Alternatively, drag path 60 may comprise the entire touch area of the user's finger at each point along the drag movement. In response to a touch and drag action like this, in operation 340 the control unit 110 changes the transparency of a specific region of the first layer in which a touch and drag action is detected. For example, if drag path 460 is created on the first layer via user finger 470, drag path transparency 460 is changed in such a way that drag path 460 becomes transparent. The control unit 110 can receive the coordinates of the drag path 460 from the touchscreen 190 and then can change transparency of a specific region corresponding to the drag path 460 in the first layer.
[132] For example, in the case where the initial transparency of the first layer is '70', the transparency of the drag path 460 can be changed to '100'. Therefore, drag path 460 becomes transparent, and the image underlying the drag path can be seen clearly.
[133] Figure 5 is a flow chart illustrating a process of displaying a layer with a variable attribute according to temperature on an electronic device touch screen according to one embodiment of the present disclosure. Figures 6A and 6B show screen shots illustrating layers displayed with different attributes according to temperature on an electronic device touch screen according to one embodiment of the present disclosure.
[134] The first layer displayed on the touchscreen can have at least two attributes, which can be determined depending on the temperature. If a current temperature is higher than a predefined temperature, the control unit 110 can assign the first attribute to the first layer. In contrast, if a current temperature is equal to or less than a predefined temperature, the control unit 110 can assign the second attribute to the first layer.
[135] Referring to figures 1, 5, 6A and 6B, the touch screen 190 displays an image in operation 510. The electronic device may include temperature sensor 171. In operation 520, temperature sensor 171 detects a current temperature. Then the temperature sensor 171 sends a detected temperature value to the control unit 110.
[136] Control unit 110 can determine the attribute of the first layer, based on a temperature value received from temperature sensor 171. If a current temperature is greater than a predefined temperature, control unit 110 can display an image of water vapor in the first layer. In contrast, if a current temperature is equal to or less than a predefined temperature, the control unit 110 can display a frost image on the first layer.
[137] For example, in case a predefined temperature is zero degrees, the control unit 110 determines in operation 530 whether a current temperature is greater than zero degrees. If a current temperature is greater than zero degrees, the control unit 110 assigns a 'water vapor' attribute to the first layer. For this reason, in operation 540, an image of water vapor or a similar pattern is displayed on the first layer. Figure 6A shows the first layer 610 having a 'water vapor' attribute and covered with a water vapor image. If a current temperature is equal to or less than zero degrees, the control unit 110 assigns a 'frost' attribute to the first layer. For this reason, in operation 550, a frost image or a similar pattern is displayed on the first layer. Figure 6B shows the first layer 620 having a 'frost' attribute and covered with a frost image. Although the first layer has two attributes (that is, water vapor and frost) in this modality, any other types of attributes or many more attributes can be used alternatively for the first layer.
[138] The control unit 110 can determine a current temperature based on data received from any external entity. For example, by accessing the Internet via the mobile communication unit 120 or the subcommunication unit 130, the control unit 110 can determine a current temperature outside the electronic device 100 or in a specific area.
[139] Figure 7 is a flow chart illustrating a process associated with changes in temperature in an electronic device according to one embodiment of the present disclosure. Figures 8A and 8B show screenshots illustrating variations in transparency according to temperature changes on an electronic device touch screen according to one embodiment of the present disclosure.
[140] Referring to figures 1, 7, 8A and 8B, temperature sensor 171 detects a current temperature in operation 710. Then, in operation 720, the control unit 110 receives temperature data from temperature sensor 171 and determines if there is a change in temperature. Alternatively, the control unit 110 can determine a current temperature or temperature changes based on data received from any external entity. In operation 730, in response to a change in temperature, the control unit 110 changes transparency of the first layer displayed in a semitransparent state. For example, as shown in figure 8A, the first layer 810 initially has a transparency of '60'. If a change in temperature is detected, the control unit 110 changes the transparency of the first layer. Specifically, transparency is increased according to an ascending temperature and decreased according to a descending temperature, and vice versa. Figure 8B shows that the transparency of the first layer 820 has been changed to '70'. Because of an increase in transparency, the background image underlying the first layer can be seen more clearly.
[141] Control unit 110 can change transparency of all regions of the first layer according to weather information that includes at least one of temperature, humidity and wind direction. Accessing an external weather server via mobile communication unit 120 or subcommunication unit 130, control unit 110 can receive weather information or data from the weather server.
[142] Figure 9 is a flow chart illustrating a process associated with removing a touch and dragging action from a touchscreen on an electronic device according to one embodiment of the present disclosure. Figure 10 shows a screen capture illustrating a drop of water object displayed at a point of removal of a touch and drag action on the first layer of an electronic device touch screen according to one embodiment of the present disclosure.
[143] Referring to figures 1, 9 and 10, the first layer is displayed in a semitransparent state, superimposed on the background image displayed on the 190 touch screen. In operation 910, an image of water vapor is displayed on the first layer. That is, the first layer 1020 has a water vapor attribute, so that the image of water vapor 1022 can be displayed on the first layer 1020. When a user performs a touch and drag action over a certain region of the first layer 1020 displayed on touch screen 190, touch screen 190 detects touch and drag action in operation 920 and then sends coordinates of the detected region to control unit 110.
[144] In operation 930, control unit 110 changes the transparency of the detected region of the first layer in such a way that a background image 1024 can be seen. After the touch and drag action is removed, the control unit 110 can control a water drop object 102 6 to be displayed at a touch and drag action remove point in operation 940. The water drop object 1026 it is a type of graphic object shaped like a drop of water. The water drop object 1026 can be displayed in a specific location on the first layer under the control of the control unit 110. Also, the water drop object 1026 can be moved in the direction of gravity under the control of the control unit 110 In other embodiments, water drop objects can be displayed in one or more locations or at each point of the lowest location along the drag path, with the lowest point being defined according to the direction of gravity and the point lower place referring to a part of the drag path that is lower when compared to the immediately adjacent areas.
[145] That is, in operation 930, the transparency of a touch and drag region is changed, and for this reason the background image 1024 is seen clearly. Additionally, in operation 940, the water drop object 1026 is displayed at the end point of the touch and drag region.
[146] Figure 11 is a flow chart illustrating a process after the end of a given time from a display of a drop of water object on a touchscreen on an electronic device according to one embodiment of the present disclosure. Figures 12A and 12B show screen shots illustrating the movement of a water drop object after a certain time has elapsed from a display of the water drop object on an electronic device touch screen according to a form of the present disclosure.
[147] Referring to figures 1, 11, 12A and 12B, the first layer 1240 having an attribute of a water vapor image 1242 is displayed in a semitransparent state on the touch screen 190, being superimposed on the plane image background. When a user's finger (not shown) touches and is dragged along part of the first layer 1240, a touch and drag region 1244 of the first layer 1240 is transparently displayed in operation 1110.
[148] When a certain time has elapsed in operation 1120 after a transparent display, a drop of water object is displayed in the first layer in operation 1130. That is, after the end of a predetermined time, the control unit 110 can control the first layer 1240 to display the water drop object.
[149] The dripping water object can be displayed close to a specific region in which a touch and drag action is detected. Alternatively, the water drop object can be displayed in a random position near the edge of the first layer. When another time has elapsed in operation 1140 after a display of the water drop object in the first layer, the water drop object can be moved in the gravity direction in operation 1150.
[150] For example, as shown in figure 12A, a user finger touches the touchscreen 190 and is dragged in a specific direction 1214. Then the control unit 110 receives coordinates from a touch and drag region 1212 from the 190 touch screen, and controls the 1212 touch and drag region to be displayed transparently. That is, a specific region 1212 of the first layer 1210, corresponding to a finger drag path, is displayed in a transparent state on the touch screen 190. When a given time has elapsed after the touch and drag region 1212 is displayed transparently, a water drop object 1216 is displayed at the end point of the touch and drag region 1212 on the first layer 1210. Alternatively, as soon as a touch and drag action is removed, the water drop object 1216 can be displayed in the first layer 1210. As noted earlier, the drop water object can also or alternatively be positioned at lower located points along the drag path.
[151] When another time has elapsed after the water drop object 1216 is displayed on the first layer 1210, the control unit 110 can control the water drop object 1216 to be moved in the gravity direction. That is, the water drop object 1216 created at a drag removal point as shown in figure 12A begins to move in the direction of gravity after the end of a given time as indicated by a reference number 1246 in the figure 12B. Control unit 110 can determine the direction of gravity when receiving signals from the gyrosensor (not shown) and / or the acceleration sensor (not shown).
[152] Figure 13 is a flow chart illustrating a process associated with a touch and drag action detected in a layer with a frost image on an electronic device's touchscreen according to one embodiment of the present disclosure. Figures 14A to 14C show screenshots illustrating an accumulated frost object displayed in response to the removal of a touch and drag action detected in a layer with a frost image on an electronic device touch screen according to a modality of the present disclosure.
[153] In the event that a frost image is displayed on the first layer, the control unit 110 can create an accumulated frost object at a drag removal point.
[154] Referring to Figures 1, 13 and 14A, the first layer 1410 is displayed in a semi-transparent state, superimposed on the background image displayed on the 190 touch screen. In operation 1310, a frost image is displayed on the first layer 1410. That is, the first layer 1410 has a frost attribute, so the frost image can be displayed on the first layer 1410. When a user performs a touch and drag action over a certain region of the first layer 1410 displayed on the touch screen 190, the touch screen 190 detects the touch and drag action in operation 1320 and then sends coordinates of the detected region to control unit 110. In operation 1330, the control unit 110 changes the transparency of the detected region of the first layer in such a way that a background image 1412 can be seen clearly. After the touch and drag action is removed, the control unit 110 can control an accumulated frost object 1416 to be displayed at a touch and drag remove point in operation 1340.
[155] For example, as shown in figure 14B, a user finger touches the touch screen 190 and is dragged in a specific direction 1430. Then the control unit 110 receives coordinates from a touch and drag region 1420 from the touch screen 190, and controls the touch and drag region 1420 to be displayed transparently. In the case where the initial transparency of the first layer is '70', transparency of the touch and drag region 1420 is changed to '100' indicating a transparent state. For this reason, the 1420 touch and drag region is displayed transparently. Additionally, at the end point 1440 of the touch and drag region 1420, an accumulated frost object 1450 is displayed. As shown in greater detail in Figure 14C, the accumulated frost object 1450 can be displayed in the form of rising eyebrows at a drag removal point.
[156] Figure 15 is a flow chart illustrating a process after the end of a given time from a display of a region having been changed in transparency on a touchscreen on an electronic device according to a modality of the present disclosure . Figures 16A to 16C show screenshots illustrating the disappearance of a region having been changed in transparency after the end of a given time from a display of the region on an electronic device's touchscreen according to a modality of present revelation.
[157] Control unit 110 can control a specific region (for example, where a user input occurs) to be changed in transparency as time passes. A user input can be a touch and drag action detected on the touchscreen.
[158] Referring to Figures 1, 15 and 16A to 16C, the first layer 1600 is displayed in a semi-transparent state, superimposed on the background image displayed on the 190 touch screen. When a user performs a touch and drag along a certain region of the first layer 1600 displayed on touch screen 190, touch screen 190 detects touch and drag action and then sends coordinates of the detected region to control unit 110. In operation 1510, control unit 110 changes the transparency of the detected region of the first layer. Then the control unit 110 determines whether a given time has elapsed after a transparent display of the detected region. When a given time has elapsed in operation 1520, the control unit 110 changes the opacity of the region having been changed in transparency in operation 1530. A change to an opaque state is carried out by means of a gradual decrease in transparency. That is, the control unit 110 gradually decreases transparency to the initial transparency of the first layer. For example, the first layer is displayed in the initial transparency of '70' on the touchscreen. When a touch and drag action is received for a specific region of the first layer, the transparency of the specific region is changed from '70' to '100'. Then, after twenty seconds, the control unit 110 begins to gradually decrease the transparency of the specific region from '100' to '70'.
[159] Figure 16A shows that a touch and drag region 1610 is transparently displayed on the first layer 1600 having a semitransparent state. A dotted line is used virtually to indicate the 1610 touch and drag region that actually occurs. Figure 16B shows that the touch and drag region 1630 is changed to a slightly semitransparent state as time passes. At this time, the transparency of the touch and drag region 1630 is still different from that of the first layer 1620. Figure 16C shows that the touch and drag region is completely invisible in the first layer 1650 as time passes further.
[160] Figures 17A to 17D show screenshots illustrating a sequential disappearance of a region having a change in transparency after a certain time has elapsed from a display of the region on an electronic device's touchscreen accordingly. with one embodiment of the present disclosure.
[161] Figure 17A shows that a user's finger 1710 forms a 1712 path of a touch and drag action on the first layer 1700. This path 1712 is displayed transparently on the first layer 1700. Figure 17B shows that a finger The user additionally forms another path 1724 from a touch and drag action on the first layer 1720. At this time, the previously formed path 1722 is displayed in an opaque manner because of the changed transparency as time passes. Figure 17C shows that a user's finger additionally forms another path 1734 from a touch and drag action on the first layer 1730. At this time, the path previously formed 1732 is displayed in an opaque manner because of the transparency that has changed over time goes by. In addition, the path formed first (1722 in figure 17B) is completely invisible having assumed the same opacity as the first surrounding layer. Figure 17D shows an additional time limit. Only the path formed last 1742 is displayed on the first layer 1740, and the other paths formed previously are completely invisible.
[162] As time passes, a display on the first layer is changed sequentially. Also, several paths gradually disappear in the order in which they are formed as shown in figures 17A to 17D. That is, the path formed first 1712 disappears first and the path formed last 1724 disappears last.
[163] Figures 18A to 18C show screenshots illustrating a layer changed in opacity in a case of a touch and drag action performed in about 90% of the regions on an electronic device's touchscreen according to a modality of the present disclosure.
[164] Referring to figures 18A through 18C, a user's finger (not shown) forms a path 1810 from a contact region in the first layer 1800. Then the transparency of contact path 1810 is changed. Additional and continuous contact in the first layer 1800 increases a transparent region of the first layer. If a contact region occupies about 90% of the total region of the first layer, the control unit 110 controls the total region of the first layer to be in a semi-transparent state.
[165] Figure 18A shows that path 1810 caused by a touch and drag action is displayed transparently in the first layer 1800. Figure 18B shows that about 90% of the region of the first layer 1820 is displayed transparently. Figure 18C shows that the total region of the first layer 1830 returns to the initial semitransparent state because of the detection of a touch and drag action of about 90% of the region of the first layer.
[166] Figure 19 is a flow chart illustrating a process associated with changes in humidity in an electronic device according to one embodiment of the present disclosure. Figures 20A to 20E show screenshots illustrating various ways of displaying a layer according to changes in temperature and humidity on an electronic device touch screen according to one embodiment of the present disclosure.
[167] Referring to figures 1, 19 and 20A to 20E, humidity sensor 172 detects current humidity in operation 1910. Then control unit 110 receives detection results from humidity sensor 172 and determines humidity based on detection results. In operation 1912, depending on the determined humidity, the control unit 110 can change transparency of the first layer which is in a semitransparent state.
[168] Alternatively, such humidity can be determined based on data received from any external entity.
[169] In operation 1914, depending on the humidity determined, the control unit 110 can change a display frequency of water drop objects displayed on the first layer. For example, in the case where the first layer has a 'water vapor' attribute, the control unit 110 can designate a higher display frequency for water drop objects and can also increase the speed of water drop objects being displaced in the direction of gravity when the humidity is between 100% and 90%. If the humidity is between 90% and 20%, the control unit 110 can designate a slightly higher display frequency for water drop objects and can also slightly increase the speed of water drop objects being shifted in the direction of gravity. If the humidity is between 20% and 10%, the control unit 110 can designate a lower display frequency for water drop objects and can also decrease the speed of water drop objects being shifted in the direction of gravity. If the humidity is between 10% and 0%, the control unit 110 can designate a much lower or no display frequency for water drop objects and can also control water drop objects so they are not displaced in the direction of gravity.
[170] For example, if the humidity is between 100% and 90%, ten objects of water drop are displayed on the touchscreen. If the humidity is between 90% and 20%, five objects of water drop are displayed on the touchscreen. If the humidity is between 20% and 10%, one or two drops of water are displayed on the touchscreen. If the humidity is between 10% and 0%, no drop of water is displayed on the touchscreen.
[171] Control unit 110 can change transparency of the first layer, depending on humidity. For example, control unit 110 can set transparency of the first layer to 30 when the humidity is between 100% and 90%. Also, control unit 110 can set transparency of the first layer to 50 when the humidity is between 90% and 20%. And also the control unit 110 can set transparency of the first layer to 70 when the humidity is between 20% and 10%. Also, the control unit 110 can set the transparency of the first layer to 100 when the humidity is between 10% and 0%.
[172] Figure 20A shows a case where the humidity is between 100% and 90%. In this case, the first layer 2000 is displayed on the 190 touchscreen. Because of the higher humidity, the background image 2010 is seen in an obscure way. The transparency of the 2014 and 2016 contact paths formed by a user's finger (not shown) are changed. Water drop objects are formed in the contact paths 2014 and 2016. Another water drop object 2012 is displayed near an edge of the first layer 2000.
[173] Figure 20B shows another case in which the humidity is between 20% and 10%. In this case, the contact paths 2026 and 2028 formed by a user's finger (not shown) are displayed in the first layer 2020. The background image 2022 is seen more clearly than the background image 2010 shown in the figure 20A. Another water drop object 2024 is displayed near an edge of the first layer 2020. In the case where the humidity is between 20% and 10%, the control unit 110 can establish transparency of the first layer to 70.
[174] Figure 20C shows yet another case in which the humidity is 0%. In this case the transparency of the first layer 2030 is changed, and the background image 2032 is seen clearly. '[175] However, in the case where the first layer has a' frost 'attribute, the control unit 110 can set the transparency of the first layer to 50 when the humidity is between 100% and 20%. Figure 20D shows the first layer 2040 in this case. Since the first layer 2040 has a 'frost' attribute, a frost image is displayed on the first layer 2040 and the background image 2042 is obscured. If a user's finger performs a touch and drag action on the 190 touchscreen, the resulting paths 2044 and 2046 are displayed transparently in the first layer 2040.
[176] Figure 20E shows the first layer 2050 displayed on the touchscreen in this case. If the humidity is between 10% and 0%, the control unit 110 can establish transparency of the first layer 2050 to 100.
[177] In the case where the first layer has a 'frost' attribute, control unit 110 may not create a drop of water object.
[178] Figure 21 is a view illustrating a variation in a way of displaying a layer at a touch point according to the duration of a touch on an electronic device's touch screen according to one embodiment of the present revelation.
[179] Referring to figure 21, when a 2110 user's finger comes into contact with the first layer displayed on the touchscreen, the transparency of a 2100 touch point is changed. If the 2110 user's finger maintains such contact for a given time, a region having been changed in transparency is gradually enlarged. That is, the point of contact 2100 having been changed in transparency by the initial finger contact is changed to an enlarged point 2120 as time passes. In addition, the control unit can create a 2130 water drop object displayed on the first layer.
[180] The 190 touchscreen detects contact from a user's finger on the first layer displayed on it. Then the control unit 110 receives coordinates from a contact region of the touch screen 190 and, based on the coordinates received, determines a specific region to have its transparency changed. In addition, the control unit 110 controls the touch screen 190 to change transparency of the contact region. For this reason, the partial region 2100 of the first layer is displayed transparently. In addition, the control unit 110 detects whether the contact finger is maintained continuously. If the contact finger is kept motionless for a given time, the control unit 110 can enlarge a region displayed in a transparent way like this in the first layer.
[181] For example, if an initial region having its transparency changed by a finger contact has a size of 70, this region can be enlarged to a size of 100 because of continuous finger contact on the touchscreen.
[182] Figures 22A to 22D show screenshots illustrating a process of releasing a locking state from a user interface in response to a user input on an electronic device's touchscreen according to one embodiment of the present revelation.
[183] The electronic device of this disclosure may include the display unit configured to display the user interface and having the touch screen. In addition, the electronic device may include the control unit configured to maintain a user interface lock state when controlling the touchscreen to display a semitransparent layer superimposed on the user interface, to change transparency of a partial region of the semitransparent layer when user input is detected in the partial region, and to release the locking state of the user interface when the partial region occupies a specific percentage or more of the total region of the touchscreen.
[184] Referring to Figures 1 and 22A, the first layer 2210 is displayed in a semitransparent state on the 190 touchscreen. A plurality of icons for running applications is arranged behind the first layer 2210 and is obscured. In the case where the total region of the first layer 2210 is in a semitransparent state, the control unit 110 may reject the selection of such icons. For example, even if one of the icons displayed obscurely behind the first layer 2210 is touched, the touched icon is not selected and thus a corresponding application is not run.
[185] Returning to figure 2B, the touch screen 190 can display the image 292 discussed above in which a plurality of icons are arranged. This image 292 will then be referred to as a desktop image. When one of these icons is touched and so selected, a specific application linked to the selected icon can be run. That is, the plurality of icons can act as a kind of user interface.
[18 6] Returning to figure 22A, in the case where the first layer 2210 is displayed in a semitransparent state and superimposed on the desktop image, the control unit 110 can maintain the user interface in a blocked state. The user interface can be changed from a blocking state to a non-blocking state, and vice versa, under the control of control unit 110.
[187] Referring to figure 22B, the first layer 2220 is displayed in a semitransparent state on the 190 touch screen. A user interface having a plurality of icons arranged on it is displayed behind the first layer 2220. If an action touch and drag happens on the first layer 2220, the control unit 110 receives coordinates from a touch and drag region 2222 of the touch screen 190. Then the control unit 110 changes transparency of the touch and drag region 2222 in such a way that region 2222 is in a transparent state. For this reason, some icons behind the first layer 2220 are clearly displayed in the transparent region 2222. In the end, the first layer 2220 now contains the transparent region 2222 and the semitransparent region 2224, depending on a user input. When the transparent region 2222 occupies 90% or more of the total region, the control unit 110 can change a locking state of the user interface to a non-locking state.
[188] Referring to Figure 22C, many regions of the first layer 2230 displayed on the 190 touchscreen are in a transparent state, and the other peripheral regions are in a semitransparent state. If the transparent region occupies 90% or more of the first layer 2230, the control unit 110 can change a blocking state of the user interface to a non-blocking state.
[189] Figure 22D shows a user interface screen that is in an unblocked state. The first layer (not shown) is displayed transparently on the 190 touchscreen, so that a 2240 desktop image having icons arranged thereon is clearly seen. For this reason, a user can touch a desired icon from the icons arranged in the 2240 desktop image in order to run a corresponding application.
[190] Figures 23A to 23C show screenshots illustrating a process of releasing a locking state from a user interface in a partial region on an electronic device's touchscreen according to one embodiment of the present disclosure.
[191] The electronic device may include the display unit configured to display a plurality of icons to run applications and to have the touch screen to detect user input. In addition, the electronic device can include the control unit configured to reject a selection of icons when controlling the touchscreen to display a semitransparent layer overlaid on the icons, and to control at least one of the icons to remain in a selectable state when changing transparency of a partial region of the semitransparent layer when user input is detected in the partial region.
[192] Referring to figures 1 and 23A, an image of the desktop having a plurality of icons arranged thereon is displayed on the touchscreen 190, and the first layer is displayed on top of it in a semitransparent state. The icons are obscured because of the first layer. In the case where the total region of the first layer is in a semitransparent state, the control unit 110 can maintain a user interface in a blocking state. For example, even if a 2310 icon obscurely displayed behind the first layer is touched, the touched icon 2310 is not selected because of a locked state of the user interface.
[193] Referring to figure 23B, an image of the desktop having a plurality of icons arranged thereon is displayed on the touchscreen 190, and the first layer 2320 is displayed on it in a semitransparent state. The first layer 2320 can contain a transparent region 2322 and a semitransparent region 2324. The transparent region 2322 receiving a user input is changed from a semitransparent state to a transparent state, while the semitransparent region 2324 not receiving user input is still in a semi-transparent state. Some icons are located in the transparent region 2322 and are clearly displayed in comparison to the other icons located behind the semitransparent region 2324. The control unit 110 can control such icons located in the transparent region 2322 to be selected. That is, if the transparency of a partial region of the first layer 2320 is changed in response to a user input, the control unit 110 can partially release a locking state from the user interface. That is, only transparent region 2322 is placed in an unblocked state. For this reason, if a certain icon 2326 located in the transparent region 2322 is touched and so selected, the control unit 110 runs a specific application linked to the selected icon 2326.
[194] Ά figure 23C shows a particular screen 2330 of the application executed when the visible icon (2326 in figure 2 33) is selected. That is, when a 2326 collections icon is selected as shown in figure 23B, a 2330 collections application screen is displayed.
[195] The method discussed earlier is described in this document with reference to illustrations of user interface flowcharts, methods and computer program products in accordance with various modalities of the present disclosure. It will be understood that each block of the flowchart illustrations, and combinations of blocks in the flowchart illustrations, can be implemented by means of computer program instructions. These computer program instructions can be provided for a general purpose computer processor, special use computer, or other programmable data processing device to produce a machine, in such a way that the instructions, which are executed via the processor computer or other programmable data processing device, create resources to implement the functions specified in the flowchart block or blocks. These computer program instructions can also be stored in a non-transitory, computer-usable or computer-readable memory that can direct a computer or other programmable data processing device to function in a particular way, so that the instructions stored in the non-transitory, computer-usable or computer-readable memory produces an article of manufacture including instruction resources that implement the function specified in the flowchart block or blocks. Computer program instructions can also be loaded onto a computer or other programmable data processing device to cause a series of operational steps to be performed on the computer or other programmable device to produce a computer-implemented process in such a way that instructions that are executed on the computer or other programmable device provide steps to implement the functions specified in the flowchart block or blocks.
[196] And each block of the flowchart illustrations can represent a module, segment or part of code, which comprises one or more executable instructions to implement the specified logical function (s). It should also be noted that in some alternative implementations the functions noted in the blocks can occur out of order. For example, two blocks shown in succession can actually be executed substantially concurrently or the blocks' can sometimes be executed in reverse order, depending on the functionality involved.
[197] Although the present disclosure has been shown and described with reference to various modalities of the same, it will be understood by those skilled in the art that various changes in form and details can be made in it without departing from the spirit and scope of the present disclosure as defined the attached claims and their equivalences. layer has at least two attributes that are determined according to temperature.
权利要求:
Claims (36)
[1]
1. Claims missing in original document.
[2]
2. Claims missing in original document.
[3]
3. Claims missing in original document.
[4]
4. Claims missing in original document.
[5]
5. Claims missing in original document.
[6]
6. Claims missing in original document.
[7]
7. Claims missing in original document.
[8]
8, Claims missing in original document.
[9]
9, Claims missing in original document.
[10]
10. Claims missing in original document.
[11]
11. Claims missing in original document.
[12]
12. Claims missing in original document.
[13]
13. Claims missing in original document.
[14]
14. Claims missing in original document.
[15]
15. Electronic device according to claim 14, characterized by the fact that the control unit is additionally configured to designate a first attribute for the first layer when the temperature is greater than a predefined temperature, and to designate a second attribute for the first layer when the temperature is lower than the preset temperature.
[16]
16. Electronic device, characterized by the fact that it comprises: a touch screen configured to display an image and to detect user input; and a control unit configured to control the touchscreen to display a semitransparent layer superimposed on the image, where, when user input is detected in a partial region of the semitransparent layer, the control unit is additionally configured to change transparency partial region.
[17]
17. Electronic device, characterized by the fact that it comprises: a touch screen configured to display a user interface and to detect user input; and a control unit configured to maintain a user interface blocking state when controlling the touchscreen to display a semitransparent layer superimposed on the user interface, to change transparency of a partial region of the semitransparent layer when user input is detected in the partial region, and to release the locking state of the user interface when the partial region occupies a specific percentage or more of the total region of the touchscreen.
[18]
18. Electronic device, characterized by the fact that it comprises: a touch screen configured to display a plurality of icons to run applications and to detect user input; and a control unit configured to reject a selection of icons when controlling the touchscreen to display a semitransparent layer overlaid on the icons, and to control at least one of the icons to be in a selectable state when changing transparency in a partial region of the semitransparent layer when user input is detected in the partial region.
[19]
19. Method for a display control for an electronic device screen having a touch screen, the method characterized by the fact that it comprises: displaying a first layer superimposed on an image displayed on the touch screen; detect user input on the touchscreen; and changing transparency of a partial region of the first layer in which user input is detected.
[20]
20. Method according to claim 19, characterized in that it additionally comprises: determining temperature; and displaying an image of water vapor on the first layer when the determined temperature is greater than a preset temperature, and displaying a frost image on the first layer when the determined temperature is less than the preset temperature.
[21]
21. Method, according to claim 20, characterized by the fact that it additionally comprises: changing the transparency of the entire region of the first layer according to variations in the determined temperature.
[22]
22. Method, according to claim 20, characterized by the fact that it additionally comprises: displaying an object of water drop in the first layer after the end of a given time.
[23]
23. Method, according to claim 22, characterized by the fact that it additionally comprises: displaying the water drop object as if it were displacing in the direction of gravity.
[24]
24. Method, according to claim 22, characterized by the fact that it additionally comprises: displaying the water drop object at a point of removal of the user input.
[25]
25. Method, according to claim 19, characterized by the fact that it additionally comprises: changing transparency of the detected user input region as time passes.
[26]
26. Method, according to claim 19, characterized by the fact that it additionally comprises: detecting humidity; and change the transparency of the first layer according to the humidity detected.
[27]
27. Method, according to claim 20, characterized by the fact that it additionally comprises: displaying an accumulated frost object at a point of removing user input on the first layer when the frost image is displayed.
[28]
28. Method according to claim 19, characterized by the fact that it further comprises: displaying a second layer superimposed on the first layer on the touchscreen, the second layer having an opaque region for displaying information.
[29]
29. Method, according to claim 19, characterized by the fact that it additionally comprises: receiving temperature; and displaying an image of water vapor on the first layer when the received temperature is greater than a preset temperature, and displaying a frost image on the first layer when the received temperature is less than the preset temperature.
[30]
30. Method, according to claim 19, characterized by the fact that it additionally comprises: changing the transparency of the entire region of the first layer according to meteorological information.
[31]
31. Method, according to claim 30, characterized by the fact that the meteorological information includes at least one of temperature and humidity.
[32]
32. Method according to claim 19, characterized by the fact that the first layer has at least two attributes that are determined according to temperature.
[33]
33. Method according to claim 32, characterized in that it further comprises: assigning a first attribute to the first layer when the temperature is greater than a predefined temperature, and assigning a second attribute to the first layer when the temperature is lower than the preset temperature.
[34]
34. Method for a display control for an electronic device screen having a touchscreen, the method characterized by the fact that it comprises: displaying an image on the touchscreen; display a semitransparent layer overlaid on the image; detect user input on the touchscreen; and changing transparency of a partial region of the semitransparent layer in which user input is detected.
[35]
35. Method for a display control for an electronic device screen having a touchscreen, the method characterized by the fact that it comprises: displaying a user interface on the touchscreen; maintain a locking state of the user interface when controlling the touchscreen to display a semitransparent layer overlaid on the user interface; change transparency of a partial region of the semitransparent layer when user input is detected in the partial region; and release the locking state of the user interface when the partial region occupies a specific percentage or more of the total region of the touchscreen.
[36]
36. Method for display control for an electronic device screen having a touchscreen, the method characterized by the fact that it comprises: displaying a plurality of icons to run applications on the touchscreen; reject a selection of icons when controlling the touchscreen to display a semitransparent layer overlaid on the icons; and control at least one of the icons to be in a selectable state when changing transparency of a partial region of the semitransparent layer when a user input is detected in the partial region.
类似技术:
公开号 | 公开日 | 专利标题
BR102014006171A2|2020-03-31|ELECTRONIC DEVICE AND METHOD TO CONTROL SCREEN DISPLAY USING TEMPERATURE AND HUMIDITY
US11086479B2|2021-08-10|Display device and method of controlling the same
CN108139778B|2021-06-15|Portable device and screen display method of portable device
AU2014287943B2|2017-02-23|User terminal device for supporting user interaction and methods thereof
JP5726909B2|2015-06-03|Multi-layer user interface with flexible translation and vertical movement
US20140351728A1|2014-11-27|Method and apparatus for controlling screen display using environmental information
KR102075082B1|2020-02-10|Apparatus having a touch screen and method for changing display of apparatus
US20140333551A1|2014-11-13|Portable apparatus and method of displaying object in the same
KR102190904B1|2020-12-14|Electronic device and method for control window
KR102089707B1|2020-03-16|Display apparatus and method for controlling thereof
KR101984673B1|2019-05-31|Display apparatus for excuting plurality of applications and method for controlling thereof
KR20140068573A|2014-06-09|Display apparatus and method for controlling thereof
KR102183071B1|2020-11-25|Display apparatus for excuting plurality of applications and method for controlling thereof
KR102102157B1|2020-04-21|Display apparatus for executing plurality of applications and method for controlling thereof
US20170308271A1|2017-10-26|Display device and method for controlling display device
TWI668621B|2019-08-11|Portable terminal and display method thereof
KR102117450B1|2020-06-01|Display device and method for controlling thereof
KR20140084966A|2014-07-07|Display apparatus and method for controlling thereof
US9910832B2|2018-03-06|Selecting user interface elements to display linked documents with a linking document
KR20140089870A|2014-07-16|Method and apparatus for providing user interface
同族专利:
公开号 | 公开日
TWI640914B|2018-11-11|
EP2778884A2|2014-09-17|
AU2014201585B2|2019-06-13|
JP6363365B2|2018-07-25|
TW201437902A|2014-10-01|
AU2014201585A1|2014-10-02|
JP2014179102A|2014-09-25|
RU2014110018A|2015-09-20|
EP2778884A3|2017-11-22|
CN104049888A|2014-09-17|
US20140282159A1|2014-09-18|
EP2778884B1|2021-08-18|
US20180300019A1|2018-10-18|
AU2014201585B9|2019-12-05|
RU2678482C2|2019-01-29|
US11150775B2|2021-10-19|
KR20140112918A|2014-09-24|
CA2846059A1|2014-09-14|
引用文献:
公开号 | 申请日 | 公开日 | 申请人 | 专利标题

JPH09146447A|1995-11-17|1997-06-06|Hitachi Denshi Ltd|Simulated visibility generation and display device|
US6559828B1|1999-08-13|2003-05-06|Clothing Plus Oy|User interface for selecting functions in an electronic hardware|
JP3390167B2|2001-02-28|2003-03-24|株式会社コナミコンピュータエンタテインメントジャパン|Program used to generate images|
EP1644816B1|2003-06-20|2016-09-14|Apple Inc.|Computer interface having a virtual single-layer mode for viewing overlapping objects|
US7515135B2|2004-06-15|2009-04-07|Research In Motion Limited|Virtual keypad for touchscreen display|
US20100088623A1|2006-10-13|2010-04-08|Core Aplli Incorporated|Operational support computer program and operational assitance computer system|
EP1926028A1|2006-11-24|2008-05-28|Nedstat B.V.|Computer program for website analytics|
US9049302B2|2007-01-07|2015-06-02|Apple Inc.|Portable multifunction device, method, and graphical user interface for managing communications received while in a locked state|
KR101390103B1|2007-04-03|2014-04-28|엘지전자 주식회사|Controlling image and mobile terminal|
US9083814B2|2007-10-04|2015-07-14|Lg Electronics Inc.|Bouncing animation of a lock mode screen in a mobile communication terminal|
EP2104024B1|2008-03-20|2018-05-02|LG Electronics Inc.|Portable terminal capable of sensing proximity touch and method for controlling screen using the same|
US8040233B2|2008-06-16|2011-10-18|Qualcomm Incorporated|Methods and systems for configuring mobile devices using sensors|
US9197738B2|2008-12-04|2015-11-24|Microsoft Technology Licensing, Llc|Providing selected data through a locked display|
KR101480470B1|2009-01-13|2015-01-26|엘지전자 주식회사|Mobile terminal and method for controlling touch lock screen thereof|
KR101544364B1|2009-01-23|2015-08-17|삼성전자주식회사|Mobile terminal having dual touch screen and method for controlling contents thereof|
US20100210357A1|2009-02-18|2010-08-19|Kelly Slough|Overlay content in a gaming environment|
KR101597553B1|2009-05-25|2016-02-25|엘지전자 주식회사|Function execution method and apparatus thereof|
US8621387B2|2009-06-08|2013-12-31|Apple Inc.|User interface for multiple display regions|
US9141768B2|2009-06-10|2015-09-22|Lg Electronics Inc.|Terminal and control method thereof|
US20110130170A1|2009-07-21|2011-06-02|Lg Electronics Inc.|Mobile terminal and method of controlling the operation of the mobile terminal|
KR101638056B1|2009-09-07|2016-07-11|삼성전자 주식회사|Method for providing user interface in mobile terminal|
TWI420332B|2009-09-29|2013-12-21|Htc Corp|Method and apparatus for displaying weather condition and computer product using the same|
US8471814B2|2010-02-26|2013-06-25|Microsoft Corporation|User interface control using a keyboard|
JP5722547B2|2010-03-19|2015-05-20|京セラ株式会社|Mobile terminal device|
KR20110106577A|2010-03-23|2011-09-29|삼성전자주식회사|Method for reproducing content using received information through network and display apparatus thereof|
KR101642725B1|2010-04-14|2016-08-11|삼성전자 주식회사|Method and apparatus for managing lock function in mobile terminal|
KR101669967B1|2010-08-23|2016-10-27|엘지전자 주식회사|Mobile terminal and method for controlling the same|
KR101685991B1|2010-09-30|2016-12-13|엘지전자 주식회사|Mobile terminal and control method for mobile terminal|
US9027117B2|2010-10-04|2015-05-05|Microsoft Technology Licensing, Llc|Multiple-access-level lock screen|
KR20120053609A|2010-11-18|2012-05-29|삼성전자주식회사|Method and apparatus for providing electronic book function in portable terminal|
JP5459251B2|2011-03-31|2014-04-02|カシオ計算機株式会社|Image processing apparatus, image processing method, and program|
EP2527968B1|2011-05-24|2017-07-05|LG Electronics Inc.|Mobile terminal|
JP2012249978A|2011-06-06|2012-12-20|Namco Bandai Games Inc|Program, information storage medium, game device and server system|
US8608334B2|2011-06-14|2013-12-17|Lightheaded Ventures, Llc|Device for displaying and illuminating an image|
JP5583087B2|2011-08-04|2014-09-03|株式会社東芝|Image processing apparatus, method, and program|
US8700999B2|2011-08-15|2014-04-15|Google Inc.|Carousel user interface for document management|
JP6007469B2|2011-09-01|2016-10-12|ソニー株式会社|Information processing apparatus, display control method, and program|
CN102279710A|2011-09-20|2011-12-14|宇龙计算机通信科技有限公司|Terminal and unlocking method|
KR101853856B1|2011-10-04|2018-05-04|엘지전자 주식회사|Mobile terminal and control method for the same|
US9223472B2|2011-12-22|2015-12-29|Microsoft Technology Licensing, Llc|Closing applications|
KR101880240B1|2011-12-28|2018-07-23|브리티쉬 텔리커뮤니케이션즈 파블릭 리미티드 캄퍼니|Mobile terminal and method for controlling operation thereof|
US8756511B2|2012-01-03|2014-06-17|Lg Electronics Inc.|Gesture based unlocking of a mobile terminal|
US9672796B2|2012-02-17|2017-06-06|Lg Electronics Inc.|Electronic device including flexible display|
US9747019B2|2012-02-24|2017-08-29|Lg Electronics Inc.|Mobile terminal and control method thereof|
TWI469040B|2012-03-01|2015-01-11|Quanta Comp Inc|Electronic device and locking/unlocking screen method|
CN102662571B|2012-03-26|2016-05-25|华为技术有限公司|Method and the subscriber equipment of unlock screen protection|
US9973900B2|2012-11-27|2018-05-15|Google Llc|Method of displaying location of a device|
US8963865B2|2012-12-14|2015-02-24|Barnesandnoble.Com Llc|Touch sensitive device with concentration mode|
KR102301592B1|2012-12-29|2021-09-10|애플 인크.|Device, method, and graphical user interface for navigating user interface hierachies|
EP2752758A3|2013-01-07|2016-10-26|LG Electronics Inc.|Image display device and controlling method thereof|
SG10201702070YA|2013-02-07|2017-05-30|Dizmo Ag|System for organizing and displaying information on a display device|EP3832986A1|2016-04-02|2021-06-09|enLighted, Inc.|Distributed light fixture beacon management|
AU344207S|2012-05-02|2012-09-05|Samsung Electronics Co Ltd|Display screen for an electronic device|
US9508169B2|2012-09-14|2016-11-29|Google Inc.|Method and apparatus for contextually varying amounts of imagery on a map|
WO2014196445A1|2013-06-03|2014-12-11|矢崎総業株式会社|Display device|
USD745039S1|2013-09-03|2015-12-08|Samsung Electronics Co., Ltd.|Display screen or portion thereof with animated graphical user interface|
USD773479S1|2013-09-06|2016-12-06|Microsoft Corporation|Display screen with icon group|
USD755202S1|2013-12-30|2016-05-03|Samsung Electronics Co., Ltd.|Display screen or portion thereof with graphical user interface|
USD754678S1|2013-12-30|2016-04-26|Samsung Electronics Co., Ltd.|Display screen or portion thereof with graphical user interface|
US10474345B2|2014-04-04|2019-11-12|Shawn SHEY|User interfaces and methods for displaying content|
USD793411S1|2014-05-16|2017-08-01|Apple Inc.|Display screen or portion thereof with graphical user interface|
USD793412S1|2014-06-02|2017-08-01|Apple Inc.|Display screen or portion thereof with graphical user interface|
CN104461351B|2014-11-12|2019-10-22|深圳市金立通信设备有限公司|A kind of method of terminal unlocking|
CN104461350B|2014-11-12|2019-10-22|深圳市金立通信设备有限公司|A kind of terminal|
CN104390713A|2014-12-03|2015-03-04|天津市电视技术研究所|Intelligent temperature alarm control device under high temperature environment|
KR101630404B1|2015-01-29|2016-06-14|네이버 주식회사|Apparatus and method for display cartoon data|
KR20160105121A|2015-02-27|2016-09-06|삼성전자주식회사|Method for operating electronic apparatus, electronic apparatus and storage medium|
JP6511860B2|2015-02-27|2019-05-15|富士通株式会社|Display control system, graph display method and graph display program|
KR20170014764A|2015-07-31|2017-02-08|삼성전자주식회사|Screen controlling method and electronic device supporting the same|
USD813243S1|2015-09-08|2018-03-20|Apple Inc.|Display screen or portion thereof with animated graphical user interface|
USD775649S1|2015-09-08|2017-01-03|Apple Inc.|Display screen or portion thereof with animated graphical user interface|
KR20170076347A|2015-12-24|2017-07-04|삼성전자주식회사|Electronic apparatus and operating method thereof|
AU2016396043A1|2016-03-02|2018-09-20|RazerPte. Ltd.|Data processing devices, data processing methods, and computer-readable media|
JP2018005109A|2016-07-07|2018-01-11|オリンパス株式会社|Display device, display method, and program|
CN106289555B|2016-07-22|2018-09-18|京东方科技集团股份有限公司|Display base plate|
EP3301543A1|2016-09-30|2018-04-04|Nokia Technologies OY|Selectively reducing reflectivity of a display|
KR102034548B1|2016-10-10|2019-10-21|삼성전자주식회사|Electronic device and Method for controlling the electronic device thereof|
US10635373B2|2016-12-14|2020-04-28|Samsung Electronics Co., Ltd.|Display apparatus and method of controlling the same|
USD886837S1|2018-05-07|2020-06-09|Google Llc|Display screen or portion thereof with transitional graphical user interface|
USD895652S1|2018-05-18|2020-09-08|Carefusion 303, Inc.|Display screen with graphical user interface for an infusion device|
KR20200025481A|2018-08-30|2020-03-10|삼성전자주식회사|Electronic apparatus and the control method thereof|
CN110141197A|2019-06-15|2019-08-20|出门问问信息科技有限公司|Electronic equipment with display screen|
USD918948S1|2019-07-16|2021-05-11|Zoomlion Heavy Industry Na, Inc.|Machine cabin display screen with graphical user interface|
CN112558814A|2020-12-17|2021-03-26|惠州Tcl移动通信有限公司|Screen display method and device and storage medium|
法律状态:
2018-05-15| B12F| Other appeals [chapter 12.6 patent gazette]|
2020-03-17| B150| Others concerning applications: publication cancelled [chapter 15.30 patent gazette]|Free format text: ANULADA A PUBLICACAO CODIGO 15.21 NA RPI NO 2461 DE 06/03/2018 POR TER SIDO INDEVIDA. 15.21 ANULADO PARA FINS DE PROSSEGUIMENTO DO FLUXO PROCESSUAL DO PEDIDO, EM ATENDIMENTO AO PARECER RECURSAL QUE REFORMOU A DECISAO EM REFERENCIA. |
2020-03-31| B03A| Publication of a patent application or of a certificate of addition of invention [chapter 3.1 patent gazette]|
2020-04-14| B06F| Objections, documents and/or translations needed after an examination request according [chapter 6.6 patent gazette]|
2020-06-02| B06U| Preliminary requirement: requests with searches performed by other patent offices: procedure suspended [chapter 6.21 patent gazette]|
2020-09-15| B11B| Dismissal acc. art. 36, par 1 of ipl - no reply within 90 days to fullfil the necessary requirements|
优先权:
申请号 | 申请日 | 专利标题
KR1020130027595A|KR20140112918A|2013-03-14|2013-03-14|Method and apparatus for controlling screen display using temperature and humidity|
KR10-2013-0027595|2013-03-14|
[返回顶部]