![]() CONTRAST CONTROL OF OVERLAY ON AUGMENTED REALITY SCREENS
专利摘要:
the exemplary modalities provide systems and methods for controlling overlapping contrast on augmented reality screens. in an exemplary method, contrast enhancement data that includes data that describes at least a portion of a background image for presentation under an informational overlay on a display device is accessed. the informational overlay includes at least one of a graphic or textual element to partially obscure a scene. data that identifies at least a portion of the information overlay under which the background image should be displayed is generated. the background image is displayed on a display device at the top of the scene portion that corresponds to the informational overlay portion, and the information overlay is displayed on the display device at the top of the background image and the scene. 公开号:BR112019014333A2 申请号:R112019014333-5 申请日:2018-01-16 公开日:2020-02-27 发明作者:Tinsman John 申请人:Opentv, Inc.; IPC主号:
专利说明:
“CONTRAST CONTROL OF OVERLAY ON AUGMENTED REALITY SCREENS” PRIORITY CLAIM [001] This application claims the benefit of Provisional Application No. 2 US 15 / 407,491, filed on January 17, 2017, which is incorporated into this document in its entirety for reference. FIELD [002] The present disclosure generally refers to information display technology and, in a specific exemplary modality, to the contrast control of an informational overlay on augmented reality screens. BACKGROUND [003] Augmented reality (AR) screens have been employed in a wide variety of environments in which a displayed physical or virtual scene is augmented or enhanced with additional information (for example, text, graphics, and so on) presented. about the scene on the screen from the perspective of the observer. An example of an AR screen is a heads-up display (HUD) screen for aircraft, automobiles and the like, where the information overlay is projected onto glass or another medium through which a user can view a physical scene, such as a highway. Another example of an AR screen is a screen of a smart phone or tablet computer equipped with a camera, so that a physical scene captured by the camera is displayed on the screen, with an informational overlay also being displayed on the screen over the scene. In another example of an AR screen, a virtual scene, such as that shown in a first-person shooter (FPS), can be enhanced with an informational overlay, with the virtual scene and information overlay being displayed on a monitor d video or the like. [004] In other situations, the scene that is displayed may render the Petition 870190064983, of 10/07/2019, p. 10/71 2/44 informative overlay less readable or visible, possibly making this information less useful. For example, in an informational overlay that identifies points of interest (for example, streets, shops, and so on) through text, graphics, or the like, a portion of the scene in which part of the informational overlay is presented may be a color and / or texture similar to that part of the overlay in this way, possibly making the specific part of the informational overlay less observable for the user. BRIEF DESCRIPTION OF THE DRAWINGS [005] Several attached drawings illustrate merely exemplary modalities of the present disclosure and cannot be considered as limiting in scope. [006] Figure 1A is a block diagram of an exemplary user device that includes a display device for displaying a scene captured through a camera in which an informational overlay can be displayed. [007] Figure 1B is a block diagram of an exemplificative user device that includes a projector to project an informational overlay on a transparent surface through which a scene can be seen. [008] Figure 1C is a block diagram of an exemplary user device that includes a transparent display device that can present an overlay of information and through which a scene can be viewed. [009] Figure 2 is a block diagram of an exemplary overlay module usable in one or more of the user devices of Figures 1A, 1B and 1C. [010] Figure 3 is a flow diagram of an exemplary method of modifying an emphasis of an informational overlay on an augmented reality screen. [011] Figures 4A and 4B are data flow diagrams for methods Petition 870190064983, of 10/07/2019, p. 11/71 3/44 examples of generating scene change data to modify an emphasis of an informational overlay. [012] Figure 5A is a graphic representation of a scene with an informational overlay. [013] Figure 5B is a graphical representation of a scene related to the scene in Figure 5A, in which an emphasis of the informational overlay is modified. [014] Figure 6 is a flow diagram of an exemplary method for providing contrast control for an informational overlay on an augmented reality screen. [015] Figure 7 is a flow diagram of an exemplary method of using scene change data to modify contrast enhancement data used to provide contrast control for an informational overlay. [016] Figure 8 is a graphical representation of a scene in which an informational overlay is provided, which shows the contrast enhancement of a portion of the informational overlay through a background image. [017] Figure 9A is a graphical representation of a scene in which a virtual video screen is presented without contrast enhancement. [018] Figure 9B is a graphical representation of the scene in Figure 9A in which a virtual video screen is presented with contrast enhancement. [019] Figure 10 is a simplified block diagram of a machine in an exemplary form of a computing system within which a set of instructions for making the machine perform any or more of the methodologies discussed in this document can be used. executed. DETAILED DESCRIPTION [020] In the following description, for the purposes of explanation, several specific details are presented in order to provide an understanding of several Petition 870190064983, of 10/07/2019, p. 12/71 4/44 exemplary modalities of this matter. It will be evident, however, for those skilled in the art, that the exemplary modalities of this subject can be practiced without these specific details. [021] Figures 1A, 1B and 1C are block diagrams of exemplary user devices 101 A, 101B, 101C (collectively, 101) through which an augmented reality (AR) screen can be presented to a user and from there In this way, it can serve as platforms on which the exemplary modalities that are discussed in more detail below can be provided. Figure 1A is a block diagram of an exemplary user device 101A that includes a display device 102 for displaying a scene (e.g., primary scene 110) captured by means of a camera 104, and for displaying a corresponding informational overlay. User device 101A includes an overlay module 106 that can be configured to provide informational overlay, such as text, graphics and / or other visual elements, which must be displayed “at the top” (for example, appearing in front of or upon and thereby blocking or partially obscuring) a representation of primary scene 110 on display device 102. Wave 110 is a representation of a real or virtual environment (for example, an external or internal view of a physical area, such as can be captured by means of a camera; a view of a virtual world from the point of view of a player in a gaming environment and the like). In an exemplary embodiment, overlay module 106, which is described more fully below in conjunction with Figure 2, can provide overlapping overlay modification, overlay contrast control and other functions described below in relation to the presentation of the informational overlay. Such functions can be assisted, in some exemplary modalities, with the use of data generated by the camera 104 to capture the primary scene 110. In an exemplary modality, the overlay module 106 can be configured to add Petition 870190064983, of 10/07/2019, p. 13/71 5/44 or replace one or more portions of images captured by the camera 104 before presenting the resulting composite images to the user via the display device 102. The display device 102, in various exemplary embodiments, can be a flat panel screen , such as a touchscreen, but other types of display devices 102 can be incorporated into user device 101A in other embodiments. Exemplary embodiments of the 101A user device may include, but are not limited to, portable devices such as smart phones, tablet computers, virtual reality headsets (VR) and so on. However, other exemplary embodiments, including non-portable devices, such as desktop computers, as well as devices where the camera 104 may be located outside user device 101A, may also employ the various exemplary embodiments of the overlay module 106 described below. [022] Figure 1B is a block diagram of an exemplary user device 101B that includes a projector 103 to project an informational overlay onto a transparent projection surface 112 (for example, glass, acrylic and so on) through which primary scene 110 can be seen. In this example, an overlay module 106 can generate the informational overlay and provide that overlay in a format used by the projector 103. In addition, user device 101B can include a camera 104 to capture primary scene 110 and employ the resulting data as input for overlay module 106, as described more fully below. An exemplary modality of the 101B user device can be an alert screen (HUD), which can be used in land vehicles, aircraft, space vehicles and so on. However, other types of 101B user devices that employ the architecture shown in Figure 1B are also possible. [023] Figure 1C is a block diagram of a user device Petition 870190064983, of 10/07/2019, p. 14/71 6/44 exemplary 101C which includes a transparent display device 105 that can display an informational overlay generated by an overlay module 106 and through which a primary scene 110 can be viewed. In an exemplary embodiment, the transparent display device 105 may include light-emitting diodes (LEDs), such as organic LEDs (OLEDs), fixed or embedded within a transparent structure, such as glass or acrylic. Thus, in the exemplary embodiments of Figure 1B, user device 101C may also include a camera 104 for capturing primary scene 110 and generating data for use by overlay module 106 to modify or otherwise process the informational overlay. be presented to the user. In an exemplary embodiment, the 101C user device can be another type of alert screen that can be used on vehicles, aircraft and the like. In addition, user device 101C, like user device 101B of Figure 1B, can be a gaming device so that each player employs a separate user device 101 through which he sees a common action screen, and wherein each user device 101 provides information intended solely for the corresponding player. In some exemplary embodiments, the transparent display device 105 may include a resizing structure (for example, a liquid crystal display (LCD) structure) or another structure or light blocking system that can be controlled to cause partial or total of all, or one or more portions, of the primary scene 110, from the perspective of the observer, thereby modifying the transparency of one or more portions of the transparent display device 105. [024] In exemplary embodiments, components other than the display device 102, the projector 103, the camera 104, the overlay module 106, and so on can also be incorporated into the user device 101, such as a human-interface. machine (HMI), a data storage device (for example, flash memory, magnetic disk drive, Petition 870190064983, of 10/07/2019, p. 15/71 7/44 optical disc drive, and so on), a wired or wireless communication interface (for example, a wide area network (WAN) (for example, the Internet), a cellular network (for example , third generation (3G) or fourth generation (4G) network, a cable television network, a satellite communication network, or any other communication or connection network or combinations thereof), and so on. [025] Figure 2 is a block diagram of an exemplary overlay module 106 usable on one or more of the user devices 101 A, 101B, 101C of Figures 1A, 1B and 10, as mentioned above. In an exemplary embodiment, overlay module 106 may include one or more of an overlay data access module 202, a scene change data access module 204, a scene data analyzer 206, an object identifier 208, a device motion detector 210, an overlay emphasis modifier 212, an overlay contrast enhancement data access module 214, an overlay background image 216, or an overlay output generator 218. In various exemplary modalities, the various modules 202-218 of the overlay module 106 can be implemented in hardware (for example, logic circuits and other electronic components), in software or firmware in the form of instructions stored in a memory and configured to run on one or more hardware processors (for example, microprocessors, microcontrollers and the like) or some combination thereof. In addition, depending on the particular user device 101 on which the overlay module 106 is implemented, less than all modules 202-218 can be implemented, and additional modules in addition to those explicitly shown in Figure 2 can be employed. [026] The overlay data access module 202 can be Petition 870190064983, of 10/07/2019, p. 16/71 8/44 configured to access overlay data from a data source inside or outside overlay module 106 or user device 101. In an exemplary embodiment, overlay data includes data that describe textual and / or graphic elements which should be presented as an informational overlay on a scene 110 that is displayed to a user. The overlay data, in an exemplary mode, can specify the color, intensity and / or other characteristics of each image element (pixel) of the informational overlay in a compressed or uncompressed format. [027] In some exemplary embodiments, the overlay data access module 202 can generate the overlay data using other accessible data, such as map data corresponding to scene 110 that is presented or viewed by the user. In addition, user device 101 may include a location detector (for example, a Global Positioning System (GPS) receiver) and orientation sensor (for example, one or more gyroscopic sensors) to determine a device's location and orientation user 101, and correlate the determined location and orientation of the user device 101 with the map data to determine the overlay data. In other exemplary embodiments, the overlay data access module 202 may employ image data (e.g., data from camera 104) in combination with map data to generate the overlay data. [028] In yet other exemplary embodiments, the overlay data access module 202 can receive overlay data or related information that refers to a physical or virtual environment shown in scene 110 that is presented to the user of the user device 101 For example, the overlay data access module 202 can receive, from a game that operates on user device 101 or an external device, information that refers to a location and orientation within a virtual gaming world and /or Petition 870190064983, of 10/07/2019, p. 17/71 9/44 overlay data describing that location and orientation. The overlay module 106 can present the resulting informational overlay on top of a view of the virtual gaming world for the user (for example, via the display device 102). [029] Scene change data access module 204 can be configured to access or receive scene change data referring to primary scene 110 from a module or component inside or outside overlay module 106. In In an exemplary embodiment, scene change data can identify one or more areas or portions of primary scene 110 that have changed over a period of time. Such information can also indicate how, by how much or how quickly one or more portions of primary scene 110 have changed. Changes in primary scene 110 may include ale in the intensity, color, and other characteristics of one or more portions of scene 110. As described below, scene change data can be used to modify the informational overlay as it is presented for the user. In an exemplary embodiment, the scene change data is stored in a data structure that can indicate one or more portions of the scene 110 that have changed, as well as possibly a magnitude of the change. In other exemplary modalities, scene change data may not indicate changes (for example, in object movement, brightness, color and / or other aspects) in scene 110 which is below some limit value. [030] Scene data analyzer 206 can be configured to analyze data describing primary scene 110 (for example, data generated by camera 104 or some other device outside of user device 101), such as image capture data of a real or virtual environment shown in scene 110. Scene data analyzer 206 can then generate scene change data that is accessed through scene change data access module 204 Petition 870190064983, of 10/07/2019, p. 18/71 10/44 described above. In an exemplary embodiment, the scene data analyzer 206 can compare a current version of scene 110 (for example, a current image of scene 110) and a previous version of scene 110 to detect changes in scene 110. The data describing these changes can be generated as scene change data. Other means of analyzing scene 110 and generating scene change data can also be employed in other exemplary modalities. [031] In another exemplary embodiment, the scene data analyzer 206 can analyze data that describes primary scene 110 and generate data that indicates one or more aspects of the current primary scene 110, as opposed to the generation of scene change data. For example, scene data analyzer 206 can generate data that correlates particular areas of primary scene 110 to one or more values that indicate color, brightness, complexity and / or other visual aspects of each particular area. In other exemplary embodiments, the scene data analyzer 206 can generate data that correlates particular areas of the primary scene 110 to one or more objects represented in the current primary scene 110. Other types of data that describe primary scene 110 can be generated in other exemplary modalities. [032] Scene object identifier 208 can be configured to detect one or more objects in a scene 110. Such objects can include man-made objects (for example, vehicles, signage, streets, curbs, buildings, and so on) on), natural objects (for example, trees, mountains, ponds, and so on), people, animals and the like. Scene object identifier 208, in an exemplary embodiment, can identify any objects by means of shape, color, visual texture and other characteristics, possibly by comparing image data that correspond to a scene 110 with stored information describing one or more visual aspects of various objects. The scene object identifier 208, based on object detection, Petition 870190064983, of 10/07/2019, p. 19/71 11/44 can generate data that indicates the presence of one or more objects detected in scene 110, including a location of the object within scene 110, a distance or range from the vantage point of user device 101 to the object, and other information that describe the object. [033] In an exemplary embodiment, the scene object identifier 208 can identify one or more objects based on the information provided, directly or indirectly, by the user. For example, the user can indicate a particular destination (for example, through software mapping that runs on user device 101). Critical locations along that route, such as curves at intersections, the location of an entrance to the destination, and so on), can be identified by the mapping software and forwarded to the scene object identifier 208. In turn , the scene object identifier 208, based on the physical appearance of the critical location, the captured data of the scene, the location and / or orientation of the user device 101, and possibly other information, can identify a particular area or portion of the scene 110 in which the critical location appears. Other means of identifying one or more objects in scene 110 are also possible. In an exemplary embodiment, scene object identifier 208 can forward an indication of the detected object data to scene data analyzer 206, which can process that data to determine the movement of one or more objects within scene 110 along of time. Such information can be included in the scene change data produced by the scene data analyzer 206. [034] In other examples, another system, device, or application may determine that the object is detected by the scene object identifier 208. For example, a game application may provide information that refers to one or more objects (for example, characters, physical items, and so on) that are shown in scene 110, such as your identity, location, range or distance and Petition 870190064983, of 10/07/2019, p. 20/71 12/44 other characteristics based on the actions of the user of the user device 101, other users, and so on. [035] In addition to indicating the location of an object within scene 110, scene object identifier 208 can determine a distance from the object to a particular vantage point, such as a distance from user device 101 or camera 104 that captures scene 110. For example, a camera 104 can compare an apparent size of the object within scene 110 to a known size of the object to determine the distance between camera 104 and the object. In other exemplary embodiments, scene object identifier 208 may employ other object detection technologies, such as radar, sonar, cope, and the like, to determine a distance or range from an identified object within scene 110. [036] Device motion detector 210 can be configured to detect motion of user device 101, such as translational and / or rotational motion of user device 101, and generate data indicating such movement. The device motion detector 210 may include, for example, a GPS receiver, one or more gyroscopic sensors and / or other components. In response to the detected motion, device motion detector 210 can generate data indicative of device motion 101, which can be provided to scene data analyzer 206 for use in scene analysis 110. In other exemplary embodiments, data generated motion patterns can be provided for the overlay emphasis modifier 212 and / or the overlay background image 216. [037] The overlay emphasis modifier 212 can be configured to modify one or more visual characteristics of the informational overlay that is presented to the user (for example, through the display device 102 or projector 103) to emphasize or de-emphasize the informational overlay , or a Petition 870190064983, of 10/07/2019, p. 21/71 13/44 portion of it, over time. In an exemplary embodiment, the overlay emphasis modifier 212 can modify a brightness, sharpness, color, size, line width, text font or other characteristic of one or more portions of the informational overlay. For example, a portion of the informational overlay can be emphasized by increasing the brightness or sharpness of a portion of the overlay, increasing the line size or width of one or more features of the overlay, increasing the font size or using a text font bold or underlined in the overlay, changing a color of the overlay to contrast with the underlying scene, and so on. In addition, one or more features of a portion of the informational overlay can be modulated over time (for example, periodic "flash" or "flashing" of a portion of the overlay) to emphasize that portion of the overlay. On the other hand, de-emphasizing at least a portion of the informational overlay can be accomplished by decreasing a brightness or sharpness (for example, by blurring or defocusing) of the overlay, by decreasing a line width size of one or more resources of the overlay, decreasing a size of the text font in the overlay, using a color that is chromatically close to the underlying scene, and the like. [038] In some exemplary modalities, the emphasis or de-emphasis of one or more portions or areas of an informational overlay may be based on a current state of scene 110, or a recent or current change to scene 110. For example, a portion of the information overlay can be emphasized if a portion of scene 110 presented under or behind the overlay portion is visually complex or not of significant importance. Conversely, the informational overlay, or a portion of it, can be de-emphasized in response to a corresponding portion of scene 110 that is important or time-critical in some way that ensures the maximization of Petition 870190064983, of 10/07/2019, p. 22/71 14/44 visibility of that portion of the scene 110. [039] In an exemplary embodiment, overlay emphasis modifier 212 can receive scene change data from scene change data access module 204 and emphasize or de-emphasize one or more visual characteristics of at least a portion of the informational overlay in response to a change in scene 110. More specifically, scene change data can indicate a particular portion or area of scene 110 that has undergone some type of change. In response, the overlay emphasis modifier 212 can identify a portion of the informational overlay (for example, a particular visual object, such as a text string or a graphic object) that corresponds to the particular portion of scene 110, and emphasize de-emphasizing that overlap portion. For example, overlay emphasis modifier 212 can determine that one or more portions of the overlay overlap the particular portion of scene 110 that changes and then emphasize or de-emphasize that one or more portions. [040] In an exemplary embodiment, scene change data can indicate a portion of scene 110 that corresponds to an object or location of interest (for example, a point of interest, such as an entrance to a building or parking; a potential obstacle, a character in a game, and so on) located in the indicated portion of scene 110. In an exemplary embodiment, such information can be generated by the scene data analyzer 206 and / or the scene object identifier 208. In another exemplary embodiment, scene change data can determine an object or location of interest when accessing data in a database that typically indicates occupied locations or locations that may involve increased user attention (for example, parking lots, intersections, curves closed, and so on). [041] The operation of the overlay emphasis modifier 212, in a Petition 870190064983, of 10/07/2019, p. 23/71 15/44 exemplary modality, may depend on how the association of the object of interest with scene 110 changes over time, such as how the location of the object of interest within scene 110 changes. For example, the overlay emphasis modifier 212 can respond to an object of interest detected by modifying one or more characteristics of the entire overlay, or of a portion of the overlay that corresponds to the detected object. For example, if the object of interest enters or leaves a particular area of scene 110 (for example, a central portion of scene 110), overlay emphasis modifier 212 may reduce brightness or otherwise de-emphasize the portion overlap, or emphasize the overlap portion, which corresponds to a central portion of the scene 110. [042] In another exemplary embodiment, the overlay emphasis modifier 212 can modify an emphasis (for example, emphasize or de-emphasize) all or a portion of the informational overlay in response to some change in a d or range of an object of interest from the observer, device 101 or other vantage point. For example, if the object of interest appears to advance at some particular physical distance from the vantage point of the observer (for example, an object at a particular physical distance from the camera 104 or the user device 101) or some distance from the vantage point of the observer (for example, a virtual distance of one character from the observer within the context of a game), the overlay emphasis modifier 212 can emphasize or de-emphasize all or a portion of the informational overlay, such as a portion that corresponds to a portion of scene 110 that corresponds to the object (for example, the portion of scene 110 occupied by the object). In another exemplary embodiment, the overlapping emphasis modifier 212 can emphasize or de-emphasize all or a portion of the informational overlap in response to the object of interest that comes out of a particular physical or virtual distance (for example, perceived) between the vantage point of the observer and the object. Petition 870190064983, of 10/07/2019, p. 24/71 16/44 [043] Overlay emphasis modifier 212 can also modify one or more characteristics of all or a portion of the informational overlay, thereby emphasizing or de-emphasizing that portion of the overlay, in response to a particular magnitude of a change in scene, as indicated in the scene change data. In an exemplary embodiment, if the speed or acceleration of an object shown in scene 110 exceeds, or falls below, a particular level, the overlapping emphasis modifier 212 can emphasize or de-emphasize all or a portion of the overlap accordingly. For example, an object in scene 110 that is moving fast, or accelerates quickly, can cause the overlay emphasis modifier 212 to de-emphasize the overlay so that the visibility of the object in scene 110 can be improved, or the overlay emphasis 212 can emphasize a single object indicator in the overlay to draw the user's attention to the object. [044] In other examples, rapid or slow translational or rotational movements of device 101 (for example, as detected by device motion detector 210) can cause overlapping emphasis modifier 212 to emphasize or de-emphasize all or one or more portions of the informational overlay. For example, overlay emphasis modifier 212 can de-emphasize informational overlay during times when device 101 undergoes rapid translational or rotational movements. [045] In another exemplary embodiment, instead of relying on scene change data, the overlay emphasis modifier 212 can modify an emphasis (for example, emphasize or de-emphasize) all or a portion of the informational overlay based on a data analysis describing the current scene 110. For example, the overlay emphasis modifier 212 can receive data from one or more of the scene data analyzer 206 or the scene object identifier 208 which indicates one or more aspects of the scene. current scene 110, and modify Petition 870190064983, of 10/07/2019, p. 25/71 17/44 an emphasis of all or a portion of the informational overlay in relation to the current scene 110 based on this data. In an exemplary embodiment, the overlay emphasis modifier 212 can perform the override modification in relation to a standard informational overlay based on the initial scene data prior to the generation of scene change data. [046] Continuing with Figure 2, modules, such as the overlay contrast enhancement data access module 214 and the overlay background image generator 216, in conjunction with other modules of overlay module 106, can cooperate to provide a background image between scene 110 and the informational overlay to render the overlay, or one or more portions of it, more readable or visible to the user of the user device 101. In an exemplary embodiment, the access module overlay contrast enhancement data 214 can be configured to access or generate contrast enhancement data that describes the background image to be used. For example, the background image can be a semi-transparent or semi-transparent (for example, pixelated) background image of some color (for example, black, gray, red, yellow, etc.) and / or brightness that allows at least some scene 110 appears through the background image to be viewed by the user of the user device 101. In another exemplary embodiment, the background image may be opaque to prevent any portion of scene 110 corresponding to the background image from being seen. In addition, the background image described by the contrast enhancement data may or may not match the resolution of the scene 110. In an exemplary embodiment, the contrast enhancement data overlay can provide a pattern that can then be repeated to cover some portion of scene 110, as determined by the overlay background image 216. [047] The overlay background image 216 can be Petition 870190064983, of 10/07/2019, p. 26/71 18/44 configured to determine a portion of the informational overlay in which the background or contrast image should be displayed. In exemplary embodiments, the overlay background image generator 216 can determine the portion of the informational overlay based on information from one or more of the modules in overlay module 106. For example, the overlay background image 216 can determine that the overlay portion is supported with the background image based on data from scene data analyzer 206, such as data indicating a visual complexity of one or more portions of scene 110. Based on the visual complexity of a portion of scene 110 that exceeds a particular limit, the overlay background image generator 216 can identify a portion of the informational overlay that corresponds to that portion of scene 110 to which the background image is to be applied. [048] In another exemplary embodiment, the overlay background image generator 216 can determine which portion of the informational overlay to be supported by the background image based on information from device motion detector 210. For example, when the detector motion device 210 detects the translational or rotational movement of the user device 101 that exceeds a particular threshold, the overlay background image generator 216 can determine that no portion of the informational overlay should be supported by a background image allowing Thus, at least portions of the informational overlay are less discernible or readable while user device 101 continues to exceed the limit. [049] In yet another exemplary modality, the overlay background image generator 216 can base its identification of the informational overlay portion on the scene change data discussed above, such as those accessed by the scene change data access module. 204, or generated by the scene data analyzer 206 or the scene object identifier 208. Petition 870190064983, of 10/07/2019, p. 27/71 19/44 For example, in response to data that identifies a location of an object of interest identified by scene object identifier 208, the overlay background image 216 may identify a portion of the information overlay that it demonstrates or identifies (e.g., through text, or an arrow or other graphic) to which the background image should be applied to provide additional contrast for that portion of the overlay. In an exemplary embodiment, such additional contrast can be applied under particular circumstances that involve the object of interest, such as entering or leaving a particular portion or area of scene 110; encroaching on a particular distance from a vantage point, such as the user vantage point of user device 101; leave a particular distance from the vantage point; and so on, as described above. [050] The overlay background image generator 216, in an exemplary embodiment, can determine that the overlap portion is supported by the background image based on a magnitude of a change in the scene, as determined by the scene data analyzer 206 and / or device motion detector 210, which exceeds or falls below a particular magnitude or level, as described above. For example, the overlay background image 216 can determine that no background image is guaranteed while scene 110 is changing or moving rapidly, while such a background image may be desirable during a slowly changing scene 110. [051] In some exemplary embodiments, the overlay background image generator 216 may vary over time, with the background image being applied over scene 110, such as by turning on or off (for example, flashing or blinking) of the background image, changing the nature of the background image (for example, through the overlay contract enhancement data that is used), and so on to emphasize the portion of the Petition 870190064983, of 10/07/2019, p. 28/71 20/44 overlay that corresponds to the background image. In an exemplary embodiment, the overlay background image generator 216 can reduce darkness or otherwise change one or more aspects of the background image (e.g., pattern, color, texture, etc.) over time , in relation to the location of the corresponding portion of the overlay within scene 110. For example, the darkness of the background image may be reduced as the portion of the corresponding overlay moves in the opposite direction to the center of scene 110, or as the overlap portion moves more quickly within scene 110, as user device 110 moves more quickly, and so on. [052] The overlay output generator 218 can be configured to cause presentation (for example, through the display device 102 of the user device 101 A, the projector 103 of the user device 101B, and the transparent display device 105 user device 101C) of the informational overlay and / or the background image, as generated or modified by the overlay emphasis modifier 212 and / or overlay background image generator 216. In some exemplary embodiments, the overlay module overlay 106 can include an overlay emphasis modifier 212, but no overlay background image generator 216, or vice versa, with overlay output generator 218 causing the presentation of the informational overlay that is modified by the module output that is present in overlay module 106 (for example, overlay emphasis modifier 212 or overlay background image 216). In other exemplary embodiments, overlay module 106 can include both overlay emphasis modifier 212 and overlay background image generator 216, which can operate in the same scene 110. Consequently, overlay output generator 218 can receive the output of both the emphasis modifier of Petition 870190064983, of 10/07/2019, p. 29/71 21/44 overlay 212 as well as the overlay background image generator 216 to be applied to the informational overlay and the associated scene 110 simultaneously. [053] For example, overlay module 106 (for example, through scene data analyzer 206, scene object identifier 208, and / or device motion detector 210) can determine that scene 110 is currently undergoing a change that exceeds some limit, such as through movement of objects detected in the scene, rotational or translational movement of the user device 101 itself, or the like. In response to these changes, the overlay emphasis modifier 212 can determine that at least a portion of the informational overlay should be de-emphasized (for example, less brightness, smaller text font, and so on). Simultaneously in response to the same scene changes 110, the overlay background image 216 may determine that a background image should not be employed, or that background image should be modified to allow more of the underlying scene 110 to be visible. In response, the overlay output generator 218 can modify the background image and the informational overlay according to the output of both the overlay emphasis modifier 212 and the overlay background image 216, simultaneously. [054] In another exemplary embodiment, the overlay background image generator 216 and / or the overlay output generator 218 can be configured to relocate a portion of the informational overlay, such as one or more objects or elements of the overlay, the an area of scene 110 that is most compatible with the use of a background image. For example, a particular element of the informational overlay may correspond to a portion of scene 110 that is of some importance for user viewing, such as a central portion of scene 110. In this example, the location within scene 110 of the informational overlay portion for which the background image should be used can Petition 870190064983, of 10/07/2019, p. 30/71 22/44 to be moved to a more peripheral area of the scene, thus allowing the use of the background image without causing a negative impact on the visibility of the central portion of the scene 110. [055] In an exemplary embodiment, the overlay output generator 218 can separately generate data or signals that represent the informational overlay and the background image. For example, in relation to user device 101C, overlay output generator 218 can provide data representing the informational overlay for the transparent display device 105 to produce the informational overlay (for example, via LEDs), and can provide data representing the background image for the transparent display device 105 to partially or totally block the light received from all, or one or more portions of the primary scene 110 (for example, via an LCD structure or system coupled to the device transparent display 105). [056] Figure 3 is a flow diagram of an example method 300 for modifying an informational overlay on an augmented reality screen. Although method 300 and other methods discussed below are described as being performed by overlay module 106 and the various modules included therein, method 300 can be performed by other structures not explicitly described in this document. In addition, although the operations of method 300 in Figure 3 and others discussed below are presented in a particular order, other execution orders are also possible, including the simultaneous, concomitant or overlapping execution of two or more operations. [057] In method 300, overlay module 106 may cause an informational overlay to appear at the top of a scene 110 on a display device (for example, via the overlay output generator 218 on the display device 102 of the device 101 A user interface on the Petition 870190064983, of 10/07/2019, p. 31/71 23/44 transparent projection 112 through the projector 103 of the user device 101B, or on the transparent display device 105 of the user device 101C, through the data of the overlay data access module 202) (operation 302). Scene change data indicating a change in scene 110 can be accessed (for example, via the scene change data access module 204) (operation 304). A visual emphasis (for example, emphasis or de-emphasis) of at least a portion of the informational overlay can be modified (for example, through the overlay emphasis modifier 212, based on data from the 206 scene data analyzer, scene object identifier 208 and / or device motion detector 210) based on scene change data (operation 306). In other exemplary modalities, the visual emphasis of at least a portion of the informational overlay can be based on data that indicates one or more aspects of the current scene 110, as opposed to scene change data. [058] Figures 4A and 4B are data flow diagrams of example methods 400A, 400B of generating scene change data to modify an emphasis of an informational overlay. In method 400A, a current version of scene 110 can be compared to a previous version of scene 110 (for example, by the scene data analyzer 206 or scene object identifier 208) (operation 402), and the change data of scene can be based on this comparison (operation 404). In method 400B, motion data that indicates the movement of a display device (for example, display device 102 of user device 101A, projector 103 of user device 101B (or the transparent projection surface 112 associated with the same), or the transparent display device 105 of user device 101C) can be accessed (for example, through device motion detector 210) (operation 412). Scene change data can be generated based on motion data (operation Petition 870190064983, of 10/07/2019, p. 32/71 24/44 414), as described above. [059] Figure 5A is a graphical representation of a 500A scene with an exemplary informational overlay. Scene 500A is a view of a street scene that includes a 502A entrance to a parking lot for a business of interest to the user. The related informational overlay includes several elements that provide user guidance so that the user can navigate to the parking lot and the associated business. More particularly, the informational overlay includes an arrow 508A that indicates the location of the 502A parking entrance, 504A text supply instructions (“Turn Here”) that directs the user to turn at the 502A parking entrance, and a 506A graphic element (a box) that highlights the 504A text instructions. The informational overlay additionally includes an arrow 528A for the building interest entry, a textual label 524A ("Entrance") for the building entrance, and a graphic element 526A that highlights the textual label 524A. A textual label 514A (for example, suite number) for the building entrance (“23C”) and an associated graphic element 516A are also included in the informational overlay. [060] Figure 5B is a graphical representation of a scene 500B related to scene 500A in Figure 5A, in which an emphasis of the informational overlay is modified. As the user approaches parking entrance 502A, entrance 502A appears to be closer to the user in scene 500B than in the previous scene 500A. Such a difference in appearance can be detected and indicated by scene object identifier 208 and / or scene data analyzer 206, which can determine that the apparent size of parking entrance 502A has increased and / or the location of parking entrance 502A if situates more centrally within the current 500B scene. In another example, a GPS receiver of user device 101 may indicate that the location of interest is within some particular distance to user device 101. In response, the Petition 870190064983, of 10/07/2019, p. 33/71 25/44 overlapping emphasis 212 can determine that at least the overlapping elements associated with parking entrance 502A (for example, arrow 508A, textual instructions 504A and / or associated graphic 506A) should be de-emphasized to increase visibility from parking entrance 502A in the scene. As shown in Figure 5B, this de-emphasis may include making the resulting arrow 508B less colorful or more translucent, and removing the corresponding graphic element 506A entirely, leaving only the 504A text instructions intact. Similarly, textual label 514A for the suite number and textual label 524A for the building entrance can be left intact, however arrow 528A and graphics 516A and 526A from the original scene 500A have been completely removed in the current scene 500B. Other methods of emphasizing or de-emphasizing various portions of the informational overlay can be employed in other modalities. [061] Figure 6 is a flow diagram of an exemplary method 600 to provide contrast control for an informational overlay on an augmented reality screen. In method 600, contrast enhancement data that describes at least a portion of a background image for presentation under an informational overlay is accessed (for example, through the overlay contrast enhancement data access module 214) (operation 602 ). At least a portion of the information overlay on which the background image is to be presented is identified (for example, using the scene data analyzer 206, scene object identifier 208 and / or the device motion detector 210) (operation 604). The overlay background image 216 can display the background image on top of a portion of scene 110 that corresponds to the portion of the informational overlay (operation 606). In addition, the overlay output generator 218 may cause the information overlay to appear on top of the background image and scene 110 (operation 608). Petition 870190064983, of 10/07/2019, p. 34/71 26/44 [062] Figure 7 is a flow diagram of an example 700 method of using scene change data to modify contrast enhancement data used to provide contrast control for an informational overlay. In method 700, scene change data indicating a change in scene 110 can be accessed (for example, via scene change data access module 204) (operation 702). Data that identifies at least a portion of the information overlay in which the background image is to be displayed is generated based on scene change data (for example, through background image generator 216 based on data from the data analyzer 206, scene object identifier 208 and / or device motion detector 210) (operation 704). In other exemplary modalities, the data that identifies a portion of the overlay under which the background image should be displayed is generated based on the data that describe one or more aspects (for example, color, brightness, complexity, object location, type of the current scene 110, as opposed to the scene change data. [063] Figure 8 is a graphical representation of a scene 800 in which an informational overlay is provided, which shows the contrast enhancement of a portion of the informational overlay through a background image. The aspects of scene 800 are similar to those of scene 500A, described above, in which an 802 parking entrance of interest is indicated by an arrow 808, instructional text 804 (“Turn Here”), and graphic element 806 (for example , a box) of an informational overlay. A text label 814 (“Tree”) and associated graphic element 816 indicate a tree in scene 800, while two labels 824, 834 (“Heaven” and “More Heaven”) and corresponding graphic element 826, 836 are also included in the overlay informative. [064] Also shown in Figure 8 is a background image 840 that appears Petition 870190064983, of 10/07/2019, p. 35/71 27/44 at the top of the scene 800 and behind the “More Sky” label 834 and the associated graphic element 836 to demonstrate the use of the background image 840 to increase the contrast of the label 834 and the graphic element 836 thereby increasing the legibility or visibility of those portions of the overlay relative to other portions of the overlay, such as, for example, the label 824 and the associated graphic element 826, where the corresponding portion of scene 800 (for example, the blue sky) makes those elements 824 , 826 less readable. Textual instruction 804, label 814 and corresponding graphic elements 806, 816 can also be considered less readable or visible due to the visually complex portion of the scene (for example, the tree) that supports these particular elements of the overlay. As mentioned above, the particular portion of the informational overlay in which the background image 840 is presented can be based on several different factors in other modalities, such as the movement of objects of interest into or out of one or more areas of the scene 800, the movement of objects in and in the opposite direction to a particular vantage point (e.g., user device 101), movement of user device 101, and the like. [065] Figure 9A is a graphical representation of a 900A scene in which a 902 virtual video screen is presented without contrast enhancement. More specifically, the virtual video screen 902 (for example, a video clip) of Figure 9Ais displayed directly on top of a white wall 904 in scene 900A. Assuming that the virtual video screen 902 is configured to allow at least a portion of the wall 904 of scene 900A to appear through the virtual video screen 902, the virtual video screen 902 may appear excessively clear or “washed out”, thereby reducing the an observer's ability to observe at least some of the finer or more detailed aspects of the image that is shown through the 902 virtual video screen. [066] Figure 9B is a graphical representation of scene 900A from Figure 9A in Petition 870190064983, of 10/07/2019, p. 36/71 28/44 which the virtual video screen 902 is presented with contrast enhancement. More specifically, at the top of the resulting scene 900B, a background image 906 was rendered, on which the virtual video screen 902 is displayed, resulting in a display with greater perceived contrast. In this particular exemplary embodiment, the background image 906 is a gray pixelated image that partially blocks the view of the portion of scene 900B underlying the virtual video screen 902. Furthermore, in the exemplary embodiment of Figure 9B, the resolution of the background image 906 is less than that of the 900B scene itself, although the resolution of other background images may be equal to or greater than that of the corresponding scene. [067] In view of at least some of the exemplary modalities described above, all or portions of an informational overlay, such as those employed on an AR screen, can be emphasized or de-emphasized by changing one or more aspects of the informational overlay. In addition, the visual contrast of all or portions of the overlay can be modified using a background image. Such changes may be based on the particular situation of the underlying scene, changes in the scene and other factors. By changing the visual aspects of an overlay in such a way, the overall utility for the user of the overlay can be enhanced. [068] Figure 10 is a block diagram illustrating components of a machine 1000, according to some exemplary modalities, with the ability to read instructions 1024 from a machine-readable medium 1022 (for example, a machine-readable medium non-transitory, machine-readable or computer-readable medium) and perform any one or more of the methodologies discussed in this document, in whole or in part. Specifically, Figure 10 shows the machine 1000 as an example of a computing device (for example, a computer) in which the instructions 1024 (for example, software, firmware, a program, an application, an applet, an Petition 870190064983, of 10/07/2019, p. 37/71 29/44 app, or other executable code) to make the machine 1000 perform any one or more of the methodologies discussed in this document, in whole or in part. [069] For example, instructions 1024 can cause the machine 1000 to execute the flow diagrams of Figures 3, 4A, 4B, 6 and 7, as well as all the exemplary modalities associated with them. Instructions 1024 can transform the general unscheduled machine 1000 into a particular machine (for example, specially configured machine) programmed to perform the functions described and illustrated in the manner described. In addition, in the exemplary embodiments, the machine 1000 may operate as one or more of the modules 202218 of the overlay module 106 of Figure 2, as well as one or more of the user devices 101A to 101C of Figures 1A to 1C, or any system or computing device described in this document. [070] In the exemplary modalities, the machine 1000 operates as a standalone device or can be connected (for example, in a network) to other machines. Machine 1000 can be a server computer, a client computer, a personal computer (PC), a tablet computer, a laptop computer, a netbook, a signal decoder (STB), a personal digital assistant (PDA) , a cell phone, a smart phone, a web device, a network router, a network switch, a network bridge, a power adapter, or any 1000 machine capable of executing 1024 instructions, sequentially or otherwise mode, specifying the actions to be taken by that machine 1000. Additionally, although only a single single machine 1000 is illustrated, the term “machine” should also be taken to include any collection of machines that individually or jointly execute instructions 1024 to perform any one or more of the methodologies discussed in this document. Petition 870190064983, of 10/07/2019, p. 38/71 30/44 [071] Machine 1000 includes a processor 1002 (for example, a central processing unit (CPU), a graphics processing unit (GPU), a digital signal processor (DSP), an application-specific integrated circuit (ASIC), a radio frequency integrated circuit (RFIC), or any suitable combination thereof), a main memory 1004, and a static memory 1006, which are configured to communicate with each other over a bus 1008. Processor 1002 can contain microcircuits that are configurable, temporarily or permanently, by any or all instructions 1024 so that processor 1002 is configurable to perform any or more of the methodologies described in this document, in whole or in part. For example, a set of one or more microcircuits of processor 1002 may be configurable to run one or more modules (for example, software modules) described in this document. [072] Machine 1000 may additionally include a 1010 graphics screen (for example, a plasma display panel (PDP), a light-emitting diode (LED), a liquid crystal display (LCD), a projector, a cathode ray tube (CRT), or any other screen capable of displaying graphics or video). Machine 1000 may also include an alphanumeric input device 1012 (for example, a keyboard or numeric keypad), a cursor control device 1014 (for example, a mouse, a touchpad, a trackball, a joystick, a motion sensor , an eye tracking device, or other pointing instrument), a 1016 storage unit, a 1018 signal generation device (for example, a sound card, amplifier, speaker, headphone connector, or any suitable combination thereof), and a network interface device. [073] The 1016 storage unit includes the machine-readable medium 1022 (for example, a machine-readable tangible storage medium) in which Petition 870190064983, of 10/07/2019, p. 39/71 31/44, instructions 1024 are stored that incorporate any one or more of the methodologies or functions described in this document. The instructions 1024 can also reside, completely or at least partially, inside main memory 1004, inside processor 1002 (for example, inside a cache memory of processor 1002), or both, before or during the execution of the same by the machine 1000. Consequently, main memory 1004 and processor 1002 can be considered machine-readable media 1022 (for example, machine-readable, tangible and non-transitory media). [074] In some exemplary embodiments, the machine 1000 can be a mobile or portable computing device and have one or more additional input components (for example, sensors or meters). Examples of such input components include an image input component (for example, one or more cameras), an audio input component (for example, a microphone), a direction input component (for example, a compass ), a location input component (for example, a Global Positioning System (GPS) receiver), a guidance component (for example, a gyroscope), a motion detection component (for example, one or more accelerometers ), an altitude detection component (for example, an altimeter) and a gas detection component (for example, a gas sensor). Entries collected by one or more of these input components can be accessible and available for use by any of the modules described in this document. [075] As used herein, the term “memory” refers to a 1022 machine-readable medium capable of storing data temporarily or permanently and can be adopted to include, but is not limited to, random access memory (RAM), read-only memory (ROM), temporary storage memory, flash memory and cache memory. Although the machine-readable medium 1022 is shown in an exemplary mode as a medium Petition 870190064983, of 10/07/2019, p. 40/71 32/44 single, the term “machine-readable medium” should be used to include a single medium or multiple media (for example, a centralized or distributed database, or associated caches and servers) capable of storing 1024 instructions. The term “machine-readable medium” should also be adopted to include any medium, or combination of multiple media, that has the capacity to store 1024 instructions for execution by a machine (for example, machine 1000), so that the instructions 1024, when executed by one or more processors of the machine 1000 (for example, processor 1002), make the machine 1000 perform any one or more of the methodologies described in this document. The term "machine-readable medium" should therefore be adopted to include, but not limited to, one or more data repositories in the form of a solid-state memory, an optical medium, a magnetic medium or any suitable combination thereof. [076] In addition, the machine-readable medium 1022 is non-transitory because it does not incorporate a propagation or transitory signal. However, labeling the machine-readable medium 1022 as “non-transitory” should not be interpreted to mean that the medium is incapable of movement; the medium must be considered as being transportable from one physical location to another, in some exemplary modalities. In addition, since the machine-readable medium 1022 is tangible, the medium can be considered as a machine-readable device. [077] 1024 instructions can additionally be transmitted or received over a 1026 communication network using a transmission medium via the 1020 network interface device and using any of several well-known transfer protocols (for example , Hypertext Transfer Protocol (HTTP)). Examples of 1026 communication networks include a local area network (LAN), a wide area network (WAN), the Internet, mobile phone networks, basic telephone service networks (POTS), and wireless data networks. Petition 870190064983, of 10/07/2019, p. 41/71 33/44 wire (for example, Wi-Fi®, LTE® and WiMAX ™ networks). The term "transmission medium" should be used to include any non-tangible medium that has the capacity to store, encode or carry 1024 instructions for execution by the machine 1000, and includes digital or analog communications signals or other non-tangible medium to facilitate communication such software. [078] A carrier medium may comprise a computer-readable storage medium or a transmission medium. A carrier means comprises any means for carrying instructions that are executable by a machine processor. [079] In an exemplary embodiment, a computerized method comprises accessing, using at least one of one or more hardware processors, contrast enhancement data that describes at least a portion of a background image for presentation under an overlay. informative on a display device, the informative overlay comprising at least one of a graphic element or a textual element to partially obscure a scene; identify, using at least one of the one or more hardware processors, at least a portion of the informational overlay under which the background image must be presented; cause, on the display device, the presentation of the background image at the top of a portion of the scene that corresponds to at least a portion of the informational overlay; and cause the informational overlay to appear at the top of the background image and the scene on the display device. [080] In another exemplary modality, which includes all the previous exemplary modalities, the method additionally comprises causing the presentation of the scene on the display device. [081] In another exemplary modality, which includes all previous exemplary modalities, the contrast enhancement data comprise Petition 870190064983, of 10/07/2019, p. 42/71 34/44 data indicating a background pattern to be used for the background image, which causes the background image to be displayed using the indicated background pattern. [082] In another exemplary modality, which includes all the previous exemplary modalities, the contrast enhancement data comprises data that indicates a color for the background image, which causes the background image to be displayed using the indicated color. [083] In another exemplary modality, which includes all the previous exemplary modalities, the contrast enhancement data comprises data that indicate a brightness for the background image, which causes the background image to be displayed using the indicated brightness. [084] In another exemplary modality, which includes all of the previous exemplary modalities, the data identifies at least a portion of the informational overlay based on a brightness of the portion of the scene that corresponds to the identified portion of the informational overlay. [085] In another exemplary modality, which includes all of the previous exemplary modalities, the data identifies at least a portion of the informational overlay based on a visual complexity of the portion of the scene that corresponds to the identified portion of the informational overlay. [086] In another exemplary modality, which includes all the previous exemplary modalities, the cause of the presentation of the background image at the top of the portion of the scene comprises causing the temporal modulation of the background image. [087] In another exemplary modality, which includes all of the previous exemplary modalities, the method additionally comprises accessing scene change data that indicates a change in the scene, with the generation of the data identifying at least a portion of the informational overlay based on in the scene change data. Petition 870190064983, of 10/07/2019, p. 43/71 35/44 [088] In another exemplary modality, which includes all previous exemplary modalities, the scene change data comprises an indication of a portion of the scene that corresponds to an object of interest in the scene demonstrated by the identified portion of the informational overlay. [089] In another exemplary modality, which includes all the previous exemplary modalities, the generation of data identifies at least a portion of the informational overlap that occurs in response to the object of interest that enters the portion of the scene. [090] In another exemplary modality, which includes all the previous exemplary modalities, the generation of data identifies at least a portion of the informational overlap that occurs in response to the object of interest that leaves the portion of the scene. [091] In another exemplary modality, which includes all previous exemplary modalities, the generation of data identifies at least a portion of the informational overlap that occurs in response to the object of interest that invades a particular perceived distance to a vantage point of the scene . [092] In another exemplary modality, which includes all the previous exemplary modalities, the generation of data identifies at least a portion of the informational overlap that occurs in response to the object of interest that leaves a particular perceived distance to an advantage point of scene. [093] In another exemplary modality, which includes all of the previous exemplary modalities, the generation of data identifies at least a portion of the informational overlap that occurs in response to scene change data that indicates a magnitude of the change in the scene that exceeds one particular level over a period of time. [094] In another exemplary modality, which includes all previous exemplary modalities, the generation of data identifies at least a portion Petition 870190064983, of 10/07/2019, p. 44/71 36/44 of informational overlap that occurs in response to scene change data that indicates a magnitude of the change in the scene that falls below a particular level over a period of time. [095] In another exemplary modality, which includes all previous exemplary modalities, the method additionally comprises comparing a current version of the scene with an earlier version of the scene; and generate scene change data based on a comparison of the current version of the scene with the previous version of the scene. [096] In another exemplary modality, which includes all the previous exemplary modalities, the method additionally comprises receiving movement data that indicate the movement of the display device; and generate the scene change data based on the motion data. [097] In another exemplary modality, which includes all the previous exemplary modalities, the movement data indicates the translational movement of the display device. [098] In another exemplary modality, which includes all the previous exemplary modalities, the movement data indicates the rotational movement of the display device. [099] In an exemplary embodiment, a display system comprises one or more hardware processors; and a memory that has stored in the same instructions that, when executed by at least one of the one or more hardware processors, cause the display system to perform operations that comprise accessing contrast enhancement data that describe at least a portion of a background image for presentation under an informational overlay on a display device, the informational overlay comprising at least one of a graphic element or a textual element to partially obscure a scene; identify at least Petition 870190064983, of 10/07/2019, p. 45/71 37/44 a portion of the informational overlay under which the background image should be displayed; cause, on the display device, the presentation of the background image at the top of a portion of the scene that corresponds to at least a portion of the informational overlay; and cause the informational overlay to appear at the top of the background image and the scene on the display device. [0100] In another exemplary modality, which includes all the previous exemplary modalities, the display system additionally comprises the display device. [0101] In another exemplary modality, which includes all the previous exemplary modalities, the operations additionally comprise accessing scene change data that indicate a change in the scene, with the generation of the data identifying at least a portion of the informational overlay based on in the scene change data. [0102] In another exemplary embodiment, which includes all of the previous exemplary embodiments, the display system additionally comprises a motion detection device coupled to the display device, the motion detection device being configured to generate motion data that they indicate the movement of the display device, and the operations additionally include generating the scene change data based on the movement data. [0103] In another exemplary modality, which includes all previous exemplary modalities, the operations additionally include comparing a current version of the scene with a previous version of the scene; and generate scene change data based on a comparison of the current version of the scene with the previous version of the scene. [0104] In another exemplary modality, which includes all the previous exemplary modalities, operations additionally comprise causing Petition 870190064983, of 10/07/2019, p. 46/71 38/44 presentation of the scene on the display device. [0105] In another exemplary modality, which includes all the previous exemplary modalities, the scene comprises a view of a physical environment; the display device comprises a transparent material through which the physical environment is viewed; the system additionally comprises a projection device; and the cause of the presentation of the background image and the cause of the presentation of the informational overlay comprise projecting the background image and the informational overlay using the projection device on the transparent material. [0106] In another exemplary modality, which includes all the previous exemplary modalities, the scene comprises a view of a physical environment; the display device comprises a transparent material through which the physical environment is viewed, and a light emitting system coupled to the transparent material; and the cause of the presentation of the background image and the cause of the presentation of the informational overlay comprise causing the presentation of the background image and the informational overlay with the use of the light emitting system. [0107] In another exemplary modality, which includes all the previous exemplary modalities, the scene comprises a view of a physical environment; the display device comprises a transparent material through which the physical environment is viewed, a light emitting system coupled to the transparent material, and a light blocking system coupled to the transparent material to at least partially block the light from one or more portions the scene; and cause the presentation of the background image using the light blocking system; and cause the presentation of the informational overlay using the light emitting system. [0108] In another exemplary modality, which includes all of the previous exemplary modalities, the operations additionally comprise generating one or more signals that include information describing the scene, the background image, and the Petition 870190064983, of 10/07/2019, p. 47/71 39/44 informational overlay, and the display device is configured to emit light representing the scene, the background image and the informational overlay based on one or more signals. [0109] In an exemplary embodiment, a non-transitory computer-readable storage medium comprises instructions that, when executed by one or more hardware processors on a machine, cause the machine to perform operations that include accessing contrast enhancement data that describe at least a portion of a background image for presentation under an informational overlay on a display device, the informational overlay comprising at least one of a graphic or textual element to partially obscure a scene; identify at least a portion of the informational overlay under which the background image should be presented; cause, on the display device, the presentation of the background image at the top of a portion of the scene that corresponds to at least a portion of the informational overlay; and cause the informational overlay to appear at the top of the background image and the scene on the display device. [0110] Throughout this specification, several instances may implement components, operations or structures described as a single instance. Although individual operations of one or more methods are illustrated and described as separate operations, one or more of the individual operations can be performed simultaneously, and nothing requires that the operations be performed in the order illustrated. The structures and functionality presented as separate components in the example configurations can be implemented as a combined structure or component. Similarly, the structures and functionality presented as a single component can be implemented as separate components. These and other variations, modifications, additions and improvements are covered by the scope of the matter in this document. Petition 870190064983, of 10/07/2019, p. 48/71 40/44 [0111] Certain exemplary modalities are described in this document as including logic components or various components, modules or mechanisms. The modules can consist of software modules (for example, code embedded in a 1022 machine-readable medium or a transmission signal) or hardware modules. A “hardware module” is a tangible unit capable of carrying out certain operations and can be configured or arranged in a certain physical way. In various exemplary embodiments, one or more computer systems (for example, a stand-alone computer system, a client computer system or server computer system) or one or more hardware modules of a computer system (for example, a processor 1002 or a group of processors 1002) can be configured by software (for example, an application or application portion) as a hardware module that operates to perform certain operations, as described in this document. [0112] In some exemplary modalities, a hardware module can be implemented mechanically, electronically or any suitable combination thereof. For example, a hardware module may include a dedicated or logical circuitry that is permanently configured to perform certain operations. For example, a hardware module can be a special purpose processor, such as a field programmable port arrangement (FPGA) or an ASIC. A hardware module can also include programmable logic or circuitry that is temporarily configured by the software to perform certain operations. For example, a hardware module may include software covered by a general-purpose processor or another programmable processor. It will be noted that the decision to implement a hardware module mechanically, in a dedicated and permanently configured circuitry, or in a temporarily configured circuitry (for example, configured by software) can be triggered taking into account cost and time. Petition 870190064983, of 10/07/2019, p. 49/71 41/44 [0113] Consequently, the phrase “hardware module” should be understood as covering a tangible entity, whether it is an entity that is physically constructed, permanently configured (for example, directly connected), or temporarily configured (for example, programmed ) to perform certain operations described in this document. As used in this document, “module implemented in hardware” refers to a hardware module. Considering the exemplary modalities in which the hardware modules are temporarily configured (for example, programmed), each of the hardware modules does not need to be configured or instantiated in any instance in time. For example, when a hardware module comprises a general-purpose processor configured by software to become a special-purpose processor, the general-purpose processor may be configured as respectively different special-purpose processors (for example, comprising different hardware) at different times. The software can therefore configure a processor, for example, to constitute a particular hardware module in one instance of time and to constitute a different hardware module in a different instance of time. [0114] The various operations of the exemplary methods described in this document can be performed, at least partially, by one or more processors that are temporarily configured (for example, by software) or permanently configured to perform the relevant operations. Whether temporarily or permanently configured, such processors may constitute modules implemented by a processor that operate to perform one or more operations or functions described in this document. As used in this document, “module implemented by processor” refers to a hardware module implemented using one or more processors. [0115] Similarly, the methods described in this document can Petition 870190064983, of 10/07/2019, p. 50/71 42/44 be at least partially implemented per processor, with a processor being an example of hardware. For example, at least some of the operations of a method can be performed by one or more modules implemented by a processor or processors. [0116] Some portions of the matter discussed in this document may be presented in terms of algorithms or symbolic representations of operations on data stored as bits or binary digital signals within a machine memory (for example, a computer memory). Such algorithmic or symbolic representations are examples of techniques used by those of ordinary skill in the data processing technique to convey the content of their work to others versed in the technique. As used in this document, an “algorithm” is a sequence of self-consistent operations or similar processing that leads to a desired result. In this context, the algorithms and operations involve physical manipulation of physical quantities. Typically, but not necessarily, such quantities may take the form of electrical, magnetic or optical signals capable of being stored, accessed, transferred, combined, compared or otherwise manipulated by a machine. Sometimes it is convenient, mainly for reasons of common use, to refer to such signs with the use of words, such as "data", "content", "bits", "values", "elements", "symbols", “Characters”, “terms”, “numbers”, “numerals” or similar. These words, however, are merely convenient labels and must be associated with adequate physical quantities. [0117] Unless specifically stated otherwise, discussions in this document that use words, such as "process", "compute", "calculate", "determine", "present", "display", or the like may be refer to the actions or processes of a machine (for example, a computer) that manipulates or transforms data represented as physical quantities (for example, electronic, Petition 870190064983, of 10/07/2019, p. 51/71 43/44 magnetic or optical) within one or more memories (for example, volatile memory, non-volatile memory, or any suitable combination thereof), registers, or other machine components that receive, store, transmit, or display information. In addition, unless specifically stated otherwise, the terms "one" or "one" are used in this document, as is common in patent documents, to include one or more of an instance. Finally, as used herein, the conjunction "or" refers to a non-exclusive "or", unless specifically stated otherwise. [0118] Although an overview of the inventive matter has been described with reference to specific exemplary modalities, several modifications and changes can be made to these exemplary modalities without departing from the broader scope of the modalities of the present disclosure. For example, various modalities or resources of the same can be mixed and correlated or optional by a person of ordinary skill in the technique. Such modalities of the inventive material may be referred to in this document, individually or collectively, by the term "invention" merely for the sake of convenience and without intending to voluntarily limit the scope of this request to any single inventive concept if more than one is, in fact, revealed. [0119] The exemplary modalities illustrated in this document are described in sufficient detail to allow those skilled in the art to practice the revealed teachings. Other modalities can be used and derived from them, so that structural and logical substitutions and changes can be made without departing from the scope of this disclosure. The detailed description, therefore, should not be considered in a limiting sense, and the scope of various modalities is defined only by the appended claims, together with the full range of equivalents to which such claims are entitled. [0120] In addition, plural instances can be provided for resources, Petition 870190064983, of 10/07/2019, p. 52/71 44/44 operations or structures described in this document as a single instance. In addition, limits between various resources, operations, modules, mechanisms and data stores are somewhat arbitrary, and particular operations are illustrated in the context of specific illustrative configurations. Other allocations of functionality are anticipated and may fall within the scope of various modalities of this disclosure. In general, the structures and functionality presented as separate resources in the example configurations can be implemented as a combined structure or resource. Similarly, the structures and functionality presented as a single resource can be implemented as separate resources. These and other variations, modifications, additions and enhancements fall within the scope of the modalities of this disclosure as represented by the appended claims. The descriptive report and the drawings must therefore be considered in an illustrative rather than a restrictive sense.
权利要求:
Claims (23) [1] 1. Computerized method CHARACTERIZED by the fact that it comprises: access contrast enhancement data that describes at least a portion of a background image for presentation under an informational overlay on a display device, with the informational overlay comprising at least one of a graphic element or a textual element to partially obscure a scene; identify at least a portion of the informational overlay under which the background image should be presented; cause, in the display device, the presentation of the background image over a portion of the scene that corresponds to at least a portion of the informational overlay; and cause the informational overlay to appear over the background image and scene on the display device. [2] 2. Computerized method, according to claim 1, CHARACTERIZED by the fact that it additionally comprises: cause the scene to appear on the display device. [3] 3. Computerized method according to claim 1 or 2, CHARACTERIZED by the fact that the contrast enhancement data comprises data that indicate a background pattern to be used for the background image, which causes the presentation of the image of background using the indicated background pattern. [4] 4. Computerized method, according to claim 1 or 2, CHARACTERIZED by the fact that the contrast enhancement data comprises data that indicate a color for the background image, which causes the background image to be displayed using the indicated color. Petition 870190064983, of 10/07/2019, p. 54/71 2/5 [5] 5. Computerized method, according to claim 1 or 2, CHARACTERIZED by the fact that the contrast enhancement data comprises data that indicate a color for the background image, which causes the background image to be displayed using the indicated brightness. [6] 6. Computerized method according to any one of claims 1 to 5, CHARACTERIZED by the fact that the data identifies at least a portion of the informational overlay based on a brightness of the portion of the scene that corresponds to the identified portion of the informational overlay. [7] 7. Computerized method according to any one of claims 1 to 5, CHARACTERIZED by the fact that the data identifies at least a portion of the informational overlay based on a visual complexity of the portion of the scene that corresponds to the identified portion of the informational overlay . [8] 8. Computerized method according to any one of claims 1 to 7, CHARACTERIZED by the fact that the cause of the presentation of the background image at the top of the scene portion comprises causing the temporal modulation of the background image. [9] 9. Method according to any one of claims 1 to 8, CHARACTERIZED by the fact that it additionally comprises: access scene change data that indicates a change in the scene; the generation of the data identifies at least a portion of the informational overlay based on the scene change data. [10] 10. Computerized method according to claim 9, CHARACTERIZED by the fact that the scene change data comprise an indication of a portion of the scene that corresponds to an object of interest in the scene demonstrated by the identified portion of the informational overlay. [11] 11. Computerized method, according to claim 10, CHARACTERIZED by the fact that the generation of data that identifies the Petition 870190064983, of 10/07/2019, p. 55/71 3/5 minus a portion of the informational overlay that occurs in response to the object of interest that enters the portion of the scene. [12] 12. Computerized method, according to claim 10, CHARACTERIZED by the fact that the generation of data that identifies at least a portion of the informational overlap that occurs in response to the object of interest that leaves the portion of the scene. [13] 13. Computerized method, according to claim 10, CHARACTERIZED by the fact that the generation of data that identifies at least a portion of the informational overlap that occurs in response to the object of interest that invades a particular perceived distance to a vantage point of the scene. [14] 14. Computerized method, according to claim 10, CHARACTERIZED by the fact that the generation of data that identifies at least a portion of the informational overlap that occurs in response to the existing object of interest that leaves a particular perceived distance to a point advantage of the scene. [15] 15. Computerized method, according to claim 10, CHARACTERIZED by the fact that the generation of data that identifies at least a portion of the informational overlap occurs in response to scene change data that indicate a magnitude of the change in the scene that exceeds a particular level over a period of time. [16] 16. Computerized method, according to claim 10, CHARACTERIZED by the fact that the generation of data that identifies at least a portion of the informational overlap occurs in response to scene change data that indicate a magnitude of the change in the falling scene below a particular level over a period of time. [17] 17. Computerized method, according to any of the Petition 870190064983, of 10/07/2019, p. 56/71 4/5 claims 9 to 16, CHARACTERIZED by the fact that it additionally comprises: compare a current version of the scene with an earlier version of the scene; and generate scene change data based on a comparison of the current version of the scene with the previous version of the scene. [18] 18. Computerized method according to any of claims 9 to 17, CHARACTERIZED by the fact that it additionally comprises: receiving motion data that indicates the movement of the display device; and generate the scene change data based on the motion data. [19] 19. Computerized method, according to claim 18, CHARACTERIZED by the fact that the movement data indicates the translational movement of the display device. [20] 20. Computerized method, according to claim 18, CHARACTERIZED by the fact that the movement data indicates the rotational movement of the display device. [21] 21. Display device CHARACTERIZED by the fact that it comprises: one or more hardware processors; and a memory that has stored in the same instructions that, when executed by at least one of the one or more hardware processors, cause the display system to carry out the method, according to any previous claim. [22] 22. Computer-readable storage medium CHARACTERIZED by the fact that it comprises instructions that, when executed by one or more hardware processors of a machine, cause the machine to perform the method, according to any one of claims 1 to 20. Petition 870190064983, of 10/07/2019, p. 57/71 5/5 [23] 23. Carrier medium CHARACTERIZED by the fact that it carries instructions that, when executed by one or more hardware processors of a machine, cause the machine to perform the method, according to any one of claims 1 to 20.
类似技术:
公开号 | 公开日 | 专利标题 BR112019014333A2|2020-02-27|CONTRAST CONTROL OF OVERLAY ON AUGMENTED REALITY SCREENS BR112019014336A2|2020-02-27|MODIFICATION WITH EMPHASIS ON AUGMENTED REALITY DISPLAY ES2806947T3|2021-02-19|User interface for augmented reality enabled devices US11085787B2|2021-08-10|Augmented reality interface for navigation assistance CA2964693C|2019-08-06|Street-level guidance via route path US20140256429A1|2014-09-11|Image display system and head-mounted display device US20140139519A1|2014-05-22|Method for augmenting reality US20160155242A1|2016-06-02|Overlay display RU2020125730A|2022-02-04|SYSTEMS AND METHODS FOR TEXT OVERLAYING IN THE AMUSEMENT PARK ENVIRONMENT US10366495B2|2019-07-30|Multi-spectrum segmentation for computer vision KR101395388B1|2014-05-14|Apparatus and method for providing augmented reality US10229523B2|2019-03-12|Augmented reality alteration detector US11232636B2|2022-01-25|Methods, devices, and systems for producing augmented reality WO2022042624A1|2022-03-03|Information display method and device, and storage medium KR20180051422A|2018-05-16|Electronic apparatus and control method thereof US20200154047A1|2020-05-14|Systems and methods for image-based light output US20200117924A1|2020-04-16|Electronic apparatus and control method thereof US10922889B2|2021-02-16|Directing user attention TWI670646B|2019-09-01|Method of displaying information and displaying system thereof EP3510440B1|2021-10-13|Electronic device and operation method thereof US20150373273A1|2015-12-24|Visualizing pinpoint attraction objects in three-dimensional space JP2012164351A5|2013-05-30| KR20170061798A|2017-06-07|A translation system based on augmented reality US10922043B2|2021-02-16|Information processing device and information processing method for acquiring information associated with a target US20200387708A1|2020-12-10|Display of information related to audio content based on ambient lighting conditions
同族专利:
公开号 | 公开日 EP3571672A1|2019-11-27| US20190259189A1|2019-08-22| CN110168616A|2019-08-23| US10235788B2|2019-03-19| US11200716B2|2021-12-14| US20180204362A1|2018-07-19| WO2018136426A1|2018-07-26|
引用文献:
公开号 | 申请日 | 公开日 | 申请人 | 专利标题 US5913591A|1998-01-20|1999-06-22|University Of Washington|Augmented imaging using a silhouette to improve contrast| US7319755B2|2001-08-03|2008-01-15|Waterstrike Incorporated|Image altering apparatus and method for providing confidential viewing of a fundamental display image| US10032192B2|2003-12-23|2018-07-24|Roku, Inc.|Automatic localization of advertisements| US7271815B2|2004-10-21|2007-09-18|International Business Machines Corporation|System, method and program to generate a blinking image| GB0805924D0|2008-04-02|2008-05-07|Hibbert Ralph Animation|Storyboard creation system| US9480919B2|2008-10-24|2016-11-01|Excalibur Ip, Llc|Reconfiguring reality using a reality overlay device| US8964298B2|2010-02-28|2015-02-24|Microsoft Corporation|Video display modification based on sensor input for a see-through near-to-eye display| US20120056896A1|2010-09-02|2012-03-08|Border John N|Presenting information on a see-though display| US8576276B2|2010-11-18|2013-11-05|Microsoft Corporation|Head-mounted display device which provides surround video| US8670000B2|2011-09-12|2014-03-11|Google Inc.|Optical display system and method with virtual image contrast control| US9235318B2|2012-02-01|2016-01-12|Facebook, Inc.|Transitions among hierarchical user-interface layers| GB2501925B|2012-05-11|2015-04-29|Sony Comp Entertainment Europe|Method and system for augmented reality| NL2009616C2|2012-10-11|2014-04-14|Ultra D Co Peratief U A|Adjusting depth in a three-dimensional image signal.| US9619911B2|2012-11-13|2017-04-11|Qualcomm Incorporated|Modifying virtual object display properties| JP6155643B2|2013-01-07|2017-07-05|セイコーエプソン株式会社|Display device and control method of display device| WO2015034535A1|2013-09-09|2015-03-12|Empire Technology Development, Llc|Augmented reality alteration detector| KR20150059989A|2013-11-25|2015-06-03|삼성전자주식회사|Apparatus and Method for recognition a documentation with text and image| EP2886039B1|2013-12-17|2019-08-21|Microsoft Technology Licensing, LLC|Method and see-thru display device for color vision deficit correction| US9898844B2|2013-12-31|2018-02-20|Daqri, Llc|Augmented reality content adapted to changes in real world space geometry| US20150262428A1|2014-03-17|2015-09-17|Qualcomm Incorporated|Hierarchical clustering for view management augmented reality| US9710711B2|2014-06-26|2017-07-18|Adidas Ag|Athletic activity heads up display systems and methods| US9557951B2|2015-03-04|2017-01-31|International Business Machines Corporation|Filtering information within augmented reality overlays| US9791917B2|2015-03-24|2017-10-17|Intel Corporation|Augmentation modification based on user interaction with augmented reality scene| KR20160137258A|2015-05-22|2016-11-30|삼성전자주식회사|Electronic apparatus and method for displaying screen thereof| US10152815B2|2017-01-17|2018-12-11|Opentv, Inc.|Overlay emphasis modification in augmented reality displays| US10235788B2|2017-01-17|2019-03-19|Opentv, Inc.|Overlay contrast control in augmented reality displays|JP6879479B2|2015-09-02|2021-06-02|インターディジタル・シーイー・パテント・ホールディングス・ソシエテ・パ・アクシオンス・シンプリフィエ|Methods, devices and systems that facilitate navigation in extended scenes| US10587834B2|2016-03-07|2020-03-10|Disney Enterprises, Inc.|Systems and methods for tracking objects for augmented reality| US10235788B2|2017-01-17|2019-03-19|Opentv, Inc.|Overlay contrast control in augmented reality displays| US10152815B2|2017-01-17|2018-12-11|Opentv, Inc.|Overlay emphasis modification in augmented reality displays| JP2019062462A|2017-09-27|2019-04-18|キヤノン株式会社|Display unit and control apparatus thereof, and control method therefor| US10699485B2|2018-01-04|2020-06-30|Universal City Studios Llc|Systems and methods for textual overlay in an amusement park environment| EP3676656A4|2018-01-18|2020-11-25|Samsung Electronics Co., Ltd.|Method and apparatus for adjusting augmented reality content| US10481680B2|2018-02-02|2019-11-19|Disney Enterprises, Inc.|Systems and methods to provide a shared augmented reality experience| JP2019152980A|2018-03-01|2019-09-12|キヤノン株式会社|Image processing system, image processing method and program| US10546431B2|2018-03-29|2020-01-28|Disney Enterprises, Inc.|Systems and methods to augment an appearance of physical object for an augmented reality experience| US10974132B2|2018-10-02|2021-04-13|Disney Enterprises, Inc.|Systems and methods to provide a shared interactive experience across multiple presentation devices based on detection of one or more extraterrestrial bodies| US11014008B2|2019-03-27|2021-05-25|Disney Enterprises, Inc.|Systems and methods for game profile development based on virtual and/or real activities| US10916061B2|2019-04-24|2021-02-09|Disney Enterprises, Inc.|Systems and methods to synchronize real-world motion of physical objects with presentation of virtual content| TWI717824B|2019-09-04|2021-02-01|財團法人工業技術研究院|Information display method and information display system|
法律状态:
2021-10-13| B350| Update of information on the portal [chapter 15.35 patent gazette]|
优先权:
[返回顶部]
申请号 | 申请日 | 专利标题 US15/407,491|US10235788B2|2017-01-17|2017-01-17|Overlay contrast control in augmented reality displays| US15/407,491|2017-01-17| PCT/US2018/013893|WO2018136426A1|2017-01-17|2018-01-16|Overlay contrast control in augmented reality displays| 相关专利
Sulfonates, polymers, resist compositions and patterning process
Washing machine
Washing machine
Device for fixture finishing and tension adjusting of membrane
Structure for Equipping Band in a Plane Cathode Ray Tube
Process for preparation of 7 alpha-carboxyl 9, 11-epoxy steroids and intermediates useful therein an
国家/地区
|