专利摘要:
the present invention relates to a method of evaluating wireless device performance and / or wireless network performance, and / or wireless network usage trends, in a wireless electronic device, the method comprising: providing software wireless device in the wireless electronic device, the wireless device software being incorporated into an application or memory of the wireless electronic device, and being operable to perform at least one test associated with the characteristics and / or location of the device, and / or performance of the device and / or network, and / or use of the device by a user; communicate with at least one of: i) an operating system of the wireless electronic device; and ii) an external test server to perform at least one test, where at least one test is performed according to at least one test parameter provided in a wireless device configuration and / or software, the at least one a test parameter comprising at least one behavioral criterion, which controls the impact of the test on the operation of the wireless electronic device; collect test data resulting from the performance of at least one test; and provide test data to a collection server.
公开号:BR112019014658A2
申请号:R112019014658
申请日:2018-01-16
公开日:2020-05-26
发明作者:Sean Kroeker Anthony;Stephen Chow Brennen;Daniel Yarish David;Banford Bulmer Macdonald Hunter;William Neville Stephen;E Darcie Thomas
申请人:Tutela Tech Ltd;
IPC主号:
专利说明:

Invention Patent Report for SYSTEM AND METHOD FOR EVALUATING WIRELESS DEVICE AND / OR WIRELESS NETWORK.
CROSS REFERENCE TO RELATED PATENT APPLICATION (S) [0001] This patent application claims priority for US Provisional Patent Application No. 62 / 447,255, filed on January 17, 2017, the contents of which is incorporated by reference in this specification.
TECHNICAL FIELD [0002] The present invention relates to systems and methods for evaluating wireless device and / or wireless network performance, and trends in wireless network usage.
DESCRIPTION OF RELATED TECHNIQUE [0003] The number of wireless devices, which access wireless communication networks, is continuously growing. These devices can access the various networks through cell phones, WiFi and other access points. As the number of devices grows, the demand on these networks grows, affecting the performance of both networks and devices.
[0004] To determine the performance of wireless devices and wireless networks, network service providers, device manufacturers, application developers and other entities, which have a stake in this performance, require performance and usage data. There are several techniques for collecting and evaluating performance and usage data, for example, applications in stand-alone devices or modules that perform periodic testing. Wireless carriers can also have natural applications, which have access to certain performance data, which can be evaluated.
[0005] However, these techniques can be invasive at your disposal
Petition 870190067243, of 07/16/2019, p. 33/100
2/62 sites and users of these devices, be limited to the type of network and / or the type of device and / or the type of user in which the applications or modules are employed, or be rigid when and how the techniques are implemented.
SUMMARY [0006] The following is a system and method, which embed or otherwise use software functionality in wireless electronic devices (for example, embedded in applications, memory, an operating system (OS) or other), to collect data regarding the use and performance of the device, applications, OS, network, etc. To balance the competing goals of interacting with devices to collect data and have minimal effect on device applications and operations, software functionality is integrated in a flexible and transparent manner.
[0007] By using this software functionality on a large number of devices on a large number of networks, large and significant data sets across different applications, devices, networks, etc. can be assembled without having an adverse impact on the operability of the devices during their periods of normal use.
[0008] In one aspect, a method of evaluating wireless device performance and / or wireless network performance, and / or wireless network usage trends, in a wireless electronic device is provided, the method comprising: provide wireless device software on the wireless electronic device, the wireless device software being incorporated into an application or memory of the wireless electronic device, and being operable to perform at least one test associated with the characteristics and / or location of the wireless device. device, and / or device and / or network performance, and / or use of the device by a user; communicate with at least one of: i) a system
Petition 870190067243, of 07/16/2019, p. 34/100
3/62 wireless electronic device operational theme; and ii) an external test server to perform at least one test, where at least one test is performed according to at least one test parameter provided in a wireless device configuration and / or software, the at least one a test parameter comprising at least one behavioral criterion, which controls the impact of the test on the operation of the wireless electronic device; collect test data resulting from the performance of at least one test; and provide test data to a collection server.
[0009] In other respects, there are systems and a computer-readable medium configured or operable to execute the method.
BRIEF DESCRIPTION OF THE DRAWINGS [0010] The embodiments will be described below by way of example with reference to the attached drawings, in which:
Figure 1 is a schematic block diagram of a wireless communication environment, which includes several electronic communication devices;
Figure 2A is a block diagram of a configuration for a wireless device;
Figure 2B is a block diagram of another configuration for a wireless device;
Figure 2C is a block diagram of another configuration for a wireless device, which includes a media classification toolkit (MAT);
Figure 3 is a block diagram of a configuration for collecting data from mobile devices using wireless device software (WDS);
Figure 4 is a block diagram of a configuration for collecting data from mobile devices using WDS with data processing, data distribution and software support.
Petition 870190067243, of 07/16/2019, p. 35/100
4/62 device:
Figure 5 is a block diagram illustrating the configuration shown in Figure 4 for various third party devices and systems;
Figure 6 is a schematic diagram illustrating a configuration for a game or application, which includes a WDS and / or a MAT-WDS;
Figure 7 is a schematic diagram illustrating a configuration for centrally managing multiple WDS on one device;
Figure 8 is a block diagram illustrating additional details for the configuration shown in Figure 4;
Figure 9 is a block diagram illustrating a configuration in which feedback and configuration servers are used for communication and / or control applications, using wireless device software on various mobile devices; and Figure 10 is a flow chart illustrating instructions executable by computer executed in the collection of performance and / or usage data on a device.
DETAILED DESCRIPTION [0011] The following is a system that allows collaborative data from multiple devices across many applications, networks and environments to be gathered and analyzed in a way that allows more meaningful and complete data to be obtained without undue impact on the devices and networks with which the system operates in an adverse manner.
[0012] As will be described in this specification, in order to have a minimal impact on applications and processes running on devices, in which software functionality is employed, several factors must be considered.
Petition 870190067243, of 07/16/2019, p. 36/100
5/62 [0013] For example, the software functionality should not use too much battery power, CPU processing power, network bandwidth or memory. Data collection speeds must also be adjustable, so that they can be reduced to accommodate user actions, intense network actions and, depending on the device's battery level, device location and activities in the app. To employ software functionality more broadly to obtain a more comprehensive set of data, software functionality is based on integration with applications, games, operating systems, tools / utilities, etc., which are widely distributed and used. As such, software functionality should not interfere with the performance of, for example, advertisements that provide revenue to an app / game partner. For example, software functionality should not run a complete test while an advertisement is being loaded over the network.
[0014] The software functionality, described in this specification, is also configured to be responsive to a configuration server, which establishes the rules of when and where tests should be performed, what tests should be performed and in what quantity. This allows the software functionality to be adjusted to accommodate different networks / locations, different applications / games, different users. For example, software functionality must wait until the device connects to WiFi, before transferring collected data records to the server / system, in order to minimize the impact on a user data plan (and / or establish limits on monthly usage for any tests done on a particular device or account). In this way, software functionality can safeguard against opponents wishing to use the present system to target devices and applications.
Petition 870190067243, of 07/16/2019, p. 37/100
6/62 [0015] As will be described in this specification, the software functionality can perform active tests that generate network traffic to measure network performance. This may involve activating a test server to test network latency, or downloading a file of a predetermined size from that server to determine transfer speeds and throughput. Passive tests are also performed, in which the software functionality does not generate any additional traffic, but instead monitors the existing traffic generated by the user, the applications, the network infrastructure, etc. For example, software functionality can monitor the speed at which an advertisement is downloaded and the size of that advertisement to estimate speed and throughput. The software functionality can also be configured to be incorporated within or to have an interface between the application and the OS, to access information that is already being tracked or is otherwise available through the OS. With these different techniques, the software functionality can adapt and adjust its operations to gather meaningful and complete data for the system, while considering the performance of the device, the application and the network.
[0016] Now returning to the figures, Figure 1 illustrates an example of a wireless environment 10, which can include one or more wireless networks
14. Networks 14 can be of different types, for example, different types of cellular networks (2G, 3G, 4G, etc.). Different types of networks can also include other types of wireless networks 14, such as WiFi networks accessible by available WiFi access points. Within the wireless network environment 10, several electronic communication devices 12 having wireless capabilities operate by connecting to one or more of the different networks / types of networks 14 and / or directly to each other through point-to-point or type connections. mesh.
[0017] To obtain collaborative data 16 from environment 10, each
Petition 870190067243, of 07/16/2019, p. 38/100
7/62 device 12 includes software functionality, which is able to run tests, monitor the operations and usage of the existing device, and otherwise collect data on, or related to, one or more applications on that device 12.1 This Software functionality is referred to in this specification as wireless device software (WDS), and in the example shown in Figure 1, WDS 40 is incorporated into at least one application 38 on devices 12. However, as will be illustrated below, WDS 40 can be introduced on devices 12 in different configurations from those shown in Figure 1. In partnership with editors of various mobile applications (eg games) or operating systems or tools / utilities, WDS 40 can be distributed to millions mobile devices 12 to anonymously collect QoS, network usage trends and device and application usage data, among other things. It can be considered that the WDS 40 can also be integrated with mobile applications or other embedding applications, which will be used on various mobile devices 12. The WDS 40 can be incorporated into applications and games on various operating systems (OSs) for platforms mobile devices, such as Android, iOS, Windows, etc., as well as other mobile platforms, such as those for wearable items, games, vehicle systems, wireless sensors, etc. that is, any other device / platform with location tracking capabilities (for example, GPS, network-based location, etc.) and network connectivity (for example, internet), with the ability to run the software functionality described herein descriptive report, it is applicable in collection, analysis and mechanisms of data re-feeding / recording, described in this descriptive report. In some implementations, devices with only network connectivity and no location-based capabilities can also be incorporated into the system.
Petition 870190067243, of 07/16/2019, p. 39/100
8/62 [0018] Data 16, which is collected, is preferably tied to a location or otherwise considered location based, so that data 16 or information derived from data 16 can be put on a map . Data 16 is also collected preferably in an anonymous way, so that no personally identifiable information is collected and / or stored by system 18. For example, system 18 must be configured to not collect device advertising ID, device ID or other information that can be used with another data set to identify the user of the device 12. In an implementation, the software functionality described in this specification can be configured to generate and attach a unique random number, which is specific to the particular installation, and that is reset (for example, regenerated) periodically (for example, every day). This can be done to ensure that an opponent cannot observe recorded data from a device over a period of several days, to determine who that device belongs to.
[0019] Data 16 may include, without limitation: device location, device manufacturer name, device model, OS name and version, network operator ID,% free memory, CPU usage, drain speed battery, storage usage (ie, device metric), application name, transferred bytes, sending bytes, time of first installation, time of last update (ie, mobile application metric), shipping yield, yield transfer rate, latency, connection speed, signal strength, phase fluctuation, packet download speed, packet loss, # radio frequency conflicts (ie, network QoS metric), BSSID, SSID, signal strength (i.e., Wi-Fi scan metric), connection start / end times, connection type, technology, service provider, cell ID, LAC, MCC,
Petition 870190067243, of 07/16/2019, p. 40/100
9/62
MNC, DHCP response time (ie, connection metric), etc.
[0020] The collected data 16 is fed to a central system 18, which includes modules and processes to collect the data, process and analyze the data 16, generate feedback for the WDS 40 and devices 12, and therefore prepare the interfaces users and reports. It should be understood that multiple central systems 18 can be used, for example, to follow handling laws, which require data from a particular jurisdiction to be stored in that jurisdiction, etc. Data 16 can be securely stored in cloud-based databases and transmitted securely over secure connections (for example, HTTPS). The databases can be dispersed globally and can be configured to provide direct access to customers of the system 18.
[0021] Reports and user interfaces are generated and made available to one or more third parties 22, in particular to game or application developers, device manufacturers, network operators, etc. Reports and user interfaces can be provided by using data visualization tools, such as graphical reports, interactive dashboards, network tools, etc. Reports can be issued periodically or in real time, with control panels available when needed online at any time.
[0022] Other third parties 22, such as self-organizing networks (SONs), can also benefit from data 16, which is collaborated by the system 18. SONs dynamically adjust the network, antennas or waveform characteristics in response to the quality of the network. The idea is that the network is self-correcting, so that if there is a problem, the network will adjust itself to eliminate the entire problem on its own. SONs typically require access to large amounts of field data to operate, which can be satisfied
Petition 870190067243, of 07/16/2019, p. 41/100
10/62 with large data sets, which can be obtained using system 18. System 18 and, optionally, third parties 22 can also communicate with a media rating system 28, which, as explained below, is a example of a specific purpose data collection system, which uses a customized version of WDS 40.
[0023] As will be explained in more detail below, system 18 can have or otherwise control or operate a configuration server, which generally represents any entity or service that can be controlled by system 18 or during its execution, to interact with WDS 40 introduced in devices 12. Thus, the behavior of WDS 40 or its effects in, and / or the behavior of, applications 38 can be modified by system 18 or even, directly, by third parties 22, if registered or otherwise permitted by system 18. This allows commands or feedback 20 to be provided by system 18 to WDS 40 on all devices 12, if desired. In one example, the configuration server can be used to provide a delete key or other functionality that can instruct WDS 40 and / or application 38 itself to terminate or interrupt operations, for example, in the event of a security breach or avoid a potential application or server failure. It should be understood that the elimination switch module 126 can also be integrated into configuration server 124 and / or application 38 or WDS 40.
[0024] Now returning to Figure 2A, an example of a configuration for an electronic device 12 is shown having at least one case of WDS 40. Device 12 includes processor 30, memory 32 and operating system 42. Device 12 it is also operable in this example to provide graphical user interfaces to a user via a monitor 34. For example, a
Petition 870190067243, of 07/16/2019, p. 42/100
11/62 visual component can be provided directly to device 12 to display part of the information collected to the user. Device 12 also includes one or more communication interfaces 36, which are operable to connect device 12 to one or more networks
14. As also shown in Figure 2A, device 12 can include multiple applications 38 of different types as discussed above. To collect and send data 16, which is relevant across multiple applications 38 and types of applications, each application 38 (or any one or more applications 38) can incorporate the WDS 40 mentioned above (which is embedded and runs in the background in application 38 ) to gather private data, run tests, etc. WDS 40 is able to not only access components on device 12, such as processor 30, battery (not shown in Figure 2A) and OS 42, WDS 40 can be configured for directly, or by application 38 in which lies, communicating with one or more networks 14 by interacting with one or more communication interfaces 36. It should be understood that WDS 40 can have a strict coding limit of several tests, which can be performed over a period of time, whose limits are unalterable by the configuration server. The WDS 40 can also be operable to identify its own code running on a different application on the same electronic device, and be responsive to identify its own code running on the different application because it has only one case of the wireless device software operating at the same time.
[0025] WDS 40 can be implemented as a software library, which is incorporated into mobile device applications, as shown in Figure 2A, and designed to register and integrate with system 18, as shown below. WDS 40 is a mobile library, configured for one or more types of OS, for example, iOS, Android, Windows, BlackBerry, etc. The library is configured
Petition 870190067243, of 07/16/2019, p. 43/100
12/62 to be easily incorporated into an existing application to collect device, connection, network QoS, Wi-Fi and basic application performance indicators (KPIs). Using this inappropriate approach requires that WDS 40 has the ability to communicate with system 18 over a network connection, over Wi-Fi or mobile (for example, cellular). This provides flexibility of introduction through a network infrastructure anywhere in the world. The WDS 40 interacts with several sets of servers during its operation, details of which are provided below, that is: an authentication server, a configuration server, test servers and log servers. The licensing server dictates that API keys and applications are left to operate and collect data by WDS 40. The configuration server establishes rules and parameters specific to the operation of WDS 40. The WDS 40 can use the test servers to run active tests on the connected network. The WDS 40 can have built-in intelligence, which is responsive to the configuration server, which determines which test server to interact with for active testing. Globally, there are many servers that can be used for these tests, and the system must dynamically select the correct one for WDS 40 each time. For example, the system can always select to test the test server geographically closest.
[0026] The logging servers are used to send the payloads of data from the WDS 40 and to the system for analysis, logging, feedback, etc. subsequent
[0027] It should be understood that, preferably, no personally identifiable information is recorded or transmitted, including no key press information, no contact information, etc.
[0028] Below, a non-exhaustive list of KPIs can be collected by WDS 40 and stored locally in a database (for example,
Petition 870190067243, of 07/16/2019, p. 44/100
13/62 example, SQLite):
[0029] For network QoS -> device manufacturer name, device model, OS name and version, device screen resolution width, device screen resolution height, total device memory, device disk storage total device, language, connection type, mobile technology, service provider, connection start time stamp, connection end time stamp, cell ID, location area code (LAC), service set identification (BSSID), service set identifier (SSID), mobile country code (MCC), mobile network code (MNC), time zone, physical cell ID (PCI), QoS collection time stamp, throughput send, transfer throughput, latency, connection speed, signal strength, latitude, longitude, altitude, horizontal precision, vertical precision, phase fluctuation, packet loss, behavior, free device memory, disk CPU positive, device battery level, device battery status, bytes transmitted in delta, bytes received in delta, energy received by reference signal (RSRP), quality received by reference signal (RSRQ), signal-to-noise ratio reference signal (RSSNR), channel quality indicator (CQi), synchronization advance (TA), system uptime, country, region, city, geohash, etc. [0030] For application data usage -> device manufacturer name, device model, OS name and version, device screen resolution width, device screen resolution height, total device memory, storage in total device disk, language, connection type, mobile technology, service provider, connection start time stamp, connection end time stamp, cell ID, LAC, BSSID, SSID, MCC, MNC, time zone , PCI, application name, application package name, bytes transmitted in application delta, bytes received in application delta
Petition 870190067243, of 07/16/2019, p. 45/100
14/62 captive, active application delta application time, application first installation time, application last update time, country, region, city, average latitude, average longitude, geohash, etc. [0031] For WDS database -> primary key (device), identifier, device manufacturer name, device model, OS name and version, device screen resolution width, device screen resolution height , total device memory, total device disk storage, language, WDS version, database version, active time, date created, host application package name, host application version, connection (connection) ID, connection type, mobile technology, service provider, connection start time stamp, connection end time stamp, cell ID, LAC, BSSID (connection), SSID (connection), response time, authentication time , MCC, MNC, time zone, PCI, primary key (QoS), connection ID ((QoS), QoS collection time stamp, send throughput, transfer throughput, medium latency, maximum latency, minimum latency, speed binding, intensity signal, latitude, longitude, altitude, horizontal precision, vertical precision, medium phase fluctuation, maximum phase fluctuation, minimum phase fluctuation, packets sent from server response, discarded server response packets, discarded percentage of server response , transmitted bytes, transmitted packets, received bytes, received packets, sent server response packets, lost server response packets, lost server response percentage, speed, behavior, free device memory, device CPU, battery level device status, device battery status, bytes transmitted in delta, bytes received in delta, RSRP, RSRQ, RSSNR, CQI, TA, system uptime, transfer throughput test size, sending throughput test size, QoS test enabler, server DNS
Petition 870190067243, of 07/16/2019, p. 46/100
15/62 transfer throughput test, transfer throughput test server IP, send throughput test server DNS, send throughput test server IP, server response test server DNS, IP server response test server, primary key, Wi-Fi scan time stamp, BSSID (Wi-Fi scan), SSID (Wi-Fi scan), signal strength, frequency, latitude (scan of Wi-Fi), longitude (Wi-Fi scan), altitude (Wi-Fi scan), horizontal accuracy (Wi-Fi scan), vertical precision (Wi-Fi scan), access point capabilities, key primary (Wi-Fi scan), connection ID (Wi-Fi scan), ADU time stamp, application name, application package name, bytes transmitted from the delta application, bytes received from the delta application, bytes transmitted from the application, bytes received from the application, active time of the application in delta, application active time, time of first application installation, time of last application update, etc.
[0032] For configuration server parameters (DSC ··· dynamic server configuration) -> The UNIX time stamp of the DSC configuration, the version of the DSC configuration, the JSON version scheme of the DSC, the frequency in the which DSC configuration is enabled, Boolean operation to enable testing on variations of device connectivity, Boolean operation to enable testing on variations of device locations, the minimum horizontal distance a device must travel before the location is updated, the minimum time that a device must travel before the site is updated, the minimum behavior that a device must vary before the site is updated, the quality of service for site updates that defines the device’s location accuracy and power consumption, the list of location limits that define the specific locations for tests to be performed within the limits, longi
Petition 870190067243, of 07/16/2019, p. 47/100
16/62 tude towards the southwest of the location limit, the latitude towards the northeast of the location limit, the longitude towards the northeast of the location limit, Boolean operation to allow only passive tests, the minimum time that must elapse between passive tests, Boolean operation to enable server response tests, the frequency at which server response tests will be performed, Boolean operation to enable server response tests by a cell phone, the minimum number of location variations between server response tests, the minimum time that must elapse between server response tests, the list of strings that define specific networks for tests to be performed, the Wi-Fi network SSID or the name of the mobile service provider , Boolean operation to enable periodic performance testing, the frequency at which performance tests will be performed, Boolean operation ana to enable performance tests by cell phone, the minimum number of location variations between performance tests, the minimum time that must elapse between performance tests, Boolean operation to enable mobile connections, Boolean operation to enable Wi-Fi connections , the frequency at which the database will be exported, Boolean operation to enable the export of SDK error records with database export, the maximum size of the database files before the SDK attempts to force the export of the database data, the maximum size of the database files before the SDK attempts to force the database export and, if unsuccessful, stops data collection, Boolean operation to collect scans of Wi-Fi access points, Boolean operation to collect application data usage information, the maximum number of applications to track application data usage information, boole operation ana to collect Android system applications, list of system application package chains to include
Petition 870190067243, of 07/16/2019, p. 48/100
17/62 in ADU reports, the package name of the system applications, the URL of the server to perform server response tests, the number of packages to be sent when performing a server response test, the size of packets to be sent when performing a server response test, delay time when performing a server response test, the URL of the server to perform throughput tests, the URL of the server to perform sending throughput tests, the size of the test file to be used when performing a throughput test, the delay in waiting when executing a transfer throughput test, the delay in waiting when executing of a sending throughput test, list of QoS test servers to run tests on the network based on the location of the device, the latitude of the QoS test server, the longitude of the t this of QoS, the URL of the QoS test server to run server response tests, the URL of the QoS test server to run transfer response tests, the URL of the QoS test server to run server response tests submission, the server URL to register applications and receive a valid input key, if a valid API key is provided, the server URL to check the input key for the name of the particular application package that is valid, the frequency restore device UID, the URL of the server to export log files from export DB, the URL of the server to export log files, the maximum number of daily SDK errors, before SKD is closed for 24 hours, the maximum allowed monthly mobile data quota, Booian operation to record Wi-Fi connection change information, the server URL to record Wi-Fi connection change information, operation booieana to register the server UID with the information
Petition 870190067243, of 07/16/2019, p. 49/100
18/62 Wi-Fi connection change, Boolean operation to enable the occupant to provide additional location information to the SDK, the server URL for the occupant to obtain additional location information for the SDK, Boolean operation to check the external IP address, the URL of the server to determine the external IP address, etc.
[0033] As such, WDS 40 is configured to conduct testing and gather many and varied amounts of data 16, which can be stored on, sent to and used by system 18, as described in this specification.
[0034] In general, there are two types of KPIs, which can be captured by WDS 40, that is, as shown in Table 1 below:
Test type description active An active test is one that uses network traffic to measure the quality of the network connection. These tests require WDS to interact with a QoS test server to complete. Active tests include, for example:1. transfer throughput tests;2. shipping performance tests; and3. server response tests;The. latency;B. phase fluctuation; andç. packet loss passive A passive test is one that only transfers data from the device's OS without using any network traffic. Passive tests include, for example:1. all device information;2. all connection information;3. most QoS KPIs including data usage and location;4. all information of Wi-Fi scans; and5. all application data usage information
Table 1: Summary of KPI types.
Petition 870190067243, of 07/16/2019, p. 50/100
19/62 [0035] As indicated above, the collection speed of the KPIs is preferably determined by the configuration provided to the WDS 40 on a first start-up. The frequency of active tests can be adjusted by the configuration server. Passive tests can be performed when an active test is not being conducted, to prevent one test from influencing the other and to ensure that the amount of CPU in use is minimized. However, passive and active tests can be performed at the same time when appropriate. It should be understood that WDS 40 can be configured to collect only passive measurements. As also indicated above, preferably, no personally identifiable information is recorded or transmitted.
[0036] Active tests on a network are performed by communicating with external test servers. These test servers can host several files of different sizes to perform throughput testing. For sending throughput testing, the test servers can provide an unimpeded partition for sending files of any size. In addition, test servers can also echo UDP packets sent from WDS 40 for server response tests.
[0037] It should be understood that while in Figure 2A each application 38 includes an introduced instance of WDS 40, to monitor and test application 38 and / or device 12, WDS 40 can be introduced in several other configurations. For example, Figure 2B illustrates that WDS 40 may differently (or additionally) reside on OS 42 and interact centrally with multiple applications 38. WDS 40 can also reside as a standalone application or in another location or component of device 12, such as shown in dashed lines, with functionality to interact with several (or all) of the 38 applications. Similarly, one or more of the 38 applications may have
Petition 870190067243, of 07/16/2019, p. 51/100
20/62 additionally the WDS 40 residing in it (also shown in dashed lines), for example, applications 38, which need to have these operations controlled internally instead of being open to an external program, module or routine. WDS 40 can therefore be installed in several different applications (that is, in a weather application and then in a totally different game), and these different applications can potentially be installed on the same phone or on several different phones. This allows the scenario, in which the WDS 40 is installed several times on the same teiefone (for example, as illustrated), in which case the WDS 40 must identify that it is obtaining data from the same device 12.
[0038] Now returning to Figure 2C, the WDS 40 can also be adjusted for specific applications or specific types of media. In the example shown in Figure 20, a media classification toolkit (MAT) 44 WDS is shown, which is configured to monitor and record on a particular type of media, such as video, particularly the quality of experience (QoE ). The MAT 44 WDS provides the ability to collect video, user interactions and device metrics directly from mobile devices 12.
[0039] In the configuration shown in Figure 2C, the MAT WDS software library acts as the endpoint middleware. The software library can easily be incorporated into video applications from the service provider 38. Its purpose is to monitor video QoE and record this information to a video QoE server 48 for data storage and processing by the system 18. The content In this example, video is provided through a video content server 46. The MAT 44 WDS can include several different libraries for particular use cases and analyzes. For example, video asset analytics provide a detailed understanding of the video assets and the device used to view the video.
Petition 870190067243, of 07/16/2019, p. 52/100
21/62 deo, providing metrics relating to video resources and device information, for example: manifest bit rate list, count of ABR change requests by bit rate, length of segments received by bit rate, number of segments per bit rate, temporary reload time, device serial number, device manufacturer / model, OS and version, device MAC address, device IP address, network operator ID, location of the device, percentage of free memory of the device, CPU usage of the device, software version (library), time stamp (start / end times for viewing period), name of video resources, URL of video resources, flow type (HLS, MSS, HDS, MPEG-DASH, etc.), device connectivity type (Wi-Fi vs. mobile), access point identifier (CID, BSSID, etc.) [0040] The analyzes of user interactions provide a detailed view of the video experience and user interaction to provide a better understanding of user engagement with the video application. The feature set for this type of analysis can include: average session duration, insufficient temporary storage execution count, server response time, resource playback start time, playback end time, pause start time , end pause time, fast forward start time, fast forward end time, rewind start time, rewind end time, etc.
[0041] An extended video analysis can also be performed to allow a better view of the user's network performance and the impact on the device, to provide a better picture of the network performance experienced by the device. The feature set for this type of analysis can include: device's battery drain speed,
Petition 870190067243, of 07/16/2019, p. 53/100
22/62 device, customer ID, event record (manifest transfer - order time stamp, URL, transfer time (how long the video was played), sending performance, transfer performance, latency, loss of packet, phase fluctuation, data usage, signal strength, number of radio frequency conflicts, network connection time, network authentication time, event log (initial request for playback time - set time in ms), etc.
[0042] The monitoring of video resources is done as a software library within a video application 38, and, thus, the characteristics that require the library to be started properly when application 38 is opened by the user. The feature may need a reference to be passed to a media player, for example, for each media player that is initialized by video application 38. The feature may also need a reference to be passed to a URI or URL, you may need an API to be opened for the associated time stamps, you may also need a reference to be passed to the appropriate click listener, a reference to be passed to the application creation and deletion functions, an AP! to be opened for associated information, etc.
[0043] MAT's WDS software library inherently uses device resources for data collection. The tests related to the use of memory and CPU registered for a period of more than ten minutes were conducted:
1. Baseline test: This test represents the video playback application being run without playing any video. This provides a basic line showing the activity area of the application's user interface and background processes.
Petition 870190067243, of 07/16/2019, p. 54/100
23/62
2. Video test: This test represents the application of video playback also during video playback. In this test, video QoE metrics are collected and monitored.
3. QoS test: This test extends the video test by collecting network specific metrics. The transfer throughput, send throughput, latency, phase fluctuation and packet loss tests are conducted every thirty seconds, increments in areas of activity above the video test represent the impact of the network performance tests on the device.
[0044] For memory usage, the percentage differences between the three tests were found to be reasonably minimal, with the QoS test requiring more memory than the video or base test because of the addition of the network rating tests performed every thirty seconds. At most, it was found that the difference in memory use between the baseline test and the QoS test was approximately 4.5% at any given time. It should be noted that memory usage by any application on a mobile device 12 can affect the performance and battery consumption of the device.
[0045] For CPU usage, the three tests were also conducted using a device having two cores. The percentage difference between the video and QoS tests was found to be reasonably minimal, and in the baseline test, CPU usage dropped as the device went into a sleeping state. At most, it was found that the CPU usage between the video test and the QoS test was approximately 1.0% at any predetermined time, or at any predetermined processor core. It was observed that there is a big difference between both the video and QoS tests and the baseline test, particularly over time. This is because device 12 is moving to a sleeping or depressed state,
Petition 870190067243, of 07/16/2019, p. 55/100
24/62 because the user is no longer interacting with the application 38. The CPU usage by any application on a mobile device 12 can affect the device's performance and battery consumption, similarly to memory usage.
[0046] A data collection configuration is shown at a high level in Figure 3. Each mobile device 12, which is configured to operate WDS 40 (using one or more applications 38) provides data to a collection server 50, which is introduced as part of system 18. The collected data 16 is processed as described in this specification, together with data 16 obtained from other devices 12, to generate information and data for third party systems 22. Figure 4 provides other details for the configuration shown in Figure 3, in which the collection server 50 collects the data and has the collected data processed by a data processing stage 52. The data thus processed is then provided to a data distribution stage 54, for distribution to third party systems 22. Figure 4 also illustrates that the data distribution stage 54 can allow system 18 to provide feedback to the device mobile device 12 for communication with a device software support feature 56, which is connectable to WDS 40 to complete the feedback circuit. Device software support functionality 56 may include configuration server 124, shown generically in Figure 1, as discussed in more detail below.
[0047] Because WDS 40 has been introduced in multiple different types of applications on multiple different types of devices operating with multiple different types of networks, not only can data be collected from a wider range of sources to provide a more comprehensive set of data meaningful and complete, a more comprehensive feedback network can be established, providing
Petition 870190067243, of 07/16/2019, p. 56/100
25/62 do, therefore, the ability to reach a wider range of devices 12. This feedback network can be used for several purposes, including modifying the behavior of the WDS 40. In addition to modifying the behavior of the WDS 40, the feedback can also be enhanced to create a two-way channel between third parties 22 and devices 12, applications 38, and even networks 40 (for example, network devices and infrastructure).
[0048] Figure 5 illustrates a configuration similar to that shown in Figure 4, but showing the data collection 16 from multiple devices 12 by the multiple WDSs 40. As shown in Figure 5, the various mobile devices 12 shown can be served by an entity common device software support system 56 and can provide data 16 to a common collection server 50. System 18 can employ multiple regional collection servers 50 and multiple device software support entities 56, if necessary, and thereby , the example shown in Figure 5 is illustrative only.
[0049] Figure 6 provides a schematic block diagram of a game / application 38, to illustrate the flexibility in which WDS 40 (or MAT 44 WDS) can be configured to collect desired data sets and monitor desired KPIs . In this example, application 38 includes an application scaling module 50, which can include code or software functionality in application 38, which is responsible for interacting with WDS 40, 44, for scaling the operations of application 38, for example , when advertising is released, when playing certain types of media, etc. That is, in addition to WDS 40 being configured to be minimally invasive in application 38, application 38 can also be reactive to allow certain tests to be conducted and / or use the information gathered by WDS 40 (for example, network quality), to proactively adjust actions in applications to improve the
Petition 870190067243, of 07/16/2019, p. 57/100
26/62 user experience. Figure 6 also illustrates that WDS 40 can communicate with, or receive information from, the processor of device 30, the memory of device 32 and a battery of device 33. As indicated above, this information may also be available by communicating with OS 42.
[0050] By way of example, several WDS modules are exemplified in Figure 6, including a data collection adjustment module 52 for storing parameters (for example, configuration server settings) dictating how / when to run certain tests, a passive test parameter module 54 to store certain passive test methodologies, and an active test parameter module 56 to store certain active test methodologies.
[0051] As discussed above, passive testing methodologies include taking data from device components, such as OS 42, without using network traffic.
[0052] Active tests use network traffic to measure the quality of the network connection. These tests require the WDS 40 to interact with a test server. The test methodology for three exemplary active tests is described below.
Transfer throughput tests [0053] The transfer throughput test is calculated by measuring the duration required to transfer, for example, a 2 MB file from a test server over an HTTPS TCP connection. WDS 40 establishes a connection to a test server, which uses, for example, Amazon's CloudFront service to achieve the fastest response service. The WDS 40 starts to transfer a file of a defined size to measure the performance. The WDS 40 calculates the yield by dividing the size of the test file by the time it takes to transfer the file completely. This method
Petition 870190067243, of 07/16/2019, p. 58/100
27/62 logic can be used in all tests, except for a continuous performance test, in which the performance is determined constantly until the transfer reaches a constant speed. This speed can be recorded as the throughput.
[0054] The frequency of the transfer throughput tests can also be adjusted as to which server is used to transfer his file. The size of the throughput tests can be adjusted using the configuration server 124, according to Table 2 presented below:
Test size description Micro 250 kB Little 500 kB Medium 1 MB Medium-large 2 MB Great 5 MB Huge 10 MB Continuous Transfer until constant speed is reached
Table 2: Transfer throughput test sizes [0055] Transfer throughput tests are not designed to measure the maximum throughput of the current connection, but, in contrast, a medium-large test (2 MB file) is selected for simulate the loading of a network page or the start of a video session on a mobile device. A continuous test can be used to measure the maximum transfer throughput of a connection.
Send throughput tests [0056] The send throughput test is calculated by measuring the length of time it takes to send, for example, a 1 MB file of random bytes to a test server over an HTTPS TCP connection. The WDS 40 starts sending a file of a defined size to measure the performance. The WDS 40 calculates the yield per
Petition 870190067243, of 07/16/2019, p. 59/100
28/62 dividing the size of the test file by the time it takes to complete sending the file.
[0057] The frequency of the transfer throughput tests can also be adjusted as to which server is used to send his file. The size of the sending throughput tests can be adjusted using the DSC, according to Table 3 presented below:
Test size description Micro 250 kB Little 250 kB Medium 500 kB Medium-large 1 MB Great 1 MB Huge 5 MB Continuous Sending until constant speed is reached
Table 3: Sizes of sending throughput tests [0058] The sending throughput tests are not designed to measure the maximum throughput of the current connection, but, in contrast, a medium-large test (1 MB file) is selected for simulate sending a medium definition image to social media. A continuous test can be used to measure the maximum sending throughput of a connection.
Server response tests [0059] A server response test includes a test for latency, phase fluctuation and packet loss over a connection. Server response testing is calculated by measuring the round trip time of sending a predetermined number of protocol packets (for example, UDP) to a test server over a protocol connection (for example, UDP). Packets are sent with precise synchronization in a pre-established WDS 40 sequence.
Petition 870190067243, of 07/16/2019, p. 60/100
29/62 are received by the test server and returned immediately to WDS 40. WDS 40 calculates latency based on the total time it takes for packets to be sent and received by WDS 40. Phase fluctuation is calculated by measuring difference or deviation in the time in which ordered packets are received. Packet loss is based on packets that are returned out of order or missing from the sequence.
[0060] The number of packets to be sent, the size of the packets and the destination server can be adjusted using the configuration server 124. Generally, 20 packets of 1 byte each are used to perform the server response tests. Latency is measured for each packet, which is sent and received back at WDS 40. From this set of latency measurements for each packet, the maximum, minimum and average latencies are calculated and stored. Similarly, phase fluctuation is measured for each set of packets received on WDS 40. From this set of phase fluctuation measures, maximum, minimum and average phase fluctuations are calculated and stored. Packet loss is calculated based on the number of packets not returned to WDS 40 in the same order as they are sent. Packet drop speed is calculated based on the number of packets not returned to WDS 40, regardless of the order received.
[0061] In the example described in this specification, LJDP packages are selected to simulate latency-sensitive applications, such as video calls and VoIP. A high phase fluctuation Indicates an insufficient user experience and can more significantly affect the voice video release. High packet loss typically indicates network congestion, but it can also be caused by a weak radio signal, or faulty network hardware or drivers. High packet loss affects more often
Petition 870190067243, of 07/16/2019, p. 61/100
30/62 the user experience in audio and / or video playback applications and online games.
[0062] WDS 40 must be implemented to minimize battery usage. In standardized configurations, monitoring the WDS 40 can increase battery life by approximately 1%. However, one that system 18 is unlikely to be able to actively monitor QoE for 24 hours a day, over the course of an average day, consumption can be made so that it is considerably below 1%. The number can be modified by increasing or decreasing the frequency of the QoE test.
[0063] Below is an estimate of the battery usage associated with the introduction of WDS monitoring capabilities in an application 38. Four separate tests were performed by more than 10 different types of devices, the average results shown in the present descriptive report.
Baseline test [0064] The baseline test is intended to represent an average application being run without any QoE tests running.
Tests every 15 minutes [0065] This test represents the battery usage of a running application, while WDS collects the QoE metric once every 15 minutes.
Tests every 5 minutes [0066] This test represents the battery usage of a running application, while WDS collects the QoE metric once every 5 minutes.
Tests every 2 minutes [0067] This test represents the battery usage of a running application, while WDS collects the QoE metric once a
Petition 870190067243, of 07/16/2019, p. 62/100
31/62 every 2 minutes.
data collection interval Result 15 minutes During a 30 minute session, 23 seconds of battery usage is lost 5 minutes During a 30 minute session, 1.1 minutes of battery usage is lost 2 minutes During a 30 minute session, 2.6 minutes of battery usage is lost
Table 4: Battery test results [0068] The WDS 40 should also be configured to minimize data usage. In standardized configurations, WDS 40 can be configured to use only 25 MB per month per mobile data device 12, with most data usage being associated with running active tests on network 14, and exporting the database for reporting servers. The KPIs using network data are summarized below:
Data type category KPi Use of data QoS shipping yield 1 MB QoS transfer income 2 MB QoS latency 1 kB * QoS phase fluctuation 1 kB * QoS loss of server response packets 1 kB *
* 1 kB used to obtain latency, phase fluctuation and packet loss KPIs as part of the server response test
Table 5: KPIs using network data [0069] Depending on the client and the network being monitored, different behaviors for WDS are suitable. The settings
Petition 870190067243, of 07/16/2019, p. 63/100
32/62 presented below indicate the amount of data used during a period of 31 days.
Configuration Data consumption Testing details 1 day 31 days THE ~ 3MB <100 MB . transfer yield every 2 hours. shipping yield every 2 hours. latency tests every 30 seconds. phase fluctuation tests every 30 seconds. packet loss tests every 30 seconds STANDARDB ~ 16 MB <500 MB . transfer throughput every 15 minutes. shipping yield every 15 minutes. latency tests every 30 seconds. phase fluctuation tests every 30 seconds. packet loss tests every 30 seconds Ç ~ 33 MB <1 GB . transfer yield every 8 minutes. shipping yield every 8 minutes. latency tests every 30 seconds. phase fluctuation tests every 30 seconds. packet loss tests every 30 seconds D -260 MB <8 GB . transfer throughput every 60 seconds. shipping throughput every 60 seconds. latency tests every 30 seconds. phase fluctuation tests every 30 seconds. packet loss tests every 30 seconds. export data once a day
Table 6: Data usage settings [0070] With respect to the CPU, the WDS 40 should not have a considerable impact on the device's CPU usage and should not decrease the
Petition 870190067243, of 07/16/2019, p. 64/100
33/62 wireless device performance 12.
[0071] Regarding data storage and recording frequency, WDS monitoring results are stored locally on device 12, until they are recorded on a server. All recorded data must then be deleted from the wireless device storage. The registration frequency is configurable using WDS 40, although the standard can be adjusted once each day, for example, 24 hours while connected to the WiFi network, as an example.
[0072] WDS 40 can record data as SQLite files using an HTTPS POST on port 443. Once stored on the server side, the information can be encrypted and protected by protection barriers and theft prevention mechanisms of the private data deposit . The collection of identifiable user data is an optional feature of WDS 40. When collected, system 18 can parse all device identification information to make the information anonymous.
[0073] Any IP connection, used by wireless device 12, can be monitored by WDS 40. This includes, but is not limited to, any Wi-Fi, 2G, 3G and 4G connections. Furthermore, since all monitoring and logging is performed excessively (OTT), WDS 40 works regardless of the service provider's consent to such monitoring, which is particularly advantageous for competitive network analysis.
[0074] In an implementation, GPS data is used only if wireless device 12 has been activated naturally. Since many users choose to disable GPS for energy saving, when GPS is not available, the WDS 40 can still have access to location information collected by the device's natural network based on the location system (for example, the tri
Petition 870190067243, of 07/16/2019, p. 65/100
34/62 cell tower and Wi-Fi access points).
[0075] As shown in Figure 2A, a device 12 can have multiple applications 38, all incorporated in WDS 40. Figure 7 illustrates an alternative configuration, in which a device WDS manager 60 is included in device 12 to manage operations tests, which can be performed by different WDSs. This can prevent duplicate testing from being performed by WDSs to allow data to be shared, particularly the data that is used for passive testing. Also, an external connection 62 can be provided on other WDSs 40 or WDS managers 60 on other devices, to allow point-to-point coordination between devices, for example, on the same network 14 or communicating with the same test servers, for greater test efficiencies. An instance of WDS 40 can therefore communicate with another instance of WDS 40 on the same device 12 or on a nearby device, or on a device 12 on the same network.
[0076] Further details regarding the functional blocks shown in Figures 4 and 5 are then provided in Figure 8. Starting with the mobile device 12, the WDS 40 is, in this example, incorporated into a mobile application 38 and includes a software interface 84 to promote an interaction between application 38 and a software controller 86, to control tests and other operations of WDS 40. WDS 40 also includes a test data store 88 to store data obtained during tests, an SDK of test 90 to perform one or more particular tests, which involve the operation of the application 38 and / or the device itself by the OS of the device 42. The WDS 40 also includes a utilities SDK 92, which includes methods, functions and APIs, which can be used to extract data and information from the device's OS 42. These methods can be used
Petition 870190067243, of 07/16/2019, p. 66/100
35/62 to export data to the collection server 50.
[0077] The Utilities SDK 92 is also operable to communicate with the collection server 50. The collection server 50 includes a registration server 96, to receive test and other data being registered by WDS 40, and a database. log data 98, to store test data for use by the data processing module 52.
[0078] Data processing module 52 includes a central data services server (CDS) 100, which provides data source APIs for different third party data sources and metadata. The CDS 100 server can also provide storage for quick responses to data aggregation operations. The CDS server 100 also interacts externally with one or more third-party data sources 64 and internally with the data aggregation module 60 discussed above. The data aggregation module 60 obtains (that is, extracts, requests or otherwise receives) the data collected by the collection server 50. The data aggregation module 60 also performs the aggregation of the various data and data types and stores the aggregated data in a report database 104, which will be accessed by a report generation module 106 to generate various types of reports, control panels, etc. It can be considered that the data can be extracted from third party data sources and not just from the collection server. For example, external databases can be extracted by the fact that they help translate latitude and longitude into city names, in which the data was collected.
[0079] Report generation module 106 can generate several types of data for distribution to third parties 22, as shown in Figure 8. For example, report generation module 106 can generate reports 110 and / or control panels 112, and you can prepare that
Petition 870190067243, of 07/16/2019, p. 67/100
36/62 raw data 114 are analyzed elsewhere. Report generation module 106 can prepare feedback data 116 to be sent to device support software 56, in this example configuration, to a feedback server 126, which is part of that device support software 56.
[0080] Device support software 56 may include multiple servers, which can communicate with and control, monitor, update, secure, eliminate, or otherwise interact with WDS 40 on the various devices 12. In this example, the software for device support 56 includes the feedback server 126 mentioned above, as well as the configuration server 124 to manage the settings for the WDS 40, and an authentication server 122 to authenticate the WDS 40 to ensure that it is from an application or developer. suitable application. Device support software 56 also includes a test server 120 to interact with test SDK 90, to provide and update / configure tests and test suites to be run by WDS 40.
[0081] WDS 40 can be configured as a software library, which is incorporated into mobile device applications 38 to register and integrate with collection server 50 and data processing module 52. WDS 40 libraries can be added to an existing application for collection device, connection, network QoS, Wi-Fi and KPIs. As shown in Figure 8, WDS 40 interacts with device software support entity 56, which can include different servers with which WDS 40 can communicate during its operation. The exemplary configuration shown in Figure 8 includes the servers responsible for authentication and initiation (authentication server 122), configuration (configuration server 124), test (test server 120) and registration (registry server 96), which communicate with WDS 40. The au server
Petition 870190067243, of 07/16/2019, p. 68/100
37/62 tentication 122 can be used to dictate which application programming interface (API) 38 keys and applications are left to operate and collect data by WDS 40. Configuration server 124 can be used to establish specific rules and parameters for the operation of WDS 40. WDS 40 can also use test servers 120 to run active tests on the connected network 14. Log servers 96 are used to send data payloads from WDS 40 to system 18.
[0082] As indicated above, authentication server 122 can be used to verify which applications 38 are using the correct API key for each developer, and provide each application with a unique installation key. Each application developer can receive an API key, which is used to generate a unique installation key for each application 38. This installation key is used to control the configuration of WDS 40, as well as to track the data collected by each application 38 .
[0083] Authentication server 122 can also check that application 38 has not previously registered with system 18. This ensures that the data collected by WDS 40 is associated again with the correct application 38 and developer, for example, to justify a revenue share. Authentication server 122 also provides control of specific suspension applications or developers to collect data at any time, for example, to implement a deletion key.
[0084] WDS 40 can be configured to check with the authentication server 122 the first initialization of WDS 40 and, periodically (for example, every few days), the next initialization. This allows authentication server 122 to prevent any application 39 from collecting data 16. All communication and data
Petition 870190067243, of 07/16/2019, p. 69/100
38/62 of those transferred between WDS 40 and authentication server 122 are preferably protected and encrypted. For example, WDS 40 can receive a three-day local cache on device 12, to prevent WDS 40 from checking with authentication server 122 each startup to prevent extra traffic or loquacity over network 14, and act as a local cache. on device 12.
[0085] As discussed more generically above, test servers 120 are used to run active tests on a network 14 through interaction with WDS 40. Test servers 120 can host several files of different sizes for performing performance tests of transfer. For sending throughput testing, test servers 120 can provide an unimpeded partition for sending files of any size. Furthermore, test servers 120 can also echo packets for the corresponding protocol (for example, UDP packets) sent from WDS 40 for testing server responses. Multiple test servers 120 can be established as needed around the world. Test servers 120 can be introduced in a cloud or on premises hosting environment. WDS 40 determines which server 120 to use to perform active tests by selecting the most suitable server 120, based on the geographic location of the servers. For example, the nearest route may need to use submarine cable, while a server a little further away may be able to use terrestrial cable faster (that is, consider more than just geographic proximity). Test servers 120 used by WDS 40 can be configured by configuration server 124. All communication and data transferred between WDS 40 and test servers 120 are preferably protected and encrypted.
[0086] The 124 configuration server is designed to provide
Petition 870190067243, of 07/16/2019, p. 70/100
39/62 full control by WDS 40. Configuration server 124 allows system 18 to adjust data collection frequencies, data logging frequencies and types of sense data collected by devices 12 in the field. Each WDS entry can be assigned a unique entry key, used by the WDS 40 to periodically check which data collection / logging behaviors the WDS 40 should adhere to. This provides the dynamic adjustment of the WDS 40's performance to the fine-tuned battery consumption, network loquacity and other parameters.
[0087] A profiled configuration maintained by configuration server 124 is transferred to WDS 40 upon initialization of WDS 40. For example, configuration server 124 may maintain a new policy that says Do not collect data in country X. This new policy, or this new profile for data collection, will be transferred and executed by WDS 40. A new configuration profile is placed in WDS 40 at a specific frequency. The WDS 40 can also have a local cache on device 12 (e.g., three days) of configuration server 124, to prevent WDS 40 from extracting configuration from configuration server 124 too often. All communications and data transferred between WDS 40 and configuration server 124 are preferably protected and encrypted.
[0088] The file and configuration data can be signed by the service with a known, trusted security certificate. The signature is passed with the configuration server configuration, verified in WDS 40 on device 12. The WDS 40 can then attempt to match the signature on the configuration server with that generated locally on device 12, using the same certificate on the server side. . If the signature generated on WDS 40 does not reproduce that provided by configuration server 124, WDS 40 can be configured to discard the configuration and continue to use the configuration
Petition 870190067243, of 07/16/2019, p. 71/100
40/62 previous ration, or a standard. This joint signature check between server 124 and WDS 40 ensures that the configuration is not compromised. Compromising the configuration provided to the WDS 40 can have varying degrees of impact on the user's device, the amount of data used, impact on the battery, etc.
[0089] With the configuration shown in Figure 8, the following process flow can be implemented. WDS 40 can boot by check with authentication server 122 to run or not. The WDS 40 then extracts a configuration file from the configuration server 124 to conduct the operation of the WDS 40. The data is then collected by the WDS 40 through interaction with the device's OS, to capture various KPIs on the device, network connection, Network QoS, Wi-Fi scanning information and application data usage, etc., as discussed in this specification. WDS 40 can also perform network performance tests against 120 test servers or servers.
[0090] The data is collected by WDS 40 and stored in a database (for example, SQLite) for a particular period of time, for example, a period of 24 hours. The database is then exported to the registry server or servers 96. The registry servers 94 can parse the database to divide the data into different tables, for example, within BigQuery. In this example, the data is stored in several BigQuery registration tables, depending on the type of data. Periodically, for example, hourly, data flow tasks can be performed to add additional metadata to the raw data sent from WDS 40. These metadata include indicating in the raw data the country, region and city metadata, etc. Once the data is processed by the data flow tasks, the data is made available in various tables and views. These tables and views provide
Petition 870190067243, of 07/16/2019, p. 72/100
41/62 export views and raw data formation and standardized reports with other tools, as described in this specification. It should be understood that standardized reports, customer reports, customer dashboards and raw data can all be made available by a combination of reports and customer dashboards or through different views and table exports (for example , from BigQuery).
[0091] As illustrated in Figure 9, collection server 50 is configured to collect data from multiple mobile devices 12 by report server 96 interacting or otherwise communicating with WDS 40 on each of multiple devices 12. if it is understood that while the collection server 50 can communicate with multiple devices 12, the more comprehensive system can include multiple collection servers 50, for example, placed regionally, each collection server 50 being able to communicate with the processing module 52. Figure 9 also illustrates that feedback data 116, generated by reporting module 106, can be provided to configuration server 124, to allow configuration server 124 to control the behavior of WDS 40 on each device 12.
[0092] Configuration server 124 can also use third-party data sources 64 and human input 82, to control the operation of WDS 40 on any one or more of devices 12. Third-party data sources 64 and human input 82 may be provided by any party authorized to control configuration server 124 for a particular purpose. For example, a system administrator 18 or a third party 22 (for example, a game / application developer) can access feedback data 116 in the form of usage and performance data and apply human input 82 directly to the configuration server 124 to modify the com
Petition 870190067243, of 07/16/2019, p. 73/100
42/62 behavior of a target application 38 (using WDS 40 or other functionality).
[0093] Figure 10 illustrates the flow of data into data in the union, aggregation and analysis of data from mobile devices 12, as described in this specification. In step 200, the mobile application (or the operating system, etc.), which contains WDS 40, starts WDS 40 to begin collecting collection test data on the mobile device 12. It should be understood that, as shown in Figure 2, OS 42 or other components of device 12 can be used to start WDS 40 to begin data collection. The operation of WDS 40 is, in this example, controlled by a controller 202, which coordinates authenticating WDS 40 in step 206 and obtaining the appropriate configuration in step 204. Controller 202 can also coordinate obtaining information from OS 42 in step 208, for example, to collect data in step 210 for passive testing. Data collection in step 210 is also done based on network tests conducted in conjunction with device support software 56 in step 212.
[0094] The data collected is stored in step 214 and sent to system 18 in step 216. The data sent is collected in step 218 for subsequent aggregation, analysis, processing, registration, etc.
[0095] The data can be aggregated by adding the data sent to a large set of tables, for example, divided by day. The large set of tables can then be consulted according to certain variables. In one configuration, data for all applications 38, devices 12 and networks 14 can be placed in the same data store, and can be grouped in various ways, depending on what is mentioned to be shown in reports, dashboards, etc. .
[0096] The data are analyzed in several ways, by system 18
Petition 870190067243, of 07/16/2019, p. 74/100
43/62 and / or by third parties 22. For example, data 16 can be broken down by country, region, city, etc., as well as by time periods (for example, month). Client groups can also be run by network type (2G versus 3G versus 4G) and the statistics determined and displayed for those groups. Client groups can also be run to determine the names of the application packages, the names of the applications. It must be understood that determining the names of application packages is not trivial, since a single application can have multiple packages as part of its installation, and also different names in different languages. System 18 is configured to join packages to obtain a single language list of application names and their associated package names (since package names are globally unique). Customer groups can also be prepared for service providers based on MCCs and MNCs. This allows brands to be associated with operators for a given network 14, instead of relying only on network 14 registered by device 12 (for example, since there may be an itinerant situation or another scenario in which the provider, listed by devices 12, may be inconsistent).
[0097] System 18 can therefore combine data sent from several different mobile applications 38 and introductions from various devices in various networks, regions, etc. System 18 is also capable of obtaining data 16, which is specific and relevant to particular game and application developers, network operators, device manufacturers, etc., by grammatically analyzing or packaging this comprehensive data set, therefore. System 18 is also capable of extracting additional metadata from several other third party sources and open data 82. System 18 can generate raw data files, as well as provide views by
Petition 870190067243, of 07/16/2019, p. 75/100
44/62 through user interfaces (for example, control panels).
[0098] For example, a set of data flow tasks can be used to add additional metadata to the raw data being sent from WDS 40. These data flow tasks can be performed periodically, for example, hourly in the last hour for sending data from WDS 40. The results can then be grouped into daily tables at a particular time, for example, midnight GMT (Greenwich Mean Time), for consultation.
[0099] The following is a summary of the processes that can occur during different data flow tasks:
1. For many fields, enumerators can be used in WDS 40 to simplify and reduce the amount of data sent. Data flow tasks can be used to exchange enumerations for human-readable strings.
2. Country, region and city indicators can be added to the data based on the recorded latitude and longitude.
3. The geohash can be calculated for the recorded latitude and longitude.
4. The remaining device storage and the remaining device memory can be calculated.
5. The mapping of MCC and MNC to a service provider brand can be added.
6. The mapping of an application package name to an application name can also be added.
[00100] It should be understood that several open and paid third party sources can be used to complement the raw data collected by WDS 40.
[00101] Data reports can therefore be prepared in a variety of ways and, if desired, other third party data sources 82
Petition 870190067243, of 07/16/2019, p. 76/100
45/62 can be incorporated. Since the data is collected from various WDSs 40, introduced within various types of applications running on various types of OSs 42 and types of devices; all in, crossing and / or interacting with various types of networks 14 and regions; a more comprehensive view of how a network, device, application, operating system or electronic environment can generally be accessed. The data, which is collected and stored, can be consulted in various ways for many purposes, to suit the needs of different third parties 22 wishing to access this more comprehensive and more complete set of data. Since WDS 40 can be introduced into various types of applications 38, such as games, which enjoy additional circulation and reach multiple platforms, regions, an unobstructed tool is introduced and can leverage the union of these desired data on a periodic basis and continuous, without adversely affecting the performance of devices 12 or applications 38.
[00102] In addition to providing a system, which allows collaborative data from various devices across many applications, networks and environments to be gathered and analyzed in a way that provides more meaningful and complete data to be obtained without the excessive impact on devices and networks within which the system operates in an adverse manner as described above; several other cases of applications, configurations and use, using or configuring the associated system 18, will be described below.
Test informed to the user [00103] System 18 described above considers test networks 14 and the generation of test data in a few different ways, that is:
a) requesting information from the OS of the mobile device 42 (that is, API calls from the device);
Petition 870190067243, of 07/16/2019, p. 77/100
46/62
b) create network traffic and perform active tests; for example, determining the yield of a yield by transferring a file from a controlled test server 120, then verifying the performance of the owned and controlled transfer. In this case, the network traffic being analyzed was created for the specific purpose of running a test.
c) verify the network traffic initiated by the user or by some other mobile device service, which has not been generated for the specific purpose of performing a test, that is, a passive test; for example, a network testing service can examine how quickly a user is able to upload a photo on Facebook or download a video from YouTube, and then determine the performance by passively checking the performance of these uncontrolled operations.
[00104] It is recognized that access to more information from the user makes it possible to improve these types of tests. For example, the actions, behaviors or locations of users (or mobile services) dictate which of the three types of tests to perform. These same actions, behaviors or locations can also provide additional information, which can inform the approach of testing or how the results should be interpreted to generate more valuable and accurate understandings.
[00105] Traditionally, it has been found that the passive test is less accurate than the active test. This is because less is known about the traffic being analyzed, that is, the passive test is less controlled. System 18, described in this specification, can be configured to run tests on the network, which are initiated by user actions or informed by user actions. This can be done by giving or otherwise accessing additional information from the user or the mobile service, which can greatly improve the pass test.
Petition 870190067243, of 07/16/2019, p. 78/100
47/62 vo (and the test in general). This is because 38 mobile apps can track user actions, such as user click on a button to upload a photo. When the mobile app 38 realizes that a user has clicked on the send photo button, it can perform a passive test on the network in this data sending, knowing that: 1) it was a photo; 2) the size of the photo being sent; and 3) the address of the destination server. In other words, the mobile app 38 and WDS 40 are in a position to leverage a better understanding of the nature of the file transfer, to perform a more effective and accurate passive performance test. This can be done, for example, by having WDS 40 using an APi to ingest information from the mobile app 38. In this way, the mobile app 38 passes information to the WDS 40, just as the user just clicked a button to send a size x photo. Accessing this information provides the context that may not have been previously available for passive testing, for example, when a file was sent, whether it was a photo, the resolution or size of the photo, or the destination server and the routing details.
[00106] System 18 can therefore be adapted so that the user's interaction with a mobile service dictates what type of passive network test to perform and how to interpret the results. For example, if the user sends a photo on a particular mobile service, such as Instagram, system 18 can use the additional information to perform a passive network test, which is designed to monitor the network's ability to handle sending data. Photos. This additional information can be provided by a mobile application 38 and is typically provided by the mobile application 38, which contains the network test code - although other sources for this additional information are possible. In this case, the passive test of the system will have access to additional information, such as: 1) that the user
Petition 870190067243, of 07/16/2019, p. 79/100
48/62 ary is trying to send a photo; 2) the size of this photo; and 3) the destination server, etc.
[00107] It should be understood that the test informed to the user need not be limited to the passive tests of the network. Behavior, characteristics, location, etc. of the mobile user can dictate the specific active tests, which must be performed based on the types of tests desired by the system controller. The test reported to the user also allows the system to consider when an active test or a passive test is most appropriate. For example, it may be better to just run passive tests, which do not create new network traffic, when the user is watching a video or doing something with his device 12, which is sensitive to the performance of the network. In other words, this additional information and the test reported to the user can help dictate when and where tests should be performed to: 1) not interfere with the user experience; or 2) provide the information that is most needed by the system.
[00108] Furthermore, as wireless networks move more and more to be virtualized or defined by software, the test results informed to the user can be used to modify or dictate the hardware, the software or itself implementation of network 14 for information on network requirements, based on services and applications 38 being used by users and the actions they are taking.
[00109] System 18, described in this specification, can therefore be used to perform informed / dictated to the user, that is, when the user does not specifically select to perform a network test. In this case, the network tests are selected and initiated based on the actions taken by a user of a mobile device 12, which contains the network test software (for example, the transfer
Petition 870190067243, of 07/16/2019, p. 80/100
49/62 of a photo). The details of these actions performed by the user can be used as an input for analyzing the results (for example, a network's ability to deliver a photo). The action taken by the user is something that is not what the user selects to perform a network test.
[00110] It should be understood that although the examples mentioned above are in the context of more about a user and the buttons in the application that that user is going to select, it can also be a non-human service that provides additional information.
Device cancellation tracking & advertising [00111] The systems and methods described above consider tracking mobile devices 12, as they access and generate mobile network users 14. These mobile devices 12 and their users can be identified and tracked daily from various modes, including:
a) the mobile device ID: for example, MAC address, IMEI or IMSI of the mobile device;
b) the device's advertising ID: the advertiser's ID or IFDA are non-persistent IDs of the mobile device 12, used to deliver desired mobile advertisements;
c) cookies: IDs that are installed on devices as they access and use networks and network services;
d) the ID (or WDS ID) of the mobile software: a unique ID generated by the mobile device to identify a specific installation of the software;
e) an ID used for registration in the mobile software: for example, a Facebook ID, a Netflix ID or a Gmail ID, which is used by a user to register in a mobile application 38.
f) a set of behavioral characteristics: for example, a set of characteristics, which can be defined with
Petition 870190067243, of 07/16/2019, p. 81/100
50/62 based on various factors that may include device locations, IP addresses used by the device, or WlFi / cell access points commonly used by the user.
[00112] Each device tracking the approach has its own privacy implications, which typically need to be considered and managed. That is, a selected tracking approach will normally need to be both acceptable to the user of the mobile device and certain legal requirements.
[00113] By tracking how these IDs flow through networks 14, system 18 can be used to inform wireless service providers about user cancellation. For example, if an application ID is used to register a phone on a first network 14a for a day, and then later the same application ID is used to register on a phone on a second network 14b, then you can register that user as equally canceled. That is, in that case, it can be expected that that user left the first network 14a and became a client of the second network 14b. This cancellation record alone provides a valuable service for wireless providers. However, this record is even more powerful when combined with other data sets to enable the predicted capabilities, which create the possibility of advertising to influence cancellation.
[00114] For example, this historical network cancellation information, when combined with other sets of information, such as wireless network coverage, wireless network performance, cookies on the network website, recent searches, hardware / software mobile device plans, user network subscription plans, what people are saying about the wireless operator on social media, and other information sets, can be used to perform cancellation prediction on individual users or on
Petition 870190067243, of 07/16/2019, p. 82/100
51/62 large aggregate parts of the population.
[00115] This allows a better dissemination desired by wireless operators to users who are: 1) high probability candidates to leave their networks 14; or 2) highly likely candidates to leave their competing networks 14. The same mobile IDs can be used to select specific users or IDs with appropriate disclosures.
[00116] As an example, the system's wireless network performance tests can be used to compare networks and lower selected advertising campaigns. If the second network provider finds that there is a better wireless network in a specific city, it can tailor its advertising to devices in that city, to promote its network as the highest performer. It is then possible for mobile applications 38 and services to indicate wireless operators to their users. Users can choose to allow a wireless service, such as Facebook, to track network performance, usage models and location, and then suggest to them the best wireless network 14 for their needs.
[00117] As an alternative approach to user cancellation tracking, system 18 can track which groups of mobile devices 12 tend to be presented on specific networks 14. For example, if the same four mobile devices consistently access the same access point of WiFi, or access networks by the same IP address, it is reasonable to assume that this is a family unit or an associated group. If, suddenly, one of these devices 12 leaves that group and a new device 12 appears, which is authenticated with a different cellular wireless network 14, it can reasonably be considered that there was a network cancellation event by the user of that newly appeared device .
[00118] As such, tracking one or more IDs associated with a
Petition 870190067243, of 07/16/2019, p. 83/100
52/62 user or device 12, and gaining access to, or otherwise tracking, events related to users, such as social media posts, can improve cancellation identification and cancellation registration and / or a desired advertisement. System 18 can be adapted for this cancellation forecast by tracking a user as it moves through networks 14 and mobile devices 12, using their social media record IDs, so that an analysis of network cancellation / device can be made.
Network neutrality and service level agreement tracking [00119] Tracking wireless network performance by system 18, which can be done by collaborating with mobile endpoints as described above, can also be used to determine which areas, users or services are being strangled; as well as which areas, users or services are being provided with the best service levels.
[00120] The identification and comparison of low performance and high performance cases can be used, for example, in several ways:
a) to inform cities and governments whose areas are being adequately served by wireless service providers; wireless regulators often need carriers to provide certain levels of service to less privileged rural areas and / or neighborhoods, and violators can be identified and penalized for using test data;
b) to inform Mobile Virtual Network Operators (IWNOs) whether or not a home network is providing adequate levels of service, or whether the home network operator is providing inferior service to MVNO subscribers, as compared to themselves; this allows the MVNO to determine whether the
Petition 870190067243, of 07/16/2019, p. 84/100
53/62 their domestic operators are within breach of service level agreement (SLA) rules;
c) to inform the wireless networks 14 on which network 14 will have its subscribers outside the area and whether or not those networks outside area 14 are adhering to or violating the SLAs, and how the experience of quality outside their area by their subscribers out of area compares with the quality being received by these home network subscribers;
d) whether or not the neutrality laws are being adhered to or violated; for example, it can be noted whether a network operator is strangling a third party reproduction service, and promoting its own reproduction service, and to what degree.
[00121] System 18 can therefore be adapted so that the results of network tests or the quality of service are compared with a quality threshold dictated by a wireless regulator or a home network provider, to see if the requirements are being met.
Event-activated testing - autonomous driving / cyber-physical vehicles [00122] The quality and coverage of the network are often considered critical for certain emerging cyber-physical domains, such as autonomous driving vehicles and cyber-medicine. In such cases, the final mobile device 12 has a core purpose, which is sensitive to the network. It is important that these devices 12 maintain access to the quality of the network, which is good enough to satisfy their core requirements. For example, a cybermedicine device, designed to inform hospitals of heart attacks, should be able to send a message to hospitals or emergency dispatchers when a heart attack is detected. [00123] The network test capabilities for these devices 12 can then be considered critical to their performance, with the test being activated by events, which are inherent to the end of the number.
Petition 870190067243, of 07/16/2019, p. 85/100
54/62 core of the device.
[00124] In one example, a vehicle or a vehicle network of autonomous driving can select to perform tests when vehicles do not need to perform emergency maneuvers (for example, avoid an animal or other obstruction on the road) to track the performance of these maneuvers. Alternatively, the vehicle grouping can run tests only in cases when it is known that there are parts of the road or route where network performance information is missing. In such cases, a network test system can have its tests activated by external events. The resulting network data set can be combined with information about the operation of the cyber-physical device and the requirements to determine whether network 14 is suitable for those cyber-physical device requirements.
[00125] In another example, a cybermedicine device 12 can perform event-driven tests on network 14, to ensure that network 14 is running well enough to meet the network requirements of an emergency (and that the device is connected to the appropriate server). Exemplary events can be, in this case: 1) user is sleeping or the user is not at immediate risk of health; 2) the user's health indications are reaching dangerous levels that can get worse; and 3) the user is in danger.
[00126] It should be understood that in applications such as autonomously driven vehicles, devices 12 are in a great position to map the quality of the network over huge areas and, therefore, may depend on or otherwise play a greater role. role in a future test of the network. It should also be understood that vehicles are not limited to automobiles and may include drones or other autonomous devices.
Petition 870190067243, of 07/16/2019, p. 86/100
55/62
Privacy in mobile device testing [00127] Mobile devices 12, used to perform network testing, typically need to be able to maintain user privacy to the extent that users are informed. For example, if a user introduces that he made the choice to enter or leave the service, or parts of it, the entire system must be responsible for that entry and, consequently, adjust what is collected. The analysis and handling of this data must also be informed by these same user inputs.
[00128] System 18 can also be adapted to ensure that it is able to consume information about jurisdictional and geographical differences in privacy rules and be responsive to those rules. For example, a global testing system may run differently in Russia than in the European Union, depending on current governmental privacy legislation in both areas.
[00129] It may also be important for system 18 to orchestrate tests performed across the network to determine end points for testing to maintain user privacy. For example, system 18 may select to distribute tests among mobile devices 12 in a way that makes it even more difficult to track the movement or characteristics of a specific device 12. Or, for example, if a specific area is known to be private property and have a very low population density, system 18 can be configured to be able to treat this data differently, or not to collect data from that area, since it would be easier than normal to associate the tests done in this low area population with the person or persons known to be living in or having access to that area. There may also be specific geographic areas where it is illegal to run tests or measure location, and system 18 may need to be adapted accordingly.
Petition 870190067243, of 07/16/2019, p. 87/100
56/62
MIMO / SON - Elimination of interference and beam formation [00130] The multiple input and multiple output (MIMO) and SON 22b systems can have several channels available, all of which are being evaluated. Also, MIMO and SON 22b systems can use beamforming to broadcast specific channels and network resources to specific mobile devices 12, that is, based on their unique requirements. Therefore, each user on network 14 can experience something completely different, so that the importance of collaborative network quality increases.
[00131] The collaborative information of the mobile devices 12 themselves can basically be used to inform the network 14 about the characteristics of the network that are needed to be broadcast to each mobile device 12 and how this beam formation needs to occur (generally, based on the application being used or user signature alignment). As wave formation and beam formation take place, information from experiences of the application of the mobile device and the network (collaborated by system 18) can be used in a feedback circuit to inform the wave formation processes and beam formation.
[00132] In other words, the beam formation allows each user to gain access to different characteristics of the network. However, to understand whether this is working well, there needs to be a feedback loop informed by collaboration on the network, as described in this specification.
Security [00133] Abnormal mobile device behavior: The network test / monitoring agent (for example, WDS 40) can be used to detect / identify compromised mobile devices 12. For example, if WDS 40 normally perceives a device mobile 12, or a loT 12 device, normally only uses 2 MB / day
Petition 870190067243, of 07/16/2019, p. 88/100
57/62 data, and then suddenly rises to 100 MB, system 18 can be used to identify this abnormal network behavior and indicate device 12 as possibly being compromised.
[00134] Abnormal access point behavior: It is recognized that competitors are beginning to use rogue access points and counterfeit cell towers to attract mobile devices 12 for connection. They can then monitor traffic over network 14 or use these malicious connections to install malware. System 18 can also be used to identify abnormal behavior of access points. For example, if users are accessing the same access point from multiple locations, then that access point can be a cheating access point, which is being triggered around attractive connections. Alternatively, if the cell tower ID, or some other cell tower identifier, or a sudden variation in the cell tower's characteristics, this can be flagged as possibly a fake tower made to look similar to the non-malicious access point.
[00135] System 18 can therefore be adapted so that the performance and details of mobile devices 12 and network access points are compared against the details and performance expected to address network problems and compromised systems .
[00136] Private network leak: Certain networks are not intended to be seen outside specific geographic areas and certain installations. System 18 can record whether certain networks 14 are seen where they should not be seen.
[00137] Additional aspects that can make system 18 more secure include:
a) the mobile device network 12 can be controlled by several network controllers instead of just one (i.e., fragmented
Petition 870190067243, of 07/16/2019, p. 89/100
58/62 system). For example, mobile devices 12 can use a differently configured server 24. It should be understood that there may also be benefits in fragmentation, which will require populations of subsets of devices 12 to use all different servers (ie different servers) test 120, different authentication servers 122 and different configuration servers 24). Thus, if one of the controllers is compromised, then the entire system 18 will not be compromised at the same time. Within the framework of the principles mentioned above, network controllers are generally used to control which devices 12 will perform which tests and under which conditions. Network controllers are also used to control which servers are used for these tests. If these servers are compromised, then the entire system can be used to perform a DDOS attack;
b) the mobile device agents (for example, WDS 40), which perform the tests, can be adjusted so that they are frequently re-authenticated or otherwise inactive; this feature can be hard coded in WDS 40, so that if WDS 40 is compromised (for example, executing a DDOS attack), then, after a certain period of time, WDS 40 is closed because when it is stopped you will need reauthentication.
Exemplary use cases [00138] Application monitoring: the network tests described above can be used to record the performance or likely performance of network 38 applications, such as Skype, YouTube, Netflix, etc., without ever interacting directly with the proprietary servers used by these applications. In contrast, the network requirements of these applications 38 are understood and compared with the characteristics of the network, which are observed and collected by the agent
Petition 870190067243, of 07/16/2019, p. 90/100
59/62 network test (for example, WDS 40), to record application performance. The system 18 can therefore be configured so that the results are used to record the performance or the probable performance of the applications of the network 38.
[00139] Network operations: the collaboration described above can provide alarms for network operators, which indicate the specific areas or access points of the network that are providing sub-optimal performance. These alarms and information can be used to inform network maintenance or indicate which areas of a network 14 need further testing by other methods. The system 18 can therefore be configured so that the performance and details of the mobile devices 12 and the network access points are compared with the expected details and performance, to search for network problems and compromised systems.
[00140] Network planning: system 18 can detail areas with high traffic on foot or with high population densities, which are also poorly served by wireless service providers. These are areas where network improvements are predicted to provide the biggest gains for the entire subscriber base. By comparing this performance with that of competitors, system 18 can suggest areas in which the network operator should focus on being more competitive and performing better customer acquisition. System 18 can therefore be configured so that results are used in conjunction with user density information collected from System 18 or from external sources, to inform a network operator of the most beneficial location for maintenance, expansions and improvements from the Web.
[00141] Competitor screening: system 18 can be used to inform a network operator on: 1) what new towers or technology
Petition 870190067243, of 07/16/2019, p. 91/100
60/62 are being implemented by competitors; 2) that network operators are gaining more subscribers and where; 3) what types of applications and services are being run by the competing network and how this is changing over time; and 4) the performance of competitive networks and how this is evolving over time. System 18 can therefore be configured to inform a wireless operator of the performance being achieved by competitors' subscribers and in which new implementations and changes to the competitor's network are being recorded, forecasted and reported.
Interaction of connection management platforms [00142] Furthermore, system 18 can also be configured to interact with a device connection management platform (not shown), as can be provided in a mobile operating system, or how it can be controlled by the network operator, to assist a mobile device 12 in selecting a suitable network 14 or an access point connection. In this case, the data collected by the WDS 40 is transmitted, either in its raw form or after an analysis of the data, to the connection management platform through an API, for use in the network or access point selection process.
Artificial intelligence and machine learning [00143] Furthermore, the system can also benefit from the use of Artificial Intelligence (AI) and Machine Learning (ML) in addition to data analysis. The data recorded by WDS 40 can be entered on Al and ML platforms (not shown), for processing into improved information to be used by network operators for the purpose of network planning, network maintenance, customer service, defined network software (SDN), network function virtualization (NFV), or MIMO systems, so that network 14 can be responsive to this improved information produced
Petition 870190067243, of 07/16/2019, p. 92/100
61/62 by Al and ML processes executed on the data provided by WDS 40. Groups different from network operators can similarly benefit from the improved information produced by Al and ML applied to the WDS test data.
[00144] For simplicity and clarity of illustration, when considered appropriate, the reference numbers can be repeated in the figures to indicate corresponding or similar elements. In addition, several specific details are shown to provide a complete understanding of the examples described in this specification. However, those skilled in the art will understand that the examples, described in this specification, can be practiced without these specific details. In other cases, well-known methods, procedures and components have not been described in detail so as not to obscure the examples described in this specification. Also, the description should not be considered to limit the scope of the examples described in this specification.
[00145] It should be understood that the examples and corresponding diagrams used in this specification are for illustrative purposes only. Different configurations and terminology can be used without departing from the principles indicated in this specification. For example, components and modules can be added, eliminated, modified or arranged with different connections without departing from these principles.
[00146] It should also be understood that any module or component, exemplified in this specification, that carries out instructions, may include or otherwise have access to computer-readable media, such as storage media, computer storage media, or data storage devices (removable or non-removable), such as, for example, disks
Petition 870190067243, of 07/16/2019, p. 93/100
62/62 magnetic, optical discs or tape. Computer storage media may include removable and non-removable, volatile and non-volatile media, implemented in any method or terminology for storing information, such as computer-readable instructions, data structures, program modules or other data. Examples of computer storage media include RAM, ROM, EEPROM, instant memory or other memory technology, CD-ROM, digital versatile discs (DVDs) or other optical storage, magnetic tapes, magnetic tape, magnetic disk storage or others magnetic storage devices, or any other medium that can be used by an application, module, or both. Any computer storage media can be part of system 18, any component of or related to system 18, etc., or accessible or connectable to it. Any application or module described can be implemented using computer-readable and executable instructions, which can be stored or otherwise retained by these computer-readable means.
[00147] The steps or guidelines in the flowcharts and diagrams described in this specification are only exemplary. There can be many variations in these steps or operations without departing from the principles discussed in this specification. For example, steps can be performed in a different order, or steps can be added, deleted or modified.
[00148] Although the principles mentioned above have been described with reference to certain specific examples, several modifications of them will become evident to those skilled in the art, as described in the appended claims.
权利要求:
Claims (30)
[1]
1. Method of evaluating wireless device performance and / or wireless network performance, and / or wireless network usage trends, in a wireless electronic device, characterized by the fact that the method comprises:
provide wireless device software on the wireless electronic device, the wireless device software being incorporated into an application or memory of the wireless electronic device, and being operable to perform at least one test associated with the characteristics and / or location of the wireless device. device, and / or device and / or network performance, and / or use of the device by a user;
communicate with at least one of: i) an operating system of the wireless electronic device; and ii) an external test server to perform at least one test, where at least one test is performed according to at least one test parameter provided in a wireless device configuration and / or software, the at least one a test parameter comprising at least one behavioral criterion, which controls the impact of the test on the operation of the wireless electronic device;
collect test data resulting from the performance of at least one test; and provide test data to a collection server.
[2]
2. Method, according to claim 1, characterized by the fact that the test data is collected from at least one passive test, performed without using network traffic.
[3]
3. Method, according to claim 1, characterized by the fact that the test data is collected from at least one active test, performed using network traffic through communication with the test server.
[4]
4. Method according to claim 3, characterized
Petition 870190067243, of 07/16/2019, p. 95/100
2/5 by the fact that the at least one active test corresponds to a transfer throughput test, a send throughput test or a server response test.
[5]
5. Method, according to claim 4, characterized by the fact that the at least one active test comprises monitoring a video resource by obtaining video content from a video content server, and recording the video data in a experience server video quality.
[6]
6. Method, according to claim 2, characterized by the fact that the at least one passive test corresponds to device information, connection information, data usage and / or location, WiFi scanning information or information of use of application data.
[7]
7. Method according to claim 1, characterized by the fact that it also comprises communication with a configuration server to obtain the configuration, in which the configuration server is operable to control and modify the test behavior of the device software without thread.
[8]
8. Method according to claim 7, characterized by the fact that the test behavior is modifiable to increase or decrease the test frequency, determine when and / or where the tests are performed, and / or activate or deactivate the test .
[9]
9. Method according to claim 8, characterized in that the configuration is controllable at least in part by introducing an external entity or system, to adjust the behavior test according to the desired behavior.
[10]
10. Method, according to claim 1, characterized by the fact that at least one behavioral criterion is determined in the device.
[11]
11. Method according to claim 10, characterized
Petition 870190067243, of 07/16/2019, p. 96/100
3/5 due to the fact that an application scheduling module on the device is configured to provide at least one behavioral criterion.
[12]
12. Method according to claim 1, characterized in that the wireless device software communicates with a device manager to coordinate the test with at least one additional wireless device software in at least one other corresponding application .
[13]
13. Method, according to claim 1, characterized by the fact that it also comprises communication with an authentication server, to register the wireless device software and approve the use of the wireless device software the electronic disposition.
[14]
14. Method, according to claim 7, characterized by the fact that it further comprises receiving a request to modify the testing behavior of a third party through the configuration server.
[15]
15. Method, according to claim 7, characterized by the fact that it also comprises receiving a command to terminate the operation of the application through the configuration server.
[16]
16. Method, according to claim 1, characterized by the fact that it also comprises obtaining metadata from at least one third party data source, and incorporating the metadata in the test data.
[17]
17. Method, according to claim 1, characterized by the fact that it also comprises waiting to provide test data to the collection server, until the wireless device software detects a WiFi connection.
[18]
18. Method, according to claim 1, characterized by the fact that it also comprises monitoring a limit for the use of data to impose a maximum amount of data that can be
Petition 870190067243, of 07/16/2019, p. 97/100
4/5 used during peto less a test.
[19]
19. Method according to claim 1, characterized by the fact that it further comprises repeating at least one test, according to any one or more of a predetermined frequency, displacement at a predetermined distance or establishment of a new connection.
[20]
20. Method, according to claim 1, characterized by the fact that it comprises anonymizing the data collected to prevent the storage of user identifiable information.
[21]
21. Method according to claim 1, characterized by the fact that the wireless device software exposes one or more APIs to the application, to allow the application to have direct access and be responsive to the test data.
[22]
22. Method according to claim 1, characterized by the fact that the wireless device software is configured to be able to run while the application is being used in the background, or when the application is in the background of the device .
[23]
23. Method according to claim 7, characterized by the fact that the wireless device software generates a random identifier, and in which the configuration server is able to determine, by test behaviors, how often the random identifier is used and how often that random identifier is replaced with a new random identifier.
[24]
24. Method according to claim 1, characterized by the fact that the wireless device software comprises a strict coding limit for various error codes, which can generate for a period of time, after which the device software wireless stops operating.
[25]
25. Method according to claim 1, characterized
Petition 870190067243, of 07/16/2019, p. 98/100
5/5 due to the fact that the wireless device software limits or stops activity during periods of high network usage by the application or memory.
[26]
26. Method, according to claim 1, characterized by the fact that the wireless device software operates according to the permissions received.
[27]
27. Method according to claim 1, characterized by the fact that the wireless device comprises a sequential queue to avoid simultaneous testing.
[28]
28. Method, according to claim 1, characterized by the fact that it further comprises:
select and initiate at least one test based on one or more user actions detected by the wireless electronic device; and providing details associated with one or more user actions as an input to analyze the results of the test data, where the one or more user actions are different from a user specifically selecting to perform a network test.
[29]
29. Computer-readable medium, characterized by the fact that it comprises instructions executable by computer to execute the method as defined in any one of claims 1 to 28.
[30]
30. System, characterized by the fact that it comprises a processor and a memory, the memory comprising the computer-readable medium as defined in claim 29.
类似技术:
公开号 | 公开日 | 专利标题
BR112019014658A2|2020-05-26|system and method for evaluating wireless device and / or wireless network performance
EP3571859B1|2022-03-16|System and method for evaluating wireless device and wireless network performance
US20180262533A1|2018-09-13|Monitoring Device Data and Gateway Data
JP2018170777A|2018-11-01|System and method for security and quality assessment of wireless access points
US20160080965A1|2016-03-17|Distributed RAN Information Collection, Consolidation And RAN-Analytics
US8838819B2|2014-09-16|Method for embedding meta-commands in normal network packets
Goel et al.2015|Survey of end-to-end mobile network measurement testbeds, tools, and services
US20100268524A1|2010-10-21|Method For Modeling User Behavior In IP Networks
US10841187B2|2020-11-17|Monitoring enterprise networks with endpoint agents
CA3050163C|2021-11-02|System and method for interacting with and controlling testing of wireless device and/or wireless network performance on wireless electronic devices
US20210390039A1|2021-12-16|Scheduled tests for endpoint agents
Petroulakis et al.2018|Reactive security for SDN/NFV‐enabled industrial networks leveraging service function chaining
US20200045131A1|2020-02-06|Method and system for a network presence platform with intelligent routing
US10972461B2|2021-04-06|Device aware network communication management
US20180376358A1|2018-12-27|Remote management of base stations and/or the functions performed thereby
Gao et al.2021|A nationwide census on wifi security threats: prevalence, riskiness, and the economics.
Campos et al.2021|Towards Labeling On-Demand IoT Tra ic
Naslund et al.2016|5G-ENSURE-D2. 3 Risk Assessment, Mitigation and Requirements |
WO2021021267A1|2021-02-04|Scheduled tests for endpoint agents
Nunes et al.2014|Agent-based platform for continuous measurement of Internet access quality of service
CA2653607A1|2008-08-14|Improved network management
同族专利:
公开号 | 公开日
AU2018209352A1|2019-08-08|
EP3571860A1|2019-11-27|
EP3571860A4|2021-01-27|
WO2018132902A1|2018-07-26|
US20180206136A1|2018-07-19|
US20210006993A1|2021-01-07|
CA3050165A1|2018-07-26|
US20210076237A1|2021-03-11|
US10827371B2|2020-11-03|
CN110366858A|2019-10-22|
引用文献:
公开号 | 申请日 | 公开日 | 申请人 | 专利标题

US6618360B1|1999-06-15|2003-09-09|Hewlett-Packard Development Company, L.P.|Method for testing data path of peripheral server devices|
US6745011B1|2000-09-01|2004-06-01|Telephia, Inc.|System and method for measuring wireless device and network usage and performance metrics|
US7302420B2|2003-08-14|2007-11-27|International Business Machines Corporation|Methods and apparatus for privacy preserving data mining using statistical condensing approach|
KR100677425B1|2005-01-13|2007-02-02|엘지전자 주식회사|Apparatus and method for checking performance of wireless data network|
US7873321B2|2005-03-29|2011-01-18|Qualcomm Incorporated|Apparatus and methods for determining network access performance of a wireless device|
US8538413B2|2006-03-02|2013-09-17|Telefonaktiebolaget L M Ericsson |Network layer test module for testing control channels and unidirectional radio bearers|
US8954045B2|2006-09-29|2015-02-10|Qualcomm Incorporated|Method and apparatus for managing resources at a wireless device|
US8483068B2|2007-03-30|2013-07-09|Verizon Patent And Licensing Inc.|System and method of performance monitoring of multicast services with mobility support|
US9137664B2|2007-05-01|2015-09-15|Qualcomm Incorporated|Application logging interface for a mobile device|
US20100041391A1|2008-08-12|2010-02-18|Anthony Wayne Spivey|Embedded mobile analytics in a mobile device|
US8355945B1|2009-05-11|2013-01-15|Sprint Communications Company L.P.|Identifying and ranking high-impact churn sectors|
US8811977B2|2010-05-06|2014-08-19|At&T Mobility Ii Llc|Device-driven intelligence and feedback for performance optimization and planning of a service network|
KR101804595B1|2010-05-25|2018-01-10|헤드워터 리서치 엘엘씨|Device-assisted services for protecting network capacity|
US20120123723A1|2010-11-15|2012-05-17|Wassim El-Hassan|Methods for mitigating interactions among wireless devices in a wireless test system|
US9444692B2|2011-04-26|2016-09-13|Openet Telecom Ltd.|Systems, devices and methods of crowd-sourcing across multiple domains|
US9451451B2|2011-09-30|2016-09-20|Tutela Technologies Ltd.|System for regulating wireless device operations in wireless networks|
US20130159150A1|2011-12-19|2013-06-20|Verizon Patent And Licensing, Inc.|Mobile device data metering, bandwidth allocation, and traffic control|
US9465668B1|2012-04-30|2016-10-11|Google Inc.|Adaptive ownership and cloud-based configuration and control of network devices|
GB2507994A|2012-11-16|2014-05-21|Vodafone Ip Licensing Ltd|Mobile Device Application Analysis|
US9195574B1|2012-11-30|2015-11-24|Mobile Labs, LLC|Systems, methods, and apparatuses for testing mobile device applications|
US9274935B1|2013-01-15|2016-03-01|Google Inc.|Application testing system with application programming interface|
US20150371163A1|2013-02-14|2015-12-24|Adaptive Spectrum And Signal Alignment, Inc.|Churn prediction in a broadband network|
US9530168B2|2013-03-28|2016-12-27|Linkedin Corporation|Reducing churn rate for a social network service|
EP2981889A4|2013-04-04|2016-11-16|Google Inc|Mobile application performance prediction|
US20150186952A1|2014-01-01|2015-07-02|SlamAd.com, Inc.|Apparatus and method to facilitate downloading mobile software applications into a portable electronic device, which software applications include advertisements that are embedded within the software application and are re-transmitted to others through use of the portable electronic device|
US20160100325A1|2014-01-27|2016-04-07|Google Inc.|Wireless network monitoring device|
US9392474B2|2014-07-08|2016-07-12|Ixia|Methods, systems, and computer readable media for determining a metric of radio frequency channel quality for idle channels in LTE and LTE advanced networks|
US20160011623A1|2014-07-09|2016-01-14|Intel Corporation|Processor state control based on detection of producer/consumer workload serialization|
US20160134508A1|2014-11-12|2016-05-12|International Business Machines Corporation|Non-disruptive integrated network infrastructure testing|
US9924386B2|2015-06-22|2018-03-20|Orchestra Technology, Inc.|Method and system for wireless network optimization and remote control of mobile handset operation|
DE102015121484A1|2015-12-10|2017-06-14|P3 Insight GmbH|Method for determining a data transmission speed of a telecommunication network|
US10949771B2|2016-01-28|2021-03-16|Facebook, Inc.|Systems and methods for churn prediction|EP3449596B1|2016-04-28|2019-07-31|Telefonaktiebolaget LM Ericsson |Technique for handling service level related performance data for roaming user terminals|
US10142165B2|2017-04-24|2018-11-27|Hall Labs Llc|Adaptive communication channel redundancy in a hub-based intermediate-range system|
US10838950B2|2017-04-29|2020-11-17|Cisco Technology, Inc.|Dynamic review cadence for intellectual capital|
US10820274B2|2017-06-19|2020-10-27|T-Mobile Usa, Inc.|Systems and methods for testing power consumption of electronic devices|
US11120377B2|2017-12-29|2021-09-14|The Nielsen Company , Llc|Methods, systems, articles of manufacture and apparatus to calibrate payload information|
EP3525507B1|2018-02-07|2021-04-21|Rohde & Schwarz GmbH & Co. KG|Method and test system for mobile network testing as well as prediction system|
EP3614627B1|2018-08-20|2021-09-15|EXFO Inc.|Telecommunications network and services qoe assessment|
US11166170B2|2018-09-28|2021-11-02|Mcafee, Llc|Mechanism to determine cellular signal reputation of a location and intelligently manage lack of connectivity|
GB201821330D0|2018-12-31|2019-02-13|Bullitt Group Ltd|Data collection|
US11212186B2|2019-03-13|2021-12-28|Facebook, Inc.|Measuring the impact of network deployments|
WO2021009134A1|2019-07-12|2021-01-21|Aveq Gmbh|System and method for estimating the quality of networks or services through crowdsourcing|
WO2021025601A1|2019-08-06|2021-02-11|Telefonaktiebolaget Lm Ericsson |Methods and nodes in a communications network|
EP3800865A1|2019-10-04|2021-04-07|HERE Global B.V.|Crowdsourcing performance indicators|
CN111490910A|2020-03-27|2020-08-04|深圳融安网络科技有限公司|Device information scanning method, terminal device and computer readable storage medium|
法律状态:
2021-10-19| B350| Update of information on the portal [chapter 15.35 patent gazette]|
优先权:
申请号 | 申请日 | 专利标题
US201762447255P| true| 2017-01-17|2017-01-17|
PCT/CA2018/050043|WO2018132902A1|2017-01-17|2018-01-16|System and method for evaluating wireless device and/or wireless network performance|
[返回顶部]