![]() Electronic devices and methods for treating depressive symptoms associated with multiple sclerosis
专利摘要:
Certain implementations of the disclosed technology may include electronic devices and methods for treating depressive symptoms associated with multiple sclerosis. According to an example implementation, a method is provided. The method may include displaying a feeling selection interface; receiving a feeling selection input; displaying a feeling spectrum interface; receiving a first feeling intensity input; displaying an automatic thought selection interface; receiving an automatic thought selection input; displaying an alternative thought selection interface; receiving an alternative thought selection input; displaying the feeling spectrum interface again; receiving a second feeling intensity input; and generating a journal entry indicating at least any difference between the first feeling intensity input and the second feeling intensity input. 公开号:DK201970328A1 申请号:DKP201970328 申请日:2019-05-24 公开日:2020-11-23 发明作者:Paul Kersanske Brent;Brown Michael;R Weingardt Kenneth;Christine Ahrens Jillian 申请人:Pear Therapeutics Inc; IPC主号:
专利说明:
[0001] [0001] This disclosure relates, generally, to the treatment of depression and, more particularly, to electronic devices and methods for the treatment of depressive symptoms associated with multiple sclerosis utilizing computerized behavioral therapy.BACKGROUND [0002] [0002] Multiple sclerosis is a chronic disease involving damage to sheaths of nerve cells in the brain and spinal cord, causing symptoms including numbness, pain, fatigue, impaired speech and muscle coordination, and vision loss. The National Multiple Sclerosis Society estimates nearly one million people in the United States alone live with multiple sclerosis. Generally, there are four types of multiple sclerosis: Relapsing-remitting multiple sclerosis, secondary-progressive multiple sclerosis, primary-progressive multiple sclerosis, and progressive-relapsing multiple sclerosis. Relapsing-remitting multiple sclerosis is the most common type of multiple sclerosis as 85% of all multiple sclerosis patients are initially diagnosed with relapsing- remitting multiple sclerosis. Patients with relapsing-remitting multiple sclerosis experience clearly defined attacks of new or increasing neurological symptoms (relapses) followed by periods of partial or complete recovery (remission). [0003] [0003] Relapsing-remitting multiple sclerosis is defined by inflammatory attacks on nerve fibers and myelin, which are layers of insulating membranes surrounding the nerve fibers in the central nervous system. This can cause a patient to experience common symptoms of multiple sclerosis including fatigue, walking difficulties, numbness or tingling, spasticity, weakness, vision issues, dizziness or vertigo, urinary incontinence or bowel incontinence, and cognitive or emotional changes. Each patient’s experience with relapsing-remitting multiple sclerosis is unique; no two patients will present the same symptoms or have the same disease course. [0004] [0004] Relapsing-remitting multiple sclerosis often leads to depressive symptoms and anxiety. Depressive symptoms are a natural reaction to the unpredictable course 1 [0005] [0005] There is no association between the severity of symptoms and likelihood of a patient experiencing depressive symptoms; any patient with relapsing-remitting multiple sclerosis can experience depressive symptoms at any point in the disease progression. But a variety of factors may influence depressive symptoms in patients with relapsing-remitting multiple sclerosis. A patient’s initial diagnosis of multiple sclerosis may be followed by a period of depressive symptoms. Patients may also experience depressive symptoms due to the physical symptoms associated with multiple sclerosis. For example, a patient suffering from fatigue may be depleted of emotional energy required to fight depressive symptoms. Furthermore, a patient’s high level of uncertainty about new symptoms and the future may cause patients to experience depressive symptoms. Physiological causes, such as damage to the central nervous system, and chemical changes, such as expression of pro-inflammatory protein molecules involved in cell-to-cell communications, may cause patients to experience depressive symptoms as well. Medication side effects can worsen depressive symptoms. Steroids, for example, can cause euphoria in the short term, followed by depressive symptoms once the euphoria has stopped. [0006] [0006] Depressive symptoms significantly affect the mood of a patient suffering from multiple sclerosis, thereby negatively affecting the patient’s quality of life. Patients with multiple sclerosis may prioritize physical health over emotional health and leave depressive symptoms untreated. Leaving depressive symptoms untreated can lead to reduced quality of life and impaired cognitive function. For example, depressed patients may seek to withdraw from daily life activities, resulting in reduced social stimulation. Patients with multiple sclerosis also experience an increased risk of suicide — they are 7.5 times more likely to commit suicide than members of the general population. [0007] [0007] Current treatment options for depressive symptoms in multiple sclerosis patients generally include antidepressant medication and face-to-face therapy with a clinician. However, these treatment options have proven sub-optimal. Accordingly, improved electronic devices and methods for treating depressive symptoms associated with multiple sclerosis are needed.SUMMARY [0008] [0008] The instant disclosure provides various electronic devices, methods, and digital therapeutics for treating depressive symptoms associated with multiple sclerosis. According to one aspect of the disclosure, an electronic device for treating depressive symptoms associated with multiple sclerosis is provided. The electronic device includes a display, an input device, one or more processors, and memory storing one or more programs configured to be executed by the one or more processors. According to this aspect, the one or more programs include instructions for carrying out a method. The method includes, displaying, on the display, a feeling selection interface. The feeling selection interface presents a plurality of feeling interface elements, and each feeling interface element is associated with a particular feeling. The method further includes, while displaying the feeling selection interface, receiving, via the input device, a first sequence of inputs. The first sequence of inputs includes a feeling selection input that corresponds to a particular feeling interface element. The method further includes, in response to receiving the feeling selection input, displaying, on the display, a feeling spectrum interface presenting a plurality of intensities associated with the particular feeling. The method further includes, while displaying the feeling spectrum interface, receiving, via the input device, a second sequence of inputs. The second sequence of inputs includes a first feeling intensity input. The first feeling intensity input corresponds to a first intensity of the plurality of intensities. The method further includes, in response to receiving the first feeling intensity input, displaying, on the display, an automatic thought selection interface. The automatic thought selection interface presents a plurality of automatic thought interface elements, and each automatic thought interface element is associated with a particular automatic thought. The method further includes, while displaying the automatic thought selection interface, receiving, via the input device, a third sequence of inputs. The third sequence of inputs includes an automatic thought selection input 3 [0009] [0009] This aspect may include one or more of the following optional features as well. In some aspects, the instructions implement a method that includes, in response to receiving the automatic thought selection input, displaying, on the display, a thinking traps interface. The thinking traps interface presents a plurality of thinking trap interface elements associated with the particular automatic thought interface element. Each thinking trap interface element is associated with a particular thinking trap. The method may further include, while displaying the thinking traps interface, receiving, via the input device, a sixth sequence of inputs. The sixth sequence of inputs includes one or more thinking trap selection inputs. The one or more thinking trap selection inputs correspond to one or more particular thinking trap interface elements. The method may further include the journal entry being modified to further indicate the one or more particular thinking trap interface elements. [0010] [0010] In some aspects, the method includes, in response to receiving the one or more thinking trap selection inputs, displaying, on the display, a quick recap interface 4 [0011] [0011] In one aspect, the method includes, the journal entry being modified to further indicate the particular alternative thought interface element. [0012] [0012] In another aspect, the method includes, in response to receiving the feeling selection input, displaying, on the display, a company selection interface. The company selection interface presents a plurality of company interface elements. Each company interface element is associated with a particular relationship type. The method may further include, while displaying the company selection interface, receiving, via the input device, a seventh sequence of inputs. The seventh sequence of inputs includes a company selection input. The company selection input corresponds to a particular company interface element. The method may further include the journal entry being modified to further indicate the particular company interface element. [0013] [0013] In some aspects, the method includes, in response to receiving the feeling selection input, displaying, on the display, a location selection interface. The location selection interface presents a plurality of location interface elements. Each location interface element is associated with a particular location. The method may further include, while displaying the location selection interface, receiving, via the input device, an eighth sequence of inputs. The eighth sequence of inputs includes a location selection input. The location selection input corresponds to a particular location interface element. The method may further include the journal entry being modified to further indicate the particular location interface element. [0014] [0014] In another aspect, the method includes in response to receiving the feeling selection input, displaying, on the display, a multiple sclerosis symptoms selection interface. The multiple sclerosis symptoms selection interface presents a plurality of multiple sclerosis symptom interface elements. Each multiple sclerosis symptom interface element is associated with a particular multiple sclerosis symptom. The method may further include, while displaying the multiple sclerosis symptoms selection interface, receiving, via the input device, a ninth sequence of inputs. The ninth sequence of inputs include one or more multiple sclerosis symptom selection inputs. The one or more multiple sclerosis symptom selection inputs correspond to 5 [0015] [0015] According to another aspect of the disclosure, a computerized method for treating depressive symptoms associated with multiple sclerosis is provided. The method includes, at an electronic device including a display and an input device, displaying, on the display, a feeling selection interface presenting a plurality of feeling interface elements. Each interface element is associated with a particular feeling. While displaying the feeling selection interface, the method further includes receiving, via the input device, a first sequence of inputs. The first sequence of inputs includes a feeling selection input. The feeling selection input corresponds to a particular feeling interface element. The method further includes, in response to receiving the feeling selection input, displaying, on the display, a feeling spectrum interface. The feeling spectrum interface presents a plurality of intensities associated with the particular feeling. The method further includes, while displaying the feeling spectrum interface, receiving, via the input device, a second sequence of inputs. The second sequence of inputs includes a first feeling intensity input that corresponds to a first intensity of the plurality of intensities. In response to receiving the first feeling intensity input, the method further includes displaying, on the display, an automatic thought selection interface. The automatic thought selection interface presents a plurality of automatic thought interface elements, and each automatic thought interface element is associated with a particular automatic thought. The method further includes, while displaying the automatic thought selection interface, receiving, via the input device, a third sequence of inputs. The third sequence of inputs includes an automatic thought selection input. The automatic thought selection input corresponds to a particular automatic thought interface element. In response to receiving the automatic thought selection input, the method further includes displaying, on the display, an alternative thought selection interface. The alternative thought selection interface presents a plurality of alternative thought interface elements, and each alternative thought interface element is associated with a particular alternative thought. The method further includes, while displaying the alternative thought selection interface, receiving, via the input device, a fourth sequence of 6 [0016] [0016] Aspects of the disclosure may include one or more of the following features as well. In one exemplary aspect, the method includes, in response to receiving the automatic thought selection input, displaying, on the display, a thinking traps interface. The thinking traps interface presents a plurality of thinking trap interface elements associated with the particular automatic thought interface element. Each thinking trap interface element is associated with a particular thinking trap. The method may also further include, while displaying the thinking traps interface, receiving, via the input device, a sixth sequence of inputs. The sixth sequence of inputs includes one or more thinking trap selection inputs. The one or more thinking trap selection inputs correspond to one or more particular thinking trap interface elements. The method may also further include the journal entry being modified to further indicate the one or more particular thinking trap interface elements. [0017] [0017] In another aspect, the method includes, in response to receiving the one or more thinking trap selection inputs, displaying, on the display, a quick recap interface element. The quick recap interface element indicates the particular automatic thought and the one or more particular thinking trap elements. [0018] [0018] In still another aspect, the method includes in response to receiving the alternative thought selection input, the journal entry is modified to further indicate the particular alternative thought interface element. [0019] [0019] In one aspect, the method includes, in response to receiving the feeling selection input, displaying, on the display, a company selection interface. The 7 [0020] [0020] In another aspect, the method includes, in response to receiving the feeling selection input, displaying, on the display, a location selection interface. The location selection interface presents a plurality of location interface elements, and each location interface element is associated with a particular location. The method may further include, while displaying the location selection interface, receiving, via the input device, an eighth sequence of inputs. The eighth sequence of inputs includes a location selection input. The location selection input corresponds to a particular location interface element. The method may further include the journal entry being modified to further indicate the particular location interface element. [0021] [0021] In one aspect, the method includes, in response to receiving the feeling selection input, displaying, on the display, a multiple sclerosis symptoms selection interface that presents a plurality of multiple sclerosis symptom interface elements. Each multiple sclerosis symptom interface element is associated with a particular multiple sclerosis symptom. The method may further include, while displaying the multiple sclerosis symptoms selection interface, receiving, via the input device, a ninth sequence of inputs. The ninth sequence of inputs includes one or more multiple sclerosis symptom selection inputs. The one or more multiple sclerosis symptom selection inputs correspond to one or more particular multiple sclerosis symptom interface elements. The method may further include the journal entry being modified to further indicate the one or more particular multiple sclerosis symptom interface elements. [0022] [0022] An exemplary non-transitory computer-readable storage medium storing one or more programs configured to be executed by one or more processors of an 8 [0023] [0023] An exemplary method for treatment of depressive symptoms associated with multiple sclerosis, in a subject, in need thereof, including administering to said subject a computerized method of the foregoing method is also included as part of the instant disclosure. [0024] [0024] According to another aspect of the disclosure, another computerized method for treating depressive symptoms associated with multiple sclerosis is provided. The method includes, at an electronic device including a display and an input device, receiving, via the input device, feeling assessment data describing a feeling associated with a user. The method further includes receiving, via the input device, first feeling intensity data describing a first intensity of the feeling associated with the user. The method further includes identifying a plurality of potential automatic thoughts based on the feeling associated with the user. Each potential automatic thought of the plurality of potential automatic thoughts corresponds to a negative thought. Additionally, the method includes receiving, via the input device, automatic thought selection data. The automatic thought selection data identifies a particular potential automatic thought from among the plurality of potential automatic thoughts. The method also includes identifying a plurality of potential alternative thoughts based on the automatic thought selection data. Each potential alternative thought of the plurality of potential alternative thoughts corresponds to a positive thought. Further, the method includes receiving, via the input device, alternative thought selection data. The alternative thought selection data identifies a particular potential alternative thought from among the plurality of potential alternative thoughts. Continuing, the method includes receiving, via the input device, second feeling intensity data describing a second intensity of the feeling associated with the user. The method also includes determining any difference between the first intensity and the second intensity to provide feeling intensity difference data. Finally, according to this aspect of the disclosure, the method includes displaying, on the display, the feeling intensity difference data. [0025] [0025] Exemplary electronic devices for performing the foregoing method are also included as part of the instant disclosure. [0026] [0026] An exemplary non-transitory computer-readable storage medium storing one or more programs configured to be executed by one or more processors of an electronic device with a display and input device including instructions for performing the foregoing method is also included as part of the instant disclosure. [0027] [0027] An exemplary method for treatment of depressive symptoms associated with multiple sclerosis, in a subject, in need thereof, including administering to said subject a computerized method of the foregoing method is also included as part of the instant disclosure. [0028] [0028] According to another aspect of the disclosure, a digital therapeutic for treating depressive symptoms associated with multiple sclerosis is provided. The digital therapeutic includes an automatic thought identification module. The automatic thought identification module is configured to identify a plurality of potential automatic thoughts based on feeling assessment data describing a feeling associated with a user. Each potential automatic thought of the plurality of potential automatic thoughts corresponds to a negative thought. The automatic thought identification module is also configured to receive automatic thought selection data. The automatic thought selection data identifies a particular potential automatic thought from among the plurality of potential automatic thoughts. The digital therapeutic further includes an alternative thought identification module. The alternative thought identification module is configured to identify a plurality of potential alternative thoughts based on the automatic thought selection data. Each potential alternative thought of the plurality of potential alternative thoughts corresponds to a positive thought. The alternative thought identification module is also configured to receive alternative thought selection data. The alternative thought selection data identifies a particular potential alternative thought from among the plurality of potential alternative thoughts. The digital therapeutic further includes a feeling intensity module. The feeling intensity module is configured to receive first feeling intensity data describing a first intensity of the feeling associated with the user at first point in time. The feeling intensity module is also configured to receive second feeling intensity data describing a second intensity of the feeling associated with the user at second point in time. The second point in time is later than the first point in time. The feeling intensity module is also configured to generate feeling intensity difference data. The feeling 10 [0029] [0029] This aspect may include one or more of the following optional features as well. In some aspects, the digital therapeutic further includes a feeling assessment module. The feeling assessment module is configured to receive the feeling assessment data describing the feeling associated with the user. [0030] [0030] According to another aspect of the disclosure, the digital therapeutic further includes a thinking traps module. The thinking traps module is configured to identify a plurality of potential thinking traps based on the feeling assessment data and receive thinking trap selection data. The thinking trap selection data identifies one or more particular potential thinking traps from among the plurality of potential thinking traps. [0031] [0031] In another aspect, the digital therapeutic further includes a journal module. The journal module is configured to generate a journal entry comprising at least the feeling intensity difference data. [0032] [0032] According to another aspect, the digital therapeutic further includes a company module. The company module is configured to receive company selection data. The company selection data identifies, by relationship type, a person who accompanied the user at a time in which the user experienced the feeling. The journal entry further includes the company selection data. [0033] [0033] In yet another aspect of the disclosure, the digital therapeutic further includes a location module. The location module is configured to receive location selection data identifying a location of the user at a time in which the user experienced the feeling. The journal entry further includes the location selection data. [0034] [0034] According to another aspect, the digital therapeutic further includes a multiple sclerosis (MS) symptom module. The multiple sclerosis symptom module is configured to receive multiple sclerosis symptom selection data identifying one or more multiple sclerosis symptoms associated with the user. The journal entry further includes the multiple sclerosis symptom selection data. [0035] [0035] In another aspect, the digital therapeutic further includes the journal entry further including the thinking trap selection data. [0036] [0036] An exemplary method for treatment of depressive symptoms associated with multiple sclerosis, in a subject, in need thereof, including administering to said subject a digital therapeutic of the foregoing digital therapeutic is also included as part of the instant disclosure. [0037] [0037] The details of one or more implementations of the disclosure are set forth in the accompanying drawings and the description below. Other aspects, features, and advantages will be apparent from the description and drawings, and from the claims.BRIEF DESCRIPTION OF THE FIGURES [0038] [0038] Reference will now be made to the accompanying Figures, which are not necessarily drawn to scale, and wherein: [0039] [0039] FIG. 1 is a schematic view of an example system implementing a computerized method for treating depressive symptoms associated with multiple sclerosis. [0040] [0040] FIG. 2A illustrates a feeling selection interface in accordance with an exemplary embodiment of the disclosure. [0041] [0041] FIG. 2B illustrates a feeling spectrum interface in accordance with an exemplary embodiment of the disclosure. [0042] [0042] FIG. 2C illustrates an automatic thought selection interface in accordance with an exemplary embodiment of the disclosure. [0043] [0043] FIG. 2D illustrates an alternative thought selection interface in accordance with an exemplary embodiment of the disclosure. [0044] [0044] FIG. 2E illustrates a feeling spectrum interface in accordance with an exemplary embodiment of the disclosure. [0045] [0045] FIG. 2F illustrates a thinking traps interface in accordance with an exemplary embodiment of the disclosure. [0046] [0046] FIG. 2G illustrates another view of the thinking traps interface in accordance with an exemplary embodiment of the disclosure. [0047] [0047] FIG. 2H illustrates yet another view of the thinking traps interface in accordance with an exemplary embodiment of the disclosure. [0048] [0048] FIG. 2I illustrates a company selection interface in accordance with an exemplary embodiment of the disclosure. [0049] [0049] FIG. 2] illustrates a location selection interface in accordance with an exemplary embodiment of the disclosure. [0050] [0050] FIG. 2K illustrates a symptoms selection interface in accordance with an exemplary embodiment of the disclosure. [0051] [0051] FIG. 2L illustrates a recap interface element in accordance with an exemplary embodiment of the disclosure. [0052] [0052] FIG. 2M illustrates a journal interface in accordance with an exemplary embodiment of the disclosure. [0053] [0053] FIG. 2N illustrates a positive feeling selection interface in accordance with an exemplary embodiment of the disclosure. [0054] [0054] FIG. 20 illustrates a situation selection interface in accordance with an exemplary embodiment of the disclosure. [0055] [0055] FIG. 2P illustrates a positive reflection element in accordance with an exemplary embodiment of the disclosure. [0056] [0056] FIG. 2Q illustrates a positive journal interface in accordance with an exemplary embodiment of the disclosure. [0057] [0057] FIG. 3 is a flowchart illustrating a computerized method for treating depressive symptoms associated with multiple sclerosis in accordance with an exemplary embodiment of the disclosure. [0058] [0058] FIG. 4 is a flowchart illustrating another computerized method for treating depressive symptoms associated with multiple sclerosis in accordance with an exemplary embodiment of the disclosure. [0059] [0059] FIG. 5 is a schematic view of an example electronic device for treating depressive symptoms associated with multiple sclerosis in accordance with an exemplary embodiment of the disclosure. [0060] [0060] FIG. 6 is a functional block diagram illustrating a digital therapeutic for treating depressive symptoms associated with multiple sclerosis in accordance with an exemplary embodiment of the disclosure. [0061] [0061] Like reference symbols in the various drawings indicate like elements. 13 [0062] [0062] Some implementations of the disclosed technology will be described more fully with reference to the accompanying drawings. This disclosed technology may, however, be embodied in many different forms and should not be construed as limited to the implementations set forth herein. [0063] [0063] Example implementations of the disclosed technology provide electronic devices, methods, and digital therapeutics for treating depressive symptoms associated with multiple sclerosis. [0064] [0064] Example implementations of the disclosed technology will now be described with reference to the accompanying figures. [0065] [0065] Referring to FIG. 1, in some implementations, a therapy prescription system 100 provides a patient 101 access to a prescription digital therapeutic 120 prescribed to the patient 101 and monitors events associated with the patient’s 101 interaction with the prescription digital therapeutic 120. Although the digital therapeutic 120 is described herein as being a “prescription” digital therapeutic, it is understood that, according to some implementations, the digital therapeutic 120 will not require a prescription from a clinician. Rather, in such implementations, the digital therapeutic 120 may be available to a patient without a prescription, and the digital therapeutic 120 nonetheless otherwise functions in accordance with the description of the prescription digital therapeutic 120 described herein. According to implementations in which the digital therapeutic 120 is not prescribed, the person using or being administered the digital therapeutic may be referred to as a “user.” A “user” may include a patient 101 or any other person using or being administered the digital therapeutic 120, irrespective of whether the digital therapeutic 120 was prescribed to that person. [0066] [0066] As used herein, a digital therapy may also be referred to as a digital- therapeutic configured to deliver evidence-based psychosocial intervention techniques for treating a patient with a particular disease or disorder, as well as symptoms and/or behaviors associated with the particular disease or disorder. In the instant case, the patient 101 is diagnosed with multiple sclerosis (MS) and the prescription digital therapeutic 120 is specifically tailored for addressing one or more depressive symptoms associated with MS that the patient 101 may experience. An authorized 14 [0067] [0067] In some examples, the system 100 includes a network 106, a patient device 102, an HCP system 140, and a multiple sclerosis therapy service 160. The network 106 provides access to cloud computing resources 150 (e.g., distributed system) that execute the multiple sclerosis therapy service 160 to provide for the performance of services on remote devices. Accordingly, the network 106 allows for interaction between patients 101 and HCPs 109 with the multiple sclerosis therapy service 160. For instance, the multiple sclerosis therapy service 160 may provide the patient 101 access to the prescription digital therapeutic 120 and receive event data 122 inputted by the patient 101 associated with the patient’s 101 interaction with the prescription digital therapeutic 120. In turn, the multiple sclerosis therapy service 160 may store the event data 122 on a storage resource 156. [0068] [0068] The network 106 may include any type of network that allows sending and receiving communication signals, such as a wireless telecommunication network, a cellular telephone network, a time division multiple access (TDMA) network, a code division multiple access (CDMA) network, Global system for mobile communications (GSM), a third generation (3G) network, fourth generation (4G) network, a satellite communications network, and other communication networks. The network 106 may include one or more of a Wide Area Network (WAN), a Local Area Network (LAN), and a Personal Area Network (PAN). In some examples, the network 106 includes a combination of data networks, telecommunication networks, and a combination of data and telecommunication networks. The patient device 102, the HCP system 140, and the multiple sclerosis therapy service 160 communicate with each other by sending and receiving signals (wired or wireless) via the network 106. In some examples, the network 106 provides access to cloud computing resources, which may be elastic/on-demand computing and/or storage resources 156 available over the 15 [0069] [0069] The patient device 102 may include, but is not limited to, a portable electronic device (e.g., smartphone, cellular phone, personal digital assistant, personal computer, or wireless tablet device), a desktop computer, or any other electronic device capable of sending and receiving information via the network 106. The patient device 102 includes data processing hardware 112 (a computing device that executes instructions), memory hardware 114, and a display 116 in communication with the data processing hardware 112. In some examples, the patient device 102 includes a keyboard 148, mouse, microphones, and/or a camera for allowing the patient 101 to input data. In addition to or in lieu of the display 116, the patient device 102 may include one or more speakers to output audio data to the patient 101. For instance, audible alerts may be output by the speaker to notify the patient 101 about some time sensitive event associated with the prescription digital therapeutic 120. In some implementations, the patient device 102 executes a patient application 103 (or accesses a web-based patient application) for establishing a connection with the multiple sclerosis therapy service 160 to access the prescription digital therapeutic [0070] [0070] The patient application 120 may send notifications to the patient device [0071] [0071] The storage resources 156 may provide data storage 158 for storing the event data 122 received from the patient 101 in a corresponding patient record 105 as well as the prescription digital therapeutic 120 prescribed to the patient 101. The patient record 105 may be encrypted while stored on in the data storage 158 so that any information identifying patient 101 is anonymized, but may later be de-crypted when the patient 101 or supervising HCP 109 requests the patient record 105 (assuming the requester is authorized/authenticated to access the patient record 105). All data transmitted over the network 106 between the patient device 102 and the cloud computing system 150 may be encrypted and sent over secure communication channels. For instance, the patient application 103 may encrypt the event data 122 before transmitting to the multiple sclerosis therapy service 160 via the HTTPS protocol and decrypt a patient record 105 received from the multiple sclerosis therapy service 160. When network connectivity is not available, the patient application 103 may store the event data 122 in an encrypted queue within the memory hardware 114 until network connectivity is available. [0072] [0072] The HCP system 140 may be located at a clinic, doctor’s office, or facility administered by the HCP 109 and includes data processing hardware 142, memory hardware 144, and a display 146. The memory hardware 144 and the display 146 are in communication with the data processing hardware 142. For instance, the data processing hardware 142 may reside on a desktop computer or portable electronic device for allowing the HCP 109 to input and retrieve data to and from the multiple sclerosis therapy service 160. In some examples, the HCP 109 may initially onboard some or all of patient data 107 at the time of prescribing the prescription digital therapeutic 120 to the patient 101. The HCP system 140 includes a keyboard 148, 17 [0073] [0073] The cloud computing resources 150 may be a distributed system (e.g., remote environment) having scalable/elastic resources 152. The resources 152 include computing resources 154 (e.g., data processing hardware) and/or the storage resources 156 (e.g., memory hardware). The cloud computing resources 150 execute the multiple sclerosis therapy service 160 for facilitating communications with the patient device 102 and the HCP system 140 and storing data on the storage resources 156 within the data storage 158. In some examples, multiple sclerosis therapy service 160 and the data storage 158 reside on a standalone computing device. The multiple sclerosis therapy service 160 may provide the patient 101 with the patient application 103 (e.g., a mobile application, a web-site application, or a downloadable program that includes a set of instructions) executable on the data processing hardware 112 and accessible through the network 106 via the patient device 102 when the patient 101 provides a valid access code 104. Similarly, the multiple sclerosis therapy service 160 may provide the HCP 109 with the HCP application 110 (e.g., a mobile application, a web-site application, or a downloadable program that includes a set of instructions) executable on the data processing hardware 142 and accessible through the network 106 via the HCP system 140. [0074] [0074] FIGS. 2A-2Q illustrate schematic views of exemplary GUIs of the prescription digital therapeutic 120 (e.g., by execution of the patient application 103) displayed on the display 116 of the patient device 102 for treating depressive symptoms associated with MS. The example GUIs are configured to display graphical elements (e.g., buttons) that the patient 101 may select via user inputs such as touch inputs, speech inputs, or other input techniques such as via a mouse, stylus, keyboard, gesture, or eye gaze. [0075] [0075] Referring to FIG. 2A, in some implementations, upon launching the patient application 103 associated with the prescription digital therapeutic 120 prescribed to the patient 101, the patient application 103 displays a feeling selection GUI 204 that allows the patient 101 to input a particular feeling they are presently experiencing, or has recently experienced. In the example shown, the feeling selection GUI 204 provides a plurality of feeling interface elements 205, each 205a—n associated with a corresponding feeling the patient 101 is experiencing or has recently experienced. While the example shown depicts interface elements 205a-205g, the patient 101 may view additional interface elements 205n by scrolling (e.g., via a swipe gesture). The plurality of feeling interface elements 205 may be prepopulated based on common feelings a typical patient diagnosed with MS may be experiencing. The patient 101 may indicate their current feelings by selecting the corresponding feeling interface element 205 displayed in the feeling selection GUI 204. In the example shown, a first feeling interface element 205a (“Anxious”) indicates that the patient 101 is feeling anxious, a second feeling interface element 205b (“Scared”) indicates that the patient 101 is feeling scared, a third feeling interface element 205c (“Dreadful”) indicates that the patient 101 is feeling dreadful, a fourth feeling interface element 205d (“Panicked”) indicates that the patient 101 is feeling panicked, a fifth feeling interface element 205e (“Angry”) indicates that the patient 101 is feeling angry, a sixth feeling interface element 205f (“Frustrated”) indicates that the patient 101 is feeling frustrated, and a seventh feeling interface element 205g (“Grieved”) indicates that the patient 101 is feeling grieved. [0076] [0076] The feeling interface elements 205a-205g do not represent an exhaustive list of all feeling interface elements, but rather an exemplary list of feeling interface elements that may be included as part of the feeling selection GUI 204. Furthermore, 19 [0077] [0077] In the example shown, the patient device 102 detects a first sequence of inputs, the first sequence of inputs including a feeling selection input 206 (e.g., touch or spoken) corresponding to the feeling element interface 205b (“Scared”) indicating they are feeling scared. As used herein, a sequence of inputs can be a single input. In some implementations, the feeling selection input 206 causes the patient application 103 to transmit time-stamped event data 122 to the multiple sclerosis therapy service 160 (FIG. 1) that includes a selection indication indicating that the patient is presently feeling scared. [0078] [0078] In some examples, the feeling selection input 206 causes the patient application 103 to generate, for display on the patient device 102, a journal interface element of the plurality of journal interface elements 231 (FIG. 2M), the journal interface element indicating the selected feeling. In other examples, the feeling selection input 206 causes the patient application 103 to modify the already-generated plurality of journal interface elements 231 to indicate the selected feeling. [0079] [0079] After detecting selection of a feeling interface element 205, the patient application 103 advances to display a feeling spectrum GUI 207 (FIG. 2B) on the display 116 of the patient device 102. In some configurations, the feeling selection input 206 selecting the feeling interface element 205 causes the patient application 103 to automatically display the feeling spectrum GUI 207. In other configurations, the patient application 103 requires the patient 101 to first confirm the selected feeling interface element 205 by selecting a Feeling Selection Done Button 237 (e.g., as shown in FIG. 2A). In these configurations, the patient application 103 displays the feeling spectrum GUI 207 in response to a selection indication indicating selection of the Feeling Selection Done Button 237. [0080] [0080] At FIG. 2B, in some configurations, the patient application 103 causes the patient device 102 to display the feeling spectrum GUI 207 that allows the patient 101 to input a feeling intensity of the particular feeling that the they are presently experiencing. In the example shown, the feeling spectrum GUI 207 provides a plurality of intensities 208, each individual intensity 208a-208e being associated with a corresponding intensity of the particular feeling the patient 101 may be presently experiencing. The patient 101 may indicate the present intensity of their current feelings by moving a Slider button 238 to select a corresponding intensity. In some configurations, Slider button 238 translates up and down a Scale 241, and the position of Slider button 238 relative to the Scale 241 indicates a particular intensity. For example, the location of the Slider button 238 relative to the Scale 241 is reflected in an intensity value 239. The intensity value 239 will provide the patient 101 with a numerical percentage value of their intensity of their current feeling. For example, if the patient 101 translates the Slider button 238 more than half way up the Scale 241, the intensity value 239 will reflect a higher percentage value. As seen in FIG. 2B, the location of the Slider button 238 relative to the Scale indicates the intensity of the feeling scared that the patient 101 is feeling, and the intensity value 239 indicates that the patient 101 is 59% scared. [0081] [0081] With continued reference to FIG. 2B, in some configurations, the location of Slider button 238 relative to the Scale 241 will correspond to one of the plurality of intensities 208. The patient 101 may indicate a feeling intensity of the particular feeling that they are currently feeling by translating the Slider button 238 relative to the Scale 241 to correspond to one of the plurality of intensities 208 displayed in the feeling spectrum GUI 207. The plurality of intensities 208 correspond to the feeling selection input 206 that was selected in the prior GUI, feeling selection GUI 204. In the example shown, the plurality of intensities 208 correspond to the feeling of “scared”; a first intensity 208a (“Extremely”) indicates that the patient is feeling extremely scared, a second intensity 208b (“Very”) indicates that the patient is feeling very scared, a third intensity 208c (“Fairly”) indicates that the patient is feeling fairly scared, a fourth intensity 208d (“A little”) indicates that the patient is feeling a little scared, and the fifth intensity 208e (“Barely”) indicates that the patient is feeling barely scared. The intensities 208a-208e do not represent an exhaustive list of all 21 [0082] [0082] In the example shown, the patient device 102 detects a second sequence of inputs, the second sequence of inputs including a first feeling intensity input 209 (e.g., touch or spoken) that selects the intensity 208c, corresponding to the intensity value 239, indicating that they are feeling fairly scared. In some implementations, the first feeling intensity input 209 causes the patient application 103 to transmit time-stamped event data 122 to the multiple sclerosis therapy service 160 (FIG. 1) that includes a selection indication indicating that the patient is presently feeling fairly scared. [0083] [0083] In some examples, the first feeling intensity input 209 causes the patient application 103 to generate, for display on the patient device 102, a journal interface element of the plurality of journal interface elements 231 (FIG. 2M), the journal interface element indicating the selected feeling intensity. In other examples, the first feeling intensity input 209 causes the patient application 103 to modify the already- generated plurality of journal interface elements 231 to indicate the selected feeling intensity. [0084] [0084] After detecting selection of the plurality of intensities 208, the patient application 103 advances to display an automatic thought selection GUI 210 (FIG. 2C) on the display 116 of the patient device 102. In some configurations, the first feeling intensity input 209 selecting one of the plurality of intensities 208 causes the patient application 103 to automatically display the automatic thought selection GUI [0085] [0085] At FIG. 2C, in some configurations, the patient application 103 causes the patient device 102 to display the automatic thought selection GUI 210 that allows the patient 101 to input a particular automatic thought corresponding to their thoughts. In the example shown, the automatic thought selection GUI 210 provides a plurality of automatic thought interface elements 211, each individual automatic thought interface element 211a-211n being associated with a corresponding automatic thought that the patient 101 may have recently had, or currently has. While the example shown depicts automatic thought interface elements 211a—211j, the patient 101 may view additional interface elements 211n by scrolling (e.g., via a swipe gesture). The automatic thoughts represent thoughts that are common in patients with MS. As depicted in FIG. 2C, in the example shown, the particular thoughts are negative thoughts that users with MS experience that can cause depressive symptoms. Displaying common automatic thoughts advantageously allows the patient 101 to identify a particular thought that the patient has that may be associated with one or more depressive symptoms. The plurality of automatic thought interface elements 211 may be prepopulated based on common automatic thoughts a typical patient diagnosed with MS may have had or currently has. The patient 101 may indicate the automatic thought associated with them by selecting the corresponding automatic thought interface element 211 displayed in the automatic thought selection GUI 210. In the example shown, a first automatic thought interface element 211a (“Relax and calm down”) indicates that the patient 101 has or had the thought to relax and calm down, a second automatic thought interface element 211b (“When you get her/him going you can’t stop her at all.”) indicates that the patient 101 has or had the thought that when you get him/her going you can't stop her at all, a third automatic thought interface element 211c (“I need to calm down”) indicates that the patient 101 has or had the thought that they need to calm down, a fourth automatic thought interface element 211d ("Why is my wife with me ”) indicates that the patient 101 has or had the thought asking why their wife is still with them, a fifth automatic thought interface element 211e (“Why can’t I have that ”) indicates that the patient 101 has or had the thought asking why they can't have that, a sixth automatic thought interface element 211f (“I hate to bother people”) indicates that the patient 101 has or had the thought that they hate to bother people, a seventh automatic thought interface element 211g 23 [0086] [0086] The automatic thought interface elements 211a-211j do not represent an exhaustive list of all automatic thought interface elements, but rather an exemplary list of automatic thought interface elements that may be included on the automatic thought selection GUI 210. Furthermore, the automatic thought selection GUI 210 may include other automatic thought interface elements in addition to automatic thought interface elements 211a-211j, or may omit one or more automatic thought interface elements 211a-211j. [0087] [0087] In the example shown, the patient device 102 detects a third sequence of inputs, the third sequence of inputs including an automatic thought selection input 212 (e.g., touch or spoken) corresponding to the automatic thought interface element 211f (“I hate to bother people”) indicating that the patient 101 has or has recently had the thought that they hate to bother people. In some implementations, the automatic thought selection input 212 causes the patient application 103 to transmit time- stamped event data 122 to the multiple sclerosis therapy service 160 (FIG. 1) that includes a selection indication indicating that the patient 101 has or had the thought that they hate to bother people. [0088] [0088] In some examples, the automatic thought selection input 212 causes the patient application 103 to generate, for display on the patient device 102, a journal interface element of the plurality of journal interface elements 231 (FIG. 2M), the journal interface element indicating the selected automatic thought. In other examples, the automatic thought selection input 212 causes the patient application 103 to modify the already-generated plurality of journal interface elements 231 to indicate the selected automatic thought. [0089] [0089] After detecting selection of an automatic thought interface element 211, the patient application 103 advances to display an alternative thought selection GUI 213 (FIG. 2D) on the display 116 of the patient device 102. In some configurations, the automatic thought selection input 212 selecting the automatic thought interface element 211 causes the patient application 103 to automatically display the alternative thought selection GUI 213. In other configurations, the patient application 103 requires the patient 101 to first confirm the selected automatic thought interface element 211 by selecting an Automatic Thought Selection Done Button 242. In these configurations, the patient application 103 displays the alternative thought selection GUI 213 in response to a selection indication indicating selection of the Automatic Thought Selection Done Button 242. [0090] [0090] At FIG. 2D, in some configurations, the patient application 103 causes the patient device 102 to display the alternative thought selection GUI 213 that allows a patient 101 to input a particular alternative thought corresponding to their thoughts. In the example shown, the alternative thought selection GUI 213 provides a plurality of alternative thought interface elements 214, each individual alternative thought interface element 214a-214n being associated with a corresponding alternative thought that the patient 101 can use to modify their thoughts and feelings. While the example shown depicts alternative thought interface elements 214a—214h, the patient 101 may view additional interface elements 214n by scrolling (e.g., via a swipe gesture). The alternative thoughts represent thoughts that can help users with MS modify their automatic thoughts by changing the distortion of their thoughts. The alternative thoughts reflect positive thoughts that patients with depressive symptoms associated with MS can think about to modify their automatic thought(s) that are related to their depressive symptoms. The plurality of alternative thought interface elements 214 may be prepopulated based on recommended alternative thoughts a typical patient diagnosed with MS would find beneficial to think about in order to modify automatic thought(s). The patient 101 may indicate the alternative thought that they would like to use to modify their feelings and thoughts by selecting the corresponding alternative thought interface element 214 displayed in the alternative thought selection GUI 213. In the example shown, a first alternative thought interface element 214a (“I'm going to get through this eventually”) indicates that the patient 25 [0091] [0091] The alternative thought interface elements 214a-214h do not represent an exhaustive list of all alternative thought interface elements, but rather an exemplary list of alternative thought interface elements that may be included on the alternative thought selection GUI 213. Furthermore, the alternative thought selection GUI 213 may include other alternative thought interface elements in addition to alternative thought interface elements 214a-214h, or may omit one or more alternative thought interface elements 214a-214h. [0092] [0092] In the example shown, the patient device 102 detects a fourth sequence of inputs, the fourth sequence of inputs including an alternative thought selection input 215 (e.g., touch or spoken) corresponding to the alternative thought interface element 26 [0093] [0093] In some examples, the alternative thought selection input 215 causes the patient application 103 to generate, for display on the patient device 102, a journal interface element of the plurality of journal interface elements 231 (FIG. 2M), the journal interface element indicating the selected alternative thought. In other examples, the alternative thought selection input 215 causes the patient application 103 to modify the already-generated plurality of journal interface elements 231 to indicate the selected alternative thought. [0094] [0094] After detecting selection of an alternative thought interface element 214, the patient application 103 advances to display the feeling spectrum GUI 207 (FIG. 2E) on the display 116 of the patient device 102. In some configurations, the alternative thought selection input 215 selecting the alternative thought interface element 214 causes the patient application 103 to automatically display the feeling spectrum GUI 207. In other configurations, the patient application 103 requires the patient 101 to first confirm the selected alternative thought interface element 214 by selecting an Alternative Thought Selection Done Button 243 (e.g., as shown in FIG. 2D). In these configurations, the patient application 103 displays the feeling spectrum GUI 207 in response to a selection indication indicating selection of the Alternative Thought Selection Done Button 243. [0095] [0095] At FIG. 2E, in some configurations, the patient application 103 causes the patient device 102 to display again the feeling spectrum GUI 207 that allows a patient 101 to, again, input a feeling intensity of the particular feeling that they are presently experiencing or recently felt. In the example shown, the patient device 102 detects a fifth sequence of inputs, the fifth sequence of inputs including a second feeling intensity input 216 (e.g., touch or spoken) that selects the fifth intensity 208e, corresponding to an updated intensity value 244, indicating that they are feeling barely scared. In some implementations, the second feeling intensity input 216 causes 27 [0096] [0096] In some examples, the second feeling intensity input 216 causes the patient application 103 to generate, for display on the patient device 102, a journal interface element of the plurality of journal interface elements 231 (FIG. 2M), the journal interface element indicating at least any difference between the first feeling intensity input 209 and the second feeling intensity input 216 (e.g., as reflected through a percentage decrease or the like). In other examples, the second feeling intensity input 216 causes the patient application 103 to modify the already-generated plurality of journal interface elements 231 to indicate at least any difference between the first feeling intensity input 209 and the second feeling intensity input 216. [0097] [0097] After detecting selection of one of the plurality of intensities 208, the patient application 103 advances to display a next GUI on the display 116 of the patient device 102. In some configurations, the second feeling intensity input 216 selecting one of the plurality of intensities 208 causes the patient application 103 to automatically display the next GUI. In other configurations, the patient application 103 requires the patient 101 to first confirm the selected one of the plurality of intensities 208 by selecting the Feeling Spectrum Done Button 240. In these configurations, the patient application 103 displays the next GUI in response to a selection indication indicating selection of the Feeling Spectrum Done Button 240. [0098] [0098] Referring now to FIGs. 2F-2M, the patient application 103 may display some or all of the GUIs corresponding to the figures. The GUIs corresponding to FIGs. 2F-2M may be displayed, if at all, in any particular order at any time the patient 101 interacts with the patient application 103. [0099] [0099] At FIGs. 2F-2H, in some configurations, the patient application 103 causes the patient device to display a thinking traps GUI 217 that allows the patient 101 to input a thinking trap associated with the particular thoughts they are having. In the examples shown, the thinking traps GUI 217 provides a plurality of thinking trap interface elements 218, each individual think trap interface element 218a-218n being associated with a corresponding thinking trap the patient 101 may be presently thinking or has recently thought. It should be noted that while the example shown 28 [00100] [00100] In the example shown, the patient device 102 detects a sixth sequence of inputs, the sixth sequence of inputs including a thinking trap selection input 219a (e.g., touch or spoken) corresponding to a Sounds Like Me Button 245a that corresponds to the thinking trap interface element 218a ("Overgeneralizing”) indicating that the patient 101 is overgeneralizing. In some implementations, the patient 101 can select one or more thinking trap interface elements by selecting more than one Sounds Like Me Buttons 245, each Sounds Like Me Button 245 corresponding to a thinking trap interface element 218. In other implementations, the patient 101 may opt not to select any thinking trap interface elements. In an example in which the patient opts to select one or more thinking trap interface elements, the patient 101 could select the Sounds Like Me Button 245a that corresponds to the 29 [00101] [00101] In some implementations, the thinking trap selection input 219a causes the patient application 103 to transmit time-stamped event data 122 to the multiple sclerosis therapy service 160 (FIG. 1) that includes a selection indication indicating that the patient is presently overgeneralizing. [00102] [00102] In some examples, the thinking trap selection input 219 causes the patient application 103 to generate, for display on the patient device 102, a journal interface element of the plurality of journal interface elements 231 (FIG. 2M), the journal interface element indicating the selected thinking trap. In other examples, the thinking trap selection input 219 causes the patient application 103 to modify the already- generated plurality of journal interface elements 231 to indicate the selected thinking trap. [00103] [00103] In some examples, a company selection GUI 221 (FIG. 21) is provided on the display 116 of the patient device 102. The patient application 103 may advance to the company selection GUI 221, according to one example, in response to the patient 101 selecting one or more thinking trap interface elements 218a—218b. In some configurations, the thinking trap selection input 219 selecting the Sounds Like Me button 245 causes the patient application 103 to automatically display the company selection GUI 221. In other configurations, the patient application 103 requires the patient 101 to first confirm the selected thinking trap interface element 218 by selecting a Done button 246 (e.g., as shown in FIG. 2F). In these configurations, the patient application 103 displays the company selection GUI 221 in response to a selection indication indicating selection of the Done button 246. [00104] [00104] At FIG. 21, in some configurations, the patient application 103 causes the patient device 102 to display the company selection GUI 221 that allows a patient 101 to input the company that they were with when they felt the particular feeling. In the examples shown, the company selection GUI 221 provides a plurality of company interface elements 233, each individual company interface element 233a-n being associated with a corresponding person (as identified by relationship type) that the patient 101 may have been with prior to, or when experiencing, the particular feeling. [00105] [00105] The company interface elements 233a-e do not represent an exhaustive list of all company interface elements, but rather an exemplary list of company interface elements that may be included on company selection GUI 221. Furthermore, company selection GUI 221 may include other company interface elements in addition to company interface elements 233a—233e, or may omit one or more of company interface elements 233a—233e. [00106] [00106] In the example shown, the patient device 102 detects a seventh sequence of inputs, the seventh sequence of inputs including a company selection input 223 (e.g., touch or spoken) corresponding to the company interface element 223d (“My Sibling”) indicating that the patient 101 was with their sibling when they felt the particular feeling. In some implementations, the company selection input 223 causes the patient application 103 to transmit time-stamped event data 122 to the multiple sclerosis therapy service 160 (FIG. 1) that includes a selection indication indicating that the patient was with their sibling when they felt the particular feeling. [00107] [00107] In some examples, the company selection input 223 causes the patient application 103 to generate, for display on the patient device 102, a journal interface element of the plurality of journal interface elements 231 (FIG. 2M), the journal interface element indicating the selected company. In other examples, the company selection input 223 causes the patient application 103 to modify the already-generated plurality of journal interface elements 231 to indicate the selected company. [00108] [00108] In some examples, a location selection GUI 224 (FIG. 2J) is provided on the display 116 of the patient device 102. The patient application 103 may advance to the location selection GUI 224, according to one example, in response to the patient 101 selecting a company interface element 233. In some configurations, the company selection input 223 selecting the company interface element 233 causes the patient application 103 to automatically display the location selection GUI 224. In other configurations, the patient application 103 requires the patient 101 to first confirm the selected company interface element 233 by selecting a Company Selection Done Button 247 (e.g., as shown in FIG. 2I). In these configurations, the patient application 103 displays the location selection GUI 224 in response to a selection indication indicating selection of the Company Selection Done Button 247. [00109] [00109] At FIG. 2J, in some configurations, the patient application 103 causes the patient device 102 to display the location selection GUI 224 that allows a patient 101 to input the location that patient 101 was at prior to, or when, the patient 101 felt the particular feeling. In the examples shown, the location selection GUI 224 provides a plurality of location interface elements 225, each individual location interface element 225a-n being associated with a corresponding location that the patient 101 may have been at prior to, or when, experiencing the particular feeling. While the example shown depicts location interface elements 225a-225e, the patient 101 may view additional location interface elements 225n by scrolling (e.g., via a swipe gesture). The plurality of location interface elements 225 may be prepopulated based on locations commonly frequented by patients diagnosed with MS. The patient 101 may indicate the location that they were at prior to, or when, they experienced the particular feeling by selecting the corresponding location interface element 225 displayed in the location selection GUI 224. In the example shown, a first location interface element 225a (“Home”) indicates that the patient 101 was at home when 32 [00110] [00110] The location interface elements 225a-e do not represent an exhaustive list of all location interface elements, but rather an exemplary list of location interface elements that may be included on location selection GUI 224. Furthermore, location selection GUI 224 may include other location interface elements in addition to location interface elements 225a-225e, or may omit one or more of location interface elements 225a—225e. [00111] [00111] In the example shown, the patient device 102 detects an eighth sequence of inputs, the eighth sequence of inputs including a location selection input 226 (e.g., touch or spoken) corresponding to the feeling interface element 225d ("Commute”) indicating that the patient 101 was commuting to or from a location when they felt the particular feeling. In some implementations, the location selection input 226 causes the patient application 103 to transmit time-stamped event data 122 to the multiple sclerosis therapy service 160 (FIG. 1) that includes a selection indication indicating that the patient was commuting to or from a location when they experienced the particular feeling. [00112] [00112] In some examples, the location selection input 226 causes the patient application 103 to generate, for display on the patient device 102, a journal interface element of the plurality of journal interface elements 231 (FIG. 2M), the journal interface element indicating the selected location. In other examples, the location selection input 226 causes the patient application 103 to modify the already-generated plurality of journal interface elements 231 to indicate the selected location. [00113] [00113] In some examples, a MS symptoms selection GUI 227 (FIG. 2K) is provided on the display 116 of the patient device 102. The patient application 103 33 [00114] [00114] At FIG. 2K, in some configurations, the patient application 103 causes the patient device 102 to display the MS symptom selection GUI 227 that allows a patient 101 to input one or more MS symptoms that they experienced associated with the particular feeling. In the examples shown, the MS symptom selection GUI 227 provides a plurality of MS symptom interface elements 228, each individual MS symptom interface element 228a-n being associated with a corresponding symptom that the patient 101 may have experienced associated with the particular feeling. While the example shown depicts MS symptom interface elements 228a-228h, the patient 101 may view additional MS symptom interface elements 228n by scrolling (e.g., via a swipe gesture). The plurality of MS symptom interface elements 228 may be prepopulated based on MS symptom a patient diagnosed with MS may experience related to the selected feeling (e.g., as selected through GUI 204 shown at FIG. 2A). The patient 101 may indicate the MS symptom that they experienced associated with the particular feeling by selecting the corresponding MS symptom interface element 228 displayed in the MS symptom selection GUI 228. In the example shown, a first MS symptom interface element 228a (“Relapse”) indicates that the patient 101 had a relapse associated with the particular feeling, a second MS symptom interface element 228b (“Fatigue”) indicates that the patient 101 experienced fatigue associated with the particular feeling, a third MS symptom interface element 228c (“Brain Fog”) indicates that the patient 101 experienced brain fog associated with the particular feeling, a fourth MS symptom interface element 228d (“Tremor”) indicates that the patient 101 experienced at least one tremor associated with the particular feeling, a fifth MS symptom interface element 228e (“Focus”) indicates that the patient 101 experienced 34 [00115] [00115] The MS symptoms interface elements 228a-h do not represent an exhaustive list of all MS symptom interface elements, but rather an exemplary list of symptom interface elements that may be included on MS symptom selection GUI [00116] [00116] In the example shown, the patient device 102 detects a ninth sequence of inputs, the ninth sequence of inputs including a MS symptom selection input 229 (e.g., touch or spoken) corresponding to the MS symptom interface element 228d (“Tremor”) indicating that the patient 101 felt one or more tremors when they experienced the particular feeling. In some implementations, the MS symptom selection input 229 causes the patient application 103 to transmit time-stamped event data 122 to the multiple sclerosis therapy service 160 (FIG. 1) that includes a selection indication indicating that the patient felt tremors when they experienced the particular feeling. [00117] [00117] In some examples, the MS symptom selection input 229 causes the patient application 103 to generate, for display on the patient device 102, a journal interface element of the plurality of journal interface elements 231 (FIG. 2M), the journal interface element indicating the selected MS symptom. In other examples, the MS symptom selection input 229 causes the patient application 103 to modify the already- generated plurality of journal interface elements 231 to indicate the selected MS symptom. [00118] [00118] In some examples, a journal GUI 230 (FIG. 2M) is provided on the display 116 of the patient device 102. The patient application 103 may advance to the journal GUI 230, according to one example, in response to the patient 101 selecting a MS [00119] [00119] At FIG. 2M, in some configurations, the patient application 103 causes the patient device 102 to display the journal GUI 230 that allows a patient 101 to view information corresponding to a history of past interactions between the patient 101 and the patient application 103. In the examples shown, the journal GUI 230 provides a timestamp interface element 232 associated with a particular time and date that the patient application recorded the interaction between the patient 101 and the patient application 103, a plurality of journal interface elements 231, each individual journal interface element being associated with corresponding journal information that the patient 101 may have entered in while interacting with the patient application 103. While the example shown depicts journal interface elements 231a-231h, the patient 101 may view additional journal interface elements 231n by scrolling (e.g., via a swipe gesture). The plurality of journal interface elements 231 may be prepopulated based on interactions between the patient 101 and the patient application 103 at the time and day corresponding to the timestamp interface element 232. The patient 101 may view past interactions between the patient 101 and the patient application 103. In the example shown, at a time and day corresponding to the timestamp interface element 232 (“January 30th 2019, 2:58pm”), a first journal interface element 231a (“Start Feeling”) indicates that the patient 101 first selected the scared feeling and a feeling intensity of 59%, a second journal interface element 231b (“Who I Was With”) indicates that the patient 101 was alone when they interacted with the patient application 103 at the time and day corresponding to the timestamp interface element 232, a third journal interface element 231c (“Where”) indicates that the patient 101 was at the doctor’s office when they interacted with the patient application 103 at the time and day corresponding to the timestamp interface element 232, a fourth journal 36 [00120] [00120] The journal interface elements 231a-h do not represent an exhaustive list of all journal interface elements, but rather an exemplary list of journal interface elements that may be included on journal GUI 230. Furthermore, journal GUI 230 may include other journal interface elements in addition to journal interface elements 231a-231h, or may omit one or more of journal interface elements 231a—231h. [00121] [00121] AtFIG. 2L, in some configurations, the patient application 103 causes the patient device 102 to display a recap interface element 220. This may occur at any point during the interaction between the patient 101 and the patient application 103, but in the example shown, occurs at least after the patient 101 has selected an automatic thought and one or more thinking traps. In the example shown, the recap interface element 220 provides information to patient 101 corresponding to an automatic thought and a thinking trap selected by the patient 101 while the patient 101 interacted with the patient application 103. The information in the recap interface element 220 does not represent an exhaustive list of all information capable of representation in the recap interface element 220, but rather an example of the type of information that can be presented in the recap interface 220. Furthermore, the recap 37 [00122] [00122] At FIG. 2N, in some configurations, the patient application 103 causes the patient device 102 to display a positive feeling selection GUI 250 that allows the patient 101 to input a particular feeling they are presently experiencing, or has recently experienced. In the example shown, the positive feeling selection GUI 250 provides a plurality of positive feeling interface elements 251, each 251a-n associated with a corresponding feeling the patient 101 is experiencing or has recently experienced. While the example shown depicts interface elements 251a-251h, the patient 101 may view additional interface elements 251n by scrolling (e.g., via a swipe gesture). The plurality of positive feelings interface elements 251 may be prepopulated based on common feelings a typical patient with MS may be experiencing. The patient 101 may indicate their current feelings by selecting the corresponding positive feeling interface element 251 displayed in the positive feeling selection GUI 250. In the example shown, a first positive feeling interface element 251a (“Calm”) indicates that the patient 101 is feeling calm, a second positive feeling interface element 251b (“Okay”) indicates that the patient 101 is feeling okay, a third positive feeling interface element 251c (“Proud”) indicates that the patient 101 is feeling proud, a fourth positive feeling interface element 251d (“Hopeful”) indicates that the patient 101 is feeling hopeful, a fifth positive feeling interface element 251e (“Happy”) indicates that the patient 101 is feeling happy, a sixth positive feeling interface element 251f (“Optimistic”) indicates that the patient 101 is feeling optimistic, a seventh positive feeling interface element 251g (“Determined”) indicates that the patient 101 is feeling determined, and an eighth positive feeling interface element 251h (“Grateful”) indicates that the patient 101 is feeling grateful. [00123] [00123] The positive feeling interface elements 251a-251h do not represent an exhaustive list of all positive feeling interface elements, but rather an exemplary list of positive feeling interface elements that may be included as part of the positive feeling selection GUI 250. Furthermore, the positive feeling selection GUI 250 may include other positive feeling interface elements in addition to positive feeling interface elements 251a-251h, or may omit one or more of positive feeling interface elements 251a-251h, without departing from the teachings herein. In some 38 [00124] [00124] In the example shown, the patient device 102 detects a tenth sequence of inputs, the tenth sequence of inputs including a positive feeling selection input 254 (e.g., touch or spoken) corresponding to the positive feeling element interface 251c (“Proud”) indicating they are feeling proud. In some implementations, the positive feeling selection input 254 causes the patient application 103 to transmit time- stamped event data 122 to the multiple sclerosis therapy service 160 (FIG. 1) that includes a selection indication indicating that the patient is presently feeling proud. [00125] [00125] In some examples, the positive feeling selection input 254 causes the patient application 103 to generate, for display on the patient device 102, a positive journal interface element of a plurality of journal interface elements 260 (FIG. 2Q), the positive journal interface element indicating the selected feeling. In other examples, the positive feeling selection input 254 causes the patient application 103 to modify the already-generated plurality of positive journal interface elements 260 to indicate the selected feeling. [00126] [00126] After detecting selection of a positive feeling interface element 251, in some embodiments, the patient application 103 advances to display a situation selection GUI 255 (FIG. 20) on the display 116 of the patient device 102. In some configurations, the positive feeling selection input 254 selecting the positive feeling interface element 251 causes the patient application 103 to automatically display the situation selection GUI 255. In other configurations, the patient application 103 requires the patient 101 to first confirm the selected positive feeling interface element 251 by selecting a Positive Feeling Selection Done Button 253 (e.g., as shown in FIG. 20). In these configurations, the patient application 103 displays the feeling spectrum GUI 207 in response to a selection indication indicating selection of the Feeling Selection Done Button 237. [00127] [00127] At FIG. 20, in some configurations, the patient application 103 causes the patient device 102 to display the situation selection GUI 255 that allows the patient 101 to input a situation corresponding to what they did. The situation may correspond to an activity the patient 101 did recently. The situation may also correspond to an 39 [00128] [00128] The situation interface elements 256a-256k do not represent an exhaustive list of all situation interface elements, but rather an exemplary list of situation interface elements that may be included on the situation selection GUI 255. Furthermore, the situation selection GUI 255 may include other situation interface 40 [00129] [00129] In the example shown, the patient device 102 detects an eleventh sequence of inputs, the eleventh sequence of inputs including a situation selection input 257 (e.g., touch or spoken) corresponding to the situation interface 256e (“Ate healthy”) indicating that the patient 101 ate healthy. In some implementations, the situation selection input 257 causes the patient application 103 to transmit time-stamped event data 122 to the multiple sclerosis therapy service 160 (FIG. 1) that includes a selection indication indicating that the patient 101 ate healthy. [00130] [00130] In some examples, the situation selection input 257 causes the patient application 103 to generate, for display on the patient device 102, a positive journal interface element of a plurality of journal interface elements 260 (FIG. 2Q), the positive journal interface element indicating the selected situation. In other examples, the situation selection input 257 causes the patient application 103 to modify the already-generated plurality of positive journal interface elements 260 to indicate the selected situation. [00131] [00131] AtFIG. 2P, in some configurations, the patient application 103 causes the patient device 102 to display a positive reflection element 258. This may occur at any point during the interaction between the patient 101 and the patient application 103, but in the example shown, occurs at least after the patient 101 has selected a positive feeling and a situation. In the example shown, the positive reflection element 258 provides information to patient 101 corresponding to a positive feeling and a situation selected by the patient 101 while the patient 101 interacted with the patient application 103. The information in the positive reflection element 258 does not represent an exhaustive list of all information capable of representation in the positive reflection element 258, but rather an example of the type of information that can be presented in the positive reflection element 258. Furthermore, the positive reflection element 258 may include other information in addition to the information depicted in the example in FIG. 2P, or may omit information depicted in the example in FIG. 2P. [00132] [00132] At FIG. 2Q, in some configurations, the patient application 103 causes the patient device 102 to display a positive journal GUI 259 that allows a patient 101 to view information corresponding to a history of past interactions between the patient 41 [00133] [00133] The positive journal interface elements 260a-e do not represent an exhaustive list of all journal interface elements, but rather an exemplary list of 42 [00134] [00134] FIG. 3 is a flow chart illustrating a method 300 for treating depressive symptoms associated with multiple sclerosis in accordance with an example implementation of the disclosed technology. According to one example, the method 300 may be performed by an electronic device, such as the patient device 102. The method 300 begins at block 302 where a feeling selection interface (e.g., the feeling selection GUI 204) is displayed. The feeling selection interface presents a plurality of feeling interface elements (e.g., the plurality of feeling interface elements 205), each feeling interface element being associated with a particular feeling. At block 304, a first sequence of inputs including a feeling selection input (e.g., the feeling selection input 206) is received. The feeling selection input corresponds to a particular feeling interface element (e.g., the second feeling interface element 205b). At block 306, the electronic device displays a feeling spectrum interface (e.g., the feeling spectrum GUI 207). The feeling spectrum interface presents a plurality of intensities (e.g., the plurality of intensities 208) associated with the particular feeling. [00135] [00135] At block 308, the electronic device receives a second sequence of inputs including a first feeling intensity input (e.g., the first feeling intensity input 209). The first feeling intensity input corresponds to a first intensity (e.g., the third intensity 208c) of the plurality of intensities. At block 310, the electronic device displays an automatic thought selection interface (e.g., the automatic thought selection GUI 210). The automatic thought selection interface presenting a plurality of automatic thought interface elements (e.g., the plurality of automatic thought interface elements 211). Each automatic thought interface element is associated with a particular automatic thought. At block 312, the electronic device receives a third sequence of inputs including an automatic thought selection input (e.g., the automatic thought selection input 212). The automatic thought selection input corresponds to a particular automatic thought interface element. At block 314, the electronic device displays an alternative thought selection interface (e.g., the alternative thought selection GUI 213). The alternative thought selection interface presents a plurality of alternative 43 [00136] [00136] At block 316, the electronic device receives a fourth sequence of inputs including an alternative thought selection input (e.g., the alternative thought selection input 215). The alternative thought selection input corresponds to a particular alternative thought interface element. At block 318, the electronic device displays the feeling spectrum interface. At block 320, the electronic device receives a fifth sequence of inputs including a second feeling intensity input (e.g., the second feeling intensity input 216). The second feeling intensity input corresponds to a second intensity (e.g., the fifth intensity 208e) of the plurality of intensities. At block 322, the electronic device generates a journal entry (e.g., the eighth journal interface element 231h). The journal entry indicates at least any difference between the first feeling intensity input and the second feeling intensity input. Following block 322, the method 300 concludes. [00137] [00137] FIG. 4 is a flow chart illustrating another method 400 for treating depressive symptoms associated with multiple sclerosis in accordance with an example implementation of the disclosed technology. According to one example, the method 400 may be performed by an electronic device, such as the patient device 102. [00138] [00138] At block 406, the electronic device identifies a plurality of potential automatic thoughts based on the feeling associated with the user (e.g., as shown in FIG. 2C). Each potential automatic thought of the plurality of potential automatic thoughts correspond to a negative thought. At block 408, the electronic device receives automatic thought selection data identifying a particular potential automatic thought from among the plurality of potential automatic thoughts (e.g., as shown in FIG. 20). [00139] [00139] Atblock 410, the electronic device identifies a plurality of potential alternative thoughts based on the automatic thought selection data (e.g., as shown in 44 [00140] [00140] At block 414, the electronic device receives second feeling intensity data describing a second intensity of the feeling associated with the user (e.g., as shown in FIG. 2E). At block 416, the electronic device determines any difference between the first intensity and the second intensity to provide feeling intensity difference data. At block 418, the electronic device displays the feeling intensity difference data (e.g., as shown in FIG. 2M). Following block 418, the method 400 concludes. [00141] [00141] FIG. 5 is schematic view of an example electronic device 500 (e.g., a computing device) that may be used to implement the systems and methods described in this document. The electronic device 500 is intended to represent various forms of digital computers, such as laptops, desktops, workstations, personal digital assistants, servers, blade servers, mainframes, and other appropriate computers. The components shown here, their connections and relationships, and their functions, are meant to be exemplary only, and are not meant to limit implementations of the inventions described and/or claimed in this document. [00142] [00142] The electronic device 500 includes a processor 510, memory 520, a storage device 530, a high-speed interface/controller 540 connecting to the memory 520 and high-speed expansion ports 550, and a low speed interface/controller 560 connecting to a low speed bus 570 and a storage device 530. Each of the components 510, 520, 530, 540, 550, and 560, is interconnected using various busses, and may be mounted on a common motherboard or in other manners as appropriate. The processor 510 can process instructions for execution within the electronic device 500, including instructions stored in the memory 520 or on the storage device 530 to display graphical information for a graphical user interface (GUI) on an external input/output device, such as display 580 coupled to high speed interface 540. In other implementations, multiple processors and/or multiple buses may be used, as appropriate, along with multiple memories and types of memory. Also, multiple electronic device 500 may be connected, with each device providing portions of the 45 [00143] [00143] The memory 520 stores information non-transitorily within the electronic device 500. The memory 520 may be a computer-readable medium, a volatile memory unit(s), or non-volatile memory unit(s). The non-transitory memory 520 may be physical devices used to store programs (e.g., sequences of instructions) or data (e.g., program state information) on a temporary or permanent basis for use by the electronic device 500. Examples of non-volatile memory include, but are not limited to, flash memory and read-only memory (ROM) / programmable read-only memory (PROM) / erasable programmable read-only memory (EPROM) / electronically erasable programmable read-only memory (EEPROM) (e.g., typically used for firmware, such as boot programs). Examples of volatile memory include, but are not limited to, random access memory (RAM), dynamic random access memory (DRAM), static random access memory (SRAM), phase change memory (PCM) as well as disks or tapes. [00144] [00144] The storage device 530 is capable of providing mass storage for the electronic device 500. In some implementations, the storage device 530 is a computer- readable medium. In various different implementations, the storage device 530 may be a floppy disk device, a hard disk device, an optical disk device, or a tape device, a flash memory or other similar solid state memory device, or an array of devices, including devices in a storage area network or other configurations. In additional implementations, a computer program product is tangibly embodied in an information carrier. The computer program product contains instructions that, when executed, perform one or more methods, such as those described above. The information carrier is a computer- or machine-readable medium, such as the memory 520, the storage device 530, or memory on processor 510. [00145] [00145] The high speed controller 540 manages bandwidth-intensive operations for the electronic device 500, while the low speed controller 560 manages lower bandwidth-intensive operations. Such allocation of duties is exemplary only. In some implementations, the high-speed controller 540 is coupled to the memory 520, the display 580 (e.g., through a graphics processor or accelerator), and to the high-speed expansion ports 550, which may accept various expansion cards (not shown). [00146] [00146] The electronic device 500 may be implemented in a number of different forms, as shown in the figure. For example, it may be implemented as a standard server 500a or multiple times in a group of such servers 500a, as a laptop computer 500b, or as part of a rack server system 500c. [00147] [00147] Referring now to FIG. 6, one example of a digital therapeutic 600 for treating depressive symptoms associated with multiple sclerosis is illustrated, in functional block form. As shown, the digital therapeutic 600 includes a feeling assessment module 604, an automatic thought identification module 606, an alternative thought identification module 614, a feeling intensity module 622, a thinking traps module 634, a company module 644, a location module 648, a multiple sclerosis symptom module 652, a journal module 654, and a display module 630. According to one example, the digital therapeutic 600 may be implemented as a computer program executed on an electronic device, such as device 102. According to this example, executing the computer program on the electronic device may serve to administer therapeutic treatment to a user of the electronic device in a manner designed to mitigate, or alleviate, depressive symptoms associated with multiple sclerosis. [00148] [00148] In operation, the digital therapeutic 600 may function as follows. The feeling assessment module 604 is configured to receive feeling assessment data 602 (e.g. input 206; block 304). The feeling assessment data 602 may constitute data describing a feeling associated with a user (e.g., anxious, scared, dreadful, etc.). According to one example, the feeling assessment data 602 may be provided to the feeling assessment module 604 via user input as discussed, for example, with regard to FIG. 2A above. [00149] [00149] The automatic thought identification module 606 is configured to receive the feeling assessment data 602 from the feeling assessment module 604. In addition, the automatic thought identification module 606 is configured to identify a plurality of potential automatic thoughts 608 based on the feeling assessment data 602. By way of example and not limitation, the plurality of potential automatic thoughts 608 may be identified from within a database or the like (not shown) storing a variety of automatic thoughts. Each potential automatic thought of the plurality of potential automatic thoughts 608 may correspond to a negative thought (although, according to some 47 [00150] [00150] The alternative thought identification module 614 is configured to receive the automatic thought selection data 612. In addition, the alternative thought identification module 614 is configured to identify a plurality of potential alternative thoughts 616 based on the automatic thought selection data 612. By way of example and not limitation, the plurality of potential alternative thoughts 616 may be identified from within a database or the like (not shown) storing a variety of alternative thoughts. Each potential alternative thought of the plurality of potential alternative thoughts 616 may correspond to a positive thought. Further, the alternative thought identification module 614 is configured to receive alternative thought selection data 620 (e.g., input 215; block 316). The alternative thought selection data 620 may identify a particular potential alternative thought 618 from among the plurality of potential alternative thoughts 616. According to one example, the alternative thought selection data 620 may be provided to the alternative thought identification module 614 via user input as discussed, for example, with regard to FIG. 2D above. [00151] [00151] The feeling intensity module 622 is configured to receive first feeling intensity data 624 and second feeling intensity data 626 (e.g., input 209 and input 216; block 308 and block 320). The first feeling intensity data 624 may describe a first intensity of the feeling associated with the user (e.g., as indicated by the feeling assessment data 602) being treated via the digital therapeutic 600 at a first point in time. The second feeling intensity data 626 may describe a second intensity of the feeling associated with the user at a second point in time. According to one example, the second point in time is later than the first point in time. According to one example, the first feeling intensity data 624 may be provided to the feeling intensity module 622 via user input as discussed, for example, with regard to FIG. 2B above. Similarly, 48 [00152] [00152] In response to receiving the first feeling intensity data 624 and the second feeling intensity data 626, the feeling intensity module is configured to generate feeling intensity difference data 628 (e.g., interface element 23 1h of FIG. 2M; block 322). The feeling intensity difference data 628 may indicate any difference (including, in some examples, no difference) between the first feeling intensity data 624 and the second feeling intensity data 626. For example, and as discussed with respect to element 231h of FIG. 2M above, the feeling intensity difference data 628 may indicate a change (e.g., a drop) in the intensity of a particular feeling experienced by the user receiving treatment via the digital therapeutic 600. [00153] [00153] The thinking traps module 634 is configured to receive the automatic thought selection data 612. In addition, the thinking traps module 634 is configured to identify a plurality of potential thinking traps 636 based on the feeling assessment data 602. By way of example and not limitation, the plurality of potential thinking traps 636 may be identified from within a database or the like (not shown) storing a variety of thinking traps. Each potential thinking trap of the plurality of potential thinking traps 636 may correspond to a negative emotional tendency, such as overgeneralizing, catastrophizing, etc. Further, the thinking traps module 634 is configured to receive thinking trap selection data 640 (e.g., input 219). The thinking trap selection data 640 may identify a particular potential thinking trap 638 from among the plurality of potential thinking traps 636. According to one example, the thinking trap selection data 640 may be provided to the thinking traps module 634 via user input as discussed, for example, with regard to FIGS. 2F-2H above. [00154] [00154] The company module 644 is configured to receive company selection data 642 (e.g., input 223). The company selection data 642 may identify, by relationship type (e.g., partner, children, sibling, parent, friend, co-worker, etc.), a person who accompanied a user of the digital therapeutic 600 at a time in which the user experienced the feeling described by the feeling assessment data 602, or, whether the user was alone when they experienced the feeling described by the feeling assessment data 602. As discussed in additional detail below, in some examples, the company 49 [00155] [00155] The location module 648 is configured to receive location selection data 646 (e.g., input 226). The location selection data 646 may identify a location (e.g., home, doctor, work, commute, store, etc.) of the user at the time in which the user experienced the feeling described by the feeling assessment data 602. As discussed in additional detail below, in some examples, the location selection data 646 may be provided to the journal module 654 for use in generating a journal entry 656. [00156] [00156] The multiple sclerosis symptom module 652 is configured to receive multiple sclerosis symptom selection data 650 (e.g., input 229). The multiple sclerosis symptom selection data 650 may identify one or more multiple sclerosis symptoms (e.g., relapse, fatigue, brain fog, tremor, focus, memory, balance problems, vision problems, etc.) associated with the user. As discussed in additional detail below, in some examples, the multiple sclerosis symptom selection data 650 may be provided to the journal module 654 for use in generating a journal entry 656. [00157] [00157] The journal module 654 is configured to receive the company selection data 642, location selection data 646, multiple sclerosis symptom selection data 650, the particular potential thinking trap 638, the feeling intensity difference data 628, particular potential automatic thought 610, and the particular potential alternative thought 618. In response to receiving one or more of the foregoing types of data, the journal module 654 is configured to generate a journal entry 656 including some or all of the foregoing types of data. On example of a generated journal entry 656 is shown with regard to FIG. 2M and discussed above. [00158] [00158] The display module 630 is configured to receive the generated journal entry 656 and generate display data 632 representing the generated journal entry 656. For example, according to one embodiment, the display module 630 is configured to generate display data 632 representing a generated journal entry 656 that includes all of the following types of data: company selection data 642, location selection data 646, multiple sclerosis symptom selection data 650, particular potential thinking trap 638, feeling intensity difference data 628, particular potential automatic thought 610, and particular potential alternative thought 618, as shown, for example, in FIG. 2M. According to another embodiment, the display module 630 is configured to generate 50 [00159] [00159] Among other advantages, the present disclosure provides electronic devices and methods for implementing a prescription digital therapeutic configured to treat depressive symptoms associated with MS. The digital therapeutic may administer cognitive behavioral therapy (CBT) to treat the depressive symptoms. More specifically, the digital therapeutic may implement both cognitive therapy as well as behavioral activation as part of the administered CBT. Administration of CBT via the digital therapeutics described herein may serve to correct distorted cognitions that can cause patients to have a negative view of themselves, the world, and the future. [00160] [00160] The present disclosure also provides a digital therapeutic that includes a plurality of GUIs to help a user/patient understand situations, symptoms, and automatic thoughts related to their negative feelings; check their thoughts against a set of common cognitive distortions or “thinking traps”; and identify alternative thoughts that are more helpful and realistic. The patient/user may be provided with examples of automatic and alternative thoughts that were obtained from a large sample of people with MS. [00161] [00161] The present disclosure also provides a digital therapeutic to help patients/users focus on developing skills to cope with MS symptoms, such as brain fog and fatigue, related to depression. The digital therapeutic of the present disclosure provides 24/7 access to support and resources for treating depressive symptoms associated with MS. [00162] [00162] The present disclosure also provides a digital therapeutic to reduce depressive symptoms associated with multiple sclerosis according to clinical measurements. For example, the digital therapeutic described herein improves patient condition according to one or more of the following clinical measurements: MADRS, BDI-II, and PHQ-9. For example, the digital therapeutic described herein creates physiological changes in patients. [00163] [00163] Certain implementations of the disclosed technology are described above with reference to block and flow diagrams of systems and methods and/or computer program products according to example implementations of the disclosed technology. It will be understood that one or more blocks of the block diagrams and flow diagrams, and combinations of blocks in the block diagrams and flow diagrams, respectively, can be implemented by computer-executable program instructions. Likewise, some blocks of the block diagrams and flow diagrams may not necessarily need to be performed in the order presented, may be repeated, or may not necessarily need to be performed at all, according to some implementations of the disclosed technology. [00164] [00164] The terminology used herein is for the purpose of describing particular exemplary configurations only and is not intended to be limiting. As used herein, the singular articles “a,” “an,” and “the” may be intended to include the plural forms as well, unless the context clearly indicates otherwise. The terms “comprises,” “comprising,” “including,” and “having,” are inclusive and therefore specify the presence of features, steps, operations, elements, and/or components, but do not preclude the presence or addition of one or more other features, steps, operations, elements, components, and/or groups thereof. The method steps, processes, and operations described herein are not to be construed as necessarily requiring their performance in the particular order discussed or illustrated, unless specifically identified as an order of performance. Additional or alternative steps may be employed. [00165] [00165] Although the following description uses terms “first,” “second,” etc. to describe various elements, these elements should not be limited by the terms. These terms are only used to distinguish one element from another. For example, a first touch could be termed a second touch, and, similarly, a second touch could be termed a first touch, without departing from the scope of the various described embodiments. The first touch and the second touch are both touches, but they are not the same touch. [00166] [00166] Various implementations of the electronic devices, systems, techniques, and modules described herein can be realized in digital electronic and/or optical circuitry, integrated circuitry, specially designed ASICs (application specific integrated circuits), computer hardware, firmware, software, and/or combinations 52 [00167] [00167] These computer programs (also known as programs, software, software applications or code) include machine instructions for a programmable processor, and can be implemented in a high-level procedural and/or object-oriented programming language, and/or in assembly/machine language. As used herein, the terms “machine- readable medium” and "computer-readable medium” refer to any computer program product, non- transitory computer readable medium, apparatus and/or device (e.g., magnetic discs, optical disks, memory, Programmable Logic Devices (PLDs)) used to provide machine instructions and/or data to a programmable processor, including a machine-readable medium that receives machine instructions as a machine-readable signal. The term “machine-readable signal” refers to any signal used to provide machine instructions and/or data to a programmable processor. [00168] [00168] The processes and logic flows described in this specification can be performed by one or more programmable processors, also referred to as data processing hardware, executing one or more computer programs to perform functions by operating on input data and generating output. The processes and logic flows can also be performed by special purpose logic circuitry, e.g., an FPGA (field programmable gate array) or an ASIC (application specific integrated circuit). Processors suitable for the execution of a computer program include, by way of example, both general and special purpose microprocessors, and any one or more processors of any kind of digital computer. Generally, a processor will receive instructions and data from a read only memory or a random access memory or both. The essential elements of a computer are a processor for performing instructions and one or more memory devices for storing instructions and data. Generally, a computer will also include, or be operatively coupled to receive data from or transfer data to, or both, one or more mass storage devices for storing data, e.g., magnetic, magneto optical disks, or optical disks. However, a computer need not have such devices. [00169] [00169] To provide for interaction with a user, one or more aspects of the disclosure can be implemented on a computer having a display device, e.g., a CRT (cathode ray tube), LCD (liquid crystal display) monitor, or touch screen for displaying information to the user and optionally a keyboard and a pointing device, e.g., amouse or a trackball, by which the user can provide input to the computer. Other kinds of devices can be used to provide interaction with a user as well; for example, feedback provided to the user can be any form of sensory feedback, e.g., visual feedback, auditory feedback, or tactile feedback; and input from the user can be received in any form, including acoustic, speech, or tactile input. In addition, a computer can interact with a user by sending documents to and receiving documents from a device that is used by the user; for example, by sending web pages to a web browser on a user's client device in response to requests received from the web browser. [00170] [00170] A number of implementations have been described. Nevertheless, it will be understood that various modifications may be made without departing from the spirit and scope of the disclosure. Accordingly, other implementations are within the scope of the following claims, including the following implementations, expressed as interrelated items: Item 1. An electronic device for displaying feeling intensity inputs, the electronic device comprising: a display; an input device; one or more processors; and memory storing one or more programs configured to be executed by the one or more processors, the one or more programs including instructions for: 54 DK 2019 70328 A1 generating, for display on the display, a journal entry, the journal entry indicating at least any difference between a first feeling intensity input and a second feeling intensity input. Item 2. The electronic device as in any of the preceding items, wherein the one or more programs also include instructions for: displaying, on the display, a feeling selection interface, the feeling selection interface presenting a plurality of feeling interface elements, each feeling interface element being associated with a particular feeling; and while displaying the feeling selection interface, receiving, via the input device, a first sequence of inputs, the first sequence of inputs including a feeling selection input, the feeling selection input corresponding to a particular feeling interface element. Item 3. The electronic device as in any of the preceding items, wherein the one or more programs also include instructions for: displaying, on the display, a feeling spectrum interface, the feeling spectrum interface presenting a plurality of intensities associated with the particular feeling; and while displaying the feeling spectrum interface, receiving, via the input device, a second sequence of inputs, the second sequence of inputs including a first feeling intensity input, the first feeling intensity input corresponding to a first intensity of the plurality of intensities. Item 4. The electronic device as in any of the preceding items, wherein the one or more programs also include instructions for: displaying, on the display, an automatic thought selection interface, the automatic thought selection interface presenting a plurality of automatic thought interface elements, each automatic thought interface element being associated with a particular automatic thought; and while displaying the automatic thought selection interface, receiving, via the input device, a third sequence of inputs, the third sequence of inputs including an 55 DK 2019 70328 A1 automatic thought selection input, the automatic thought selection input corresponding to a particular automatic thought interface element. Item 5. The electronic device as in any of the preceding items, wherein the one or more programs also include instructions for: displaying, on the display, an alternative thought selection interface, the alternative thought selection interface presenting a plurality of alternative thought interface elements, each alternative thought interface element being associated with a particular alternative thought; and while displaying the alternative thought selection interface, receiving, via the input device, a fourth sequence of inputs, the fourth sequence of inputs including an alternative thought selection input, the alternative thought selection input corresponding to a particular alternative thought interface element. Item 6. The electronic device as in any of the preceding items, wherein the journal entry is modified to further indicate one or more particular thinking trap interface elements. Item 7. The electronic device as in any of the preceding items, wherein the one or more programs also include instructions for: in response to receiving the one or more thinking trap selection inputs, displaying, on the display, a quick recap interface element, the quick recap interface element indicating the particular automatic thought and the one or more particular thinking trap elements. Item 8. The electronic device as in any of the preceding items, wherein the journal entry is modified to further indicate the particular alternative thought interface element. Item 9. A computerized method for displaying feeling intensity inputs, the method comprising: at an electronic device including a display and an input device: 56 DK 2019 70328 A1 receiving data corresponding to a first feeling intensity input; receiving data corresponding to a second feeling intensity input; generating, for display on the display, a journal entry, the journal entry indicating at least any difference between the data corresponding to the first feeling intensity input and the data corresponding to the second feeling intensity input. Item 10. The computerized method of Item 9, wherein the method further comprising: at the electronic device including a display and an input device: displaying, on the display, a feeling selection interface, the feeling selection interface presenting a plurality of feeling interface elements, each feeling interface element being associated with a particular feeling; and while displaying the feeling selection interface, receiving, via the input device, a first sequence of inputs, the first sequence of inputs including a feeling selection input, the feeling selection input corresponding to a particular feeling interface element. Item 11. The computerized method as in any one of Items 9 and 10, wherein the method further comprising: at the electronic device including a display and an input device: displaying, on the display, a feeling spectrum interface, the feeling spectrum interface presenting a plurality of intensities associated with the particular feeling; and while displaying the feeling spectrum interface, receiving, via the input device, a second sequence of inputs, the second sequence of inputs including a first feeling intensity input, the first feeling intensity input corresponding to a first intensity of the plurality of intensities. Item 12. The computerized method as in any one of Items 9, 10, and 11, wherein the method further comprising: at the electronic device including a display and an input device: 57 DK 2019 70328 A1 displaying, on the display, an automatic thought selection interface, the automatic thought selection interface presenting a plurality of automatic thought interface elements, each automatic thought interface element being associated with a particular automatic thought; and while displaying the automatic thought selection interface, receiving, via the input device, a third sequence of inputs, the third sequence of inputs including an automatic thought selection input, the automatic thought selection input corresponding to a particular automatic thought interface element. Item 13. The computerized method as in any one of Items 9, 10, 11, and 12, wherein the method further comprising: at the electronic device including a display and an input device: displaying, on the display, an alternative thought selection interface, the alternative thought selection interface presenting a plurality of alternative thought interface elements, each alternative thought interface element being associated with a particular alternative thought; while displaying the alternative thought selection interface, receiving, via the input device, a fourth sequence of inputs, the fourth sequence of inputs including an alternative thought selection input, the alternative thought selection input corresponding to a particular alternative thought interface element. Item 14. The computerized method as in any one of Items 9, 10, 11, 12, and 13, wherein the method further comprising: at the electronic device including a display and an input device: displaying, on the display, the feeling spectrum interface; and while displaying the feeling spectrum interface, receiving, via the input device, a fifth sequence of inputs, the fifth sequence of inputs including a second feeling intensity input, the second feeling intensity input corresponding to a second intensity of the plurality of intensities. 58 DK 2019 70328 A1 Item 15. The computerized method as in any one of Items 9, 10, 11, 12, 13, and 14, wherein the journal entry is modified to further indicate one or more particular thinking trap interface elements. Item 16. The computerized method as in any one of Items 9, 10, 11, 12, 13, 14, and 15, wherein the method further comprising: at the electronic device including a display and an input device: displaying, on the display, a quick recap interface element, the quick recap interface element indicating the particular automatic thought and the one or more particular thinking trap elements. Item 17. The computerized method as in any one of Items 9, 10, 11, 12, 13, 14, 15, and 16, wherein the journal entry is modified to further indicate the particular alternative thought interface element. Item 18. A computerized method for displaying feeling intensity inputs, the method comprising: at an electronic device including a display and an input device: determining any difference between a first intensity and a second intensity to provide feeling intensity difference data; and displaying, on the display, the feeling intensity difference data. Item 19. The computerized method of Item 18, the method further comprising: receiving, via the input device, feeling assessment data, the feeling assessment data describing a feeling associated with a user; and receiving, via the input device, first feeling intensity data, the first feeling intensity data describing a first intensity of the feeling associated with the user. Item 20. The computerized method as in any one of Items 18 and 19, the method further comprising: 59 DK 2019 70328 A1 identifying a plurality of potential automatic thoughts based on the feeling associated with the user, each potential automatic thought of the plurality of potential automatic thoughts corresponding to a negative thought; and receiving, via the input device, automatic thought selection data, the automatic thought selection data identifying a particular potential automatic thought from among the plurality of potential automatic thoughts. Item 21. The computerized method as in any one of Items 18, 19, and 20, the method further comprising: identifying a plurality of potential alternative thoughts based on the automatic thought selection data, each potential alternative thought of the plurality of potential alternative thoughts corresponding to a positive thought; and receiving, via the input device, alternative thought selection data, the alternative thought selection data identifying a particular potential alternative thought from among the plurality of potential alternative thoughts. Item 22. An electronic device, comprising: a display; an input device; one or more processors; and memory storing one or more programs configured to be executed by the one or more processors, the one or more programs including instructions for: determining any difference between a first intensity and a second intensity to provide feeling intensity difference data; and displaying, on the display, the feeling intensity difference data. Item 23. The electronic device of Item 22, wherein the one or more programs also include instructions for: receiving, via the input device, feeling assessment data, the feeling assessment data describing a feeling associated with a user; and receiving, via the input device, first feeling intensity data, the first feeling intensity data describing a first intensity of the feeling associated with the user. 60 DK 2019 70328 A1 Item 24. The electronic device as in any one of Items 22 and 23, wherein the one or more programs also include instructions for: identifying a plurality of potential automatic thoughts based on the feeling associated with the user, each potential automatic thought of the plurality of potential automatic thoughts corresponding to a negative thought; and receiving, via the input device, automatic thought selection data, the automatic thought selection data identifying a particular potential automatic thought from among the plurality of potential automatic thoughts. Item 25. The electronic device as in any one of Items 22, 23, and 24, wherein the one or more programs also include instructions for: identifying a plurality of potential alternative thoughts based on the automatic thought selection data, each potential alternative thought of the plurality of potential alternative thoughts corresponding to a positive thought; and receiving, via the input device, alternative thought selection data, the alternative thought selection data identifying a particular potential alternative thought from among the plurality of potential alternative thoughts. Item 26. A digital therapeutic for treating depressive symptoms associated with multiple sclerosis, the digital therapeutic comprising: a display module, the display module configured to generate display data representing feeling intensity difference data. Item 27. The digital therapeutic of Item 26 further comprising: an automatic thought identification module, the automatic thought identification module configured to (i) identify a plurality of potential automatic thoughts based on feeling assessment data describing a feeling associated with a user, each potential automatic thought of the plurality of potential automatic thoughts corresponding to a negative thought and (ii) receive automatic thought selection data identifying a particular potential automatic thought from among the plurality of potential automatic thoughts. 61 DK 2019 70328 A1 Item 28. The digital therapeutic as in any one of Items 26 and 27, further comprising: an alternative thought identification module, the alternative thought identification module configured to (1) identify a plurality of potential alternative thoughts based on the automatic thought selection data, each potential alternative thought of the plurality of potential alternative thoughts corresponding to a positive thought and (ii) receive alternative thought selection data identifying a particular potential alternative thought from among the plurality of potential alternative thoughts. Item 29. The digital therapeutic as in any one of Items 26, 27, and 28, further comprising: a feeling intensity module, the feeling intensity module configured to (1) receive first feeling intensity data describing a first intensity of the feeling associated with the user at a first point in time; (ii) receive second feeling intensity data describing a second intensity of the feeling associated with the user at a second point in time, the second point in time being later than the first point in time; and (iii) generate feeling intensity difference data, the feeling intensity difference data indicating any difference between the first intensity and the second intensity. Item 30. The digital therapeutic as in any one of Items 26, 27, 28, and 29, further comprising: a feeling assessment module, the feeling assessment module configured to receive the feeling assessment data describing the feeling associated with the user Item 31. The digital therapeutic as in any one of Items 26, 27, 28, 29, and 30, further comprising: a thinking traps module, the thinking traps module configured to (i) identify a plurality of potential thinking traps based on the feeling assessment data and (ii) receive thinking trap selection data identifying one or more particular potential thinking traps from among the plurality of potential thinking traps. 62
权利要求:
Claims (29) [1] 1. An electronic device for treating depressive symptoms associated with multiple sclerosis, the electronic device comprising: a display; an input device; one or more processors; and memory storing one or more programs configured to be executed by the one or more processors, the one or more programs including instructions for: displaying, on the display, a feeling selection interface, the feeling selection interface presenting a plurality of feeling interface elements, each feeling interface element being associated with a particular feeling; while displaying the feeling selection interface, receiving, via the input device, a first sequence of inputs, the first sequence of inputs including a feeling selection input, the feeling selection input corresponding to a particular feeling interface element; in response to receiving the feeling selection input, displaying, on the display, a feeling spectrum interface, the feeling spectrum interface presenting a plurality of intensities associated with the particular feeling; while displaying the feeling spectrum interface, receiving, via the input device, a second sequence of inputs, the second sequence of inputs including a first feeling intensity input, the first feeling intensity input corresponding to a first intensity of the plurality of intensities; in response to receiving the first feeling intensity input, displaying, on the display, an automatic thought selection interface, the automatic thought selection interface presenting a plurality of automatic thought interface elements, each automatic thought interface element being associated with a particular automatic thought; while displaying the automatic thought selection interface, receiving, via the input device, a third sequence of inputs, the third sequence of inputs 63 DK 2019 70328 A1 including an automatic thought selection input, the automatic thought selection input corresponding to a particular automatic thought interface element; in response to receiving the automatic thought selection input, displaying, on the display, an alternative thought selection interface, the alternative thought selection interface presenting a plurality of alternative thought interface elements, each alternative thought interface element being associated with a particular alternative thought; while displaying the alternative thought selection interface, receiving, via the input device, a fourth sequence of inputs, the fourth sequence of inputs including an alternative thought selection input, the alternative thought selection input corresponding to a particular alternative thought interface element; in response to receiving the alternative thought selection input, displaying, on the display, the feeling spectrum interface; while displaying the feeling spectrum interface, receiving, via the input device, a fifth sequence of inputs, the fifth sequence of inputs including a second feeling intensity input, the second feeling intensity input corresponding to a second intensity of the plurality of intensities; and generating, for display on the display, a journal entry, the journal entry indicating at least any difference between the first feeling intensity input and the second feeling intensity input. [2] 2. The electronic device of claim 1, wherein the one or more programs further include instructions for: in response to receiving the automatic thought selection input: displaying, on the display, a thinking traps interface, the thinking traps interface presenting a plurality of thinking trap interface elements associated with the particular automatic thought interface element, each thinking trap interface element being associated with a particular thinking trap; while displaying the thinking traps interface, receiving, via the input device, a sixth sequence of inputs, the sixth sequence of inputs including one or more thinking trap selection inputs, the one or more thinking trap selection 64 DK 2019 70328 A1 inputs corresponding to one or more particular thinking trap interface elements; and wherein the journal entry is modified to further indicate the one or more particular thinking trap interface elements. [3] 3. The electronic device of claim 2, wherein the one or more programs further include instructions for: in response to receiving the one or more thinking trap selection inputs, displaying, on the display, a quick recap interface element, the quick recap interface element indicating the particular automatic thought and the one or more particular thinking trap elements. [4] 4. The electronic device as in any one of the preceding claims, wherein the journal entry is modified to further indicate the particular alternative thought interface element. [5] 5. The electronic device as in any one of the preceding claims, wherein the one or more programs further include instructions for: in response to receiving the feeling selection input: displaying, on the display, a company selection interface, the company selection interface presenting a plurality of company interface elements, each company interface element being associated with a particular relationship type; while displaying the company selection interface, receiving, via the input device, a seventh sequence of inputs, the seventh sequence of inputs including a company selection input, the company selection input corresponding to a particular company interface element; and wherein the journal entry is modified to further indicate the particular company interface element. [6] 6. The electronic device as in any one of the preceding claims, wherein the one or more programs further include instructions for: 65 DK 2019 70328 A1 in response to receiving the feeling selection input: displaying, on the display, a location selection interface, the location selection interface presenting a plurality of location interface elements, each location interface element being associated with a particular location; while displaying the location selection interface, receiving, via the input device, an eighth sequence of inputs, the eighth sequence of inputs including a location selection input, the location selection input corresponding to a particular location interface element; and wherein the journal entry is modified to further indicate the particular location interface element. [7] 7. The electronic device as in any one of the preceding claims, wherein the one or more programs further include instructions for: in response to receiving the feeling selection input: displaying, on the display, a multiple sclerosis symptoms selection interface, the multiple sclerosis symptoms selection interface presenting a plurality of multiple sclerosis symptom interface elements, each multiple sclerosis symptom interface element being associated with a particular multiple sclerosis symptom; while displaying the multiple sclerosis symptoms selection interface, receiving, via the input device, a ninth sequence of inputs, the ninth sequence of inputs including one or more multiple sclerosis symptom selection inputs, the one or more multiple sclerosis symptom selection inputs corresponding to one or more particular multiple sclerosis symptom interface elements; and wherein the journal entry is modified to further indicate the one or more particular multiple sclerosis symptom interface elements. [8] 8. A computerized method for treating depressive symptoms associated with multiple sclerosis, the method comprising: at an electronic device including a display and an input device: 66 DK 2019 70328 A1 displaying, on the display, a feeling selection interface, the feeling selection interface presenting a plurality of feeling interface elements, each feeling interface element being associated with a particular feeling; while displaying the feeling selection interface, receiving, via the input device, a first sequence of inputs, the first sequence of inputs including a feeling selection input, the feeling selection input corresponding to a particular feeling interface element; in response to receiving the feeling selection input, displaying, on the display, a feeling spectrum interface, the feeling spectrum interface presenting a plurality of intensities associated with the particular feeling; while displaying the feeling spectrum interface, receiving, via the input device, a second sequence of inputs, the second sequence of inputs including a first feeling intensity input, the first feeling intensity input corresponding to a first intensity of the plurality of intensities; in response to receiving the first feeling intensity input, displaying, on the display, an automatic thought selection interface, the automatic thought selection interface presenting a plurality of automatic thought interface elements, each automatic thought interface element being associated with a particular automatic thought; while displaying the automatic thought selection interface, receiving, via the input device, a third sequence of inputs, the third sequence of inputs including an automatic thought selection input, the automatic thought selection input corresponding to a particular automatic thought interface element; in response to receiving the automatic thought selection input, displaying, on the display, an alternative thought selection interface, the alternative thought selection interface presenting a plurality of alternative thought interface elements, each alternative thought interface element being associated with a particular alternative thought; while displaying the alternative thought selection interface, receiving, via the input device, a fourth sequence of inputs, the fourth sequence of inputs including an alternative thought selection input, the alternative thought 67 DK 2019 70328 A1 selection input corresponding to a particular alternative thought interface element; in response to receiving the alternative thought selection input, displaying, on the display, the feeling spectrum interface; while displaying the feeling spectrum interface, receiving, via the input device, a fifth sequence of inputs, the fifth sequence of inputs including a second feeling intensity input, the second feeling intensity input corresponding to a second intensity of the plurality of intensities; and generating, for display on the display, a journal entry, the journal entry indicating at least any difference between the first feeling intensity input and the second feeling intensity input. [9] 9. The computerized method of claim 8, further comprising: in response to receiving the automatic thought selection input: displaying, on the display, a thinking traps interface, the thinking traps interface presenting a plurality of thinking trap interface elements associated with the particular automatic thought interface element, each thinking trap interface element being associated with a particular thinking trap; while displaying the thinking traps interface, receiving, via the input device, a sixth sequence of inputs, the sixth sequence of inputs including one or more thinking trap selection inputs, the one or more thinking trap selection inputs corresponding to one or more particular thinking trap interface elements; and wherein the journal entry is modified to further indicate the one or more particular thinking trap interface elements. [10] 10. The computerized method of claim 9, further comprising: in response to receiving the one or more thinking trap selection inputs, displaying, on the display, a quick recap interface element, the quick recap interface element indicating the particular automatic thought and the one or more particular thinking trap elements. 68 DK 2019 70328 A1 [11] 11. The computerized method as in any one of claims 8, 9, and 10, wherein the journal entry is modified to further indicate the particular alternative thought interface element. [12] 12. The computerized method as in any one of claims 8, 9, 10, and 11, further comprising: in response to receiving the feeling selection input: displaying, on the display, a company selection interface, the company selection interface presenting a plurality of company interface elements, each company interface element being associated with a particular relationship type; while displaying the company selection interface, receiving, via the input device, a seventh sequence of inputs, the seventh sequence of inputs including a company selection input, the company selection input corresponding to a particular company interface element and wherein the journal entry is modified to further indicate the particular company interface element. [13] 13. The computerized method as in any one of claims 8, 9, 10, 11, and 12, further comprising: in response to receiving the feeling selection input: displaying, on the display, a location selection interface, the location selection interface presenting a plurality of location interface elements, each location interface element being associated with a particular location; while displaying the location selection interface, receiving, via the input device, an eighth sequence of inputs, the eighth sequence of inputs including a location selection input, the location selection input corresponding to a particular location interface element; and wherein the journal entry is modified to further indicate the particular location interface element. 69 DK 2019 70328 A1 [14] 14. The computerized method as in any one of claims 8, 9, 10, 11, 12, and 13, further comprising: in response to receiving the feeling selection input: displaying, on the display, a multiple sclerosis symptoms selection interface, the multiple sclerosis symptoms selection interface presenting a plurality of multiple sclerosis symptom interface elements, each multiple sclerosis symptom interface element being associated with a particular multiple sclerosis symptom; while displaying the multiple sclerosis symptoms selection interface, receiving, via the input device, a ninth sequence of inputs, the ninth sequence of inputs including one or more multiple sclerosis symptom selection inputs, the one or more multiple sclerosis symptom selection inputs corresponding to one or more particular multiple sclerosis symptom interface elements; and wherein the journal entry is modified to further indicate the one or more particular multiple sclerosis symptom interface elements. [15] 15. A computerized method for treating depressive symptoms associated with multiple sclerosis, the method comprising: at an electronic device including a display and an input device: receiving, via the input device, feeling assessment data, the feeling assessment data describing a feeling associated with a user; receiving, via the input device, first feeling intensity data, the first feeling intensity data describing a first intensity of the feeling associated with the user; identifying a plurality of potential automatic thoughts based on the feeling associated with the user, each potential automatic thought of the plurality of potential automatic thoughts corresponding to a negative thought; receiving, via the input device, automatic thought selection data, the automatic thought selection data identifying a particular potential automatic thought from among the plurality of potential automatic thoughts; 70 DK 2019 70328 A1 identifying a plurality of potential alternative thoughts based on the automatic thought selection data, each potential alternative thought of the plurality of potential alternative thoughts corresponding to a positive thought; receiving, via the input device, alternative thought selection data, the alternative thought selection data identifying a particular potential alternative thought from among the plurality of potential alternative thoughts; receiving, via the input device, second feeling intensity data, the second feeling intensity data describing a second intensity of the feeling associated with the user; and determining any difference between the first intensity and the second intensity to provide feeling intensity difference data; and displaying, on the display, the feeling intensity difference data. [16] 16. An electronic device, comprising: a display; an input device; one or more processors; and memory storing one or more programs configured to be executed by the one or more processors, the one or more programs including instructions for: receiving, via the input device, feeling assessment data, the feeling assessment data describing a feeling associated with a user; receiving, via the input device, first feeling intensity data, the first feeling intensity data describing a first intensity of the feeling associated with the user; identifying a plurality of potential automatic thoughts based on the feeling associated with the user, each potential automatic thought of the plurality of potential automatic thoughts corresponding to a negative thought; receiving, via the input device, automatic thought selection data, the automatic thought selection data identifying a particular potential automatic thought from among the plurality of potential automatic thoughts; 71 identifying a plurality of potential alternative thoughts based on the automatic thought selection data, each potential alternative thought of the plurality of potential alternative thoughts corresponding to a positive thought; receiving, via the input device, alternative thought selection data, the alternative thought selection data identifying a particular potential alternative thought from among the plurality of potential alternative thoughts; receiving, via the input device, second feeling intensity data, the second feeling intensity data describing a second intensity of the feeling associated with the user; and determining any difference between the first intensity and the second intensity to provide feeling intensity difference data; and displaying, on the display, the feeling intensity difference data. [17] 17. A digital therapeutic for treating depressive symptoms associated with multiple sclerosis, the digital therapeutic comprising: an automatic thought identification module, the automatic thought identification module configured to (i) identify a plurality of potential automatic thoughts based on feeling assessment data describing a feeling associated with a user, each potential automatic thought of the plurality of potential automatic thoughts corresponding to a negative thought and (ii) receive automatic thought selection data identifying a particular potential automatic thought from among the plurality of potential automatic thoughts; an alternative thought identification module, the alternative thought identification module configured to (1) identify a plurality of potential alternative thoughts based on the automatic thought selection data, each potential alternative thought of the plurality of potential alternative thoughts corresponding to a positive thought and (ii) receive alternative thought selection data identifying a particular 72 DK 2019 70328 A1 potential alternative thought from among the plurality of potential alternative thoughts; a feeling intensity module, the feeling intensity module configured to (1) receive first feeling intensity data describing a first intensity of the feeling associated with the user at a first point in time; (ii) receive second feeling intensity data describing a second intensity of the feeling associated with the user at a second point in time, the second point in time being later than the first point in time; and (iii) generate feeling intensity difference data, the feeling intensity difference data indicating any difference between the first intensity and the second intensity; and a display module, the display module configured to generate display data representing the feeling intensity difference data. [18] 18. The digital therapeutic of claim 17, further comprising: a feeling assessment module, the feeling assessment module configured to receive the feeling assessment data describing the feeling associated with the user. [19] 19. The digital therapeutic of claim 17, further comprising: a thinking traps module, the thinking traps module configured to (i) identify a plurality of potential thinking traps based on the feeling assessment data and (ii) receive thinking trap selection data identifying one or more particular potential thinking traps from among the plurality of potential thinking traps. [20] 20. The digital therapeutic as in any one of claims 17, 18, and 19, further comprising: 73 DK 2019 70328 A1 a journal module, the journal module configured to generate a journal entry comprising at least the feeling intensity difference data. [21] 21. The digital therapeutic of claim 20, further comprising: a company module, the company module configured to receive company selection data identifying, by relationship type, a person who accompanied the user at a time in which the user experienced the feeling; and wherein the journal entry further comprises the company selection data. [22] 22. The digital therapeutic as in any one of claims 20 and 21, further comprising: a location module, the location module configured to receive location selection data identifying a location of the user at a time in which the user experienced the feeling; and wherein the journal entry further comprises the location selection data. [23] 23. The digital therapeutic as in any one of claims 20-22, further comprising: a multiple sclerosis symptom module, the multiple sclerosis symptom module configured to receive multiple sclerosis symptom selection data identifying one or more multiple sclerosis symptoms associated with the user; and wherein the journal entry further comprises the multiple sclerosis symptom selection data. [24] 24. The digital therapeutic as in any one of claims 20-23, wherein the journal entry further comprises the thinking trap selection data. 74 DK 2019 70328 A1 [25] 25. A non-transitory computer-readable storage medium storing one or more programs configured to be executed by one or more processors of an electronic device with a display and input device, the one or more programs including instructions for performing the method of any of claims 8-14. [26] 26. A non-transitory computer-readable storage medium storing one or more programs configured to be executed by one or more processors of an electronic device with a display and input device, the one or more programs including instructions for performing the method of claim 15. [27] 27. A method for treatment of depressive symptoms of multiple sclerosis, in a subject, in need thereof, comprising administering to said subject a computerized method of any of claims 8-14. [28] 28. A method for treatment of depressive symptoms of multiple sclerosis, in a subject, in need thereof, comprising administering to said subject a computerized method of claim 15. [29] 29. A method for treatment of depressive symptoms of multiple sclerosis, in a subject, in need thereof, comprising administering to said subject a digital therapeutic of any of claims 17-24. 75
类似技术:
公开号 | 公开日 | 专利标题 Shore2013|Telepsychiatry: videoconferencing in the delivery of psychiatric care Chirra et al.2019|Telemedicine in neurological disorders: opportunities and challenges Hong et al.2016|Care partnerships: Toward technology to support teens' participation in their health care US20190109830A1|2019-04-11|Systems and Methods for Ensuring Data Security in the Treatment of Diseases and Disorders Using Digital Therapeutics Lawes-Wickwar et al.2018|Application and effectiveness of telehealth to support severe mental illness management: systematic review Markowitz et al.2015|What to do when a psychotherapy fails Torous et al.2019|A new hope for early psychosis care: the evolving landscape of digital care tools Bernini et al.2021|Cognitive telerehabilitation for older adults with neurodegenerative diseases in the COVID-19 Era: a perspective study Kang et al.2014|Older adults’ perception of chronic illness management in South Korea Bhidayasiri et al.2020|Delivering patient-centered care in Parkinson's disease: Challenges and consensus from an international panel Harvey et al.2012|Technology, society, and mental illness: challenges and opportunities for assessment and treatment Hartmann et al.2019|Utilization of patient-generated data collected through mobile devices: Insights from a survey on attitudes toward mobile self-monitoring and self-management apps for depression Niles et al.2020|Randomized controlled trial testing mobile-based attention-bias modification for posttraumatic stress using personalized word stimuli Hsieh et al.2020|Telemedicine and the Management of Insomnia Sun et al.2019|A nurse’s twenty-four-year journey with breast cancer-related lymphedema Liu et al.2021|Analysis of topics and characteristics of user reviews on different online psychological counseling methods Pitman et al.2017|Psychotherapy technique related to changes in anxiety symptoms with a transdiagnostic sample DK201970328A1|2020-11-23|Electronic devices and methods for treating depressive symptoms associated with multiple sclerosis US20200330019A1|2020-10-22|Electronic Devices and Methods for Treating Depressive Symptoms Associated With Multiple Sclerosis Cui et al.2021|Longitudinal relationships among problematic mobile phone use, bedtime procrastination, sleep quality and depressive symptoms in Chinese college students: a cross-lagged panel analysis Tan et al.2020|Potential of Mobile technology to relieve the urgent mental health needs in China: web-based survey Parolin et al.2021|Italia Ti Ascolto [Italy, I am listening]: an app-based group psychological intervention during the COVID-19 pandemic Detar2011|Alcoholics anonymous and other twelve-step programs in recovery Swarna Nantha et al.2020|The internal realities of individuals with type 2 diabetes: Mediators influencing self-management beliefs via Grounded Theory approach US20200350076A1|2020-11-05|Systems and Methods for Clinical Curation of Crowdsourced Data
同族专利:
公开号 | 公开日 AU2020257885A1|2021-11-11| CA3136946A1|2020-10-22|
引用文献:
公开号 | 申请日 | 公开日 | 申请人 | 专利标题
法律状态:
2020-11-23| PAT| Application published|Effective date: 20201018 | 2022-01-05| PHB| Application deemed withdrawn due to non-payment or other reasons|Effective date: 20210819 |
优先权:
[返回顶部]
申请号 | 申请日 | 专利标题 US201962835250P| true| 2019-04-17|2019-04-17|KR1020217035667A| KR20220009942A|2019-04-17|2020-04-13|Electronic devices and methods for treating depressive symptoms associated with multiple sclerosis| AU2020257885A| AU2020257885A1|2019-04-17|2020-04-13|Electronic devices and methods for treating depressive symptoms associated with multiple sclerosis| SG11202111418XA| SG11202111418XA|2019-04-17|2020-04-13|Electronic devices and methods for treating depressive symptoms associated with multiple sclerosis| CN202080038804.4A| CN113892147A|2019-04-17|2020-04-13|Electronic device and method for treating depressive symptoms associated with multiple sclerosis| US16/847,043| US20200330019A1|2019-04-17|2020-04-13|Electronic Devices and Methods for Treating Depressive Symptoms Associated With Multiple Sclerosis| PCT/US2020/027919| WO2020214523A1|2019-04-17|2020-04-13|Electronic devices and methods for treating depressive symptoms associated with multiple sclerosis| EP20723702.5A| EP3956905A1|2019-04-17|2020-04-13|Electronic devices and methods for treating depressive symptoms associated with multiple sclerosis| CA3136946A| CA3136946A1|2019-04-17|2020-04-13|Electronic devices and methods for treating depressive symptoms associated with multiple sclerosis| IL286965A| IL286965D0|2019-04-17|2021-10-04|Electronic devices and methods for treating depressive symptoms associated with multiple sclerosis| 相关专利
Sulfonates, polymers, resist compositions and patterning process
Washing machine
Washing machine
Device for fixture finishing and tension adjusting of membrane
Structure for Equipping Band in a Plane Cathode Ray Tube
Process for preparation of 7 alpha-carboxyl 9, 11-epoxy steroids and intermediates useful therein an
国家/地区
|