Research in Gerontological Nursing

Empirical Research 

Testing a Theory-Based Mobility Monitoring Protocol Using In-Home Sensors: A Feasibility Study

Blaine Reeder, PhD; Jane Chung, MSN, RN; Amanda Lazar, BS; Jonathan Joe, BS; George Demiris, PhD; Hilaire J. Thompson, PhD, RN


Mobility is a key factor in the performance of many everyday tasks required for independent living as a person ages. The purpose of this mixed-methods study was to test a theory-based mobility monitoring protocol by comparing sensor-based measures to self-report measures of mobility and assess the acceptability of in-home sensors with older adults. Standardized instruments to measure physical, psychosocial, and cognitive parameters were administered to 8 community-dwelling older adults at baseline, 3-month, and 6-month visits. Semi-structured interviews to characterize acceptability of the technology were conducted at the 3-month and 6-month visits. Technical issues prevented comparison of sensor-based measures with self-report measures. In-home sensor technology for monitoring mobility is acceptable to older adults. Implementing our theory-based mobility monitoring protocol in a field study in the homes of older adults is a feasible undertaking but requires more robust technology for sensor-based measure validation.

[Res Gerontol Nurs. 2013; 6(4):253–263.]


Mobility is a key factor in the performance of many everyday tasks required for independent living as a person ages. The purpose of this mixed-methods study was to test a theory-based mobility monitoring protocol by comparing sensor-based measures to self-report measures of mobility and assess the acceptability of in-home sensors with older adults. Standardized instruments to measure physical, psychosocial, and cognitive parameters were administered to 8 community-dwelling older adults at baseline, 3-month, and 6-month visits. Semi-structured interviews to characterize acceptability of the technology were conducted at the 3-month and 6-month visits. Technical issues prevented comparison of sensor-based measures with self-report measures. In-home sensor technology for monitoring mobility is acceptable to older adults. Implementing our theory-based mobility monitoring protocol in a field study in the homes of older adults is a feasible undertaking but requires more robust technology for sensor-based measure validation.

[Res Gerontol Nurs. 2013; 6(4):253–263.]

Mobility, the ability to move oneself from place to place in the environment without assistance (Shumway-Cook, Ciol, Yorkston, Hoffman, & Chan, 2005), is at the core of independent living for older adults because it underpins the performance of everyday tasks. For example, mobility is a key factor in basic activities of daily living (ADLs), such as bathing and getting out of bed, that are required for independence in personal care (Katz, Ford, Moskowitz, Jackson, & Jaffe, 1963). Likewise, mobility is a factor in performance of instrumental activities of daily living (IADLs), such as preparing food and doing laundry, that contribute to independence beyond basic personal care (Lawton & Brody, 1969). With the rapid growth of the older adult population in the next 2 decades, the number of older adults with mobility limitations will increase substantially, and the current health care system will be strained to deliver needed services to this population. Given that mobility is at the core of independent living, scalable approaches to monitor mobility levels of community-dwelling older adults are needed to help inform interventions that allow them to age in place. For example, the ability to monitor mobility in the home could allow family members or health care providers to identify the onset of depression if a decrease in activity levels is detected over time. Or, it could help older adults adhere to programs of behavioral change by allowing them to see their physical activity levels.


Performance-based measures, such as the performance-oriented mobility assessment (Tinetti 1986), can reliably and accurately assess mobility in older adults. Self-report measures of mobility, such as the Physical Activity Scale for the Elderly (Washburn, Smith, Jette, & Janney, 1993) and the Self-reported Physical Function Measures (SPFM, Alexander et al., 2000) instrument can correlate with performance-based measures. However, a drawback of performance-based and self-report measures is that they require face-to-face interactions between older adults and health care providers. Therefore, there is a need for mobility measures that can scale to a community level with a minimum of human effort. In-home sensor technologies that unobtrusively collect activity data may meet this need through sensor-based measures.

Webber, Porter, and Menec (2010) described a theoretical framework that incorporates physical, cognitive, psychosocial, environmental, and biographical determinants of mobility at increasing orders of “life space” along mobility zones that include the bedroom, home, outdoor area of the home, neighborhood, local service community (e.g., stores, financial institutions, health care facilities), region or country, and the world. They hypothesized that moving up each zone of the mobility framework requires a greater degree of independent mobility. Other life space research indicates that contextual factors related to living situation and home size play a role in mobility. Cress, Orini, and Kinsler (2011) found that the living space of retirement community residents was approximately 60% compared with that of community-dwelling older adults, and they took approximately 3,000 fewer steps per day.

To operationalize the determinants of mobility framework, we defined a protocol that includes self-report measures and seeks to correlate them with sensor-based measures to create cost-effective, technology-facilitated capabilities for the assessment of older adults’ mobility in community settings (Demiris & Thompson, 2012). The protocol operates at the home level of the mobility framework. Specifically, it implements methods to assess the various life space determinants of mobility using validated self-report measures, such as the Falls Efficacy Scale (FES, Tinetti, Richman, & Powell, 1990) for physical determinants, the Mini-Cog (Borson, Scanlan, Brush, Vitaliano, & Dokmak, 2000) for cognitive determinants, and the Geriatric Depression Scale, short form (GDS-SF, Yesavage et al., 1982–1983) for psychosocial determinants in conjunction with in-home sensor technology and observer scans of the home environment (Demiris & Thompson, 2012). The goal of the sensor-based measures is to automate detection of individual changes in mobility and inform timely interventions to support independence in older adults.

A sensor-based measure is a mobility measure derived through analysis of activity data collected by sensor technology. One approach to detecting human activity patterns is through sensors that are worn or carried on the body (Zijlstra & Aminian, 2007). While these kinds of sensors can provide location-based data for an individual person, inherent drawbacks are the adherence and feasibility issues body-worn sensors present because a person must remember to wear the equipment and to do so properly (Allet, Knols, Shirato, & de Bruin, 2010).

A less obtrusive approach for monitoring mobility is to use sensor technologies installed in the home environment (Demiris, 2009). This approach shows promise without the drawbacks of body-worn sensors. A systematic review of evidence for health smart-home technologies to support independent living found that 12 of 13 studies classified as promising or effective evidence included an in-home monitoring technology component (Reeder, Meyer, et al., 2013). In one smart-home project, retrospective analysis of activity data after a fall by a community-dwelling 90-year-old woman showed a marked increase in restlessness detected by bed sensors leading up to the fall (Rantz, Skubic, & Miller, 2009). In another case, decreased overall activity levels correlated with a clinical test that indicated increasing depressive symptoms (Skubic, Alexander, Popescu, Rantz, & Keller, 2009). In a third study, retrospective analysis of activity data collected by sensors showed mixed results but demonstrated that decreased activity levels correlated with a major health event in the case of one participant (Brownsell, Blackburn, & Hawley, 2008). In addition, the feasibility of using in-home sensors to detect total daily activity, time out of home, and walking speed has been demonstrated in a longitudinal study that enrolled 232 cognitively intact, community-dwelling older adults (Kaye et al., 2011).

Evaluation studies of sensors and other home-based health technologies have shown that older adults find these technologies acceptable (Mahoney, Mahoney, & Liss, 2009; Reeder, Demiris, & Marek, 2013). However, there are challenges to implementing home-based technologies for use by older adults because home settings vary widely, and consumer technology is often designed with younger people as the target audience. Special consideration for the needs, preferences, and abilities of older adults must be taken with regard to design of technologies intended to support their independence at home.

Prior research demonstrates the potential for technology to assess mobility and older adults’ acceptance of home-based technology, but further research that engages older adults in home settings to evaluate sensor-based measures is needed. For example, a systematic review of technologies for frailty assessment, which included studies of mobility sensing, identified the need for feasibility studies of the reliability, validity, and acceptability of technologies that might predict increased risk of morbidity, disability, and death for older adults (Zaslavsky, Thompson, & Demiris, 2012). Therefore, we conducted this study in an effort to better define the relationship between sensor-based measures and self-report measures that relate to mobility by engaging older adults in their homes.

Purpose and Aims

This study was a feasibility test of the theory-based mobility monitoring protocol using validated self-report measures and sensor-based measures with healthy community-dwelling older adults (Demiris & Thompson, 2012). The study aims were to: (a) determine the extent to which novel sensor-based activity data correlate with validated self-report measures that represent physical, psychosocial, and cognitive mobility parameters; and (b) assess the acceptability of sensor technology installed in the homes of community-dwelling older adults over the course of a 6-month study. In addition, this field test identified challenges to implementing the mobility monitoring protocol in a real-world setting.



Participants were recruited from an independent retirement community in Seattle, Washington. To be included in the study, each participant needed to be 65 or older, a resident of the participating community, and able to speak and understand written English. Exclusion criteria included having a known life expectancy of 6 months or less, inability to provide written informed consent, or unwillingness to install the sensor technology in the home. All study procedures were approved by the University Institutional Review Board.


Potential study participants were identified through a presentation at the facility that explained the study as well as snowball sampling procedures. Interested individuals completed information cards for later contact about enrollment. Consent was obtained in the privacy of each participant’s residence during the baseline visit of the study, prior to any procedures. Participants were compensated with a $10 gift card after the 3- and 6-month interview visits.


The retirement community is a mix of 1-, 2-, and 3-bedroom apartments with additional communal living areas (e.g., gym, patio, library). All residences in the community are provided with Internet access as part of the normal housing agreement between residents and the facility.

Sensor System Description

The sensor system was provided by a technology partner from another university. The system components included motion sensors, a purpose-built gateway, and a Web portal with administration interfaces and data visualization capabilities. The motion sensors used in this study were commercially available passive infrared sensors that have been used in other in-home monitoring studies (Kaye et al., 2011; Rantz et al., 2009; Skubic et al., 2009). These sensors are 2.5 inches by 2.5 inches and detect changes in motion through changes in ambient temperature when a person moves about a room. Passive infrared sensors detect motion in a cone-shaped area with a range of approximately 20 feet from the lens of the sensor. Figure 1 shows a photograph of a sensor installed in the home of a study participant. Activity data from individual sensors are wirelessly transmitted to the gateway and then routed to a remote server via the Internet. Server-side processes aggregated and presented data for viewing through a secure Web-based interface.

Motion sensor installed in the home of a study participant. This figure illustrates how a motion sensor used in the study looks after it has been installed in the kitchen of a residence.

Figure 1.

Motion sensor installed in the home of a study participant. This figure illustrates how a motion sensor used in the study looks after it has been installed in the kitchen of a residence.

To assess the system before the study, we conducted a successful field test of an installation of the sensor system in the home of a research team member for approximately 4 weeks. Prior to enrollment of study participants, the research team conducted a study site visit that included training of all team members in installation of the system and a test of all equipment used for installations in participants’ homes. Our technology partner provided remote support via telephone and e-mail to the onsite study leader during field tests, installations during participant enrollment, and during the study period. The study leader from our research team (B.R.) was an experienced software engineer, system designer, and senior support technician with doctoral training in health informatics research. Figure 2 displays an example floor plan for sensor installation, showing the location of the gateway and five sensors with cone shaped overlays to illustrate motion detection areas. Please note that this figure is for illustrative purposes only; overlay areas are not to scale nor intended to be precise engineering representations of device motion detection behavior.

Example floor plan configuration for sensor installation. This figure shows the locations of a gateway, sensors, and the detection direction of sensors.

Figure 2.

Example floor plan configuration for sensor installation. This figure shows the locations of a gateway, sensors, and the detection direction of sensors.


Two members of the research team (B.R., J.C.) conducted baseline, 3-month (midpoint), and 6-month (exit) visits (January, April, and July 2012) with each participant that lasted 60 to 90 minutes. During the baseline visit, after participants provided consent, demographic data were collected for age, sex, race, ethnicity, history of chronic conditions, and current medications. Sensor systems were installed by one member of the research team during the baseline visit and removed during the exit visit at the end of the 6-month study period. Study data were collected through self-report instruments, interviews, and monthly calendars completed by participants to monitor falls. Table 1 shows the data collection schedule for the self-report measures, interviews, and fall calendars. The study procedures are described in greater detail below.

Mobility Parameters, Self-Report Instruments, and Data Collection Schedule

Table 1:

Mobility Parameters, Self-Report Instruments, and Data Collection Schedule

At baseline and exit visits, all participants completed validated self-report instruments that assessed physical, psychosocial, and cognitive parameters of the mobility framework on which this study is based (Demiris & Thompson, 2012; Webber et al., 2010). Physical mobility parameters were measured using the SPFM (Alexander et al., 2000) instrument, FES (Tinetti et al., 1990), and Short Form 12-item Survey version 2 Physical Component Summary Measures (SF-12v2 PCS, Ware, Kosinski, & Keller, 1996). The SPFM consists of 7 Katz Index of Independence in Activities of Daily Living (Katz ADLs) items (Katz et al., 1963), three Rosow-Breslau Scale items (Rosow & Breslau, 1966), and five Nagi performance scale items (Nagi, 1976). Psychosocial mobility parameters were measured using Short Form 12-item Survey version 2 Mental Component Summary Measures (SF-12v2 MCS, Ware et al., 1996), GDS-SF (Yesavage et al., 1982–1983), and Medical Outcomes Study Social Support Survey (MOS-SS, Sherbourne & Stewart, 1991). The cognitive mobility parameter was measured using the Mini-Cog, a 3-minute, three-item recall test used to screen for cognitive impairment in older adults (Borson et al., 2000). Each participant completed the SPFM during the baseline, 3-, and 6 month visits.

Participants also completed a monthly calendar to track any falls. They were asked to update the calendar on a daily basis with an “X” for any day they experienced a fall without injury, an asterisk to denote a fall that resulted in injury, or to leave the day blank if no fall occurred (Ganz, Higashi, & Rubenstein, 2005). A member of the research team (J.C.) visited each participant at the beginning of each month to exchange a blank fall calendar for a completed one. In addition, we conducted scans of the residential environment for barriers (e.g., tripping hazards, blocked access ways) and made inquiries about changes in the use of mobility aids (e.g., canes, walkers) during the baseline, 3-, and 6-month visits.

Data collected during each visit were entered into REDcap™ (, which was used to facilitate study management. Semi-structured interviews to assess participant perceptions with regard to obtrusiveness, usefulness, and acceptability of the sensor technology were conducted by two researchers (B.R., J.C.) during the 3- and 6-month visits. Interviews were recorded using a digital audio recorder. Table 2 lists the questions included in the acceptability interview guide.

Acceptability Interview Guide Questions

Table 2:

Acceptability Interview Guide Questions

Quantitative Data Analysis

For quantitative data analyses, nonparametric statistical tests were used because of the small sample. Wilcoxon signed-rank tests were used to compare data between baseline and 6 months for all variables, except mobility; Friedman Analysis of Variance by Ranks was conducted to determine whether there were differences in mobility as measured at the three time points; and Mann-Whitney U tests were conducted for comparisons between fallers and non-fallers. All quantitative analyses were performed with SPSS version 17.

Thematic Analysis

Recordings for the 3- and 6-month interviews were transcribed verbatim by members of the research team. Accuracy of each transcript was verified by another member of the research team by listening to the recording and reading the transcript. Thematic analysis was conducted to identify themes related to acceptability and perceptions of technology according to procedures described by Boyatzis (1998). Three coders independently coded one transcript for one of the 3-month interviews to create a codebook. Coders met to standardize codes and reconcile disagreements until consensus was reached about code application. Codes from the codebook were reviewed for content validity by the principal investigators (PIs) of the study (G.D., H.J.T.) after several transcripts were coded. Subsequent transcripts were independently coded by three coders and reconciled through weekly in-person meetings. The codebook was updated following each meeting. After all eight midpoint interviews were coded, the PIs and study leader (B.R., G.D., H.J.T.) met for a coding review before analyzing the remaining seven interview transcripts. Exit interview transcripts were coded in full and reconciled by at least three coders following the same procedure. Final results were summarized by the study leader with assistance from the other coders.

Quantitative Results

Demographic Characteristics

Eleven individuals expressed interest in the study and were contacted by the research team to discuss study specifics and to schedule a consent visit. One individual chose not to enroll after follow up because his spouse preferred not to participate. Eight participants enrolled in the study, ranging in age from 79 to 86 (mean age = 83, SD = 2.2 years). Six participants were married couples living in three individual units, and the remaining 2 participants were single occupants of their residential units. Six of 8 participants (75%) had completed graduate school, 8 of 8 (100%) were White, and 7 of 8 (87.5%) were married. One participant died between the midpoint and exit interviews. Two individual participants experienced a single fall, each between the midpoint and exit interviews, for a total of two falls during the study period.

Self-Report Measures

There were no significant differences in mobility parameter measures between the baseline and 6-month visits. Overall, mobility decreased for most participants, due to lower Nagi scores, but this change was not significant across the three measurement time points (Table 3). Individuals who fell during the study period (n = 2) had higher baseline FES scores and lower SPFM scores than those who did not fall (Table 3); at later time points, these trends were reversed, with fallers having lower FES scores and higher SPFM scores. There was an overall decrease in mean GDS-SF score (p = 0.06), indicating a downward trend for depressive symptoms. However, these differences were not statistically significant (Table 3).

Assessment of Mobility Parameters at Baseline, 3-Month, and 6-Month Visits

Table 3:

Assessment of Mobility Parameters at Baseline, 3-Month, and 6-Month Visits

Sensor-Based Measures

Implementation challenges outmatched the ability of our technology partner to support the sensor system during the study. Early in the study, technical issues were related to cross-talk from similarly-addressed sensors between adjacent installations. Cross-talk issues were resolved by reallocating sensor addresses within unique ranges and replacing the sensors that had conflicting addresses during scheduled study visits. Subsequent technical issues were related to breakdowns in transmission of activity data from the gateway hardware to the remote server in our technology partner’s data center. Data collected between resolution of cross-talk issues and onset of data transmission issues were insufficient to allow for analysis of activity patterns. As a result, technical issues prevented correlation of sensor-based measures with self-report measures. However, a positive outcome of these challenges during the study was the observation of the effects of the technical issues on the attitudes of our study participants.

Qualitative Results: Acceptability of In-Home Sensors

Thematic analysis of the 15 interview transcripts (8 midpoint, 7 exit) resulted in three emergent themes: Perceptions of Technology (subthemes: Understanding of Technology and Obtrusiveness), Perceived Usefulness of Data (subthemes: Usefulness for Different Stakeholders and Desire to Access Personal Activity Data), and Privacy (subthemes: Privacy Concerns and Willingness to Share Data). In addition, we compared participant responses for agreement from the midpoint to exit interviews to determine changes in attitudes over time.

Perceptions of Technology

Understanding of Technology. An explanation of the sensor system and its function was given during the recruiting presentation and study enrollment process. Nevertheless, some participants held misperceptions of technology features and functionality. For example, one participant thought the system was sending real-time data and alerts about specific activities, including detection of falls, leading him to comment: “We are pleased that they’re here because they may be very valuable if we fall.”

Obtrusiveness. Seven participants explicitly stated that the sensors in their residences were not obtrusive to them, and one stated that she thought of them when visitors asked about them. Common comments about the sensors were: “We don’t think about them at all” and “I don’t even notice them. Once in a while I look up and see one, but they don’t bother me at all.” Most participants reported that visitors did not notice the sensors. One participant’s comment exemplified a more common response: “No one has commented on them, and we’ve had a fair number of people in and out. Nobody has said: ‘What’s that?’ or even commented.”

Perceived Usefulness of Data

Usefulness for Different Stakeholders. Participants had positive views of the potential usefulness of personal activity data in general and for others. However, participants had mixed opinions about the usefulness of personal activity data for themselves. Participants had greater levels of agreement about the potential usefulness of their personal activity data for other stakeholders. These views are exemplified by the following quote: “I am the major player at gathering the data but I’m a minor player in interpreting it, because I don’t know what’s going on with me and I don’t recognize it.” Participants identified potential uses for personal activity data, such as generating automatic alerts to family or community staff about sudden changes in activity levels or detecting changes caused by the aging process. One participant noted the need to begin activity monitoring as a preventive action before a change in health status to establish a baseline: “You don’t wait until you [have] got some changes and then start monitoring.”

Desire to Access Personal Activity Data. Most participants expressed a desire to access their personal activity data if the data were made available to them. One participant cited entertainment value as a possible motivator to view her personal activity data: “It would be kind of fun instead of playing solitaire to look up the data, yeah. If I could do it I might do it.” For those who wished to access their personal activity data, desired frequency of access ranged from 1 to 6 months, with approximately 3 months being the most commonly reported.


Privacy Concerns. When asked, all participants explicitly stated they had no major privacy concerns. Most participants were willing to disclose the presence of the sensors to others as indicated by stories of telling their family members and other residents about the sensor systems in their homes. Participants did acknowledge that they might have fewer privacy concerns than others in the population. One participant commented: “Privacy is a tremendously important issue for some people and for others, they could care less.” This same participant also noted the potential for privacy to cause harm: “The only thing it does is maybe isolate you to a greater extent.”

Willingness to Share Data. All participants expressed a willingness to share their personal activity data with family members and health care providers. However, some raised issues about the circumstances of data sharing related to data security, the potential for judgment, and loss of control in decision making. One participant was concerned about possible judgment of her activities: “If it showed something, the lazy part of my life, maybe I wouldn’t show it. I don’t know.” Another participant related data sharing to managing finances with her daughter: “I trust my daughter but we don’t think alike about everything.” When asked about sharing data with health care providers, she responded: “Yeah, I think so. In fact, I’d probably be more truthful with them than I will with my daughter at times.”

Changes in Attitudes Over Time

At the 6-month point, there were no substantive changes over time from the 3-month mark for all participants in the areas of privacy concerns (generally “none”), willingness to share data (generally “yes”), willingness to disclose presence of sensors to others (generally “yes”), desire to access personal data (generally “yes”), or desire to control when and where sensors function (generally “no”).

None of the participants reported that their perceptions of the sensor technology had changed during the study period. However, at the 3-month mark, one participant remarked:

I can see a lot more value to this study then I did before I started talking about it but I see what you are really trying to do now. It’s much better. I didn’t understand it that well to start with.

Statements made by 2 participants in the exit interview did not support their expressed opinions about lack of attitude change, as exemplified by the following quote:

I think I was kind of excited about it [the sensor system] at first but the longer it was here the less—I don’t know. I think I convinced myself that there was not enough time for any useful information to be gleaned.

At 3 months, 2 of 8 participants explicitly reported they thought the technology would be useful to them. One of these participants died before the 6-month exit interview. At 6 months, 3 of 7 participants stated they thought the sensor system would be personally useful to them. Both participants who did not perceive the sensor system as useful at 3 months maintained this opinion at 6 months. Perception of a positive state of personal health seemed to correlate with lack of perceived personal need for the technology and the idea that it would be good for “someone else.” Interestingly, it appears that technical issues with the sensor system negatively influenced two of the participants’ perceptions about the usefulness of the technology, even though they both reported a desire to see their personal data.


This feasibility study evaluated the acceptability of in-home sensor technology with older adults and tested a theory-based mobility monitoring protocol in a community setting (Demiris & Thompson, 2012). A broad goal of the study was to inform sensor-based measures as a means to easily detect individual changes in mobility and inform timely responses to environmental demands to support independent living for older adults.

Unfortunately, technical issues, as previously described, prevented us from validating sensor-based measures against self-report measures in our mobility monitoring protocol. These technical issues were unexpected because the commercially available sensors used in the study have been deployed successfully in numerous other studies. For example, the feasibility of using passive infrared sensors to detect total daily activity, time out of home, and walking speed has been demonstrated in a large longitudinal study of community-dwelling older adults (Kaye et al., 2011). In addition, several other studies have demonstrated the feasibility of using passive infrared sensors with older adults in home settings over the long term (Rantz et al., 2009; Skubic et al., 2009; Tomita, Mann, Stanton, Tomita, & Sundar, 2007).

While these cases demonstrate successful implementations of sensor technology for in-home health monitoring with older adults, other cases detail the challenges of technology installation in home settings (Bowles et al., 2012; Mahoney, 2004; Williams, Arthur, Niedens, Moushey, & Hutfles, 2013). A systematic review of the barriers and drivers of the use of consumer health technologies with older adults found that usability and reliability issues are often associated with the use of early-stage systems in research (Jimison et al., 2008). Our results indicate a requirement for study designs that use more mature technology with greater resources to support technology installations. In addition, participants should be explicitly informed about the possibility of technical problems to avoid overdependence on study technology (Mahoney et al., 2007).

Perceived usefulness has been shown to be a determinant of a person’s intention to adopt a new technology (Venkatesh & Davis, 2000), and older adults are motivated to use new technology if they perceive a benefit from its use (Melenhorst, Rogers, & Bouwhuis, 2006; Tomita et al., 2007; Wild, Boise, Lundell, & Foucek, 2008). For those participants in our study who were uncertain about the usefulness of the technology, it appears that their opinions were influenced by the experience of technical issues with the in-home sensor system. The implication for research is that if perceived usefulness is a determinant of technology adoption and experiences with unreliable technology cause some older adults to question the perceived usefulness of technology, then researchers must ensure that technologies deployed in the homes of older adults are reliable. Looking beyond research to broader implementations, the implication is there may be a percentage of the older adult population that will never again use a novel technology that could help them stay independent if they experience early technical issues. In this study, however, our participants considered the sensor technology acceptable regardless of their views about its usefulness.

Participants stated that potential usefulness of activity data for themselves and health care providers would be maximized if sensor data collection began before any health issue occurred, such as falls, depression, or sleep interruptions. However, in contrast to their own professed views, when asked if they would install a sensor system to monitor their own activity patterns, some participants said they felt no need to do so because they were enjoying good health. This finding suggests a need for research to understand how to promote technology adoption by those at risk for mobility declines when reliable technologies that indicate functional decline become available. In addition, given the error inherent in self-report measures, future research in this area should include efforts to develop and validate objective measures of physical activity.

One recognized issue related to deployment of passive infrared sensors for activity monitoring is the ability to distinguish between multiple residents or visitors from collected sensor data (Reeder, Meyer, et al., 2013). Recent research has shown that it is possible to detect visitors in the home of study participants with a high degree of sensitivity and specificity using only data collected from passive infrared sensors (Petersen, Larimer, Kaye, Pavel, & Hayes, 2012). Other research has demonstrated the ability to disambiguate in-home walking speed of multiple residents using sensor data and clinical assessment data (Austin, Hayes, Kaye, Mattek, & Pavel, 2011). A third approach is to use additional sensors in conjunction with passive infrared sensors to collect height data that can identify individual residents as they pass through doorways (Hnat, Griffiths, Dawson, & Whitehouse, 2012). These three approaches are feasible to identify individual residents in multi-person homes without body-worn sensors. In any case, disambiguating individual residents from sensor data is a post-processing issue that was precluded by the outcomes of the current study.

There was a downward trend in mean GDS-SF score from baseline to study exit in this observational study. This improvement may be explained by seasonal effects such as little natural sunlight at baseline (winter) in the Pacific Northwest where this study was conducted (Sumaya, Rienzi, Deegan, & Moss, 2001). As we were not specifically examining this effect, we cannot rule out other factors, and a longer follow-up study with a larger sample would be required for confirmation.

Studies in home settings with older adults must consider great variation in participant characteristics due to individual changes that result from normal aging. Further, there may be as yet undiscovered patterns of change in the way people perceive the usefulness of technology as they age. This study provides new insights into factors that can affect technology implementations in the homes of older adults. Practical observations were made with regard to implementing the theory-based mobility monitoring protocol. For instance, enrollment, instrument administration, and technology installation required a team of two researchers working in tandem to shorten study visit duration and minimize participant burden. Good documentation of installations is a key factor to increase efficiency when troubleshooting technical issues. For example, detailed floor plans with sensor locations are important to help identify problems and plan for resolution prior to an onsite visit. One of the most valuable lessons learned is that a selected technology may not be as robust as it appears, even after initial field testing.


The participant sample in this study was racially homogenous and of a higher socioeconomic status than the general U.S. population of adults 65 and older. Thus, the findings presented in this article may not generalize to larger populations of older adults within or external to the United States. In addition, the small sample of this pilot study limits the generalizability of the findings.


This study demonstrated the feasibility of implementing a theory-based mobility monitoring protocol over a 6-month study period with simultaneous enrollment of multiple community-dwelling older adults as participants. Technical issues precluded successful comparisons of activity data to self-report measures to inform sensor-based mobility measures (our first aim). However, these challenges presented an opportunity to observe the effect of technical issues on acceptability and perceived usefulness of in-home sensor technology during a field study with older adults (our second aim). Future work will involve identification of reliable technology and a pilot study with a greater number of participants to compare self-report measures and sensor-based measures using our theory-based mobility monitoring protocol.


  • Alexander, N.B., Guire, K.E., Thelen, D.G., Ashton-Miller, J.A., Schultz, A.B., Grunawalt, J.C. & Giordani, B. (2000). Self-reported walking ability predicts functional mobility performance in frail older adults. Journal of the American Geriatrics Society, 48, 1408–1413.
  • Allet, L., Knols, R.H., Shirato, K. & de Bruin, E.D. (2010). Wearable systems for monitoring mobility-related activities in chronic disease: A systematic review. Sensors, 10, 9026–9052. doi:10.3390/s101009026 [CrossRef]
  • Austin, D., Hayes, T.L., Kaye, J., Mattek, N. & Pavel, M. (2011). On the disambiguation of passively measured in-home gait velocities from multi-person smart homes. Journal of Ambient Intelligence and Smart Environments, 3, 165–174. doi:10.3233/ais-2011-0107 [CrossRef]
  • Borson, S., Scanlan, J., Brush, M., Vitaliano, P. & Dokmak, A. (2000). The Mini-Cog: A cognitive ‘vital signs’ measure for dementia screening in multi-lingual elderly. International Journal of Geriatric Psychiatry, 15, 1021–1027. doi:10.1002/1099-1166(200011)15:11<1021::AID-GPS234>3.0.CO;2-6 [CrossRef]
  • Bowles, K.H., O’Connor, M., Hanlon, A., Naylor, M.D., Riegel, B., Weiner, M. & Glick, H. (2012, January–February). Barriers to cost and clinical efficiency with telehomecare and proposed solutions. Paper presented at eTELEMED 2012, The Fourth International Conference on eHealth, Telemedicine, and Social Medicine. , Valencia, Spain. .
  • Boyatzis, R.E. (1998). Transforming qualitative information: Thematic analysis and code development. Thousand Oaks, CA: Sage.
  • Brownsell, S., Blackburn, S. & Hawley, M.S. (2008). An evaluation of second and third generation telecare services in older people’s housing. Journal of Telemedicine and Telecare, 14, 8–12. doi:10.1258/jtt.2007.070410 [CrossRef]
  • Cress, M.E., Orini, S. & Kinsler, L. (2011). Living environment and mobility of older adults. Gerontology, 57, 287–294. doi:10.1159/000322195 [CrossRef]
  • Demiris, G. (2009, September). Privacy and social implications of distinct sensing approaches to implementing smart homes for older adults. Paper presented at the Engineering in Medicine and Biology Society, 2009, Annual International Conference of the IEEE. , Minneapolis, MN. . doi:10.1109/IEMBS.2009.5333800 [CrossRef]
  • Demiris, G. & Thompson, H.J. (2012). Mobilizing older adults: Harnessing the potential of smart home technologies. Contribution of the IMIA Working Group on Smart Homes and Ambient Assisted Living. Yearbook of Medical Informatics, 7(1), 94–99.
  • Ganz, D.A., Higashi, T. & Rubenstein, L.Z. (2005). Monitoring falls in cohort studies of community-dwelling older people: effect of the recall interval. Journal of the American Geriatrics Society, 53, 2190–2194. doi:10.1111/j.1532-5415.2005.00509.x [CrossRef]
  • Hnat, T.W., Griffiths, E., Dawson, R. & Whitehouse, K. (2012, November). Doorjamb: Unobtrusive room-level tracking of people in homes using doorway sensors. Paper presented at the 10th Association for Computing Machinery Conference on Embedded Network Sensor Systems. , Toronto, Ontario. . doi:10.1145/2426656.2426687 [CrossRef]
  • Jimison, H., Gorman, P., Woods, S., Nygren, P., Walker, M., Norris, S. & Hersh, W. (2008). Barriers and drivers of health information technology use for the elderly, chronically ill, and underserved (AHRQ Publication No. 09-E004). Retrieved from the Agency for Healthcare Research and Quality website:
  • Katz, S., Ford, A.B., Moskowitz, R.W., Jackson, B.A. & Jaffe, M.W. (1963). Studies of illness in the aged. The index of ADL: A standardized measure of biological and psychosocial function. Journal of the American Medical Association, 185, 914–919. doi:10.1001/jama.1963.03060120024016 [CrossRef]
  • Kaye, J.A., Maxwell, S.A., Mattek, N., Hayes, T.L., Dodge, H., Pavel, M. & Zitzelberger, T.A. (2011). Intelligent systems for assessing aging changes: Home-based, unobtrusive, and continuous assessment of aging. Journals of Gerontology. Series B, Psychological Sciences and Social Sciences, 66(Suppl. 1), i180–i190. doi:10.1093/geronb/gbq095 [CrossRef]
  • Lawton, M.P. & Brody, E.M. (1969). Assessment of older people: Self-maintaining and instrumental activities of daily living. The Gerontologist, 9, 179–186. doi:10.1093/geront/9.3_Part_1.179 [CrossRef]
  • Mahoney, D.F. (2004). Linking home care and the workplace through innovative wireless technology: The Worker Interactive Networking (WIN) project. Home Health Care Management & Practice, 16, 417–428. doi:10.1177/1084822304264616 [CrossRef]
  • Mahoney, D.F., Mahoney, E.L. & Liss, E. (2009). AT EASE: Automated technology for elder assessment, safety, and environmental monitoring. Gerontechnology, 8(1), 11–25. doi:10.4017/gt.2009. [CrossRef]
  • Mahoney, D.F., Purtilo, R.B., Webbe, F.M., Alwan, M., Bharucha, A.J., Adlam, T.D. & Becker, S.A. (2007). In-home monitoring of persons with dementia: Ethical guidelines for technology research and development. Alzheimer’s and Dementia, 3, 217–226. doi:10.1016/j.jalz.2007.04.388 [CrossRef]
  • Melenhorst, A.S., Rogers, W.A. & Bouwhuis, D.G. (2006). Older adults’ motivated choice for technological innovation: Evidence for benefit-driven selectivity. Psychology and Aging, 21, 190–195. doi:10.1037/0882-7974.21.1.190 [CrossRef]
  • Nagi, S.Z. (1976). An epidemiology of disability among adults in the United States. The Milbank Memorial Fund Quarterly. Health and Society, 54, 439–467. doi:10.2307/3349677 [CrossRef]
  • Petersen, J., Larimer, N., Kaye, J.A., Pavel, M. & Hayes, T.L. (2012, August–September). SVM to detect the presence of visitors in a smart home environment. Paper presented at the Engineering in Medicine and Biology Society, 2012, Annual International Conference of the IEEE. , San Diego, CA. . doi:10.1109/EMBC.2012.6347324 [CrossRef]
  • Rantz, M.J., Skubic, M. & Miller, S.J. (2009, September). Using sensor technology to augment traditional healthcare. Paper presented at the Engineering in Medicine and Biology Society, 2009, Annual International Conference of the IEEE. , Minneapolis, MN. . doi:10.1109/IEMBS.2009.5334587 [CrossRef]
  • Reeder, B., Demiris, G. & Marek, K.D. (2013). Older adults’ satisfaction with a medication dispensing device in home care. Informatics for Health and Social Care, 38, 211–222. doi:10.3109/17538157.2012.741084 [CrossRef]
  • Reeder, B., Meyer, E., Lazar, A., Chaudhuri, S., Thompson, H.J. & Demiris, G. (2013). Framing the evidence for health smart homes and home-based consumer health technologies as a public health intervention for independent aging: A systematic review. International Journal of Medical Informatics, 82, 565–579. doi:10.1016/j.ijmedinf.2013.03.007 [CrossRef]
  • Rosow, I. & Breslau, N. (1966). A Guttman Health Scale for the Aged. Journal of Gerontology, 21, 556–559. doi:10.1093/geronj/21.4.556 [CrossRef]
  • Sherbourne, C.D. & Stewart, A.L. (1991). The MOS Social Support Survey. Social Science and Medicine, 32, 705–714. doi:10.1016/0277-9536(91)90150-B [CrossRef]
  • Shumway-Cook, A., Ciol, M.A., Yorkston, K.M., Hoffman, J.M. & Chan, L. (2005). Mobility limitations in the Medicare population: Prevalence and sociodemographic and clinical correlates. Journal of the American Geriatrics Society, 53, 1217–1221. doi:10.1111/j.1532-5415.2005.53372.x [CrossRef]
  • Skubic, M., Alexander, G., Popescu, M., Rantz, M. & Keller, J. (2009). A smart home application to eldercare: Current status and lessons learned. Technology and Health Care, 17, 183–201. doi:10.3233/THC-2009-0551 [CrossRef]
  • Sumaya, I.C., Rienzi, B.M., Deegan, J.F. 2nd. & Moss, D.E. (2001). Bright light treatment decreases depression in institutionalized older adults: A placebo-controlled crossover study. Journals of Gerontology. Series A, Biological Sciences and Medical Sciences, 56, M356–M360. doi:10.1093/gerona/56.6.M356 [CrossRef]
  • Tinetti, M.E. (1986). Performance-oriented assessment of mobility problems in elderly patients. Journal of the American Geriatrics Society, 34, 119–126.
  • Tinetti, M.E., Richman, D. & Powell, L. (1990). Falls efficacy as a measure of fear of falling. Journal of Gerontology, 45(6), P239–P243. doi:10.1093/geronj/45.6.P239 [CrossRef]
  • Tomita, M.R., Mann, W.C., Stanton, K., Tomita, A.D. & Sundar, V. (2007). Use of currently available smart home technology by frail elders: Process and outcomes. Topics in Geriatric Rehabilitation, 23, 24–34. doi:10.1097/00013614-200701000-00005 [CrossRef]
  • Venkatesh, V. & Davis, F.D. (2000). A theoretical extension of the technology acceptance model: Four longitudinal field studies. Management Science, 46, 186–204. doi:10.1287/mnsc. [CrossRef]
  • Ware, J. Jr.. , Kosinski, M. & Keller, S.D. (1996). A 12-item Short-Form Health Survey: Construction of scales and preliminary tests of reliability and validity. Medical Care, 34, 220–233. doi:10.1097/00005650-199603000-00003 [CrossRef]
  • Washburn, R.A., Smith, K.W., Jette, A.M. & Janney, C.A. (1993). The Physical Activity Scale for the Elderly (PASE): Development and evaluation. Journal of Clinical Epidemiology, 46, 153–162. doi:10.1016/0895-4356(93)90053-4 [CrossRef]
  • Webber, S.C., Porter, M.M. & Menec, V.H. (2010). Mobility in older adults: A comprehensive framework. The Gerontologist, 50, 443–450. doi:10.1093/geront/gnq013 [CrossRef]
  • Wild, K., Boise, L., Lundell, J. & Foucek, A. (2008). Unobtrusive in-home monitoring of cognitive and physical health: Reactions and perceptions of older adults. Journal of Applied Gerontology, 27, 181–200. doi:10.1177/0733464807311435 [CrossRef]
  • Williams, K., Arthur, A., Niedens, M., Moushey, L. & Hutfles, L. (2013). In-home monitoring support for dementia caregivers: A feasibility study. Clinical Nursing Research, 22, 139–150. doi:10.1177/1054773812460545 [CrossRef]
  • Yesavage, J.A., Brink, T.L., Rose, T.L., Lum, O., Huang, V., Adey, M. & Leirer, V.O. (1982–1983). Development and validation of a geriatric depression screening scale: A preliminary report. Journal of Psychiatric Research, 17, 37–49. doi:10.1016/0022-3956(82)90033-4 [CrossRef]
  • Zaslavsky, O., Thompson, H. & Demiris, G. (2012). The role of emerging information technologies in frailty assessment. Research in Gerontological Nursing, 5, 216–228. doi:10.3928/19404921-20120410-02 [CrossRef]
  • Zijlstra, W. & Aminian, K. (2007). Mobility assessment in older people: New possibilities and challenges. European Journal of Ageing, 4(1), 3–12. Retrieved from doi:10.1007/s10433-007-0041-9 [CrossRef]

Mobility Parameters, Self-Report Instruments, and Data Collection Schedule

Mobility Parameter Self-Report Instrument Description Scoring Schedule
Physical SPFM 15-question functional mobility self-assessment instrument (items below) 0 to 15; higher score = greater disability Baseline, 3 months, 6 months
• Katz ADLs 7 items focused on walking and transferring 0 or 1 for each item
• Rosow-Breslau scale 3 items focused on walking 0 or 1 for each item
• Nagi performance scale 5 items focused on stooping, reaching, and lifting 0 or 1 for each item
SF-12v2 PCS 6 item SF-12v2 summary score of overall physical function 0 to 100; higher score = better health Baseline, 6 months
Psychosocial SF-12v2 MCS 6 item SF-12v2 summary score of overall mental function 0 to 100; higher score = better health Baseline, 6 months
FES 10-question instrument to rate confidence in task performance without falling 0 to 10; ⩾70 indicates fear of falling Baseline, 6 months
GDS-SF 15-question survey used to diagnose depression level 0 to 15; ⩾5 suggests depression Baseline, 6 months
MOS-SS 19-item instrument to assess social support 0 to 100; higher score = greater support Baseline, 6 months
Cognitive Mini-Cog Cognitive screening tool with three-word recall and clock drawing tests (items below) 0 to 5; 0 to 2 = possible impairment Baseline, 6 months
• TWR Three-word recall test one each word recalled
• CDT Clock drawing test 0 or 2; 0 = normal clock, 2 = abnormal clock
Fall calendars Monthly
Interviews 3 months, 6 months

Acceptability Interview Guide Questions

Midpoint and Exit Visits

What do you think about the sensors installed in your home?

Do you think often about the sensor system?

Did you have any visitors or family who asked about the sensors?

Does the system in your home change the way you carry out your daily activities?

Would data from these sensors that show how active you are in your home have any usefulness for you?

Would you like to see these data sets?

Would you share these data with your family?


Health care provider?


How often would you like to see the data about your own activities of daily living?

Would you like to be able to turn the sensor system on and off depending on your preferences at any given time?

Do you have any general privacy or other concerns with the use of home based monitoring technology for health purposes?

Exit Visit Only

Did your overall attitude towards the system change over time?

Are there any other thoughts you would like to share about the system or this research study?

Assessment of Mobility Parameters at Baseline, 3-Month, and 6-Month Visits

Mean (SD)
Mobility Parameter Self-Report Instrument Baseline (n = 8) 3 Months (n = 8) 6 Months (n = 7) p Value
Physical SPFM 20.3 (4.5) 18.8 (1.5) 19.1 (3.0) 0.96
• Katz ADLs 7.5 (0.8) 7.5 (0.5) 7.3 (0.5) 0.78
• Rosow-Breslau Scale 2.3 (0.7) 2.0 (0.5) 2.4 (0.5) 0.14
• Nagi performance scale 10.5 (4.7) 9.3 (1.5) 9.4 (2.9) 0.96
SF-12v2 PCS 43.5 (9.2) 45.3 (7.4) 0.50
Psychosocial SF-12v2 MCS 52.8 (5.8) 60.3 (4.4) 0.006
FES 9.4 (1.2) 9.5 (0.7) 0.18
GDS-SF 1.1 (1.1) 0.3 (0.8) 0.06
MOS-SS 83.8 (5.7) 84.1 (8.7) 0.53
Cognitive Mini-Cog TWR 2.8 (0.7) 2.7 (0.8) 1.00
Mini-Cog CDT 2.0 (0.0) 2.0 (0.0) 1.00

Dr. Reeder is Assistant Professor, College of Nursing, University of Colorado Anschutz Medical Campus, Aurora, Colorado. At the time this article was written, Dr. Reeder was Postdoctoral Fellow, Biobehavioral Nursing and Health Systems, School of Nursing, University of Washington, Seattle, Washington. Ms. Chung is PhD student, Dr. Demiris is Professor, and Dr. Thompson is Associate Professor, Biobehavioral Nursing and Health Systems, School of Nursing, Ms. Lazar and Mr. Joe are PhD students, and Dr. Demiris is Professor, Biomedical Informatics and Medical Education, School of Medicine, University of Washington, Seattle, Washington.

The authors have disclosed no potential conflicts of interest, financial or otherwise. This work was supported by National Institute of Nursing Research training grant T32NR007106 and National Library of Medicine training grant T15LM007442. The authors thank the study participants for sharing their time and homes with them.

Address correspondence to Blaine Reeder, PhD, Assistant Professor, College of Nursing, University of Colorado Anschutz Medical Campus, Mail Stop C288-19, 13120 E. 19th Avenue, Education 2 North, Aurora, CO 80045; e-mail:

Received: February 28, 2013
Accepted: July 09, 2013
Posted Online: August 05, 2013


Sign up to receive

Journal E-contents